vtkButtonWidget(with vtkTex) cannot capture correct event/render at correct position when the window has more than one renderers
This issue was created automatically from an original Mantis Issue. Further discussion may take place here.
-
create window
-
add two renderers to the window. one is up, one is donw
-
add vtkButtonWidget to the window.
-
When click on the window, move the button to that position (on upper renderer) Set current render as upper render. capture the position int x = this->Interactor->GetEventPosition()[0]; int y = this->Interactor->GetEventPosition()[1]; calculate bounds and put the widget button->GetRepresentation()->PlaceWidget(bounds);
Now you have two option.
- use input x,y input
- calculate the x,y position according to the view port of renderer. Assume you put on the upper render, whose lowerleft maybe (0, 1/2 size of window)
-
Do render now. for case 1), the button will not put on correct position but to top of the window. Seems the position is calculated using (0,0) as lowerLeft for case 2) the display is correct
-
Move your mouse to the button Nothing happens. You cannot click the button and raise any event
I debug into vtkTexturedButtonRepresentation2D, it will check the positin of mouse and see if it in area of texture image of the button. For case 2, seems the button assume the origin point is (0, 1/2 the size of window height). But the input of MoveAction use display coordination (assume the origin point is (0,0). So it not works If you move the mouse down to lower. Example, the button at (20,60), the window size is (100,100), you move to (20, 10),the mouse changed and it think we are on the button. Actually not.
-
So, shall we change the renderer logic? or change the MoveAction/SelectAction logic? at least one is incorrect. I assume the render logic has something wrong.