06-15-2009 06:35 PM
I have a custom MainScreen which has a VerticalFieldManager and a bunch of custom fields that are focusable. When the user trackball clicks on of the field I want to be able to inject a key stroke to the application that launched my custom main screen. I have this "almost" working, but the application is capturing the trackball click and then my injected key strokes.
I have the application setup so that the MainScreen overrides the TrackBallClick event and injects the key strokes and returns true to stop the trackballclick event from going any further.
Why is the trackball click event being passed to the calling application?
06-16-2009 06:51 AM
I think we need to careful that we understand which Objects are involved. For example, when you say the keystrokes are being passed to your application, which Object is that. Is it your UiApplication? Can you explain the problem again, but carefully explaining which Objects are involved and their relationships?
One thing though, you need to watch carefully if you have set up appropriate Listeners. Have you used these in your processing, or are you just using the MainScreen methods which do this for you.
06-16-2009 11:31 PM
Here is the Hierarchy
- Base App (not my app)
- MainScreen (mine)
-CustomField (mine - generating the Trackball Click)
I want to be able to inject a keystroke to the Base App. Right now there are no event handlers in either the CustomField or the VerticalFieldManager. The MainScreen has:
protected boolean trackwheelClick(int status, int time)
KeyCodeEvent keyCode = new EventInjector.KeyCodeEvent(KeyCodeEvent.KEY_DOWN, 'V', KeyListener.STATUS_ALT);
KeyCodeEvent keyCode2 = new EventInjector.KeyCodeEvent(KeyCodeEvent.KEY_UP, 'V', KeyListener.STATUS_ALT);
As I stated above, what happens is that the Base App registers a Trackball Click before it gets my injected keycode. I thought returning true would have prevented this. Do I need to implement additional handlers in the VerticalFieldManager of CustomField?
06-17-2009 04:45 AM
Thanks for this information. It might be useful to know what classes are involved in the Base App. When it pushes you Screen, is there a chance that it registers a TrackwheelListener against your Screen?
In general al however, what you are suggesting should not happen. The UI events should be directed to the active Screen.
I've a few other questions prompted by your response
1) "Base App registers a Trackball Click before it gets my injected keycode."
How do you know that it does this?
2) What device and level of OS are you running this on? Is this consistent and obeys the forward compatibility rule for both your and the Base App?
3) If you are running on trackball devices (as opposed to scrollwheel devices like the 8700), you can use navigationClick to trap the 'click'. From memory, this gets fired before the trackwheelClick, and trackwheelclick (that is there for compatibility) only gets fired if navigationClick doesn't return true. Can you confirm that navigationClick is not being used anywhere?
06-17-2009 04:54 AM
06-17-2009 12:20 PM
To answer your questions:
1) I know that the Base App is getting a trackball/wheel click event because the Base App's menu pops up. After that it responds to the KeyCode event. The Simulator seems to work correctly, the Base App's menu never pops up...it only happens on my physical device.
2) I am using an 8310 with OS 188.8.131.52 (same device/os simulator)
I changed to a navigationClick instead of using the trackwheel click for better compatibility; menu in Base App still pops up. I have yet to try the application on other devices.
I would have thought that returning false would prevent this. Is it because I am calling this.close() on the MainScreen before it returns true? Why does it work on the Sim but not my device :s
Thanks for all your help.
06-18-2009 12:08 PM
Could you further elaborate on the following:
a) what classes are involved in the Base App?
b) how does the Base App interact with your Screen - does it push your Screen?
c) Does the Base App register any listeners against your screen?
As I said in my previous post, n general, what you are suggesting should not happen.
Could you attach the device, debug it, and put a break point in the makeMenu of the base App. I would be interested to know how it was invoked. I wonder if some event on your screen has decided that it needs to get the menu from the Base App.
I am surprised that you see a difference between the Simulator and the OS in this regard, if they are the same Level. Can I suggest you have a look in the Simulator downloads to see if you find the exact match for your device?. And you are using JDE/plug-in that is 4.5 or earlier aren't you.
Regarding your returning false, can you confirm that your code is actually invoked and invoked before the menu is displayed. There are issues is actually being able to do this. Here is grubby suggestion, that might work. In your navigationClick method, do an invokeAndWait on a Runnable that displays a Dialog.alert(). This needs to be an invoke and wait, so that any other screen events are scheduled and run before the Dialog appears. Now when the Dialog appears, because it does not cover the whole screen, if the menu appears underneath it, then you know it was invoked before you got called. Alternatively, once you click the Dialog, the menu might appear, This is not conclusive, I can see holes in this, however it is an interesting experiment.