03-17-2012 07:29 PM
Hey, I've got fully functional game together, using the structure from Toni Westbrook's excellent tutorial. I'm very pleased with it! But, not all is well.
It's a scroller, and I have written it so that a skilled player can go on, and on, and on... powering up, rather than dying and getting another life. So, scores or hundreds of objects have been added to the vector, and then removed with vector.removeElementAt().
I think the gradual slowdown is due to the number of (dead) objects. But, I'm not sure. Anyway, I discovered vector.trimToSize(), and thought, "that's it!" But, it has not cured the problem.
Can anyone shed light on the cpu and memory use in a case with lots of mostly-removed objects? Or, suggest something else that I should be looking at as a cause of slowdown?
The scrolling BG should not be relevant, as it's just a graphics object and bitmap the size of 2 screens, that I keep drawing on, moving, copying.
Using OS 5 on Curve 8900.
Solved! Go to Solution.
03-17-2012 10:16 PM
03-18-2012 12:56 AM
There are 4 objects other than my player, and one bitmap for each, so, the bitmap is shared among however many instances, maybe up to 5 or so of any type at once, unless the random sequence does something unusual.
That's an interesting idea, of basically flipping them off until time for reuse. I'll definitely hold it as a reserve, and consider it further in its own right. 'Course, then I'd have, say, 20 of these objects at all times, where, if using removal, they could at times dwindle down near zero, where they could, theoretically, be "lighter."
I would still like to know more about the efficiency of vectors and object removal, as well as thoughts on other possibilities.
03-18-2012 10:51 PM
That sounds like a good suggestion; I had not been aware of the profiler.
With regard to vector.trimToSize(), I've done some reading, and it appears that using it becomes counterproductive when the number of objects is changing. That is, it takes more time to create a new object when there's not already some unused capacity in the vector.
I've also learned, that, when more room is needed, the default behavior of the vector is to double its capacity, to prevent the need, and speed penalty, of repeatedly enlarging the vector. It is unclear whether "carrying" this extra capacity brings its own penalty.
And, it's possible to set the the size of the vector, if the expected size range is know. It seems this might be good for me (though I now doubt that vector size is the root of my speed problem). I could set a useful number, and not carry possibly 49% more capacity than I need, as could happen if new capacity was barely needed, and was automatically doubled as the program ran.
03-19-2012 11:37 PM
Thanks, Maadani. The profiler did the trick. I have a routine to kill off objects that have gone off screen, and one was not working as expected. The profiler numbers made me see right away that the count for that type of object was way out of line with the other.
Profiler is not exactly intuitive to use! In case anyone else needs it, here's the BB reference section for the Eclipse Profiler, which got me through it.
I must say, this Eclipse + plugin is fantastic. Doing my first program in Java and for Blackberry has been made much smoother with the context help that Eclipse provides. Otherwise, it's always difficult getting the structure of declarations, parameters, syntax, etc. of a new language.