Wow! I can't believe how easy rendering is and the HUGE improvement I had versus the Box2D renderer. I had it in my mind that it would be overly complicated, but it's that's not the case at all. The results are so good that my conveyor belt actually WORKS on my phone. That's great because I've got a couple of cool ideas for some levels using this construct.
My (simplistic) design, as mentioned in the previous post, is working great. I created a new renderer class that is passed the the Box2D world instance. From here, I get the list of bodies that are in the world. I thought that I would have to iterate over all my fixtures in order to get things to appear. That's not really the case. Although your object may be comprised of several polygons (especially in the event that you want to a concave type object), your actual object is a single entity. If you have a texture that 'fits over' that entity, then you are all set. Much easier than trying to iterate over the fixture list to define individual elements!
I added some info to my user data class -- object texture, size, origin, and color. The renderer uses these to scale textures appropriately. When I get the real textures drawn up, it will use those instead. I wanted to implement scaling because some of my objects don't have a predefined size at compile time (platforms have variable length, gears have variable radius, etc.) Other objects, on the other hand, will have static sizes so handling those should have less of an impact on CPU time when I define the 'real' textures.
I added a grand total of 2 textures for my own debugging - one shape for circles and one for squares. I also added a simple line so I can see rotation. See the screenshot below for an example:
1) I had defined some objects with rotation and instead of rotating the object, I rotated the fixture. This resulted in rendering images that were out of whack with the body (a vertical platform, for example, rendered a texture horizontally). Bad! This was easily corrected by setting the shape's rotation to 0 and the body's .angle parameter to the desired angle. I love quick fixes like that.
2) I need to have ordered rendering. I have some instances where objects overlap. Sometimes the player's arm/gun is rendered before the body and sometimes it is rendered after. You either see the gun peaking out when it is rotated or you see it 'correctly' on top of the player. I need to correct this. Since I'm iterating over the entire world of objects, I probably will skip this special case and have the player object render itself separately... it knows to draw the player body and then the arm next. Skipping should be a snap because my user data includes object type info.
3) My fps on my phone on the conveyor belt map was on the order of 14 fps using the debug renderer. Using my renderer, my fps shot up to close to the max -- 50-60 fps.
4) The screenshot above shows a capture of my game running on the desktop. The fps says 4774. With the debug renderer, I was getting 1/4 of that... say around 1400 fps or so. Notably slower than what I am getting right now. Most of the operational time was spent rendering joints (40%+).
5) This doesn't obsolete my usage of the debug renderer. I can always run that AFTER I do my rendering to 'overlay' what the physics engine is thinking it should be displaying. That's a pretty neat trick.
That's all for now. All in all, I'm pretty happy I got this working. It was a big disappointment yesterday to see that the conveyor belt killed things!