Hello,

I used pyglet and pymunk to make some tutorials.  I recently got a laptop 
with an i7 processor and a GeForce GT 630M graphics card.  This lets me run 
programs from either the built in graphics of the i7 or through the nvidia 
graphics card (using the proprietary driver on linux).  (Bumblebee is the 
linux library that supports Optimus to let this happen).

Here is a video of one demo:

http://vimeo.com/65989831

On the i7, this runs at about 80 fps, and on the nvidia card it runs at 
90.  That isn't much of a boost and it makes me think the graphics card 
isn't being optimally used.  

I might think that game logic is using most of the resources and so 
accelerating the graphics doesn't provide much boost.  But another example 
simply replaces the polygons with sprites and should be more GPU intensive 
AFAIK, but it runs at 30 and 45 fps respectively.  (I'll try to post a 
video of it later).  This code uses Jonathan Hartley's 
SpriteItem<http://code.google.com/p/brokenspell/source/browse/trunk/sinisterducks/spriteitem.py?r=279>.
  
Is there a significantly better way to handle sprites? 

The code in question are poly_demo.py and sprite_demo.py found at 
http://bitbucket.org/permafacture/pyglet-pymunk-demo/overview



Thank you for reading and any comments/insight are appreciated.



PS if you're interested, the tutorial I've written can be found here: 
http://pyratesarecool.appspot.com/Integrating_Pyglet_and_Pymunk


-- 
You received this message because you are subscribed to the Google Groups 
"pyglet-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/pyglet-users.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to