Hey everyone, I recently acquired a Dell Latitude C840 from an auction for $50. This thing is pretty minimally configured, no builtin wifi etc, Intel P4 M @ 1.6GHz but on the bright side it did have a GB of RAM. Additionally it has an Nvidia GForce4 440 Go (Integrated of course) for video.
My desktop is a shiny new e-machine I purchased back in October for $350 from walmart The desktop also has 1GB of RAM, AMD Sempron 3100+, but it has a an Nvidia GeForce 6100 Video Card (Integrated), with an open PCI-Express slot. Anyways long story short, I have a test application that renders a spinning flashing pyramid to the screen for 100,000 frames and clocks the operation to give me a feeling for how many FPS the machine is capable of (it also loads glew and checks for extensions, and basically gives me all the info I need to know about a machines 3D capabilities). My Desktop scores a respectable 539.428 Average FPS My Laptop on the other hand shocked me when the results came in at 813.751 Average FPS Both machines are configured identically software wise (other than drivers), since both are running Kubuntu Edgy. The test suite is built from the exact same project files with the exact same options and defines. Doing some research I see that the Dell Laptop was intended as a "Desktop Replacement" until now I always thought that was just a buzzword. Turns out for 3D and gaming, my $50 laptop is wiping the floor with my $350 desktop. Just curious to know if anyone else has ran into surprises like this when testing older hardware against newer stuff. /* PLUG: http://plug.org, #utah on irc.freenode.net Unsubscribe: http://plug.org/mailman/options/plug Don't fear the penguin. */
