By Jason Saundalkar
Senior Technical Editor Jason Saundalkar explains why nVidia’s ‘PhysX’ technology is important and how it will affect you in the future…
My first PC experience ever was actually playing a game over at my friend’s place back in my early school days. The game was none other than id Software’s excellent Wolfenstein 3D, and it completely blew me away.
After that first experience I promptly started saving pocket money so I could buy a PC to play that game. By today’s standards, Wolf 3D is prehistoric, especially when you compare it to games like Crysis. The exciting thing is that, as good as Crysis is, future games are going to be even more atmospheric and realistic thanks to enhanced physics.
Game designers have a monumental task each time they set out to develop a game because absolutely everything has to be coded. The graphics you see on-screen, the audio effects you hear and everything you interact with, as the gamer, are essentially just different representations of some serious programming. Imagine you come up to a bog standard wall while playing a game, that wall has to be written into the game. If you were to search through the billions of lines of game code, you’d find a line of code somewhere telling the game’s graphics engine how to render that wall and by that I mean it’ll have specifics like the height, width, length and much more.
Static objects like that are easy enough to code for but imagine coding for objects that you interact with or even creating realistic weather conditions etc – that’s a completely different ballgame altogether. Objects that are realistically interactive are because of physics that have been programmed into the game. The zero-gravity level in Crysis is the perfect example of this; the developers essentially had to write code so that when the player enters this particular area of the game, the game’s physics for gravity change allowing the character to float around (just as he would in the real world in a zero-gravity environment).
While it’s a difficult task for developers, creating realistic physics also requires a heck of a lot of processing power. Up to now, a computer’s CPU has been burdened with processing physics code and this has forced developers to cut down on just how realistic their games can be. This is simply because the developers have to strike a balance between using the CPU for graphics work, artificial intelligence, physics and more. Giving the processor too much to do at once would ultimately overburden it, which would cause games to run like a Powerpoint presentation (one frame at a time).
This is where nVidia’s ‘PhysX’ technology comes in. PhysX essentially makes it possible for a compatible GPU to offload physics calculations from the CPU. This means the processor is free to perform other tasks faster and will result in a smoother gaming experience. Numerous GPUs currently support PhysX such as the GeForce 200 family in addition to most of the GeForce 9 and GeForce 8 series of graphics cards.
When you compare a CPU to a GPU in terms of design and architecture, the latter boasts a far more robust, parallel processing design. Whereas today’s top desktop CPU features just four cores, some of today’s mid-range graphics cards feature over 80 cores! This makes GPUs far more adept at tackling intensive calculations, such as physics processing, which means that game developers can now concentrate on making their games even more realistic and immersive than before.
A number of current titles such as Unreal Tournament 3, Mass Effect and Gears of War support PhysX technology and many more are on the way. But, while I don’t expect this next batch of titles to really blow me away in terms of just how realistic they are, I can’t wait for the next generation titles. To quote my classmate ‘you haven’t seen anything yet’.For all the latest tech news from the UAE and Gulf countries, follow us on Twitter and Linkedin, like us on Facebook and subscribe to our YouTube page, which is updated daily.