One thing that has always evolved in games (among others) is the graphics. Every new generation of graphic engines pack an even more mindblowing experience. Take the awaiting of Doom3 and HL2.
The evolution of the graphics in games is closely linked to the evolution of hardware. The evolution of hardware drives the evolution of graphics. It is clear that we will see improved graphics in the future. But I think we will eventually reach a level after which it will not matter much and not be worth making better graphics. And I think that goal will be reached some time before we have absolutely 100% true photorealistic graphics. There are three main reasons:
1. Reasonable. At a point it will not really be reasonable to improve the graphics any further. The time and cost involved is not outweighted by the income any more. Developers are allready today talking about how much the graphics can evolve on the current generation of hardware and that we could see a shift in games towards evolvement of other aspects.
2. We play to be entertained. This means we are not really interested in the games being too realistic. The reality imposes too many limitations. A games should only by so realistic that does not ruin the fun.
3. Robot constructors have allready run in to the third reason. If they build their robots to be too human like people tend to be disgusted about them. It has turned out that our brain automatically imposes an expression on things that does not have an expression (like animals and robots that do not look too human) but if something is looking very human like we in stead try to interpretate the expression we expect to be. The problem is the robots often have a "wrong" impression on their face and/or that they do not follow the rules we expect humans to follow (in terms of expression). The result is robot constructors have gone back to create more simple robots (in terms of appearance). The same problem also exist when developers create computer games.
In terms of hardware development we are beginning to reach the limits of the current technology. Moore's law says something like:"The speed of computers will double every 48 months". This has been very true up until now. However we are truely beginning to reach the limits of what we can do with silicium based cpus. Intel has recently moved to the 0.90 micron technology and expects to reach the 0.65 micron technology in the near future (a few years) but that will also be the last step in shrinking the transistor which is the heart of all electronics today. Shrinking the transistor any further is not possible as the amount of "stray electricity" (electricity jumping from one transistor to another) will be so high it would be impossible to get anything useable through the cpu. When the transistor cannot be shrunk any more we also hit the limit in frequency. Even Intel have finally realized more speed cannot be achived by ramping up the frequency. Ramping up the frequency has always been the Intel approach to speed increase. AMD has for a long time realized speed could and would have to be reached by other means. So while Intel made their cpus less efficient to ramp up the frequency (true for the P4 family with a few exceptions) AMD have made their cpus more efficient. That is why the AMD top model is still around 2,5GHz while the P4 is up around 3,4GHz. But Intel is finally also realizing the future and is about to shift to a number based rating system (instead of rating by frequency). To further increase the speed the near future will bring us cpus with two or more cores meaning we will have cpus with multiple cores built in to the same socket.
However all these improvements cannot change the fact that we are reaching the limits imposed by silicium. And we do not have a new technology around the corner. Companies like Intel, AMD, IBM as well as universities etc research in new technologies but we will not see anything new in the next 10 years so the future is quite unknown.
Yeah, it'll be a long time before photo-realism can be achieved. Probably about 2 decades or so. Heck, about a decade ago, 1 GB was considered massive. 1.5 decades ago, even 20 MB was a lot. I know. I've lived with computers back in those days and was so proud of my 1GB HDD .
Oh well, if no one beats me to it, I'm gonna make the first VR console, coming with one of those gun thingies and a sensory glove. Main prob would be doing the turning around stuff and preventing the headgear from spoiling your eyes. But then again, monitors already spoil your eyes as it is .
Disclaimer: Any sarcasm in my posts will not be mentioned as that would ruin the purpose. It is assumed that the reader is intelligent enough to tell the difference between what is sarcasm and what is not.
Cybermze: http://www.intel.com/research/silicon/mooreslaw.htm (I think the theroy originally said it was every 18 months).
Anyway, I don't think we are going to have photo-realistic graphics anytime soon. Keep in mind that animation is going to be the most difficult part, even though they have technology that can simulate actual human movement (motion capture). Still, you also have to ask yourself the question of how much it costs and how long will it take. Companies can't afford to take several years on a game.
On a final note: Realism depends a lot on its enviroment. Sure, we could create a realistic human that just stands around, but how about having that human also run, walk, jump, catch, fall down, and swim while in a pool with lots of mirrors and other people?
Not in the conventional manner, but we still have a few fractions of a micron to go before we hit a relativistic limit. Once chips can't get any smaller, we innovate. Imagine stacking a bunch of 1gHz processors on top of one another and treating it as a single chipset. That's the direction we'll be going in the near future.
n/a
Pete Nattress Cheesy Bits img src/uploads/sccheesegif
Registered 23/09/2002
Points 4811
20th July, 2004 at 20:46:01 -
my chemistry's a bit rusty, but there's a formation(allotrope?) of Carbon called Buckminsterfullerene. apparantly it would act as an ideal replacement for silicon and allow for the creation of next generation hardware. all this is from my old chemsitry teacher, so i don't have any first hand knowledge. it's all greek to me, but for your clicking pleasure: http://www.google.com/search?q=buckminsterfullerene
Animation ain't much of a problem. Just look at how much 3DS Max has improved in it over just half a decade. The only problem would just be getting the polygons/vertices to flow right according to what hits it, which wouldn't take animators too long to figure out, considering how much they've done already. Things are already easy enough with motion capture and stuff.
Silicon's pretty poor... and the main reason it's used in such high quantities is coz it's uber-cheap. But since carbon is easier to get, it'll be a matter of time before Buckminsterfullerene replaces it. Or maybe those quantum computers will come out sooner than we'd expect, which is usually the case with overfunded R&D these days.
I'd try to join in the quantum computer research stuff, but I suck at chemistry, so I'm stuck with typical electrical engineering .
Disclaimer: Any sarcasm in my posts will not be mentioned as that would ruin the purpose. It is assumed that the reader is intelligent enough to tell the difference between what is sarcasm and what is not.
I tend to agree with Tigs, current technology can only be pushed so far. It's only a matter of (probably a long) time before we find ourselves using Quantum Computers (http://www.cs.caltech.edu/~westside/quantum-intro.html) since they are almost infinately more powerful than what we have today.
Theres already talk of using DNA instead of a hard disk, it all sounds a bit odd but DNA can store (according to IBM) around 3 terrabytes per chromosome.
As for games, the graphical advancements have been decreasing a lot in the recent years, it's all a matter of time (as Cybermaze pointed out) before it grinds to a halt and you just won't be able to tell a difference in graphical quality.
I don't think that's possible, I tried to pass a current through a flower once and it burst into flames - how are they going to turn that into a microchip?
n/a
Pete Nattress Cheesy Bits img src/uploads/sccheesegif
But it is true both DNA as well as atoms themselves are subjects we research in. Using the electrons as a way to store data in atoms it is possible to create computers with enormous power taking up very little space. The real challenge is controlling the atoms. But just imagine if I had a piece of metal and every atom in the metal would function the same way as one transistor. I think it was some swedish scientists who could use atoms as ram storing and retrieving data. The good thing about atoms is also that we can create more complex computers. While transistors only support 2 modes (on/off they are binary) a computer of atoms could possibly take more 4 modes (0,1,2,3) meaning it can calculate more data in one run than a binary computer.
The real problem is really that all these technologies are some years away ... maybe as much as 10-20 years. In the meanwhile we are quite limited. Once the Silicon transistors cannot be smaller we will hit the ceiling. Which means something around 0.65 micron and 6GHz (if it is at all useful to go that high). After that the only real alternative is to place more cpu cores on the same die. A while ago I saw the estimated requirements for Windows Longhorn. It suggested a 4GHz cpu with dual core (two cpu cores on the same die) and at least 2Gb ram.