The Daily Click ::. Forums ::. General Chat ::. More realistic than photo-realistic??!
 

Post Reply  Post Oekaki 
 

Posted By Message

Kramy



Registered
  08/06/2002
Points
  1888
19th July, 2004 at 15:21:42 -

Somehow I highly doubt that. Creating such scenes in realtime would take far more processing power than cracking 128-bit encryption.

 
Kramy

m. collar



Registered
  04/01/2002
Points
  271
19th July, 2004 at 16:58:41 -

Wwhat about another forms of immersion?
touch - smell - taste ?



 
-http://www.create-games.com/project.asp?id=1457

Cybermaze



Registered
  03/04/2003
Points
  853
20th July, 2004 at 05:55:10 -

One thing that has always evolved in games (among others) is the graphics. Every new generation of graphic engines pack an even more mindblowing experience. Take the awaiting of Doom3 and HL2.

The evolution of the graphics in games is closely linked to the evolution of hardware. The evolution of hardware drives the evolution of graphics. It is clear that we will see improved graphics in the future. But I think we will eventually reach a level after which it will not matter much and not be worth making better graphics. And I think that goal will be reached some time before we have absolutely 100% true photorealistic graphics. There are three main reasons:

1. Reasonable. At a point it will not really be reasonable to improve the graphics any further. The time and cost involved is not outweighted by the income any more. Developers are allready today talking about how much the graphics can evolve on the current generation of hardware and that we could see a shift in games towards evolvement of other aspects.

2. We play to be entertained. This means we are not really interested in the games being too realistic. The reality imposes too many limitations. A games should only by so realistic that does not ruin the fun.

3. Robot constructors have allready run in to the third reason. If they build their robots to be too human like people tend to be disgusted about them. It has turned out that our brain automatically imposes an expression on things that does not have an expression (like animals and robots that do not look too human) but if something is looking very human like we in stead try to interpretate the expression we expect to be. The problem is the robots often have a "wrong" impression on their face and/or that they do not follow the rules we expect humans to follow (in terms of expression). The result is robot constructors have gone back to create more simple robots (in terms of appearance). The same problem also exist when developers create computer games.

In terms of hardware development we are beginning to reach the limits of the current technology. Moore's law says something like:"The speed of computers will double every 48 months". This has been very true up until now. However we are truely beginning to reach the limits of what we can do with silicium based cpus. Intel has recently moved to the 0.90 micron technology and expects to reach the 0.65 micron technology in the near future (a few years) but that will also be the last step in shrinking the transistor which is the heart of all electronics today. Shrinking the transistor any further is not possible as the amount of "stray electricity" (electricity jumping from one transistor to another) will be so high it would be impossible to get anything useable through the cpu. When the transistor cannot be shrunk any more we also hit the limit in frequency. Even Intel have finally realized more speed cannot be achived by ramping up the frequency. Ramping up the frequency has always been the Intel approach to speed increase. AMD has for a long time realized speed could and would have to be reached by other means. So while Intel made their cpus less efficient to ramp up the frequency (true for the P4 family with a few exceptions) AMD have made their cpus more efficient. That is why the AMD top model is still around 2,5GHz while the P4 is up around 3,4GHz. But Intel is finally also realizing the future and is about to shift to a number based rating system (instead of rating by frequency). To further increase the speed the near future will bring us cpus with two or more cores meaning we will have cpus with multiple cores built in to the same socket.

However all these improvements cannot change the fact that we are reaching the limits imposed by silicium. And we do not have a new technology around the corner. Companies like Intel, AMD, IBM as well as universities etc research in new technologies but we will not see anything new in the next 10 years so the future is quite unknown.

 
If you knew, I would have to kill you...

ChrisB

Crazy?

Registered
  16/08/2002
Points
  5457
20th July, 2004 at 08:44:46 -

I remember a gadget called 'iSmell' which let webpages have their own unique fragrances. The name alone was enough to condemn itself to doom

 
n/a

Muz



Registered
  14/02/2002
Points
  6499

VIP MemberI'm on a BoatI am an April FoolHonored Admin Alumnus
20th July, 2004 at 11:02:56 -

Yeah, it'll be a long time before photo-realism can be achieved. Probably about 2 decades or so. Heck, about a decade ago, 1 GB was considered massive. 1.5 decades ago, even 20 MB was a lot. I know. I've lived with computers back in those days and was so proud of my 1GB HDD .

Oh well, if no one beats me to it, I'm gonna make the first VR console, coming with one of those gun thingies and a sensory glove. Main prob would be doing the turning around stuff and preventing the headgear from spoiling your eyes. But then again, monitors already spoil your eyes as it is .

 
Disclaimer: Any sarcasm in my posts will not be mentioned as that would ruin the purpose. It is assumed that the reader is intelligent enough to tell the difference between what is sarcasm and what is not.

Image

RapidFlash

Savior of the Universe

Registered
  14/05/2002
Points
  2712
20th July, 2004 at 16:33:52 -

Cybermze: http://www.intel.com/research/silicon/mooreslaw.htm (I think the theroy originally said it was every 18 months).
Anyway, I don't think we are going to have photo-realistic graphics anytime soon. Keep in mind that animation is going to be the most difficult part, even though they have technology that can simulate actual human movement (motion capture). Still, you also have to ask yourself the question of how much it costs and how long will it take. Companies can't afford to take several years on a game.
On a final note: Realism depends a lot on its enviroment. Sure, we could create a realistic human that just stands around, but how about having that human also run, walk, jump, catch, fall down, and swim while in a pool with lots of mirrors and other people?

 
http://www.klik-me.com

Radix

hot for teacher

Registered
  01/10/2003
Points
  3139

Has Donated, Thank You!VIP MemberGOTW WINNER CUP 1!GOTW WINNER CUP 2!GOTW WINNER CUP 3!GOTW WINNER CUP 4!
20th July, 2004 at 20:32:02 -

Not in the conventional manner, but we still have a few fractions of a micron to go before we hit a relativistic limit. Once chips can't get any smaller, we innovate. Imagine stacking a bunch of 1gHz processors on top of one another and treating it as a single chipset. That's the direction we'll be going in the near future.

 
n/a

Pete Nattress

Cheesy Bits img src/uploads/sccheesegif

Registered
  23/09/2002
Points
  4811
20th July, 2004 at 20:46:01 -

my chemistry's a bit rusty, but there's a formation(allotrope?) of Carbon called Buckminsterfullerene. apparantly it would act as an ideal replacement for silicon and allow for the creation of next generation hardware. all this is from my old chemsitry teacher, so i don't have any first hand knowledge. it's all greek to me, but for your clicking pleasure: http://www.google.com/search?q=buckminsterfullerene

 
www.thenatflap.co.uk

Radix

hot for teacher

Registered
  01/10/2003
Points
  3139

Has Donated, Thank You!VIP MemberGOTW WINNER CUP 1!GOTW WINNER CUP 2!GOTW WINNER CUP 3!GOTW WINNER CUP 4!
20th July, 2004 at 20:54:42 -

Buckyballs are the particular structure of BMF that provides a nice conductor, so you're probably better off searching for buckyballs specifically.

 
n/a

Muz



Registered
  14/02/2002
Points
  6499

VIP MemberI'm on a BoatI am an April FoolHonored Admin Alumnus
20th July, 2004 at 22:31:11 -

Animation ain't much of a problem. Just look at how much 3DS Max has improved in it over just half a decade. The only problem would just be getting the polygons/vertices to flow right according to what hits it, which wouldn't take animators too long to figure out, considering how much they've done already. Things are already easy enough with motion capture and stuff.

Silicon's pretty poor... and the main reason it's used in such high quantities is coz it's uber-cheap. But since carbon is easier to get, it'll be a matter of time before Buckminsterfullerene replaces it. Or maybe those quantum computers will come out sooner than we'd expect, which is usually the case with overfunded R&D these days.

I'd try to join in the quantum computer research stuff, but I suck at chemistry, so I'm stuck with typical electrical engineering .

 
Disclaimer: Any sarcasm in my posts will not be mentioned as that would ruin the purpose. It is assumed that the reader is intelligent enough to tell the difference between what is sarcasm and what is not.

Image

colej_uk



Registered
  15/05/2002
Points
  1627
21st July, 2004 at 16:40:57 -

I tend to agree with Tigs, current technology can only be pushed so far. It's only a matter of (probably a long) time before we find ourselves using Quantum Computers (http://www.cs.caltech.edu/~westside/quantum-intro.html) since they are almost infinately more powerful than what we have today.

Theres already talk of using DNA instead of a hard disk, it all sounds a bit odd but DNA can store (according to IBM) around 3 terrabytes per chromosome.

As for games, the graphical advancements have been decreasing a lot in the recent years, it's all a matter of time (as Cybermaze pointed out) before it grinds to a halt and you just won't be able to tell a difference in graphical quality.

 
-

David Newton (DavidN)

Invisible

Registered
  27/10/2002
Points
  8322

Honored Admin Alumnus
21st July, 2004 at 19:23:15 -

On the subject of processing power doubling every 18 months, I had to write this tidal wave of dullness about it: http://wired.st-and.ac.uk/~wong/moore.doc

Something in it might be relevant here, you never know. I seem to remember that germanium was also being considered as a replacement for silicon.

(By the way, that piece of work earned me the comment "Good - even interesting!" from my tutor. I wonder if he was reading the right essay.)

 
http://www.davidn.co.nr - Games, music, living in America

ChrisB

Crazy?

Registered
  16/08/2002
Points
  5457
21st July, 2004 at 19:57:45 -

I don't think that's possible, I tried to pass a current through a flower once and it burst into flames - how are they going to turn that into a microchip?

 
n/a

Pete Nattress

Cheesy Bits img src/uploads/sccheesegif

Registered
  23/09/2002
Points
  4811
22nd July, 2004 at 06:31:40 -

/\ BAD joke

 
www.thenatflap.co.uk

Cybermaze



Registered
  03/04/2003
Points
  853
22nd July, 2004 at 12:14:32 -

Indeed a bad one.

But it is true both DNA as well as atoms themselves are subjects we research in. Using the electrons as a way to store data in atoms it is possible to create computers with enormous power taking up very little space. The real challenge is controlling the atoms. But just imagine if I had a piece of metal and every atom in the metal would function the same way as one transistor. I think it was some swedish scientists who could use atoms as ram storing and retrieving data. The good thing about atoms is also that we can create more complex computers. While transistors only support 2 modes (on/off they are binary) a computer of atoms could possibly take more 4 modes (0,1,2,3) meaning it can calculate more data in one run than a binary computer.

The real problem is really that all these technologies are some years away ... maybe as much as 10-20 years. In the meanwhile we are quite limited. Once the Silicon transistors cannot be smaller we will hit the ceiling. Which means something around 0.65 micron and 6GHz (if it is at all useful to go that high). After that the only real alternative is to place more cpu cores on the same die. A while ago I saw the estimated requirements for Windows Longhorn. It suggested a 4GHz cpu with dual core (two cpu cores on the same die) and at least 2Gb ram.

 
If you knew, I would have to kill you...
   

Post Reply



 



Advertisement

Worth A Click