Originally posted by RealNC
View Post
Announcement
Collapse
No announcement yet.
The Most Amazing OpenGL Tech Demo In 64kb
Collapse
X
-
Originally posted by Vim_User View PostYou are really asking that? Aren't you aware that Michael already clarified that for him quantity is better than quality?
Comment
-
Yeah, like others have mentioned, it looks pretty nice, but ray(back)tracing (& similar techniques) are generally too expensive for real gaming (although numbers of 'hops' can vary depending on load and fps).
While easy to implement, few hundreds LoC (not including render for scene and primitives),
it still cant be used for realtime stuff because performance is several orders higher than what it would take to do the same thing via conventional Z-buffer (&similar).
This is inherent flaw caused by the fact that whole scene has to be redrawn each frame (although I've seen experimental code which worked around this by cacheing ray objects and reusing stuff when possible).
Please note that some 'glitches' these are caused by premature termination of rays.Last edited by Guest; 23 April 2014, 04:34 PM.
Comment
-
Originally posted by b15hop View PostGames that include gigabytes of textures and movies, don't always make better graphics. As JC stated, visuals are always an illusion. So even for gigatextures, the whole point is that detail comes from clever illusions to make whatever on the screen look real.
Game engines have barely evolved since 90's (notably quake3 engine), in fact most current engines are based on original quake engine (z-buffer for scene depth and various hacks like bump mapping to make it look real).
For true quality, you gonna need physics simulation like raytracing, beam tracking and radiosity.
Most of these can be seamlessly combined together, but they are computationally most expensive, since lighting must be computed per each frame.
Most people would be amazed how difficult is it to render things like mirrors, difraction or believable shadows.
Comment
-
Originally posted by tpruzina View PostYeah, like others have mentioned, it looks pretty nice, but ray(back)tracing (& similar techniques) are generally too expensive for real gaming (although numbers of 'hops' can vary depending on load and fps).
While easy to implement, few hundreds LoC (not including render for scene and primitives),
it still cant be used for realtime stuff because performance is several orders higher than what it would take to do the same thing via conventional Z-buffer (&similar).
This is inherent flaw caused by the fact that whole scene has to be redrawn each frame (although I've seen experimental code which worked around this by cacheing ray objects and reusing stuff when possible).
Please note that some 'glitches' these are caused by premature termination of rays.
Comment
-
Originally posted by b15hop View Post.... The grains are possibly due to the low level of traces per pixel. I'm guessing less than 200 per pixel.
in raytracing this is done reversely, viewer (viewpoint) sends out beams that reflects out of objects and if they manage to end up in lightsource, it then recursely illuminate objects.
So "quality" is based on not only number of hops, but also number of rays in the scene.
Comment
-
bout forward and backward raytracing are viable methods, and even a mix is
raytracing, realistically speaking, will probably become mainstream when i will be an old man, if then
it's just too heavy (and you can do calculations on that, or just dl a optimized raytracer)
also that they are games, not simulations
and games don't need to be ultra realistic to be fun
approximations... maybe, like course photon mapping
funny, i remember an old game that used path tracing to shade the terrain
i think (not sure) that sc2 does something like that for some details
Comment
-
Originally posted by gens View Postbout forward and backward raytracing are viable methods, and even a mix is
raytracing, realistically speaking, will probably become mainstream when i will be an old man, if then
it's just too heavy (and you can do calculations on that, or just dl a optimized raytracer)
also that they are games, not simulations
and games don't need to be ultra realistic to be fun
approximations... maybe, like course photon mapping
funny, i remember an old game that used path tracing to shade the terrain
i think (not sure) that sc2 does something like that for some details
Raytracing is pretty standard in movie industry (pixar and co), when animated movie runs for day or two on few capable servers to produce 'realistic' lighting (shading).
Demo though looks like 100 year old camera (which also had problem with optics not getting enough light beams to be projected onto film).
Comment
-
Originally posted by gens View Postbout forward and backward raytracing are viable methods, and even a mix is
raytracing, realistically speaking, will probably become mainstream when i will be an old man, if then
it's just too heavy (and you can do calculations on that, or just dl a optimized raytracer)
also that they are games, not simulations
and games don't need to be ultra realistic to be fun
approximations... maybe, like course photon mapping
funny, i remember an old game that used path tracing to shade the terrain
i think (not sure) that sc2 does something like that for some details
On topic though, I'm still impressed with this demo and any demo that pushes the bounds of the GPU. The best part of 64k demos is that they're doing the whole "MORE for LESS". The idea being that you don't need exabytes of data to render a simple scene... Remember, the whole thing is all an illusion anyway. So if a hamster was powering that GPU and gave you real time (<5ms) shading of a carrot that made you want to eat the computer screen. If the carrot was so real that you now have an addiction to eating carrots. Yet the whole demo is 64k.... Then so be it. Bring on the carrots.
Comment
Comment