Announcement

Collapse
No announcement yet.

ATI Evergreen 3D Code May Soon Go Into Gallium3D

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by Hephasteus View Post
    3d graphics on a 2d screen. You can stay. 3d screens go away.
    The counter-argument (which we used when trying to sell 3D in the early 80s) is that a window is flat (2D) but you can still see realistic looking 3D through it...

    Not sure it's totally relevent but it was a pretty effective story at the time
    Test signature

    Comment


    • #52
      Originally posted by bridgman View Post
      The counter-argument (which we used when trying to sell 3D in the early 80s) is that a window is flat (2D) but you can still see realistic looking 3D through it...

      Not sure it's totally relevent but it was a pretty effective story at the time
      *I'm getting tech support shivers*

      Comment


      • #53
        Originally posted by droidhacker View Post
        We're not talking Star Trek holodecks here, which by their description aren't even actual holograms (they just use the word for convenience to describe some imaginary "thing" that has not yet been invented, that can actually exert physical forces -- and is therefore NOT an actual hologram).
        Warning, Star Trek nerd alert:

        The holograms in star trek actually were just supposed to be 3D images made out of light, and the physicality of them was faked by using projected force fields.

        Comment


        • #54
          Hello from Flatland!

          I highly doubt this whole 3D is real and suggest you folks seek psychological help.

          Sincerely,
          Your most humble Square

          Comment


          • #55
            Originally posted by droidhacker View Post
            Note that you can't recreate a focused image from an out-of-focus image, which means that out-of-focus background or foreground information in current 3d content could never be recreated.
            Which is why analog 3D cameras never caught on. But if you're recording digitally with an infinite focus, then you can blur the background or foreground algorithmically when displaying the result.

            Comment


            • #56
              Originally posted by TomcaT-SdB View Post
              So, Sony, Panasonic, Samsung, LG, etc all have no idea where the market is going? Ummm, ok. The point being, that uptake in new technologies happens pretty fast when there's a muti-vendor push. At CES, most (save one that I read of) 3D displays were based on active stereo. By all accounts, every display vendor on the floor had 3D displays as their showcase.
              I'm trying to figure out the point of this argument but it continues to elude me. There's been a huge 'multi-vendor push' for a lot of things, yet you're still not likely to find a lot of people who refuse to buy something if it's not in HD and in 5.1. Hell, a large portion of even the geek community still download VCD-quality rips off of TPB.

              You're also ignoring the rapid standardization of 3D formats
              ...yes? Courtier's Reply much? I 'ignore' something and you fail to make a point of it, well aren't we a crazy bunch.

              and the fact that 3D capable TV's are already in the sub $2000 range (albeit the DLP checkerboard stereo).
              ...while their non-3D counterparts remain significantly cheaper; it's still premium hardware. You haven't contradicted that.

              The rest of the post can be summarized as "video on *nix sucks".
              Pretty much.
              If there's going to be a change in that, we need to support current features.
              Oooooor support core features people are likely to use day-in, day-out, while leaving the frills on an 'as time permits' priority list.

              Originally posted by Hephasteus View Post
              Bet ya can't sell me a 3d tv even if you have a big gun. 3d is like the obnoxious retarded psychopath child that for some reason needs to be adopted by society but never seems to happen. My dad didn't adopt you in the 1950's and I'm not going to adopt you in the 2010's.

              3d graphics on a 2d screen. You can stay. 3d screens go away.
              Oh, how I hope you're trolling.

              Comment


              • #57
                Originally posted by bridgman View Post
                The counter-argument (which we used when trying to sell 3D in the early 80s) is that a window is flat (2D) but you can still see realistic looking 3D through it...
                God bless Linear algebra?

                Comment


                • #58
                  Originally posted by etnlWings View Post
                  Oh, how I hope you're trolling.
                  I'm also pretty unenthused about 3d screens in relation to the prospect of, say, high DPI (200+ dpi) screens going mainstream, which is something that I'd *really* appreciate.

                  Comment


                  • #59
                    Originally posted by DuSTman View Post
                    I'm also pretty unenthused about 3d screens in relation to the prospect of, say, high DPI (200+ dpi) screens going mainstream, which is something that I'd *really* appreciate.
                    I had an iPhone 4 in my hand with that 'retina' display. Man I was blown away by an HD movie playing. It is insanely awesome! I read on /. that there was now a new display technique that had 8 times the density of the retina display. It is insane and extremely lifelike!

                    Comment


                    • #60
                      Originally posted by V!NCENT View Post
                      I had an iPhone 4 in my hand with that 'retina' display. Man I was blown away by an HD movie playing. It is insanely awesome! I read on /. that there was now a new display technique that had 8 times the density of the retina display. It is insane and extremely lifelike!
                      Indeed. I wonder when we'll start seeing these displays, though. IBM made some 200DPI 20" screens a few years back but they needed an especially adapted matrox card, and they discontinued them after a while (Understandable, as they cost several thousand quid). Still, the iPhone's retina display proves the maturity of the manufacturing process - historically the real barriers were that DVI couldn't handle the bandwidth requirements, and that the OSes couldn't scale properly, which have both recently been fixed (displayport, plus new versions of all the main OS' GUI toolkits have established methods for coding resolution independant UIs - of course, there'll still be some with hardcoded dimensions, but that'll have to be left to time).

                      Comment

                      Working...
                      X