Announcement

Collapse
No announcement yet.

ATI Evergreen 3D Code May Soon Go Into Gallium3D

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • droidhacker
    replied
    Originally posted by TomcaT-SdB View Post
    Based on the direction that the entertainment industry is moving in, it's an important point to cover.
    The entertainment industry isn't going to be FORCING 3D on anyone since doing so WILL cut off the potential customers who are either unable or unwilling to do 3D to begin with, and no, technological limitations requiring hardware upgrades are not the only reasons -- it is advisable that epileptics and those with heart conditions avoid 3D or other weird strobing effects... which means that all of the hardware and content MUST have a 2D fallback mode.

    On top of that, who is to say that 3D this time around is any less a fad than it was any of the previous times it went around? (and its been around a lot!)

    And I'll take 2D over a headache any day.

    Leave a comment:


  • TomcaT-SdB
    replied
    Originally posted by etnlWings View Post
    I'm trying to figure out the point of this argument but it continues to elude me. There's been a huge 'multi-vendor push' for a lot of things, yet you're still not likely to find a lot of people who refuse to buy something if it's not in HD and in 5.1. Hell, a large portion of even the geek community still download VCD-quality rips off of TPB.
    The vast majority of sets available at Target, Best Buy, Walmart, etc are flat panels capable of 780p or higher. It's rare that you can find an SDTV on the self. Since the HDDVD / Bluray format war resolved, Bluray is well ahead of DVD adoption rates (year to year market %). When multiple content makers and hardware vendors are pushing the same technology, the consumer base adopts it. The current direction of the entertainment industry is a strong embrace of stereo.

    ...yes? Courtier's Reply much? I 'ignore' something and you fail to make a point of it, well aren't we a crazy bunch.
    You were asserting that stereo is only appealing to enthusiasts and/or kids, as well as casting doubts about the readiness of the tech:
    No, more like writing it off as it's strictly the purview of enthusiasts and kids and they're still iffy about the technology
    Which, based on content slated to release, the 3D bluray format already being standardized, standardization underway or complete for stereo TV broadcasts in the UK, China, US, etc is false. You made an assertion which flies in the face of reality. So, yes, you are ignoring the current state of standards within the industry.

    ...while their non-3D counterparts remain significantly cheaper; it's still premium hardware. You haven't contradicted that.
    Sub $2000 was around a year ago, I haven't looked into pricing for that unit recently, so I don't have a current number. All a display needs in order to be stereo compliant is a high enough refresh rate and a sync out for active stereo, and have a decoder capable of handling the stereo format. As far as the refresh rates go, many current LCD sets have 120Hz or above refresh rates and the price point for these units start around 475 to 500 USD based on some recent pricing in stores. How much more of a price premium will be charged for stereo sets is up in the air, but we'll know that early next year. The actual costs to make the hardware though, shouldn't be much higher than what we already have since the refresh rate for the panels was the highest barrier to overcome.

    Oooooor support core features people are likely to use day-in, day-out, while leaving the frills on an 'as time permits' priority list.
    I agree that having the basics covered first is the right direction to go. My initial question was simple: Does the Gallium framework have hooks in place to support active stereo, or alternatively, is stereo something that's outside scope for the Gallium layer? My hope is that stereo is at least on the radar for the open drivers. Based on the direction that the entertainment industry is moving in, it's an important point to cover.

    Leave a comment:


  • droidhacker
    replied
    Originally posted by V!NCENT View Post
    Errr... I thought that 3D photography was done using two camera's in one?
    Not necessarily in one... two cameras side-by-side, they have to be synchronized (i.e. time stamp from common time source) and they have to be able to angle towards and away from each other such that they converge on the subject distance and the two beams have to be separated by about the same distance as the distance between your eyes at the "viewing" distance back from the subject.

    Leave a comment:


  • V!NCENT
    replied
    Errr... I thought that 3D photography was done using two camera's in one?

    Leave a comment:


  • droidhacker
    replied
    Originally posted by highlandsun View Post
    Which is why analog 3D cameras never caught on. But if you're recording digitally with an infinite focus, then you can blur the background or foreground algorithmically when displaying the result.
    Doesn't matter if it is analog or digital. A lens is still a lens and an aperture is still an aperture. There is NO SUCH THING as "infinite focus".

    What you are describing is a large depth of field, which just as attainable using analog as digital. Of course, it is easier to FAKE a small depth of field using a digital source that has measures of DISTANCE for all points on the image... which are NOT entirely present in 3d photography, but which can be sort-of "guessed".

    Now unfortunately, this approach doesn't yield good results... specifically, you obtain a large depth of field using a small aperture. A small aperture means less light and more diffraction, which generally reduces the image quality, which means that you can't blow it up as much -- which means no imax theaters since it would look like crap when blown up to that size.

    And then there is a whole bunch more complexity.... http://en.wikipedia.org/wiki/Depth_of_field

    Leave a comment:


  • droidhacker
    replied
    Originally posted by smitty3268 View Post
    Warning, Star Trek nerd alert:

    The holograms in star trek actually were just supposed to be 3D images made out of light, and the physicality of them was faked by using projected force fields.
    No, their description was actually a lot more complex than that.
    More like force-fields + tractor beam + replication + transporter.

    All of which are magical "things" that haven't been invented yet

    Work like this in theory:
    You replicate a thin surface to provide texture and appropriate reflective properties, and hold it in place with a tractor beam / force field. Light would be provided by a standard light source rather than an actual hologram.

    OR, with just a force-field... that force field would itself carry the reflective properties.

    Remember that their standard "force-field" would give you a zap and light up if you touch it.... the "holodeck" didn't exhibit this characteristic, but a "holographic" brick wall would block your way just the same. By their description, that "holographic" brick wall was, in fact, an ACTUAL brick wall.

    Leave a comment:


  • DuSTman
    replied
    Originally posted by V!NCENT View Post
    I had an iPhone 4 in my hand with that 'retina' display. Man I was blown away by an HD movie playing. It is insanely awesome! I read on /. that there was now a new display technique that had 8 times the density of the retina display. It is insane and extremely lifelike!
    Indeed. I wonder when we'll start seeing these displays, though. IBM made some 200DPI 20" screens a few years back but they needed an especially adapted matrox card, and they discontinued them after a while (Understandable, as they cost several thousand quid). Still, the iPhone's retina display proves the maturity of the manufacturing process - historically the real barriers were that DVI couldn't handle the bandwidth requirements, and that the OSes couldn't scale properly, which have both recently been fixed (displayport, plus new versions of all the main OS' GUI toolkits have established methods for coding resolution independant UIs - of course, there'll still be some with hardcoded dimensions, but that'll have to be left to time).

    Leave a comment:


  • V!NCENT
    replied
    Originally posted by DuSTman View Post
    I'm also pretty unenthused about 3d screens in relation to the prospect of, say, high DPI (200+ dpi) screens going mainstream, which is something that I'd *really* appreciate.
    I had an iPhone 4 in my hand with that 'retina' display. Man I was blown away by an HD movie playing. It is insanely awesome! I read on /. that there was now a new display technique that had 8 times the density of the retina display. It is insane and extremely lifelike!

    Leave a comment:


  • DuSTman
    replied
    Originally posted by etnlWings View Post
    Oh, how I hope you're trolling.
    I'm also pretty unenthused about 3d screens in relation to the prospect of, say, high DPI (200+ dpi) screens going mainstream, which is something that I'd *really* appreciate.

    Leave a comment:


  • nanonyme
    replied
    Originally posted by bridgman View Post
    The counter-argument (which we used when trying to sell 3D in the early 80s) is that a window is flat (2D) but you can still see realistic looking 3D through it...
    God bless Linear algebra?

    Leave a comment:

Working...
X