Page 4 of 18 FirstFirst ... 2345614 ... LastLast
Results 31 to 40 of 176

Thread: R800 3D mesa driver is shortly before release

  1. #31
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,385

    Default

    Quote Originally Posted by DuSTman View Post
    Point is, they aren't writing a complete driver - they're copying the r600 one. The evergreen microcode format is almost identical to that of r600/r700, with the exceptions of needing interpolation instructions for texture reads, so the compiler could probably be used with very little modification. Then there's apparently been some changes to the register offsets, but that, again, is something that should be straightforward to fixup for someone with the information.
    It's a bit more involved than that. Parameter interpolation moves from hardware into the shaders, the hardware that was used for managing constants in the 6xx/7xx driver is gone (my understanding is that there were two ways to handle constants and one of them was removed), and there's a few not-particularly-well-documented bits that we're still figuring out. We knew about the first one, figured out the second one part way through the implementation, and are still struggling with the third. Plus the register offsets changed.

    My guess is that we'll finish IP review about the same time we finish figuring out the hardware.

    Quote Originally Posted by DuSTman View Post
    I mean, from what we're told, it's so similar that it would probably take a full-time dev who's familiar with the codebase and has all the info no more than a week to adapt a copy of the r600 classic driver, whereas a gallium driver would probably take a couple of months at least before it reaches the same level of maturity.
    Getting running on mesa is much faster for this generation (600g picked up texture code today; textures have been running for a while on Evergreen), but I'm hoping we can bring up the next hardware generation directly on the Gallium3D driver.

  2. #32
    Join Date
    Nov 2007
    Posts
    1,024

    Default

    It is one thing to use your hardware, it is another thing to use it within it's useful life time.
    You appear to be confused about the useful time frame of hardware. The HD5000 is for bleeding edge early adoptors around now. The vast majority of consumers are still running HD2000 through HD4000 hardware, if not 9800 class chips.

    This is no different than the OpenGL 4.1 threads. OpenGL 4.1 is put out now so that it can be in use 2-3 years from now. If they waited 2-3 years before releasing it, then it wouldn't be in real use until 4-6 years from now.

    Perhaps you bought brand new top-of-the-line hardware and are miffed it isn't supported. I was miffed when my home desktop's HD4000 series card wasn't supported, too. But to claim that my HD4770 is "past its useful time frame" is just freaking idiotic.

    According to Steam, all of 3% of users are even using HD5000 series hardware right now, and it's 9 months old. The most popular ATI series is still the HD4800. The most popular NVIDIA card is the 8800, which is now five generations old (it being followed by the 9x00 series, GT100 series, GT200 series, and now GT400 series).

    The lag in drive support from ATI sucks for early adopters, but it is by no means resulting in only useless hardware being supported.

    But hey, why use facts and statistics and reasoning when you can make grandiose claims and make your post seem more dramatic and meaningful?

  3. #33
    Join Date
    Jun 2009
    Posts
    2,926

    Default

    But you have to admit that it's very amusing to observe the evolution of the complaints.

    They went roughly like: OSS drivers don't work at all -> OSS drivers are not accelerated -> OSS drivers are not as fast as binary drivers -> OSS drivers do not have powersaving -> OSS drivers cannot run obscure OpenGL 4.1 demos shortly after a bleeding edge next generation hardware hits the shelves.

    I can live with last year's graphics card and will survive not being able to run Unigine Heaven. As a result, I get a fully OSS stack which works out of the box and is a full part of the operating system.

    Once the R800 driver has similar performance to the r600 one, I'll probably get an Evergreen card for a multiseat configuration.

  4. #34
    Join Date
    Mar 2009
    Posts
    8

    Default

    Quote Originally Posted by pingufunkybeat View Post
    But you have to admit that it's very amusing to observe the evolution of the complaints.
    I'd be happy if i had at least tear-free
    xv acceleration on my HD5850. I'm so tired of waiting.

  5. #35
    Join Date
    Oct 2009
    Posts
    2,061

    Default

    Quote Originally Posted by elanthis View Post
    The most popular NVIDIA card is the 8800, which is now five generations old (it being followed by the 9x00 series, GT100 series, GT200 series, and now GT400 series).
    Mustn't forget that nvidia's "hardware generations" are in name alone. They take what is virtually the same exact part, slap a new sticker on it, and call it "new". Virtually everything from 8xxx-3xx is the same thing, which puts the 8800 ***ONE*** legit hardware generation behind the current 4xx's.

  6. #36
    Join Date
    Jun 2009
    Posts
    2,926

    Default

    Quote Originally Posted by boris64 View Post
    I'd be happy if i had at least tear-free
    xv acceleration on my HD5850. I'm so tired of waiting.
    The initial evergreen support is taking longer than I had hoped, I agree with this.

    But I also agree with elanthis that calling a 6-month old card "past its useful lifetime" is ridiculous.

  7. #37
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by bridgman View Post
    It's a bit more involved than that. Parameter interpolation moves from hardware into the shaders, the hardware that was used for managing constants in the 6xx/7xx driver is gone (my understanding is that there were two ways to handle constants and one of them was removed), and there's a few not-particularly-well-documented bits that we're still figuring out. We knew about the first one, figured out the second one part way through the implementation, and are still struggling with the third. Plus the register offsets changed.

    My guess is that we'll finish IP review about the same time we finish figuring out the hardware.

    Getting running on mesa is much faster for this generation (600g picked up texture code today; textures have been running for a while on Evergreen), but I'm hoping we can bring up the next hardware generation directly on the Gallium3D driver.
    you need the spec to build a free driver but yes you also need the free driver for wrote a 100% correct spec ;-)

    i think the fglrx team do not have this kind of 100% handbook ;-)

  8. #38
    Join Date
    Aug 2007
    Posts
    437

    Default

    The thing is, GPU is all or nothing kind of business. If you don't get the latest bleeding edge GPU, it doesn't matter what you got then.

    Bragging about HD2000 is faster than HD1000 is like bragging about a P2 faster than MMX, it is faster but not to a useful degree now.

    Also you can brag all you want about how fast and glorious your HD4770 is compared to HD3850 and such, but it doesn't really matter. Scoring 3000 vs 1000 in 3D Vantage doesn't matter when there are mainstream cards on the market which costs only sub-100 dollars and all up to 8000~10000 marks.

    Conclusion? If you don't push brand new hardware to its full potential, soon it will not matter so you wasted your money buying it. That's what I meant about 'useful life time'.

    Also, I'm not calling HD5850 out of its useful life time NOW, just predicting it will be the case BY THE TIME OSS picks up the functionalities.

  9. #39
    Join Date
    Aug 2007
    Posts
    437

    Default

    I'm not confused, owning more than 50 cards from 1998 to 2010, following fglrx and OSS ATI closely for more than 5 years, I think I know that I am talking about.

    I have HD2900XT storing in the cupboard, why not use it? Because a $99 HD 5670 is far ahead of it in terms of performance, power consumption, noise, heat, dimension, almost all aspects. Yes HD2900XT is fast, faster than all HD1000 class cards, but no, I won't use it because there are far better alternatives.

    Same thing goes for newer cards. I will buy an HD5870 just because after 2 years I will have full OSS support for it? No, I won't. Because by the time it picks up the OSS support, there will be far superior cards for fraction of the price.

    It it ok to advance in technology sometimes, folks.

  10. #40
    Join Date
    Oct 2009
    Posts
    2,061

    Default

    Quote Originally Posted by FunkyRider View Post
    I'm not confused, owning more than 50 cards from 1998 to 2010, following fglrx and OSS ATI closely for more than 5 years, I think I know that I am talking about.

    I have HD2900XT storing in the cupboard, why not use it? Because a $99 HD 5670 is far ahead of it in terms of performance, power consumption, noise, heat, dimension, almost all aspects. Yes HD2900XT is fast, faster than all HD1000 class cards, but no, I won't use it because there are far better alternatives.

    Same thing goes for newer cards. I will buy an HD5870 just because after 2 years I will have full OSS support for it? No, I won't. Because by the time it picks up the OSS support, there will be far superior cards for fraction of the price.

    It it ok to advance in technology sometimes, folks.
    You seem to be missing the point and looking at it from a different and not-universally-applicable perspective.

    SOME people may like to have the latest and greatest, BUT that doesn't mean EVERYONE does.

    Here's an example: I just purchased a new mainboard -- an 890GX chipset with integrated radeon 4290 and supposed support for dual digital display outputs (got the manufacturer scrambling to figure out why it isn't working, but that's a different story). These were the objectives;
    1) open source 3D support (being R700, it is nice and supported),
    2) ability to drive dual 1920x1200 displays,
    3) AM3 to make effective use of a 1090T.

    You're probably sitting there scratching your head wondering why I would mix some weakass IGP like a 4290 with a 6-core 1090T.... well the answer is this: 4290 completely meets ALL of my graphic performance needs and I have much more productive uses for the 1090T than playing video games, hence I have an IGP that is basically 2 year old tech (RV710), and I have no interest AT ALL in having any higher performance add-on card in the machine.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •