Announcement

Collapse
No announcement yet.

AMD working on XvMC for r300g?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • pingufunkybeat
    replied
    Originally posted by Hoodlum View Post
    I think it's more a case of someone noticing it was hard to decode and using it for that unintended purpose.
    This is the case with most benchmarks.

    Leave a comment:


  • Hoodlum
    replied
    Originally posted by BlackStar View Post
    It's a clip that's intentionally hard to decode, in order to test decoding efficiency. If this doesn't fit the definition of a benchmark I don't know what does.
    Really? So you believe the they shot the clip with the intention that it will be hard to decode? That seems completely unrelated to the task of filming a nature documentary to me. I think it's more a case of someone noticing it was hard to decode and using it for that unintended purpose.

    Leave a comment:


  • BlackStar
    replied
    Originally posted by popper View Post
    you seem to miss the point?, its just an h.264 clip thats hard to decode, its nothing to do with "A Benchmark" app or whatever.
    It's a clip that's intentionally hard to decode, in order to test decoding efficiency. If this doesn't fit the definition of a benchmark I don't know what does.

    Leave a comment:


  • agd5f
    replied
    Originally posted by misGnomer View Post
    Meanwhile for me as an end user it looks like none of the HD 4xxx or 5xxx generation cards will be reasonably usable in the near future. Maybe a cast out 3xxx card from an upgrading windows user...
    X1xxx = r500
    HD2xxx, HD3xxx = r600
    HD4xxx = r700
    HD5xxx = evergreen

    r600 and r700 are both well supported so HD2xxx-HD4xxx cards are working well already. evergreen support is just starting to roll out now.

    Leave a comment:


  • misGnomer
    replied
    Well d'oh (for me ;-)

    Thanks agd5f. The table lists ATI hardware features up to R700 series (aka Radeon HD 4xxx) so I presume the current HD 5xxx line is called "Evergreen".

    Being the well-intentioned, demanding and nice fellow that I am, it'd also be nice to have regular post-release human-readable "State Of The Union of Radeon" summaries aimed at layman users. Something referring to the model #s such as HD 4xxx, 5xxx etc. and with simple explanations of what's there and what's not (or yet to come). 2D, 3D, partial/full hardware acceleration, TV-out, power-saving and so forth, with some recommendations too.

    I realize that much is still in the pipeline (pun unintended) so this might be something for the Phoronix editors to ponder in the future.

    Meanwhile for me as an end user it looks like none of the HD 4xxx or 5xxx generation cards will be reasonably usable in the near future. Maybe a cast out 3xxx card from an upgrading windows user...

    Leave a comment:


  • agd5f
    replied
    Originally posted by misGnomer View Post
    Where are the ATI open drivers in terms of hardware acceleration support? Is there a status chart or table somewhere showing different families/generations and their supported capabilities at present and planned? Such a table could be very useful for laymen wishing to stick with the open drivers while hoping to squeeze the maximum hardware-supported features out of their graphics hardware.
    http://wiki.x.org/wiki/RadeonFeature

    Leave a comment:


  • jrch2k8
    replied
    well dont get me wrong but i think the bird example is not a good example for now but i agree is a good case testbed for the future, why?

    well is just a dream to get a decoder that effienctly accelerated just from thin air, in reality that is just not going to happen.

    so the best solution i think is make a modular system and begin to accelerate the most critical parts, and then begin with the full optimization process.

    in linux we dont have any acceleration, so everything even if little at begining that start to provide some acceleration will be just peachy, even using shader and later migrate to opencl.

    even an shader accelerated XV at the beggining can help to reduce cpu overheat a bit, next step could be an semi gpu accelerated codec routines in libavcodec for example and some more openmp magic etc, etc. so little by little we will end with one hell of a video decoding system eventually

    so even when you are rigth that my BD torrented h264 sux compared to birds, we need accelaration for those at first aka ppl mostly use this techs to watch videos not for ultra pro jobs. so when normal BD plays just fine then that kind of ultra bitrate videos is the next step

    Leave a comment:


  • misGnomer
    replied
    Now I'm out of my depth here and this may be off-topic in this thread (r300g?). If so, my apologies, but being interested in open-source drivers and support for hardware acceleration...

    ATI's new low-end Radeon HD 5450 looks like an interesting fanless card and Anandtech's review (http://www.anandtech.com/video/showdoc.aspx?i=3734&p=4) pointed out some decoding issues using "a specially crafted heavily interlaced 1080i MPEG-2 file called Cheese Slices, made by blaubart of the AV Science Forum". It was comparing Vector Adaptive Deinterlacing vs Motion Adaptive.

    Where are the ATI open drivers in terms of hardware acceleration support? Is there a status chart or table somewhere showing different families/generations and their supported capabilities at present and planned? Such a table could be very useful for laymen wishing to stick with the open drivers while hoping to squeeze the maximum hardware-supported features out of their graphics hardware.

    Leave a comment:


  • popper
    replied
    OC Younes Manton's summer of code 2008 Gallium unfinished Proof Of concept patches (POC as he couldnt be bothered to actually finish them and get payed ?)stand as they are , he's not interested in it any more and he's almost invisable today since he went back to Nouveau MESA patches etc, so its up to other devs to take that code or not and do SOMETHING that actually works adn useable by the end user, will that dev or goup of devs be You ?

    Leave a comment:


  • popper
    replied
    Originally posted by BlackStar View Post
    He is. My laptop plays every 1080p movie I've thrown at it without batting an eyelid (and I've thrown many).

    Frankly, I couldn't care less if it drops frames on a benchmark, when every real world test works smoothly.
    you seem to miss the point?, its just an h.264 clip thats hard to decode, its nothing to do with "A Benchmark" app or whatever.

    it is You that choose what stages of decode you make that clip pass through to get the end result, be it patch XvMC and related code to make it play that clip smoothly in the future,or weather You Take Younes Manton's summer of code 2008Gallium patches he left to ROT way back in January 18, 2009
    rather than finish it to a useable state by SOC 2009.

    or even go the prefered Bridgeman way today of OpenCL over new Gallium rather than the UVD way....

    eather way it involves refactoring or coding New ways to play that AVC 1080 HD clip, and so lay the ground work for future 2K and 4K super HD, those that actually make the HW and code the patches make the future market,simples.

    Leave a comment:

Working...
X