Announcement

Collapse
No announcement yet.

XvMC support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #81
    Originally posted by popper View Post
    thats a shame, we are looking at months at the very least then!
    For open source, yes, but I expect fglrx will have it sooner.

    Originally posted by popper View Post
    i dont know why (other than saving pennys they could recoup on the retail cost) you HW vendors dont just move away from these antiquated DSP SOC, and start using current faster and vastly more expandable FPGA's for your UVD, you (or indeed anyone) could then simply re-program on the fly for many other HW assisted tasks and market sectors?. imagine the open source and even closed add-on FPGA (Field Programable Gate Array)code you could market and sell into a generic mass markets. puttig an FPGA on every ATI/AMD gfx and related card/MB would bring the FPGA prices right down in line or below the cheap DSPs favoured today perhaps....
    FPGAs need an insane number of random-logic transistors (ie they take a bunch of die area per transistor) to perform the same functionality as application-specific logic. I think UVD would end up roughly the same size as the RV770 shader core if we did it in FPGA. It's a nice idea but the cost would probably be a lot higher than you expect.

    Originally posted by popper View Post
    given the apparent potential long wait for anything ATI UVD related, perhaps its finally time to move over to NV cards for now as the only viable option for many people world wide today!, as CoreAVC have a linux libray available and have released test HW assisted cuda/VS2 on wiodows,
    Again, I was talking about UVD support in the open source drivers.

    Originally posted by popper View Post
    i dont know if its will be usable on linux X86 as yet though.
    The nvcuvid library that everything else builds on is Windows only AFAIK.
    Last edited by bridgman; 08 January 2009, 08:58 PM.
    Test signature

    Comment


    • #82
      If you want a fpga based graphics card you should look at the open graphics project. Their initial developer cards are fpga based, and note that they use a fairly beefy fpga just to do a "simple" graphics card. You can also gasp at the price of one of those boards (whilst it is true, that the fact that it is niche keeps costs up).

      However even the open graphics people want to transition/create a wide release asic version. Fpgas use much more power,silicon, and are less powerful than an equivalent asic, and as a result their primary market applications are for developers and applications where one needs specific functionality but for which there are no equivalent asics.
      Last edited by _txf_; 08 January 2009, 09:42 PM.

      Comment


      • #83
        > If you want a fpga based graphics card you should look at the open graphics project.

        Updated: November 2022 ... Important: In order for you to maximize the efficiency of your CARBONITE BACKUP process and to get the most out of your investment in the service, we highly suggest that you do the following ...

        Comment


        • #84
          the reason i mentioned FPGAs in passing was not for looking into making the whole core of the gfx card ,but rather finally tapping into the far broader mass markets were many gfx apps and datasets could benefit IF there were an FPGA put on a common board such as a gfx card or a common a garden motherboard for instance...

          NV and ATI are using a static ASIC today for their blackbox video engine, i think FPGA in this case would be better (other than as Bridgeman outlined as space in your blackbox video AV SOC [System On a Chip]) OC

          the "Fpgas use much more power,silicon, and are less powerful than an equivalent asic" doesnt seem to be the case as much today, for instance this



          true its a large chip but its billed as "... are claimed to offer the industry's largest density, highest performance, highest system bandwidth, and lowest power among high-end FPGA solutions. "

          they as you might expect also offer conversion to Altera's transceiver-based HardCopy IV GX ASICs.

          but thats not my point, if all your new HW has a generally available FPGA onboard, making just as common as a USB port, than you can start to use that to set yourself apart on the mass market, and offer lots of innovative ways for your customersbase to interact with that FPGA(s)for their own use for instance.

          im trying to get away from the old school thinking, prototype on FPGA and copy to your slighty cheaper static mass ASIC, to later find you need to fix a bug or limitation you didnt imagine, and run off a new batch/revision.

          using FPGAs in all your kit could remove that problem and be used to bring that convenience to the masses of home devs and also give you the ability to fix your design errors/limitations later.

          in the case above as bridgeman says OC, it increases die area in a specialist SOC so its not as versitile for that case, but theres nothing stopping you putting a seperate off the shelf version on your main PCB and linking it in to your main SOC that way to use as you see fit .....

          generally speaking its become cheaper now to take other peoples SOC, and mix and match them to get your desired end result, rather than have the ASIC made yourself, so its just the logical step to forget the limited static ASIC and just use flexable FPGAs on mass to bring the prices down there, they are as fast or faster and would only get better as CPU and Gfx did, if HW vendors started using them on mass in the PC realm.

          or thats how i see it today, and how it could be tomorrow with some insight and innovative thinking

          i like this http://www.pldesignline.com/ for getting you thinking...

          if you wanted to get real twisted OC theres also the KiloCORE PPC based FPGA with 256 and 1024 cores from way back in 2006 , mixed PPC/Altivec and x86 now thats a sure fire headline even today

          Last edited by popper; 09 January 2009, 03:06 AM.

          Comment


          • #85
            Might be considered ranting, rather than helping but...

            Where exactly lays linux appeal for the average Windows convertee
            (being one myself)? Can't play some fancy 3d games, ok, I can live
            with that. Must learn a lot about things I never even knew they were
            needed. For example - what's the thing with XRandR - everyone gets
            excited about (gasp!) changing resolution, multiple displays and
            rotating them? Well that's an amazing feature - guess it shouldn't be
            had for granted in the third millenium.

            Going on - want to play mp3? Good luck finding your way through names
            like GStreamer, Xine, Phonon, PulseAudio... Fancy window rendering?
            No problem, first you got to install fglrx or radeonhd or radeon or ati
            and then configure your Composite flag in xorg.conf, but you can't do
            it if you don't su/sudo first. Fglrx? Fglrx? WTF is fglrx? Compiz?
            Metacity? Kwin? People, I just want them kube rotatin'n'shit

            ACPI on laptops? hald, acpid - there's always something wrong with them,
            deeper C states on processors have problems, can't have power managenent
            on laptop IGP's (without fglrx), standby and hibernate woes.

            OK - most of regular Redmond-style users quit at this point. Me - not.
            I read, I learned, compiled quite a bit, changed text files, patched,
            recompiled. Slow and steady - one day I'll even manage to do it all
            without a blink. Not so soon, though.

            But... at the end of the day - what do I get, where's the satisfaction?
            Can't play games - checked. Can't have video under compiz - checked.
            Can't do web developing right (IEs4linux - not so good, flash?)- checked.
            Can't play shockwave game online during break - checked. But wait!
            I can watch my HD H-264 DVB-T TV broadcast and trailers - NOT.

            I always considered linux to be a kind of OS that is smart about
            resources. You know - doing things the way they should be done.
            If you got a specialized hardware to do some kind of work - use it,
            optimize, optimize, optimize. Come on... an i810 integrated graphics
            has working XvMC. If it is such a hassle to program a working MPEG2
            acceleration on a modern graphic processor (and it seems it is), then
            what's the point? Where is that magical area that linux excels at -
            compiling stuff? Watching DVD on a console framebuffer? Showing some
            other friend wobbly windows?

            I know what's the answer - why don't you do something yourself? Yup,
            criticism is hard to take when you're doing all the work. I respect
            that. And I promise I won't give you a "But,...". Just a thought.

            Comment


            • #86
              " I just want them kube rotatin'n'shit
              "

              ROTFL

              Comment


              • #87
                [...] then what's the point? Where is that magical area that linux excels at [...]
                Choice.

                This is a system that can scale from paltry router hardware (100MHz processor, 4MB flash) all the way to clusters with thousands of processors. You can choose exactly how you want your system to work, what software to run and how the software will look like.

                I'm not saying this power comes for free - you've seen the price. I'm also not saying this is something most people want or need - but it's there if you need it.

                However, Linux is steadily getting better, bit by bit. 2008 brought major improvements to wireless and video drivers, the two major pains. Most companies have now released or have promised to release specs for their hardware. The UI is improving (compare KDE 4 to 3 or Gnome 2.24 to 2.16 if you want proof). More programs see ports or can be adapted to run under wine. 2009 will hopefully bring massive improvements to video support - the pieces are already set.

                We won't see masses of people switching overnight (people never like change, XP vs Vista anyone?), but as Linux becomes more and more viable, the more adapters it will gain. In fact I was quite surprised at how many people switched in my immediate environment, expressed they liked the change and never went back.

                I agree it gets frustrating at times and I certainly share your pain about video playback, flash support and certain development tools. However, I look forward to the (hypothetical) day when I'll be able to go to a shop and buy a computer with Linux installed by default ("Do you want windows? It's only 80€ more.")

                Comment


                • #88
                  As has been stated before, it's a resource issue. Windows has 90% of the desktop market; linux is like 1-2%. I'd love to support video decode and 3d and advanced power management, etc., but these tasks are non-trivial (especially considering how complex gfx hardware is nowadays) and there are only a handful of active developers.

                  Comment


                  • #89
                    Originally posted by bridgman View Post
                    The ideal solution is to have access to the dedicated hardware we all put in our chips to support BluRay playback. Unfortunately that hardware is also wrapped up in bad scary DRM stuff so making it available on Linux is a slow and painful process.
                    It seems as though the graphics hardware manufacturers should spend more time lobbying US legislators, both to repeal the DMCA and preferably to ban DRM outright. It's a boat anchor on the tech industry, it hobbles the pace of technical innovation, and everyone knows it.

                    Seriously... looking at the resources involved in driver development (for all platforms) to deal with DRM in one fashion or another, the lobbying efforts might be time and money better spent... if DRM is taken out of the picture, then anyone could use your fancy dedicated hardware on any system without any lawyers having a hemorrhoid over the whole thing. Which, frankly, would raise the intrinsic value and usefulness of your hardware product.

                    Comment


                    • #90
                      Originally posted by Porter View Post
                      It seems as though the graphics hardware manufacturers should spend more time lobbying US legislators, both to repeal the DMCA and preferably to ban DRM outright. It's a boat anchor on the tech industry, it hobbles the pace of technical innovation, and everyone knows it.

                      Seriously... looking at the resources involved in driver development (for all platforms) to deal with DRM in one fashion or another, the lobbying efforts might be time and money better spent... if DRM is taken out of the picture, then anyone could use your fancy dedicated hardware on any system without any lawyers having a hemorrhoid over the whole thing. Which, frankly, would raise the intrinsic value and usefulness of your hardware product.
                      I am not a lawyer (oh GOD no I am not a lawyer!), but if my interpretation of 17 USC section 117 (the "fair use" section of the U.S. copyright laws) is anywhere near correct, you could argue that DRM is already illegal as it prevents users from backing up media or programs, moving the software to a new machine, or copying for educational and research purposes, all of which are rights granted to the users of the copyrighted works.

                      Comment

                      Working...
                      X