Announcement

Collapse
No announcement yet.

Intel Wants YOUR Linux Questions, Feedback

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by AlbertP View Post
    Multicard is not yet working within the X server
    Already working with fglrx.

    Comment


    • Originally posted by eugeni_dodonov View Post
      Usually such level of support lies upon Linux distribution maintainers/packagers, as it basically only involves backporting patches from newer kernel/X.org/etc releases to the supported ones.
      I don't get it, we're asking for continued support of "older" hardware in newer drivers, what has this got to do with backporting?
      Is it because "the kernel people" drop support themselves for this old Intel hardware without your consent?

      Originally posted by eugeni_dodonov View Post
      But in general, the problem is that this process is extremely time and resource-consuming. So our team's goal is to stay focused on hardware enablement in new kernels and userspace components, which are more critical and need much more attention to be working. So we leave the backporting and support for historic (by historic I mean 'no longer manufactured nor supported by any major customer') hardware to the downstream (in this case, Linux distributions or OEM/ODM companies who provide such hardware).
      So basically it's extremely hard to do, and since Intel only has billions of dollars of net income per year, AND the best knowledge and expertise about how to do this, they're telling others to do it. Sounds like a fair plan.

      Comment


      • Originally posted by RussianNeuroMancer View Post
        Already working with fglrx.
        Because both cards use the same driver in AMD+AMD hybrid configurations, a hack in the driver is possible (deceiving the X server). But with cards using different drivers it is not possible unless you use Xinerama without 3D acceleration.

        Comment


        • Originally posted by AlbertP View Post
          Because both cards use the same driver in AMD+AMD hybrid configurations, a hack in the driver is possible (deceiving the X server). But with cards using different drivers it is not possible unless you use Xinerama without 3D acceleration.
          I think hack is possible if both of drivers is FOSS.

          Comment


          • On the subject of multi-card, what are the chances Intel will release a stand-alone graphics card without the cpu? I don't understand why they don't. They already make the GPU's, so what's wrong with putting it on a board for use with desktops?

            Can someone explain to me what Intel has to loose from this idea? The customers are already there, practically.

            Comment


            • Originally posted by jltyper View Post
              On the subject of multi-card, what are the chances Intel will release a stand-alone graphics card without the cpu? I don't understand why they don't. They already make the GPU's, so what's wrong with putting it on a board for use with desktops?

              Can someone explain to me what Intel has to loose from this idea? The customers are already there, practically.
              The problem is they don't have the ability to design hardware that is truly competitive, performance-wise, with any of the ATI or Nvidia current-gen parts, except maybe ones that are passively cooled, or other IGPs/APUs such as Bulldozer and Tegra.

              In other words, if you take any PCI-Express 2.0 ATI or Nvidia discrete GPU manufactured since ~Q4 2009, and discard the ones that don't have an on-board fan, you're basically looking at those cards blowing Intel's GPUs out of the water in terms of performance. And Intel doesn't even have the technology developed to support the mid/high-end; they can't just make an existing IGP bigger and have it magically become competitive with a high-end Radeon or GeForce.

              Sure, from a naive point of view, they COULD take the existing ASIC and just plaster it onto an add-on board instead of integrated with the motherboard; but the biggest design issue with that is the fact that Intel's IGPs use system memory as equivalent to GPU "VRAM", while the add-on cards always use dedicated on-chip GDDR3 or GDDR5.

              Not only does this require a significant change in the hardware design, but it also means that drivers written to put textures in system memory now have to put them in on-board VRAM. And then you have to design hardware instructions for interacting with the VRAM, and so on and so forth.

              Basically Intel would have to design a new ASIC with a major departure from existing Intel IGPs, and write a completely different kind of driver, in order to offer an Intel graphics processor on a discrete board.

              Comment


              • Originally posted by allquixotic View Post
                The problem is they don't have the ability.
                I agree with most of what you said. But I don't if I could say this entirely objective. This is Intel we are talking about, not some mom and pop computer repair store. Intel doesn't have the ability? Then who does? Just AMD? Just NVIDIA?

                Is it really going to be just AMD and NVIDIA forever? I thought Intel was a big CPU company. I guess that's the only thing. It seems strange to me. Who else can compete?

                If GPU's are regarded as essential to computing in general, then what is Intel planning to do, to avoid becoming obsolete to desktop users?

                What if an amd/radeon combination delivers better graphics? There is the potential that more users would go for this. Even if AMD cpu's are slower, if there is more perceivable benefits due to reliance on 3d acceleration by various desktops, I think I might go for that.

                Comment


                • I just happen to have purchased Intel CPU (i3-2105) + MB thanks to their outstanding opensource support. Gallium is wanted, guys!

                  Comment


                  • Mesa is good but Gallium is better.

                    Intel have got Mesa running so well that many users don't care about Gallium. But of course having a Gallium3D driver would still increase the features through state trackers, and perhaps performance as well.

                    Comment


                    • Originally posted by jltyper View Post
                      I agree with most of what you said. But I don't if I could say this entirely objective. This is Intel we are talking about, not some mom and pop computer repair store. Intel doesn't have the ability? Then who does? Just AMD? Just NVIDIA?

                      Is it really going to be just AMD and NVIDIA forever? I thought Intel was a big CPU company. I guess that's the only thing. It seems strange to me. Who else can compete?

                      If GPU's are regarded as essential to computing in general, then what is Intel planning to do, to avoid becoming obsolete to desktop users?

                      What if an amd/radeon combination delivers better graphics? There is the potential that more users would go for this. Even if AMD cpu's are slower, if there is more perceivable benefits due to reliance on 3d acceleration by various desktops, I think I might go for that.
                      There are several reasons I can sketch out that almost certainly serve as major roadblocks to Intel becoming a leader in high-performance 3d graphics:
                      • Software and hardware patents. With a company as huge as Intel, with all this money (billions) at stake, you better believe that software and hardware patents owned by AMD and Nvidia are going to be used aggressively against Intel if they attempt to use anything that may be covered by an AMD or Nvidia patent. The other thing to remember is that AMD and Nvidia are under no obligation to permit Intel to legally license their patents for any fair amount of money; or alternatively, their demands could be so high as to effectively nullify Intel's profit margin. The reason that AMD and Nvidia can continue to exist without suing each other into oblivion is that each company holds a large enough critical mass of 3d graphics patents that they've either (a) cross-licensed both company's patents to each other, or (b) they've found some kind of technological battle line where both companies can produce a viable product without stepping on one another's patents (because remember, AMD only has to worry about Nvidia's patents and not their own; and Nvidia only has to worry about AMD's patents and not their own; if we ignore patent trolls, S3, SGI, etc. for a moment. By comparison, Intel would have to dodge BOTH AMD's AND Nvidia's patents simultaneously to produce a product.)
                      • Anti-monopoly forces within government and the public would probably notice if Intel became "king" of yet another major market segment. If you think about it, they are basically already the king of high-end consumer and server CPUs; they are already a leader in many very lucrative market segments; and they already produce rather good IGPs that offer a unique trade-off, sacrificing performance for a fantastic power profile (they are very energy-efficient compared to AMD and Intel discrete GPUs). As an anti-monopoly person myself, I'm kind of happy that we don't have Intel overlords owning the entire hardware stack, from the CPU to the motherboard to the GPU. Well, it's still possible to buy systems like that, but not at the high-end performance category of GPU.

                      Comment

                      Working...
                      X