Announcement

Collapse
No announcement yet.

AMD A8-3500M Llano Linux Benchmarks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Qaridarium
    sure why not?
    Because it looks like Fusion GPU in current drivers are managed like standard GPU. I didn't see anything like that up to now. I am asking just because maybe I missed some commits...

    Comment


    • #22
      It would be supported as in attempting such a transfer would be instant, I believe.

      Comment


      • #23
        When someone announces GPCPU socket standards so mobos and GPCPUs aren't locked together, there is direct competition again, and you don't have to buy a new mobo when you buy a new GPCPU, then I will be excited.

        Oh wait, that will never happen because they only care about taking more of your money...

        Comment


        • #24
          Originally posted by Yfrwlf View Post
          When someone announces GPCPU socket standards so mobos and GPCPUs aren't locked together, there is direct competition again, and you don't have to buy a new mobo when you buy a new GPCPU, then I will be excited.

          Oh wait, that will never happen because they only care about taking more of your money...
          There are always technical excuses: vastly different, quickly changing designs, pin counts, pin layouts optimised for different things.
          This would basically force both manufacturers to use similar technologies, e.g. if the socket has support for on-die memory controller (extra pins), then all the cpu manufacturers would have to use them, or else the mobo makers would have to make some mobos with external memory controllers (north bridges), that would not work very well for cpu's with built-in memory controllers, in practice making it cpu-specific.
          This would mean that amd and intel would need to cooperate in designing the motherboard standards and interconnections as well as parts of the cpu designs themselves. Two major manufacturers of competing products starting to cooperate is not good for competition. So this would require a wider committee representing more stakeholders. We've seen this kind of cooperation with benefits for the consumers, for example USB, sata, pci, agp, pci-e, atx, ethernet. The problem is this might pose too strict restrictions on freedom to design the most efficient performance and as opposed to the other standards, performance (or efficiency) is the only thing that counts in cpu design.

          Comment


          • #25
            Originally posted by misiu_mp View Post
            There are always technical excuses: vastly different, quickly changing designs, pin counts, pin layouts optimised for different things.
            This would basically force both manufacturers to use similar technologies, e.g. if the socket has support for on-die memory controller (extra pins), then all the cpu manufacturers would have to use them, or else the mobo makers would have to make some mobos with external memory controllers (north bridges), that would not work very well for cpu's with built-in memory controllers, in practice making it cpu-specific.
            This would mean that amd and intel would need to cooperate in designing the motherboard standards and interconnections as well as parts of the cpu designs themselves. Two major manufacturers of competing products starting to cooperate is not good for competition. So this would require a wider committee representing more stakeholders. We've seen this kind of cooperation with benefits for the consumers, for example USB, sata, pci, agp, pci-e, atx, ethernet. The problem is this might pose too strict restrictions on freedom to design the most efficient performance and as opposed to the other standards, performance (or efficiency) is the only thing that counts in cpu design.
            That's the excuse, anyway. They are cooperating, though. Cooperating to not cooperate. In reality these CPUs do exactly the same thing and accomplish the same goal, and if these supposedly huge differences in architecture were actually relevant, multiple socket standards could exist to compensate or you simply allow for both options. Just because pins are available doesn't mean you have to use them all. The problem is no CPU manufacturer makes CPUs for any of the standards that you claim are there for legitimate reasons except for their own socket type. Why is that? Oh yeah, it's because they want to fragment the market so that consumers will be locked into one particular socket. It's "good for business" (them taking more of your money) to make consumers have to throw away their entire computer when ever they're interested in upgrading. This waste in the system, like all other waste, results in more money for the super rich, less for everyone else, and thus a lower quality of life for everyone else.

            Yes, you need to allow for things getting smaller and such and maybe a couple socket types for when there is a need to offload certain things onto other chips on the mobo (the sockets for which should also be standardised), but that doesn't mean you can't have standards. Standards can evolve when needed, but still allow for much better competition than without.

            Comment


            • #26
              The problem is no CPU manufacturer makes CPUs for any of the standards that you claim are there for legitimate reasons except for their own socket type. Why is that? Oh yeah, it's because they want to fragment the market so that consumers will be locked into one particular socket. It's "good for business" (them taking more of your money) to make consumers have to throw away their entire computer when ever they're interested in upgrading. This waste in the system, like all other waste, results in more money for the super rich, less for everyone else, and thus a lower quality of life for everyone else.
              It's not just good for business it is good for the consumer for the same reason an unstable hardware api is good for the Linux kernel. It gives the designer more freedom to innovate rather than having to be stuck to an older featureset and socket that will never get updated because the committees can't decide on anything.

              Let's be quite blunt here, for something extremely complex that is rapidly moving and developing let's say a Kernel or a CPU you don't want to lock down the progress and bring it to a halt through a bureaucracy. You want the company or developer to have as much freedom as they can possibly have.

              Now let's say we've got ISO holding this standard for the rapidly developing technology of CPUs, now AMD has a new feature they want to implement nobody has ever thought of before, but it requires a new socket to make it work. Now in order to make it work they have to announce to ISO why they want the standard changed, thus tipping their hand on this new tech, and so this means intel is going to come along and implement their hack of it and AMD loses the initiative, also both companies have to wait 6 months for their standard to be argued over resubmitted, argued over again, then they'd have to get it signed in triplicate, sent back, lost, found, subjected to public query, lost again and finally buried in soft peat for three months before finally recycled as lighters before they finally got it approved.

              Let's in comparison look at ODF for a second here and why It can have a standard. ODF isn't really a moving target per say, it evolves as a standard yes.. but at glacial speed. No one is really so much developing the standard so much as the office suite behind it, thus meaning the point is interoperability. These office suites aren't trying to tack on extra stuff that the standard doesn't support because that's not the point it's not really a competition against other ODF it's against OOXML and infact they're trying to make it interopt with the other suites.

              A processor is completely different, what you're asking for is the equivalent of telling all game developers that: "you can develop a game, but only for the Unreal 3 engine, and if you want new feature.. well you'll have to wait for us to discuss it, and eventually just maybe, if we feel like it, get around to it in Unreal 4 in the next few years" Now obviously Unreal 3 is suited well for some tasks but not for others, I wouldn't for instance want to write a 2d adventure game in it, I'd want to hack AGS to work under Linux.

              Comment

              Working...
              X