Originally posted by cj.wijtmans
View Post
Announcement
Collapse
No announcement yet.
The Rumor Is Back That Future Intel CPUs To Use Radeon Graphics
Collapse
X
-
Originally posted by cj.wijtmans View PostNowhere was implied anything was "hard"
AMD was copying intel as in slapping 64bit arch as an extension on an existing arch for competing with intel. x86-64 is not completely their own.
Just because AMD slapped 64bit onto the x86 arch does not make it amd64.
- Likes 1
Comment
-
Originally posted by juno View PostNot gonna happen. I don't get that rumours. Intel had this licensing agreement for years with Nvidia. It allows them to build and sell their GPU stuff without infringing patents, it's not about including IP or hardware blocks into theirs.
Apart from that, I don't get the people celebrating this "news", either. I want a balanced market, Intel pairing up with AMD would push Nvidia out.
I guess it's about licensing and not hardware even if i think Intel integrated AMD hw would be cheaper for Intel but worse for users.
I rather see a license deal then and hw deal between Intel and AMD.
Comment
-
And what would intel do with its graphics design team? Lay them all off? Or make them work to improve AMD product? I think it would be good to have Radeons all over the place, but I dont want Intel goiing away as GPU maker. I hope they take radeons in GPU, but move HD and iris as discrete cards, and pump money in that to kill off NVIDIA.
Comment
-
Not sure about the rumor concerning Intel CPUs being paired with integrated AMD Radeon graphics, but let me start my own rumor:
Does anyone else wonder why Intel has devoted so many resources to the iGVT-g project, which will allow sharing a single Intel integrated GPU among multiple VMs?
Don’t get me wrong, I’m excited by the prospect of running on my laptop multiple VMs, with each VM having accelerated graphics. And I can foresee use cases for iGVT-g on servers and workstations that only have integrated GPUs. But an Intel integrated GPU doesn’t have, and probably never will have, enough horsepower to compete with an Nvidia discrete GPU that uses GRID.
So I wonder, is Intel planning to, either on its own, or in partnership, develop discrete GPUs using iGVT-g to compete with Nvidia GRID?Last edited by GizmoChicken; 07 December 2016, 01:43 PM.
Comment
-
Originally posted by Mark625 View Post
blah blah. It was Intel that copied AMD64 from AMD, get over it. AMD64 had nothing to do with IA-64, which was what Intel was pushing so hard for. Intel also copied NUMA, Hyperlink, and multicore CPU ideas from AMD, using different names of course.
HP already had high-end system experience with their PA-RISC. Intel and HP were doing NUMA and the like for Itanium based machines. But they wanted to milk the high end server market first by charging high prices for each Itanium chip. The very affordable AMD Opteron kicked the legs out of that plan.
Comment
-
Oh and Itanium was great. I have a rack-mount box I got off EBay with Itanium 2 chips. At 1.6 GHz these Itaniums could beat the pants off Opterons at 2.2 GHz. This did depend on using the right kind of code and heavily optimized C++ compiles with profiling feedback.
If only Intel had dropped a bunch of these at $300 or so, so that enthusiasts and amateur programmers could have tried them out.
Comment
Comment