Originally posted by Pawlerson
View Post
Announcement
Collapse
No announcement yet.
Windows 8 Outperforming Ubuntu Linux With Intel OpenGL Graphics
Collapse
X
-
Well I have a i5+hd4000 laptop and I do not miss windows 8 bloated UI at all. But it has been know that Intel's driver for windows performs better. The gap as been closing steadily but isn't there yet... The open source Intel driver is alright as it stands, its plays CS:S and DoD:S fine (with a work~around) and it has played good with Wine. It isn't my 1100t+670GTX with Nvidia's BLOB but the OSS driver does the job fine for my little 12 inch.
Comment
-
Originally posted by Rexilion View PostI find that really hard to believe. If Intel is already developing drivers for their hardware, aren't they already giving away all the details?
A famous example is when one driver cheated hard at LightsMark and did some special culling of the submitted geometry. Users who modified the benchmark found that if you turned the camera around while using said driver, you'd see broken geometry. On other drivers, you'd see the bits of the scene you'd expect.
This is a popular tactic for both common benchmarks and newer games. It lets the driver authors publish much higher benchmark numbers for a stock configuration of the application, tricking users into buying their hardware/driver despite the fact that it is no faster in the general case or on app configurations they didn't cheat at, or that games might break if the user does something out of the ordinary on a cheated configuration.
That's unlikely in FOSS drivers because all that code is a huge maintenance burden that only helps in marketing products, something that FOSS projects are not generally interested in. Also, spilling the beans on why a particular benchmark is so good kind of defeats the purpose of cheating at the benchmark in the first place.
Again, there's no evidence I know of that Intel's Windows drivers are guilty of this at all.
Comment
-
Originally posted by elanthis View PostThe "cheating" is a bit different (and I don't know that Intel has ever been found to do it, though both AMD and NVIDIA have). The cheating is that the drivers can detect popular benchmarks or games and make changes to the rendered scene to improve performance.
A famous example is when one driver cheated hard at LightsMark and did some special culling of the submitted geometry. Users who modified the benchmark found that if you turned the camera around while using said driver, you'd see broken geometry. On other drivers, you'd see the bits of the scene you'd expect.
This is a popular tactic for both common benchmarks and newer games. It lets the driver authors publish much higher benchmark numbers for a stock configuration of the application, tricking users into buying their hardware/driver despite the fact that it is no faster in the general case or on app configurations they didn't cheat at, or that games might break if the user does something out of the ordinary on a cheated configuration.
That's unlikely in FOSS drivers because all that code is a huge maintenance burden that only helps in marketing products, something that FOSS projects are not generally interested in. Also, spilling the beans on why a particular benchmark is so good kind of defeats the purpose of cheating at the benchmark in the first place.
Again, there's no evidence I know of that Intel's Windows drivers are guilty of this at all.
Comment
-
Would the AF quality count as cheating?
Earlier Intel Windows drivers had terrible AF quality, but with driver tweaks they got it to filter properly (and slower). So either a real oversight, or attempted cheat that was removed when detected.
Comment
-
...
The title of this disgusting brown nosing article is very misleading. It should be "Windows 8 CLOSED SOUCE DRIVERS perform better than LINUX OPEN SOURCE DRIVERS, but Linux CLOSED SOURCE DRIVERS PERFORM THE SAME AS WINDOWS 8 DRIVERS with OpenGL" That should be the title, clueless shill. BTW, Valve games run much better for me than on my wife's shitty Windows 8 computer.
Comment
-
Originally posted by flooby View PostThe title of this disgusting brown nosing article is very misleading. It should be "Windows 8 CLOSED SOUCE DRIVERS perform better than LINUX OPEN SOURCE DRIVERS, but Linux CLOSED SOURCE DRIVERS PERFORM THE SAME AS WINDOWS 8 DRIVERS with OpenGL" That should be the title, clueless shill. BTW, Valve games run much better for me than on my wife's shitty Windows 8 computer.
a) insulting beyond bounds
b) suggests adjustment to the title for nuance
There is no reason to adjust the title. Windows GPU drivers are closed source. Linux GPU drivers are either open source or closed source. In the case of Intel, it's only open source. So there is no need to
a) insult the author of the article
b) suggest adjustment to the title
I hope you don't ever reproduce...
Comment
-
Originally posted by Rexilion View PostThat remark is
a) insulting beyond bounds
b) suggests adjustment to the title for nuance
There is no reason to adjust the title. Windows GPU drivers are closed source. Linux GPU drivers are either open source or closed source. In the case of Intel, it's only open source. So there is no need to
a) insult the author of the article
b) suggest adjustment to the title
I hope you don't ever reproduce...
+1
Lots of people do not like data so they want to degrade its importance.
You CAN compare OpenGL perf just on one gpu :P
Comment
-
Originally posted by elanthis View PostThe "cheating" is a bit different (and I don't know that Intel has ever been found to do it, though both AMD and NVIDIA have). The cheating is that the drivers can detect popular benchmarks or games and make changes to the rendered scene to improve performance.
A famous example is when one driver cheated hard at LightsMark and did some special culling of the submitted geometry. Users who modified the benchmark found that if you turned the camera around while using said driver, you'd see broken geometry. On other drivers, you'd see the bits of the scene you'd expect.
This is a popular tactic for both common benchmarks and newer games. It lets the driver authors publish much higher benchmark numbers for a stock configuration of the application, tricking users into buying their hardware/driver despite the fact that it is no faster in the general case or on app configurations they didn't cheat at, or that games might break if the user does something out of the ordinary on a cheated configuration.
That's unlikely in FOSS drivers because all that code is a huge maintenance burden that only helps in marketing products, something that FOSS projects are not generally interested in. Also, spilling the beans on why a particular benchmark is so good kind of defeats the purpose of cheating at the benchmark in the first place.
Again, there's no evidence I know of that Intel's Windows drivers are guilty of this at all.
I'm really glad that someone understands. Games and benchmarks speak to the driver, not directly to the hardware. If the driver wants to cheat will cheat, there is not technology available to measure quality of the picture. In fact when you have 2x GPUs you only have +50% FPS, that's is because the driver goes in quality and precision mode, same with double the shaders. My opinion is this:
NV-Kepler= 3.2_Tflops@64bit_(Intel comparison) = 6.4@32bit_(AMD comparison) = 9.6@Fmac=trioperant_(AMD HD2000-6000, G80-300, PS3, XBOX360 comparison).
AMD-HD7000= 3.8_Tflops@32bit = 5.7@Fmac=trioperant.
Intel-4000= 170_Gflops@64bit = 340@32bit =510@Fmac=trioperant.
Also there is not an exact way to compare open source drivers with the closed ones, because the closed ones cheat. If you ask me Intels_open and Intels_closed are equals. Also they share the same OpenGL code. How the hell some of you figure out that are different? Make your brain think!
Comment
-
Originally posted by artivision View PostI'm really glad that someone understands. Games and benchmarks speak to the driver, not directly to the hardware. If the driver wants to cheat will cheat, there is not technology available to measure quality of the picture. In fact when you have 2x GPUs you only have +50% FPS, that's is because the driver goes in quality and precision mode, same with double the shaders.
Furthermore, FPS is only a single (one of many) ratio's defining 'performance'. So an 50% increase in FPS will not increase performance with 50%.
Originally posted by artivision View PostMy opinion is this:
NV-Kepler= 3.2_Tflops@64bit_(Intel comparison) = 6.4@32bit_(AMD comparison) = 9.6@Fmac=trioperant_(AMD HD2000-6000, G80-300, PS3, XBOX360 comparison).
AMD-HD7000= 3.8_Tflops@32bit = 5.7@Fmac=trioperant.
Intel-4000= 170_Gflops@64bit = 340@32bit =510@Fmac=trioperant.
Originally posted by artivision View PostAlso there is not an exact way to compare open source drivers with the closed ones, because the closed ones cheat.
Originally posted by artivision View PostIf you ask me Intels_open and Intels_closed are equals. Also they share the same OpenGL code.
I don't think that the DRM intel driver part is shared with Windows.
Originally posted by artivision View PostHow the hell some of you figure out that are different? Make your brain think!
Comment
Comment