Announcement

Collapse
No announcement yet.

Intel Linux Graphics Shine With Fedora 12

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • AdamW
    replied
    there's probably a bug report for Compiz crashing already. Probably https://bugzilla.redhat.com/show_bug.cgi?id=539789 ?

    Leave a comment:


  • m4rgin4l
    replied
    Works...almost

    I can log in now, but Compiz crashes. It's not "shiny" but it is enough for me. Thanks!

    Leave a comment:


  • AdamW
    replied
    m4rgin4l: just checking - you saw I closed your bug as a dupe of https://bugzilla.redhat.com/show_bug.cgi?id=540218 and a kernel build with a candidate fix is now available (well, will be there in a few minutes), right? http://koji.fedoraproject.org/koji/b...buildID=143335

    Leave a comment:


  • m4rgin4l
    replied
    Originally posted by whizse View Post
    You have filed a bug report about this, right? If not, now would be a good time, as upstream have removed support for non kernel modesetting.
    I tried using abrt (a kernel oops is generated) but it didn't work, I'll try using bugzilla but that thing is a pain in the ass.

    Edit: Filed the bug: https://bugzilla.redhat.com/show_bug.cgi?id=541593
    Last edited by m4rgin4l; 26 November 2009, 08:40 AM.

    Leave a comment:


  • BlackStar
    replied
    Originally posted by caramerdo View Post
    BlackStar, can you provide a link for that claim? According to Intel, the latest drivers are OpenGL 2.1 compliant...
    According to the 2.1 specs (pdf), "OpenGL 2.1 implementations must support at least revision 1.20 of the OpenGL shading language." (page 351, J.1). The latest drivers I could find would only support GLSL 1.10 - and badly at that.

    Personally, I spent the better part of last week rewriting an application to work on a X4500 chip: downgraded to GLSL 1.10, disabled FBO blits, floating point attachments, MRT (which translates to no HDR, bloom, shadows or antialiasing) reduced texture resolution and finally... the driver produced an utterly broken picture. Imagine memory corruption, similar to TV snow, overlayed on top of the actual rendering.

    That was on Vista, by the way. On Ubuntu, the driver simply refused to render any geometry touched by a vertex shader. In the end I simply gave up: Ati and Nvidia cards consume the application just fine, but Intel ones simply do not offer any meaningful OpenGL support. Maybe if you limited yourself to 15-year old GL1.1-level features the drivers might work correctly - but that's simply not an option.

    It's sad to see the largest IHV (50%+ marketshare) produce such garbage. Their competitors are shipping OpenGL 3.2, when Intel is still struggling with 2.1 - and yes this *does* drive developers away from OpenGL and into Direct3D (and Microsoft).

    Edit: Search for Intel on opengl.org to see how depressing the situation really is.
    Last edited by BlackStar; 25 November 2009, 08:42 PM.

    Leave a comment:


  • caramerdo
    replied
    Originally posted by FunkyRider View Post
    Oh please, not a single 'real' Intel's 3D accelerator can beat the bottom of the line GeForce 8500 or even an 785G integrated chipset, and if you've ever seen PC games in recent like 10 years, there is no place for intel's graphics. We are living in the full HD gaming era where people treat 1920x1080, 4AA resolution as "normal". functional wise, intel's graphics can't even generate an acceptable render frame in modern games. I don't know if there is any better word than 'joke' to describe this. Intel never took gaming seriously. Or maybe they tried, it's either just too easy or too hard.

    “They’re the world’s leading designers and manufacturers of CPUs – how hard could it be to build a GPU? I mean, come on, how hard could it be? That crummy little company down the road builds them – we could build them in our sleep. Come on, how hard could it be?” ——NVIDIA David Kirk
    Please think before you post.

    Intel graphics cards are not directed to "gamers" - they don't give a shit about those people. Integrated graphics are extremely useful for business people and mom/dad types which only want decent 2D and Google Earth. And guess what, they are the most lucrative market since they represent 80% of the graphics card sales.

    Myself, I just bought a new laptop with the Intel X4500 because of 3 things:
    a) I don't play games
    b) Integrated graphics gives 25%+ more battery life
    c) Intel has excellent working open source drivers, and having been burned by Ati and their crappy fglrx I really appreciate that

    Sorry if I'm aggressive, but I'm tired of seeing people complaining about integrated graphics performance in games - come on, do you really expect them to play games? Just don't buy them for that.


    BlackStar, can you provide a link for that claim? According to Intel, the latest drivers are OpenGL 2.1 compliant...
    Last edited by caramerdo; 25 November 2009, 04:54 PM.

    Leave a comment:


  • BlackStar
    replied
    Either way, Intel is also very engaged in the Open Source community with open source drivers, the Moblin platform, etc., so finding other ways to describe this other than "joke", in order to soften your tone, would offer more respect to a significant contributor to the community.
    One word: Poulsbo.

    Not to mention shitty, non-conformant OpenGL drivers across the board. Check out the OpenGL forums if you wish to how developers feel about the nightmare of supporting Intel hardware.

    You may not like it, but Intel is single-handedly holding back the adoption of OpenGL. It's one of the leading reasons why Direct3D is the only real option for consumer graphics.

    So you think *that* is good for the community? Heh, good one - tell us more!

    Leave a comment:


  • Craig73
    replied
    Originally posted by FunkyRider View Post
    Oh please, not a single 'real' Intel's 3D accelerator can beat the bottom of the line GeForce 8500 or even an 785G integrated chipset, ...I don't know if there is any better word than 'joke' to describe this. Intel never took gaming seriously. Or maybe they tried, it's either just too easy or too hard.
    I don't think that anyone was disputing that Intel accelerators have not served the needs of gamers. I would guess that 80% of the broader market has not cared about games nor 3D to any significant degree. So while Intel chips have not served the needs of games, that does not mean Intel is out of touch of the broader market.

    It's also not hard to argue that 3D is serving a useful need in the general desktop, and there are innovative UI designs that can be accelerated by a GPU, and that the GPU's role in general purpose computing is becoming very significant (offloading video decoding from CPI, compositing desktops, accelerating photo editing, etc.). And Intel can be seen bringing stronger 3D offerings to their product lineup.

    Either way, Intel is also very engaged in the Open Source community with open source drivers, the Moblin platform, etc., so finding other ways to describe this other than "joke", in order to soften your tone, would offer more respect to a significant contributor to the community.

    Leave a comment:


  • FunkyRider
    replied
    Oh please, not a single 'real' Intel's 3D accelerator can beat the bottom of the line GeForce 8500 or even an 785G integrated chipset, and if you've ever seen PC games in recent like 10 years, there is no place for intel's graphics. We are living in the full HD gaming era where people treat 1920x1080, 4AA resolution as "normal". functional wise, intel's graphics can't even generate an acceptable render frame in modern games. I don't know if there is any better word than 'joke' to describe this. Intel never took gaming seriously. Or maybe they tried, it's either just too easy or too hard.

    “They’re the world’s leading designers and manufacturers of CPUs – how hard could it be to build a GPU? I mean, come on, how hard could it be? That crummy little company down the road builds them – we could build them in our sleep. Come on, how hard could it be?” ——NVIDIA David Kirk

    Leave a comment:


  • whizse
    replied
    I guess you could argue about what "real" means here, Intel really doesn't target gamers.

    Anyway, comparisons with Win would be welcome.

    Leave a comment:

Working...
X