Page 2 of 5 FirstFirst 1234 ... LastLast
Results 11 to 20 of 43

Thread: Catalyst 10.1 Still Trash In Heaven, But Good News

  1. #11
    Join Date
    Jun 2009
    Location
    Paris
    Posts
    428

    Default

    Quote Originally Posted by Sockerdrickan View Post
    The proprietary driver does not support floating point. At least not in an OpenGL 3.2 context. The following code will produce black fragments:

    Code:
    out vec4 color;
    
    void main() {
       color=vec4(1.0)*0.99;
    }
    One to two years ago, there used to be a bug that parsed floating-points using the locale's decimal point instead of plain '.'. This is normally fixed nowadays.

  2. #12
    Join Date
    May 2007
    Location
    Third Rock from the Sun
    Posts
    6,532

    Default

    I'm starting to think that the first card we will the Unigine Heaven benchmark in linux will be a Fermi based one.

  3. #13
    Join Date
    Nov 2008
    Posts
    755

    Default

    Quote Originally Posted by Sockerdrickan View Post
    The following code will produce black fragments:
    I'm no expert on GLSL, but does your code actually set the color somewhere? Using ShaderMaker, I just tried setting this as "Fragment Shader" and got the expected results. HD5770, catalyst 10.1.
    Code:
    void main()
    {
    	gl_FragColor = vec4(1.0)*0.99;
    }
    not sure if it qualifies as an "OGL 3.2 context" as I didn't dissect shadermaker's sources.

  4. #14
    Join Date
    Jan 2010
    Location
    Stockholm, Sweden
    Posts
    5

    Default

    Quote Originally Posted by rohcQaH View Post
    I'm no expert on GLSL, but does your code actually set the color somewhere? Using ShaderMaker, I just tried setting this as "Fragment Shader" and got the expected results. HD5770, catalyst 10.1.
    Code:
    void main()
    {
    	gl_FragColor = vec4(1.0)*0.99;
    }
    not sure if it qualifies as an "OGL 3.2 context" as I didn't dissect shadermaker's sources.
    That is what my vec4 color; was for obviously, and no shadermaker probably don't use a forward compatibility context

  5. #15
    Join Date
    Jan 2010
    Location
    Stockholm, Sweden
    Posts
    5

    Default

    Quote Originally Posted by rohcQaH View Post
    I'm no expert on GLSL, but does your code actually set the color somewhere?
    out vec4 color; //right there


    And no shadermaker probably does not use a forward compatibility context.

  6. #16
    Join Date
    Nov 2008
    Posts
    69

    Default

    Quote Originally Posted by Sockerdrickan View Post
    out vec4 color; //right there


    And no shadermaker probably does not use a forward compatibility context.
    Does putting:

    precision highp float;
    at the top of the frag shader fix it?

  7. #17
    Join Date
    Jan 2010
    Location
    Stockholm, Sweden
    Posts
    5

    Default

    Quote Originally Posted by Kazade View Post
    Does putting:



    at the top of the frag shader fix it?
    Actually I already use that line. It does not help at all though.

  8. #18
    Join Date
    Jul 2008
    Posts
    174

    Default

    Quote Originally Posted by bridgman View Post
    I don't think this is an OpenGL 3.x issue, it's about retrofitting the newest tesselation hardware support (which is *not* part of any OpenGL spec today) into an existing OpenGL 3.2 driver via extensions.
    I've actually read something similar to this in the past. I've heard certain ATI issues with wine or xorg stemming from Nvidia actually providing more functionality than required by spec. This situation has come up dozens of times. Can you comment on it?

  9. #19
    Join Date
    Oct 2008
    Posts
    2,904

    Default

    Quote Originally Posted by Joe Sixpack View Post
    I've actually read something similar to this in the past. I've heard certain ATI issues with wine or xorg stemming from Nvidia actually providing more functionality than required by spec. This situation has come up dozens of times. Can you comment on it?
    OpenGL has a long history of vendor extensions getting added, as that was part of the design to allow hardware makers to differentiate their cards. NVidia's are typically used more than others simply because for a long time they were the only real option for high quality 3D graphics on linux. Now that alternative drivers are catching up, it's less likely to be an ongoing problem because new software will take other hardware into account from the beginning. Hopefully, anyway. Also, part of the point of the newer OpenGL 3 versions is to standardize some of the extensions both companies were supporting to help reduce the problem.

  10. #20
    Join Date
    Aug 2007
    Posts
    6,598

    Default

    @bridgman

    Who from ATI is responsible for the Unigine Heaven delay? I see no reason why one hardware vendor should force a software vendor not to release any kind of software. It does not matter if the driver still has bugs or not.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •