Announcement

Collapse
No announcement yet.

Radeon Gallium3D Still Long Shot From Catalyst

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • russofris
    replied
    Originally posted by bridgman View Post
    You're confusing me with the "Android is an excuse" comment. 30 posts ago you said that Android was Linux and therefore its market share should be counted as Linux. Now you're saying that working on Android is just an excuse not to support Linux on the desktop ?

    I don't understand the MS comment.

    Click on his name on the left side, then click on "Ignore". The result will be something that resembles a linux related web-forum.

    No need to thank me.

    Leave a comment:


  • Drago
    replied
    Originally posted by Rakot View Post
    Out of curiosity, did you guys consider to make something like GSoC for developing some features of the driver? IMO it's very interesting idea to pay a developer to improve, e.g., power management.
    I will support GSoc for Radeons too.

    Leave a comment:


  • log0
    replied
    Sigh, another thread going down the drain.

    mats88, bridgman please don't feed the troll, thanks.

    Leave a comment:


  • Qaridarium
    replied
    Originally posted by bridgman View Post
    Step back a minute... with every response you make the work required to move userspace drivers into the kernel is going up, but the benefits are staying the same (or going down because of the diversion of existing developers). This isn't looking good.

    The term "give a f*ck" in English actually means "caring", I think the term you're looking for is "don't give a f*ck".

    You're confusing me with the "Android is an excuse" comment. 30 posts ago you said that Android was Linux and therefore its market share should be counted as Linux. Now you're saying that working on Android is just an excuse not to support Linux on the desktop ?

    I don't understand the MS comment.
    well yes I learn English here... "you don't give a fuck"
    thank you very much for the lesson.

    its funny english people write all "Positive" and germans write all "negativ"
    "einen schei? drauf geben" schei?(shit)=negativ fuck=positiv....
    but you can translate the schei?e=fuck...
    the germans use the word schei?e in the same way the englishe people use fuck but germans mean all thinks in the negativ way and english in the positive LOL
    because of this my english fail LOL
    *Disclaimer I’m a German if i write something then its negative in meaning*
    Germans reverse the negative if they wany say something good "das ist kein schei?"

    Originally posted by bridgman View Post
    You're confusing me with the "Android is an excuse" comment.
    its just a multiple viewpoint. if the driver is IN the kernel then its Linux if the driver is outside of the kernel and intentionally Incompatible to the Linux desktop then it isn't Linux then its a incompatible "Forg"

    in my point of view forg the linux kernel in a incompatible way is just stupid because of burning energy on 2 same products its much more cost efficiency if there is no forge.

    and the "excuse" comes if the "companys" don't want make it compatible if they think incompatible double-work is better/cheaper THIS IS A EXCUSE!


    Originally posted by bridgman View Post
    I don't understand the MS comment.
    its a lesson from the war teaching book keep your friends close and your enemy closer.

    microsoft work together with "Suse/Novel" to hurt the "Linux market" and the enemy is red-hat.

    next step for microsoft is making there own incompatible linux like andorid and pushing directX with linux kernel.

    its also a lesson from the war teaching book : if you can't destroy your enemy join them and hurt them from the inner side,
    Last edited by Qaridarium; 03-25-2012, 11:07 PM.

    Leave a comment:


  • Rakot
    replied
    Originally posted by bridgman View Post
    We have also recently added two experienced developers (three actually, but since one was replacing Richard I'm only counting two) who IMO are and will be as productive or more over the next year or so than a half dozen developers hired and brought up to speed on the job. Just a thought...
    Out of curiosity, did you guys consider to make something like GSoC for developing some features of the driver? IMO it's very interesting idea to pay a developer to improve, e.g., power management.

    Leave a comment:


  • bridgman
    replied
    Originally posted by Qaridarium View Post
    its sooooo simple: if c++ is a no go for the kernel then you have to rewrite it to push ALL the graphic stuff into the kernel.

    sure the other way is: give a fuck about linux(opensource) and spend all the "linux" money on closed source "Android"

    in my point of view android is just another excuse to not support linux on "Desktop"

    next time microsoft start there own linux version with directX and FUD and then they hurt linux from the inside.. LOL...
    Step back a minute... with every response you make the work required to move userspace drivers into the kernel is going up, but the benefits are staying the same (or going down because of the diversion of existing developers). This isn't looking good.

    The term "give a f*ck" in English actually means "caring", I think the term you're looking for is "don't give a f*ck".

    You're confusing me with the "Android is an excuse" comment. 30 posts ago you said that Android was Linux and therefore its market share should be counted as Linux. Now you're saying that working on Android is just an excuse not to support Linux on the desktop ?

    I don't understand the MS comment.
    Last edited by bridgman; 03-25-2012, 08:52 PM.

    Leave a comment:


  • Qaridarium
    replied
    Originally posted by bridgman View Post
    Q, the GLSL compiler is used by all hardware, not just Intel. It takes GLSL (and GL assembler IIRC), converts it to GLSL IR, which is then optionally converted to TGSI or Mesa IR if that's what the hardware driver accepts (radeon and nouveau take TGSI since they use a Gallium3D layer).

    The IR passed down uses short vectors for pixel and coordinate data, which can be converted efficiently to VLIW in the hardware layer.

    Anyways, it's in no way an "Intel compiler", except to the extent that the Intel devs were nice enough to take on the task of writing it.
    its sooooo simple: if c++ is a no go for the kernel then you have to rewrite it to push ALL the graphic stuff into the kernel.

    sure the other way is: give a fuck about linux(opensource) and spend all the "linux" money on closed source "Android"

    in my point of view android is just another excuse to not support linux on "Desktop"

    next time microsoft start there own linux version with directX and FUD and then they hurt linux from the inside.. LOL...

    Leave a comment:


  • bridgman
    replied
    Originally posted by RussianNeuroMancer View Post
    You mean top-GPU in 8xxx series? Or something beyond?
    Correct, assuming we stay with the HD <gen>xxx model. Sea Islands and the 2013 APUs on the latest roadmap :

    http://www.anandtech.com/show/5491/a...admap-revealed

    Originally posted by Qaridarium View Post
    i don't care about intel shit in an AMD tropic. the solution is easy just put the intel shit into the garbage. we need a AMD/VLIW compiler ! and not a intel JOKE. the intel compiler is not VLIW ! you only get 1/5 of the speed with amd hardware this is just an FUD attack from intel on AMD!
    Q, the GLSL compiler is up in the Mesa common layer and is used by all GPU hardware, not just Intel. It takes GLSL (and GL assembler IIRC), converts it to GLSL IR, which is then optionally converted to TGSI or Mesa IR if that's what the hardware driver accepts (radeon and nouveau take TGSI since they use a Gallium3D layer).

    The IR passed down uses short vectors for pixel and coordinate data, which can be converted efficiently to VLIW in the hardware layer. Scalar variables in the GLSL program are passed down to the hardware layer as scalars, and that's where VLIW hardware needs a fancier compiler in the HW driver layer to maximize efficiency.

    Anyways, it's in no way an "Intel compiler", except to the extent that the Intel devs were nice enough to take on the task of writing it.
    Last edited by bridgman; 03-25-2012, 07:36 PM.

    Leave a comment:


  • Qaridarium
    replied
    Originally posted by mattst88 View Post
    There's a serious inability on your part to participate in a reasonable discussion. I probably shouldn't bother responding, but I'll do it only once more given the lack of meaningful response.
    'serious inability' if my inability is more important than the tropic then your mental-fascism is not my problem.

    Originally posted by mattst88 View Post
    The GLSL compiler, you know, the one that Intel wrote? http://cgit.freedesktop.org/mesa/mesa/tree/src/glsl
    It's C++.
    i don't care about intel shit in an AMD tropic.
    the solution is easy just put the intel shit into the garbage.
    we need a AMD/VLIW compiler ! and not a intel JOKE.
    the intel compiler is not VLIW ! you only get 1/5 of the speed with amd hardware this is just an FUD attack from intel on AMD!



    Originally posted by mattst88 View Post
    I can't tell if that's a particularly sad attempt to dodge a very real difficulty of moving user space code into the kernel, or if you actually don't understand.

    I'll assume it's the latter and waste a bit more of my time explaining it to you.

    From the kernel documentation (http://git.kernel.org/?p=linux/kerne...g.tmpl;hb=HEAD)

    Code:
    No floating point or MMX
          The FPU context is not saved; even in user
          context the FPU state probably won't
          correspond with the current process: you would mess with some
          user process' FPU state.  If you really want
          to do this, you would have to explicitly save/restore the full
          FPU state (and avoid context switches).  It
          is generally a bad idea; use fixed point arithmetic first.
    its just the wrong way to "think" because you can calculate ALL without floating point because of this you can run a Floating Point Emulation!

    now what? more FUD from your side?

    Leave a comment:


  • RussianNeuroMancer
    replied
    Originally posted by bridgman View Post
    for the next generation we are starting sufficiently early that we should be able to hide both development and review in the pre-launch window where it doesn't impact users or community developers
    You mean top-GPU in 8xxx series? Or something beyond?

    Leave a comment:

Working...
X