Announcement

Collapse
No announcement yet.

Lightspark May Work Towards A Gallium3D State Tracker

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • fabiank22
    replied
    Originally posted by V!NCENT View Post

    And Gallium3D, as tech, own. No it's not finnished and (duh!) thus its API is not stable. But it will be. That's the point.
    >.<
    Troll on bro. Haters gonna hate...

    I also love how you simply quote random strawman fanfiction, heck you even managed to get Hurd in there. Five thumbs up from me

    Leave a comment:


  • V!NCENT
    replied
    Originally posted by monraaf View Post
    Yeah, it's unfinished, unstable and also not very well documented. You must be a real masochist to even consider targeting this API as an application developer.
    What's the point of documenting the Gallium API when it's design isn't even finnished yet?

    Hell, what was the point of porting Quake to Linux in 1996?

    What was the point of developping Linux at all when HURD was being developped in 1990 and microkernels was supposed to be the state of the art in kernel design? Who the hell is going to develop for Linux? What company might possibly be interested in this crap?

    What is the point of starting a project at all? I mean it isn't usable from the very first moment you start working on it.

    Seriously you are all _IDIOTS_

    Leave a comment:


  • monraaf
    replied
    Originally posted by V!NCENT View Post
    And Gallium3D, as tech, own. No it's not finnished and (duh!) thus its API is not stable. But it will be. That's the point.
    Yeah, it's unfinished, unstable and also not very well documented. You must be a real masochist to even consider targeting this API as an application developer.

    Leave a comment:


  • V!NCENT
    replied
    Direct "something" 3D and OpenGL are overkill because it's like building a 24km high 1000000hp truck to travel from europe to the US through water, when you can also just use a boat.

    Serialisation goes like this:
    "Hey GPU! I'm a compositing WM"
    -"Hey GPU! I'm a movie decoder!"
    GPU: "I'm busy, OK?"

    And Gallium3D, as tech, own. No it's not finnished and (duh!) thus its API is not stable. But it will be. That's the point.

    Why do people use DX on Windows? Well why do you think, smartass? "Hey I'm a decoder app programmer. Can I have the nessecary sourcecode of Windows?" >.<

    Leave a comment:


  • BlackStar
    replied
    Originally posted by fabiank22 View Post
    Third and most important: Microsoft is currently shipping about 30 versions of DirectX 9,10 and 11 to every computer, because apps have started to hard-code dependencies onto for example the Juli 2009-version of DX11 and will only accept that. It's DLL-hell at it's fullest, forcing Microsoft to ensure that ALL of these work. Does Mesa really have the man-power to ensure Gallium API 0.8, 0.9, 0.91 and so on work at all times? And wasn't the reason OpenGL was standardized in the first place to prevent this mess?
    This is D3DX you are talking about and this is done on purpose because Microsoft didn't wish to maintain binary compatibility.

    Leave a comment:


  • bridgman
    replied
    Originally posted by fabiank22 View Post
    Correct me if I'm wrong but isn't that already done in Mesa? Memory Management was merged around 2.6.31, shader compilers are there and so on?
    Sure, but all of the low level bits need to be changed for each new generation of hardware. That's where a lot of our developer time goes and that is presumably what you are saying we should stop in order to make time for video decode work.

    Leave a comment:


  • fabiank22
    replied
    Originally posted by bridgman View Post
    .. and the video decode framework would require pretty much everything we do for 3D anyways (eg the ability to compile and run shaders, the ability to draw primitives, the ability to access memory etc...). Remember that it's the upper levels of Mesa, shared across all GPU hardware vendors, which deal with all of the GL specific nasties... and it's that common code which needs most of the work to support higher levels of GL.
    Correct me if I'm wrong but isn't that already done in Mesa? Memory Management was merged around 2.6.31, shader compilers are there and so on?

    Originally posted by bridgman View Post
    The proprietary drivers already have GL 4.x and OpenCL; video decode acceleration is WIP but is already working for some users. Mplayer supports GL output which doesn't normally have tearing issues unless you are running with a compositor, which I doubt you would want on an HTPC system.
    Well I have both ideological and practical problems with getting the proprietary driver to run. As for GL, on my current system(Intel i3 HD, Mobility Radeon 5850) it's actually slower than XV while making my Fan freak out more than Starcraft II on full in Windows.

    Originally posted by TemplarGR
    I may be wrong on this, but isn't Gallium3D a different flavour of MESA? Why you consider it different to OpenGL?
    Because OpenGL is a standardized API and Gallium3D a moving target implementation that seeks to implement all features a video card has?

    This question actually had me think about the problem more, and the more I do, the more this state tracker thing freaks me out.
    OpenGL was created to have a stable cross-platform 3D-API for developers to target. So let's say I'm a game developer. I write my game for OpenGL 3, and it works for a few years until changes to GCC/libc break it for everyone anyway.
    Now I can easily check whether the card of my user supports OGL3. Or at least I could till every driver vendor decided to implement whatever he wanted, randomly backporting features for OGL3 to cards that report 2.1 or even 1.5(yay Intel). Oh well, I'll check for the individual GL-extension then, checking for the existing 50+ Strings is a pain, but at least I can count on that... or at least I could, but now I've read about how some extensions are only half-working but report working state because that let's you run a few games on wine. And don't get started about Mesa's own extensions which are not part of the standard.
    With getting OGL to work being this much of russian roulette I understand why people would jump onto the state tracker bandwagon. But within a few years that environment will be fucked up too if everybody writes his own tracker.

    First of all Gallium3Ds API has been changing rapidly and according to Zack Rusin will continue to do so for another while. So not only will distros need to start packaging new Kernel/libdrm/Mesa-version as soon as the hit or else their state trackers will stop working, every API-change could easily break a ton of applications.
    Secondly people will start inter-dependencies between state-trackers, making the whole environment impossible to package and to maintain.
    Third and most important: Microsoft is currently shipping about 30 versions of DirectX 9,10 and 11 to every computer, because apps have started to hard-code dependencies onto for example the Juli 2009-version of DX11 and will only accept that. It's DLL-hell at it's fullest, forcing Microsoft to ensure that ALL of these work. Does Mesa really have the man-power to ensure Gallium API 0.8, 0.9, 0.91 and so on work at all times? And wasn't the reason OpenGL was standardized in the first place to prevent this mess?

    Leave a comment:


  • TemplarGR
    replied
    I may be wrong on this, but isn't Gallium3D a different flavour of MESA? Why you consider it different to OpenGL?

    Leave a comment:


  • BlackStar
    replied
    Originally posted by Svartalf View Post
    Originally posted by BlackStar
    The fact that everyone seems to be considering the raw Gallium3d API for acceleration goes to show how much OpenGL sucks. Think about it for a moment.
    Keep trolling...
    You, of all people, should understand how ill-suited OpenGL is for the short of work Lightspark is doing. For instance, efficient DMAs to VRAM are essential. In OpenGL you can:

    a. set up an async transfer through a PBO (if that's available)
    b. use glTexImage2D(..., null) discarding the previous data (requires efficient garbage collection inside the driver)
    c. use glTexSubImage2D and ping-pong between two or more textures
    d. use glTexSubImage2D alone
    e. use glDrawPixels with an FBO
    f. invoke Cthulu(*)

    Each method may be faster or slower depending on the hardware and driver configuration. Testing this falls somewhere between "nightmare" and "impossible".

    Then you have to generate OpenGL shaders on the fly (ugh(**)), read back to system memory if necessary (double ugh(***)), compose, re-upload, display. And all this in OpenGL 2.1 with 1.x fallbacks for older hardware.

    [Add obligatory joke how Adobe's engineers *still* haven't managed to get this right after so many years]

    I can see the appeal of Gallium for these tasks. Can't you? Off-hand: dead-simple DMA, sane, low-level shader instructions, efficient software fallback if the hardware is not up to the task.

    (*) this is said to increase upload speeds but I haven't confirmed. Use at your own risk.
    (**) GLSL isn't really suited to this task and ARB programs are deprecated and stuck to 2004-level capabilities.
    (***) Think texture uploads, only more random. For example, Nvidia 480 achieves half the download speed of 280. Why? Noone knows, have fun optimizing your code.

    Leave a comment:


  • liam
    replied
    Originally posted by bridgman View Post
    Maybe I misread things, but I don't think the developers were looking for "more features than OGL", but rather "something much smaller and simpler" where available.

    If you want to run a few shaders, Gallium3D will probably be the simplest way to do it... and if you use Gallium3D, the thing you write ends up being called a state tracker.
    Assuming you were responding to me, I was referencing the commenter named Sean. He seemed to be suggesting using only the openCL and OGL apis. OCL, to my knowledge, doesn't have even the beginnings of a state tracker yet, but it clearly is something Linux could make use of. By doing things this way he would then be able to make better use of gfx card so he wouldn't simply be using it for framebuffer/blitting.
    As for the rest of your post, I wasn't suggesting he not use gallium, just that this poster made an interesting case for not writing a lightspark state tracker.

    Leave a comment:

Working...
X