Announcement

Collapse
No announcement yet.

S3 Graphics Releases Linux Driver With OpenGL 3.0, VA-API

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by Thunderbird View Post
    I haven't seen the drivers but judging by the GL string whether it supports 3.0 is wrong. Officially 3.0 should be reported when a opengl 3.0 capable glx context is created using GLX_ARB_create_context. That's the official way. Unfortunately Nvidia broke with that rule in a recent 180.x release likely for marketing reasons. If S3 supports GLX_ARB_create_context then they have OpenGL 3.0 support.

    I just checked their libGL.so and they do have GLX_ARB_create_context / glXCreateContextAttribsARB, so they really support OpenGL 3.0. Please update the article.
    @Michael, Thunderbird is right.

    My fglrx 9.2 doesn't report opengl 3.0 either with glx:
    OpenGL version string: 2.1.8494 Release

    But opengl 3.0 is certanly there.

    Comment


    • #17
      Wikipedia has the specs:

      Chrome 540 GTX 65nm 800Mhz core
      32 unified shaders 1700Mhz ram (850 gDDR3) 64bit bus

      Comment


      • #18
        Not that I would buy a S3 graphics card, but hey, at least they have something now.

        I just hope they recognize the value that is added to their hardware if the driver is opened up.

        Comment


        • #19
          They are not producting high-end 3D cards, why oh why don't they go for free drivers?

          Comment


          • #20
            Originally posted by tball View Post
            @Michael, Thunderbird is right.

            My fglrx 9.2 doesn't report opengl 3.0 either with glx:
            OpenGL version string: 2.1.8494 Release

            But opengl 3.0 is certanly there.
            The only reason nvidia changed the strings (which is actually very bad as when you don't use a opengl3 context you aren't using the real opengl3) is for marketing reasons. A new glxinfo would be needed.

            Comment


            • #21
              Originally posted by nico342 View Post
              What's happen ATI ?
              ATI seems to be making progress. I checked out radeon hd code today from a git repo and it seemed to have some 3d support for my radeon 2600. If this extends to the r700 series too, I will absolutely buy an ati card in the price range of the chrome 540.. Hopefully one of those new rv740 based cards.

              Comment


              • #22
                for those that do want to buy:
                https://s3gstore.s3graphics.com/

                Comment


                • #23
                  Originally posted by Thunderbird View Post
                  The only reason nvidia changed the strings (which is actually very bad as when you don't use a opengl3 context you aren't using the real opengl3) is for marketing reasons. A new glxinfo would be needed.
                  So nvidia lied, that's it.

                  Comment


                  • #24
                    Originally posted by kraftman View Post
                    So nvidia lied, that's it.
                    More like they are waiting for the foss to get their act together and update their supporting code to todays standards.

                    Comment


                    • #25
                      Originally posted by sreyan View Post
                      ATI seems to be making progress. I checked out radeon hd code today from a git repo and it seemed to have some 3d support for my radeon 2600. If this extends to the r700 series too, I will absolutely buy an ati card in the price range of the chrome 540.. Hopefully one of those new rv740 based cards.
                      I just checked, within the price range of the 540 which will more then likely be seriously gimped out with only 64 bit memory, even though it comes with GDDR3 you can get either an AMD HD3850 or HD4670 or the Nvidia 9600GSO.

                      All 3 of which retail for less, come with 128 or 256 bit, GDDR3 or GDDR4 memory and all 3 will kick the crap out of the 540 up and down the block. I'm not actually sure which one is the most powerful of the 3.

                      But I'd wager that the 3 are fairly close in raw performance, since the HD3 series wasn't that powerful, the HD3870 usually coming in around the performance of the 8800GT and 9600GT, so the 9600GO and HD3850 should be fairly close, the HD4670 looks pretty close as well and should have a notably lower power draw then the HD3850.

                      You can check the GPU chats http://www.guru3d.com/category/vga_charts/ sucks that they haven't updated the charts on ET:QW though due to it's quick fall off in popularity since it's pretty much the most demanding game on Linux...


                      Oh and for anyone interested, here's a preview review of the new AMD HD4750/rv740 gpu http://www.guru3d.com/article/radeon...-preview-test/ It looks to fall between the HD4830 and the HD4850, so if you have either its prolly not worth it to you.

                      I've long believed that Nvidia and VIA/S3 should merge, so what if we lose them dropping docs, it's not like anyone will actually knowingly buy and S3 hardware these days, not to mention the fact that you can't find any VIa hardware in normal shops, I've only ever sen it for sale on shops that specialize in sff and *-ITX sized mobos in the last 3+ years. They don't even seem to make AMD or Intel chipsets anymore.

                      Why should they merge with nVidia? Simple we could use a 3rd real competitor in the cpu and mobo market that offers a complete system, AMD already has their complete low to high end cpu+chipset+gpu platform, Intel is going to complete their final piece of the puzzle with the release of Larabee, this leaves nVidia out in the cold since they licensed SLi to Intel for the X58 chipset, but Intel wont let them make chipsets for the i7 series CPUs.

                      nVidia also can't just go out and make their own x86 CPU, due to Intel controlling who gets license to the designs, only having signed out those licenses due o IBM and government contract requirements of having a 2nd supplier for hardware. After years of failed companies in the tech market I think the only ones with x86 licenses are AMD and VIA.

                      VIA merging with nVidia would allow them to enter the cpu market and allow them to offer at least a high end chipset and gpu market to start with while they apply nVidia's knowhow to the VIA Nano to get a decent performer out of it, its already a pretty good cpu at the low end, but with a little help form nVidia's team theres no reason that they couldn't have a competitive mid to high end platform within 5 years.

                      Comment


                      • #26
                        Originally posted by Duo Maxwell View Post
                        VIA merging with nVidia would allow them to enter the cpu market and allow them to offer at least a high end chipset and gpu market to start with while they apply nVidia's knowhow to the VIA Nano to get a decent performer out of it, its already a pretty good cpu at the low end, but with a little help form nVidia's team theres no reason that they couldn't have a competitive mid to high end platform within 5 years.
                        That should have happened about the time AMD acquired ATI. 5 years is an eternity in the cpu/gpu market, it's probably too late at this time.

                        Comment


                        • #27
                          nVidia could also flex their immense performance muscle and move away from x86 to produce really high performance CPUs if they wanted, there are other licensable CPU specs (MIPS/POWER) which can be made much more efficient and powerful than the aging x86. Problem will be how to market it (and get [commercial] support for such platform[1]), not to mention that even though nVidia's big, Intel is MUCH bigger (and not even they were able to shift consumer PCs architecture from x86).

                          Back to VIA's S3 500 series, I welcome their efforts for competing in this rather tough market, however, only 64-bit bus width is a bit too low (nowadays low-to-mid range cards use 128-bit bus width. I really hope they'll produce improved versions and price them more in tune to AMD/nVidia, however they can price their products much lower due to the volume they sell, so VIA would have to make a heavy investment in this regard to be able to price these products competitively.
                          [list=1][*]We know Linux is most likely to support it, provided there is enough information released.

                          Comment


                          • #28
                            Thats the problem though, due o the windows monoculture all other cpu archs have gone under, even the last holdout in the consumer market, Apple with the IBM Power4 derived G5 gave up and went to x86. The only place you're seeing other archs these days in the consumer market is in stand alone devices like game consoles like the Cell in the PS3.

                            Nviida already has the ARM based Tegra platform http://en.wikipedia.org/wiki/Nvidia_Tegra but you won be seeing that powring anything more powerful then your next cellphone.

                            As far as my stated 5 year mark, yes, that is an eternity in tech years, but with curent trends the CPU is becomming less and less important, while the GPU is becoming more and more essential. I do't see why the couldn't though have a version of the Nano die shrunk between 55-45nm and at least dual core running 2.5Ghz within a year, which would at least get them in the door at the low end with the Pentium dual core a few of the low end C2Ds and the AMD X2 line. The would be able to hold onto their gpu position for a while as well to remain profitable while they claw their way up. The real challenge though is if they can get it into the OEMs, nvidia has had little trouble getting their GPUs and chipsets into the OEMs for the last few years, I've seen more of them recently there then I've seen Intel GPUs and AMD gpus when poking around the laptops at the local worst buy.

                            But seriously you don' think t'd take close to 5 years for them to get a cpu that would be within a few flops oft the best that AMD and Intel have to offer at the time? By then we'll be up to 12-16 core CPUs pushing something close to 4.5Ghz a core. It's not something anyone could build over night.

                            Maybe someday soon we'll get multicore multicore cpus that will have several different archs running oon the same die ith some level of interpreter that decides what arch to execute what code fore some truly amazing cpu power. But alas, I think the difficulty of making such a cpu will leave it an infeasible pipedream...

                            No matter what it's an uphll battle for both nVidia and VIA since its now much harder to pull out an antitrust on Intel for monopolistic practices since AMD bought ATI. The only thing we can hope is that AMD has something up their sleeve to retake the cpu market and maybe a new chipset that can give us at least 4x16xPCIe slots, or even better, a 7x16xPCIe chipset with 2x CPU sockets, anything to get them more market space ahead of Intel. Just think if you could get a 14 way crossfire mobo with 2x 4-4.5Ghz OCd quads and 12Gb of tri channel DDR3. Sure theres no real reason for it, but it would get AMD more much neded advertising all over the web on various tch forums and it'd allow them to just brute force games like Crysis Warhead into actually running lol.

                            Comment


                            • #29
                              Originally posted by deanjo View Post
                              More like they are waiting for the foss to get their act together and update their supporting code to todays standards.
                              So they lied. Nvidia drivers replace much of X, so I don't think they were/are waiting for FOSS.
                              Last edited by kraftman; 03-01-2009, 09:21 AM.

                              Comment


                              • #30
                                Originally posted by d2kx View Post
                                They are not producting high-end 3D cards, why oh why don't they go for free drivers?
                                lol.. as if highend has anything to do with it..

                                Comment

                                Working...
                                X