Announcement

Collapse
No announcement yet.

ATI Radeon HD 4850 512MB

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by miles View Post
    Also, how can the result be CPU dependant in tests where the NVIDIA cards get higher FPS?
    There are two parts to a CPU limited game.
    1. The game engine itself
    2. The SW portion of the driver


    Ignoring the game engine itself, it basically means that our driver is "heavier" than Nvidia's when the GPU not the limiting factor. This is where a lot of effort into driver optimization occurs. Remember that we are part way on our journey, and that the 9800GTX and is beaten by the 3870 on ET:QW under Linux).

    In general, if you look at comparative results you will see one of two things occur.
    1. A drop off as the HW class scales
    2. A drop off as the resolution or IQ settings are increased.


    If you don't see that then it is generally that the HW is not limiting. In the case of most of the tests in the 4850 article both NV and AMD both suffer this.

    (Although I do agree we'll have to find native Linux games that stress the cards more, testing +100FPS games isn't telling much, and that's in heavy GPU loads that the card will show its value compared to the previous generation)
    There are some engines that stress the HW quite strongly, hopefully we will see some of those (like lightsmark and unigine) appear in the not too distant future. Having Linux native engines lowers the bar considerably on making games available under Linux.

    Hopefully UT3 will eventually come and it will allow the cards and drivers to be separated at default settings. We'll wait until next week to see Michael's review with AA+AF pushed way up.

    Discuss the latest and greatest in graphics cards, as well as the Linux compatibility and performance.

    Comment


    • #42
      I bet the card would run just as fast on x8 PCIe 2.0. I'd like to see the PCIe lanes cabled off the mobo so you could customize how many lanes you use and how many slots or what sort of slots and the spacing of the slots-to give you as much room as you like; maybe a double case design with expansion slots front and back and a peg board so you could move the slots.

      Right now, you get what you get, a fixed number and type of slots,hardwired PCIe lanes, and there's no room for the cards you do have slots for.

      alexforcefive,

      I bet the price of women's shoes or maybe a nice purse and some frilly underwear, compares better. It's just not fair......or is it? God save the Queen? Good God ya' all!!!

      Live responsibly and please don't drink and breathe.

      Comment


      • #43
        Originally posted by Aradreth View Post
        anyone who says they need 100+ is lying out there arse seeing as LCD's refresh at 60Hz so anything about that is pointless (not to mention the human eye can't detect changes that quickly).
        Actually LCD's do not refresh at any hertz. Pixels are on or off. Refresh rate is totally useless comparative in the digital realm.

        Comment


        • #44
          Originally posted by deanjo View Post
          Actually LCD's do not refresh at any hertz. Pixels are on or off. Refresh rate is totally useless comparative in the digital realm.
          Yes and no.

          The concept is more or less the same, since the pixel state change and the phospor illumination will only happen at the rate relative to your refresh rate. The refresh rate is the maximum rate at which the scanline will have new information presented. The phospor decay period and and minimum LCD transistor switch rate are also relatively important but these days can more or less be assumed to be the same as the refresh rate.

          I still contend that an average rate above the refresh rate since it gives enough headroom above the refresh rate to ensure that even in graphically hard parts of the game the rate doesn't fall below the refresh rate.

          I understand the concept of the eye not being able to distinguish rates faster than 25-30 Hz, but the eye does detect artefacts at rates up to around 70 Hz. This is why most people flicker flicker on CRTs at 60Hz. It is the eye detecting the gap between the phosphor decay, but if the user looks closely at the screen, they will still see a continual unbroken image, but still feel the difference.

          On an LCD this particular issue becomes mostly irrelevant, since the pixels are turned on and off discretely, rather than charging phosphor with the electron beam with the associated decay. I believe this is really what you are talking about.

          This is also the reason that most LCD TVs are now refreshing at 120Hz, since the image feels "sharper" and "more responsive". The eye can't discern the steps, but the secondary visual effects are still there.

          Regards,

          Matthew

          Comment


          • #45
            Originally posted by mtippett View Post
            I understand the concept of the eye not being able to distinguish rates faster than 25-30 Hz, but the eye does detect artefacts at rates up to around 70 Hz. This is why most people flicker flicker on CRTs at 60Hz. It is the eye detecting the gap between the phosphor decay, but if the user looks closely at the screen, they will still see a continual unbroken image, but still feel the difference.

            The primary reason why people can detect flicker @ 60 Hz is because of the occasional synchronization of artificial light sources and the refresh rate of the monitor (in countries that have a 50 Hz powergrid this effect happens at that refresh rate). It is the same effect as to why when filming a crt screen with a camera you will see scan lines on the screen. This effect does not happen on LCD's as the pixel state (as you have mentioned) changes discreetly.

            Comment


            • #46
              Originally posted by deanjo View Post
              The primary reason why people can detect flicker @ 60 Hz is because of the occasional synchronization of artificial light sources and the refresh rate of the monitor (in countries that have a 50 Hz powergrid this effect happens at that refresh rate). It is the same effect as to why when filming a crt screen with a camera you will see scan lines on the screen. This effect does not happen on LCD's as the pixel state (as you have mentioned) changes discreetly.
              I think we are both being a little bit liberal and coarse with our interpretation and description of phenomena and effects. But to continue the interesting discussion...

              The eye (and brain which is part of the picture) are analog, you may be able to perceive something you can't see under close examination. CRT flicker is more noticable in your peripheral vision and not your direct vision. Your eye does not "capture" at 25 Hz, it has a response time, some persistance in light detection, and the brain also applies it's own interpretation. You can detect a flash of 1/4000 of a second, but would not be able to see a light that is oscillating on and off at 1/60th of second.

              LED lights demonstrate this. You look at the light directly and it is 100% solid and unwavering, you scan your eyes to the left or right, and you will see a collection of discrete images of the lights as you "capture" the light in being on and off in different parts of your field of view.

              I agree that there may be *some* aliasing between the local power source (as applied to lighting) and CRTs, but almost all domestic lighting uses some method of persistance (either phosphorus in flourescent tubes and CFLs), or in the case of tungsten lighting, the filament doesn't cool down enough before it heats up for the next AC cycle.

              Also, bear in mind though that the VESA "flicker-free" standards applied just as correctly moving from 60 to 72 Hz in Australia as it did in North America.

              The aliasing you are talking about when filming is a related, but more or less discrete effect. The frames capture a finite amount of light, and the CRT monitors on the set will be emitting (or scanning out) a fixed number scanlines in that period. As a result the captured image will never be full frames, this coupled with phosphor decay will results in the historically familiar dark horizontal lines on TV and movies. With LCDs, this simplifies down to tearing on screens. Framelock and Genlock provide a solution to this, by ensuring that all displays on a set will be synchronized to a common clocking signal that is in sync with the camera.

              Regards,

              Matthew

              Comment


              • #47
                How to run fglrx on it ?

                The bottom line right now is there are a few troubles with the Catalyst 8.6 for Linux and the Radeon HD 4850.
                So it seems that you have manged to run it. How did you do that ? I'm stuck on SIGSEGV somewhere in userland part of fglrx while trying to start X.

                Regards,
                rle

                Comment


                • #48
                  Originally posted by rlewczuk View Post
                  So it seems that you have manged to run it. How did you do that ? I'm stuck on SIGSEGV somewhere in userland part of fglrx while trying to start X.

                  Regards,
                  rle
                  What motherboard/chipset are you using? You may need to update your BIOS to circumvent a bug with the driver.
                  Michael Larabel
                  https://www.michaellarabel.com/

                  Comment


                  • #49
                    Originally posted by Michael View Post
                    What motherboard/chipset are you using? You may need to update your BIOS to circumvent a bug with the driver.
                    Whoops, wrong driver version (I've downloaded 8.501 BUT then installed 8.49.4), sorry for this redundant rambling :-). Now everything seems to be working fine.

                    The good news is that AMD Stream SDK is also working (I've bought this card just for it). I've compiled and run all CAL and Brook+ examples from SDK. Except for some assertion messages (this is a known CAL/Ubuntu problem) all are working good.

                    My system was Asus P5B + E6300 + 2GB RAM (reduced from 4GB to get the card running first and solve MTRR problems later).

                    I still didn't benchmark CAL/Brook+ as I'm a newbie in this topic (and need to move the card to a more decent machine first).

                    Regards,
                    rle

                    Comment


                    • #50
                      Michael, since you've got both HD4850 and HD4870, could you also mention in the tests if the cards suspend/hibernate/resume nicely, using the proprietary drivers and the open source ones?

                      Since all the cards released at the moment are the same (only the branding differs), that information would prove really useful!

                      Thanks a lot.

                      Comment

                      Working...
                      X