Announcement

Collapse
No announcement yet.

The Open-Source Linux Graphics Card Showdown

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by droidhacker View Post
    I think he's already using turquoise, blue, and green.....

    Edit: also red, brown, and orange...
    Shouldn't an Algorithm along the lines of

    Code:
    // num_of_lines and an array of line objects assumed here called line
    const ColorMax= #FFFFFF;
    int ColorStep = ColorMax/num_of_lines;
    for (int i =0; i < num_of_lines; i++)
    {
         templine* = line[i]
         templine->setColor(ColorStep*i);
    }
    work out for the best? that way the colors are all equidistant from each other
    Last edited by Luke_Wolf; 27 April 2012, 02:09 PM.

    Comment


    • #22
      Originally posted by droidhacker View Post
      I think he's already using turquoise, blue, and green.....
      Well there are different shades of those colors thats not as close.
      And you shouldn't use same color in different shades in the same graphs unless there bright and dark.
      If you use them as he does now then you need to alter the lines and start using different lines like dotted ... ----
      It should be easy to follow the graphs curves and now which curve that represents each graphics card.

      Comment


      • #23
        Originally posted by Luke_Wolf View Post
        Shouldn't an Algorithm along the lines of

        Code:
        // num_of_lines and an array of line objects assumed here called line
        const ColorMax= #FFFFFF;
        int ColorStep = ColorMax/num_of_lines;
        for (int i =0; i < num_of_lines; i++)
        {
             templine* = line[i]
             templine->setColor(ColorStep*i);
        }
        work out for the best? that way the colors are all equidistant from each other
        Interesting idea, but it won't work.
        It will be ESPECIALLY bad if you select an EVEN number of steps.

        The problem is that the color selection is not just a number of 3 bytes, it is 3 distinct ONE byte numbers representing the red, green, and blue components.
        You see, the problem works like this; when you divide the byte space into some number of chunks in the manner you propose, you'll end up varying the most significant byte by that factor, leaving the least significant bytes alone. I.e., you can end up with the following sequence for choosing 8 steps;
        #1FFFFF
        #3FFFFF
        #5FFFFF
        ...
        #FFFFFF
        That will, of course, be a sequence of turquoise progressively brighter as it gets closer to #FFFFFF

        Simiarly, you can't just drop down into the three bytes and progress as 000000, 111111, 222222, ..., FFFFFF, because that will get you greyscale.

        You might want to pick numbers that are along the basis of ON/OFF like so;
        FFFFFF <-- don't use this one though, because white on white is a bad choice.
        FFFF00 <-- yellow
        FF00FF <-- purple
        00FFFF <-- turquoise
        FF0000 <-- red
        00FF00 <-- green
        0000FF <-- blue
        000000 <-- black

        Now you can go through a second time using "half's".
        7F7F7F <-- grey
        7F7F00 <-- dark yellow
        7F007F <-- dark purple
        ...
        etc.
        Darks are good.... they tend to contrast well with brighter colors.

        Mix and match the darks and light bytes;
        FF7F7F <-- pink
        7F7FFF <-- light blue
        ...
        lights, on the positive side, tend to contrast VERY well against darks, but also tend to be bad since they don't contrast well against WHITE.
        FF7F00 <-- orange. Now we're talking.

        Maybe some QUARTER colors;
        7F3F00 <-- brown
        7F003F <-- kind of a dark redish purple, quite distinct.


        Colors are funny things to work with mathematically. The main thing you need to keep in mind is that you have to treat colors as THREE numbers, not just ONE.

        Comment


        • #24
          Originally posted by Nille_kungen View Post
          Well there are different shades of those colors thats not as close.
          And you shouldn't use same color in different shades in the same graphs unless there bright and dark.
          If you use them as he does now then you need to alter the lines and start using different lines like dotted ... ----
          It should be easy to follow the graphs curves and now which curve that represents each graphics card.
          That was a joke.... but yes, absolutely. You need high contrast between similar types of colors. You can easily tell the difference between FFFFFF, 7F7F7F, and 000000, despite being varying intensities of the same color.

          Comment


          • #25
            To put this problem to rest, Michael should FORGET about algorithms for picking colors. Just hard code numbers in an array of, say, 50 numbers, ordered such that you can pick them in a LINEAR fashion for maximum contrast.

            Comment


            • #26
              @Qaridarium

              First of all intel produces already cpus for other companies. Also until recently when amd paid lots of money to be fully independent from the factory part they had a contract that they needed to produce the cpus there. Now they could produce anywhere but i do not know if they want to pay intel to create their chips. Usually the layout needs to be optimizied for the production way as well, so you can not only shrink it and it still works. amd already pays for x86 patents, so intel gets a few cents from every cpu amd sells (or maybe $, dont know the contract). the 22nm production is most likely not fully optimized as intel only sells the quad cores right now. those are much more expensive compared to the dual cores. For better profit the yield must be high enough, that was the main problem for amd and their factory as well as for tsmc who produces the gpus for amd + nvidia. Every shrink needs some time to be fully optimized. if amd pays well and intel has spare capacities i dont see a problem why intel would not produce the chips, they get money in any way, but of course less when amd produces em somewhere else.

              Comment


              • #27
                Originally posted by Kano View Post
                @Qaridarium

                First of all intel produces already cpus for other companies. Also until recently when amd paid lots of money to be fully independent from the factory part they had a contract that they needed to produce the cpus there. Now they could produce anywhere but i do not know if they want to pay intel to create their chips. Usually the layout needs to be optimizied for the production way as well, so you can not only shrink it and it still works. amd already pays for x86 patents, so intel gets a few cents from every cpu amd sells (or maybe $, dont know the contract). the 22nm production is most likely not fully optimized as intel only sells the quad cores right now. those are much more expensive compared to the dual cores. For better profit the yield must be high enough, that was the main problem for amd and their factory as well as for tsmc who produces the gpus for amd + nvidia. Every shrink needs some time to be fully optimized. if amd pays well and intel has spare capacities i dont see a problem why intel would not produce the chips, they get money in any way, but of course less when amd produces em somewhere else.
                None of that matters.
                All that matters is that AMD's got

                Comment


                • #28
                  Originally posted by droidhacker View Post
                  I also question the performance tests. Intel is known for playing games in order to benchmark higher. It wouldn't surprise me one bit if there was something sneaky in the CPU that knows when someone else's GPU is attached and does something like cutting memory bandwidth. Intel is known for pulling sneaky crap, like designing CPUs FOR the benchmarks, rather than the real workload.
                  This is also why Intel was sued over ICC, At best it only allows non Intel 64 bit CPUs have SSE2, in 32 bit it pushes them into 486 mode. The most obvious example of this is the SuperPi benchmark.

                  Comment


                  • #29
                    Originally posted by Kano View Post
                    Thats incorrect. The gpu part has got different voltage pins, you can select the differently to gpu voltag on Z boards. The default is that it is disabled as soon as you add a PCI-E gfx card, but you can force it to stay enabled if you want to use it together with virtu (on win). Intel even sells chips with disabled gpu part, not yet for ivb, but for snb like:



                    Just a matter of time, then intel has got too many ivb cpus without working gpu and wants to sell those too
                    If I'm not mistken aren't Intel's GPUs just a 2nd chip on the same CPU package while AMD's have both as part of the same chip? Hence why the APUs have their own CPU socket?

                    As for the non IGP parts, thats not what the OEMs want, they want just the single chips so they can cut costs, they don't want to have to install a GPU as well.
                    Last edited by Kivada; 27 April 2012, 04:46 PM.

                    Comment


                    • #30
                      Often you get pci-e cards together with cpus with gpu integrated. Not because of so much higher speed (as they mainly use low cost cards that could not be the real reason), but just to write some fancy things like 2gb vram (for the extra slow chip) or similar onto the flyer.

                      Comment

                      Working...
                      X