Announcement

Collapse
No announcement yet.

GNOME Prompt Becomes Ptyxis

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by DumbFsck View Post

    I call bullshit. Cite the research.

    >inb4 it is actually a 2ms delta on some arbitrarily large number. Not the difference in latency from 2 to 4ms, or 2 to 3.
    OT I know, but I just wanted to say yup, this ^^^

    Comment


    • #52
      Originally posted by DumbFsck View Post

      I call bullshit. Cite the research.

      >inb4 it is actually a 2ms delta on some arbitrarily large number. Not the difference in latency from 2 to 4ms, or 2 to 3.
      Very well:


      It's not as crazy as it sounds. A simple 60hz screen refreshes every 16-17ms, and we can detect each frame change on that. Studies show that we can tell the difference in smoothness past 144hz, which is refreshing every 7ms. They've tested up to (or past?) 200hz which is still noticeably different (though much less obvious than 60 >> 144) and that is only 5ms every frame.

      Humans notice things at that low latency mostly when they are actively interacting with elements on a screen. It has something to do with the brain's expectation of "input latency" being basically 0 in the real world.

      Comment


      • #53
        Originally posted by Okki View Post

        You must be the only one to run all your graphical applications from a terminal all day long... You'll have to explain to me the point of using a desktop environment (especially GNOME) in this case 🤔

        Do you also run your terminal from a terminal?​ 😂
        The softwares I use are always customized versions that are not available from the UI... that happens a lot for advanced Linux use.
        And on top of that, I use lots of command line tools made by myself or not.
        So, yes, I basically use the DE only for managing windows and virtual desktops and that's it..
        And I need, when looking for a tool, to find it just by guessing from its name...

        Comment


        • #54
          Originally posted by You- View Post

          OK now I have a headache.
          Don't gnash your teeth and writhe in pain! Perhaps you need a psychiatrist from Ctesiphon riding a bicycle with pneumatic tyres to visit and wrangle the knots - just hope they're knowledgeable - a Czar of the psyche - and not a pseudo or a psycho. You might need to knuckle down and wrestle with the problem, perhaps learn some mnemonics to remember the techniques. It'll take time: for that infamous 50-minute hour, watch the shadow of the gnomon move around the sun-dial, which, if you are fortunate, will be richly decorated with knurls and curlicues from careful knife-work, perhaps with pictures of pterosaurs like Pteranodon and Pterodactylus; and Cnidarians. Of course, to be honest, the visiting professional might need to stay in an hotel, so you would need to be heir to a fortune to pay for all of this - maybe you have access to a stock of bdellium?

          I wonder if a psephologist will be needed to count the likes? Pshaw! - unlikely.

          Comment


          • #55
            Originally posted by User29 View Post
            Can anyone tell me, why is GPU acceleration is a thing? Why is it needed? IMHO writing to a terminal is slow as hell anyway so who cares?
            It is not slow.

            Linux has an abundance of excellent terminal applications. Interestingly, I could not find any decent comparison of their text display performance. Since I use the command line a lot, I want text output that is as fast as possible. When you compile a large project, you don’t want the console output to be the limiting factor. System Due to popular demand, here is what my test system looks like: Ubuntu 7.04, Gnome, ATI Radion Mobile 9600 with fglrx driver, and a Pentium M with 1.5 GHz. The Benchmark I took the burden on me to do a comprehensive comparison of...


            Actually back in the 2000s a lot of work was invested to make terminals as fast as possible. And now again with further enhancements and hardware-acceleration. This affects scrolling speed in less, Vim and reaction time of TUIs. And compile-times - you cannot print the unit-tests, when you're terminal still needs to display the configure step.
            How fast a log is displayed (think less in follow mode or tail -f) can affect performance of other parts of an application. In the same way you're FPS can affect the engine speed on even modern games. And decoupling is sometimes desired and sometimes avoided. You cannot ask for input until the output with the instruction is done

            I once have implemented an clone of pong on Windows with the CMD. What I didn't expected was the abysmal performance. Had to invest extra work to keep a playable framerate.


            TLDR
            The -s or -v switches exist not just to avoid information overflow, they are often needed to gain performance because the terminal becomes a bottleneck.

            // edit
            For the nerds. NOTCURSES (C++) should be the state-of-the-art benchmark of terminal performance and capabilities
            Last week (2021-05-09) I cut Notcurses 2.3.0, containing most of the work necessary for Notcurses 3.0.0. In particular, Notcurses now supports portable (Sixe...


            Last edited by hsci; 01 March 2024, 06:57 AM.

            Comment


            • #56
              Originally posted by dwagner View Post
              Do you really believe it takes a contemporary CPU more time to render typed characters than it takes to wait for the next frame to be sent to the monitor? Highly unlikely, even if a 120Hz display "only" takes ~8ms until the next frame is displayed.


              Unless, of course, the GPU spends considerable power by "waking up" or "clocking up" upon every key pressed, while even the lowest powered CPU core (which registered the key press event, anyway) was "awake" for a moment, anyway, and took only microseconds to write a few bytes into a frame buffer.
              There are no frame buffers anymore. You should really stop living in the 90s and read up on how modern computers work.

              The entire GNOME desktop is composited, so the GPU is always "in the loop" of rendering each frame. And if you take this into account, then it should be obvious that rendering glyphs directly on the GPU (thus only sending a command over the bus) will be faster and more efficient than rendering them on the CPU and then sending the entire texture to the same GPU over the bus.
              Last edited by intelfx; 01 March 2024, 06:57 AM.

              Comment


              • #57
                I feel like there is a new terminal app being released every month now...

                Comment


                • #58
                  I agree it isn't as crazy as it sounds. But, and there's a big but, the research you showed does not argue in favour of what you said.

                  The research you linked (and the link was broken for me, but I believe it is this one) shoes that for tracing a line with a stylus directly on screen the median was much higher, i.e. 7ms, and previous research done by Ng even argued that for "buttons" on a touch screen, 40 seemed to be the least amount, with people not being able to tell between 40 and 7 (which we know isn't true).

                  Your first comment mentioned was about rendering characters in a screen, and I couldn't find any research from this team (I searched submissions from Albert Ng and Jota, independently) about this topic, although their set up for this specific experiment was sick af I also couldn't find any other publications using it, which I believe is untapped potential.


                  As a side note, I somehow believe you didn't read the research. Not only because of the broken link, as that could be an issue on my end, but also because it did not reach the same conclusions as you did, and your comments later are also not applicable. If you did Google (2ms jnd) and sent the first link without event clicking it, shame on you. I am very much in favour of open research etc., but won't link the pdf as that might be against some law or even a phoronix rule, but there is a friendly crow that can help people read the research for free.


                  Now back to the conversation. I can reliably tell the difference between 200 and 360, 10/10 times. And back in my windows days (and using a 60hz screen) there was some win32 program that made your screen black, right side with an arbitrary delay for your input and left side with another, it randomised these delays and asked for you to press the arrow corresponding to the faster side. We couldn't type, only move the pointer. I couldn't tell the difference between 14 and 15ms, but could between 16 and 18, these numbers are suspiciously close to the frame time in a 60hz screen, so I wish I still had the binary to test again, it was 10 years ago but maybe I'll search my backups later. I do remember tho that I sent to a couple of friends, both global in CSGO MM, owners at the time of 144hz, and one of them couldn't get lower than 20ms lol, the other was more similar to me, so maybe it is something in the software that bottlenecked us.


                  Now, again, if you do have research on if we can perceive differences in a character being rendered in screen, as you said you had, I'm looking forward to reading it. This one was interesting, though unrelated, so I bet the correct one will be even more.

                  Butffor now, people noticing the dragging of a large box with a stylus (perceiving differences down to 2ms +- 4ms), dragging small boxes (4ms +- 6ms) and tracing a line (>7ms) isn't good enough.

                  Comment


                  • #59
                    Originally posted by M@GOid View Post
                    Another options would have been Gprompt, Gterminal or Gonsole. A less boring and a killer one could have been G-Spot.

                    /S
                    Gonsole is a good name. Gprompt or Gterminal are terrible.

                    Comment


                    • #60
                      Originally posted by jacob View Post

                      Gonsole is a good name. Gprompt or Gterminal are terrible.
                      ...or, if it has to be a hard-to-google reuse of a common word, how about Germinate?

                      Comment

                      Working...
                      X