Announcement

Collapse
No announcement yet.

OnLive - Why Linux Gamers Should Take Notice

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    But when I ping any of the sites local to me the latency is in the 10-20 ms range... Some of you seem to think that onlive are suggesting that their service will work regardless of server' location which is not true; at the demo they said they tried playing from australia and couldn't because of lag.

    Provided they've really made a really low-latency codec, and with much fanfare, they say they have, only problem left is the dynamic latency...

    It's quite possible for them to strike deals with the ISP to insure against that, a bit of money goes a long way.

    Just saying it's not theoretically impossible, personally I'd consider a delay of up to 100 ms playable.

    Comment


    • #32
      From what I hear the delay at GDC was 1 ms. Impossible for a human brain to notice. If it's already doing that good, then with lag it gets to the 10-20ms rang I'd still be able to play comfortably. Even at if it got up 100ms, I think I'd still be fine, and I'm sure it'd take a lot of lag to get up that high from 1 ms.

      I don't know they may just have pulled that number out of their ass though. But if it's true, it is amazing and would definitely be doable.

      Comment


      • #33
        Nope. Don't compare pears with apples. On a LAN you have at best one switch in between. A switch is fast, it just duplicates the data packet on the appropriate port. On a WAN though you have routers all over the place. Each router processes data packages which yields delays. Striking deals with ISPs doesn't help much since their routers are not the only players in this game. Latency builds up at various places so this is utopian. As mentioned above they even said themselves it didn't work around the globe and if they can't do this, which IS the heart of their idea, then it's just bull from A to Z.

        Besides the resolution is crappy and with heavy compression comes heavy artifacts. If you compress lossy ( and they have to if they want to claim such a thing ) then you have visual declination of quality and if you compress even more heavy than the best video compressors out on the market which already yield not too blistering quality with small videos then their quality is going to be more shitty than shit.

        Besides why pay for a broken system where you are cut off playing your games whenever the Internet is overloaded again or your ISP line ( which is shared by the way unless you pay BIG buckets ) is crammed full? Or you want to play on a laptop? Always looking for a huge backbone since WiFi obviously isn't useful for that kind of usage at all? God the problems and restrictions are so numerous how can any sane person even remotely consider using that trash?

        Comment


        • #34
          Originally posted by Dragonlord View Post
          As mentioned above they even said themselves it didn't work around the globe and if they can't do this, which IS the heart of their idea
          That's not their idea at all, the servers are supposed to be local, and therefore link quality more under your ISP's control and generally much better.

          I'm getting consistent 20 ms pinging a local site, and that's over wlan (which undoubtedly adds some lag)

          Comment


          • #35
            Originally posted by igor View Post
            That's not their idea at all, the servers are supposed to be local, and therefore link quality more under your ISP's control and generally much better.

            I'm getting consistent 20 ms pinging a local site, and that's over wlan (which undoubtedly adds some lag)
            Heh... I'm more than familiar with "local" and how painful doing that will be for them.

            For them to have the "locality" you're describing, they'll have to sit on every ISP or sit on the backbone. I know of a company that tried this sort of play with content delivery. They experienced the LARGEST flameout of the dot.com boom at the tail end thereof. 85 million US in less than 1 year's time. The name of the company: epicRealm. I did part of the software that resided in the 50 data centers worldwide. It really did work- burn rate doing it, though, was hideous. As high as 3 million US per month in data access and sever power operations. And that doesn't get into what this bunch will need to make it really, really work.

            When I say it's unlikely, there's a reason. It's because I've been in that space repeatedly and it's not cheap and typically, unless you've got really, really deep pockets (IBM deep or GE deep...) you can't afford to play in the first place.

            Comment


            • #36
              They've made standard rack mounted systems if that helps anything

              It's really not as centralised a platform as you dudes perceive it, you should check out the videos where they speak about it, they claim both hardware and software advances in order to achieve this kind of latency.

              I believe the public beta starts soon, so we'll just have to wait and see

              Comment


              • #37
                Originally posted by igor View Post
                they claim both hardware and software advances in order to achieve this kind of latency.
                Claims are one thing.

                Delivery's another. Unless they've got the sort of money epicRealm had in hand, they're not going to last past the beta.

                Everyone keen on this keeps missing the math.

                It's a brutal reality and software and hardware "advances" do NOT offset the way the Connected Internet actually works.

                1Gbit links will only be able to handle about 500 people tops.
                Keeping it local means you're going to have to handle thousands. 10Gbit links will give you about 4000 or so (Most stuff can't COPE with 10Gbit stuff full-on, not to mention the packet overheads will trim about 1/4 of the peak "speed" on the links... We're still struggling on how to best do packet analysis at near wire speeds at Tektronix, my current day job, in the 10Gb realm....)- that's 4000 or so for a major metripolitan area's main ISP. So you could probably serve MAYBE 12k people within a town the size of Dallas, TX. Seriously.

                Realistic? Not on your life.

                Comment


                • #38
                  this will go nowhere.
                  a) Latency
                  b) Latency
                  c)Bandwidth
                  d) cloud is just a buzz word.
                  e) remember the disaster called 'thin clients' in the 90s? Same stuff
                  f) remember 'smart terminals' in the 70s&80s? same stuff

                  this is made of failure to rip off investors.

                  Comment


                  • #39
                    I don't have much to say here since you all already got my points.

                    It seems like quite an ambitious project, however the 1 thing that will perhaps be the unwinding point of this.
                    Networking
                    Think about where they demoed it! at GDC! Which means this was done in a hall with a very fast dedicated connection to the internet. Even if they had that, it's still the fact that it was basically an in-house demo of the product, which means most likely GIGABIT LAN speeds.
                    Years ago I tried streaming games over TightVNC using 8-bit color, Hextile encoding (by trial and error, this was the fastest I came up with), I got 4 boxes that were never in sync and updated every 1 or more seconds. Imagine that, at least 10 times slower.
                    The very logistics of it seem to boggle the mind. I would have loved to have been there (and I get an excused absence (even credit!) from school for going) to see how much stuff these demo'ers knew about the product they're selling. They better have had a lead tech there!

                    EDIT: Discard my foresight, it seems to betray me.
                    I foresee this as probably a separate dedicated server and a separate client on the same network (slingbox anyone?). Server does all the legwork, streaming it to the client, that's the only way that they could combat all the lag the internet produces. At that point, you might as well save a few hundred bucks and build your own box.

                    I do see the need for them to get industry support, after all, what good is a product that has no titles at launch?
                    I for one will hold back on developing for this until it's proven and reliable.

                    ... I guess I had more to say than I thought.

                    EDIT: Okay, I didn't really look at the site all the way through before writing, now I have. I just don't know how this is really going to work over the internet, games are all about instant command and reaction. Ever try to shoot someone in Quake3 on a laggy network?

                    EDIT 2: God, can you think of the bandwidth that ISP's would have to push if this service became popular? I shudder to think of at least 20 people doing this in an area simultaneously, or even 2+ people simultaneously on the same connection.
                    Last edited by me262; 27 March 2009, 06:19 PM. Reason: Post site lookover.

                    Comment


                    • #40
                      Originally posted by Kevin View Post
                      This is a huge deal and could mean that Linux gamers finally step on Microsoft's face and finally get the gaming fix they have waited so long for.
                      By that logic, I have been playing Burnout on Linux for ages - my PS3 is plugged into a TV card- I play on Linux using tvtime!

                      Comment

                      Working...
                      X