Announcement

Collapse
No announcement yet.

Rich Geldreich On The Concerns Of Open-Sourcing In The Game Industry

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by Sonadow View Post
    This whole 'open source is more secure' is really complete bull at this point of time. Even with large names like Oracle, Red Hat, Apple and Canonical, nobody caught the vulnerabilities in Heartbleed and ShellShock until close to a decade later.
    I am not saying that open-source developers are perfect (far from it and I believe things are getting worse) but it would have almost certainly been caught a lot later had all the code been closed source. Plus the OpenBSD devs have been complaining about the buggy mess that is OpenSSL for many years before heartbleed (suggesting that a lot of peer review was already taking place). One issue was that the OpenSSL guys were not accepting patches readily enough but another one is that the codebase was simply full of archaic platform and protocol support that needed cleaning up.

    Originally posted by Sonadow View Post
    Windows 10 (32bit) is still binary compatible with most software from Win98
    Heh, that is probably because Windows itself has barely changed since Windows NT 4.0 (other than the obvious regressions). That aside, 32-bit is getting quite hard to run these days. Many hardware providers only provide 64-bit Windows drivers. The days of the Windows backwards compatibility is quickly fading.
    But what I was more talking about is things like Quake 1-2 and 3 have all been ported to "modern platforms" such as the Web (via Emscripten), Android (via NDK), Linux, BSD, heck even weird sh*t such as printers. My point being that this hasn't been done for i.e Half-Life, Thief and other closed source games from the 90s / early 2000s... Because it can't. You need emulation or virtualization. Things like Hyper-v, VirtualBox etc... are all starting to drop support for these older operating systems too.
    Also, if this old software you are effectively needing to run in emulation has a security issue (such as heartbleed), you cannot fix the darn thing unless you can find a leaked copy of the source on torrents.

    Originally posted by Sonadow View Post
    they don't even survive a few glibc or libstdc++ minor version updates, and older unmaintained code often cannot even build on newer libraries even with complete source access.
    This I do actually agree with. If you cannot build the source, it is almost worthless. And yes games developers do often yield some of the worst code and build systems I have ever seen. But with a large community effort, it can be modernized and built. With a closed source binary, yes you can patch out the DRM, patch in a few Sleep() calls or even inject it with DLLs to support a few new features but that is pretty much it.

    As for the glibc and libstdc++... That is more Linux being sh*t rather than an issue with the open-source ethos. For example the BSDs suffer from this less and certainly traditional UNIXes such as Solaris can still run extremely ancient binaries no problem.
    Last edited by kpedersen; 01 December 2017, 06:31 AM.

    Comment


    • #42
      He's a greedy bastard: It's kind of ironical because his name is "Rich"

      Originally posted by RealNC View Post
      Name checks out :P

      "Rich Geldreich"

      "Geldreich" is German for "Moneyrich." Not a soundalike or anything. It literally means that. I'm not making it up.
      Hahaha. That's what I thought exactly. His name seems like one from an old comic character, totally obvious...
      Last edited by timofonic; 01 December 2017, 07:57 PM.

      Comment


      • #43
        Hrrrmph, when someone puts something under certain license, people could, would and have each and every legal right to take it as granted, as long as they use it in accordance with license. Just because author HAS granted these rights, dammit. So when someone dumps code under premissive licenses, they HAVE to expect it could and would be used all the ways, and most entities wouldn't even bother contributing or paying back. Just because license allows it to be this way, and it happens to be most convenient way of doing things for most companies around. So one could end up doing busload of unpaid jobs for greedy asshat companies. Which isn't best experience ever, obviously. Well, maybe that's why there is GPL and LGPL, after all. These at least encourage give improvements back. Author could also relax GPL restrictions on certain commercial entity if they've paid some $$$, though it req's some attention to details, like CLA.
        Last edited by SystemCrasher; 01 December 2017, 09:57 PM.

        Comment


        • #44
          Originally posted by SystemCrasher View Post
          Hrrrmph, when someone puts something under certain license, people could, would and have each and every legal right to take it as granted, as long as they use it in accordance with license. Just because author HAS granted these rights, dammit. So when someone dumps code under premissive licenses, they HAVE to expect it could and would be used all the ways, and most entities wouldn't even bother contributing or paying back. Just because license allows it to be this way, and it happens to be most convenient way of doing things for most companies around. So one could end up doing busload of unpaid jobs for greedy asshat companies. Which isn't best experience ever, obviously. Well, maybe that's why there is GPL and LGPL, after all. These at least encourage give improvements back. Author could also relax GPL restrictions on certain commercial entity if they've paid some $$$, though it req's some attention to details, like CLA.
          Well, I consider CLA a big issue too. Look at what happened to CUPS now owned by Apple.

          But I agree he should have been used GPL or LGPL license instead crying later...

          Comment


          • #45
            Originally posted by andrebrait View Post
            I don't think that's viable even in ideal LAN conditions, latency-wise, let alone over the Internet. That would be like what Steam's In-home streaming does and it doesn't work as well as the native thing for me even over a GbE setup.

            Plus, people here kinda pointed me to how it's usually done and I'll read about it later.

            EDIT: This would be a thin client-like stuff, but yeah, wouldn't work. Unless you're playing in low-res and the game isn't sensitive to latency.

            Also, remember the server would be rendering stuff for everyone. Nope, not viable for someone who doesn't have the financial resources of Google, Facebook, NVIDIA and the like.
            Well, for what i see people are saying, it's basically what i said with obvious optimizations (i just pointed out a basic idea. there's usually alot of optimization on top of that to get optimal results):
            • If players are within a certain radius of each other then the info (position, speed, acceleration, bearing) is passed along, so the client can 'guess' what's going to happen and render accordingly
            • Instead of rendering the whole thing, the server only simulates (instead of actual render) the positions, bearings and little more
            This way alot of cheating is reduced, without a big impact.

            Comment


            • #46
              Originally posted by nomadewolf View Post

              Well, for what i see people are saying, it's basically what i said with obvious optimizations (i just pointed out a basic idea. there's usually alot of optimization on top of that to get optimal results):
              • If players are within a certain radius of each other then the info (position, speed, acceleration, bearing) is passed along, so the client can 'guess' what's going to happen and render accordingly
              • Instead of rendering the whole thing, the server only simulates (instead of actual render) the positions, bearings and little more
              This way alot of cheating is reduced, without a big impact.
              Yes, in which case the client would be rendering not the server. The rest is ok, but you added the server rendering stuff to the mix and that's not viable AFAIK. Without rendering, yours and theirs ideas probably work and are probably what's implemented by some engines already.

              Comment


              • #47
                Originally posted by nomadewolf View Post

                Well, for what i see people are saying, it's basically what i said with obvious optimizations (i just pointed out a basic idea. there's usually alot of optimization on top of that to get optimal results):
                • If players are within a certain radius of each other then the info (position, speed, acceleration, bearing) is passed along, so the client can 'guess' what's going to happen and render accordingly
                • Instead of rendering the whole thing, the server only simulates (instead of actual render) the positions, bearings and little more
                This way alot of cheating is reduced, without a big impact.
                That "certain radius" could be the size of the map players are playing on, which would render the whole 'certain radius' to be a void solution and also code overhead.
                Imagine 64 players on 2x2km map. Most sniper rifles IRL can throw bullet twice that far, though 2km is about top for aimed hits. Ancient Mosin M1891 or Kar98k have lethal range of ~5km. Longest distance between 2 points on a 2x2km square you could have, would be by it's ~2.828 km diagonal.
                It would work for close quarters combat in and around cityscape, but not really on huge open maps. Certainly it would not work in a game with implemented ingame artillery - unless shells move like slugs and follow the 'gravity' multiple times normal. It would not also work against the concept of indirect fire. Shooter does not have to always literally see the enemy. He could simply fire against set of coordinates read to him by forward observers over ingame VoIP. Games using such parameters exist and are played.

                Comment


                • #48
                  Originally posted by timofonic View Post
                  Well, I consider CLA a big issue too. Look at what happened to CUPS now owned by Apple.
                  Sure thing. But in this case things would at least be a bit more fair and balanced for original author, right? Though it puts contributors at disadvantage, like Apple has showcased to us.

                  But I agree he should have been used GPL or LGPL license instead crying later...
                  Well, it at least attempts to encourage cooperation while BSD/MIT/Apache are of little help in this regard. So they only make sense when authors are damn sure they can handle it alone and do not care about any contributions. If it isn't case, BSD/MIT/Apache aren't great choice, isn't it? C'mon, if one gives rights in their license, it is too late to regret about it later, so it makes sense to think beforehand. BSD/MIT/Apache could make sense if e.g. promotion of certain data format is topmost priority, but only if widespread use of data format outweighs other concerns by good margin. Looking on Rich speech it seems it wasn't the case.

                  Comment


                  • #49
                    Originally posted by aht0 View Post

                    That "certain radius" could be the size of the map players are playing on, which would render the whole 'certain radius' to be a void solution and also code overhead.
                    Imagine 64 players on 2x2km map. Most sniper rifles IRL can throw bullet twice that far, though 2km is about top for aimed hits. Ancient Mosin M1891 or Kar98k have lethal range of ~5km. Longest distance between 2 points on a 2x2km square you could have, would be by it's ~2.828 km diagonal.
                    It would work for close quarters combat in and around cityscape, but not really on huge open maps. Certainly it would not work in a game with implemented ingame artillery - unless shells move like slugs and follow the 'gravity' multiple times normal. It would not also work against the concept of indirect fire. Shooter does not have to always literally see the enemy. He could simply fire against set of coordinates read to him by forward observers over ingame VoIP. Games using such parameters exist and are played.
                    Like i said... it's a matter of optimization...
                    So objects like bullets, shells and the sort get exceptions and are always passed...
                    Someone uses a cheat to get positions from those points of origin? Since when a shot is fired the server already knows where it's going with fair certainty, the server can choose to only pass that information to the clients that will be impacted by it.

                    It's a game of cat and mouse...
                    Cheaters will always come up with new ideas. But there always a counter solution.

                    Comment


                    • #50
                      Originally posted by aht0 View Post
                      Most sniper rifles IRL can throw bullet twice that far, though 2km is about top for aimed hits.
                      Little nitpick: sniper rifles require a pretty well-trained sniper operating them to get anywhere beyond 1km.

                      He could simply fire against set of coordinates read to him by forward observers over ingame VoIP. Games using such parameters exist and are played.
                      Arma (I II and II) is the only game I can think of where you can find this level of coordination.

                      Comment

                      Working...
                      X