Results 1 to 10 of 14

Thread: A New Internet Draft Of HTTP 2.0 Published

Hybrid View

  1. #1
    Join Date
    Jan 2007
    Posts
    15,126

    Default A New Internet Draft Of HTTP 2.0 Published

    Phoronix: A New Internet Draft Of HTTP 2.0 Published

    The Hypertext Transfer Protocol Bis Working Group of the IETF has published a new Internet draft of the HTTP 2.0 protocol...

    http://www.phoronix.com/vr.php?view=MTQwNjU

  2. #2
    Join Date
    Sep 2011
    Posts
    704

    Default

    So is this based on Google SPDY?

  3. #3
    Join Date
    Sep 2008
    Posts
    267

    Default

    It also introduces unsolicited push of representations from servers to clients.
    that's like heaven for black hats and script-kiddies around the world. Including but not limiting to organisations like the NSA.

    In other words: How retarded are the people that design something like that? A 5 year old knows that that is just wrong.

    Also I really, really hope that someone manages to get the people responsible for that on a data limited plan and sets a website as their start page that transmits a few gibibytes/page request...
    Last edited by Detructor; 07-09-2013 at 12:59 PM.

  4. #4
    Join Date
    Jan 2012
    Posts
    186

    Default

    Quote Originally Posted by Detructor View Post
    that's like heaven for black hats and script-kiddies around the world. Including but not limiting to organisations like the NSA.

    In other words: How retarded are the people that design something like that? A 5 year old knows that that is just wrong.

    Also I really, really hope that someone manages to get the people responsible for that on a data limited plan and sets a website as their start page that transmits a few gibibytes/page request...
    Would you like to explain further what this is and why it is, or even point me in the direction to study about it.

  5. #5
    Join Date
    Jan 2012
    Posts
    65

    Default

    Quote Originally Posted by Detructor View Post
    that's like heaven for black hats and script-kiddies around the world. Including but not limiting to organisations like the NSA.

    In other words: How retarded are the people that design something like that? A 5 year old knows that that is just wrong.

    Also I really, really hope that someone manages to get the people responsible for that on a data limited plan and sets a website as their start page that transmits a few gibibytes/page request...
    If your web client does not want the data from the server it doesn't have to look at it. It's no more dangerous than the requested data responses.

    As for extra data blowing bandwidth caps...what is the difference between normal HTTP 1.1 and this? Do you review and approve each URL request before it transmits? Since Firebug shows most sites with 20-30 requests, I doubt that you do. Any current web site using HTTP/1.1 could already load gigabytes of images or Javascript data and most people would never realize it was happening.

  6. #6
    Join Date
    Oct 2012
    Posts
    148

    Default

    Quote Originally Posted by Zan Lynx View Post
    If your web client does not want the data from the server it doesn't have to look at it. It's no more dangerous than the requested data responses.

    As for extra data blowing bandwidth caps...what is the difference between normal HTTP 1.1 and this? Do you review and approve each URL request before it transmits? Since Firebug shows most sites with 20-30 requests, I doubt that you do. Any current web site using HTTP/1.1 could already load gigabytes of images or Javascript data and most people would never realize it was happening.
    I don't have to review all the links to outright disable all the images on a site and only download specific ones.

    I do it all the time when using tethering.

  7. #7
    Join Date
    Feb 2008
    Location
    Linuxland
    Posts
    5,187

    Default

    With a download you yourself started, you can press "Stop". With a push download in the background, will there even be any indication a transfer is ongoing, let alone a way to stop it? This be the concern.

  8. #8
    Join Date
    May 2012
    Posts
    866

    Default

    Quote Originally Posted by AJenbo View Post
    So is this based on Google SPDY?
    Basically yes, but while Google's solution makes webpages load faster, HTTP2.0 makes pages load even faster, before you hit Enter.

  9. #9
    Join Date
    Sep 2011
    Posts
    704

    Default

    Quote Originally Posted by mark45 View Post
    Basically yes, but while Google's solution makes webpages load faster, HTTP2.0 makes pages load even faster, before you hit Enter.
    How would that work, the webserver doesn't know you are going to connect before you... connect. Or is it more like internal links on a site? We sould already be able to handle that usig JS to pefetch the content.

  10. #10
    Join Date
    Jan 2009
    Posts
    1,438

    Default

    Quote Originally Posted by AJenbo View Post
    So is this based on Google SPDY?
    Somewhat.
    From the abstract it compresses the headers, multiplexes the connection, and preemptively sends data to the client (called Server Push employed using PUSH_PROMISE frames...this requires the use of GET).
    SPDY compresses the headers, multiplexing the connection, preemptively sending data to client, and prioritized requests(http://tools.ietf.org/html/draft-mbe...ttpbis-spdy-00).
    So, at first glance, the only difference is the prioritization of requests...BUT section 5.3 of HTTP/2.0 talks about stream priority which addresses this issue. The odd thing to me is that SPDY uses only 3bits to represent the various priorties while HTTP/2.0 uses the full 31bits.

    IOW, from a layman, they looks pretty damn similar.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •