Announcement

Collapse
No announcement yet.

W3C Prepares Guidance For Web Development In A Post-Spectre World

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • uid313
    replied
    Originally posted by coder View Post
    Stylesheets aren't evil. If they're appropriately limited, they serve to encourage use of semantic markup vs. presentational. And if you want to make the web more accessible, proper use of semantic markup is a very good thing.
    I agree, I really like style sheets and semantic markup, I am a huge fan of that.

    Originally posted by coder View Post
    None of that is intrinsic to scripting. Furthermore, Firefox has fixes or mitigations for most of those and Google just announced Chrome would no longer allow 3rd party cross-site cookies.
    Yeah, but they replace third-party cookies with some new tracking mechanism to replace it.
    There are so many different ways to track people, such as Web fonts, canvas fingerprinting, E-Tag, HTTP headers, web beacons, IP address leaking over WebRTC, etc. There seems to be no end.
    And the privacy is just one problem. Another problem is the ever increasing attack surface with new APIs introduced all the time, like Web USB, WebGL, Web Bluetooth, Web NFC, WebXR, WebVR, Service Workers, WebRTC, etc.
    And its not just privacy and security either, it is same foreground as background colored text, it is text in font-size: 1px, poor accessibility (insufficient contrast, etc), lack of dark mode support, floating and/or overlapping elements, etc.

    Originally posted by coder View Post
    I'm left with the feeling that maybe what we need is a well-defined subset of HTML 5. It has all the ingredients we need for a good static web experience, but a lot of other things that we could often do better without.
    Yeah, I think you're right about that. I think HTML 5 is good, I like that it has many semantic elements such as header, footer, main, nav, aside, dialog, etc.

    Leave a comment:


  • coder
    replied
    Originally posted by uid313 View Post
    Yeah, but Gemini isn't proposed as a successor or replacement for the web. It does not aim to replace the web.
    Yeah scripting and client-side execution is cool and opens up for very cool things and experiences and apps, but it also has its downsides.
    Yes, totally agree -- it's a tradeoff, and yes, the availability of scripting virtually guarantees that it'll be abused.

    Originally posted by uid313 View Post
    It's not just about security, its also about privacy, not being tracked across websites, its about not having popovers and clickjacking, zombie cookies, supercookies, cross-site tracking, third-party cookies, cryptomining, fingerprinting, etc.
    None of that is intrinsic to scripting. Furthermore, Firefox has fixes or mitigations for most of those and Google just announced Chrome would no longer allow 3rd party cross-site cookies.

    Originally posted by uid313 View Post
    There are no style sheets, and you configure your own font, font-size, color scheme, etc.
    Stylesheets aren't evil. If they're appropriately limited, they serve to encourage use of semantic markup vs. presentational. And if you want to make the web more accessible, proper use of semantic markup is a very good thing.

    I'm left with the feeling that maybe what we need is a well-defined subset of HTML 5. It has all the ingredients we need for a good static web experience, but a lot of other things that we could often do better without.

    Leave a comment:


  • ed31337
    replied
    The web of today frankly sucks ass. It's a lot better than when Microsoft was dominating the browser war, but still, Google's advertisement revenue means they're not likely to make web browsers good for privacy, security, and efficiency. Sure, they'll pay lip-service to those goals, but ultimately, as long as they're in charge, we're going to be carrying around an advertising monkey wrench or two thrown in there.

    Gemini sounds great! I hope they live long and prosper. I look forward to stepping out of "the world, according to Google" and into Gemini-Space.

    Leave a comment:


  • uid313
    replied
    Originally posted by coder View Post
    All things being equal, sure. But it's no accident that scripting and programmability keeps getting pushed to clients (e.g. Java, JS, ActiveX, Flash, Web Asm, WebGL, etc.). Besides scaling better, it results in a far richer and more responsive experience.

    Yeah, if you're just browsing wikipedia, then static is fine, But there are plenty of sites that would be infeasible to do without client-side programmability, whether using a rich web app or a custom thick-client. And these days, web technologies have received such focus on security that I'd honestly feel safer running a web app than a completely random closed-source thick client.
    Yeah, but Gemini isn't proposed as a successor or replacement for the web. It does not aim to replace the web.
    Yeah scripting and client-side execution is cool and opens up for very cool things and experiences and apps, but it also has its downsides.

    It's not just about security, its also about privacy, not being tracked across websites, its about not having popovers and clickjacking, zombie cookies, supercookies, cross-site tracking, third-party cookies, cryptomining, fingerprinting, etc.

    I can see Gemini being easy to index (unlike some SPA applications like React) and very compatible with screen readers and accessible to people with disabilities. There are no style sheets, and you configure your own font, font-size, color scheme, etc.

    Leave a comment:


  • coder
    replied
    Originally posted by uid313 View Post
    This is not the developer, it is for the user. If I want to visit some site then I want it to be static, I don't want any dynamic content.
    All things being equal, sure. But it's no accident that scripting and programmability keeps getting pushed to clients (e.g. Java, JS, ActiveX, Flash, Web Asm, WebGL, etc.). Besides scaling better, it results in a far richer and more responsive experience.

    Yeah, if you're just browsing wikipedia, then static is fine, But there are plenty of sites that would be infeasible to do without client-side programmability, whether using a rich web app or a custom thick-client. And these days, web technologies have received such focus on security that I'd honestly feel safer running a web app than a completely random closed-source thick client.

    Leave a comment:


  • uid313
    replied
    Originally posted by coder View Post
    Sounds fairly useless. Maybe good for those few cases when you could also just use static HTML, but then why wouldn't you just use static HTML?
    This is not the developer, it is for the user. If I want to visit some site then I want it to be static, I don't want any dynamic content.

    Leave a comment:


  • coder
    replied
    Originally posted by uid313 View Post
    Yes, it takes inspiration from both Gopher and the Web, and is something in between.
    The whole point of Gemini is that it will not evolve, it is designed to be minimal and not be extensible.
    Sounds fairly useless. Maybe good for those few cases when you could also just use static HTML, but then why wouldn't you just use static HTML?

    Maybe this could get some traction in ultra-secure government, military, and financial environments, but then I'd hope they've already done the work to ban any sort of scripting, Web Assemly, WebGL shaders, etc. from conventional browsers.

    Leave a comment:


  • coder
    replied
    Originally posted by [email protected] View Post
    TL;DR: don't run code that you don't trust?

    Like, disable javascript? Is there a way to selectively enable mitigations when launching a program, like running `spectredo proprietaryapp`?
    One thing would be to avoid mixing threads from different processes on the same core. I think Google submitted a kernel patch for optionally enabling that?

    Or, if you could distinguish between trusted and untrusted apps, then you could even mix threads from trusted apps and just worry about sandboxing the threads from the untrusted ones. Of course, then there'd need to be some way of certifying packages and signing them to assert that they're "trusted" and have no ability to run external code.

    Leave a comment:


  • M@yeulC
    replied
    TL;DR: don't run code that you don't trust?

    Like, disable javascript? Is there a way to selectively enable mitigations when launching a program, like running `spectredo proprietaryapp`?

    Leave a comment:


  • uid313
    replied
    Originally posted by juarezr View Post

    Project Gemini looks more like a renewed Gopher than a multi-role platform like Web with HTML/Javascript and extensions.

    Web probably will evolve with functionality more diverse and powerful like Java Applets or Flash was some time ago (WASM new browser capabilities indicates this) than simple, well defined, and organized protocols like Gopher and Gemini.
    Yes, it takes inspiration from both Gopher and the Web, and is something in between.
    The whole point of Gemini is that it will not evolve, it is designed to be minimal and not be extensible.

    Leave a comment:

Working...
X