Announcement

Collapse
No announcement yet.

Firefox 83 Released With Warp'ed JavaScript, HTTPS-Only Mode Option

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • polarathene
    replied
    Originally posted by dev_null View Post
    Yes I’m aware of Let’s encrypt and use it. however it anyway stinks a little bit: if I have a load balancer: to which server I should install certbot ?
    Apple certs for apps btw is unrelated to HTTPS, but still the same intent of authentication to prove something is trustworthy (from who it claims to be).

    If you're using a load balancer, that would be the server that handles your HTTPS certs, if that's not something you control directly on your own server but via some service instead, then that service likely has HTTPS settings for you to configure.

    I've used nginx to serve web content, while also having another nginx instance as the reverse proxy that can handle load balancing, the latter one uses HTTPS for all my sites, while the web server instances can be handled over HTTP if not directly accessible to the public internet (eg all running on the same server machine or some VPN cluster). Caddy is also great at this and comes with HTTPS by default with LetsEncrypt all automated for you.

    What stinks about LetsEncrypt?

    Leave a comment:


  • polarathene
    replied
    Originally posted by dev_null View Post
    I cannot see anything bad with http, it specially was created in text in order to be readable an debuggable. Https requires certificates which only ‘good’ companies can issue, CPU power, complicates debugging and code writing. And it’s amazing where it matters like privacy or money but why to enforce it everywhere ??
    Getting a cert for HTTPS / TLS is very easy and often automated these days. LetsEncrypt does it for free, not sure what you're on about with "only good companies". You can issue your own too, but if your service is meant to be available to the public, only certs with a valid root CA cert in their systems trust stores will be accepted. That's intentional for security benefit, otherwise it'd be much easier to spoof identity/trust of a domain being from who they claim to be.

    It doesn't have notable CPU overhead anymore, that was addressed long ago, in fact HTTPS when paired with HTTP/2 (pretty much a given these days) is often more efficient than plain HTTP as multiple requests can all utilize the same connection instead of separate ones being initiated. The largest overhead of HTTPS would be in a handshake that is barely anything, then your likely using HW accel AES or an efficient software based cipher like ChaCha20-Poly1305.

    Complications with debugging network traffic in what way? If you are the dev, you can debug over HTTP still, or use HTTPS with an RSA key exchange instead of DHE/ECDHE, you'll have no problem observing the encrypted traffic then (you shouldn't use RSA key exchange for your users though). What complicating with code writing? Some web features only work with secure contexts, those require HTTPS. Are you referring to self-signed certificates and browsers flagging them as insecure when developing? That can be avoided, or you can setup a local dev certificate (checkout mkcert or smallstep, Caddy will also automate this all for you).

    It's not just about privacy (encryption), but trust. If you don't have a secure connection to the server, you don't know if you're really talking to the real server, it could be a phishing web server that responds to the domain request due to your DNS being compromised. Your credentials in a login/payment form can be presented on an HTTPS site but sent over HTTP request on submission, that's not immediately apparent to the user but could be sniffed, knowing HTTPS will always be used and notify you of any attempts otherwise is useful at avoiding that problem.

    Additionally, while a website that just provides some information might seem harmless. Depending on context, a passive attacker might observe what webpages you view, monitoring your activity across various HTTP sites could glean insights about you, that could be leveraged such as health conditions or interests that you might otherwise not reveal / discuss. Doesn't even have to be malicious intent, an employer might notice you've carelessly been using their network to look at job listings for a new role during your break time (or work hours), or perhaps you've been having some financial trouble recently and your internet activity indicates this to an employer (whom you're working for remotely at home but need to connect via their VPN and don't have split DNS setup, so they get all your DNS history as well regardless of HTTPS), they could potentially leverage that to their advantage in certain situations to pressure or persuade you to their benefit.

    With HTTP, the content can be manipulated too in transit. If you were looking into political articles with an upcoming election, an ISP with a certain bias or motivation could notice particularly popular articles against their interests and modify them over HTTP to reduce their impact without raising too much suspicion, you'd be none the wiser. They can inject other content, be it tracking or ads (some airlines have done this when you connect to their free wifi).

    EDIT: Just realized someone already covered some of this with a link, I had recalled that information but couldn't seem to remember that very same link

    Leave a comment:


  • kescherPh
    replied
    Originally posted by dev_null View Post
    Yes I’m aware of Let’s encrypt and use it. however it anyway stinks a little bit: if I have a load balancer: to which server I should install certbot ?
    To the load balancer. (If you mean a pass-through load balancer, not a server director)

    Leave a comment:


  • Jaxad0127
    replied
    Originally posted by dev_null View Post
    Yes I’m aware of Let’s encrypt and use it. however it anyway stinks a little bit: if I have a load balancer: to which server I should install certbot ? ... okay Google at least suggests their own https certificate for the case, but it smells by ‘good’ companies again
    In our stuff, we have the load balancer (or another machine in front of it) handle HTTPS.

    Leave a comment:


  • arQon
    replied
    Originally posted by treba View Post
    The biggest features "unlocked" by Webrender ...
    None of which work on my HTPC (Intel), AFAICT: every time I've tried forcing it on there, it's been either buggy, or MUCH slower, or both. Maybe that's just a matter of how far from ready it was at the time, and WHY I had to force it on - but it's HW with VAAPI support, which it supposedly worked on even back when I FIRST tried forcing it (which is, yknow, WHY I did. :P). And the second time, after a later announcement noting bugfixes and better / wider support. And the third time, after a similar claim months later still. Every time, it's been broken, buggy, or just plain slower - and the HTPC is exactly the sort of machine it was intended for.

    So at that point, I just gave up on it as 95% hype and 5% actual real-world use cases. It'll be interesting to see what happens in 84/85/etc, but I'm long since past caring about it at this point. If it works, great. If not, also great, since it hasn't delivered for years already at this point and I've managed just fine without it.

    I'm not stupid enough to be trying to drive a 4K display on an Intel IGP potato, but it was worse even on 720p, so meh. For the small but apparently non-zero number of machines it does actually work on, great: I've been on Linux more than long enough to remember when fullscreen video that worked *at all* was a pipe dream, and the memory of browsers running Flash for it still makes my skin crawl, so any progress in that area is nice to hear about regardless of how small the success rate is, whether it ever benefits me personally or not.

    Leave a comment:


  • dev_null
    replied
    it’s now optional. Like with apple if you are aware: they initially asked for signing, then disallowed launching of unsigned apps by default, then added optional notarization (i.e. your app has to be approved by apple service) then made it mandatory and now if you open an app which is not notarized apple just suggests move it to trash right away without even showing “open” button.

    Yes I’m aware of Let’s encrypt and use it. however it anyway stinks a little bit: if I have a load balancer: to which server I should install certbot ? ... okay Google at least suggests their own https certificate for the case, but it smells by ‘good’ companies again

    Leave a comment:


  • kescherPh
    replied
    Originally posted by dev_null View Post
    Am I only sad about HTTPS-Only Mode Option ? I cannot see anything bad with http
    As a user of this mode for a long time (it was exposed in about:config), I avoid http-only websites like the plague.

    Originally posted by dev_null View Post
    Https requires certificates which only ‘good’ companies can issue
    Have you heard of Let's Encrypt?

    Originally posted by dev_null View Post
    CPU power
    Barely.

    Originally posted by dev_null View Post
    complicates debugging and code writing
    This mode exempts localhost, for instance, so locally hosted services are still easily debuggable for devs. Also, devs usually don't implement https into their services directly, that is the job of a frontend webserver such as NGINX.

    Originally posted by dev_null View Post
    why to enforce it everywhere
    Because sites that are still not supporting https are a liability to every single visitor. See Does my site need HTTPS?

    Edit: Besides, HTTPS-only mode is optional... So, why are you complaining? You can use your filthy HTTP-only sites without an issue!
    Last edited by kescherPh; 18 November 2020, 03:51 AM. Reason: Remove duplicate sentence

    Leave a comment:


  • dev_null
    replied
    Am I only sad about HTTPS-Only Mode Option ? I cannot see anything bad with http, it specially was created in text in order to be readable an debuggable. Https requires certificates which only ‘good’ companies can issue, CPU power, complicates debugging and code writing. And it’s amazing where it matters like privacy or money but why to enforce it everywhere ??

    Leave a comment:


  • treba
    replied
    Originally posted by arQon View Post

    oh? My impression of WebRender on Linux is that the FF team's position is more like "if you have exactly this GPU, then maybe", but not much more than that. The HW-exclusion list seems to be almost infinitely large, with no reasoning behind it other than "well, we haven't tested this specific device" rather than simply assessing a device's capabilities, all of which are well-defined and have been exposed for a very long time now.

    Maybe I'm misunderstanding the situation, since none of my machines seem slow enough anyway for me to care about trying out WebRender, since I don't benchmark browsers and don't care about single-digit performance improvements. But I've lost track of how many YEARS it's been since the "WebRender will make everything sooo much better" hype, and in all that time I don't think I've ever seen any real effort at all to actually have it work on anything except Windows. (Which to be clear is a position I totally understand: I'm not blaming FF for throwing 99% of the resources at 99% of the market).

    It'll be cool if it happens, I suppose? But like I say, a difference of a few %, or slightly better battery life or whatever doesn't mean anything to me anyway.
    It's quite more than a few percent. The biggest features "unlocked" by Webrender (in combination with EGL, both on Wayland or X11) are zero-copy WebGL[1] and hardware video decoding (so far limited to VAAPI) support. Both can have quite huge impacts. Apart from that there are many cases that benefit from Webrender itself, especially as screens are getting ever bigger. Rendering 4k in software works but is far from efficient.

    Also, Webrender is now also used on MacOS and Android[3]

    1: https://mastransky.wordpress.com/202...on-on-wayland/
    2: https://mastransky.wordpress.com/202...pi-on-wayland/
    3: https://wiki.mozilla.org/Platform/GFX/WebRender_Where

    Leave a comment:


  • arQon
    replied
    Originally posted by uid313 View Post
    I recently had some performance problems with Firefox, just changing tabs took a long time, but I solved it by clearing the cache, and going to aboutreferences#privacy and clicking on "Manage Data" and manually deleting some cookies. Some domains such as YouTube had 100+ cookies, and some domains such as YouTube had multi-megabyte cookies like 8,6 MB.
    Yeah, I recently had to help someone out with a Google-driven cookie problem, and came across the same thing. Even weirder is that the *YouTube* cookies are being set by things like Gmail.
    It seems to be a side-effect of some attempt by G to preserve their tracking against something - privacy tools like containers or uBlock or whatever. Now, they're practically putting an entire DB in the cookies, and I'd guess maybe merging all the data from every G site into the cookies for all of them. Backups, effectively, so that if one ever gets deleted they can restore all the linkages via the cookies of any other G site you use.

    I don't think the megacookies themselves are the cause of any problems with FF, though in fairness you REALLY don't expect a cookie to run to multiple MB, so maybe there's some issue with how the FF code handles pathological cases like that.

    Leave a comment:

Working...
X