Announcement

Collapse
No announcement yet.

Firefox 82 Released With Performance Improvements, Video Playback Enhancements

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Guiluge
    replied
    Thanks to Quantum, Firefox is now quite fast.
    Benchmarks will still put Chromium and its derivatives on the lead, but I don't feel the difference on most websites.
    I'm currently using Vivaldi on all my devices (Android version is really great btw), as it offers me much more features, and one feature in particular : a full-featured speed-dial.
    I know many users don't care for it, but I find it useful for my own workflow.

    Firefox do offer many SD extensions, but they're not really on par with Vivaldi (or Opera) default page : before Quantum, there were better SD extensions (eg. Super Start), but they're long gone now.

    Leave a comment:


  • Mike Frett
    replied
    I really...hate PIP.

    Website devs should be forced to create a website, and use it, on an old PII with 384MB ram during their learning process. Bloated and not optimized websites are a scourge on the Internet.

    In my opinion, of course.

    Leave a comment:


  • topolinik
    replied
    Originally posted by Adarion View Post
    WTF?!
    Once there was Mozilla Suite. (formerly Netscape)
    People said "OMG it's bloated! Let's take the browser part alone and make it fast. And let's call it Firefox."
    That happened around v3. After that it became more and more obese so sometimes the whole Seamonkey suite (successor of Mozilla Suite) felt faster and had better handling.
    Now they're desparately rowing back, trying to catch up with the mess they made.
    Nah, this maybe was true a looong time ago, today firefox is getting better with every new release and my user experience is about speed and responsiveness as Opera was in its golden age.
    Further, I would say firefox performs better under linux than windows, but this may just be my personal feeling.

    Leave a comment:


  • crystall
    replied
    Originally posted by Adarion View Post
    unwanted cookie notices, unwanted shit and that is used to track you and mine bitcoins on your CPU and whatnot: often 2 - 4 MiB, that'd be up to 4294967296 bytes. vs. 4096 bytes of actual information.
    For the cookie notice you can use the "I don't care about cookies" add-on: https://addons.mozilla.org/en-US/fir...content=search

    It's really nice addition though it might cause breakage on things like YouTube embeds if you're not logged into YouTube or have previously accepted its cookies.

    For tracker/miner code Firefox now has Enhanced Tracking Protection enabled by default and if you turn it up to "Strict" in the "Privacy & Security" option it will basically block any kind of tracking activity known to man (but might cause websites to break from time to time).

    Full disclaimer: I work for Mozilla

    Leave a comment:


  • polarathene
    replied
    Originally posted by Adarion View Post
    Second half is websites becoming obese to a level that is beyond silly.

    A colourful web is okay, but this is just senseless bloat. And it makes things slow and wastes energy and time.

    But this is cause "web designers" never learnt how to code, never handeled an old computer (8086 similar class, or C64 if you want, or microcontrollers). You learn to fight for every byte of RAM there...
    Analytics is useful, not just for tracking/ads, but for the website/service to better understand users and their activity, where things need improvement or catching problems/bottlenecks with certain funnels (like signup process), testing what UI/UX changes work better (A/B testing), etc. Some other kinds are not necessary or strictly beneficial to those things, and can be classed as PII data that needs legal disclaimer and can be opt-out.

    Websites can be bloated yes, especially so with DIY website builders and services aimed at web designers like Webflow. One of their marketing websites of a kitchen sink page was 15MB to download without any scrolling, and a 30 second preload screen that looked like it froze up on my phone (yes 15MB on the phone connection took a while). It also had some JS animation for some part of the page always playing even when not in view that was stressing CPU. It didn't need to be JS animation from what I could tell and was possibly not limiting itself to a frame rate and just updating as fast as possible. That kind of website is definitely bad (but I've seen many in EU not care about it because they often boast how fast their internet speeds are to not see it as an issue).

    JS isn't all bad, you can pre-render the page and serve up a noscript version when (if any) interactivity doesn't need JS to work. And if JS is available, then it loads the additional JS in chunks as needed, giving a good/smooth experience. Images can all have placeholder/previews to prevent reflows but not require them to be downloaded immediately until scrolling near them. JS can be deferred (any third-party JS should not block the page load and be async deferred tbh). Plenty of optimizations that can be done and some developer focused projects try to enable and make easier for utilizing those.

    Any common third-party JS is often cached, so while you might count it in the page weight, if it's a common one like ads and provided from the same third-party origin, it's not downloaded again and the browser can optimize for that internally (depending how you have it handle tabs context, Chrome has multiple options for such).

    Here's an example of some optimizations, though it's not a proper website there's a lot that goes on under the hood: https://css-grid-pugs.netlify.app/

    100KB JS transferred over the network but split into chunks. On a proper website, this lets the website navigate to other pages and only load compressed JSON for data needed on that page and adds the HTML dynamically updating the page instead of a hard page refresh downloading more data as it traditionally was. It'll also grab any extra JS chunks needed for displaying parts of that new page. It's much lighter that way, and the JS can be re-used.

    In this website case, the CSS is inlined to the HTML already, or dynamically adjusted via JS, so styling code is also part of that JS weight. The document weight is a bit hefty at 17KB, mostly because of the inlined image previews for lazy loading and noscript support + inlined CSS. The advantage of that is you get a page rendered quickly, no additional resource requests required, so no repaint/reflow issues as content loads in jumping around. It could be optimized further but would require much more effort.

    Image weight is going to depend on the device you view it on, not only does viewport width come into play, but also the displays DPI and browser support for either WEBP or JPG images. That will affect which actual image URLs are chosen by the browser(no JS required) to download the images. I see 100KB for a narrow mobile view on my desktop(1080p) or 400KB at desktop width. Chrome supports native image lazy loading for a year or so now, which is more aggressive, so there's only 70KB more of images at desktop width to download for 20-ish images total. Those images have a variety of different scaled sizes, but mobile devices will often download larger images due to the higher DPI, which provides more detailed images, but can raise up to 1MB-ish in weight (still not bad considering the number of images, image optimization helps).

    On older browsers lacking native image lazy load support, JS helps with a more conservative viewport distance to trigger loading of the real images (if JS is disabled, then all images will load of course). JS is additionally providing a smoother transition with some CSS for the blurred placeholder to fade reveal the real image. In page navigation scenarios, the image would be remembered as already downloading, and can skip all that stuff to just show the image. It gets more complicated when you want to also add art direction into the mix(different images for different viewport widths or orientations, and aspect ratio differing).

    If you are on a desktop, you can adjust the viewport width to see the images scale or re-arrange to fit nicely into the given space, you'll notice the images aren't always 100% displayed there's some area for cropping/clipping to avoid distortion that stretching would cause. Columns likewise adjust based on width and are tailored to meet device/display widths with the various image scale variants sized with that in mind.

    Granted, I am into delivering optimized experiences, but there can be a lot to learn and since premature optimization is the root of all evil and that, I'm not surprised that it's not as high of a priority. Longer lived projects which have a good amount of financial backing and earnings may bother to fund optimizing for any meaningful gains. The agencies that get clients sold on visuals / functionality alone and use website builders with minimal coding are usually the ones at fault for bloat though. They prefer speed and meeting the clients requirements, it's not something they continue to benefit from afterwards by spending additional time/effort to optimize for.

    Same reason AWS is popular with businesses (with actual developers), it makes things easier for them, there's better and cheaper alternatives but it's easier to take vendor lock-in and hire developers that know those platforms well, common knowledge without having to upskill so much.


    Originally posted by Adarion View Post
    unwanted cookie notices
    Legal requirement that some can ignore I think, but especially with GDPR in EU, you want to comply and not be fined heavily (which has had a few cases for those that didn't take privacy laws seriously). Some don't actually need to have those notices, but you have to go through the legal documents that define it and really understand the conditions where it doesn't apply. That's something you're best off paying a lawyer for who understands GDPR well, sometimes it's just cheaper to play it safe with the notice.

    Additionally there's another EU based law which applies to social or user content networks/services which also needs to be taken seriously. Then there's other laws around the world, California has one too. Cookies is one of the older ones and easier to deal with, but had a follow up that required websites to inform visitors about them, often in a way that was invasive so the user couldn't claim they didn't notice the disclaimer.

    Leave a comment:


  • bug77
    replied
    Originally posted by birdie View Post
    A note for Firefox users who 1) force WebRender (via gfx.webrender.all=true) 2) use NVIDIA proprietary drivers with 3) desktop compositing turned off.

    There's a regression in this Firefox release which manifests itself in the top half of the screen being unusable, see bug 1663273 for more details. There are currently two workarounds both with their pitfalls:
    • Either export MOZ_X11_EGL=1 however this will effectively disable WebGL 2.0 (not that many websites use it anyways)
    • Or export MOZ_GTK_TITLEBAR_DECORATION=system but in this case you'll see window decorations.
    I chose the former because I don't use any websites which utilize WebGL.
    I've hit that. I've just disabled hw acceleration. One day later I thought I'd enable it again and it was all peachy. Not sure if it was a fluke or something Arch updated the second day, but it's ok now.
    But I do have the compositor turned on.

    Leave a comment:


  • HighValueWarrior
    replied
    Lot of Firefox nuthugging going on ..... undeservedly so imho.
    Linux users have made excuses for a long time ...... for me at least its wearing thin.

    Leave a comment:


  • Mitch
    replied
    My experience for the last few Firefox releases has been extremely positive. It has been fluid and stable. I use Dev edition on Windows and Linux. I've also used Wayland and X but I generally prefer X only because KDE has issues for me in Wayland (even the new 5.20). Firefox itself plays nicely with Wayland for me, whether I set its Wayland flag or not.

    I think one big ticket is video hardware acceleration enablement as a default for Linux. I've played with the About:Config here but I haven't thoroughly tested the outcome.

    Leave a comment:


  • cl333r
    replied
    Originally posted by Adarion View Post
    (bloated websites content)
    True but it seems to me websites aren't getting more bloated in the last 10 years. Even this site - did phoronix really get more bloated? Imo no.

    Leave a comment:


  • Marc Driftmeyer
    replied
    Originally posted by Adarion View Post
    WTF?!
    Once there was Mozilla Suite. (formerly Netscape)
    People said "OMG it's bloated! Let's take the browser part alone and make it fast. And let's call it Firefox."
    That happened around v3. After that it became more and more obese so sometimes the whole Seamonkey suite (successor of Mozilla Suite) felt faster and had better handling.
    Now they're desparately rowing back, trying to catch up with the mess they made.

    Second half is websites becoming obese to a level that is beyond silly.
    Sheer informational content in ASCII text: e.g. 4 Kilobytes (Kibibytes). That is 4096 bytes.
    Okay, add a little HTML formatting if you need.
    Images 1.2 MiB. Should be ~1288490188 bytes.
    embedded ads, "social" network buttons and the likes: several KiB to MiB
    active JavaScript code that really slows down CPU/browser just for the sake of displaying unwanted ads, unwanted cookie notices, unwanted shit and that is used to track you and mine bitcoins on your CPU and whatnot: often 2 - 4 MiB, that'd be up to 4294967296 bytes. vs. 4096 bytes of actual information.

    A colourful web is okay, but this is just senseless bloat. And it makes things slow and wastes energy and time.

    But this is cause "web designers" never learnt how to code, never handeled an old computer (8086 similar class, or C64 if you want, or microcontrollers). You learn to fight for every byte of RAM there...
    Question to you: Are you paying the publishers for providing whatever site contents you visit?

    Leave a comment:

Working...
X