Announcement

Collapse
No announcement yet.

Firefox 82 Released With Performance Improvements, Video Playback Enhancements

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Michael_S
    replied
    Originally posted by polarathene View Post

    I can relate to that and it is a bit worrisome, I'm not sure if much can be done when we consider what is causing such trends, other than convince / motivate businesses with a language they can grok and value to justify doing things proper (or closer to).
    I don't think anything can be done, which is why I wrote that our whole industry is fucked.

    Originally posted by polarathene View Post
    Sure? I'm only talking about performance from a pragmatic standpoint. It becomes more and more effort for declining ROI, especially when the visitor isn't likely to be in a position to become a customer. Supporting such customers can additionally be a security risk.

    I don't think the resources need to grow like that. There's a consistent effort within the industry towards doing things better and more efficiently, they do take time and I can't say how well that surfaces to non-devs that use software services to composite a website via a GUI.

    Some websites today can be done far more efficiently and easier than they would have been 5-10 years ago. There's a whole ecosystem that contributes towards making that possible though, and complexity has increased notably to do that well, this doesn't mean that the benefits aren't more accessible to less experienced developers though, Gatsby is an example of that (although lately it's been giving Wordpress vibes a bit, and I have seen some inexperienced developers mess up bloating the weight of a page..).
    I'm not thinking in terms of pragmatism, I'm thinking in terms of environmental waste, customer satisfaction, and harm to the bottom third of households by income in wealthy countries and even more people elsewhere.

    A laptop or desktop purchased in 2005 should be usable for day to day tasks today - but the only people that could make effective use of one are pretty sophisticated tech professionals that can replace the factory OS with Linux or a *BSD and figure out how to browse sites with Lynx and do most of their other actions from a terminal prompt.

    I look forward to my hardware upgrades because I've got the budget for it. Friends and relatives with a tiny fraction of my income don't.

    Originally posted by polarathene View Post
    You're comparing to very different devices. The iPhone was capable of considerably more than what you compared to it, which I doubt had much generic support, you want websites as fixed-width terminal output instead?
    Definitely, the iPhone 1 is capable of considerably more than the Apollo 11 nav computer. I grant that completely. But the question is, is the iPhone capable of 100 times as much? Because in every raw performance metric, it's at least 100 times as capable. So even considering all of the things the iPhone can do that the earlier 1969 computer can't, the iPhone makes horrifically inefficient use of its resources.

    Originally posted by polarathene View Post
    It wouldn't function with the majority of websites at all.

    Even if it were able to use an updated browser as they'll enforce TLS 1.2 (all browsers currently deprecated earlier versions with a error/warning page until removal of support completely in near future releases). TLS 1.2 was official from 2008, and more widely available in implementations around 2010, but server distro releases lacked packages until later releases, which also have slow adoption, this is especially the case with mail servers. iOS5 added TLS 1.2 support I think, but only for HTTPS, other protocols like for email was delayed until iOS9.

    Not only that the amount of supported cipher suites that servers make available and support the earlier protocol versions is fairly limited now. Some legacy clients like old versions of Java cannot use some of the cipher suites for perfect forward secrecy due to limitations in the Java run time back then that would be considered insecure to allow now. You'll also find old mobile devices lacking hardware accel for encryption where software AES is like 30x slower or so, there's a modern alternative which is near hardware accel performance in software, but was only available as a standard since 2016.
    But again, all of these security protocols were designed and adopted by the industry with the idea that any device past a certain age could be ignored.

    Now, I'm sure if there were lots of secure cryptographic protocols that were as easy to run on a 486 as they are on a 5th generation Core i7, those would have been adopted in TLS 1.3. So designing cryptographic algorithms and protocols that allowed older devices to stay relevant must have been difficult. But in a better world, billions would have been poured into that research into protocols that were still secure but support older hardware, because it would have allowed hundreds of billions of dollars in older hardware to continue wide use.

    Instead, this is our world: "If we roll this out, people with a Samsung Galaxy S3 or iPhone 4 won't be able to use our service any more. Their only option will be to buy new devices." "So what?"

    Originally posted by polarathene View Post
    The browsers themselves if not updated would probably break in other ways too due to lack of newer CSS features, browser APIs and other support like image formats.
    Again, the old devices are abandoned by manufacturers because it's not profitable to update them. So even if you wanted to browse a website that uses a modest amount of RAM - like I said, Phoronix uses less than 6MB on most pages for me - your Samsung Galaxy S1 probably won't work because Samsung saw no profit in updating it.

    Originally posted by polarathene View Post
    I've had old devices from back then and remember even some native apps (android) weren't all that snappy, especially single core low clockrate SoCs (I had a Samsung Galaxy S1 which was still quite a leap ahead of the original iPhone I think). Websites weren't the only poor performers.

    I'm just glad we're not working with hardware resources like I grew up with in the 90s, programming stuff on those especially games I recall they went to quite extensive lengths to be efficient. Now we have embedded hardware with that processing power for a few dollars or less and it runs off bugger all power.

    Leave a comment:


  • polarathene
    replied
    Originally posted by Michael_S View Post
    I understand what you're saying, and I know you and a small but significant portion of the people in our field are doing really well compared to the industry average on things like this. But the trend is still heading in the wrong direction, and it's moving so quickly it carries even the best of us along for the ride.
    I can relate to that and it is a bit worrisome, I'm not sure if much can be done when we consider what is causing such trends, other than convince / motivate businesses with a language they can grok and value to justify doing things proper (or closer to).

    Originally posted by Michael_S View Post
    So even by being efficient, you're going to cut people at the bottom out. And I'm not angry at you for that, because I understand you can't pay your bills catering to those people. But the result is the same - in 10 years you will be one of the comparatively efficient people, which by then will mean you're targeting smartphones that only have 2GB of RAM and 10 mb/s connection speeds, while the rest of the market assumes 8GB of RAM and 50 mb/s.
    Sure? I'm only talking about performance from a pragmatic standpoint. It becomes more and more effort for declining ROI, especially when the visitor isn't likely to be in a position to become a customer. Supporting such customers can additionally be a security risk.

    I don't think the resources need to grow like that. There's a consistent effort within the industry towards doing things better and more efficiently, they do take time and I can't say how well that surfaces to non-devs that use software services to composite a website via a GUI.

    Some websites today can be done far more efficiently and easier than they would have been 5-10 years ago. There's a whole ecosystem that contributes towards making that possible though, and complexity has increased notably to do that well, this doesn't mean that the benefits aren't more accessible to less experienced developers though, Gatsby is an example of that (although lately it's been giving Wordpress vibes a bit, and I have seen some inexperienced developers mess up bloating the weight of a page..).

    Originally posted by Michael_S View Post
    Remember, nobody uses an iPhone 1 for anything today. They're an antiquity. And it has more than 200 times the raw speed of the computer that powered the moon landings and more than 50,000 times as much memory.
    You're comparing to very different devices. The iPhone was capable of considerably more than what you compared to it, which I doubt had much generic support, you want websites as fixed-width terminal output instead?

    Originally posted by Michael_S View Post
    Today, just 13 years later, it's good enough for nobody and I'd be surprised if 10 of the 100 most popular sites on the internet work on it at all.
    It wouldn't function with the majority of websites at all.

    Even if it were able to use an updated browser as they'll enforce TLS 1.2 (all browsers currently deprecated earlier versions with a error/warning page until removal of support completely in near future releases). TLS 1.2 was official from 2008, and more widely available in implementations around 2010, but server distro releases lacked packages until later releases, which also have slow adoption, this is especially the case with mail servers. iOS5 added TLS 1.2 support I think, but only for HTTPS, other protocols like for email was delayed until iOS9.

    Not only that the amount of supported cipher suites that servers make available and support the earlier protocol versions is fairly limited now. Some legacy clients like old versions of Java cannot use some of the cipher suites for perfect forward secrecy due to limitations in the Java run time back then that would be considered insecure to allow now. You'll also find old mobile devices lacking hardware accel for encryption where software AES is like 30x slower or so, there's a modern alternative which is near hardware accel performance in software, but was only available as a standard since 2016.

    The browsers themselves if not updated would probably break in other ways too due to lack of newer CSS features, browser APIs and other support like image formats.

    I've had old devices from back then and remember even some native apps (android) weren't all that snappy, especially single core low clockrate SoCs (I had a Samsung Galaxy S1 which was still quite a leap ahead of the original iPhone I think). Websites weren't the only poor performers.

    I'm just glad we're not working with hardware resources like I grew up with in the 90s, programming stuff on those especially games I recall they went to quite extensive lengths to be efficient. Now we have embedded hardware with that processing power for a few dollars or less and it runs off bugger all power.


    Last edited by polarathene; 24 October 2020, 05:45 AM.

    Leave a comment:


  • Michael_S
    replied
    Originally posted by polarathene View Post

    I get what you're saying with software dev in general, but for web this isn't always the case. It doesn't always apply, but when you make money from customers and they are in regions or on devices where data is expensive and/or slow, reducing the network overhead can make a world of a difference.

    If the webflow kitchen sink demo was an actual product page for some business, you'd have an incredibly high bounce rate for all the customers that gave up before that 15MB was downloaded and the preload screen that blocked any interaction was removed. I wasn't the only one, it was a sponsored facebook ad and filled with comments asking if the website was broken due to nothing changing for over 30 seconds. For those with the patience or faster connections, the constant CPU overhead from animation logic further down the page was slowing down responsiveness.

    You don't have to go hardcore with optimizations, but a page should ideally be loading up and accessible within 3 seconds at most (based on UX studies). As a business, besides the desire to retain potential customer interest, you also don't want to waste money on bandwidth unnecessarily (it did not need to download 15MB per user to begin with), if it was a high traffic site, they could have cut their bandwidth costs down 90%, depending on expenses that could be pretty decent savings.

    Not always about saving money though, as you point out other priorities will have convenience and speed to develop come out on top. Using BunnyCDN instead of AWS services for CDN alone has it's own fair share of savings. Not having a high bounce rate due to poor UX is the bigger justification though.
    I understand what you're saying, and I know you and a small but significant portion of the people in our field are doing really well compared to the industry average on things like this. But the trend is still heading in the wrong direction, and it's moving so quickly it carries even the best of us along for the ride.

    So for example, you're targeting a 3 second web page load time on low end hardware through low bandwidth for customer engagement. You trim your website down to 80k of content not counting images and the Javascript is optimized until it runs quickly enough not to ruin user experience on a Moto G phone from 2015.

    The bright news is that your content is more efficient than literally 99% of other web content, and 90% of the people with a smart phone in the world can use your site.

    The bad news is that smart phones and tablets from 2010 which should be perfectly viable computing devices today still struggle with your content, or can't handle it at all. And spending the extra time to take the site content down from 80k to 40k or less will never pay for itself in increased customer engagement, because the people with devices that need that 40k improvement are by definition the ones with no discretionary money for your products.

    So even by being efficient, you're going to cut people at the bottom out. And I'm not angry at you for that, because I understand you can't pay your bills catering to those people. But the result is the same - in 10 years you will be one of the comparatively efficient people, which by then will mean you're targeting smartphones that only have 2GB of RAM and 10 mb/s connection speeds, while the rest of the market assumes 8GB of RAM and 50 mb/s.

    Remember, nobody uses an iPhone 1 for anything today. They're an antiquity. And it has more than 200 times the raw speed of the computer that powered the moon landings and more than 50,000 times as much memory. Apple sold millions of that phone and the users loved it. It was good enough for six million people in the first year it was out. Today, just 13 years later, it's good enough for nobody and I'd be surprised if 10 of the 100 most popular sites on the internet work on it at all.

    Leave a comment:


  • polarathene
    replied
    Originally posted by Citan View Post
    And that's putting aside the fact that Chrome doesn't even have the most basic functionality of a modern browser correctly supported: nestable tabs in sidebar.
    Pretty sure I've seen something like that in one of the derivatives, Vivaldi or Opera.

    Chrome recently got a tab group feature, might not be sidebar managed, I only saw a brief clip of it, but probably good for organization. I've used an extension in the past that gave me a tree view of my session (windows + tabs, and nested within a tab in the tree were other tabs that had been navigated to from it), I think it supported other organizational features too.

    I think my main reasons I stopped using Firefox was when it changed the extension support, preventing some APIs from working and my session being saved either by firefox or the session every 30 mins would consume large amounts of RAM (10GB or so) and bring the browser to a halt from the CPU activity (might have been before multi-process arrived). There were times it wasn't able to recover my sessions, and one update insisted on wiping my profile (tabs/bookmarks/etc) which really annoyed me (the message wasn't that clear about what it was going to do either).

    Leave a comment:


  • polarathene
    replied
    Originally posted by Michael_S View Post
    Performance optimization for any software aside games, video editors, and image editors is rarely a priority.

    I don't see that changing, really.
    I get what you're saying with software dev in general, but for web this isn't always the case. It doesn't always apply, but when you make money from customers and they are in regions or on devices where data is expensive and/or slow, reducing the network overhead can make a world of a difference.

    If the webflow kitchen sink demo was an actual product page for some business, you'd have an incredibly high bounce rate for all the customers that gave up before that 15MB was downloaded and the preload screen that blocked any interaction was removed. I wasn't the only one, it was a sponsored facebook ad and filled with comments asking if the website was broken due to nothing changing for over 30 seconds. For those with the patience or faster connections, the constant CPU overhead from animation logic further down the page was slowing down responsiveness.

    You don't have to go hardcore with optimizations, but a page should ideally be loading up and accessible within 3 seconds at most (based on UX studies). As a business, besides the desire to retain potential customer interest, you also don't want to waste money on bandwidth unnecessarily (it did not need to download 15MB per user to begin with), if it was a high traffic site, they could have cut their bandwidth costs down 90%, depending on expenses that could be pretty decent savings.

    Not always about saving money though, as you point out other priorities will have convenience and speed to develop come out on top. Using BunnyCDN instead of AWS services for CDN alone has it's own fair share of savings. Not having a high bounce rate due to poor UX is the bigger justification though.

    Leave a comment:


  • Michael_S
    replied
    Originally posted by polarathene View Post
    The agencies that get clients sold on visuals / functionality alone and use website builders with minimal coding are usually the ones at fault for bloat though. They prefer speed and meeting the clients requirements, it's not something they continue to benefit from afterwards by spending additional time/effort to optimize for.

    Same reason AWS is popular with businesses (with actual developers), it makes things easier for them, there's better and cheaper alternatives but it's easier to take vendor lock-in and hire developers that know those platforms well, common knowledge without having to upskill so much.
    Initially time to market outweighs most other concerns, and the on the long term supporting APIs for long periods carries weight. Performance optimization for any software aside games, video editors, and image editors is rarely a priority.

    I don't see that changing, really. If priorities were different a PC from 2003 would be usable today. But I might as well ask for a unicorn.

    Leave a comment:


  • enigmaxg2
    replied
    In other news, MS Edge for Linux has popped up in the AUR (and other distribution methods) and it totally DESTROYS the competition performance-wise.

    I have Manjaro KDE on a very old testbed (Athlon X2, 4 gigs RAM, Radeon X1200) Firefox struggled a lot to run (and even froze the computer sometimes) Chrome was better but Edge manages to run on that toaster better than anything else.

    Firefox and Chrome are now wiped from it.

    Leave a comment:


  • Citan
    replied
    Originally posted by timofonic View Post
    Still slower than Chrome/Chromium, what a shame...
    On desktop, that assessment can hold true only as long as the following conditions are all fulfilled.
    1) No more than 10-20 tabs at any given time.
    2) No more than one extension, and none related to tab management.
    3) Adblocker installed and properly configured.

    For having made extensive tests with all browsers, and making an extensive, near-industrial use of everything a browser can offer to me with first and foremost the combo "tab management + sessions + bookmarks", I can assure you that Chrome (and all browsers derived from it, which is about all) is a real tadpole compared to Firefox as soon as you start doing serious work.

    And that's putting aside the fact that Chrome doesn't even have the most basic functionality of a modern browser correctly supported: nestable tabs in sidebar.

    Leave a comment:


  • SystemCrasher
    replied
    Originally posted by Adarion View Post
    WTF?A colourful web is okay, but this is just senseless bloat. And it makes things slow and wastes energy and time.

    But this is cause "web designers" never learnt how to code, never handeled an old computer (8086 similar class, or C64 if you want, or microcontrollers). You learn to fight for every byte of RAM there...
    That's why web devs are collaterally known as webmonkeys. Bad news are: firefox team gone webmonkeys & marketing as well. That's why they're doomed in 1st place. Few more years and we'll see Mozilla corp leftovers sold to obscure china company or something like this, just like Opera did.

    Leave a comment:


  • Michael_S
    replied
    Originally posted by Marc Driftmeyer View Post

    Question to you: Are you paying the publishers for providing whatever site contents you visit?
    Lemme see - I have a membership to TheGuardian, Arstechnica, Wired, SourceHut, and Phoronix, so when I visit those sites, yes.

    I have a simple web app I wrote that uses no cookies, a single CSS file for a very simple Grid layout, and no JS. Pages load instantly (well, it's written in a slow programming language so it still takes a few hundred ms - I plan to rewrite it) and Firefox Task Manager says pages use from 500k-1.3MB of RAM.

    When I visit SourceHut, which also doesn't use JS, most pages use under 1.5MB of RAM.
    When I visit Phoronix, the Firefox task manager states that the main Phoronix page uses 5 MB of RAM and an individual article uses 2.6MB.
    When I visit my Roundcube page (an open source web based email client) it takes 8MB of RAM with no messages open and 12 MB or so with messages open.

    Wired gets into the 15-40MB range. Arstechnica and TheGuardian are somewhere in between.
    Yahoo Mail (admittedly all of the following except JIRA are ad-funded) takes 40 MB.
    Outlook and GMail take 70 MB.
    Github pages take 80 MB.
    Reddit pages take 100MB.
    Youtube pages take 100MB.
    JIRA (for work) takes 150MB - 500MB.

    That's bloat for you. And Michael Larabel, by the looks of it, is doing pretty well at fighting it. I'd prefer a SourceHut level of efficiency, but an old computer can run 20 tabs of Phoronix.com for every Reddit tab.

    Should support for advertising really take web pages from using 5MB of RAM (Phoronix Premium) to 100MB? (Reddit, Youtube)? Really?

    Leave a comment:

Working...
X