Is Focusing on Browser Performance Important?

Rob Williams

Editor-in-Chief
Staff member
Moderator
Whenever a new version of a Web browser is released, something sure to be listed in the release notes will have to do with performance. It could be anything from a more optimized JavaScript engine to having a smaller memory footprint. All in all, any performance increase is appreciated, but I do wonder if such a hardcore focus is necessary.

mozilla_firefox_6_01_081511_thumb.jpg

Read the rest of our post and then discuss it here!
 

RainMotorsports

Partition Master
Yeah, I dont understand it at all. If a page appears instantaneously after after onmouseclickup, do i really need it any faster? Most pages that dont load that fast for me are on shared servers, slow connections etc. While many do take as long as a second some only take a fraction.

As the occasional web designer I understand dynamic pages occasionally need to load another page and the speed can affect the feel, fluidity, enjoyment of using a site.

I think low power computing is where browser performance really starts to matter.

With Fx's bad habits memory usage will always matter :)
 

marfig

No ROM battery
I wrote about it last year on my blog. It was mostly aiming at programmers, But I used exactly web browsers as a source for my argumentation. Here's the link: The false prophets of performance

And here's an excerpt if you don't feel like reading the whole thing:

[...] But let’s not rush to a conclusion yet. 300 milliseconds is still a significant period of time. More significant than, say 1 millisecond. So we need to start contextualizing. Let us assume we are deciding on what engine to choose for a web browser. Now, this is an event-driven application. An application that is event-driven will have different performance requirements. Can we agree? I mean, the normal usage pattern is to click links and bookmarks or enter web addresses to fetch, parse and render a new web page. Under these conditions, those 300 milliseconds difference between Rabbit and Turtle will be less significant than they would if we needed the algorithm to be used on an application meant to fetch a large number of pages in a loop. In this case, if the application was to fetch 100 pages in quick succession, Turtle would require 2:20 minutes, while Rabbit would only need 1.50 minutes. A 30s difference, meaning Rabbit can fetch more pages in the same amount of time. But since this is an event-driven application, we can’t really support this argument for Rabbit, since event-driven applications are dependent on the user activating the algorithm at irregular — and many times quite spaced — intervals. I can stay 5 to 30 minutes to a few hours before moving to a different web page. As far as a web browser usage pattern is concerned, a 300 millisecond difference remains still a 300 millisecond difference at the end of the day.

In that article I essentially argue that performance is never a requirement after a certain usability threshold is met. That is, after a web page loads at an certain speed, any more performance becomes useless. The user won't care. But more than that -- after a certain threshold, any performance gain becomes imperceptible to the human eye or mind.

I don't explore on that article another characteristic of this so called "useless performance". Bottlenecks will rapidly move to other parts of the system, once performance reaches a certain level. If you consider the user as part of the entire "System" that composes web browser usage, it's very easy to perceive that a user reading speed, reading comprehension levels, skills with the keyboard and mouse and well as eye-hand coordination, become all slower processes than the time it takes for a browser to load a web page. So, overall, any software-driven performance gains after a certain threshold will also be completely negated by the user usage of that web browser.

Essentially, I think we have achieved good parsing performance levels some 3 or 4 years ago already. Since then the real bootleneck has been our bandwidth and that of the servers on the destination. Any more speed has become, at least to me, not something I care about when choosing a browser. The whole "which browser is faster" we see on the web sometimes has become thus a hollow debate I become completely uninterested about. Having a lot more interest in browser features, memory consumption and usability levels.
 
Last edited:

DarkStarr

Tech Monkey
Meh, I have no real complaints, if you have a decently new PC it shouldn't matter how much ram it uses. That and my internet connection is typically more than fast enough to load the page within a few seconds.

Currently FF is using 1gb with about 90 tabs open and I am using only 19% total of all my ram with the sims 3 and several other programs open. Then again my system is above average, 16gb of ram and about 3,5gb used. My other has 8gb and maybe 2gb used. The laptop has 6gb so I mean really, as long as its within say 5 years it should have plenty of ram or be easily upgradable.
 
Last edited:

Kougar

Techgage Staff
Staff member
I'll agree a bit with what Marfig already said, after a certain point performance isn't a consideration for me. Most browsers have reached that point.

Stability, features, and footprint are my selection criteria. Well, and interface + customization I guess. Nothing much seems to be changing on the interface and customization options front with browsers, although as much as I like Opera I am finding it increasing annoying they are breaking more and more features with ever subsequent release. After using Opera for several years running I can safely say have a better understanding of why their market share never is able to grow, now.
 

Psi*

Tech Monkey
I am teetering on boosting my download to 101 Mbps. If I do follow thru with that, perhaps there are some tests that could be devised that I could run.

In other words I think browser speed is mysteriously governor-edby the ISP download speed. Although your testing doesn't really support that ... I still think it.
 

Rob Williams

Editor-in-Chief
Staff member
Moderator
marfig said:
In that article I essentially argue that performance is never a requirement after a certain usability threshold is met.

This is the thinking I've had for a while. In your example, you mention that differences could be noticeable if dealing with 100 pages, but that's not a typical workload, nor do most people run more than a couple of tabs at once (I guess DarkStarr is a major exception). At the point where someone opens up a slew of pages, one would think Internet latency would be the bottleneck, not browser speed.

marfig said:
Bottlenecks will rapidly move to other parts of the system, once performance reaches a certain level.

That's the thing... moving forward the GPU will play a bigger role, but that will mostly have to do with video acceleration and gaming, where 10% faster JavaScript performance wouldn't be appreciated.

I am teetering on boosting my download to 101 Mbps. If I do follow thru with that, perhaps there are some tests that could be devised that I could run.

A common misconception that some people have is that having a faster download speed means that pages will load faster. While it's true that larger elements will download faster, latency is the true ruler of them all. It'd be possible to have a 100Mbit/s connection, but still have webpages take a couple of seconds to load, because the connection has to be established with the server before elements begin downloading.

A perfect example of this is HDD vs. SSD. On an SSD, applications load faster, but it's not because the transfer speed is so high. Rather it's because the latency is much, much lower (0.1ms vs. 10ms, roughly). The same basic principles apply to the Internet.

With a 100MBit/s, it'd be -hard- to properly test it. The best one might be to queue up a ton of torrents and then begin downloading them all at once. But by the time a lot of them would begin downloading, half of them might be complete ;-)

Oh, and I'd be jealous of that connection. Just an FYI.
 

Kougar

Techgage Staff
Staff member
Thanks Rob, you took the words right out of my mouth! I was getting all set to use that HDD example too. :D

To add to Rob's post, I'd like to point out that if the user is on WiFi, then the latency and typical interference involved will hide any boost to the actual ISP speeds. WiFi itself can easily increase page load times depending on a large range of factors, especially signal strength. Perhaps it is just my laptop, but I've never felt that pages loaded half as fast as they do on my wired connection.

I'm not trying to promote Opera here, but incidentally this is why Opera devised "Opera Turbo"... they decrease page load times by decreasing the data sent down the pipe. I imagine it'd be a huge boost to dial-up and even some WiFi users for that reason.
 

DarkStarr

Tech Monkey
I agree wifi can slow down your connection, I have used Clear 4g and it was decent until you wanted to do anything ping based. Horrible ping of ~500ms so no XBL or anything. Oh and Kougar Opera is pretty good but I don't really use anything but the stock browser so no experience with the broken features.

On another slightly related note:

WOW. Maybe I am a noob here but...... anyone who has safari and uses it a bit can you check this? I was poking around to delete crap on my drive and then under this location - C:\Users\DarkStarr\AppData\Local\Apple Computer\Safari\Webpage Previews - guess what I found? ~90Mb of images, images of pretty much every time I have used Safari. Personally, that's effing creepy and I am getting rid of Safari.
 
Last edited:

Rob Williams

Editor-in-Chief
Staff member
Moderator
To add to Rob's post, I'd like to point out that if the user is on WiFi, then the latency and typical interference involved will hide any boost to the actual ISP speeds. WiFi itself can easily increase page load times depending on a large range of factors, especially signal strength. Perhaps it is just my laptop, but I've never felt that pages loaded half as fast as they do on my wired connection.

I have not experienced that at all to be honest. I just did a test with both my desktop and laptop (both are right beside each other), and loaded up a page I haven't in a while. Both effectively finished loading the page at the exact same time.

Wireless -will- add some latency, but it should never be that much. While my desktop pings the router at well under 1ms, my laptop pings it at about 2ms (the router is two rooms away, so I consider this to be quite good).

If a regular ping time to a server is about 30ms, a mere 2ms piled on shouldn't reveal a noticeable difference. You wouldn't think so, anyway.
 

Kougar

Techgage Staff
Staff member
WOW. Maybe I am a noob here but...... anyone who has safari and uses it a bit can you check this? I was poking around to delete crap on my drive and then under this location - C:\Users\DarkStarr\AppData\Local\Apple Computer\Safari\Webpage Previews - guess what I found? ~90Mb of images, images of pretty much every time I have used Safari. Personally, that's effing creepy and I am getting rid of Safari.

I'd always avoided Safari, but that's news to me? Are they like screenshots of pages you were viewing or something??

Wireless -will- add some latency, but it should never be that much. While my desktop pings the router at well under 1ms, my laptop pings it at about 2ms (the router is two rooms away, so I consider this to be quite good).

If a regular ping time to a server is about 30ms, a mere 2ms piled on shouldn't reveal a noticeable difference. You wouldn't think so, anyway.

2ms sounds pretty tight to me. But WiFi is typically used when mobile, in less than perfect environments. Take any college campus, public wifi, restaurant, conference, or hotel and the wifi always loads slower regardless. So I really don't see a point to quibbling over milliseconds... only something like Opera Turbo that actually routes data to Opera's server, which then compresses it and routes it to the user has the potential to boost load speeds in those sorts of situations. My point being, any browser can be the fastest on earth, but it still has to wait to load a page until it receives the data.
 

DarkStarr

Tech Monkey
Yea, they are screenies of sites I was on, nothing too major but just weird. It also grabbed shots of me poking through my router.
 
Top