Search Engine Optimization

How Webpage Load Time Is Related to Visitor Loss

By August 7, 2009 September 10th, 2014 29 Comments

Have you ever been to a website that takes forever to load?  What do you do?

We’ve taken some past research and developed a way to determine how many visitors you could potentially be losing based on your webpage load time ranging from 0-30 seconds.  This was not easy – only a couple of studies have actually been done, and not only are they “aging”, but they have also been controversial and only up to around the first 4 seconds of load time data.  Obviously, there are many factors involved in determining how long you are willing to wait for a page to load, but with tabbed browsing, faster connections speeds, and more, maybe this is why a real study has not been done since 2006.

Here are some key takeaway points from the research we were able to come up with:

– Zona research said in 1999 that you could lose up to 33% of your visitors if you page took more than 8 seconds to load.
– Akamai said in 2006 that you could lose up to 33% of your visitors if your page took more than 4 seconds to load on a broadband connection.
– Tests done at Amazon in 2007 revealed that for every 100ms increase in load time, sales would decrease 1%.
– Tests done at Google in 2006 revealed that going from 10 to 30 results per page increased load time by a mere 0.5 seconds, but resulted in a 20% drop in traffic.

Wow.  A half of a second?  Is that even enough time to take a breath? Yet, when browsing, most people will lose patience and leave your website before they even have time to breathe.   How this relates to e-commerce sites is pretty important. If your website is selling a fairly generic item, your site had better load pretty damn fast or you just lost your sale to some other guy. At Christmas, when every parent is looking for this seasons must have toy, better hope your website loads in under 2 seconds. When a husband forgets his anniversary and is quickly looking for a flower delivery place while the boss isn’t looking, your pictures better not be too big and take forever to load.

So how long does your webpage take to load? Check out Pingdom.com/Tools, and then come back here and approximate your potential visitor loss:

potential visitor loss graph

 

If you prefer to “geek out” and read our entire white paper, you can download it here.  (I will warn you that it does mention words like “mathematical model”, “radioactive first oder decay” and “non-linear regression”.)

Ryan Kelly

Ryan Kelly

Ryan is the founder and CEO of Pear Analytics and has helped hundreds of customers with their Internet marketing since 2003. He has spoken on various topics of SEO, Analytics and other marketing at conferences in New York, Chicago and Vancouver. Clients he has consulted include Sears, KMart, CareerBuilder and PEER 1 Hosting. Ryan currently teaches two Internet marketing classes at Trinity University in San Antonio.

29 Comments

  • - says:

    Interesting and redundant.

    I managed to geek out for a while and read your white paper and some of the referenced material. I would like to add some points.

    Exponential function (or the radioactive decay function, as you have written) is a very standard function to describe many scientific phenomena. So it’s understandable that you assumed it applicable for your case. However, even a Pareto function, which is more widely used in cases of demographic, economic and psychological phenomena, would have served the purpose. Or for that matter, there are numerous other functions that are similar in nature/shape and a simple curve fitting exercise would give out a number of different results.

    Moreover, in certain cases (like when buying flight tickets, or books, or apparel) most customers would choose to search and hang around a while longer if they can strike a better deal. So, while a researcher managed to establish that 40% of online shoppers preferred sites which loaded faster, did he actually address the bias that comes with sampling (price range of products purchased, types of products, and some other factors)? In other words, was the sample representative of all the shoppers and products?

    Again, tolerable wait time is dependent on a lot of factors, as established in earlier research (for example, feedback on the page load time increases the tolerable wait time to 38 seconds, and I am quoting from one of your sources). There are a lot of flash heavy websites which are amazing to be on and I would love to shop on one of those (have a look at some webby winning websites). Moreover, web traffic might not be a very accurate indicator of the actual sales because there might be a lot of users who were actually “e-window shopping” or who stumbled upon the page.

    The point is that it’s redundant to model the number of users lost over a 30 second time period. That probably is one reason why much research has not been done in this area after 2006. It’s actually fascinating to quantify something as abstract and complicated as human behavior yet in this case it’s not very accurate as a case by case approach would be more relevant. What this paper would sum up to is “if you are selling something trivial, better keep your web pages as light as possible”.

  • Romy Misra says:

    A lot of research was put into what curve to choose and we did take into consideration the Pareto curve in the preliminary analysis.

    The problem with the Pareto curve was that apart from the minimum value of x defined as 4 it had only one parameter. We had 2 other points we knew the curve had to pass through. So this creates a situation of infinite Pareto curves, not passing through all the points we know the curve has to pass.

    The radioactive decay on the other hand was found the best approximation with the least error percentage when fit into previous years data. Which made it the best approximation to choose from a predictive model perspective.

  • - says:

    Thanks for the clarification on curve selection.

  • Great write up – five stars. I bookmarked this page.

  • Hey. I got a 502 gateway error earlier today when I tried to access this page. Anyone else had the problem?

  • Went through your post and thought hey, someone was really good in debate classes.

  • quel film says:

    It was a nice post you wrote, keep updating your site, that’s nice.

  • Randal Lesky says:

    good post, raises valid points

  • Jan Amsel says:

    Normally I do not commence opinion my personal English just isn’t up to speed. but yet thanks for this excellent blog post and hoping ahead to more.

  • Bham says:

    Nice blog. I bookmarked this page.

  • Adolfo says:

    As someone said, tolerance is related always with the kind of website you are visiting. For example if you were looking information on Wikipedia and the page takes to load more than 10 seconds obviously you will say Next. But if you were looking for a car or something like that, probably wait a little doesn’t matter, because you want to be involved on a multimedia experience.

    There is another factor, most of people attend to many pages at same time, so when something take more than you expected, you go to read or look into other pages while other page takes to load. So, in this era everybody look to many options to choose the best one. The key is the quality of content, if your content worth to wait then you wait. Simple like that.

    Regards from Mexico.

  • Vasu Adiga says:

    Twitter should have no visitors left if this were true for all websites. Usefulness comes first, speed next.

  • Johan van de Merwe says:

    A nice article, but I have one critical point. Currently you see more and more that commercial websites (like newspapers) include links to content providers of commercials. This is increasing the load of pages intensively, but still the number of commercials are increasing. Sometimes on newspapers it takes about 10 to 20 seconds to load a page and still visitors come back. In the article I didn’t find a relation between the content quality and the load time. If I really want to know or have, I will hold my patience a little bit more. And after all, Amazon (not always the fastest) is still the biggest. I think to a certain extend your article is also containing laboratory statistics. Just keep your quality of your content in shape to avoid hasty website zapping.

  • Pingback: Quora
  • Ryan Kelly says:

    That’s a good point….people are willing to wait for the fail whale to go away so they can start tweeting their lives away again. I wonder if the same is true for e-commerce sites? I think people will wait for a slow page if they think it’s the best deal they can get.

  • Ryan Kelly says:

    Hi Johan,

    your comment is similar to Vasu’s, and yes, I would agree with that point as well. We just don’t have any data relating content quality to load time, or “patience” – but I think it’s safe to say that people do wait longer for trusted brands and quality content, lower price, etc. The problem is probably a bigger deal for unknown brands who have a fraction of a second to capture your attention before you move on to the next search result.

  • I’ve bookmarked this, you should receive a pingback shortly:) the free SEO directory

  • What a rocking post man – thanks! And very good blog website theme BTW. It appears to be like quite a bit like a wordpress website online however I do know it’s blogenegine and it’s totally unique. The place did you uncover it?

  • Astrogremlin says:

    first *oder* radioactivity (your last line).  I’ve heard of radioactive “decay” but thought it was odorless! 🙂
    Super article and especially appreciate the graph showing when traffic begins to decay.

  • dragonseoman says:

    I was gonna ask about something Alex seems to have mentioned – things like jQuery image carousel or lightbox, I tend to put them in before the page loads. It looks bad when the carousel images are large or you have a lot of them, and you see them load on the page without the effect first… your thoughts on that?

  • Vennweb says:

    A nice article, this is something we a wrestleing with in our company as we develop new pages.
     
    I think your graph is labled incorrectly as a load time of 0-5s results in 100% loss where as 30% is 0. Change the title to visitor retention.
     
    I had a look at the pingdom link, it appears that it just measures the download time and not the implementation of any scripting. I have found on some sites this is a major factor in time to view rather than download time. I am hoping to release a exe app for PC on my site soon which incorperates a IE9 element and measures time to visualisation rather than time to transfer data. I anticipate adding a set of standard pages as a reference so that the page being observed can be examined independently of the local connection speed.
     
    check out our lab at vennweb.co.uk should be on-line for free download by the weekend.
     

  • Ryan Kelly says:

    thanks for catching that! I’ll have to update the graph….

  • TimDuckitt says:

    @- i run a small business and e-commerce site, feedback and experience has left me relying on the facebook page for orders because the host i chose (powweb) is just appalling. 
    load times and customer impatience is a real issue, no matter how big a block of refutation you post

  • I often experience the loss of a visitor. however, I did not lose my selling point considering my site is website promotion and advertising buying and selling.

Leave a Reply