Call-to-Action: benchmarking 10 web services
Digital Next | Industry Savvy

Call-to-Action: benchmarking 10 web services

on / by Paul Veugen

The sign up button or link is an important call-to-action on the homepages of most web services. In a recent demo case Usabilla compared the sign up on the homepages of 10 different web services. Users found the sign up button on the Twitter homepage in 1.8 seconds. Animoto was a good runner up with 2.3 seconds. On average it took participants 3.5 seconds to find a way to sign up for these web services.

The differences between the performance of these websites on this important task are big. But what makes Twitter homepage stand out in this test? Why do the sign up buttons at Animoto, Vimeo and MyNameisE catch attention faster than those of Wakoopa, Basecamp, and PayPal? We would love to hear your opinion about these test results.


Twitter – 1.8 seconds

Twitter - heatmap

Animoto – 2.3 seconds

Animoto - Heatmap

Vimeo – 2.9 seconds

Vimeo - Heatmap

MyNameisE – 3 seconds

MyNameisE - Heatmap

Usabilla – 3 seconds

Usabilla - Heatmap

Facebook – 3 seconds

Facebook - Heatmap

Foursquare – 3.3 seconds

Foursquare - Heatmap

Wakoopa – 4.3 seconds

Wakoopa - Heatmap

Basecamp – 5.3 seconds

Basecamp - Heatmap

PayPal – 6 seconds

PayPal - Heatmap

Test data

Tested with Usabilla

This usability test was conducted with the online usability tool Usabilla. This online usability tool allows you to collect direct and valuable feedback from your users in any stage of the design process. Users share their opinion or perform tasks by adding points and notes directly on the webpage, mockup, wireframe, screenshot, or image. Sign up for a free account to try it yourself.

| | | |
Article by

Paul Veugen

Founder / CEO @ Usabilla User Experience designer, entrepreneur and metrics junky.

Share your thoughts

  • Cristian Cretu

    I appreciate your effort to make this test.

    I would be more interested in a discussion about a test made on the same website category (eg: video sharing home pages).

    Talking about the differences between Basecamp and Twitter’s home page is useless from my point of view.

  • Thanks Cristian. Comparing multiple websites in the same category could be very interesting. If we’re going to do a second round, we’ll try to pick some websites in comparable categories.

  • Nice study!

    I agree with Christian, but I still think this study demonstrates two factors that influence time and success rate very clearly:

    1. Number of outbound links/form fields on the pages you tested. Less choice leads to faster decisions and less errors.

    2. Placement of the main sign up call to action. Central placement in the primary viewing area (above the fold) seems to work best.

  • Pingback: Call-to-Action: benchmarking 10 web services | UX Booth()

  • Maybe i didn’t understand, but “little-bit” correction.

    I think, this time is call-to-actions’s time (time of success). Include that time the first fixation time and button/link understanding time.

    Therefore i think, better if you say time of succces, because first fixation = found that elements. Events: click or .. = succes rate, time of success.

    Sorry, my english isn’t good, maybe you can understand what is would like to say/advise to/for you.

    But this report is fantastic, you are the star!! :)

    Kindly Regards,

  • @Szabolcs

    We measured the time between the start of a task (starts when a users has closed the task introduction) and a click (when a user clicks anywhere on the screen). That’s not completely accurate. What we should have measured is the time it took to successfully click (on the right spot). We’re going to implement the option to display the average time for all clicks in a certain area in the near future.

  • Pingback: Le best of de la semaine (4 Janvier au 8 Janvier) | Mahi Mahi()

  • Pingback: Проверка Call-to-Action страниц 10-ти популярных веб-сервисов |

Pin It on Pinterest