With 1,500 Website User Experience Reviews under its belt, Forrester has come out with a jaw-dropping finding, according to a Forrester blog post:
“only 3% of the sites earned a passing score (that’s a total of 45 sites out of the 1,500. Yes, you read that right: 45).”
Websites failed their users most often by being sloppy at the basics including text legibility, use of space, task flow and links to privacy and security policies.
Forrester analyst Adele Sage notes in her blog post that “some of these problems are relatively easy to fix, like increasing font sizes and foreground-background contrast to make text easy to read.”
The global research and advisory firm has been looking at web user experience for more 12 years, it says. It evaluated B2C and B2B sites, intranets, and employee portals on 25 criteria graded at -2 (severe failure), -1 (fail), +1 (pass), or +2 (best practice). Total scores could range from -50 to +50. Forrester’s passing score is +25 or higher.
It’s worth noting that that’s tough grading. If, for instance, a site did well on 24 criteria (scoring 24), and then failed the 25th, it would get a final grade of 23 — a fail, by Forrester’s standards. It’s also worth nothing that the data includes sites from way back in 1999, when everything, including design, was more rudimentary than it is today.
Still, many sites seem to be doing about as many things wrong as right, because “the average score across all of our reviews was only +1.1,” writes Sage.
The one beacon of light in the findings? “There was a significant increase in the average score over the years just prior — a trend we hope to see continue,” writes Sage. “There’s a similar pattern when we compare B2C and B2B sites.