Design View | Articles and opinion on design professionalism, technique and culture by Andy Rutledge

Poynter's Eye-tracking Follies

April 13, 2008

Poynter released their EyeTrack ’07 study findings some months ago and while this effort was not a completely useless study like their previous one, I thought it would be good to revisit how these sorts of studies can unnecessarily distract us in our design efforts. What follows is an updated, edited, and expanded version of my previously published article on Poynter’s Eyetrack III study.

Update: The bulk of this article was originally published in 2006. Because this article refers to both EyeTrack III (Poynter's older study) and EyeTrack '07 (Poynter's latest study), the republication of the original article mixed here with observations on the new study may cause confusion for some readers. For that, I sincerely apologize.

I ask that you keep in mind that the context of the bulk of this article was relevant to Eyetrack III, published in 2004. Poynter’s latest study for 2007 does not present nearly the level of irresponsibility that their previous study’s published “findings” embodied. The 2007 study does, however, have many flaws and the vast majority of the findings could have been determined for free, simply by asking a competent designer. So while Poynter continues to waste money and present some flimsy data, they’re getting better. This is not saying much, though.

I hold that everyone should take anything Poynter offers up with an enormous grain of salt. They generally would seem to lack the design competence and understanding necessary for conceiving of and conducting their studies, and they certainly lack the necessary credibility for presenting the conclusions they presume to put forth to their clients and to designers eager to learn.

So while Poynter shows signs of improving, this still constitutes little more than being not quite as awful as they used to be. I suggest that designers largely ignore what they present, except as cautionary tales; for how not to utilize data that impacts our design efforts.

* * *
Poynter is about to launch the research phase (Update: they’ve now completed this study and have published findings.) of its run-up to launching version 4.0 of its perennially popular game, The Poynter Institute’s Complete Waste of Time. Many of you may know it by its more common name, the EyeTrack study. It’s a game where knowledge is fun! – knowledge of anecdotal, irresponsibly useless, non–applicable trivia about the viewing habits of website visitors. Let’s play along.

Virtually every one of their past conclusions from eye-tracking studies can be refuted or reversed with even the most basic application of design or a varied context.

Wait, before we play along let’s first see if these guys know what they’re doing. I’ll save the suspense and skip to the end for you: no, they don’t know what they’re doing. Instead they’re wasting your time and the time of the editors and publishers they claim to want to help. Oh, and they’re wasting lots of money, too. In fact, if you have any doubts about whether or not Poynter understands design, you need look no further than their own website. The chaotic and psychedelic experience you’re presented with there is clear indication of their grasp of design.

Not unexpectedly, in Eyetrack III Poynter completely ignores the fact that design exists and so fails to allow for the impact of design in the context of their study. Their work and published findings create and perpetuate vacuous ideas about Web page layout and design effectiveness. Contrary to their expressed purpose, Poynter would seem to want to keep online publications in the Stone Age while making it harder for design professionals to do their jobs and appropriately serve their clients (many of whom are online publications).

They do this harm by releasing “conclusions” that are completely anecdotal and which fail to account for varying and vital contexts. Virtually every one of their past conclusions from eye-tracking studies can be refuted or reversed with even the most basic application of design or a varied context. Further, they fail to account for the effect of perhaps the most important element in the study material: the content. The results can be nothing less than irresponsibly flimsy data.

They’re making these mistakes, I believe, largely because they’re clearly not designers! The Poynter Institute is “a school for journalists, future journalists and teachers of journalists.” That’s great, but it clearly does not stand them in good stead when they so casually and unwittingly wander into the realm of design.

If I sound a bit indignant it’s because I care about my profession and its people—students and pros—so I’m particular when it comes to what others offer up to us as fodder for consumption. If someone is going to presume to teach facets of design to designers, or anyone else, they’d better damn well get it right. Poynter gets it wrong.

Eye-tracking Myopia

Here are a couple of examples of Poynter's EyeTrack conclusions. Let’s examine some of their claims from the studies about online readers’ habits.

1. Users spend a good deal of time initially looking at the top left of the page and upper portion of the page before moving down and right-ward.

Not if the designer doesn’t want that to happen. What they’re referring to here are the behaviors of their study subjects when presented with the specific layouts used in the study. Any competent designer can craft a layout and design to elicit any specific entry/focus behavior on the page.

2. Ads perform better in the left hand column over the right column of a page. The right column is treated by users as an "after-thought" area and should be designed with that in mind.

Hogwash. Yes, a designer can make this so, but she can also design the page so that any area of the page allows ads to perform better. This is, in fact, the designer’s responsibility with many projects. The “after thought area” of a page is created by the design—if it is designed to exist at all, and is not always relegated to the right column.

3. Navigation placed at the top of a homepage performed best.

…For the specific designs used in the study, perhaps. But this again is wholly contextual to the design, the content, and the intent of the designer.

This is the sort of pap they would have designers, publishers, and editors believe and invest in. In their current pre-study promo article, they say, “Because the study adheres to the highest research standards, we'll be able to offer industry leaders scientific accuracy on which to base the editing decisions they make every day.” However, as their conclusions are farcical and myopic, they don’t actually offer much in the way of valuable information. Their definitions of high research standards and scientific accuracy would seem to be unreliable.

Misapplication

One of the reasons that they so completely miss the mark with their 2004 study is that they don’t fully understand the usefulness of eye-tracking studies, abuse the purpose, and then mischaracterize the applicability of the data. Eye-tracking studies as they conduct them have one and only one use: the evaluation of a specific design as applied to a specific layout in a specifically relevant context. In other words; what can be gleaned from the properly conducted eye-tracking evaluation of one design/layout is almost completely worthless to any other page or context, UNLESS you are a designer and practiced at accurately extrapolating variations based on the data.

With the 2007 study, Poynter actually did have an expressed purpose that would seem to be valid: to reveal the differences and similarities in reading print and online publications, and how specific elements factored into that experience. Unlike the 2004 study, they did not with this latest study publish purely anecdotal findings and suggest supposedly concrete, broadly applicable standards for design.

If, however, you read how they conducted the 2007 study, errors become clear and problems become apparent. What’s more, you’ll notice that they would seem to have wasted many people’s time and wasted who knows how much money unnecessarily, because the vast majority of the findings could have been determined simply by asking a competent designer or two.

we designers must know our craft and understand the psychological and behavioral impact of our design decisions. Without this knowledge we can easily fall prey to the naïve pseudoscience common to these eye-tracking studies.

Another of the problems inherent in these studies is the fact that they invariably use subject publications that have serious design and layout problems (as they may have done with this latest study; we just don’t know because they don’t publish all of those details.). Doing so results in flawed data because conclusions drawn from examinations of poor quality designs in an artificially created context are generally inapplicable to good design and natural contexts. This is something they either don’t grasp or simply don’t care to mention.

I wonder: did it ever occur to Poynter to consult designers about the design quality of the subject publications and the context under which the studies would be conducted? Apparently not, because their studies don’t appear to consider even the fundamental sort of end-user roles designers consider as a matter of course in the average design process. Context matters. Content matters. Design matters. Poynter (and others) can’t be bothered by these trivial facts.

For instance, Poynter (and every other eye-tracking study I’ve ever seen) presumes to scientifically measure how deeply people read into articles. This sort of data might be generally applicable to similar types of online news sites, but it’s wholly inapplicable to specialized publications or commercial sites because of one vital factor: the reader’s interest in the content subject matter. Anyone not interested in the content will read very little of it. This behavior has nothing at all to do with the layout configuration or font size. Yet none of these studies seem to take this vital bit of context into account. And this is only one contextual factor. There are many others.

Any competent designer can design and lay out a page and its content so that the content is easily and readily consumed in-full (in the right context) – or – can design and lay out a page and its content so that it is merely scanned. Any competent designer can design a page to compel readers to concentrate on one specific area or to systematically cover the entirety of the page’s content. But again, context matters and impacts reader behavior. Good designers take context into account.

I’m greatly disappointed that Poynter makes such feeble and casual forays into design matters without accounting for relevant context issues. This is irresponsible of them and the fact that their work is held up as consequential and important harms the design profession; not to mention their own credibility. It also misleads our clients as to what sort of understanding is most relevant (study findings or an individual’s design competence?).

Designers should first rely on their own practiced understanding of these issues before swallowing what others offer up. I’ve said it before and I’ll continue to maintain this theme: we designers must know our craft and understand the psychological and behavioral impact of our design decisions. Without this knowledge we can easily fall prey to the naïve pseudoscience common to these eye-tracking studies.

In short, if you want to know how a reader will consume a page of content in a specific context, don’t ask a journalism study coordinator; ask a competent designer.

Archives

Archive

Practice