Tuesday, April 28, 2009

Improving Readability: Data

I claimed in last week's post about empathy that you should never listen to your players. To better illustrate what I mean, you should watch this talk about spaghetti sauce. Unlike all those other boring talks about spaghetti sauce that you've seen, this one is by Malcolm Gladwell at TED. I have no hesitation in saying this is the best pasta-related talk I've ever seen. Seriously, go watch it, I'll wait.




There are two excellent points being made in Gladwell's discussion of Howard Moskowitz's experimentation. The first comes early, in Moskowitz's observation that the search for the perfect Pepsi should have really been the search for the perfect Pepsis. There isn't necessarily a perfectly satisfactory recipe (read: design) that will suit everyone. If something isn't working, finding the median isn't necessarily the right solution. Diverge towards two alternatives and satisfy two groups instead of serving everyone poorly.

Gladwell's second point demonstrates why you should never listen to your players. In the search for the perfect spaghetti sauce, Moskowitz tested countless varieties, none of which the market showed any indication of wanting. He was rigorous and thorough in his evaluation. Game design should demonstrate the exact same rigor.

Mike Ambinder, a Ph.D. cognitive psychologist at Valve, gave a fantastic talk at GDC about applying clinical research methodologies to create better games. He opened with the claim that game design is a hypothesis and playtesting is an experiment. Essentially, he said that evaluating game design ought to be treated like a science. This is exactly what Moskowitz did to find Prego's troika of ideal spaghetti sauces.

Leigh mentioned this on our podcast, saying some developers will treat unfavourable playtest results by saying "Those playtests were dumb and just didn't get it." Then they'll repeat until they find playtesters that provide satisfactory results. Essentially, they're looking for data that supports the hypothesis they want and disregarding any contradictory evidence. This is, of course, the worst kind of bad science. It's actually worse that not testing at all, where there may still be some doubt about the design's validity.

As I discussed in the last post, the development team is too close to the project to evaluate it objectively. Data-based analysis is vital because it provides objectivity our knowledge of the game intrinsically precludes.

While this may seem obvious, it's pretty clear that some studios lack dedication to this kind of evaluation. To be fair, few have Valve's unending wellspring of money and time that makes it much easier to perform these experiments.

Finally, when I say you should never listen to your players, I'm being superfluous (but not by much). Playtesters probably shouldn't be gagged on entering the building. Playtester surveys and post-hoc discussions are an important part of playtesting. But they're good for identifying problems, not solutions.

Watching what they do is more important than what they say. And when even successful game designers can be dead wrong about what a game is missing, the value of a random player's proposed solutions should be obvious.



In an attempt to keep this post from becoming truly colossal, I'll defer to links to provide some excellent examples of how to perform this kind of data-driven analysis. Mike Darga, a designer at Cryptic Studios, has been running a truly outstanding series about exactly this. And it's great to see we're on the same page about the importance of this practice.

This weekend I'll finish the culmination of this series and finally provide some tangible examples of my own. I'll be discussing how the adventure genre ate itself and how I believe readability issues contributed significantly to this tragic autocannibalism.

Labels: , , ,

6 Comments:

Blogger Alan Jack said...

http://www.facebook.com/note.php?note_id=79711598368&ref=mf

Denki use this method when testing their games. Thoroughly agree, and thank you for the link to the TED talk - I recently had my first experience of attempting a scientific study involving gathering responses from the public. Wish I'd watched that video before I started.

April 29, 2009 at 8:14 AM  
Blogger Nels Anderson said...

@Alan Heh, that's awesome.

I think it's important to rely too heavily on in-house testing, even if participants "kidnapped," since there's still bias there. They're familiar with the projects and might be less forthcoming with negative feedback. People don't always want to dish on coworkers' efforts, after all.

Nintendo has it a little bit easier since they have so many employees.

In-house is already to test the basics of whether or not something works, but to get truly unbias evaluation, you need completely fresh eyes and minds.

And yeah, who knew one could get so much out of a talk about spaghetti sauce, eh?

April 29, 2009 at 9:16 AM  
Anonymous Anonymous said...

This is a great post! You've touched on several topics here that I really believe in:

Make players happier by making more focused games for more specific audiences.

Treat game design as a hypothesis and test against it heavily. You don't get to decide if you're right, your players do. I wish I'd seen that gdc talk.

Don't listen to your players, watch them. They don't know what makes them happy. That spaghetti talk was really excellent and enlightening.

Thanks for the links as well.

Mike
mikedarga.blogspot.com

April 30, 2009 at 11:44 PM  
Blogger Nels Anderson said...

@Mike Exactly. Experienced, skilled designers can use their gut to get closer to the target. You might need less evaluation, since you start closer to where you want to be, but no amount of talent or experience can eliminate the need for evaluation.

It's amazing how much you can learn from a talk about spaghetti sauce, eh?

May 1, 2009 at 4:58 PM  
Blogger Lucas said...

Reading about Gladwell's second point reminds me of the story of the Ford Edsel: the car designed by focus groups. (Much like the car that Homer Simpsons designed for his brother.) The Edsel's more of a cautionary tale, not a success as is the spaghetti sauce, but it still illustrates the point.

May 12, 2009 at 6:42 PM  
Blogger Nels Anderson said...

@Lucas Great, thanks for the link. I hadn't heard about the Edsel, but it seems there's definitely something important to learn here.

May 12, 2009 at 9:48 PM  

Post a Comment

Subscribe to Post Comments [Atom]

<< Home