This page illustrates the design research methods I have hands-on experience with.
Unless otherwise mentioned, I feel very comfortable with each method. Many examples include student work. If it is included here, it is because I worked very closely with the students, in a mentor – apprentice model.
I take pride in mastering a rich methodological toolbox. Rigorous knowledge of the ins and outs of various methods enables me to mix & match in order to solve a research problem within given time & resource constraints. I generate creative research plans that enable even beginners (M.S. students) to get the work done quickly, without compromising the credibility of the results.
In-depth interviewing – I have conducted in-person, video, and phone interviews for several projects. A few examples:
- I prepared this report for a corporate sponsor based on phone interviews. (contact me for password)
- This is a persona report – while I did not write every word, I supervised very closely the structure and presentation. We used affinity diagramming to analyze the data. The work was published in proceedings of HCI International.
Focus groups – I use focus groups when I need ideas that emerge from group interaction.
- In this study of perceptions of corporations on Facebook, I wanted to capture the emerging social norms of this user group. This paper is a good illustration of my beliefs about writing: even academic writing should be clear and accessible.
- My team and I developed this set of personas based on focus groups conducted inside the NSF. I did not write all the text, but supervised the structure, organization, and overall presentation of the report. Here is the research article summarizing the same insights.
Card sorting – I worked with a research assistant to redesign the information architecture for nanoHub, a central resource for nanotechnology learning and research. The wealth of resources on nanoHub had been growing over time. Users were complaining of inability to find them. I structured the open card sort to capture both users’ understanding of the terms used in the menus, and their logics for organizing information.
Design workshops – I love facilitating design workshops to extract design requirements from the artifacts created by participants. In a workshop I ran at SRI International, I prototyped a rapid method for articulating personas.
Formative usability testing – Learnability While time on task and task success are measures of ease of learning, in earlier design stages I like to collect qualitative data related to learnability. I guide users through an interview structured around these questions:
- What is this? Can you explain to me what you are seeing here?
- What do you think you can do here?
- What do you expect will happen if you perform the action you mentioned?
- [Upon performing the action] Is this what you expected would happen?
This method works really well in a Wizard of Oz setup, and I have used it to get early feedback on sketches and wireframes. Here is an example from such a test conducted on the alpha version of DIA2. I used moderated remote testing in this case.
I am a strong believer in inspection methods. They can save a lot of time and resources. I don’t jump into collecting user feedback until I have addressed all the issues we can identify in-house.
Heuristic evaluation – This report is a recent example of a heuristic evaluation conducted by a group of 3 undergraduate students under my close supervision.
Cognitive walkthrough – I use Spencer’s streamlined cognitive walkthrough frequently while designing. I rarely prepare formal reports. I usually just capture a list of issues. I have recently started working with Dr. Margaret Burnett, creator of GenderMag – a modified cognitive walkthrough meant to identify gender bias on interfaces, and have learned her method.
Website Experience Analysis (WEA) – I created an evaluation method, Website Experience Analysis (WEA), for my PhD dissertation. I proposed a framework and research protocol for evaluating the user experience on corporate websites from a public relations perspective. I wanted to bring communication, interpretation, and meaning making in the area of user experience. The protocol I created, Website Experience Analysis (WEA), enables researchers to understand what perceptions users form upon experiencing a website, and what specific website elements account for those perceptions. Read more about WEA:
- My book, Web Site Public Relations: How Corporations Build and Maintain Relationships Online.
- Shorter explanations of WEA can be found in research articles such as this one or this one. Practitioner-oriented descriptions of the research protocol are in chapters in the Handbook of Research on Electronic Surveys and Measurements and New Media and Public Relations.
- M.S. thesis that applied WEA to institutional Facebook pages.
- M.S. thesis that applied WEA to interactive information visualization.
Usability testing – I have directed, collaborated on, taught, supervised and graded more usability testing sessions than I can count. I prepared this worksheet to help graduate students plan a usability test. I worked very closely with this student group on our report for nanoHub. Another usability test done as part of my graduate UX course resulted in a publication in the proceedings of HFES (Human Factors and Ergonomics Society).
Social media analytics – I used social media analytics to monitor and evaluate the visitor experience in Indianapolis, when the city hosted the Super Bowl. The case study was published in the Journal of Direct, Data and Digital Marketing Practice.
Surveys – While I’m comfortable with survey research that results in descriptives, in this study we used a mixed-methods approach that included both qualitative data and a quasi-experiment. The experimental part was lead by a colleague with expertise in educational psychology. I designed this research that involved pre- and post- measures of an educational intervention, but relied on my co-authors for the statistical analyses. That being said, I use the System Usability Scale (SUS) and Net Promoter Score (NPS) as needed in combination with usability testing.
Content analysis – I conducted qualitative content analysis and collaborated on or supervised a number of projects that used quantitative content analysis, such as this article, or this one. While content analysis can produce useful insights in media studies, I am wary of reducing rich user insights to numbers.
The methods above are deep within my comfort zone. I taught Qualitative Research Methods for PhD students in Technology and UX Design for many years. I have not used all the methods I taught. Here is a list of methods I am familiar with but don’t have much experience using:
- I only conducted one small ethnography.
- I conducted one small diary study.
- I taught contextual inquiry but never had an opportunity to try it out myself.
- I have never created a user journey map myself, but I directed student groups who did. This is an example of student work I supervised.
- I have never extracted user mental models using Indi Young’s method. One of my M.S. students used this method in her thesis but another professor supervised this part of her work. This is a method I look forward to mastering.
The professor in me has to leave you with a list of books. These are some of the books I’ve taught from, or simply love:
- The UX Book – for the most comprehensive account of A-Z UX work.
- Universal Methods of Design – for those who need a quick reminder/catalog of methods in our toolbox.
- Measuring the User Experience – has to be my favorite book for teaching usability testing, though in case of emergency, I recommend:
- Rocket Surgery Made Easy – because it lowers the barrier of entry to collecting user feedback.
- Quantifying the User Experience: Practical Statistics for User Research and Jeff Sauro’s blog – although I am a qualitative researcher through and through, Jeff’s explanations make a lot of sense to me.
- Observing the User Experience – this could be an entry-level textbook, though I’ve never used it as such.
- Qualitative Research and Evaluation Methods – if you want to get real about qualitative research.
- Qualitative Research Design – a lot less scary than the previous one, very practical. Also quite post-positivist, but he’s very honest and reflective about it.
- Designing and Conducting Mixed Methods Research – explains how to scaffold different methods in meaningful ways.
- Transforming Qualitative Information: Thematic Analysis and Code Development – for those who think they are doing a “grounded theory approach.”
- Hidden in Plain Sight: How to Create Extraordinary Products for Tomorrow’s Customers – because when I grow up, I want to be like Jan Chipchase. He is a professional ethnographer who claims that opportunities for innovation are hidden in plain sight – all we have to do is learn to observe the world around us to identify them.