Final Reflections on AI’s Place in Generative Research

Part 6 of the AI vs. Human: A User Research Showdown series

Summary

There are many new platforms claiming to accelerate or even replace your typical user research with the use of AI. We put these claims to the test in a head-to-head study comparing the effectiveness of AI tools and human UX researchers throughout the generative research process.  As part of this test, both human researchers and AI tools (and combinations of the two) created experience maps to help a fictional client understand and empathize with their target audience’s current state experience planning family trips. 

Read on to explore our key learnings from this research, which overall were:

  • AI is great at summarizing data at a high level, but struggles to synthesize it into meaningful insights.

  • Most AI tools don’t follow that same process that a human researcher would, which can make it difficult for humans to get the results they expect.

  • To our surprise, AI tools were most successful during the data collection phase, and we feel comfortable incorporating AI-moderated studies into our toolkit.

Study Recap

Jump to findings ꜜ

In the rapidly evolving world of user experience research, the emergence of novel artificial intelligence tools presents a new frontier. As we stand on the cusp of potentially transformative changes, the question arises: How might AI impact traditionally human-led endeavors like generative UX research? 

At Brilliant Experience, we performed an empirical study to directly compare the effectiveness of AI models and human researchers in conducting qualitative interviews. Our study centered on parents of young children planning international travel, examined under four distinct conditions, each varying in AI and human involvement. Across these scenarios, the researchers (or AI tools) produced standard deliverables: a slide deck report of key insights, a set of personas, and corresponding experience maps for each persona. We evaluated each approach not only for output quality but also for efficiency and depth of understanding.

Read more about our method and goals  in our Executive Summary.

This Edition: What We've Learned

In our previous posts in this series, we discussed in detail how AI tools compare to human research at each phase of a generative qualitative interview study.

If you haven’t already, we encourage you to dive into our findings in these posts:

In this edition, we take some time to reflect on our learnings from the study overall and how AI stands to impact UX research going forward.

Key Insight #1: AI is a Summarizer, not a Synthesizer

As we learned when we began to leverage AI to analyze the qualitative data from our study, AI tools are excellent at finding broad, high-level commonalities in a set of data  - in other words, summarizing.  

We used general AI tools like ChatGPT and research-specific tools to generate summaries of large datasets almost instantaneously. This is something that just wasn’t possible before the advent of AI, and that we found extremely useful to review before digging into the more detailed analysis ourselves. We also used AI tools to generate instant summaries of individual interviews from transcripts, which can greatly speed up the data entry process. AI tools were also able to effectively apply their keen summarization abilities to creating structure for personas and experience maps by suggesting persona groupings and experience map phases based on the data.

However, synthesizing data into meaningful, actionable insights is very different from summarizing, and we found that the vast majority of AI tools we tested just weren’t able to make the leap from summary to synthesis. When asked to generate insights from a dataset, general and research-specific AI tools alike tended to produce high-level summaries that lacked novelty and business application. This limitation of AI came out not only in our synthesis phase, but also in our persona and experience map creation. 

Takeaway:  If you really want to uncover novel, actionable insights from human data, keep humans in the loop for synthesis.

Key Insight #2: Most AI Tools Don’t Think Like Humans - But We Wish They Did

When embarking on this research, we were looking forward to handing all of our tedious and repetitive analysis tasks over to AI so we could focus on identifying insights and opportunities. However, we quickly learned that AI works very differently than human researchers, and can’t simply be plugged into the process as a researcher replacement.

For example, human researchers often analyze qualitative in a bottom-up fashion - starting with raw data from which findings are then extracted and affinitized into insights and themes - but AI doesn’t follow these steps, and instead goes directly from data to summary. This means that AI tools ultimately struggle to take on some of the tasks that they would otherwise skip, like data tagging or pulling representative quotes to support a theme. Concerningly, AI tools would often produce errors when asked to perform a task such as pulling quotes - which could have detrimental effects on the output of the research.

We also found that unlike human researchers, AI tools tended to give inconsistent responses to the very same prompt, both in content and format. This made it difficult to rely on AI tools to assist with repetitive tasks.

Takeaway:  Leveraging AI tools for UX research support can be helpful and time-saving, but requires the researcher to carefully consider how the AI tool “thinks” and adapt their process accordingly.

Key Insight #3: AI is the Most Useful in the Data Collection Phase

In this research we set out to comprehensively evaluate the promise of AI tools throughout the generative research process including study planning and recruitment, data collection, and data synthesis (with insights, personas, experience maps as deliverables). At each phase we graded the effectiveness of the AI tools on a five-point scale from Risky Resource to Skilled Researcher:

Overall, we were most impressed at the ability of AI tools for data collection. Tools like Wondering, UserCue, and Listen Labs function as interactive surveys where participants respond to open-ended (and closed-ended, if desired) questions via text or voice. The AI tool “moderates” the session by dynamically asking follow-up questions based on participant  responses. While we were skeptical at first, we found that tools like these were effective for qualitative data collection, freeing up weeks of human researcher time.

Because there is still great value in a human researcher’s ability to pick up on subtle non-verbal cues, we recommend leveraging AI moderated interviews for specific scenarios only:

  • Known Population: We trust AI tools the most for audiences we are at least somewhat familiar with. When charting new territory, we’ll still rely on human moderators who can pivot the focus of sessions when needed.

  • Minimal Visual Stimuli: While many AI moderated interview tools can handle visual stimuli,  a human moderator can better guide participants’ attention to specific details.

  • Traditional Interview Structure: While we think AI is effective at running  qualitative interviews and asking great follow-up questions, they aren’t yet equipped to take over co-creation sessions or other interactive study designs.

For the reasons outlined above in Key Insights #1 and #2, we’re least comfortable with handing over insight generation and experience mapping to AI tools - at least with the current state capabilities.

Takeaway:  AI moderated interview tools are a great way to save time during the data collection phase. 

Choosing The Right AI Tools

New AI tools are emerging every day, promising to support UX researchers and product leaders in a variety of new and different ways. Regardless of the specific features, these tools generally fall into the following categories:

Throughout the  course of this research we tested numerous AI tools across all of the categories listed above, and would recommend considering the following characteristics when choosing a tool for a research task:

  • Data Privacy Policy: It is essential to ensure that the AI tool you are using maintains data privacy within the confines of the conversation and does not use your data to train its model. For participant privacy reasons, we recommend only sharing research data with AI systems that guarantee data confidentiality.

  • Research-Specific Focus: Chatbot-style general AI tools like ChatGPT are surprisingly effective at UX research tasks if given the right prompts, but tools designed specifically for research can come with useful features and are sometimes more accurate with synthesis.

  • Scope of Knowledge: One reason that research-specific tools can be more accurate with synthesis is that they are more likely to limit the scope of their response to the data itself, instead of leveraging world knowledge to fill perceived gaps. While that world knowledge can be helpful for some tasks like developing an interview guide or brainstorming opportunities, it can be dangerous when it causes the tool to “hallucinate” insights from your data.

  • State of Development: Since many of these tools are relatively new, they are changing and adding new functionality often. This can be both a benefit and a risk - consider the stability you need to be comfortable with the tool.

  • Allowed File Types: Some tools allow both videos and transcripts, while others accept only one or the other.

  • Cost: These tools can be expensive, especially those that are designed specifically for UX research. It is often required to sign a contract for an enterprise license to have access to the full feature set.

The Future of AI in UX Research

While AI tools already offer great support in UX research, as discussed above there is still room for growth. In our interview series with founders of AI tools, leaders in the space gave their perspectives on where they see the future of AI in UX research. 

They shared that:

AI research tools are likely to become more integrated and less piecemeal than they are today.

“Imagine a world where we could actually output entire user journeys, right? ‘Take my research from last week and produce a user journey for me and then plot this on Miro for me, please,’  So I think we're just scratching the surface of what the output power of AI is going to be.” - Moodi Mahmoudi, NEXT

We might also expect to see a analysis quality improve…

“I think we are going to see a lot of improvement in not just the speed of analysis but in the quality of analysis because remember, these models are only getting better. They're being trained with more and more data as we speak and they will become extremely competent at helping you analyze [your research data].” - Vlad Racoare, Research Studio

… in part due to the inclusion of AI 'agents' that can act like trained human researchers.

“I think we're gonna see a lot more agent style interactive AI coming up. So, what I mean by that is, for instance, we are going to probably have AI user interviewers, right? Where the, the researcher doesn't actually write the survey questions. They just say, you know, this is the stuff that we want to learn about, figure out how to get that out of the user and then the  AI will chat with them to, to pull that information out and, and get it to you.” - Jeff Erickson, Viable

These changes aren’t likely to replace UX researchers (just yet), but may shift the focus of UXRs from producers of data to curators of insights. 

“We’re working more like directors, or curators of information and knowledge and and data. I think that's probably going to be an impact on how we all work. We're not necessarily going to create posts from scratch, but we’ll be editing, reviewing, selecting, discarding, and so on.” - Valerie Pegon, QoQo

Additionally, the hope is that AI tools make UXR more accessible to those that would otherwise forego it.

“I don't think a lot of startup founders really understand how to analyze the data that they're getting when they're talking to a customer. And some of these tools today are so complex that you need to know actually the background of how to do good research to use them appropriately. Well, one of the big goals for us on the AI side is that we can tell start ups how to do better research without them. It's helping teams understand what the next steps they should take in their research are.” - Max Fergus, UserCue

Conclusion

In conclusion, while AI offers exciting possibilities for streamlining certain aspects of generative research, it’s clear that human researchers are still essential for deeper analysis and synthesis of insights. AI excels in data collection and high-level summarization, but struggles to think like a human when it comes to generating actionable insights. As AI technology continues to evolve, the most effective approach will likely involve a thoughtful combination of both human expertise and AI capabilities, leveraging the strengths of each to enhance the UX research process.

Want to keep up to date with the latest advancements in AI? Subscribe to our newsletter to make sure you don’t miss an issue of our ongoing AI 4 UX video interview series featuring founders of some of the most popular AI tools for UX research.

Previous
Previous

Taming the CX data deluge - AI-Powered User Insights | Interview with Arnob Mukherjee, Olvy

Next
Next

Leveraging AI to Accelerate Experience Mapping