Showing posts with label open ended questions. Show all posts
Showing posts with label open ended questions. Show all posts

Tuesday, May 15, 2012

How Can I Get the Most Out of Ideation or Brainstorming Research Sessions?

There are a number of established techniques for ideation and/or brainstorming (which are similar, but not exactly the same) that can be effectively used a part of a systematic search for targeted opportunities in the form of new features, new products, new markets, and/or new services within various categories of interest.

The fundamental premise of these techniques is to start with an issue or challenge and then generate a broad range of different possible ideas to address that challenge. Often, there are two important components to a brainstorming project: “Divergence” is the process of generating ideas followed by “Convergence,” which consists of selecting and developing the top ideas.

Although brainstorming sessions share some similar characteristics with focus groups, brainstorming research sessions are quite different from traditional focus groups ‒ within the field of new product development, brainstorming sessions are about exploring possibilities, generating new concepts and discovering new opportunities, whereas traditional focus groups are best used to validate ideas, weed out bad concepts and improve existing concepts.

The distinctions between brainstorming sessions and regular focus groups carry through to some critical differences in how the groups are conducted:

  1. Brainstorming sessions last longer than most focus groups to ensure there is sufficient time for both training and ideation. Typically, each brainstorming session is scheduled to last between 2-1/2 and 3 hours whereas focus groups generally do not go beyond 2 hours.
  1. Participants are recruited specifically to be natural “lateral thinkers” or “intuitors” because this thinking style has been shown to correlate positively with the ability to generate new ideas. However, this isn’t a common talent – most consumers are very good at reacting to ideas they are presented with but they’re not as good at coming up with new ideas on their own. In addition, the “creativity” recruiting specifications are over and above the need to invite participants who have experience with the topic under discussion.
  1. Participants in brainstorming sessions are given a homework assignment to complete in advance of the session and are required to start generating ideas before attending the session. This helps to get them “primed” for the discussion and ensures that each session can start off with idea sharing from the start.
  1. During the recruiting phase of the project, a member of the project team will contact each qualified participant by phone to encourage their idea generation and answer any questions about the process or expectations from the sessions.
  1. Brainstorming sessions are not as much of a “discussion” as a focus group is – rather, the goal is to keep things moving and use ideas shared to spark additional ideas.
  1. Ideally, the client team (often consisting of 4 to6 people) is encouraged to be fully engaged in the process and to use the ideas from the consumers to help spark their own thinking. In the end, it is often the client team members who end up generating the best, most workable ideas.

Getting the best value from brainstorming sessions also requires following a number of important steps to ensure that good quality ideas are generated. In our experience, the most effective brainstorming sessions consist of:

  1. Introductions and training in the rules of brainstorming.
  1. IDEATION GENERATION. Each participant shares one idea at a time, the facilitator probes for clarification if necessary, and other participants share any “builds” they have on the idea. A “build” is a new idea that is sparked by the original idea shared. The participants continue to generate and share ideas throughout the session, while the client team listens in the backroom and builds their own ideas.
  1. Negative comments quickly shut down the idea-generating process; therefore, participants are taught to approach ideas with a specific mindset. If they hear a new idea they dislike, rather than share this negative reaction, they instead focus on generating a new idea that fixes what they don’t like or simply move on to sharing another idea they have generated.
  1. The client team is brought in with the participants mid-way through the session and the client team members work in small teams with the consumers. Typically, each small team is asked to consider the ideas they heard throughout the session and then develop their own “ideal” solution to the project’s challenge. This co-creation process yields a range of different “ideal” solutions for the client team to consider after the session, as they choose and develop their final ideas.

Tuesday, April 24, 2012

Getting the Best Value from Open-Ended Questions

An article by Carolyn Lindquist called “For Better Insights From Text Analytics, Elicit Better Comments” in the most recent edition of Quirks Marketing Research (April 2012) gives three recommendations for improving the quality of consumer responses to open-ended questions. These three recommendations are:

1. Target your questions

2. Ask why

3. Be sensitive to placement

Based on my own experience, these are worthwhile considerations when designing surveys. I think most quantitative researchers – including me! – can fall into the twin traps of asking too many open-ends in a single survey and not defining those open-ends as clearly as possible.

I’m a strong believer in what I consider “directed open-ends,” which means that the wording is specific to the situation rather than a catch-all “please list comments below.” For example, in concept tests, I strongly believe in asking for strengths and weaknesses separately and this makes the survey both easier to answer and to analyze. This is consistently with Carolyn’s recommendation to “target your questions” – the example she gives is to link is to vary the open-ended question text according to the stated level of overall satisfaction.

I’m intrigued by Carolyn’s suggestion to ask “why” rather than “what questions,” as they have found that asking “why” (such as “please tell us why you were less than satisfied with your experience”) yields longer and more useful answers than asking “what” (as in “please tell us what we can do to improve your next experience”). She has found that the responses to “what” questions contain less detail and emotion than the answers to “why” questions. I think this suggestion is worth testing out. However, this does not mean we should ask “why” after every rating question, as we’ve had some clients request a few times over the years!

I also agree with her third recommendation on being sensitive to the placement of open-ended questions, although I don’t agree with her suggestion that open-ends should only be asked at the end of a survey. In my experience, open-ended questions should appear where they make the most sense in a survey and a nice balance of quantitative rating questions and open-ends makes for a more pleasant and natural survey-taking experience. One caveat though – I avoid having too many open-ended questions listed sequentially, as I believe that too many open-ends in a row can lead to a feeling that the survey is longer than it actually is and lead to respondent fatigue.