Market research that relies on big data often focuses on consumer behavior. Data scientists look for patterns that explain how and when consumers take action. The problem, says Walter Blotkamp, is that not all patterns are meaningful.
“If you just look at data patterns, you’re not going to understand some of the fundamentals behind [the research],” says Blotkamp, vice president, account services at MMR Research, during an AMADC How-To Session “Consumer Insights in the Digital Age – How ‘Old School’ Tactics Make You Smarter in the Era of Social Media and Big Data.”
To prove his point, Blotkamp asked the audience to look at four stuffed animals and write down which one didn’t belong. Then he held up each animal and asked the attendees to raise their hands if it was the one they chose.
Not everyone chose the same animal. The answers depended on what similarities and differences each attendee picked up on. For example, three of the animals were bears and one was a bird. Three of the animals had the same colors, and one didn’t. A data scientist who only looks at patterns would be left with hardly enough information to drive an effective marketing plan. In other words, not only is the ‘what’ important, so is the ‘why.’ Finding patterns for the sake of it leads to error.
Instead of looking simply for patterns, Blotkamp urged attendees to think of market research as a dialogue, where preparation and listening are key skills.
To illustrate, think of the differences in talk show hosts. The type of information revealed depends on who asks the questions and what the questions are. For instance, “you get different insights from somebody sitting on David Letterman’s couch than on Oprah’s couch,” says Blotkamp.
He offered a personal example. When cell phones first came out, sales reps and others in business bought them because they boosted productivity. In other words, a cell phone was a tool to make money. Then people began buying cell phones for safety, specifically as a way to get in touch with others in emergencies.
Although both segments purchased cell phones, a company Blotkamp worked for had trouble selling to the second segment. Patterns about buying behavior couldn’t explain the reasons why, so they performed qualitative and attitudinal research. What they discovered was that the second segment was skeptical about the company’s offering. (They had been burned by offers in the past.) With that information, management decided on a new marketing plan that included testimonials and endorsements.
For startups, algorithm-based analytics are useful for testing and iterating on new ideas, but for organizations with a lot of money at stake, it’s a risky way to do marketing. For instance, what if behavioral research reveals that prospects are more likely to convert after they visit your site five times, says Blotkamp. As marketers, your instinct is to focus on prospects who reached the fourth visit. But as soon as you start making changes to encourage that visit, you’ve changed the ecosystem. As a result, behavior changes and the algorithm is no longer effective.
In other words, once you make changes, a behavioral insight for one set of prospects won’t necessarily apply to a greater number of people—prospects or otherwise. And at the end of the day, predictive modeling is just that—a probability rather than an inevitability. “You can’t assume that if people are doing something now that it won’t change,” says Blotkamp. “You have to understand why people are visiting and why they’re not.”