91% of people today said that they would NOT be willing to participate in polling
The Problem – How the Industry is Failing
The traditional market research industry has depleted its most valuable resources – people who actually participate in research. In fact, 91% of people today said that they would NOTbe willing to participate in telephone polling. And this number is only slated to get worse since only 25% of participants were satisfied with their recent participation, more than 55% of people said a bad experience would impact their future willingness to participate in other research.
You can quickly imagine how the perpetual pinging of the remaining 9% of the population willing to participate could lead to complete revolt against research. Some firms have already seen it with AI bots or disinterested respondents increasingly skewing valuable results.
At the Market Insights Forum last week, Rob Stone, Ph.D., Vice Chair, Insights Association, presented these findings on consumer participation in research and called on research suppliers and buyers to reconsider how they engage consumers as a finite resource.
A Solution – Using What’s Available
In response to this call to action, one option that the industry should consider is looking at integrated business intelligence methods. Business intelligence allows for the fusion of different data sets to provide insights and evidence-based recommendations. While business intelligence usually includes internal data, researchers can start with open-source data (e.g., data.gov, PEW Research Center, and healthdata.gov) or other readily accessible data sets, such as online and social media.
Combining open-source data with social media discussions can help researchers find potential insights for their clients, and decrease the frequency of how often consumer and other stakeholder groups are questioned in surveys about their perceptions. Especially since perceptions among most stakeholder groups will not change quickly without an event or dramatic change in circumstance, which online and social media tracking can help identify.
Data sets like historical social media discussions are great for evaluating changes in most stakeholder group perceptions. People have already opted-in to providing their perceptions about products and issues openly on social media in exchange for new communication tools, so it’s only a matter of mining this data for information.
How to Get Started – Leveraging Social Media
So where do you start? Well, let’s begin with how social media can be used in exchange for asking monthly or quarterly perception questions. The following steps outline how to adapt a traditional survey to analyze social media discussions:
- Look to Previous Research. Survey research is helpful to determine the topics or drivers that impact a stakeholder’s perceptions. By looking at previous research within the industry or for the company, for example, you can determine which topics (e.g., product quality vs. customer service) might be more impactful to a company’s reputation and focus your searches accordingly.
- Identify the Right Tool. The best social media mining and visualization tools are always changing, but look for ones that will provide the right access to the most relevant data. If your audience is younger, for example, then you need a tool good at searching Instagram, instead of just Twitter.
- Craft the Search / Filters. To pull in relevant conversations on the company and related perception factors, you will need to consider the various ways that people reference the topics of interest. Desk research and savvy social media knowledge can help you find some of the terms, which will go into your searches.
- Baseline the Data. Like most data, you are looking for outliers or changes in discussions, so you need to look at least six months back to determine if this month is within or outside the norm. Comparing against competitors or the industry overall is sometimes helpful as well if you don’t have historic context.
- Keep Research Principles. Just like a survey, you need a relevant sample size to provide insights and recommendations to clients. As a rule of thumb, it should take your clients longer to read your insight and recommendations than all of the individual social posts on the topic – so 10 tweets won’t cut it.
- Test Sentiment Engines. Not all automated sentiment engines are equal, ranging usually from 50-80% accurate. Accuracy declines even further in certain fields like healthcare where symptoms can be confused with adverse events. Some tools have machine-learning capabilities which improve sentiment overtime, but otherwise, manual sampling of sentiment is usually necessary.
- Align Insights to Business Impacts. The fundamental business problem should drive the presentation of results – just like a traditional research presentation. Start with setting the stage of the current social media environment, then connect discussions into actionable recommendations that a company can use to adapt their strategy or approach. Social media does not have to be about likes, comments, and reactions. The volume of negative and positive discussion on relevant business topics should drive results.
Once an initial analysis is done, running more complex correlation analyses between online/social media discussions, plus internal or third-party data, and survey responses can further identify changes in conversation that predict shifts in perceptions over time. For example, dramatic increases in negative discussions about a company’s brand promise will likely show a notable shift in stakeholders who say that they distrust the company.
And social media is just one potential data set. Depending on the topic or stakeholder group of focus, other data sets may be better served as a predictor of survey responses. In order for market research to survive though, it must evolve and be open to considering what’s already available – instead of overwhelming the finite resources that it has left. These resources, aka willing participant, can instead be regarded as valuable commodities and only used when other data is not already available or updated on a topic.
- PEW Research Center, “What Low Response Rates Mean For Telephone Surveys” (http://www.pewresearch.org/2017/05/15/what-low-response-rates-mean-for-telephone-surveys/)
- GREENBOOK, “Q1/Q2 2017 GreenBook Research Industry Trends (GRIT) Report” (https://www.greenbook.org/grit/)