We sat down with Derek Schoettle, General Manager of Watson Data Platform (IBM Watson) to get his thoughts on where marketing research companies can most effectively perform in a business intelligence space increasingly influenced by AI and machine learning. We'll learn even more during the Insights Leadership Conference, September 26-28 in Palm Beach, when Derek presents there.

What would you say is the biggest challenge posed by AI/ML for “traditional” market research firms, which focus mainly on capturing the opinions of consumers about their preferences or experiences regarding a product or service?

In the era of Big Data and Machine Learning, organizations are increasingly building out marketing execution systems that are driven directly by advanced algorithms operating on large volumes of collected UX data. The output of a traditional market research exercise is summarized and aggregated in a way that makes it more difficult to incorporate directly into these workflows. At the extreme end, the temptation is to eschew traditional market research entirely during the implementation of such a system.

Where do you see the greatest opportunity for these companies to contribute in a meaningful (and for them) a profitable way in this new paradigm?

One opportunity is to develop better methods for aligning market research and surveys to the customer segments that are being spit out of these ML clusterings systems analyzing customer data. Now the market research becomes another input to the algorithm, and a very clean one at that.

What have been the most transformative recent advancements that bring UX data into the insights generation process?

Over the past five years cloud providers have put a tremendous amount of on-demand computing power directly into the hands of LoB decision makers, while also enabling a plethora of SaaS vendors to introduce new analytics and dashboards tailored specifically to various aspects of marketing analysis and execution, all without the traditional upfront capital expenditures of an on-premises approach. This is a trend that seems to be accelerating, not slowing down.

What new technologies or techniques that are not yet widely adopted (or which may be coming soon) do you consider to have the most potential?

The best predictive analytics integrated into marketing execution systems today tends to rely on highly-skilled data scientists to craft models expressly for the problem at hand. Deep learning techniques are maturing to the point where we see the relatively standard network designs outperforming more "artisanal" traditional models, while also proving more resilient to unexpected inputs. Companies who can amass and cleanse the data to train these networks may find themselves at an even greater advantage than they had in the past.

What do you see this feedback loop looking like in two years? How do you see non-UX/non-usage data market research playing a role in this environment in two years’ time?

I think we will continue to see an adoption of AI, but it will expand beyond today's data-hungry neural network designs to new algorithms that are more capable of learning more effectively from "small data". AI systems will also become easier and easier to train by non-experts. I think this will open up an opportunity for market researchers to be more continuously engaged in the training and experimentation workflow, and it will be critical for them to take that opportunity in order to stay relevant.

Can you point to two “takeaways” that you hope to impart on the audience at the upcoming Insights Leadership Conference?

Data may be the new natural resource, but it needs to be refined in order to be put to use, and no amount of ML or AI is going to change that. The key to becoming a truly data-driven business is to streamline the path for feeding data into smart algorithms operating as close to the decision point as possible.