5 MUST DOs to SUCCESSFULLY Implement Text Analytics Software and Maximize its Potential in your Company!
Introducing a new technology or approach for generating insights in any organization can be more challenging than one realizes. There is a succession of hurdles to overcome if you really want to achieve traction and make a lasting impact, and a misstep in any one of these can doom an otherwise promising new addition to your insights arsenal.
In fact, one of the top questions I get from managers these days is how to effectively implement something like text analytics software in the organization. The process begins well before a technology solution has been selected.
I’ve spoken with hundreds of users over the past few years at different types of companies of varying size in various industries. This post is based on those conversations, and to keep it simple—although we support a host of internal functions and disciplines—I’ll focus on one of the most popular and arguably best use cases: customer listening (marketing research).
1. Establishing a Need for Text Analytics (Do you have mixed data?)
This one may sound obvious, but unfortunately it isn’t. I often talk to prospective text analytics users who want a software demo, but they don’t have a data set to use in the demo. In other words, they haven’t thought far enough along to determine specifically what data they would actually analyze using text analytics software.
Almost any company of middle-market size or above—especially if they are consumer-facing—will have data from various sources of VOC (Voice of Customer) that would be perfectly suited for text analytics, and which, it goes without saying, are not being exploited to their full insights potential. These data may be small or large, more or less frequently collected, and longitudinal or ad hoc in nature. Sources include survey data, customer feedback and email, online research community threads, and call center transcripts (to name just a few).
The point is, wherever there is at least one unstructured/text “comment” field in a dataset, there is an opportunity to tremendously enrich analysis by leveraging this data. Furthermore, most of the time, truly valuable data consists of some mix of structured and unstructured data (i.e., text and numeric data).
Inventory the data you already have and identify which data sets look ripe for exploring with text analytics. Then select one that fairly represents the data you expect to analyze with your future tool and use this data set for your demos.
What about social media?
Yes, social media is an increasingly popular source of data that text analytics users are eager to analyze. I’ll emphasize here that when people refer to social media data, social listening, etc., in the context of research, they are almost exclusively talking about Twitter data (sometimes without even realizing it).
When I sit down with clients, I prefer to distinguish and separate social/Twitter from the aforementioned other sources of data because traditional sources have already proven themselves valuable enough to collect and analyze on an ongoing basis. This is often not the case with social media/Twitter data.
Many people are under the impression that Twitter will yield a goldmine of insights. In actuality, the extent to which Twitter has any meaningful insights value is limited and depends highly on the category/industry. In fact, many CPG companies will find very little of interest on Twitter, while high-ticket service industry brands may find a bit more.
The point is that if your company has not yet collected or looked at social/Twitter data, it’s probably not critically important for you (and it shouldn’t be the primary reason you adopt text analytics).
Moreover, if you have already determined that social media is actually important, then you should be able to articulate more than one research objective around what you expect to be able to answer with that data. If you cannot, then social/Twitter data will probably not provide a good insights ROI for your organization, and I would strongly suggest that you focus on a traditional data set first.
Bottom line: Get your feet wet with text analytics using data that has already shown clear value!
2. Identify the Text Analytics Software User/s (The “Analytics” in Text Analytics Requires an Analyst!)
Good text analytics is two parts science and one part art, therefore it will require a human analyst. It’s incumbent upon you to figure out who that person/s will be.
What you need to know about artificial intelligence and machine learning…
This may come as a shock to some folks, as many people have been led to believe that software leveraging artificial intelligence and machine learning effectively removes the need for a human analyst. This is utter nonsense.
IBM’s Watson comes to mind here. I’m not picking on Watson, but the notion that analyses that will produce meaningful insights can be completely conducted without a human analyst by a human-like computer that will automatically intuit everything and anything about any dataset is a complete fallacy and a PR gimmick.
I’ve blogged about AI & machine learning before here. Luckily for those of us in market research, a human analyst still needs to be involved for anything meaningful or useful to come from the analysis. (I say “luckily” because if this weren’t the case you and I would both be out of a job! But don’t worry; human analysts will not be replaced by machines any time soon.)
Back to identifying a human user…
Having hopefully dispelled any myths about not needing a human analyst, I want to emphasize that this does not and should not mean that you need to hire a data scientist. On the contrary, if the tool requires expertise in scripting, for example, chances are it’s not very intuitive and more of a programming tool better suited to academics.
Good text analytics tools for researchers should provide immediate applied value, and allow a common business practitioner to start analyzing any data set right away or with minimal training (I usually recommend about an hour or two). With a good tool, text analytics will be learned in the trenches using actual data for actual analysis that has real value for your company.
So who will this analyst be? Who will use the software? How much will they use it? Surprisingly a lot of companies get tripped up at this step, too, which overlaps with step #3.
Hint: Unicorns don’t exist!
They say a camel is a horse designed by a committee, but in my experience the enterprise designs a unicorn. The user should never be “everyone.”
Many companies—especially those in which procurement departments play a significant role in the decision—tend to oversimplify steps #1 and #2, and these buyers are more likely to fall for sweeping marketing promises by providers that claim to offer an one-in-all solution for everyone that can do anything.
Frequently in these cases, a long wish list of feature/attributes is compiled by a committee, often by adding wishes from various potential users in different functions across the company or by cobbling together features from very different types of text analytics software. This list ends up looking pretty unrealistic and usually calls for a solution that is suitable for all kinds of data, even calling for some sort of imaginary “merging” ability of completely non-complimentary data that do not have any common unique identifiers or even meta-level merge fields.
This theoretical software is also supposed to be equally useful for marketing, marketing research, customer service, sales, HR, PR, operations, and legal departments, and, of course, IT, too. Not only that, but it must be simple enough for everyone to understand the output without any training or prior analytical knowledge (i.e., static dashboards).
This is an insane expectation!
Applying the same logic, imagine if a hospital bought tools this way—if doctors across all departments from neurology to obgyn had to settle for, say, one scalpel. Oh, and by the way, it should also be useful for the maintenance department, because, after all, they need to cut things, too (like electrical wires or plumbing). This universal scalpel should also be useful for the administrative staff, because they have envelopes. A scalpel should be able to open an envelope, right?
Here’s the frank talk: if you put together a $150-$500K RFP, someone will answer it and claim to have the perfect one-size-fits-all universal scalpel. Good luck with that. (I feel especially sorry for the patients.)
There is no one-size-fits-all product. A text analytics decision should be handled at the department level according to that department’s unique data, objectives and staffing needs.
Will YOU be using it? Then you are the user. Congratulations!
3. Identify a Text Analytics Software Solution
You’ve identified that you have data of value, and that you have at least one user to whom this new task will fall and for which they will be directly responsible. Now it’s time to find the right tool for this user with the best ROI.
Provided you’re not looking for the mythical unicorn I mentioned in step 2, this step should be an easy one.
ALWAYS request a demo with your own data. Text analytics software providers should be happy to sign a mutual NDA; in fact, most enterprise companies require it. This MNDA covers your data and any discussions regarding your business, as well as the IP of the software provider, so it’s a win-win.
Why is this so important? Anyone can put a mock demo together on a mock data set and make it look like it works. The ONLY way to evaluate a software provider is to do so with your own data—data that you are familiar with and that is relevant to you and your business objectives.
One more thing (touched on in step 2): You should approach vendors with an open mind each time. Do not use one vendor’s approach as the basis for assessing another vendor; judge them based on actual output. Does the software have all the features needed to discover/answer your business questions and meet your objectives AND is it easy to learn and use?
One more important tip…
Do NOT allow the vendor to have a lot of time beforehand with your data. If they do you will have no idea how much time they put into setting the demo up. For a $250K contract, a company might well invest two full-time analysts across the span of a couple of weeks to make your demo impressive. Sadly, they may even use “mechanical Turk” (human) coding.
I would advise allowing a vendor no more than a day or two with your data, so make sure to schedule the demo within a day or two of giving them the data. In some instances we’ve even been asked to do the demo the same day, or just an hour or two before receiving said data. Which data is that? The data we chose in step 1, of course!
4. Expect Immediate and Ongoing Results
Congratulations! You’ve purchased your software, and hopefully you’ve received some basic training. Ideally you’ve begun using the software right away after the training.
You won’t be a text analytics master on day one, but if you have real data and real objectives and at least one person is responsible for using the tool (and that means that they will have at least a few hours per month for this purpose), then you are in very good shape.
By the way, if you are just getting started with text analytics and/or you have staffing issues, some text analytics vendors may be able to offer you some initial support and be available for special request ad hoc analysis and/or be able to suggest trusted third-party agencies who are trained in use of their tool to help you out in those cases.
Hopefully you didn’t buy the dashboard-only solution—the one everyone uses on all data with no analytical firepower. Instead, you were informed enough to select the tool that does what you need it to do using your data whenever you need it. Now you’re able to answer business-critical questions in new ways and management will take notice!
5. Socializing Text Analytics Findings (Recognition and Growth)
This last step is often neglected. It’s only fair that you get noticed for your smart software decision, and more importantly for the incredibly useful insights that you generate using text analytics. Often formerly stale data will come alive, and unstructured data usually has better predictive power than structured data.
Be prepared to evangelize your findings, and don’t be afraid to ask your software provider for suggestions about how to do so. In some cases, an initial small use case in one department ends up spreading to other departments. HR comes to marketing research asking, “Hey, I heard about that analysis you did. We think this data is kind of similar. Would you take a look at it?”
And then there are more formal opportunities, of course, if you are willing to share a case study in an article or conference presentation. The latter are not any more important than the former; in fact, the former is how you will ultimately be judged more immediately.
I hope the above was helpful. Please reach out if you have questions about any of the steps above.