How to think about tools when analysing Voice of the Customer data
Imagine: You want to analyse free-text data to surface insights from the Voice of the Customer. It’s hard and time consuming, but you know that your unstructured data has a treasure trove of actionable insights that can create business outcomes that really matter.
What if you go beyond traditional surveys? By 2025, 60% of organizations with VoC programs might supplement surveys with voice and text interactions with customers (Gartner). These data types could be even more valuable, but much harder to analyse too.
Then you have an ‘aha’ moment that you never thought you’d be able to get to with your current technology stack. The game has changed. After all, analysing unstructured data doesn’t have to be that hard and time consuming. But how do you get to this ‘aha’ moment? What should you know about the old versus new era of text analytics?
We summarized our research about the most well-known VoC tools to help decision-makers evaluate the most critical capabilities. We conducted 100+ interviews with VoC professionals and carried out extensive desktop research. We also opted in for free trials and tested the tools where it was possible.
Looking to read more?
If you’re looking to get a deep dive into the most critical capabilities of VoC tools, check out our comprehensive guide for free.Download the eBook
8 things VoC tool providers don’t want you to know
1. There is a reason why your analysis takes 4 weeks.
Setting up your VoC tool can be labour intensive. Depending on the use case, it can take a lot of time to train or just configure the software for more accurate results. A new rules-based setup is needed to discover unknown themes that are coming up in the data. Actionable insights could be surfaced in real-time rather than weeks or even months.
2. The best tech-stack is no tech-stack. It’s only one tool, end-to-end.
You should be able to easily consolidate and analyse all data points that are spread across multiple channels and data formats. You should be able to use the same analytics tool to tell a compelling story about the insights. Without exporting your analysis and building graphs in Excel and PowerPoint.
3. You could easily save most of the professional services costs.
When a platform is too complex to set up and configure and you need to do it every time a new use case emerges, chances are that you need to spend on professional services. You could pay much less, without investing in highly specialised users. Out-of the-box technology and a less steep learning curve make this possible.
4. Deep learning has disrupted keyword analysis years ago.
A system powered by keyword analysis must contain a manually written rule for every word combination in its library. Creating and maintaining these rules require tedious manual labour. Deep learning models can capture the semantic meaning of unstructured data. Now machines can encode the meaning of data in a way that is closest to human interpretation.
5. A solution that takes years to perfect is not a great solution.
You shouldn’t work hard for years to make sure the insights you’re getting are good. Supervised artificial intelligence solutions require training with specific datasets. It can take months or even years to fine tune supervised AI models and rules-based systems to get to 70%+ accuracy. It is possible to get up to speed within hours using pre-trained AI models.
6. Actionable insights are not really “actionable” after all.
It’s often the “why” versus the “what” that makes an insight actionable. With traditional text analytics the discovered insights might be too generic, focusing on the “what”. Additional analysis must be applied by humans to make them actionable. The right technologies make it possible to ask more open ended questions and visualise raw text data based on how it was exactly said by the customer. Visualisation conveys actions.
7. There are things you are not aware of that you don’t know. But it doesn’t mean you can’t get to know them easily.
Supervised AI approaches and the coding of certain rules create a pre-defined dictionary. This fixed taxonomy, can only change when the logic is updated and maintained manually. This approach can’t pick up unknown unknowns as they emerge in the data. Unsupervised AI is able to auto-generate insights that are unique to the unstructured dataset without the need to setup code frames and rules in advance.
8. Even the most advanced technology can be translated into a no-code platform.
AI technologies could lead to a performance gap between front-runners and non-adopters. For most companies this fact is a no brainer. But can you adopt AI without a dedicated data science team? It is possible to translate AI functionalities into self-service features without requiring additional professional services, custom development or training your team. No-code platforms provide the opportunity for true self-service.