Best Practices for IVR Testing
Interactive Voice Response (IVR) is more important than ever, spurred on by the automation of branded customer experiences (CX), the growing number of calls on mobile devices, and the increase in speech recognition conversational interfaces. The market for IVR software was $3.7 billion in 2017 and is projected to reach $5.5 billion by 2023, according to Research and Markets. Companies continue to update their IVR platform and IVR software with new capabilities and many are moving their IVR to the cloud.
The pace of IVR development has accelerated as companies race to offer personalization, omnichannel journeys, and differentiated experiences. But IVR software has become more complex.
Most voice self-service environments incorporate different technical components such as the IVR (voice portal), VoiceXML applications, speech recognition, VUI, text-to-speech, call routing logic, IP telephony infrastructure, web services, and underlying transaction systems that must all work together.
IVR testing has become a tremendous challenge for companies. The IVR software is continually being updated and IVR test scripts need to keep up with the code. The IVR is one of multiple channels in an omnichannel journey. But to ensure a good experience for the customer, you should test the entire end-to-end customer journey. As you take on the challenge of IVR testing, here are some best practices to help you.
1. Be an Advocate for Your Customer
Sometimes software is implemented and tested blindly without looking at the customer experience. Bad logic, poor usability, or IVR branches with dead ends, slip into production. Look at the bigger picture of the IVR software and think holistically to determine if the design makes sense from the customer’s perspective. If you have any doubts in your ability to accurately represent your customer’s point of view, conduct some user testing or conduct a usability study. A focus on your customer’s success will help you find design issues before they are put into production.
2. Strive to Automate Everything
Plan for the future and automate everything. For speed and scalability, having an ever-growing automated testing process is the only way to increase development pace while assuring high quality. Any manual testing should be targeted for automation. To accelerate the automation process, use a tool that creates functional test IVR scripts, regression tests, load tests, and monitoring tests when the IVR design is updated. Because today’s IVRs have become so complex, focus on tools that can generate test scripts for you automatically, so you can achieve high coverage over all the myriad paths and situations.
3. Adopt a Continuous Integration Mindset
Whether you are using DevOps (extension of Agile software development) or not, a continuous integration mindset is important. Continuous integration (CI) is the practice where code changes are regularly merged into a central repository so that automated builds and tests can be run to quickly merge changes. Continuous testing is part of CI and is the process of executing automated tests. Taking a multi-tier approach (#4) is a best practice.
As an example, one global financial services company is constantly integrating and regression testing their CX. Literally every single time a developer changes code, an IVR integration environment is updated and a “smoke test” kicks off. The running of the smoke test takes 20-30 minutes at most, but it runs through a lightweight regression suite to make sure nothing major was broken with the update. If there is an issue, feedback goes to the development team immediately, so the issue can be fixed while the details are fresh in the developer’s mind. Then, at the end of the day, they run a larger daily suite that runs everything against one primary testing database. That test may take about six hours. Over the weekend, they run the full gamut of everything they have, testing over 20-36 hours, against multiple test data sets.
4. Take a Multi-Tier Approach to Testing
A multi-tier approach is part of a continuous testing approach. When your test case library gets so large that you cannot run all the tests quickly, break down the tests into tiers and run them on different schedules. As a rule of thumb, when tests take more than 30 minutes to run, you need a multi-tier approach. Note that there are ways to shorten the time of running tests by adding concurrent connections allowing parallelism to increase testing throughput.
A best practice is to break down the testing as follows:
Tier 1 – Unit testing
Select lighter weight tests that test smaller pieces of the code. This testing should take no more than 30 minutes and can be run multiple times a day.
Tier 2 – Integration testing
When large tests take too long to get quick feedback (>30 minutes), move those tests into into an integration tier. Integration tests are more exhaustive tests which can be scheduled at nights or on the weekends when there is more time to test.
Tier 3 – Systems testing
If large tests take too long to run in Tier 2, then create Tier 3. Systems testing that performs load tests on the whole system can be put into this tier. These tests can be run on a weekly basis.
5. Perform Omnichannel Journey Testing
Customer journeys are now omnichannel journeys and channel transitions are common points of CX failures. Don’t just test the IVR, test the whole journey including the context passing between channels. If customers tend to start on the web, go to the IVR, and then go to a live agent, test the whole chain.
Even if your team only owns the IVR code, end-to-end customer journey testing is your responsibility. Create an omnichannel test plan with test scripts. Determine how to have the other channels participate in the testing using a black box or other method. Omnichannel journey testing requires a more extensive environment so all the channels can be tested.
6. Use Design to Drive Your Tests
In a typical software development lifecycle, the developers write all the code, and when it is complete, they hand it over to the QA testers. In the world of CX, we do not want to wait until that code is complete. Instead, use a design-driven testing approach. Once the business analyst or VUI designer provides a roadmap and design of what the IVR should do, create tests from the design. Then, when the design and code are complete, you are already set to test. This level of preparation has the added effect of finding bugs in the design and code — where they are much easier to fix.
7. Democratize Testing
It’s good for developers to write test cases and not just leave the job to a QA team. Writing test cases requires taking a customer’s perspective which will lead to better design, better code, and better tests.
As an example, one company used to have one half of the development team write test cases and at the same time have one half write the code for a portion of the project. Then they flipped the teams for a different part of the project. This enabled two different minds to be looking at the design specifications and user scenarios from a coding and testing perspective early in the cycle. It enabled test cases to be written at the same time as the code which created a feedback loop to identify issues early.
8. Stay Close to the Production Environment
Mimic your production environment so you can perform an “apples to apples” testing. To do this, build your staging environment with software applications, operating system, hardware, and network configuration to simulate the production environment. A real DevOps shop will build each testing environment using the same tools used to build out production environments. If your test or staging environments are not representative of the real production environment, bugs can get through to production or you may spend time fixing problems that can never occur in production.
9. Simulate Customer Account Data
It is important to test with customer data that represents the most frequent scenarios. Create simulation data from live data so that you have 100% control of the data and that the data is not changing on you. But make sure that the data represents a large portion of customer activity and gives you a cross section of all the customer types and situations that happen in real life. As a rule of thumb, build your test cases based on what the customer experiences should be, then find the test data to fit those test cases (versus the other way around).
10. Perform Reliability Testing
Make sure that the system performs 24/7, all the time (e.g., while the backup is running and during shift changes). Create monitoring tests that provide ongoing vigilance to catch any reliability issues. Write monitoring tests early in the process. Start running the monitoring scripts at least a day before you go live and continue to run them every few minutes, around the clock.
11. Share Knowledge Extensively
It is crucial that everyone involved with development have access to the code, test cases, and all the information on the project. Create a central hub to share information. Then QA testers (whether QA test or contractors), project managers, and management can obtain the latest information, testing status builds, and documentation.
12. Choose an Easy-to-Use, Integrated Testing Suite
Companies usually have to decide between using best-of-breed technologies or an integrated suite. For IVR testing, the best solution is an integrated suite that works enterprise-wide and can work across all channels. An integrated suite will enable functional tests to be leveraged across regression, load, and monitoring tests.
Because overall team productivity is so vital in today’s world, choose an IVR automation testing tool that is easy to use. Between QA testing staff and contractors, your testing team changes all the time. You want to have a tool that is easy enough that new team members can learn it quickly so you do not have a knowledge gap in testing.