When conducting a VoC program that intends to establish benchmarks for various business processes with respect to customer preferences in your organization, it is necessary to understand that the order in which questions are asked is equally important as the set of questions you ask for a particular benchmark.
You must put a high emphasis on the metrics that characterize benchmark-focused surveys.
For example, if you are looking to quantify a single open-ended question like how a smartphone app’s UI feels, aspects like “navigation” entail a multitude of factors.
Thus, you must undertake a more encompassing, granular study to determine the underlying problems as compared to choosing the color scheme for a CTA button.
Due to the inherent comparative nature of benchmarking, it is crucial that the survey experience itself is coherent with the benchmarking process.
This involves delivering the logical set of survey questions to every responder, independent of their actual navigation route through the website.
Remember, this might end up in a lengthy, irrelevant exercise for the responders if it is executed improperly, i.e., irrelevant question, irrelevant order, or both of them.
Also, the quality of replies is determined by the accuracy of expression method (ratings, worded opinions) and relevancy; therefore, the longer and less relevant the questions, the fewer chances are that your VoC efforts would fetch explicit, accurate responses.
Therefore, you must be considerate about the fact that your benchmarking is indicative of your performance comparison and subsequent results.
Factors that would govern the response to sub-question #1.2: Continuous progress
If the strategic purpose is constant improvement, then the tactical objective of the VoC process is to comprehend causation.
In this case, ratings are definitely valuable, but their impact tends to increase considerably if they are coupled with follow-up questions.
They must be aimed at determining why the responder evaluated a particular component of the app with a certain score. For example:
- “How effective was this feature, based on why you used it at the moment?”
The subsequent follow up survey question:
- Please explain why you failed [or succeeded] in completing the action.
It is clearly evident that continuous improvement-focused surveys have a considerably higher proportion of rating questions and open-ended discussions.
Naturally, responding to an open-ended question in a systematic method requires much more work from the surveyee; thus, the questions asked should be as brief and to the point as possible.
In other words, continuous improvement surveys should be personalized to each responder so that the questions reflect the user’s specific journey through the mobile app and take into consideration their unique user experience.
This required ability to “customize” each set of questions for each responder requires a sophisticated survey system.
It must be capable of analyzing user behavior and suggesting a series of customized, response-based dynamic questions that determine the user experience.
For example, you may ask:
“Hello, we found that you were using our app’s “latest deals” section under the pet accessories category. what do you think of our new stock?”
Next, you must add a subsequent question like:
“Please tell us why you feel positive/negative about the new stock.”
It is a no-brainer that there is a limit to the number of times you may ask a user to comment on such questions, so you must be selective about where, when, and what you ask in terms of elaboration.
In this example, when aiming at continual development as the main use case, replies that help you understand what drives user behavior in a particular area of the mobile application become invaluable.
They will extend a deep, granular grasp of causation; by pinpointing the core cause of an issue, making it possible to resolve the problem permanently.
On the flip side, the positive responses regarding their experience tell you:
- What features and design components to highlight and improve further.
- What not to do while attempting to resolve other similar issues.
At times, you would come across survey ideas that seem to satisfy both of the above-mentioned aims. However, it is more important to understand what their responses will allow you to really do instead of what they focus on at face value.