Head of Experience Design
Associate Experience Designer and Researcher
Today I want to talk about this ETR offering called Data Insight Services. We are in a lot of ways already doing this, but the goal for formally announcing it is that we want to include it more in the process, and enhance what we're doing. I'll explain of course, what Data Insights is in this talk and give a few examples.
So, as I was saying, we want to talk about this new offering and how it fits into the process, explain the benefits of these services, and when we can use them. As well as how it will fit in at ETR in the future. And then, at the end, maybe we just have an open discussion and talk about where we think this is going and if there's anything that needs to be included or not included and how we can best utilize this.
So what exactly is Data Insight Services? It's an umbrella term that we're using to describe the following services: Usability testing and Information architecture testing, which we're doing a lot of. Impact measurement, which is basically about measuring goal completion for our clients. As well as trying to get some more data on visual design, and the impact and ROI that our UX has. So going beyond just doing the testing and also coming out of it with some actionable, interesting metrics that we can then use to show the impact that we've had and also validate future work. We’ll have more hardcore numbers that explain the impact that we've had.
With Data Insights, we want to go beyond assumptions about what the best solutions are and instead create research-backed, measurable insights on what truly works and what can be improved to increase value.
So why data insights? That would be because testing our designs and measuring their impact, it makes us better designers, better developers and creates better products. And we're also living up to one of Purpose Driven Design’s core principles, which is “you can't guess your way to credibility.” And just a recap of what that principle is about is we're saying we believe an unverified fact is just an opinion. Incredible designs are based on facts, not opinions. We want to leverage data to verify assumptions and test our designs often. This helps us ensure that the things we create behave the way users expect them to. So we've already got this written in our core language and we want to make sure we're living up to it.
It's really valuable for our clients to know how their users think. It can give a perspective that they didn't have before, and we definitely want to be able to offer that value. We can prove our effectiveness with numbers, whether it's showing our current clients the impact that we had, or being able to sell to future clients, showing the effect we've had in the past.
And of course it’s a valuable potential upsell and not just for current projects, but it's also instilling this idea that going forward we might want to have more of a long term relationship to do ongoing metrics on these project, as opposed to just jumping off as soon as we're done, when we launched the site or the product.
This is a sticker from the Nielsen Norman Group event that I was saying I was at this week. “If you're not checking, you're guessing.” And that's basically at the core of what we want to do with data insights. Also a note on the term data insights: I did a little research into the SEO behind this, and while not a lot of people are searching explicitly for data insights, people are searching for the services underneath it. And so I think that offers us an opportunity to have a unique name for our grouping of these offerings, but doesn't really distract from getting SEO value from people searching for these services that are very common underneath it. You know, I saw other terms out there possibly like data-driven UX, which I think might clash with purpose-driven, but the term “data insights” in itself could be open for suggestions since this is a new grouping. But that's what we were going with for now.
So I want to talk about all the different ways that Data Insights can be incorporated into a project. We'll have some examples of these and some from projects we've already done as well that can show how we're already doing this. So, first in Discovery, one of the ways that we can use Data Insights at the beginning of a project is to run an initial usability test. Usually, we do a lot of usability testing currently during design, but a way that we can better utilize this tool in Discovery is to create a baseline on an existing product, just to see where we are and to identify problems with the current experience, as well as figure out ways that we can enhance it in the future. And the way we would do that is just to work with the client to understand their target audience and the primary use cases, and then recruit and test users to complete that task with the current design.
And in this way we're able to identify what's working and what isn't before we get into design. We can report any unexpected results and use the data to fix those issues once we get into design. A lot of times we do information architecture where we're just going off of best practices. With information architecture tests, we can test the product’s current navigation, and use that as a baseline so that we understand the individual users coming to this particular product. Usually that’s via tree testing to determine how easily users are able to navigate to the pages they need and where there's confusion. And then we can use this information to inform our IA in the redesign.
Another way we can use Data Insights is with goal completion analysis and set up. So a lot of times analytics are really important to our clients in order to start tracking these KPIs that are important to the business. A lot of times we have to make sure that those analytics are actually set up before we're able to start tracking that.
So at the beginning of a project, we'll go into the analytics and make sure that we are tracking the things that are important to them and then later we’ll be able to reference those analytics and see what changes we've been able to improve upon.
Then there is visual design testing. So this is something that we don't do as often right now, but it could be great to get an initial perception of the brand with logo tests or just brand analysis and then use that data to inform the visual design going forward. And another way, just to get data in the beginning, rather than just going off of our intuition, just as a compliment to what we're already doing.
So here is an example of how we can use this data that we get in the Discovery phase to predict for our clients the kind of improvement they might be able to see. So in this example, we can say like, let's imagine that there is a company with this many potential shoppers, they have an average spending value, and we think that we can improve their cart completion rate with a better design.
So rather than saying, “you know this shopping cart you have is kind of like badly designed or the usability is not great and we think we can improve it” and then just end it there, if we are able to have this data which we can get from doing an initial test or just looking initially at their analytics, we can then start to predict the improvements that we might be able to see and actually translate it to the value and the KPIs that the business cares about.
So that was in our initial discovery, but we can talk more about how we can apply these insights to the design phase. So usability testing throughout design is something that we are getting better at doing. We think that there are a lot of different ways this can be applied even further throughout the design process, including after initial wireframes. The first designs that we come out with to get early feedback, make sure we're on the right track, and course correct early if necessary. Again, if … we did a previous test, of course you might want to make changes and then test again. So that's another thing we can do.
Another example is if there's just a small design challenge like over a single feature and two groups may have a different opinion. If you’re not sure, you can just run a really quick test to get a data-driven answer. A-B tests test if there are two different versions that might make sense, and we want to see which one performs better. And to test branding and during visual design to gauge a user's perception and level of understanding of a concept that we've come out with.
So here's an example of a usability test. The person I'm talking to is a user or a potential user of the product. We used Respondent.io and Proto.io and walked them through a prototype and had them use it as if it were a real product. And what we learned from this was we got a better understanding of the audience perception of the brand and how they expected the process to work and where the instructions could be clearer.
Here's an example of a quick usability test, where we just had one interface idea where we were kind of not sure how users would expect to use it. So what we did was we did a quick poll with about 40 or 35 people and easily got an answer on how people expected it to be used.
This is an example of an A-B test where we actually had four designs with slightly different variants, go head to head to determine which will have the greatest profit value.
Another example of a test we could use is card sorting, which is really valuable once we're starting to get into defining the navigation structure. Before jumping into a structure, we can see how users themselves would choose to group those items. And the way you do that is there are tools like Optimal Workshop where you can give users a bucket of lots of different terms, and they can sort them themselves either with predefined categories or make their own categories. And that's just a way for us to see how people group things before we design the nav.
Tree testing is similar. It also has to do with navigation, but the difference here is that we have a structure in place already, and we want to see if people are able to navigate and find the things that they need when they're only looking at the navigation structure, kind of divorced from the design.
So here's an example of this, where we asked the same question with two different designs of the navigation. And there was a clear winner here. You could see that the second one is almost 10 seconds less … a greater success rate, greater directness. And this is just something we wouldn't really be able to know if we had just gone with our first gut on a navigation.
First click test is also good for navigation, but this time we're actually starting to consider the final visual design in the context that it provides. So this is an example of a test that we did with a client and we had two different versions of the navigation. And it's clear to see here that one version has a lot less hotspots, a lot less areas where people are clicking. We were able to see which navigation is more effective at grouping information in context of a menu with the visual design.
And so further talking about visual design testing there. This would be a great way to just kind of A-B multiple brands and options and to understand the perception. And if we pair that with discovery, we can see if our redesign is making the impact that we hoped it would, if we have a baseline from discovery.
So here's an example of just a quick survey that we were able to do with the logo to determine the perception in which it was favored by the audience.
So then finally, we've gone through Discovery and Design and now we’re at Post-Launch and ways that we can use Data Insights after launch. So there's usability testing on a live product where there's always going to be differences between a design or a prototype and a live product where people are entering real data. We can do usability testing on the live product with users to make sure that there's no usability issues, or if there are opportunities for small tweaks or improvements.
Then there's the goal completion ROI analysis, which is basically following up on our testing. We didn't do discovery and then a month or so after it's live, after we've had the time to collect the data, we can return to the analytics and analyze and determine the improvements that have been made.
So here's an example of one of these we did with a tool, Pulse Insights, which is basically an on-site micro survey. And we ran the test before and after, and we're clearly able to see some improvements. The last one there was kind of a visual design test where we gave users a bunch of different words that they would use to associate with the brand and saw an increase in the positive and decrease in the negative.
Then there’s a post-launch micro survey within a product ... if you see those popups where it's like, “did you find what you need?” or things like that, just to see if there are any small issues that arise or to assess visitor perception.
Finally, there's visual design impact measurement, where we can ask people questions and analyze the overall impact the visual design has had.
So here's an example of determining return on investment for a UX change after we've gone through the project. Just an example, but imagine that we took some Google Analytics or something before the project and saw that a task is like an average of 60 seconds. And we were able to determine how many hours that might cost an employee to do that task. And you can easily calculate from there as a savings and kind of report back both to the client and then, you know, use it as a metric going further about the impact we've been able to have.
Here's a recap of those offerings. What we're currently hoping to start pitching or selling. And of course they can be split into different packages depending on the needs of the business and what their capacity is. And the goal here being for everyone to start to think a little bit more about these testing options and how they could improve the design and the development of the products and think about where there are opportunities to start using them.
If you need an expert team of researchers to help make your project a success, consider reaching out to ETR to discuss your unique user research needs.
Associate Experience Designer and Researcher
Head of Experience Design