Boring but necessary
Integrating quality assurance processes in data projects
In tech teams, it’s common to have at least one person who’s a QA engineer. QA stands for Quality Assurance and their job is to test the organisation’s products—software, websites, apps, etc.—to discover any bugs. The QA person then communicates the issues to developers or designers for fixing. The ultimate objective is, of course, a product of the highest possible quality.
Have you ever heard of people with QA roles in data teams? Chances are, you haven’t. I hear they’re still a rare breed. However, the need for quality assurance in data projects is no lesser than in tech.
At Parabole, we were recently working with Airbnb on a data-powered interactive site. I cannot tell you what it was about just yet (more on that next year—stay tuned!) but I will say that the dataset was no joke. It included over 35,000 rows and required hours of meticulous manipulation and transformation.
So as we developed the prototype of the site, I felt the urge to make 1000% sure all data points were in the right place and format. Unfortunately, we don’t have a dedicated quality assurance analyst at the studio, so I performed the checks myself.
Since then, I’ve been thinking about the best way to add QA engineering to data projects implemented by small teams. Here are some thoughts on how to make it part of your design process:
Dedicate resources for it in advance. Is quality assurance part of your development cycle? How many hours do you spend doing it? Allot a fixed amount of time for QA at the beginning of a project and let your stakeholders know how it will unfold.
Enlist users to test the product. Whether you’re building an interactive site or a dashboard, make sure you have external people testing it for you before it goes live. Testers from your target audience are ideal but don’t hesitate to solicit external contributors, too.
Give the testers a blueprint. You want to see how people naturally interact with the product and what bugs they find. But it can also be great to give them specific prompts. One of those prompts could be to test worst-case scenarios: the highest possible load for a website, the longest name in a filter of a dashboard, the most detailed selection of categories, etc.
Review and compile the fixes. The clearer the list of changes to be implemented, the faster it will be for designers and developers, and the more they’ll appreciate it.
That’s all nice, but let’s be honest for a moment.
QA is not the most fascinating topic. If you’re thinking wow, what a boring thing to be talking about right before the holidays, I don’t blame you. Quality checks are possibly my least favourite part of data communication. But they are necessary to ensure that what we craft is accurate and of the highest quality. And we all want that, don’t we?
Perhaps you already have QA processes in place for your team. If you do, tell us about it in the comments! How do you allocate time for it? How do you prepare user testing sessions? And if you don’t, let’s work towards it together.
Thanks for reading this nerdy edition of The Plot! I’ll see you next week for the last—and lighter—newsletter of the year.
Looking to improve your data storytelling skills? Keep reading.
The next round of my data storytelling bootcamp will kick off on January 29. There are still a few spots left and it’s perfect timing for the remainder of your 2023 learning budget. Join us today for two weeks of learning, practice, and fun.