Skip to Main Content
Let's Talk

If you’ve ever remodeled a room, you know how satisfying it is to watch macro level changes happen. Imagine that before and after moment — one day you have a drab couch atop faded carpeting, and the next day, “Tada!” a teal loveseat resting on a hardwood floor. ✨

But, you did remember to add felt pads to the feet of that new loveseat, right?

Macro changes get the attention, but micro changes make the beauty last.

Macro vs. Micro

One more example, then (I promise), we’ll get to website testing.

During college, I worked on a construction crew that was converting the interior of a vacant library into a new private preschool. I mostly loved it, until the last few weeks.

The macro changes felt fast and looked dramatic: out went the heavy, gray laminated desks and in came the colorful toy bins. But, when all the big tasks were complete, there followed what felt like an endless string of tiny tasks.

  • An exit sign in the back hallway needed a battery.
  • A strip of vinyl baseboard needed to be installed in the janitor’s closet.

The work continued, but the worksite was visibly unchanged. And there was the caulking. So much caulking.

At every point where a conduit travelled through a cinder block wall, we caulked around the conduit with a fire-proof material to seal up the gap. No casual observer could tell if we forgot to do this or used the wrong material. But overlooking that little detail would cause us to fail inspection; or worse, it would increase the risk of fire spreading unchecked in an emergency.

Micro tasks are not unimportant tasks.

Yes, you cannot open up a building for public occupancy without a major feature, like “walls.” But neither could you open it without fire-proof caulk.

QA Testing — Because Details Matter

Quality Assurance (QA) Testing is the process we follow to verify that all features and functions of a new website are working correctly. Just like inspecting a new building, it’s the final bit — a comprehensive, cross-departmental effort to track issues, squash bugs, and ensure that launch day is as error-free as a team of humans can make it.

Once we enter the QA period, we’re no longer developing major features. Instead, we’re looking for the small variables that could lead to out-of-proportion problems after launch. For example:

  • A form with improper error handling that causes it not to submit
  • A mobile implementation of a module that isn’t usable
  • A touch target that’s too small
  • Missing or ineffective keyboard navigation
  • Insufficient color contrast on a button

These types of issues get discovered once a site is used heavily. QA Testing allows us to catch them early, before they are found by our clients or their users.

Our Process for QA Testing

Our process begins with a project’s Lead Developer, expands outward to include the Project Team, and concludes with the client. Let’s look at each stage in turn:

Developer Testing

The Lead Developer is responsible for testing their own code and any code other team members contribute to the site. To accomplish this, they’ll use a combination of Chrome Extensions, Mac OS Apps, and scripts we’ve written in-house to conduct manual and automated reviews.


A Shoutout Moment for the Tools We Love 🧡

Chrome Extensions

Mac Apps


Empowered with an array of testing tools, the Lead Developer creates and completes a checklist generated from our Asana templates. It’s comprehensive, covering categories from accessibility and author experience to security. Craft CMS makes it easy to provide a highly customizable author experience, so they leverage that to its full potential. And they pay special attention to the project’s accessibility compliance target.

Some example checklist tasks include:

  • Author Experience
    • Do all image fields in the CMS include recommended dimensions?
    • Has help text been added where appropriate?
  • Accessibility (Target WCAG 2.0 AA)
    • Have all color contrast errors been resolved?
    • Can the user navigate with a keyboard?
  • Security
    • Do all forms have CSRF tokens?
    • Do all forms have a spam filtering mechanism in place?
  • SEO
    • Do all pages have an H1?
    • Are default metadata fields configured and are they outputting appropriately?

Once their checklist is complete, the vast majority of low-hanging fruit quality issues have been caught and addressed. Next, they’re ready to engage the Project Team.

Pre-Client Testing with the Project Team

Once the Lead Developer’s QA is complete, we expand our testing to include the full project team (Account Managers, Lead Designer, Creative Director, and the Director of Development) for a meticulous manual review.

  • The account team checks for variances from the agreed upon scope of work
  • The creative team checks for variances from the approved Figma UI Design (such as spacing inconsistencies or a button missing hover styling)
  • And the Director of Development checks for any technical issues missed by the Lead Developer in their testing

We use a Kanban Board Asana template here, paired with a widget from Marker.io. Marker allows us to add a convenient reporting tool to both the frontend (UI) and the backend (Craft CMS). The team reports issues found in both locations. Marker integrates easily with Asana to automatically create new tasks when testers report an issue.

Enter the QA Master

The Lead Account Manager on the project also serves as the QA Master. They are the on-the-ground leader responsible for prioritizing, categorizing, and assigning incoming tasks to the appropriate team member.

Kanban boards are again ideal here, because they offer a convenient drag-and-drop sorting interface for the QA Master, and they give real-time visibility into bottlenecks in the testing process.

Illustrated inspector with a clipboard examining web pages

On a daily basis, the QA Master will be:

  • Watching for and escalating critical issues (anything that prevents us or the client from adding content)
  • Setting due dates on tasks assigned to the Lead Developer or Lead Writer
  • Confirming the bug fixes made by the Lead Developers were successful
  • Consolidating duplicate tasks
  • And minimizing back-and-forth communication


Client Testing

Once the Project Team has resolved any of their outstanding tasks, the client is invited to join the QA process in a few defined testing windows:

  • Content Population
  • Post-Content
  • Pre Launch Last Look
  • and Post Launch

The time allotted for each window varies with the scope and length of each project. It could be a week, or two weeks with the aim of providing clients ample time to do hands-on testing and ask questions.

The QA Master proves even more vital to the process at this stage.

  • They translate our team’s tech speak to the client (web development comes with a lot of jargon)
  • They watch for spiraling tasks that could impact the launch date
  • They establish what is a true bug report vs. a non-bug request

All requests can be addressed, but only true bug reports are fixed within the scope of QA. Examples of a non-bug request are:

  • Client changed their mind and would like to modify the look of a page or module
  • Client would like to add new modules or page types
  • A new need arises that the client didn't bring up before launch

The Post Launch Window

The last opportunity to submit a bug report is the Post Launch Window. After their new website is up and running, our team remains on hand to rapidly address any potential bugs discovered by the client or their users. While this window is rarely utilized, it provides a good deal of peace of mind to our clients.

What Could Go Wrong: The Perilous Path of Not QA Testing

The Perilous Path of Not QA Testing

It may be tempting when budgets are tight and timelines are short to abbreviate or skip QA. But, the risk of show-stopping bugs or potential loss of revenue is too great to ignore.

We include QA as a line item on each of our estimates and have always found the investment of time to be worth the return. It has enabled us to maintain high standards and provide positive, seamless experiences for clients and visitors alike.

Read More: Process Spotlight — How We Create Engineering Estimates

Testing Leads to Trusting

Have you experienced a bad case of the Launch Day Hiccups on a past project? Reach out to our team, and let’s chat about how our process can help you avoid it next time. Until then, happy testing!