"Tests are annoying". "They can't even test the visual output properly". "They can never check this functionality". If you asked me about what I thought about tests, this is what I would have told you several years ago. In my past self's defense, testing frontend can seem pointless. The real purpose of the frontend is to render a visual output and there is still no way to programmatically check with 100% certainty whether the visuals are rendered correctly. So what is the point?

This is what I told my friend Peter when testing came up as a topic during one of our hangouts. As a more senior backend developer, he didn't buy my argument. He then looked on incredulously as I told him that the companies I had worked for until then had little to no frontend tests on all the things they had in production. "That is horrible" he said.

His reaction stuck with me so when things calmed down on the project I was working on, I volunteered to implement some end-to-end tests. Since then, I have implemented dozens of integration and end-to-end tests and now I have a different stance on testing. I really wish that someone had explained to me the real benefits of having tests apart from the obvious "it catches bugs". So in this post, I will walk you through the benefits I discovered and the various pieces of knowledge I picked up about testing frontends.

Hand-drawn sketch of a heart sign with a checklist within it

Tests are not free

Handwritten text of free being crossed out

I think many developers are not fully prepared for the cost of tests. Tests require constant maintenance to be really valuable and if they are not maintained correctly, they just become a nuisance. On many of the projects I worked on, someone had introduced tests to a part of the codebase but then they didn't increase the test coverage or continue maintaining them. This happens when the developer or the development team underestimates the cost of tests and then abandons them when they discover the high cost.

Tests require significant development effort upfront before returning any value. It is easy to underestimate this effort and then give up too early, claiming that the tests didn't provide any value. But tests provide most of their value in the long term, which brings me to my next point.

The biggest value of tests is the confidence it provides

Hand-drawn sketch of a stick figure with sunglasses and a look of confidence

Bugs are an inevitable part of software development and naturally, we try to avoid adding bugs as much as possible. One common tactic for this is to not make big changes or to avoid making changes to the core. The reason for doing this is that these kinds of changes are the ones that are most likely to introduce bugs. But this tactic leads to increasingly convoluted code, which is harder to maintain and even more prone to bugs. The longer a project is in use, the more this happens.

When a project has sufficient test coverage, you don't have to resort to the same tactic. You develop the confidence to make big changes because you are not afraid that something will break without your knowledge. The tests have got your back.

In my experience, I have seen how this confidence is especially important on "legacy" projects that have been live for several years and are still in use. Without this confidence and the bold changes it allows, these projects will accumulate even more cruft and technical debt and be even harder to work with.

Integration tests provide the most value per effort

Handwritten text - 11 VPE

Most frontend courses cover unit tests since they are the easiest to get started with. As a result, unit tests are probably the most common form of frontend tests. But in my experience, they are also the least useful.

The most useful tests are those that check the most important user flows through a website or a web app. These tests can be either end-to-end (E2E) or integration tests. Testing terminology is confusing but the way I interpret these terms in a frontend context is that integration tests are tests with mocked API responses and E2E tests are tests where there is a live backend.

The benefit of E2E tests is that they allow you to catch bugs that are caused by a misbehaving backend. But implementing them takes more effort since

  1. You need to run the backend during each test run
  2. You need to orchestrate the backend into different states

The second point is particularly troublesome since extra functionality usually needs to be added to the backend to allow it to be orchestrated into different states. This often involves interacting with the designated backend developer or team, which instantly increases the effort needed to implement the tests.

Integration tests using mocks don't face this shortcoming. This allows you to catch the majority of the bugs within the frontend and more importantly, you can implement them faster which is really crucial during the initial adoption. (I will describe initial adoption in more detail later)

(Kent C. Dodds also agrees that integration tests are the sweet spot for frontend tests.)

Snapshot tests are not worth it

Hand-drawn sketch of a polaroid camera printing out a snapshot containing HTML elements

Snapshot tests seem very appealing initially. They assert on the rendered HTML output and that is very close to the actual visual output of the frontend. You don't have to write complicated selectors or traverse across elements; you can just snapshot the parent element. But what I have seen in practice is that snapshot tests break down quickly. Even the smallest change in markup can cause them to fail. It is way too easy to mark a bad snapshot as the correct output. You need to dive into the code to figure out what the correct snapshot should look like. As a result, what I have seen is that snapshot tests work during initial development while the original developer is around and then fall apart as other developers start working on it.

In my experience, I found that most frontend functionality can be checked by simply checking if the correct text is rendered or if the correct HTML attributes are set on elements. Despite my initial gripes about the usefulness of such "non-visual" tests, these tests have still saved my bacon more times than I could count. Most test runners provide ways to run assertions within other elements and this can be used to test complex layouts with repeating elements. (For example, Cypress has the within command for this) For non-textual elements, you can add and use ARIA attributes, which will also improve accessibility.

You don't need 100% test coverage

Hand-drawn sketch of a progress bar currently on OK%

You don't need 100% test coverage to start benefiting from tests. Covering the major flows of the application results in around 60% test coverage and this alone is enough for the tests to be of value. Personally, I think chasing 100% test coverage is not the best use of time. The value of test coverage reports is to figure out the functionality that is not covered by tests. If all user-facing functionality is covered by tests, then that coverage, usually around 80%, is sufficient. There are diminishing returns for any coverage beyond this on most projects.

But going from 0% to 60% is hard

Handwritten text - 0-60: 2 months

Maintaining test coverage is easier than growing it from zero and I have usually found myself in the latter situation. In addition to the technical effort of setting up the test infrastructure from scratch, some human effort is also required. If an existing project has little to no test coverage, that means that the development team and/or the organization has not understood the value of tests and they will need to be convinced.

The following practices have helped me get the initial test coverage in place. The same practices can also be useful for any big changes in a project.

Brace for the long haul

Building up the initial test coverage is going to take some time. Probably much longer than you think. So brace yourself for the fact that you will be doing this for some time. It is very likely that you will do this in the background alongside normal feature development for a few months.

Break it down

Don't try to achieve "acceptable" test coverage in one go. Instead, break the task down into logical pieces and merge them piece by piece. This will clearly demonstrate progress to yourself and to your stakeholders which is very helpful when working on long-term projects.

Use metrics to demonstrate progress

Another way to demonstrate progress is to regularly report numbers. In the case of testing, the metric is test coverage and setting a target, measuring and reporting it provides a quantifiable measure for non-technical management to focus on.

(Measuring the test coverage of Vite React apps was much harder than I thought so let me know if you would like to hear about how I achieved it)

Use GenAI

GenAI has gotten to the point where it can automatically generate tests or even entire test suites. But its output always needs to be reviewed carefully since they can often generate code that seems right at a glance but is subtly flawed. A middle ground that I stick to is to create the test cases myself and use GenAI to implement them. This still returns garbage sometimes but the suggestions tend to improve after I implement some scenarios myself.

Test your code as a user would use it

Hand-drawn sketch of a stick figure(you) wearing a shoe

This is the guiding principle of Testing library and over time it has become one of mine too. This means writing tests that interact with the code as close to how a real user would and not asserting on internals (e.g. component state).

Flakiness is inevitable

Hand-drawn sketch of a checklist with the last item having a cross

In an ideal world, tests would fail only when functionality breaks. But despite my efforts, I have still not managed to implement E2E or integration tests that are 100% flake-free.

People will hate me for saying this but I deal with this problem by not dealing with it. If I can get all green within 2 test runs, then that is good enough for me. (I really wish that I had discovered Cypress's Test retries sooner)

Conclusion

As a junior developer, I found tests annoying. I just wanted to build features and change the world. Tests just got in the way of that.

As a senior developer, I still find tests annoying. But I now see them as a necessary part of maintaining software, as something that has got my back while the crazy frontend world tries to take me down. Getting to the state where the tests have you covered is not easy. But I can promise you that it will be worth it.

What do you think?

Please let me know what you think about this post on the forum. You can also just stop by to say hello, I would love to hear from you!

You can find the discussion about this post here.

Prabashwara Seneviratne

Written by

Prabashwara Seneviratne

(bash)

Similar Posts

Surviving frontend