Are you looking for a way to make tests in an easier way? We got you! Check the following article and learn how to make it possible.
Modern application development is based on one simple rule:
Use composition
We compose classes, functions, and services into bigger pieces of software. That last element is the foundation of microservices and hexagonal architecture. We would like to use existing solutions, integrate them with our software and go straight onto the market.
Do you want to handle account registration and store user data? You can pick one of OAuth services. Maybe your application offers some kind of subscription or payment? There are many services that can help you to handle this. Do you need some analytics on your website, but do not understand GDPR? Feel free and take one of the ready-to-go solutions.
Something that makes development so easy from a business point of view could give you a headache – the moment when you need to write a simple test.
The Fantastic Beasts: Queues, databases and how to test them
Unit testing is pretty simple. If you only follow the rules, then your test environment and code are healthy. What rules are those?
Easy to write – a unit test should be easy to write because you write a lot of them. Less effort means more tests are written.
Readable – the test code should be easy to read. The test is a story. It describes the behavior of software and could be used as a documentation shortcut. A good unit test helps you to fix bugs without debugging the code.
Reliable – the test should fail only if there is a bug in the system that is being tested. Obvious? Not always. Sometimes tests pass if you run them one by one but fail when you run them as a set. They pass on your machine, but fail on CI (Works on My Machine). A good unit test has only one reason for failure.
Fast – tests should be fast. Preparation to run, start and test execution itself should be very swift. Otherwise you will write them, but not run them. Slow tests mean lost focus. You wait and look at the progress bar.
Independent – finally, the test should be independent. That rule stems from the previous ones. Only truly independent tests can become a unit. They are not interfering with each other, can be run in any order and potential failures do not depend on the results of other tests. Independent also means no dependency on any external resources like databases, messaging services or file system. If you need to communicate with externals, you can use mocks, stubs or dummies.
Everything becomes complicated when we want to write some integration tests. It’s not bad if we would like to test a few services together. But when we need to test services that use external resources like databases or messaging services, then we are asking for trouble.
To run the test, you need to install…
Many years ago, when we wanted to make some integration tests and use, e.g., databases, we had two options:
We can install a database locally. Set up a schema and connect from our test;
We can connect to an existing instance „somewhere in space”.
Both had pros, both had cons. But both introduce additional levels of complexity. Sometimes it was technical complexity arising from the characteristics of certain tools, e.g. installation and management of Oracle DB on your localhost. Sometimes it was an inconvenience in the process, e.g. you need to agree with the test team about JMS usage… each time you want to run tests.
Containers to the rescue
Over the last 10 years, the idea of containerization has gained recognition in the industry. So, a natural decision is to pick the containers as a solution for our integration test issue. This is a simple, clean solution. You just run your process build and everything works! You can’t believe it? Take a look at this simple configuration of a maven build:
The example above is very simple. Just one postgres database, pgAdmin and that’s all. When you run
bash
$ mvn clean verify
then the maven plugin starts the containers and after the tests turns them off. Problems start when the project grows and our compose file grows too. Each time you will need to start all containers, and they will be alive through the entire build. You can make the situation a little better by changing the plugin execution configuration, but it is not enough. In the worst-case scenario, your containers exhaust system resources before the tests start!
And this is not the only issue. You cannot run a single integration test from your IDE. Before that, you need to start the containers by hand. Moreover, the next maven run will tear down those containers (take a look at down execution).
So this solution is like a big cargo ship. If everything works well, then it’s ok. Any unexpected or uncommon behavior leads us to some kind of disaster.
Test containers – run containers from tests
But what if we could run our containers from tests? This idea looks good, and it is already being implemented. Testcontainers, because we are talking about this project, here is a solution for our problems. Not ideal, but nobody’s perfect.
This is a Java library, which supports JUnit and Spock tests, providing lightweight and easy ways to run the Docker container. Let’s take a look at it and write some code!
Prerequisites and configuration
Before we start, we need to check our configuration. Test containers need:
Docker in version v17.09,
Java minimum version 1.8,
Access to network, especially to docker.hub.
More about the requirements for specific OS and CI can be found in documentation.
I use Test containers version 1.17.3, but feel free to use the newest one.
Tests with Postgres container
The first step is to prepare our instance of a container. You can do that directly in the test, but an independent class looks better.
public class Postgres13TC extends PostgreSQLContainer<Postgres13TC> {
private static final Postgres13TC TC = new Postgres13TC();
private Postgres13TC() {
super("postgres:13.2");
}
public static Postgres13TC getInstance() {
return TC;
}
@Override
public void start() {
super.start();
System.setProperty("DB_URL", TC.getJdbcUrl());
System.setProperty("DB_USERNAME", TC.getUsername());
System.setProperty("DB_PASSWORD", TC.getPassword());
}
@Override
public void stop() {
// do nothing. This is a shared instance. Let JVM handle this operation.
}
}
At the beginning of the tests, we will create an instance of Postgres13TC. This class can handle information about our container. The most important here are the database connection strings and credentials. Now it’s time to write a very simple test.
I use JUnit 5 here. Annotation @Testcontainers is a part of the extensions that control containers in the test environment. They find all fields with @Container annotation and start and stop containers respectively.
Tests with Spring Boot
As I mentioned before, I use Spring Boot in the project. In this case, we need to write a little more code. The first step is to create an additional configuration class.
This class overrides the existing properties with values from the test container. The first three properties are standard Spring properties. The next five are additional, custom properties that can be used to configure other resources and extensions like liquibase, e.g.:
Now it’s time to define a simple integration test.
@SpringBootTest(webEnvironment = RANDOM_PORT)
@AutoConfigureTestDatabase(replace = NONE)
@ContextConfiguration(initializers = ContainerInit.class)
@Testcontainers
class DummyRepositoryTest {
@Autowired
private DummyRepository;
@Test
void shouldReturnDummy() {
var byId = dummyRepository.getById(10L);
var expected = new Dummy();
expected.setId(10L);
assertThat(byId).completes().emitsCount(1).emits(expected);
}
}
We have some extra annotations here.
@SpringBootTest(webEnvironment = RANDOM_PORT) – marks the test as a Spring Boot test and starts the spring context.
@AutoConfigureTestDatabase(replace = NONE) – these annotations say that the spring test extension should not replace postgres database configuration with H2 in the memory configuration.
@ContextConfiguration(initializers = ContainerInit.class) – an additional spring context configuration where we set up properties from Test containers.
@Testcontainers – as previously mentioned, this annotation controls the container lifecycle.
In this example, I use reactive repositories, but it works the same with common JDBC and JPA repositories.
Now we can run this test. If it’s the first run, the engine needs to pull images from docker.hub. That could take a moment. After that, we will see that two containers have run. One is postgres and other is Testcontainers controller. That second container manages running containers and even if JVM unexpectedly stops, then it turns off the containers and cleans up the environment.
Let’s sum up
Test containers are very easy-to-use tools that help us to create integration tests that use Docker containers. That gives us more flexibility and increases development speed. Proper setup of test configuration reduces the time needed to board new developers. They don’t need to set up all dependencies, just run the written tests with selected configuration files.