Integration tests with Go, Docker and Makefiles

Integration tests with Go, Docker and Makefiles


Integration tests are a fundamental part in every application. They ensure that different components communicate correctly and side effects occurs on a controlled and desired way, but how can we ensure that our API data layer is communicating with our database correctly? With this in mind, i’ll’ show you a structured approach of how to implement them in Go using Docker Compose to provide our infrastructure and a Makefile for testing automation. Ensuring a clean and reproducible environment.

Setting up the Docker image

First things first, we need a Dockerfile so we can build our image into a container and so we can run our tests in docker compose. Here’s our image:

Dockerfile

FROM golang:1.23-alpine

RUN apk add build-base

RUN mkdir -p /go/src/github.com/dinizgab/golang-tests

WORKDIR /go/src/github.com/dinizgab/golang-tests

COPY . ./
RUN go mod download

RUN go mod download

This is a simple Dockerfile that has everything we need to run it. Let’s break it down:

FROM golang:1.23-alpine -> This lines is the base image that we’re building from, in our case, an alpine version of golang in 1.23, this means that its a lightweight version of our go base image.

RUN mkdir -p /go/src/github.com/dinizgab/golang-tests

WORKDIR /go/src/github.com/dinizgab/golang-tests
COPY . ./

These three serve to set our base working directory inside our container. We create a directory with the same name as our package and after that we set it as our default directory. Then we copy all files outside our container to the container.

RUN go mod download -> This line simply get all our packages dependencies. Since we don’t need to run our API, we don’t need an entry to run our main file, as this command will be called by our compose file.


Setting up Docker Compose

First things first, we need a compose.yaml file that define what services we’re going to use, in this case, an API and a PostgresSQL database. This file defines ensures that both services are running, configured and communicating correctly.

# compose.yaml

networks:
  database:
services:
  apitest:
    build:
      context: .
    ports:
      - 8000:8000
    profiles: ["test"]
    environment:
      - DB_HOST=database
      - DB_PORT=5432
      - DB_USER=postgres
      - DB_PASSWORD=mysecretpassword
      - DB_NAME=local_db
    networks:
      - database

  database:
    profiles: ["test"]
    ports:
      - 5432:5432
    image: postgres:13.5-alpine22
    environment:
      POSTGRES_PASSWORD: mysecretpassword
      POSTGRES_DB: local_db
    networks:
      - database

This setup:

  • Creates a Postgres container that acts as our database.
  • Defines and builds an API test container, running it in an isolated environment.
  • Connects both services in a shared virtual network database.

Automating tests with a Makefile

After that, we also need a Makefile, so that we can automatize the execution of the tests (It’s possible to run it manually with the docker compose command too). This allows us to:

  • Scale a database automatically.
  • Run all the migration files.
  • Run all tests in our codebase.
Makefile

args?=./...
test: up-db migrate
    docker compose --profile test run --rm apitest go test -failfast -v -count=1 $(args)

up-db:
    docker compose up -d database

migrate:
    goose -dir ./migrations postgres "user=postgres password=mysecretpassword host=127.0.0.1 port=5432 dbname=local_db sslmode=disable" up

Commands explanation

test Instruction Runs all tests in an isolated environment:

docker compose --profile test run --rm --build apitest go test -failfast -v $(args)

Here, we are running a docker compose command, that will select all services that contain the profile test on the compose.yaml file inside its profile array and run inside the service apitest the command go test -failfast -v -count=1 -p 1 $(args), I’ll get back to it later. Before this command there is two small flags, --rm,what it will do is delete the container after it stops.

Now with the go test command:

go test -failfast -v -count=1 $(args)

First, it calls the go test tool, but, as you can see, there are multiple flags in this command. Starting with -failfast, this flag will stop the tool in the first test error (assertion or exception) that occurs, saving us time. -v is the verbose output, it will log all tests as they are running, -count=1 forces each test to run exactly once, ignoring any previous cache. $(args) is a variable that is passed on when we call make test, that represents an specific test or file that we want to test, it is optional, so if we don’t pass anything, it will fallback into ./..., that represents all test files.

up-db instruction docker compose up -d database

This command runs a database service in background (-d flag ensures it runs detached)

migrate instruction goose -dir ./migrations postgres "user=postgres password=mysecretpassword host=127.0.0.1 port=5432 dbname=local_db sslmode=disable" up

This command migrate our database with all migration files present inside ./migrations directory using goose tool.


Writing and running tests

After creating our infrastructure, we can now write our integration tests that interact with our database. Below is an example of a repository function that saves a User model into a users table and its corresponding tests.

// user_repository.go

func (r *userRepositoryImpl) Save(user models.User) error {
	query := `INSERT INTO users (first_name, username) VALUES ($1, $2)`

	_, err := r.db.Exec(query, user.FirstName, user.Username)
	if err != nil {
		return fmt.Errorf("UserRepository.Save: error saving user - %w", err)
	}

	return nil
}
// user_repository_test.go

func TestRepository(t *testing.T) {
	dbConfig, err := db.NewDBConfig()  // Gets database config from env
	if err != nil {
		t.Fatal(err)
	}
	db, err := db.New(dbConfig)   // Creates a new database connection
	if err != nil {
		t.Fatal(err)
	}
	repo := NewUserRepository(db) // Creates a new repository

	t.Run("Test create new user", func(t *testing.T) {
		db.Exec("TRUNCATE users CASCADE")  // Resets users table

		user := models.User{
			FirstName: "Gabriel",
			Username:  "dinizgab",
		}

		err = repo.Save(user)         // Saves a user in the database

		assert.NoError(t, err)        // Asserts that no errors occurred

        users, err := repo.FindAll()  // Fetch all users

		assert.NoError(t, err)        // Asserts no errors
		assert.Equal(t, "Gabriel", users[0].FirstName)  // Asserts user inforomation
		assert.Equal(t, "dinizgab", users[0].Username)

	})
}

Conclusion

This setup provides a clean and automated way to run integration tests in our application. Without the use of any third party APIs or packages, just by leveraging Go default package, a Makefile with some instructions and Docker with Docker Compose, we were able to create a fully platform agnostic test suite which ensures that database interactions functions correctly in a reproducible environment. Also, just by running make test, we can run our tests perfectly.

If you want a full example code, check out this github repository. In it, you can find a full implementation of this setup, with a simple API and a database connection.