Installing GitLab in an internal Docker Container behind a public Nginx
There are several options available to install GitLab on your server. The recommended way by GitLab is to use GitLab Omnibus, but this will bundle with an included Nginx server. If you already have your own instance of Nginx you probably do not want to install another one - it cannot listen on the default ports anyway. Thus, you have to look for another method to install GitLab. Another possibility would be to install GitLab from source. In this case however, you will have to install PostgreSQL, Redis and Nodejs.
I already had a PostgreSQL instance available, so that could have been shared, but due to the lack of Redis and Nodejs I decided to use the Docker container of GitLab. This is a GitLab Omnibus installation in a Docker container. This means, it will be bundled with all requirements (including the Nginx server), but they are bundled within one container. Thus, GitLab’s Nginx server does not interfere with my own Nginx server and I can get rid of all GitLab components easily if I decide I do not like it.
Installing Docker
First, we need to install Docker. The details on how to install docker depend on the operating system and can be found on the official website. Thus, I will not go into details. I will, however, give an insight into my ansible tasks. In my case, the task was defined for a Debian system. On Debian, you have to add Docker’s repository to your available repositories which also includes importing the GPG keys.
At first we have to make sure that the required packages for the docker
repository are available. The docker repository specifically requires
apt-transport-https
, but also a few others listed on their website.
- name: Ensure dependency packages are installed
package:
name: "{{ item }}"
state: present
become: true
with_items:
- apt-transport-https
- ca-certificates
- curl
- gnupg2
- software-properties-common
Next, we need to import the GPG key for their repository. Otherwise, we cannot verify the integrity of the packages. I had two options here: Either I could download the key directly from their website each time or I could add it to my role as a file and then upload it from the role to the server.
I decided to download it, verify it once and add the file to the role. Otherwise, I would have to verify the finterprint of the GPG key each time within my ansible tasks or bear the risk to be victim of a faked GPG key at some point.
- name: Ensure we have a directory for docker related files
file:
path: "{{ gitlab_docker_data_dir }}"
state: directory
become: true
- name: Ensure docker official GPG key is available
copy:
src: "files/docker_repo_gpg"
dest: "{{ gitlab_docker_data_dir }}/docker-stretch-stable.gpg"
become: true
register: gpgkey
- name: Ensure GPG key is installed
command: apt-key add /opt/docker/docker-stretch-stable.gpg
become: true
when: gpgkey is changed
register: dockerrepo
The final step was to add the repository file, update the package list and install the docker package.
- name: Ensure docker repository is set up
template:
src: "templates/docker.list.j2"
dest: "/etc/apt/sources.list.d/docker.list"
become: true
register: dockerrepo
- name: Ensure repository is updated
apt:
update_cache: yes
become: true
when: dockerrepo is changed
- name: Ensure docker is available
package:
name: docker-ce
state: present
become: true
notify: Start Docker
Installing GitLab
Next, we need to pull and run the docker image for GitLab. Since I use
it for personal projects, I’m only interested in the Community Edition. The
image for the Community Edition is called gitlab/gitlab-ce
.
The basic steps here are very simple now:
- Pull the image
- Start the container
We have to map a few folders into the container, so that the configuration files and data are kept on container restart. I also mapped the SSH port directly with Docker to the outside world, but TCP proxying with nginx would also be possible. The HTTP port on the other hand will not be opened to the outside world. My public nginx on the host will open an HTTPS port to the outside and relay the traffic internally.
docker pull gitlab/gitlab-ce:latest
And then we can run it with:
docker run --detach --hostname gitlab.example.com \
--publish 127.0.0.1:8080:80 --publish 2222:22 \
--name gitlab \
--volume /opt/gitlab/config:/etc/gitlab \
--volume /opt/gitlab/logs:/var/log/gitlab \
--volume /opt/gitlab/data:/var/opt/gitlab
gitlab-ce
The option hostname
is important here, because it will let GitLab know
under which hostname it is known to the outside world and allow it to display
the correct URLs. Otherwise, it will use the auto-generated hostname to
generate URLs.
Starting the docker container will take a few minutes, but it will display
the state in the docker container list (docker ps -a
) as starting
or
healthy
.
I also restricted port 8080 to localhost in order to allow the host to access the GitLab container through 8080, but not the outside world. If you run your public nginx in another docker container, you do not need to publish the port, but can use dockers networking mechanisms and put both containers into the same network.
Setting up Nginx
Finally, we need to setup Nginx to forward the requests to GitLab in the
docker container. For this, I created an ansible role called
nginx-passthrough
which I generally use
to reverse proxy from nginx to other HTTP applications.
Since I want GitLab to be available via HTTPS, I redirect HTTP traffic
to HTTPS. I use letsencrypt to generate my SSL certificates, thus I redirect
queries to /.well-known/acme-challenge
to the appropriate folder. All
values in between curly braces are ansible configuration variables.
server {
listen {{ nginx_passthrough_listen_port }};
listen [::]:{{ nginx_passthrough_listen_port }};
charset utf-8;
client_max_body_size 75M;
server_name {{ nginx_passthrough_server_name }};
root /var/www/{{ nginx_passthrough_name }}/public_html;
location / {
rewrite ^ https://{{ nginx_passthrough_server_name }}$request_uri? permanent;
}
location /.well-known/acme-challenge {
root /var/www/letsencrypt;
allow all;
}
}
The SSL rule is a bit more compliated. In this rule we have to load the SSL
certificates and I include an external snippet to harden the SSL settings.
Again, I redirect queries to /.well-known/acme-challenge
to my
letsencrypt folder.
The most important part is the redirect to the internal HTTP server, in this case GitLab’s nginx from the container (exposed on port 8080 to the host).
server {
listen {{ nginx_passthrough_listen_port_ssl }} ssl;
listen [::]:{{ nginx_passthrough_listen_port_ssl }} ssl;
charset utf-8;
client_max_body_size 75M;
server_name {{ nginx_passthrough_server_name }};
ssl_certificate /etc/letsencrypt/live/{{ nginx_passthrough_server_name }}/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/{{ nginx_passthrough_server_name }}/privkey.pem;
include /etc/nginx/ssl/ssl_secure.conf;
root /var/www/{{ nginx_passthrough_name }}/public_html;
location / {
proxy_pass http://{{ nginx_passthrough_target_host }}:{{ nginx_passthrough_target_port }};
proxy_read_timeout 86400;
proxy_set_header X-Forwarded-For: $proxy_add_x_forwarded_for;
proxy_set_header X-Real-IP: $remote_addr;
proxy_set_header Host $http_host;
proxy_redirect off;
proxy_buffering off;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
}
location /.well-known/acme-challenge {
root /var/www/letsencrypt;
allow all;
}
}
Enable SSL on Gitlab with Reverse Proxy
Next we need to make sure that Gitlab will display the correct HTTP urls
on the webpages and checkout links.
Telling Gitlab that we use an SSL endpoint and it should display all URLs
with HTTPS will require a few options set. By default, Gitlab auto detects
from your external_url
whether it should listen on port 443 or port 80. With
an external reverse proxy we want to set the external_url
to HTTPS, but
Gitlab should still listen to port 80 (because internally we transfer traffic
with standard HTTP).
Thus, you’ll have to set the following three settings at the same time:
external_url `https://example.com`
nginx['listen_port'] = 80
nginx['listen_https'] = false
If you do not specify the nginx configurations, Gitlab will configure its internal nginx to listen to SSL traffic on port 443 and your Gitlab instance will be unreachable (unless you pass all traffic from your external nginx through HTTPS and also setup SSL certificates on your Gitlab instance).
We can also encode these settings into an ansible tasks list:
- name: Ensure external_url is configured
lineinfile:
path: "{{ gitlab_docker_data_dir }}/config/gitlab.rb"
regexp: "^external_url"
insertafter: "^# external_url"
line: "external_url 'https://{{ gitlab_docker_hostname }}'"
become: true
when: gitlab_docker_external_ssl
- name: Ensure listen port is configured for external reverse proxy
lineinfile:
path: "{{ gitlab_docker_data_dir }}/config/gitlab.rb"
regexp: "^nginx\\['listen_port'\\]"
insertafter: "^# nginx\\['listen_port'\\]"
line: "nginx['listen_port'] = 80"
become: true
when: gitlab_docker_external_ssl
- name: Ensure listen https is configured for external reverse proxy
lineinfile:
path: "{{ gitlab_docker_data_dir }}/config/gitlab.rb"
regexp: "^nginx\\['listen_https'\\]"
insertafter: "^# nginx\\['listen_https'\\]"
line: "nginx['listen_https'] = false"
become: true
when: gitlab_docker_external_ssl
While you can use Gitlab without having setup the external URL to HTTPS with most of the functionality, it will strike you as soon as you want to use LFS to upload large files. LFS does not work if the external_url is set to HTTP and you use HTTPS (with redirection from HTTP to HTTPS) on your external reverse proxy. In this situation, git will receive redirects from HTTP to HTTPS all of the time and fail to upload files to LFS.
SSH Port for checkout URLs
If you do not expose your SSH connection on port 22 (probably true, because
your host also has an SSH connection) you will have to make sure that
the Gitlab instance knows which SSH port you use externally. Otherwise the
checkout URIs for SSH connections will be wrong and you have to fix them on
each git clone
or git remote add
.
While Gitlab omnibus does configure a lot of services, the SSH server does not belong to them. This means that independent from the options you set in the Gitlab configuration, SSH will still listen to the default SSH port in the docker container. This is actually good for our dockerized environment, because it allows us to configure the external SSH port while using standard docker mapping features to actually map the port.
This can be done easily with the configuration setting
gitlab_rails['gitlab_shell_ssh_port'] = 2222
(if you want to use port 2222).
Again, I created some ansible tasks for this:
- name: Ensure SSH is available on the right port to display correct checkout URLs
lineinfile:
path: "{{ gitlab_docker_data_dir }}/config/gitlab.rb"
regexp: "^gitlab_rails\\['gitlab_shell_ssh_port'\\]"
insertafter: "^# gitlab_rails\\['gitlab_shell_ssh_port'\\]"
line: "gitlab_rails['gitlab_shell_ssh_port'] = {{ gitlab_docker_ssh_port }}"
become: true
- name: Ensure GitLab container is running (public)
docker_container:
name: gitlab
image: gitlab/gitlab-ce
state: started
recreate: no
ports:
- "{{ gitlab_docker_http_port }}:80"
- "{{ gitlab_docker_ssh_port }}:22"
volumes:
- "{{ gitlab_docker_data_dir }}/config:/etc/gitlab"
- "{{ gitlab_docker_data_dir }}/logs:/var/log/gitlab"
- "{{ gitlab_docker_data_dir }}/data:/var/opt/gitlab"
- "{{ gitlab_docker_lfs_storage_path }}:/data/lfs"
hostname: "{{ gitlab_docker_hostname }}"
become: true
when: gitlab_docker_http_public
Now, your checkout URLs will include the right port and docker will handle the actual port mapping.
I do not maintain a comments section. If you have any questions or comments regarding my posts, please do not hesitate to send me an e-mail to blog@stefan-koch.name.