UPDATE: Caddy 2.0 is now in beta (found here). Some instructions or links in this post may be slightly outdated, but the process of setting up a Caddy webserver remains the same.


The past few years of software development and architecture have witnessed multiple revolutions. The rise of containers, unruly package management ecosystems, and one-click-deployments holds an unspoken narrative: most people probably don’t care about how things work beneath the top layer. Sure, advancements in application infrastructure have undoubtedly made our lives easier. I suppose I find this lack of curiosity and unwillingness to dig deeper into the innards, an unrelatable trait. Yet I digress.

I’ve never found web server configurations to be particularly tricky, but apparently, most consider this enough of a nuisance to make something even easier to use. That’s where I came across Caddy.

Caddy is a web server and free SSL service in which most of the actual work happens via their download GUI. It’s great. Even though I never expected us to reach a place where apt install nginx and apt install certbot is considered too much of a burden, it only took a few minutes of wrestling with a Docker container running on a VPS that I realized there was a better way.

Serve Anything With Caddy

In my particular example, the Docker container I was running produced an API endpoint. For some reason, this service forcefully insists that this endpoint is your machine’s localhost, or it simply won’t work. While scoffing at vanity URLs for APIs is fine, what isn’t fine is you can’t assign an SSL certificate to an IP address. That means whichever app consuming your API will fail because your app surely has a cert of its own, and HTTPS > HTTP API calls just aren’t gonna happen.

Caddy trivializes SSL certs to the point where you might not notice you’ve acquired one. Any host specified in a Caddyfile immediately receives an SSL cert, but we'll get to that in a moment.

Caddy’s download page is like a shopping cart for which things you might want your web server to do. Proxies, CORS, you name it: just put it in the (free) shopping cart:

Soon we won't even have to write code at all!

Selecting your platform, plugins, and license will provide you with a convenient one-liner which downloads your exact package, unpacks, and installs it on your machine. For example, a Caddy installation for Linux with no bells or whistles looks like this:

$ curl https://getcaddy.com | bash -s personal

Notice that we've added the flag -s personal to the command listed on the download page. This specifies a personal (free) license; I believe this was simply lef tout by accident. Running this command will install Caddy, leaving only some trivial configuration before you're up and running.

Configuring Your Caddyfile

Caddy is configured via what is simply named Caddyfile, a file which can conveniently live in your project folder, as opposed to a distant land called /etc/nginx/sites-enabled. Go ahead and create your Caddy file.

The first line in our Caddyfile config is both simple and magic. It contains merely the domain you’re intending to listen on, such something like hackersandslackers.com. No matter what else happens in your config, the mere existence of this line will generate an SSL cert for you when you run caddy.

You can serve content via any method that Nginx or Apache can, albeit much easier. A few examples:

  • root path/to/project points your DNS to serve HTTP out a root folder.
  • websocket path/to/socket command will serve an application via the specified websocket.
  • rewrite [/original/folder/path] [/new/folder/path] will reroute internal requests made to origin A to origin B,

The point I’m trying to make here is that no matter what your desired configuration might be, it’s dead simple and likely won’t exceed more than 5 lines.

Serving Our Docker Container via Proxy

If you’re using Node, chances are you’re going for a proxy configuration. In my case I had no choice: I somehow needed to interact with an HTTP url, while also passing the authentication headers necessary to make the app work. Luckily, this is trivial:

example.com

proxy example.com proxy example.com localhost:4466/my_api/prod {
 transparent
} 

errors proxieserrors.log

Yes, really. Our proxy block simply creates a proxy from example.com, and serves localhost:4466/my_api/prod.

transparent is a magic phrase which passes through all our headers to the target. It's shorthand for the following:

header_upstream Host {host}
header_upstream X-Real-IP {remote}
header_upstream X-Forwarded-For {remote}
header_upstream X-Forwarded-Port {server_port}
header_upstream X-Forwarded-Proto {scheme}

Despite our Docker app requiring an authentication token to work hitting example.com will still result in a working endpoint thanks to the headers we're pushing upstream.

I even went the extra mile to include errors proxieserrors.log as a way to log errors. I didn't even need to. I only even got two errors total: Caddy works obnoxiously well.

In case you need anything more, I’d recommend reading the documentation. Even then, this basically summarizes the things you can potentially configure:

proxy from to... {
	policy name [value]
	fail_timeout duration
	max_fails integer
	max_conns in≈teger
	try_duration duration
	try_interval duration
	health_check path
	health_check_port port
	health_check_interval interval_duration
	health_check_timeout timeout_duration
	fallback_delay delay_duration
	header_upstream name value
	header_downstream name value
	keepalive number
	timeout duration
	without prefix
	except ignored_paths...
	upstream to
	ca_certificates certs...
	insecure_skip_verify
	preset
}

Run Caddy And Never Worry About It Again

Saving your Caddyfile and running $ caddy will issue your cert, and run Caddy as a process. This will result in a dialogue letting  you know that Caddy is listening on both ports 80 and 443.

Caddy won’t run as a background process by default. To do this, simply use the command $ nohup caddy & and you're good to go.