Using Cloudflared Database Tunnels in GitHub Actions
I needed to run a script in GitHub Actions that syncs data from my repo to a PostgreSQL database—but the DB is only accessible via a Cloudflare Access-protected TCP tunnel (like posts.r2pi.co
). This required automating cloudflared access tcp
in the CI pipeline. I also wanted to be able to test this whole pipeline locally using act
.
Goal:
- Run
cloudflared access tcp
to open a tunnel from CI runner to my database. - Connect via
psql
(or a Node script) tolocalhost:<port>
using that tunnel. - Ensure this setup fails fast if the DB is not accessible.
- Make local iteration with
act
as seamless as possible.
What Actually Works in GitHub Actions
TL;DR:
Use a Cloudflare Access Service Token and set these as secrets in your repo:
TUNNEL_SERVICE_TOKEN_ID
TUNNEL_SERVICE_TOKEN_SECRET
In your workflow, do:
- name: Setup Cloudflare Access TCP
env:
TUNNEL_SERVICE_TOKEN_ID: $
TUNNEL_SERVICE_TOKEN_SECRET: $
LOCAL_PROXY_PORT: $ # "35432" in my case
REMOTE_HOSTNAME: $ # "posts.r2pi.co" in my case
run: |
mkdir -p ~/.cloudflared
cloudflared access tcp --hostname "${REMOTE_HOSTNAME}" --url "0.0.0.0:${LOCAL_PROXY_PORT}" --loglevel debug > ~/.cloudflared/access.log 2>&1 &
ACCESS_PID=$!
echo "$ACCESS_PID" > ~/.cloudflared/access.pid
sleep 15
if ! ps -p $ACCESS_PID > /dev/null; then
echo "cloudflared tunnel died. Check logs:" && cat ~/.cloudflared/access.log
exit 1
fi
Then, test the DB connection immediately:
- name: Test Database Connection
env:
DATABASE_URL: $
run: |
sudo apt-get update && sudo apt-get install -y postgresql-client
timeout 30s psql "$DATABASE_URL" -c "SELECT 1 AS connection_test;"
if [ $? -ne 0 ]; then
echo "Database connection via tunnel failed!"
cat ~/.cloudflared/access.log
exit 1
fi
-
Set
DATABASE_URL
to uselocalhost:35432
or127.0.0.1:35432
(matching your tunnel port). -
Shutdown the tunnel and upload logs on failure for easy debugging.
Common Confusions & Mistakes
1. Service Token Auth:
You must use TUNNEL_SERVICE_TOKEN_ID
and TUNNEL_SERVICE_TOKEN_SECRET
.
Don’t use CF_CLIENT_ID
/CF_CLIENT_SECRET
or cert.pem
(origin cert)—those are not accepted for cloudflared access tcp
machine auth unless you specify them with --service-token-id
and --service-token-secret
explicitly.
2. Tunnel Tokens & Tunnel Credentials JSON:
Only for running cloudflared tunnel run
(making the machine an origin). They don’t work for access tcp
. This was a big gotcha for me.
3. Multi-line Secrets with act
:
Best to use a .secrets
file, with one variable per line, to pass secrets to local act
runs.
But—see below!
Issues with Testing Locally With act
This is the main gotcha.
When running with act
(which uses Docker containers for jobs), you will nearly always run into these issues:
-
Docker Networking:
cloudflared
starts and binds to127.0.0.1:PORT
or even0.0.0.0:PORT
inside its container, which, because of Docker's network isolation, may not be reachable frompsql
running in another container step—even ifnetwork="host"
is used.- I wasn't able to get it to work with
host.docker.internal
as theDATABASE_URL
either and still got "connection refused."
-
Summary:
You will likely never get a local end-to-end test working viaact
unless you do additional, advanced Docker networking (like usingdocker run --network host
on Linux, or running both steps in the same process).
In practice:- CI (GitHub Actions) = Reliable
- act (local, Docker) = Unreliable for network tunnels
Final Checklist for CI
- [x] Use service token (
TUNNEL_SERVICE_TOKEN_ID
,TUNNEL_SERVICE_TOKEN_SECRET
)—not certs, not tunnel token. - [x] Bind
cloudflared
to0.0.0.0:<PORT>
(safe for both CI and most local runs). - [x] In CI, use
localhost
/127.0.0.1
in your DB connection string. - [x] Test DB connection immediately after starting the tunnel—fail fast if it doesn’t work.
- [x] Upload tunnel logs as artifacts if failure occurs.
- [x] Don’t expect
act
to behave like CI for tunneled network services; test on GitHub if in doubt.
If You See This in Your Logs:
Please open the following URL and log in with your Cloudflare account:
You’re still missing the correct service token credentials or using the wrong environment variable names.
References: