Privacy friendly focused website for legends

exitprocess0.net - Blog

How to properly host an anonymous "bulletproof" onion website with a reverse proxy to clearnet like a legend

| Introduction

Ye so I thought...
I see so many articles online, quick tutorials to host onion websites, most of them get the point but their goal is simply to host a onion website not how to securely do so. In this guide we will host a "bullet proof" website on Tor.
For my first article here, I thought it would be a good idea to share my insight and knowledge on the subject, I have quite a lot of experiences in terms of hosting on tor, exitprocess0 being one of the latest.
The setup we will discuss here is pretty much the one used by exitprocess0.net, it won't be the exact same one for OPSEC and security through obscurity reasons but the core idea will more or less remain the same.

| Hosting service

While not the main subject of this article, I believe it's critical to mention which hosting service to use as many people do wonder, which hosting service is suitable for .onion. There is obviously no definitive answer to that question but here are the things that you should look for :

Here is some explanations for these criterias :
1. Accept crypto : Being able to buy anonymously will obviously simplify the process, you could use some normal service not accepting crypto and simply using a middle man such as service allowing to buy debit card with cryto without KYC but its simply harder and takes more time + fees

2. "Shouldn't sell itself as a darknet service" : This is a more personal take, I do believe some service providing anonymous hosting, ignoring DMCA are legit. In fact I do know some but the thing is, its simply one of the first thing to look for in honeypot if you wanna be secure, and anonymous you should aim for services used by the masses, the "everyone hosting". It's safe to assume that feds do activetely scans these IP ranges and monitor these services in the case of these em not being a honeypot. While if you use a random ASN you are less likely to be activetely monitored. Think about it these hosting services basically yell "WE HELP CRIMINAL HOST THEIR SERVICES ON THIS IP RANGE", feds know the onion services IP range, thus are vulnerable to complex timing and traffic analysis attack and therfore can easily be monitored / are no longer anonymous. Like if I had time in this article, I would just masscan crazy RDP ASN, I swear I will find some C2 HTTP servers with no auth. Feds do the same but have better tools, you can also then monior their downtime with the one of the onion service, you get the idea. That leads us to the third point

3. Have 100% uptime : That's the easiest way to de-anonymize an onion website, if the hosting service get one of their VPS service offline for some times (like for maintenance reasons) they can easily know which hosting service you use, you do not need the power of the NSA to make these educated guesses. Every people using cheap hosting are "vulnerable" to this this is why true darknet website use fast-flux bullet proof hosting. Can't do much about it. That's why I would even advise to simply rotate VPS times to times.


With that said, just choose your hosting website. It doesn't matter if it doesn't match all criteria if properly hosted any hosting can be considered "good enough"

| Getting started

Let's say you got your first, origin main VPS server (you obviously installed a custom iso and used LUKS full disk encryption (jk but why not)), if the VPS is clean you will proceed to the first step which is installing tor. The tor daemon is the program that will open a port on 9050 and route the traffic inside it through tor network, 3 hops, encryption you know the game. You could just use the one of your package manager:

1
sudo apt-get install tor
However, for example if you are using Debian the tor package maintainers are dead (version is terribly outdated) so the best way is to compile it from source or simply add the tor project mirroirs to your package manager:
You could simply follow the tutorial on the official website in this guide I will simply put a one liner that will do the job:
Some dependencies
1
sudo apt-get update && sudo apt-get upgrade -y && sudo apt-get install apt-transport-https -y && sudo apt-get install curl -y
Adding tor mirroirs:
1
2
wget -qO- https://deb.torproject.org/torproject.org/A3C4F0F979CAA22CDBA8F512EE8CBC9E886DDD89.asc | gpg --dearmor | sudo tee /usr/share/keyrings/deb.torproject.org-keyring.gpg >/dev/null
sudo touch /etc/apt/sources.list.d/tor.sources && echo "Types: deb deb-src\nURIs: https://deb.torproject.org/torproject.org/\nSuites: <DISTRIBUTION>\nComponents: main\nSigned-By: /usr/share/keyrings/deb.torproject.org-keyring.gpg\n" > /etc/apt/sources.list.d/tor.sources && distro=$(lsb_release -c) && sed -i -e 's/<DISTRIBUTION>/$distro/g' /etc/apt/sources.list.d/tor.sources && apt-get update && sudo apt-get install tor -y
Since we are hosting a website we obviously need a webserver, there is 2 popular choice, you are probably aware of apache2 and nginx. I personally do prefer nginx, no real reasons it's simply because I'm used to it's configuration and it's the one we are going to use in this guide.
1
sudo apt-get install nginx
We will setup a firewall later in this guide, we could use pre-installed iptables but I tend to use ufw as another solution, both are kernel level firewall and work great
1
sudo apt-get install ufw
With that boring part done let's get to the core of the subject....

| Preparing the onion server and configuring tor

There is multiple ways / mindset to configure your servers

One could argue that it doesn't matter how you secure the mainframe as the onion service will be hidden anyway, I personally like to have server reject all traffic from EVERY PORT (yes even SSH port 22), and only keep loopback (internal) TCP traffic. Tor is great right? Not only you can route and more or less "anonymize" traffic at some degrees with it but you can run it on a server without "anyone" knowing.

Since SSH traffic will be blocked we will have to run it as a hidden service (so it become loopback TCP traffic), I personally like to use a different tor daemon for the SSH service.
Not using the same for the website ensure that both cannot be correlated in any ways. Both will have different nodes and torrc config. You will not see someone else do this, most of articles online simply run both service through the same tor daemon, SSH will have a different onion address but will run under the same daemon. SSH must be kept completely secret from the web server

After installing tor you should have the configuration files in
1
/etc/tor/
What we will do is create a secondary tor service for SSH, we can simply copy the torrc default configuration into something like torrc2:
1
cd /etc/tor/ && cp torrc torrc2
Now both service bind the same port so we will have to choose a local port number to route the SSH traffic in, any non-priviledged port (>1024) will do, in this example I will use port 9051 since the default port is 9050, it can easily be remembered so I simply changed the line of the torrc2 file
1
SocksPort 9050
into
1
SocksPort 9051
Now all we need to do is set the HiddenServicePort, which is basically like the hidden secret local port of the hidden service and HiddenServiceDir in the config file, it will contain the .onion address keys for the address So I simply added these lines :
1
2
HiddenServiceDir /var/lib/tor/onionservice/
HiddenServicePort 22 127.0.0.1:9051

Now I know it's in theory more secure to use unix socket as it doesn't go through the IP stack, the thing is SSHD doesn't support listening to a unix socket (at least directly), so honestly who cares there is far bigger security and privacy challenge like I'm curious how could that be possibly considered as an attack vector. Tor documentation itself on hosting onion service do bind local port. For the torrc2 config, you could also use

1
HiddenServiceAuthorizeClient

So that the onion address is discoverable only if the client has a matching password, but this is getting extremely paranoid, because even if someone has the SSH onion address SSH fingerprint does not leak server IP. There is some additional things you can add in that config file, here are some parameters that I consider important :

1
2
3
StrictNode 1 #Tell tor that it should follow these rules
ExcludeExitNode {de} # German police run tons of tor nodes, if there is a country to avoid it's this one
MaxCircuitDirtiness 120 # (seconds) After that time the nodes will be refreshed, it's great to have a tiny value but use a common on otherwise you'll get fingerprinted with your super custom circuit change

All we need now is to have this onion service up and running, to do so we can just add a systemd file that fires :

1
tor -f /etc/torrc2

-f simply being the argument to add a config file to torrc. For easy control I simply added my one line command in /bin as a script (start-tor-ssh):

1
2
#!/bin/sh
/bin/tor -f /etc/torrc2
Created the systemd file for it
1
touch /etc/systemd/system/ssh-tor.service
And this is what it contains :
1
2
3
4
5
6
7
8
9
10
11
12
13
14
[Unit]
Description=SSH Tor service
After=network.target

[Service]
User=debian-tor
Group=debian-tor
Type=oneshot
Restart=on-failure
RemainAfterExit=yes
ExecStart=start-tor-ssh

[Install]
WantedBy=multi-user.target

Now, one last thing is to tell SSH to now listen only on the local address, the ssh server config file by default is /etc/ssh/sshd_config

By default server listen on 0.0.0.0 (outside connection on its public address) we just have to change that to the loopback local address, so I simply changed the line into:

1
ListenAddress 127.0.0.1

Now if you restart SSH the server will no longer listen on public port 22, your only way to connect after restarting the SSH server will be to use the .onion. The address should be in /var/lib/youronionservice/hostname. So just cross your fingers and

1
sudo systemctl restart ssh

If unfortunately you cannot connect using the onion address, something is wrong in your config and you will have to pop an emergency KVM shell from your VPS provider... If it work, or if you simply just fixed some broken permissions and it ended up working, great now we can wall the server :

1
2
3
sudo ufw allow in lo
sudo ufw default deny incoming
sudo ufw enable

A good thing to do would be to disable ipv6, it can be used to de-anonymize and it's simply a good practice to remove things we do not use, I also, personally like to block ICMP traffic all of this can be edited in this file on Debian : /etc/sysctl.conf I added these lines :

1
2
3
net.ipv6.conf.all.disable_ipv6 = 1
net.ipv6.conf.default.disable_ipv6 = 1
net.ipv4.icmp_echo_ignore_all=1

Now if everything is right, this is what should happen if you do a intensive nmap scan on your host IP :

1
2
3
4
5
6
7
8
9
10
~$ sudo nmap -sV -Pn -O -T4 [redacted]
Starting Nmap [redacted]
Nmap scan report for [redacted]
Host is up.
All 1000 scanned ports on [redacted] are in ignored states.
Not shown: 1000 filtered tcp ports (no-response)
Too many fingerprints match this host to give specific OS details

OS and Service detection performed. Please report any incorrect results at https://nmap.org/submit/ .
Nmap done: 1 IP address (1 host up) scanned in [redacted] seconds

This is what I call a "ghost server", the host is here up, the only thing you see is the broad TCP fingerprint, it's doing something but you cannot see any services running. In fact, since we even block ICMP nmap without -Pn option believe the host is down! Now we can finally make nginx listen to a tor unix socket in the same way we did setup SSH, we can edit the default /etc/torrc configuration file and add the HiddenServicePort and HiddenServiceDir. Tiny thing : you can even bruteforce a custom onion vanity address using mkp224o, just put the generated key in the HiddenServiceDir directory: Anyway here is the torrc for the onion service I came up with :

1
2
3
4
5
6
7
8
9
Log notice file /var/log/tor/log
RunAsDaemon 1
DataDirectory /var/lib/tor
StrictNodes 1
ExcludeExitNodes {de}
HiddenServiceDir /var/lib/tor/onionwebsite/
HiddenServicePort 80 unix:/var/run/nginx.sock
MaxCircuitDirtiness 120
HiddenServicePoWDefensesEnabled 1

You may have noticed HiddenServicePowDefenseEnabled, which is a very interesting option that makes your website very hard to DDOS, client using Tor Browser will face a tiny cryptographic challenge (in the background) otherwise request will be delayed. There is always things you can add / trick and it's up to you, just don't over custommize it will make your tor daemon fingerprintable, if you are no longer like the average tor user, you are no longer anonymous. You just become the uncommon guy with a "paranoid setup" and node operator can see / monitor this.


As for nginx the file /etc/nginx/sites-enabled/default contain the listen address, I just used the unix socket :

So I changed the listen 80 line inside the server {} block into this one:

1
listen unix:/var/run/nginx.sock;

Now inside /etc/nginx/nginx.conf, there is multiple option you could add to make your server secure but the guide isn't really on that matter. So here are some lines that I've added to my nginx.conf file inside the http block:

1
2
3
4
5
6
# Hide nginx version
server_tokens off;
# Some XSS protection
add_header X-XSS-Protection "1; mode=block";
# Avoid GZIP bomb
gzip off;

You can also play with client_body_buffer_size settings to limit request size, but that just depend on what your services are about, like if client are supposed to transfer files etc... You also certainly must use things like ngx_http_limit_req_module to rate limit / throlle requests Example :

1
2
limit_req_zone $binary_remote_addr zone=one:10m rate=10r/s;
limit_req zone=one burst=20 nodelay;

Ifyou setup PHP (which, you will if you are going to make a dynamic website) do not forget that if you use PHP function like curl() or exec() the outbound traffic doesn't go through a tor proxy, thus the IP of the server will get leaked. You can use torsocks or some name spaces isolation when using phpx.x-fpm. But honestly, simply adding torsocks in the systemd file will do the job (its not failproof leak can happen).

| Fail2Ban additional bot detection

With all of this setup you are not supposed to be afraid of tools like masscan, censys bot, crawler... This section make sense for both reverse proxy and origin server, even tho it's just more meaningful to use such system only on the reverse proxy as its the server that will host tools and have all port open for hungry skid bots and scanners

1
sudo apt-get install fail2ban

Now we can go really complex with pattern detection and all but truth be told most of skids use the default config and do not care much.
Like did you know, tools like Hydra literally use "Hydra" as useragent, and you can't easily edit it... Fail2ban will simply look at the line in nginx access.log and see which match a certain regex. Here is an example config file (/etc/fail2ban/filter.d/block.conf) :

1
2
[Definition]
nginx_ban = ^<HOST> -.*"(GET|POST).*"(?:censys|Hydra||nikto|masscan)
Filter file only tells which one to ban, we need a jail file (/etc/fail2ban/jail.d/nginx_ban.conf) which essentially point to the log file and define the penality for the IP address here is an example of that file :
1
2
3
4
5
6
7
8
[nginx_ban]
enabled = true
filter = nginx_ban
action = iptables[name=NGINX, port=http, protocol=tcp]
logpath = /var/log/nginx/access.log
findtime = 10
maxretry = 20
bantime = 1h
Anyway that's pretty much it, if your goal was to host a secure onion website. The next section will be about securely having a clearnet mirroir of this website so that any clearnet users can access it through a normal clearnet domain WITHOUT compromising the origin server IP

| Reverse proxy to clearnet

exitprocess0.net is hosted in the same way, the clearnet is just a client to the onion. You first need a secondary server, the reverse proxy. Even multiple of them, even tho in this example we will use a single one. reverse proxy essentially just forward the client request & give the response. It's relatively easy if your server allows it, in fact that's what's scammer use to capture credentials but having the victim still be able to login. (Now there is multiple securities for that like X-Frame protection, cookies domain check but truth is there is always a way, exitprocess0 has none of these I just trust my users to be smart enough to use exitprocess0.net on exitprocess0.net, lol).

I won't go in much details because it gets extremely boring and clearnet server configuration is less fun than darknet one. I'd say as long as your reverse proxy server does not have the origin server IP in any of the files, your configuration is secure, because reverse proxies are meant to be fast to setup and be rotated. Just don't expose SSH port or other administration services you may use, just set them up as a hidden service like we did earlier for the origin server SSH configuration.

Nginx has already a module for reverse proxy so it's relatively easy to setup one, you can read in details the official nginx documentation in this guide I will simply show an example configuration (/etc/nginx/conf.d/nginx.conf):

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
location / {
# privoxy is used to forward traffic on the local port 8118, it's not mandatory
proxy_pass http://127.0.0.1:8118/;
proxy_http_version 1.1;
proxy_set_header Host <onion address>.onion;
# keep the client IP
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header Connection "";
proxy_set_header Proxy "";
proxy_set_header Accept-Encoding "";
# useful, replace link and menion of the site on the page with the clearnet domain
proxy_redirect ~^http://<onion address>\.onion(/.*)$ https://<clearnet domain>$1;
sub_filter 'http://<onion address>.onion' 'https://<clearnet domain>';
sub_filter_once off;

}
That's it ?

| The end ?

And with that, we have onion website mirroired to clearnet securely using a reverse proxy.
There is many things I could have added, but this blog post is already long and took me quite some time to write. I'm working on a tiny system for exitprocess0.net that rotate A record when server is under outage using tiny TTL for the DNS, I will probably write a blog post about it, or maybe it will be about freenet & VPN crowd manipulation both are planned. We'll see
. I hope that this will be helpful to someone, have a nice rest of your day, exit the process, cleanly,

Thank you for reading