Crafting a Personal Cloud
Over the years I have used just about every cloud based file storage service there is. I still have 14.88 GB of storage available to me in Dropbox. I recently decided to use Keybase to store my personal documents... that is until I read that it was purchased by Zoom and then heard Zoom's CEO made a statement about actively making decisions regarding Zoom's efforts to work with law enforcement. I don't think I can trust my data to a company who actively seeks to work to undermine the security of their users data. So I decided it was time to check out the personal cloud storage open source software landscape and see if I could host my own cloud storage system.
There are several options out there today for hosting your own personal cloud. I ended up choosing OwnCloud. I went with OwnCloud because they support encryption at rest, they not only support but reccomend SSL on the front end, they have good guides on setting up and running the system with security in mind, and they had a docker container + guide on using it with docker-compose. OwnCloud also has sync clients for every operating system and a mobile client for iOS.
The design I had in mind for this system had some challenges. I wanted a public endpoint to access it but I did not want to run the OwnCloud instance in the cloud or to pay for storage in the cloud for it. That meant I needed some way to run it on my own hardware and with my own storage but I needed it to be available from outside my home network. I also did not want to punch holes in my home network firewall for it. A setup like that would have been hard to achieve not to long ago. That was before Zerotier-One. I started using Zerotier a while back when I discovered it as part of my gaming rig setup with Moonlight. Zerotier is a virtual LAN essentially. I have it running now on all my machines and I am able to access them as long as they have an internet connection from anywhere I am.
I found this on github zyclonite/zerotier-docker. This lets me run Zerotier in a container on my cloud server hosting my website and gives that server a virtual LAN connection to my other Zerotier enabled systems.
It was super easy to setup...
docker run --name zerotier-one --device=/dev/net/tun --net=host --cap-add=NET_ADMIN --cap-add=SYS_ADMIN -v /var/lib/zerotier-one:/var/lib/zerotier-one --restart=always -d zyclonite/zerotier docker exec zerotier-one zerotier-cli join <my private network id>
Now I had a virtual network adapter which gave me access to my entire virtual LAN including the server in my house I am planning to run OwnCloud on. Step 1 complete.
Setting up OwnCloud on my server in my house was super easy. I used this guide from OwnCloud to stand up my instance on my server. I changed only one thing in their config. Instead of using
OWNCLOUD_VERSION=10.0.7 I used
OWNCLOUD_VERSION=latest giving me the latest and greatest version of OwnCloud available from Docker Hub. Needless to say I also used 1Password to generate a good password for the admin account.
The next thing I needed to figure out was how to make my server in the cloud proxy connections through the Zerotier LAN to my server in my house running OwnCloud. I use NGINX as a proxy for my website. I have Ghost running in a container as well as some APIs I wrote for various things. That setup was pretty easy. Now I needed to figure out how to make it proxy these connections as well.
This was the most difficult part of the setup by far. I started out trying to get this to work using a path location as I have done with several other things running behind my public domain.
/owncloud was my first attempt. That did not work. Owncloud redirects requests to it's root to
/login so every time I tried to get to Owncloud I ended up on a 404 page for Ghost. Scrap that idea. After that did not work I decided to run it as a subdomain. I setup a new DNS record for the subdomain and started in on the NGINX configuration. I added a new
server block to my NGINX config and pointed it to the IP and port from Zerotier for the machine hosting OwnCloud. This was actually way easier that I thought it was going to be, but I did run into a few issues. Firstly I use LetsEncrypt for my SSL certificates.
I ran into a cert error first thing. That was fixed by running
sudo certbot --nginx --no-redirect -d <subdomain>.d4v3y0rk.com
This is the same command I ran for the TLD with the
<subdomain>. added to the front. Now I was able to get to the URL without any cert issues. The final configuration change I had to make was in OwnCloud. There is a file called
config.php and I had to add one line to this file to get it to work behind my TLD.
'overwritehost' => '<subdomain>.d4v3y0rk.com'
Now I had OwnCloud running on my own personal server inside my home network and it was available from my public subdomain.
I wanted to make sure this setup was as secure as possible. So I went a little further. I setup 2FA OTP which is an "App" available in the OwnCloud Market for free. I also setup another "App" from the Market called "Brute-Force Protection". I spent some time making sure the SSL setup was secure. I disabled weak cyphers and setup a CAA record in route53 so only certificates issued from LetsEncrypt could be used for my domain. This resulted in an A rating from ssllabs.com for my site. Finally even though the backup and restore functionality is way more difficult for OwnCloud using encryption at rest on the server filesystem I enabled that as well.
This was a fun little project that only took me about 2 hours to complete and I am pretty happy with it so far. I hope in the future OwnCLoud will enhance their backup and recovery capabilities for the encryption at rest functionality. Either by enabling 2FA OTP or by disabling weak cyphers I did break the iOS client from being able to connect, but I am still able to access my files from Safari on iOS so I am not worried about the app not working. I deleted it. I also had to generate an app password for the desktop client after enabling 2FA OTP. Helpfully the Desktop Sync app suggests this when it fails to authenticate.