Automating Deploying Node Apps with Bash Scripts

Part 1 of 2 on how I automate setting up a server and running Node applications using Bash scripts and GitHub Actions.

When deploying side projects, including my latest one - FeedMirror, I run through the same list of actions to go from code running on my local machine to code running on a DigitalOcean Droplet.
 
These steps are;
  1. Create a new Droplet
  1. Install Node, Nginx and PM2.
  1. SSH into the Droplet
  1. Clone the Project’s Repository and install dependencies
  1. Start the App with PM2 and check that it’s working with PM2 List
  1. Configure Nginx to point to the App
  1. Configure a Domain to point to the Droplets IP.
  1. Check the Domain is pointing to the App.
 
After several times of doing this, it occurred to me that I should probably try automating parts of this as it gets tedious and takes up valuable time. It gets worse when I want to deploy changes to the API and, if I haven’t got GitHub actions running, repeat steps 2 - 4 time and time again.
 
So whilst building FeedMirror, I had a go at automating a few of the tasks with Bash scripts and Github Actions.
This article will cover the bash scripts that I use, whilst part two will cover how I use GitHub Actions to automate deploying changes to the Apps.

Summary of the Scripts

Setup
The setup script did four things; Install Node, Install PM2, Install Nginx and configure the firewall to allow Nginx HTTP .
Nginx
This script took arguments passed in from the terminal to created a server block in the sites-available directory, link the file to sites-enabled and reload Nginx configuration.
Gitaccess
Like the Nginx script, this took two arguments from the terminal, the user’s GitHub username and an access token, which allowed the script to create a new SSH key and add it to the GitHub User’s account.
Deploy
With everything setup, the last script cloned the repository from GitHub. Installed it’s dependencies with npm install and started it with PM2 start app.js .

Conclusion

These scripts did pretty well at automating steps 2 through 5 and using DigitalOcean’s User Data parameter when creating a droplet, I could host these scripts elsewhere and have them run on the first setup. They helped to save time with setting up a droplet for the first time but still meant I had to SSH into the droplet each time I wanted to deploy a change to the API. For this, I use Github Actions which I’ll talk about how I use these in my next article
 
I’ve included a link to the repository below with these scripts for anyone to use. Simply host them online somewhere (I host them in an S3 bucket) and call them using curl on your server.
 
Thanks for reading! 🙂