You can't talk about hashtag JAMstack without mentioning Netlify. Just like every other decision in our process, we wanted to take advantage of modern practices - without paying somebody else for their service. So..we decided to build Netlify.
Version Controlled Content
Since we're serving hundreds of sites, we wanted to think about what kind of server infrastructure we'd need. In our current iteration, we have sites spread across more than 10 servers. The process of making updates (especially global updates) can sometimes feel a bit fragile. Updating global content can mean manually pushing changes to each server while not hogging resources and slowing our sites down.
With that pain point, we knew we wanted to automate as much as possible and limit duplication of assets. One key decision that would help is focusing our efforts on static sites. Since most of our sites are static, we decided to limit unique servers and build a load balancer and cluster of site servers.
Multiple Servers and Deploying Code
Thankfully, Laravel's Forge makes it fairly trivial to set up any number of servers behind a load balancer. We followed this incredibly helpful guide to set up not only our site servers, but also our application, database, and queue servers.
Once the architecture was set up, we needed a way to handle deploying content changes from the CMS to the site servers. Thankfully, Forge provides a way to connect a site to a Bitbucket repository and trigger deploys as needed. In other words - push to deploy. AKA - Netlify.
Our current site setup involves:
- Create the site in our CMS
- Initialize a local Bitbucket repository and push the repo to Bitbucket
- Create a site on the load balancer provisioned by Forge
- On the load-balancer, assign which servers should be used to deliver the site
- Create a site on each of the site servers
- Install the Bitbucket repo for the site on each of the site servers
- Deploy the site to each site server
Of course, this all happens automatically with a few button clicks via our CMS UI. Thanks to Forge's API, we also make some updates to the NGINX script and deploy script for each site.
Deploying Content Changes
Since each site has it's own Bitbucket repo, deploying content changes is fairly straight forward. The process for "Publishing" is:
- Build the production content
- Create a commit on the build server
- Push the commit to Bitbucket
- Tell Forge to deploy the new content to each site server
Since we have a build -> commit -> deploy process, we also have the ability to "preview" sites. Sites are always up-to-date with the latest state of the site in the CMS and allow our clients to preview any changes before they go live. Once the content is approved, we can run the "Publish" steps to make those changes live.
We've often used Forge to provision servers, but we didn't always tap into Forge for managing sites. Our first pass at deploying sites was actually a set of our own scripts that would sync site files from our build server to any number of site servers. While this worked, we were also doing a lot of manual setup for load-balancing and NGINX setup for each site.
Forge has spent a number of years figuring these steps out. The servers Forge provisions are set up in a way to handle the way site's are added via Forge. Removing these scripts and steps from our own process means we can focus more and more on the code that matters to us.