Pizza as a service

How To Deploy Microsite App With Docker-Compose for Multiple Customers

There is a chil when you deploy a Microsite web application should you understand it. We recently launched a product in the company I worked. So, we need to deploy a microsite web application. The web application assign an ID to every customer that will use the application. And there is where the fun starts!

the pain point

So, the Product Lead reached out to me to find out the possibility of deploying the same web application for multiple customers with their own unique ID. The deployment includes each customer using their own environment variable file at build time. However, only one code base will be use for them. In clear terms, one source code repository, one production branch, different environment variable ,and only one server will be paid for.

This is so that maintenance of the application is a little easier and features and update would be uniform across the application. Again, the only variable here is that each customer will have their own environment variables. The variables includes the color scheme of each customer’s web app, a token for the API, and so on.

Well, in my head, this seems like a simple deal. However, one challenge is how to use different environment variable file to deploy from one source code for the microsite web application. Also, the budget to achieve this for a start is very humbling. Ooops! This is a little different case from the previous app I deploy which works well with Elastic Beanstalk and Apprunner.

The plan and solution.

Since this is going to be a microsite for each customer, the first technology to employ is the containerization. So, the plan is to deploy each customers web app with a Docker container with a registry. This can be achieve using a 8GB ram, 160GB disk, and 3TB transfer M-class EC2 server. However, there is room to increase the server size when the need arise.

After that, the next consideration is how to build each customer’s image and then container without having to individually build each. There, I adopt the docker-compose to help. With docker-compose, I can create each application as a service and specify the build for each application with each customer’s name. Docker-compose will also push the images to the registry if need be.

Once that is done, the next is to figure out how to attach environment variable for each customers to the image. At first, attaching the environment variable file at run time seems feasible. However, NextJS require that the variables be available a during build time of the web application.

So, rather than attach it at run time, I created a subdirectory for each customer in the root of the source code i.e mkdir customer109. In each subdirectory I placed .env file containing the specific settings for the customers.

the deployment

Once, I get the environment file in the subdirectory, I state a COPY step in the Dockerfile of each customer. That will copy the environment file from the subdirectory into the docker image WORKDIR. I think this is more efficient that explicitly stating the environment variables in each Dockerfile of each customer. That will mean the variable will be exposed to public as I will need to push the Dockerfile to the GitHub repository. I do not want that. And also, it keeps the Dockerfile clean and concise for any one to read. I personally do not like intimidating configuration files.

Meanwhile, I add each customer’s subdirectory .gitignore file so I do not commit it to the project’s GitHub repository. That keeps it a little more secure.

Now to get the application up and running for each customer, all I need to do is run the docker-compose up -d and docker-compose will build each docker image, tag it with the name of the customer, and run the container. And that will deploy the microsite web application for each customer.

the automation

As the last step, I need to automate all of the process to of deploying the whole application. The GitHub Action comes to help. With GitHub Actions, I can connect to the server so it run the deployment each time there is change in the project repository. All without any intervention from inside the server.

conclusion

This method of deployment is still a little too manual. I still have to create each directory manually from the server, and also manually add the customer’s web application inside the docker-compose file. However, I will keep on improving the architecture and automate as much as I can.

Also, the security of the application is a little concern to me even though I follow all security practices. Especially firewall rule, no port open because I used a reverse proxy, and configure the infrastructure with Terraform to enjoy the immutability of the infrastructure. In addition, I use SSL certificate, and also I take a regular backup and snapshot of the servers. There is also plan to have a second server to load balance the traffic too later on.

There is the thought of using Bash Script to automate some of the stuffs. In addtion to get a better service to meet this use case. Nonetheless, I delivered the web application for the first set of customers already. The next set will be added to the compose file in few days time.

Let me know if you have done something like this and how you went about the deployment, automation and security of the web application. Till then, keep enjoying the potential that cloud technology brings.


Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *