How to Configure Environment Variables in Docker Deployed to EC2How to Configure Environment Variables in Docker Deployed to EC2

When deploying applications to the cloud, Docker and AWS EC2 are a powerful combination. This article will guide you through configuring environment variables in Docker, specifically for deployments on EC2. Environment variables are crucial for managing configuration settings, ensuring security, and enabling easy modifications without changing code.

DevOps Tools Comparison, Benefits, Best Practices, and Career Path

Understanding Environment Variables

Environment variables are key-value pairs used by applications to configure settings. They can store sensitive information such as API keys, database credentials, and other configuration parameters. Proper management of these variables enhances security and flexibility.

Unveiling the New Release: Software TGD170.FDM.97

Setting Up Docker on EC2

Before diving into environment variables, let’s ensure Docker is set up on your EC2 instance.

  1. Launch an EC2 Instance:
    • Go to the AWS Management Console.
    • Navigate to EC2 and click “Launch Instance”.
    • Select an Amazon Machine Image (AMI). Amazon Linux 2 is a good choice for its compatibility with Docker.
    • Choose an instance type (e.g., t2.micro for testing).
    • Configure instance details and add storage as needed.
    • Review and launch the instance.
  2. Connect to Your EC2 Instance:
    • Use SSH to connect to your EC2 instance. For Linux or macOS:

      bash

      ssh -i /path/to/your-key.pem ec2-user@your-ec2-instance-public-ip
    • For Windows, use an SSH client like PuTTY.
  3. Install Docker:
    • Update the package list:

      bash

      sudo yum update -y
    • Install Docker:

      bash

      sudo amazon-linux-extras install docker
    • Start the Docker service:

      bash

      sudo service docker start
    • Add your user to the Docker group to run Docker commands without sudo:

      bash

      sudo usermod -aG docker ec2-user
    • Log out and log back in to apply the group changes.

Configuring Environment Variables in Docker

There are several ways to configure environment variables for Docker containers. Here are the most common methods:

1. Using Docker CLI

When running a Docker container, you can pass environment variables directly using the -e flag.

bash

docker run -e ENV_VAR_NAME=value your_image

Example:

bash

docker run -e DB_HOST=localhost -e DB_USER=root -e DB_PASS=secret my_app_image

2. Using an Environment File

You can store environment variables in a file and pass the file to Docker using the --env-file flag.

  1. Create a file named .env:

    plaintext

    DB_HOST=localhost
    DB_USER=root
    DB_PASS=secret
  2. Run the Docker container with the environment file:

    bash

    docker run --env-file .env my_app_image

3. Using Docker Compose

Docker Compose allows you to define and run multi-container Docker applications. You can specify environment variables in the docker-compose.yml file.

Example docker-compose.yml:

yaml

version: '3'
services:
db:
image: mysql:5.7
environment:
- MYSQL_ROOT_PASSWORD=secret
web:
image: my_app_image
environment:
- DB_HOST=db
- DB_USER=root
- DB_PASS=secret

Storing Environment Variables Securely

Using AWS Secrets Manager

AWS Secrets Manager helps you manage and retrieve database credentials, API keys, and other secrets throughout their lifecycle.

  1. Store a Secret:
    • Go to the AWS Management Console.
    • Navigate to Secrets Manager and click “Store a new secret”.
    • Select the secret type (e.g., RDS database credentials).
    • Enter the secret information and configure the secret name and description.
    • Choose encryption settings and save the secret.
  2. Retrieve a Secret:
    • Use the AWS SDK in your application to retrieve the secret.
    • Example code using the AWS SDK for Python (boto3):

      python

      import boto3
      import json
      def get_secret():
      secret_name = “my_secret_name”
      region_name = “us-west-2”client = boto3.client(“secretsmanager”, region_name=region_name)
      response = client.get_secret_value(SecretId=secret_name)
      secret = json.loads(response[“SecretString”])
      return secret

Using AWS Systems Manager Parameter Store

AWS Systems Manager Parameter Store provides secure, hierarchical storage for configuration data management and secrets management.

  1. Store a Parameter:
    • Go to the AWS Management Console.
    • Navigate to Systems Manager and click “Parameter Store”.
    • Click “Create parameter”.
    • Enter the parameter name, description, and value. Choose the parameter type (String, StringList, or SecureString).
    • Save the parameter.
  2. Retrieve a Parameter:
    • Use the AWS SDK in your application to retrieve the parameter.
    • Example code using the AWS SDK for Python (boto3):

      python

      import boto3

      def get_parameter(name):
      client = boto3.client(“ssm”, region_name=“us-west-2”)
      response = client.get_parameter(Name=name, WithDecryption=True)
      return response[“Parameter”][“Value”]

Automating Docker Deployment with Environment Variables on EC2

To streamline the deployment process, consider using a CI/CD pipeline. Here’s a basic example using GitHub Actions:

1. Create a Dockerfile

Your Dockerfile should define your application’s environment.

Example Dockerfile:

Dockerfile

FROM python:3.8-slim

WORKDIR /app

COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt

COPY . .

CMD [“python”, “app.py”]

2. Define GitHub Actions Workflow

Create a .github/workflows/deploy.yml file in your repository.

Example deploy.yml:

yaml

name: Deploy to EC2

on:
push:
branches:
main

jobs:
deploy:
runs-on: ubuntu-latest

steps:
name: Checkout code
uses: actions/checkout@v2

name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1

name: Log in to Amazon ECR
id: login-ecr
uses: aws-actions/amazon-ecr-login@v1

name: Build, tag, and push Docker image
id: build-image
run: |
docker build -t my_app_image .
docker tag my_app_image:latest ${{ steps.login-ecr.outputs.registry }}/my_app_image:latest
docker push ${{ steps.login-ecr.outputs.registry }}/my_app_image:latest

name: Deploy to EC2
run: |
ssh -o StrictHostKeyChecking=no -i /path/to/your-key.pem ec2-user@your-ec2-instance-public-ip << EOF
docker pull ${{ steps.login-ecr.outputs.registry }}/my_app_image:latest
docker run -d -e ENV_VAR_NAME=value ${{ steps.login-ecr.outputs.registry }}/my_app_image:latest
EOF

Final Thoughts

Configuring environment variables in Docker deployed to EC2 is an essential aspect of modern application deployment. By securely managing these variables and automating the deployment process, you can enhance your application’s security, flexibility, and scalability.

Frequently Asked Questions (FAQs)

Q1: Why are environment variables important in Docker? Environment variables allow you to configure your application without hardcoding sensitive information into your codebase. This enhances security and makes it easier to manage configuration changes.

Q2: Can I use AWS Secrets Manager and Parameter Store together? Yes, you can use both AWS Secrets Manager and Parameter Store together, depending on your use case. Secrets Manager is ideal for storing and rotating sensitive information, while Parameter Store is great for configuration data that may not need frequent rotation.

Q3: How can I ensure my environment variables are secure? To ensure security, avoid hardcoding sensitive information. Use tools like AWS Secrets Manager and Parameter Store to manage and retrieve secrets securely.

Q4: What is the best practice for managing environment variables in a CI/CD pipeline? Store sensitive information in a secure environment like AWS Secrets Manager or Parameter Store. Use environment files or secret managers to inject variables into your CI/CD pipeline during deployment.

Q5: Can I pass environment variables to Docker containers at runtime? Yes, you can pass environment variables at runtime using the -e flag in the docker run command or by specifying an environment file with the --env-file flag.

By following these best practices and leveraging AWS tools, you can effectively manage environment variables in your Docker deployments on EC2, ensuring a secure and streamlined workflow.


Sources: