avatar
Saad Fazal

Jul 05, 2024

Exploring DevOps: A New Adventure; Again?🤔

Get featured on Geeklore.io

Geeklore is in very early stages of development. If you want to help us grow, consider letting us sponsor you by featuring your tool / platform / company here instead of this text. 😊

Introduction

Hey everyone! As you know, I've always been a data guy, diving deep into the realms of data science. But recently, I've decided to give DevOps a try. Why, you ask? Well, thanks to the amazing DevOps community and a special shoutout to "kubeden" for enlightening me on how fascinating this field can be, I thought, why not explore it? So, I decided to take a detour from my data journey and spend some time in the DevOps world.

My Background

I’ve been a developer for three years now, with some experience in cloud computing and AWS. So, I figured learning DevOps might be a bit easier given my background. Here’s a rundown of what I’ve learned in the past four days during my free time:

AWS

IAM

  1. Users: Managing user accounts and permissions.
  2. Groups: Organizing users into groups for easier management.
  3. Policies (Permissions): Defining and assigning permissions to users and groups.

Amazon Elastic Container Service (ECS) and ECR

  1. Deploy Docker Container:
    • Create Cluster: Setting up a new ECS cluster.
    • Service API:
      • Tasks: Running individual containers.
      • Load Balancer: Distributing traffic among containers.
      • Health Checker: Monitoring container health.

Elastic Beanstalk

Deploying and managing applications without worrying about the underlying infrastructure.

Docker

Basics

  1. Installation of Docker CLI and Desktop: Getting Docker up and running on my machine.
  2. Understanding Images vs. Containers: Learning the difference between Docker images and containers.
  3. Running Ubuntu Image in Container: Starting a container with Ubuntu.
  4. Multiple Containers: Managing multiple containers simultaneously.
  5. Port Mappings: Mapping container ports to host ports.
  6. Environment Variables: Setting environment variables for containers.
  7. Dockerization of Node.js Application:
    • Dockerfile: Creating a Dockerfile for a Node.js app.
    • Caching Layers: Using caching to speed up builds.
    • Publishing to Hub: Pushing images to Docker Hub.

Advanced

  1. Docker Compose:
    • Services: Defining multi-container applications.
    • Port Mapping: Configuring port mappings for services.
    • Env Variables: Setting environment variables for services.
  2. Docker Networking:
    • Bridge: Default network driver.
    • Host: Using the host’s networking stack.
  3. Volume Mounting: Persisting data using volumes.
  4. Efficient Caching in Layers: Optimizing Dockerfile for caching.
  5. Docker Multi-Stage Builds: Using multi-stage builds to reduce image size.

Nginx

Setting Up

  1. Launching an EC2 Instance:
    • Create and configure a virtual machine using EC2: Choosing an instance type and region.
    • Assign a static IP: Ensuring consistent access.
    • Set up security groups: Allowing HTTP and HTTPS traffic.

Configuration

  1. Accessing the EC2 Instance: Connecting via SSH.
  2. Updating and Installing Necessary Packages: Keeping everything up-to-date.
  3. Cloning the Project Repository: Downloading my Node.js app.
  4. Installing Project Dependencies: Using npm install.
  5. Running the Node.js Application: Managing with pm2.
  6. Setting Up a Domain: Registering and pointing a domain to my Elastic IP.
  7. Configuring Nginx: Proxying requests to the Node.js app.
  8. Setting Up SSL with Let's Encrypt: Using Certbot for SSL certificates.

Kafka

Key Concepts

  1. High Throughput and Less Storage: Optimized for large data streams.
  2. Components:
    • Producers and Consumers: Sending and receiving messages.
    • Topics and Partitions: Organizing messages.
    • Consumer Groups: Managing multiple consumers.

Models

  1. Queue and Pub/Sub: Handling different messaging patterns.
  2. Zookeeper: Managing Kafka infrastructure.
  3. Admin, Producers, and Consumers: Setting up and using Kafka.

Serverless

Overview

  1. No Server Management: Focusing on code, not servers.
  2. Event-Driven Execution: Functions triggered by events.
  3. Automatic Scaling: Scaling based on load.
  4. Pay-per-Invocation: Billing based on function usage.

Practical Example

  1. Creating a Lambda Function: Deploying a function to AWS Lambda.
  2. Trigger Setup: Using API Gateway to invoke the function.
  3. Testing: Verifying with a browser and Postman.

What's Next: My Learning Plan for the Next 4 Days

In the next four days, I plan to dive deeper into the following areas:

  1. More AWS Services: Expanding my knowledge of various AWS services beyond the basics.
  2. Azure: Getting familiar with Microsoft's cloud platform and its unique features.
  3. Terraform: Learning infrastructure as code to manage cloud resources efficiently.
  4. Ansible: Exploring configuration management and automation.
  5. CI/CD: Strengthening my understanding of continuous integration and continuous deployment practices.
  6. GitHub Workflows: Refining my skills in creating and managing workflows on GitHub.

Conclusion

So in these days, my free time goes to learning about DevOps and I will be sharing more about what I have learned, and in a new post, I will share the resources too.

For the DevOps community: Do let me know your thoughts and what should I need to put more focus on in this DevOps realm?

Stay curious, keep learning, and happy coding!

Latest Comments

    avatar Algorithm Adept
    Kuberdenis Algorithm Adept Level 26
    2 months ago

    This is great! I always give people interested in learning DevOps the following task:

    - deploy kubernetes cluster, container registry, and a SQL database in any cloud provider with terraform
    - install argocd, nginx ingress controller, and cert manager on it (if you have a domain)
    - push some dummy data to the database
    - create a simple express.js app for the frontend
    - create a simple node backend api that pulls data from the db
    - containerise the applications with a pipeline (github actions) and push to the container registry
    - create a pipeline to deploy automatically to argocd (github action)

    avatar Algorithm Adept
    Kuberdenis Algorithm Adept Level 26
    2 months ago

    hmm.. I see a few bugs in the platform (can’t write multi-line comments). I’ll fix it tomorrow and also create a blog post about the topics mentioned on top :))

    avatar Rookie
    Saad Fazal Rookie Level 3
    Author 2 months ago

    Yes, I was going to mention this because when I tried to edit this blog, it kept showing me lines instead of numbers and bullet points. I thought I had forgotten how to write in Markdown language. 😂

    One more thing: inPost Settings if the image is more than 2 MB, it will give us an error page that says "413 Request Entity Too Large nginx/1.18.0 (Ubuntu)." You can use some error handling to show a proper notification about the image size limit.

    avatar Algorithm Adept
    Kuberdenis Algorithm Adept Level 26
    2 months ago

    You are absolutely right about the image error handling.

    I will get to it shrotly.

    avatar Rookie
    Saad Fazal Rookie Level 3
    Author 2 months ago

    I have completed most of the things mentioned. Currently, I am focusing on DevOps in my free time and preparing for an entry-level job position. My goal is to master and practice all the skills required for an intermediate-level DevOps engineer. This way, when I go for interviews, I’ll be well-prepared at an intermediate level but aiming for an entry-level position.

More From This User

3 0 239
Read Now
1 0 84
Read Now

© 2024 Geeklore - DEV Community RPG

Facebook Twitter Linkedin Instagram

Campaign Progression Updated!