✨ Introduction
Over the past two months, I took a deep dive into the world of CI/CD pipelines, application security, and cloud fundamentals. My goal was to simulate a real-world enterprise environment using tools like Jenkins, Docker, Tomcat, and SonarQube—all while improving my grasp of DevOps principles. The cherry on top: I successfully cleared the AWS Cloud Practitioner certification.
This blog documents my learnings, setups, and the hands-on challenges I tackled. Whether you're starting your DevOps journey or looking to refine your pipeline, I hope this helps.
1. Setting Up Jenkins for Real-World CI/CD
I started by setting up a Jenkins controller and then created custom Jenkins agents using Docker. This allowed me to:
- Isolate build environments
- Control security boundaries
- Reuse Docker images across pipelines
I structured separate pipelines for DEV and QA profiles, ensuring each environment had unique configurations and behavior. Jenkins' flexibility made it easy to parameterize the builds and manage them per environment.
2. Security & Secrets Management
One major focus was improving how environment variables and secrets are handled:
- Used Jenkins Credentials Plugin to store secrets like JWT keys and database credentials
- Passed these into Docker containers via
docker-compose and environment variables in JAVA_OPTS
- Ensured no secrets were hardcoded in the source or pipeline definitions
This was crucial for simulating a real enterprise-grade deployment pipeline.
3. Code Quality & Test Coverage
I integrated SonarQube with Jenkins to continuously analyze code quality. Here's what I did:
- Installed the SonarQube plugin in Jenkins
- Set up a local SonarQube server running in a Docker container
- Used Maven goals (
sonar:sonar) to trigger analysis from Jenkins
To measure test coverage, I integrated JaCoCo and ensured JUnit tests were run and reported independently.
These tools helped me:
- Identify code smells and bugs early
- Maintain a clean, modular codebase
- Track testing metrics over time
4. Tomcat & Docker Compose Insights
I deployed my Spring Boot application using Tomcat within a Dockerized setup. Key takeaways:
- Understood how Tomcat manages servlet deployment
- Learned how to pass environment-specific properties (
application-qa.properties, etc.) via JAVA_OPTS
- Used
docker-compose.yml to streamline configuration and ensure consistent environments
This was especially helpful in managing profile-specific behavior.
5. SSH & GitHub Automation
I explored the use of ssh-keygen and how SSH keys can be leveraged to:
- Authenticate GitHub repositories from Jenkins
- Enable secure, automated Git pulls
- Avoid manual authentication during builds
This added a new layer of automation and security to the CI/CD flow.
6. Cloud Achievement: AWS Cloud Practitioner
Parallel to my DevOps practice, I prepared for and passed the AWS Cloud Practitioner certification. Key concepts I explored:
- AWS Core Services (EC2, S3, RDS, Lambda)
- Identity and Access Management (IAM)
- Shared Responsibility Model
- Cloud pricing and billing concepts
This knowledge gave me a stronger foundation in cloud architecture and how it supports DevOps.
✅ Key Takeaways
- Jenkins pipelines can be secured and structured to simulate enterprise-grade scenarios
- Docker agents are essential for isolating builds and improving reproducibility
- Tools like SonarQube and JaCoCo provide critical insights into code health and coverage
- Managing SSH and environment variables correctly enhances both automation and security
- Understanding cloud fundamentals bridges the gap between development and operations
Conclusion
This journey was packed with hands-on learning, challenges, and rewarding moments. I now feel much more confident in designing CI/CD pipelines, securing deployments, and integrating quality gates into builds.
I'd love to hear how others approach similar setups—feel free to connect, comment, or share your tips. You can also find some of my sample configs and scripts on my GitHub.
Thanks for reading!