- Led migration of approximately 3 million customer records from a legacy enterprise CRM to Salesforce on Azure across 4 business units; designed and executed a multi-phase data pipeline delivering zero-downtime cutover with no data loss.
- Built Python and SQL ETL automation scripts to extract, transform, and load large volumes of customer data - reducing manual processing effort by 40% and eliminating recurring data quality errors.
- Configured Azure Data Factory and Azure SQL Database as the core pipeline infrastructure; enforced RBAC and audit logging to maintain data security and regulatory compliance throughout all project phases.
- Executed UAT data validation test plans, resolved critical data integrity issues prior to go-live; authored migration runbooks and presented weekly pipeline status updates to business and compliance stakeholders.
- Operated within an Agile/Scrum framework with 2-week sprints; tracked data migration milestones and cross-team dependencies in Jira.
Available for Cloud DevOps & Data Engineering roles
Hi, I'm Rajan Kumar -
I build cloud systems
that don't break at 3am.
Cloud Data & DevOps Engineer with 17 months of enterprise experience at TD Bank, a completed Azure Data Engineer certification (DP-203), and hands-on projects spanning Terraform, Docker, Kubernetes, AWS EKS, and Prometheus - built to production standards, not just for a course.
About Me
Rajan Kumar
Cloud Data & DevOps Engineer - Azure - AWS - Python - Terraform
I'm a Cloud Data and DevOps engineer with 17 months of enterprise experience at TD Bank, where I led the migration of approximately 3 million customer records from a legacy CRM to Salesforce on Azure - building Python and SQL ETL pipelines, configuring Azure Data Factory, and enforcing RBAC and audit logging across all project phases.
I hold the Microsoft Azure Data Engineer Associate (DP-203) certification and have completed the DataExpert.io Data Engineering and Analytical Engineering bootcamps. Currently completing a Cloud Computing diploma at George Brown College, with a mandatory co-op placement in Fall 2026.
My academic projects go beyond coursework - The Migration Arc takes an application from a local Vagrant VM through Docker, Terraform, AWS ECS, and all the way to a Kubernetes deployment on AWS EKS, observed end-to-end by a production-grade Prometheus and Grafana monitoring stack.
Technical Profile
Cloud-focused skill set built around Azure, Python, and production DevOps tooling.
Experience
- Completed the DataExpert.io Data Engineering Bootcamp and Analytical Engineering track; earned Microsoft Azure Data Engineer Associate (DP-203) certification.
- Completed the DataCamp Data Engineer in Python track; earned the Agentic AI on AWS credential through BESA.
- Coursework: Cloud Infrastructure & Virtualization, Linux Administration, Cybersecurity Fundamentals, DevOps, Database Management.
- Built The Migration Arc and Infrastructure Monitoring Stack as academic projects - taking an application from a local Vagrant VM to a Kubernetes deployment on AWS EKS with full observability.
- Mandatory industry co-op placement in Fall 2026.
Core Competencies
Projects & GitHub
Real-world cloud infrastructure projects built to production standards - not just tutorials.
Academic Background
Formal education in cloud and technology, supplemented by certifications and continuous self-directed learning.
Let's Connect
Open to Cloud DevOps and Cloud Data Engineering opportunities. Let's talk.