SRB.
Code • Create • Conquer
Saivihari Reddy Bandi
Senior Data Engineer

Saivihari
Reddy Bandi

Building Enterprise-Scale Data Solutions

Senior Data Engineer with 5+ years of expertise building enterprise-scale data pipelines and cloud architectures. Demonstrated success in migrating large-scale datasets, implementing CI/CD automation, and optimizing SQL performance within regulated government and commercial environments.

About Me

I'm a Senior Data Engineer with a proven track record of delivering secure, analytics-ready platforms using Azure Data Factory, Python, and T-SQL while ensuring data quality and regulatory compliance.

My expertise spans cloud platforms, data warehousing, analytics, and DevOps automation. I specialize in building robust ETL pipelines that handle billions of records, implementing CI/CD best practices, and optimizing database performance to drive significant cost savings.

With a perfect 4.0 GPA Master's degree and Microsoft Azure certifications, I combine academic excellence with real-world impact, having reduced deployment times by 88% and cut cloud costs by $12K monthly at Deloitte.

Experience

June 2025 – Present

Project Delivery Senior Analyst

Deloitte — State of Michigan (Unemployment Insurance Agency) • Remote
  • Architected and operated mission-critical data infrastructure supporting 1.5B+ record datasets for state unemployment systems, focusing on production stability, performance optimization, and compliance
  • Designed Azure Data Factory orchestration layers managing 50+ parallel pipelines with dependency chains, reducing end-to-end processing time by 35% through optimized scheduling and resource allocation
  • Built CI/CD deployment framework using Azure DevOps YAML pipelines to automate ADF artifact promotion across Dev/UAT/Prod, eliminating manual deployment errors and reducing release cycles from 3 days to 4 hours
  • Implemented real-time monitoring dashboards integrating ADF metrics, SQL DMVs, and Azure Monitor logs to track pipeline health, data freshness, and SLA compliance, achieving 99.7% uptime
  • Conducted deep-dive performance analysis using SQL Server execution plans and query store analytics, redesigning indexing strategies that improved compute efficiency by 40% and reduced monthly Azure costs by $12K
  • Engineered Python-based data quality framework validating 15+ business rules per load cycle, automatically flagging duplicates, orphaned records, and referential integrity violations with detailed exception reporting
  • Developed automated PII masking pipelines in T-SQL using dynamic data masking and custom obfuscation functions to generate compliant test datasets for 200+ downstream consumers
  • Partnered with infrastructure teams to implement role-based access control (RBAC) across 8 Azure SQL instances, defining granular permissions for 30+ user groups while maintaining SOC 2 audit compliance
  • Led quarterly disaster recovery exercises simulating complete environment failures, executing database restores, ADF pipeline redeployment, and end-to-end validation within 6-hour RTO requirements
  • Mentored 4 junior analysts through pair programming sessions and structured code reviews in Azure DevOps, establishing team coding standards and documentation practices
Technologies: Azure Data Factory, Azure SQL Database, Azure DevOps, T-SQL, Python, Azure Monitor, PowerShell, Git
July 2023 – May 2025

Project Delivery Analyst

Deloitte — State of Michigan (Unemployment Insurance Agency) • Remote
  • Built resilient data integration platform connecting legacy mainframe systems with modern cloud analytics infrastructure, focusing on reliability and regulatory compliance
  • Developed 40+ ADF pipelines orchestrating incremental and full-refresh patterns for COBOL flat files, FTP sources, and legacy databases into normalized Azure SQL star schemas supporting Power BI analytics
  • Created complex T-SQL stored procedures implementing slowly changing dimension (SCD) Type 2 logic, processing 500K-2M records per execution with transactional integrity and error handling
  • Implemented rule engine enforcing 10-year data retention policies by evaluating 12 regulatory criteria, archiving 200M+ records quarterly to Azure Blob Storage with audit trail documentation
  • Established production support runbooks for 24/7 operations, performing root cause analysis on failed pipelines using ADF debug mode, SQL Profiler traces, and transaction log analysis to restore service within 2-hour SLA
  • Built Python automation suite for environment refresh workflows, dynamically generating 500+ SQL scripts to recreate indexes, constraints, and stored procedures across refreshed databases, reducing manual effort from 16 hours to 45 minutes
  • Designed pre/post-ETL reconciliation framework comparing source row counts, checksums, and aggregates against target loads, producing automated variance reports distributed via email and SharePoint
  • Optimized nightly batch windows from 8 hours to 5 hours through strategic non-clustered index placement, partition switching techniques, and parallelized stored procedure execution
  • Collaborated with business analysts to translate 50+ functional requirements into technical specifications, data models, and ETL logic documented in Confluence
Technologies: Azure Data Factory, T-SQL, Python, Azure SQL Database, Azure Blob Storage, SSMS, Azure DevOps, Power BI
June 2018 – July 2021

SQL Developer / Data Engineer

Think Champ Private Limited • Tirupathi, India
  • Developed data infrastructure and optimized database performance for B2B SaaS platform supporting customer analytics and reporting applications
  • Designed normalized database schemas (3NF) for transactional OLTP systems managing customer, product, and order entities with 50+ tables and referential integrity constraints
  • Built 60+ T-SQL stored procedures encapsulating multi-step business logic including order processing workflows, inventory calculations, and commission computations with exception handling and transaction management
  • Created indexed views and materialized aggregations for analytical queries, reducing report generation time from 45 seconds to 3 seconds for executive dashboards
  • Implemented view-level data masking for customer PII (emails, phone numbers, addresses) in staging environments using SQL Server CASE expressions and hash functions
  • Performed systematic query tuning analyzing execution plans to identify missing indexes, parameter sniffing issues, and costly operations, achieving 25-30% average performance improvement across 100+ production queries
  • Developed reusable SQL validation scripts comparing staging-to-production data loads across dimensions (row counts, null checks, value ranges), generating CSV reconciliation reports for QA signoff
  • Coordinated with application developers during release windows to deploy database schema changes, migrate reference data, and rollback procedures
Technologies: SQL Server 2012-2016, T-SQL, SSMS, SQL Server Profiler, SQL Server Agent, IIS

Additional Experience

August 2022 – May 2023

Teaching Assistant – Computer Networks & Cybersecurity

University of North Texas • Denton, TX
  • Facilitated weekly lab sessions for 60+ students covering TCP/IP protocols, network security fundamentals, and packet analysis using Wireshark
  • Provided technical support for troubleshooting network configurations, firewall rules, and security tools through structured debugging methodology
May 2022 – August 2022

Engineering Intern

StreamSets Inc. • Remote
  • Contributed to testing and validation of cloud-native data integration platform supporting real-time streaming and batch ingestion to modern data warehouses
  • Executed test plans for 25+ data pipeline configurations spanning sources (MySQL, PostgreSQL, REST APIs) to destinations (Snowflake, Google BigQuery, S3), validating data accuracy and schema evolution handling
  • Analyzed pipeline execution logs and metrics to identify bottlenecks in transformation logic, working with engineering team to reproduce performance issues and verify optimization fixes
  • Participated in sprint ceremonies and regression testing cycles for platform releases, documenting defects in Jira with detailed reproduction steps, log excerpts, and expected vs actual outcomes
December 2017 – May 2018

HR Intern

Capgemini • Hyderabad, India
  • Supported onboarding processes for 50+ new hires including background verification documentation, access provisioning tracking, and compliance checklist completion

Projects

Project 01

FIFA World Cup Statistical Analysis

Developed Python-based data pipelines and interactive Tableau dashboards to analyze multi-year FIFA match and player data, engaging 1,000+ users with actionable performance insights.

Python Tableau Data Analysis ETL
View on GitHub →
Project 02

Enterprise CI/CD Pipeline Automation

Architected end-to-end automated deployment system for government agency handling 1.5B+ records, achieving 88% reduction in deployment time and 99.7% system uptime.

Azure DevOps PowerShell CI/CD IaC
View on GitHub →
Project 03

SQL Validation Script Generator

Created intelligent automation suite dynamically generating 500+ SQL scripts for data validation, index validation, and obfuscation across environment refreshes.

Python T-SQL Automation Validation
View on GitHub →

Hobbies & Game Development

Game 01

Maze Game

An interactive maze puzzle game built with HTML5, CSS3, and JavaScript. Features procedural maze generation, smooth controls, and engaging gameplay mechanics.

JavaScript HTML5 CSS3 Game Dev
Game 02

More Games Coming Soon

Constantly experimenting with new game concepts and mechanics. From puzzle games to arcade classics, exploring different aspects of interactive design and programming.

Unity C# Game Design Physics
View GitHub →

Education

Aug 2021 – May 2023

Master of Science

University of North Texas
Computer Science • Denton, TX
GPA: 4.0/4.0
2012 – 2016

Bachelor of Technology

Audisankara College of Engineering and Technology
Computer Science & Engineering

Skills

Programming Languages

Python Expert
T-SQL Expert
SQL Expert
PowerShell Advanced

Cloud & Data Platforms

Azure Data Factory Expert
Azure SQL Database Expert
SQL Server Expert
PostgreSQL Advanced
MySQL Advanced
Azure Blob Storage Advanced

Data Warehousing

Snowflake Intermediate
Google BigQuery Intermediate

Analytics & Visualization

Power BI Advanced
Tableau Advanced
Excel Expert
Power Query Advanced

DevOps & Automation

Azure DevOps Expert
Git Advanced
CI/CD Pipelines Expert
Infrastructure as Code Advanced

Performance Monitoring

Azure Monitor Advanced
SQL Server Profiler Advanced
Query Execution Plans Expert

Key Achievements

Delivered a time-critical client ad hoc data solution at Deloitte, reducing a typical 3-month analysis and delivery timeline to 2 weeks with 100% accuracy by implementing dynamic SQL transformations with automated validation and error handling

Earned multiple Outstanding Performance Awards and achieved promotion to Senior Analyst within two years for consistent high-impact delivery

Recognized as a Best Team Player for strong collaboration, ownership, and mentoring across data engineering initiatives

Served as a Teaching Assistant during Master's program, providing technical instruction and mentoring that led to a 100% student pass rate

Certifications

🏆

Azure Data Engineer Associate

Microsoft
Certification ID: DP-203
📜

Azure Data Fundamentals

Microsoft
Certification ID: DP-900

Let's Work Together

I'm available for senior data engineering opportunities where I can leverage my expertise in Azure, Python, and enterprise-scale data solutions to drive meaningful business impact.

Get In Touch