Data Engineer - Project Delivery Analyst
Actively Reviewing the ApplicationsDeloitte
On-site
7–7 LPA
Posted 2 weeks ago
•
Apply by June 16, 2026
Job Description
Are you an experienced, passionate pioneer in technology who wants to work in a collaborative environment? As an experienced Data Engineer - Project Delivery Analyst you will have the ability to share new ideas and collaborate on projects as a consultant without the extensive demands of travel. If so, consider an opportunity with Deloitte under our Project Delivery Talent Model. Project Delivery Model (PDM) is a talent model that is tailored specifically for long-term, onsite client service delivery.
Recruiting for this role ends on April 10 th , 2026.
Work You'll Do/Responsibilities
You will support a Data & Analytics Foundry across numerous business product teams (scaled program with ~235 onshore/offshore resources), building reliable pipelines and curated datasets for analytics and downstream consumption.
AI& Data - AI & Engineering leverages cutting-edge engineering capabilities to build, deploy, and operate integrated/verticalized sector solutions in software, data, AI, network, and hybrid cloud infrastructure. These solutions are powered by engineering for business advantage, transforming mission-critical operations. We enable clients to stay ahead with the latest advancements by transforming engineering teams and modernizing technology & data platforms. Our delivery models are tailored to meet each client's unique requirements.
Qualifications
Required
You may also be eligible to participate in a discretionary annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance.
Recruiting for this role ends on April 10 th , 2026.
Work You'll Do/Responsibilities
You will support a Data & Analytics Foundry across numerous business product teams (scaled program with ~235 onshore/offshore resources), building reliable pipelines and curated datasets for analytics and downstream consumption.
- Build and enhance data pipelines on AWS using Python to ingest, transform, and deliver data to Snowflake and downstream consumers.
- Develop and maintain Snowflake objects (schemas, tables, views) and performant SQL transformations to produce curated, analytics-ready datasets.
- Implement workflow automation and scheduling (e.g., Airflow/MWAA, Step Functions, Glue) with proper dependencies, retries, and logging.
- Apply data quality checks and basic observability (validation rules, reconciliation, alerts) and support incident triage and remediation.
- Optimize pipeline and query performance with guidance (efficient Python, partitioning/file formats in S3, Snowflake warehouse usage and query tuning).
- Follow CI/CD and IaC standards (e.g., Git-based workflows, Terraform/CloudFormation changes) to promote code across environments.
- Collaborate with analysts, product owners, and source-system teams to clarify requirements and validate outputs; participate in sprint ceremonies and estimations.
- Contribute to code reviews (give/receive), unit tests, and peer debugging; learn and apply team engineering standards.
- Communicate regularly with Engagement Managers (Directors), project team members, and representatives from various functional and / or technical teams, including escalating any matters that require additional attention and consideration from engagement management
- Independently and collaboratively lead client engagement workstreams focused on improvement, optimization, and transformation of processes including implementing leading practice workflows, addressing deficits in quality, and driving operational outcomes
AI& Data - AI & Engineering leverages cutting-edge engineering capabilities to build, deploy, and operate integrated/verticalized sector solutions in software, data, AI, network, and hybrid cloud infrastructure. These solutions are powered by engineering for business advantage, transforming mission-critical operations. We enable clients to stay ahead with the latest advancements by transforming engineering teams and modernizing technology & data platforms. Our delivery models are tailored to meet each client's unique requirements.
Qualifications
Required
- 1+ year of experience building/enhancing data pipelines and curated datasets for analytics/downstream consumers.
- 1+ year of hands-on experience with SQL and Python, including Snowflake and/or PySpark for transformations and scalable processing.
- 1+ year of experience with cloud data engineering on AWS (preferred) or Azure/GCP, including orchestration/scheduling (e.g., Airflow/MWAA, Step Functions, Glue, ADF/Fabric Data Factory).
- Understanding of ELT patterns and Lakehouse/warehouse concepts; familiarity with S3 file formats/partitioning (e.g., Parquet/Delta).
- Working knowledge of DevOps practices (Git-based workflows, CI/CD) and exposure to Infrastructure-as-Code (Terraform/CloudFormation).
- Understanding data quality, basic observability, and metadata/governance fundamentals.
- Bachelor's degree, preferably in Computer Science, Information Technology, Computer Engineering, or related IT discipline; or equivalent experience.
- Limited immigration sponsorship may be available.
- Ability to travel 10%, on average, based on the work you do and the clients and industries/sectors you serve.
- Agile delivery experience .
- Analytical ability to manage multiple projects and prioritize tasks into manageable work products.
- Can operate independently or with minimum supervision.
- Excellent written and communication skills.
- Ability to deliver technical demonstrations.
You may also be eligible to participate in a discretionary annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance.
Required Skills
Communication
Engineering
Git
Agile
Automation
Python
SQL
Training
AWS
Scheduling
Reconciliation
Snowflake
Terraform
Azure
Airflow
Parquet
Data Engineering
DevOps
CI/CD
Analytics
Debugging
Information Technology
Data quality
Validation
ADF
Metadata
Governance
Client Service
Orchestration
Recruiting
Immigration
Service Delivery
Remediation
Hybrid Cloud
Client engagement
Schemas
Data platforms
Delta
PDM
Workflow Automation
Data pipelines
PySpark
Logging
Factory
Query tuning
Unit tests
Agile Delivery
Differential
CloudFormation
Cloud Data
Leverages
Cloud Infrastructure
Computer Engineering
Observability
Lakehouse
Incident
ELT
Dependencies
Validation Rules
Supervision
Partitioning
Glue
Computer Science
Data Factory
Step Functions
Quick Tip
Customize your resume and cover letter to highlight relevant skills for this position to increase your chances of getting hired.
Related Similar Jobs
View All
Order Management Executive
Ingersoll Rand
India
Full-Time
Engineering
MS Office
Associate Consultant - Pune India
ESP Global Services
India
Volunteer
Documentation
JavaScript
React Native
+34
One Identity Entwickler (m/w/d)
Deutsche Telekom
India
Part-Time
₹1–3 LPA
Debugging
Investor Relations Specialist
Startup Singam
India
Full-Time
Communication
Portfolio Management
Entrepreneurship
+7
Onboarding Specialist
Tata Electronics
India
Full-Time
Communication
Engineering
Recruitment
+33
Share
Quick Apply
Upload your resume to apply for this position