Data Platform DevOps Engineer

Opens nxp.wd3.myworkdayjobs.com in a new tab

What You'll Do

  • Cloud & Platform Infrastructure (IaC): Deploy and maintain Databricks workspaces and AWS infrastructure (VPC, PrivateLink, IAM, S3, Lambda, EKS, and Fargate) using Terraform. • Unity Catalog Implementation: Automate the governance layer, including metastore configuration, external locations, and access controls within Unity Catalog. • Security & Compliance: Ensure the platform adheres to enterprise security standards by managing implementing cloud infrastructure and data protection automated security controls. • Workspace Lifecycle Management: Use Terraform for end-to-end workspace provisioning, ensuring consistent setup across Dev, Acc, and Prod environments. • Governance & Cost Control (Policies): Design and implement policies and guardrails to enforce standards • Identity & Access Automation: Automate assignment of permissions using Terraform.
  • Manage Service Principals for pipelines and map groups to specific Workspace roles and Unity Catalog grants. • DevOps & Automation (CI/CD) • Pipeline Architecture: Oversee GitLab CI/CD pipelines for data projects, transitioning the team from manual notebook deployments to automated workflows. • Databricks Asset Bundles (DABs): Standardize deployment strategies using DABs.
  • Develop templates and presets for Data Engineers to deploy jobs and workflows. • Release Management: Implement branching strategies, code review policies, and environment promotion rules (Dev → Acc → Prod). • Service Organization & Operations • Observability: Configure monitoring, alerting, and logging (using system tables or integration with tools like CloudWatch) to ensure platform stability. • Support & Incident Management: Serve as an escalation point for platform-related incidents. • Knowledge Sharing: Document best practices and conduct workshops to upskill data engineers on effective platform usage.
  • Job Qualifications: • Bachelor’s in computer science, software engineering, mathematics, or related field. • 5+ years industry experience in Data Engineering, Cloud Infrastructure, or DevOps; 3+ years with Databricks in enterprise settings. • Advanced Terraform skills for managing Cloud infrastructure and Databricks resources • Extensive AWS portfolio knowledge • Expertise in CI/CD pipelines using GitLab CI and Databricks Asset Bundles. • Deep understanding of Databricks Lakehouse architecture, Unity Catalog, Serverless Compute, Delta Lake, and Workflow orchestration. • Solid grasp of SDLC/DataOps, including unit testing, modular code, and Git strategies. • Proficient in Python (e.g., automation, PySpark, pandas) and Bash/Shell scripting for CI/CD. • Excellent communication, documentation, mentoring, and collaboration skills. • Preferred: Databricks Certified Data Engineer Professional or AWS Solutions Architect certification.
  • More information about NXP in India... #LI-29f4

Tools & Skills

Languages

Sourced directly from NXP Semiconductors’s career page

Your application goes straight to NXP Semiconductors.

Specialisation
Open roles at NXP Semiconductors
129 positions
Job ID
/job/Bangalore/Data-Platform-DevOps-Engineer_R-10062432

Get matched to roles like this

Upload your resume once. We’ll notify you when matching roles open up.

Join talent pool — free

Similar Other roles