C2C requirement for Azure Databricks Architect

Contract

Azure C2C Jobs

Title: Azure Databricks Architect

Location: Plano , TX (Onsite) 

Job Type: Contract

Job Summary:

We are seeking an experienced Azure Databricks Architect to lead the design, development, and implementation of scalable and high-performing data solutions using Azure cloud services, Databricks, and the broader Azure ecosystem. The ideal candidate will have a strong background in big data architecture, Spark-based ETL pipelines, cloud security, and performance tuning.

Key Responsibilities:

· Architect and design end-to-end data pipelines on Azure Databricks, including ingestion, transformation, and serving layers.

· Lead migration of legacy data platforms to Azure-based lakehouse architectures using Delta Lake, ADLS Gen2, and Synapse.

· Implement and optimize large-scale ETL/ELT workflows using Apache Spark, PySpark, and Databricks notebooks.

· Define and enforce data governance, lineage, and security best practices using Unity Catalog, Purview, RBAC, and ABAC.

· Collaborate with business stakeholders, data engineers, data scientists, and DevOps teams to translate business requirements into scalable solutions.

· Enable CI/CD pipelines and automation for Databricks using Terraform, GitHub Actions, or Azure DevOps.

· Guide performance tuning, cost optimization, and monitoring strategies for large-scale data processing jobs.

· Create architecture diagrams, documentation, and knowledge-sharing artifacts.

Required Skills & Experience:

· 10+ years of experience in data engineering, with 4+ years in Azure cloud ecosystem.

· Strong expertise in Azure Databricks, Delta Lake, Spark (PySpark/Scala).

· Deep understanding of Azure Data Lake (Gen2), ADF, Synapse Analytics, Azure SQL, Key Vault, and Azure Monitor.

· Experience with DevOps practices: CI/CD pipelines, Infrastructure as Code (Terraform, ARM, Bicep), Git.

· Hands-on with Unity Catalog, data lineage, audit logging, and access control models.

· Familiarity with data modeling, data mesh, lakehouse, and MPP-based architectures.

· Strong problem-solving, leadership, and communication skills.

Preferred Qualifications:

· Azure or Databricks certifications (e.g., Azure Data Engineer, Databricks Certified Data Engineer Professional).

· Experience integrating with Power BI, MLflow, Azure ML, or Snowflake.

· Familiarity with event-driven architecture (Kafka/Event Hub) and streaming data pipelines.

· Exposure to ML/AI workloads on Databricks.

🔔 Get our daily C2C jobs / notifications on WHATSAPP 

To apply for this job email your details to vishal.p@arkhyatech.com