Role: Data Architect (Databricks)
Location: Remote
Experience: 15+ Years (with 4–5+ years in Databricks)
Certification: Databricks Certification (Mandatory)
Location: Remote
Experience: 15+ Years (with 4–5+ years in Databricks)
Certification: Databricks Certification (Mandatory)
Key Responsibilities:
- Lead migration from on-prem/legacy platforms to Databricks Lakehouse.
- Architect and optimize batch and streaming pipelines for AI/ML & GenAI.
- Re-engineer Spark workloads with Delta Lake and Unity Catalog.
- Integrate ML models into operational pipelines for real-time insights.
- Design orchestration using Airflow or Databricks Workflows with CI/CD.
- Implement governance, security, and data quality frameworks.
- Collaborate with data scientists, ML engineers, and business teams.
Required Skills:
- 15+ years in data engineering/architecture, with 4–5+ years in Databricks.
- Hands-on with Databricks, Spark, Delta Lake, Unity Catalog.
- Strong knowledge of Airflow, Databricks Jobs, Kafka, streaming frameworks.
- Proven experience with AI/ML pipeline integration (MLflow or equivalent).
- Excellent communication and stakeholder engagement.
Preferred Skills:
- Experience with CI/CD, Terraform, DevOps practices.
- Exposure to MLOps, Feature Store, GenAI/LLM integration.
If you are interested, please reply with your updated resume.
Regards,
Raiyyan
raiyyank061@gmail.com