Databricks Solutions Architect

Icon of location
Hybrid
icon of timer
Full-Time

Experience Level: Mid-Level

Practice Area: Architecture

Engagement Type: Full-Time

Location: Hybrid

Role Summary

We are seeking a Databricks Solutions Architect to design, implement, and scale modern data and AI platforms for financial services clients. The role involves architecting Lakehouse patterns, Databricks Serverless, Delta Lake, Unity Catalog, and ML/AI workloads across secure cloud environments. You will work closely with enterprise architects, data engineers, security teams, and business stakeholders to deliver compliant, scalable, and cost-optimised Databricks solutions aligned with industry standards. Experience designing Lakehouse-based platforms and implementing Databricks Serverless compute is essential.

The Difference You’ll Make

In this role you will help FSI clients modernise legacy data estates and transition to unified Lakehouse and AI platforms. Your architectural decisions will improve how organisations ingest, process, govern, and activate data for analytics, risk modelling, fraud detection, and customer insights. You will directly contribute to faster decision cycles, stronger governance, improved cost efficiency, and the enablement of AI-driven products across the FSI sector.

Key Responsibilities

  • Architect secure, scalable Lakehouse platforms using Databricks, Delta Lake, Unity Catalog, and Databricks Serverless
  • Design end-to-end data ingestion and transformation patterns for structured, semi-structured, and streaming workloads
  • Lead architecture design for FSI use cases such as customer 360, claims analytics, regulatory reporting, fraud detection, and risk modelling
  • Develop architecture artefacts including HLDs, LLDs, integration diagrams, and data flow models
  • Define and enforce governance patterns using Unity Catalog, IAM, data lineage, and access policies
  • Design network and connectivity patterns including PrivateLink, VPC/VNet design, APIM, controlled egress, and secure service-to-service interactions
  • Provide architectural guidance on Databricks Serverless compute, cluster policies, and cost-optimisation models
  • Collaborate with engineering teams to implement CI/CD, Infrastructure as Code, DevOps, and automated deployment workflows
  • Work with AI/ML engineers to integrate MLflow, feature engineering workflows, model training, and model serving
  • Conduct technical reviews, performance tuning, and architecture assurance for production workloads
  • Advise clients on best practices across governance, security, platform scaling, workload migration, and FinOps
  • Support practice development by contributing to reusable patterns, templates, accelerators, and technical standards

Required Skills and Competencies

  • Strong hands-on experience designing and delivering Databricks solutions in enterprise environments
  • Practical experience with Databricks Serverless and associated architecture patterns
  • Solid understanding of Lakehouse principles, Delta Lake internals, Unity Catalog, data versioning, and data governance
  • Experience with streaming frameworks and real-time ingestion (Autoloader, Delta Live Tables, Structured Streaming)
  • Familiarity with AI/ML workflows including MLflow, model registry, feature stores, and end-to-end ML lifecycle
  • Strong understanding of cloud-native architecture on AWS or Azure, including storage, compute, IAM, networking, and security
  • Experience designing secure connectivity patterns (PrivateLink, APIM, reverse proxies, VPC/VNet segmentation)
  • Proficiency with Python, SQL, and notebook-based development
  • Knowledge of architectural frameworks and standards relevant to FSI, including:
  • Databricks Lakehouse and Serverless reference architectures
  • Unity Catalog governance patterns
  • APRA CPS 234 / CPS 231 considerations
  • Responsible AI principles
  • Strong communication skills for presenting designs to senior stakeholders and architecture review boards
  • Ability to work across cross-functional teams in global, hybrid, and agile environments

Qualifications and Experience

  • 5+ years in data architecture, data engineering, or platform engineering
  • 3+ years experience designing Databricks or Lakehouse-based platforms
  • Certifications preferred:
  • Databricks Certified Data Engineer Professional
  • Databricks Certified Machine Learning Professional
  • AWS/Azure Architect certifications
  • Prior experience in financial services (super, insurance, or banking) is strongly preferred
  • Experience producing enterprise-grade design artefacts aligned to governance and architectural frameworks

What We Offer

  • Competitive salary with performance-based incentives
  • Exposure to large-scale platform transformations across leading FSI organisations
  • Opportunities to work with cutting-edge cloud, data, and AI technologies
  • A collaborative environment with global architecture and engineering teams
  • Clear pathways for senior technical and leadership growth

Apply Now