Specialist Solutions Architect : ML

Specialist Solutions Architect : ML

As a Specialist Solutions Architect (SSA) in Machine Learning , you will guide customers in building big data solutions on Databricks that span a large variety of use cases. You will be in a customer-facing role, working with and will support the Solution Architects, that requires hands-on production experience with Apache Spark™ and expertise in other data technologies. SSAs help customers through design and successful implementation of essential workloads while aligning their technical roadmap for expanding the usage of the Databricks Lakehouse Platform. As a deep go-to-expert reporting to the Specialist Field Engineering Manager, you will continue to strengthen your technical skills through mentorship, learning, and internal training programs and establish yourself in an area of specialty - whether that be performance tuning, machine learning, industry expertise, or more. The individual can be based either in BLR / DEL / MUM

The impact you will have:

  • Provide technical leadership to guide strategic customers to successful implementations on big data projects, ranging from architectural design to data engineering to model deployment
  • Architect production level workloads, including end-to-end pipeline load performance testing and optimization
  • Become a technical expert in an area such as Machine Learning , Big Data , Data Management, Cloud platforms, data science, machine learning, or architecture
  • Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content, estimating workload sizing, and custom architectures
  • Provide tutorials and training to improve community adoption (including hackathons and conference presentations)
  • Contribute to the Databricks Community

What we look for:

  • 9+ years experience in a customer-facing technical role with expertise in at least one of the following: Machine Learning / Data Engineering / Data Science 
  • Maintain and extend production data systems using DevOps principles to evolve with complex needs.
  • Production programming experience in Python, R, Scala, Java or SQL
  • Experience scaling big data workloads that are performant and cost-effective.
  • Experience designing data solutions on cloud infrastructure and services, such as AWS, Azure, or GCP using best practices in cloud security and networking.
  • Experience implementing industry specific data analytics use cases.


  • Private medical insurance
  • Accident coverage
  • Employee's Provident Fund
  • Equity awards
  • Paid parental leave
  • Gym reimbursement
  • Annual personal development fund
  • Work headphones reimbursement
  • Business travel insurance
Logos/outerjoin logo full

Outer Join is the premier job board for remote jobs in data science, analytics, and engineering.