Deutsche Börse Group logo

Databricks Platform Engineer (f/m/d)

Deutsche Börse Group
Full-time
On-site
Frankfurt, Germany
Group Company: Deutsche Börse AG 
 

Databricks Platform Engineer (f/m/d) 

Full-time | unlimited

 

Ready to make a real impact in the financial industry? At Deutsche Börse Group, we'll empower you to grow your career in a supportive and inclusive environment. With our unique business model, driven by 15,000 colleagues around the globe, we actively shape the future of financial markets. Join our One Global Team!

 

Your area of work:

As part of Deutsche Börse Group’s digital transformation and cloud migration journey, the Big Data & Advanced Analytics (BDAA) team is seeking a skilled and forward-thinking Databricks Platform Engineer. This role is central to designing, building, and optimizing our enterprise-grade data platform on Databricks, enabling scalable data product development across the value chain from BI to AI.

 

You will join a dynamic team of data & automation engineers and architects, contributing to the definition and deployment of cloud data architecture, platform automation, and operational excellence. Your work will directly support business use cases in big data and advanced analytics across one of the world’s leading financial exchanges.

 

Your responsibilities:

  • Provision and configure Databricks workspaces using Terraform, CLI, and SDK.
  • Manage workspace-level settings including clusters, libraries, compute policies, and access controls.
  • Define and maintain catalogs, schemas, and tables across workspaces using Unity Catalog.
  • Ensure the data platform is secure, scalable, and highly available.
  • Contribute to the implementation of the Data Mesh paradigm across domains.
  • Act as a technical advisor to data scientists, analysts, and business users.
  • Drive onboarding, training, and enablement programs to scale platform adoption.
  • Maintain technical documentation and contribute to Agile ceremonies and sprint planning.

 

Your profile:

  • Bachelor’s degree in Computer Science, Data Engineering, or a related field.
  • 5+ years of experience in cloud big data platforms, preferably Azure or GCP.
  • Hands-on experience with Databricks, Delta Lake, and Spark; familiarity with Kafka, Flink, or similar frameworks is a plus.
  • Experience building data pipelines using tools like Apache Airflow, Data Factory, or Apache Beam.
  • Proficiency in Terraform, Python and CI/CD tools such as GitHub Actions.
  • Strong understanding of data management, monitoring, security, and privacy.
  • Excellent communication and collaboration skills across cross-functional teams.
  • Fluent in English; additional language skills are a plus.

 

Deutsche Börse Group embraces an international climate, whereby diversity is universal. This is evident across the board, be it through our diverse workforce, routine responsibilities or other areas of activities and scope of application. We are looking for employees who enjoy working in a dynamic and flexible environment and are willing to put forward innovative ideas for the company. An open mindset, a proactive approach and self-motivation are prerequisites. 
We offer our employees an attractive remuneration package. Benefits include a high level of trust and autonomy in modern, centrally located workplaces where corporate culture and values are exercised regularly. 


We value diversity and therefore welcome all applications - regardless of gender, nationality, ethnic and social origin, religion/belief, disability, age, sexual orientation and identity.


Have we piqued your interest? Then we encourage you to apply now!

 

Do you have questions about the application process or this position?

Please contact us at careers@deutsche-boerse.com or by phone 069-211-11810. We look forward to getting to know you!


Deutsche Börse Group, Human Resources
https://careers.deutsche-boerse.com/