Scala Developer / Data Engineer (f/m/d)
Publication date:
01 April 2025Workload:
100%- Place of work:Baden
- Salary estimate from jobup.ch:Log in, to see estimate from jobup.ch
Job summary
Join Axpo as a Scala Developer/Data Engineer; enhance our credit risk platform. Collaborate in a dynamic team environment with great benefits.
Tasks
- Develop features for a Scala-based application on Apache Spark.
- Design and optimize Spark jobs for scalable data processing.
- Build and maintain ETL pipelines in Azure Data Factory.
Skills
- Master’s in Computer Science or related field; hands-on Scala experience.
- Strong skills in Apache Spark and Azure Data Factory.
- Proficient in SQL and version control systems like Git.
Is this helpful?
Workload: 80–100%
Are you passionate about data engineering, large-scale data processing, and building robust pipelines in the cloud? We’re looking for a Scala Developer / Data Engineer to support and evolve our Spark-based credit risk platform while maintaining efficient ETL workflows in Azure Data Factory.
What you will do:
- Develop and maintain features for a Scala-based application running on Apache Spark
- Design and optimize Spark jobs and workflows for scalable data processing
- Identify and resolve performance issues and bugs in the Spark ecosystem
- Build and maintain ETL pipelines in Azure Data Factory (ADF)
- Automate and monitor data integration processes across various platforms
- Collaborate with risk managers, analysts, and engineers to implement data-driven solutions
What you bring & who you are:
- Master’s degree in Computer Science, Engineering, IT or a related field
- Strong hands-on experience with Scala and Apache Spark
- Proven skills in building and maintaining Azure Data Factory pipelines
- Proficiency in SQL and version control systems (e.g. Git)
- A team-oriented mindset and clear, structured communication skills
- Independent, detail-focused, and adaptable to dynamic environments
About the team:
Join Axpo’s Quant Development team in Baden. We work closely with stakeholders across risk management and business domains to ensure the stability and evolution of key data applications supporting energy trading and credit risk.