Data Engineer
Key information
- Publication date:18 March 2025
- Workload:100%
- Contract type:Permanent position
- Place of work:Rue de Langallerie 11, 1003 Lausanne
We, Tradition, are a leading and international institutional brokerage group with a 65-year track record of trust, innovation and expertise. Our customers are the largest trading participants in financial and commodity-related markets, such as global and regional investment and commercial banks, hedge funds, energy trading companies and more. Altogether, around, 2,500 professionals work in our group which operates in more than 30 countries across the 3 time zones. We are passionate about building connections in a fast-paced environment within the markets.
At the core of our success lies a strong belief that our people define our business. Teamwork, creativity, reliability, and integrity have been fundamental values since our founding in 1959. We take pride in fostering a workplace that prioritizes leadership, career growth, and superior client service.
Join us and be part of a heritage-driven, innovation-powered environment, where your skills and ambitions can directly contribute to make an impact.
In the Central Data Platform Team which is growing rapidly, we develop, deploy, and maintain robust data pipelines, high-performance web applications, and scalable data warehouse solutions. We are implementing the latest technologies to drive our activities towards a data-driven digital organization in the AI age. Our mission? Ensure seamless and optimized data access to power Business Intelligence tools, Data Science and AI solutions, and more to come internal or external digital services.
Your role:
- Develop, deploy, and maintain various data pipelines originating from diverse data sources, serving data scientists, various systems, and many end users.
- Participate in the development of the enterprise data warehouse to support different analytical use cases.
- Conduct code reviews with data scientists and business intelligence developers to provide technical guidance and best practices.
- Work closely with business IT teams, data scientists, and external providers, if the case may be, to deploy, maintain, and monitor data processing and machine learning pipelines.
- Develop and maintain advanced CI/CD pipelines to ensure continuous integration, testing, and deployment of the pipelines.
- Develop and maintain documentation of the data architecture and data management processes.
Your profile
- Experience: 2 years of experience in data engineering, devops engineering, or software engineering
- Education: Master’s degree in computer science, information technology, technology engineering or equivalent.
- Knowledge: Linux, Python, and SQL. Familiar with orchestrator tools such as Airflow.
- Tech Skills: data storage solutions such as data lakes, data warehouses, and relational databases. Developing and deploying Python and Spark pipelines running on Docker. building data vaults and developing business-focused data models, with a strong understanding of DBT for data transformation and modelling. Building CI/CD pipelines with Jenkins, Gitlab, or similar is a plus.
- Soft Skills: Organization and communication skills within the team and with external stakeholders. Excellent spoken and written English.
- Mindset: A proactive and solution-oriented team player.
Why Join Us?
- Career Growth: Opportunity for development within a global financial powerhouse
- Supportive Culture: A diverse and innovative workplace that values collaboration.
- Competitive Compensation: Attractive salary and variable compensation.
- Comprehensive Benefits: Including pension fund, benefit platforms and more.
- Flexible Work Policy: A clear directive for remote work options to support work-life balance.
It is an exciting time to join our Teams. All applications will be handled in the strictest confidence.
Contact
- Marija Mijalova