Data Engineer & Automation Specialist
Key information
- Publication date:06 February 2025
- Workload:100%
- Contract type:Permanent position
- Language:English (Fluent), French (Intermediate), Russian (Fluent)
- Place of work:Rue de l'Athénée 27, 1206 Genève
I. Job Responsibilities
1) API Development & Data Extraction
- Develop, optimize, and maintain REST API interactions for structured data extraction.
- API request structures to collect pricing, market intelligence, and business data.
- Implement web scraping pipelines using Python (Requests, BeautifulSoup, Selenium) for automated data
retrieval.
- Build real-time data synchronization systems that aggregate and process market-related information.
2) Automation & System Optimization
- Automate data acquisition, processing, and validation using Python-based workflows.
- Develop lead generation and market intelligence systems by extracting structured data from public and
private sources.
- Implement data pipelines for tracking price movements, supply chain trends, and industry-specific
datasets.
- Design alerting mechanisms that trigger notifications on pricing fluctuations, API failures, or system
anomalies.
3) Database Management & Cloud Infrastructure
- Manage, query, and optimize SQL (PostgreSQL, BigQuery, MySQL) databases for storing extracted data.
- Automate ETL (Extract, Transform, Load) workflows to maintain data integrity and accuracy.
- Deploy and manage Docker-based containerized applications for scalable data processing.
- Utilize Google Cloud Platform (GCP) and Azure for cloud-based data storage, VM management, and
distributed computing.
4) Data Processing & Analytics
- Apply statistical models to detect pricing trends, outliers, and market anomalies.
- Use Pandas and NumPy for data manipulation, aggregation, and feature engineering.
- Develop Power BI dashboards for real-time data visualization and reporting.
- Implement automated reporting systems that integrate with APIs and internal databases.
II. Required Skills & Technologies
1) Programming & Data Engineering
- Python (Requests, Selenium, BeautifulSoup, Pandas, NumPy,)
- SQL (BigQuery, PostgreSQL, MySQL)
- C# / Scala (for backend system automation and API integration)
- Docker (for deploying and managing containerized data processing applications)
- ETL Pipelines (Automated Extract, Transform, Load processes)
- Machine Learning & AI: TensorFlow, K-Nearest Neighbors (KNN), Sentiment Analysis, Named Entity
Recognition (NER)
2) Automation & Cloud Infrastructure
- Web Scraping & API Integration (Requests, JSON parsing, OAuth authentication)
- Google Cloud Platform (GCP) / Azure (VM management, cloud storage, API hosting)
- Power BI (Automated reporting, business intelligence dashboards)
- Data Workflow Automation (Task scheduling, event-driven scripting)
3) Data Processing & Optimization
- Automated Data Cleansing & Validation (Error detection, duplicate handling)
- Statistical & Trend Analysis (Pattern recognition, outlier detection)
- Anomaly Detection Systems (Real-time alerting for pricing changes, API failures)
4) Languages
- Russian, native-level Russian skills required, necessary for:
o day-to-day interactions with Eastern European countries, post soviet countries and partners
o processing Russian-language market data, API documentation, and business intelligence reports
- English (fluent, for technical documentation and communication)
- French (basic, A2 level, for local interactions)
III. Education & Experience
- Previous Experience in:
- API automation, cloud-based data pipelines, market intelligence systems
- Price extraction, lead generation, competitor tracking
- Data analytics, dashboard development, system control optimization
- Financial Data Knowledge: understanding of financial markets, pricing models, and economic indicators to
support data extraction and analysis. Ability to interpret financial datasets and integrate them into automated
workflows.
- Business & Collaboration Tools: proficiency in Microsoft Excel (advanced formulas, VBA, pivot tables), Microsoft Teams, Zoho CRM, and other collaboration platforms for workflow automation, reporting, and team coordination.IV. Additional Requirements
- Experience in scaling automation workflows for high-frequency data collection.
- Ability to analyze and process unstructured financial or market data.
- Strong understanding of cloud-based data engineering, automated monitoring, and anomaly detection.
- Familiarity with business process automation, API management, and cloud integration.
Contact
Swiss Petroleum Card SA