Senior Platform Engineer
Publication date:
22 October 2024Workload:
100%Contract type:
Permanent position- Place of work:Nagavara, Bangalore
By clicking the “Apply” button, I understand that my employment application process with Takeda will commence and that the information I provide in my application will be processed in line with Takeda’s Privacy Notice and Terms of Use . I further attest that all information I submit in my employment application is true to the best of my knowledge.
Job Description
The Future Begins Here
At Takeda, we are leading digital evolution and global transformation. By building innovative solutions and future-ready capabilities, we are meeting the need of patients, our people, and the planet.
Bengaluru, the city, which is India’s epicenter of Innovation, has been selected to be home to Takeda’s recently launched Innovation Capability Center. We invite you to join our digital transformation journey. In this role, you will have the opportunity to boost your skills and become the heart of an innovative engine that is contributing to global impact and improvement.
At Takeda’s ICC we Unite in Diversity
Takeda is committed to creating an inclusive and collaborative workplace, where individuals are recognized for their backgrounds and abilities they bring to our company. We are continuously improving our collaborators journey in Takeda, and we welcome applications from all qualified candidates. Here, you will feel welcomed, respected, and valued as an important contributor to our diverse team.
The Opportunity
As a Data Platforms Engineering leader, you'll have a business impact and direct alignment with the vision of the Head of Data Platforms and Architecture. The role is a key enabler for Takeda's strategy to become a Data-Driven Enterprise. By connecting with Business Units and Business Functions within Takeda’s Global Business and with their data teams, the data platform lead will strategically architect data, processes, and technology to achieve faster time to market for life-saving products. Ultimately, it helps Takeda to make better decisions that improve the quality and efficiency of care for patients. You will develop data-driven solutions utilizing current and next-generation technologies to meet evolving business needs. You will quickly identify opportunities and recommend possible technical solutions. You will develop application systems that comply with the standard system development methodology and concepts for design, programming, backup, and recovery to deliver superior performance, reliability, and integrity solutions.
As part of our transformational journey on Data & AI in Operations, we are taking the steps to advance to Data Mesh architecture. The current Datalake gives all Operations units access to critical data and analytic tools at pace, accelerating their work on life-saving medicines. The vision of EDS is also accelerating Operations’ data strategy of making our data Findable, Accessible, Interoperable, and reusable. This is being achieved by creating a distributed data architecture and managing our data and data products, which will sit as a centerpiece of this strategy and the future evolution of Data Science.
Responsibilities
- Create best practices and thought leadership content for the federated delivery teams building data solutions and products on Enterprise Data platforms that cater to batch, streaming, and real-time data.
- Influence stakeholders at all levels through complex engagement models with the wider cloud ecosystem not limited but inclusive of AWS foundations for Infrastructure and data technologies, Databricks, Informatica, Kafka, Managed File Transfer, and 3rd party applications, ensuring they are excited by the Enterprise Data Services vision and solution strategy.
- Working closely with Enterprise Architects, Business Engagement leads to developing a holistic understanding of the evolving landscape of Data Platforms evolution and how it aligns with Business Units and Business Functions within Japan.
- Provide a roadmap for modernizing legacy capabilities inherent to the current platform. Support all data platform initiatives – Data Lake Strategy, Data Engineering and Platform development, Data Governance, Security Models, and Master Data Management.
- Establish a collaborative engineering culture based on trust, innovation, and a continuous improvement mindset. Utilize Industry best practices and agile methodologies to deliver solutions and extract efficiency through Continuous Development and Integration automation. Manage efforts to problem-solve engineering challenges and coordinate with project consultants and delivery/engagement managers.
- As a leading technical contributor who can consistently take a poorly defined business or technical problem, work it to a well-defined data problem/specification, and execute it at a high level, have a strong focus on metrics, both for their work's impact and its engineering and operations.
- Understanding the Data platform investments creates data tools for the consumption of services and uncovers opportunities for cost optimization. This assists the team in building and optimizing our platforms into an innovative unit within the company.
Skills and Qualifications
- Bachelor’s degree or higher in Computer Science/Information technology or relevant work experience.
- Knowledge of Enterprise Architecture and
- Identify and highlight risks and issues within the project and escalate appropriately for resolution. Devise effective mitigation and escalation strategies for projects to address risks and issues.
- Assist in developing one or multiple product strategies and drive the product priority-setting in response to business needs aligned with IT architecture, deployment, and release management.
- Evaluate existing business processes and find opportunities to integrate and align a broad set of stakeholders' perspectives seamlessly.
- Support teams in transforming their ways of working, mindsets, and behaviors toward product-centricity and digital dexterity.
- Monitor portfolio progress and related milestones, identify gaps, and make strategic recommendations.
- 5+ years of relevant work experience in data platforms, solutions, and delivery methodologies (Java, Python, Spark, Hadoop, Kafka, SQL, NoSQL, Postgres, and/or other modern programming languages and tools such as JIRA, Git, Jenkins, Bitbucket, Confluence).
- Familiarity with the core technology stack, including Databricks Lakehouse (Delta Lake) or equivalent such as Big Query/Snowflake, SQL/Python/Spark, AWS, Prefect/Airflow,
- Deep Specialty Expertise in at least one of the following areas:
- Experience scaling big data workloads that are performant and cost-effective.
- Experience with Development Tools for CI/CD, Unit and Integration testing, Automation and Orchestration, REST API, BI tools, and SQL Interfaces (e.g., Jenkins)
- Experience designing data solutions on cloud infrastructure and services, such as AWS, Azure, or GCP, using best practices in cloud security and networking.
- 5+ years’ experience in a customer-facing technical role with expertise in at least one of the following:
- Software Engineer/Data Engineer: data ingestion, streaming technologies - such as Spark Streaming and Kafka, performance tuning, troubleshooting, and debugging Spark or other big data solutions.
- Experience with ETL/Orchestration tools (e.g., Informatica, Airflow, etc.)
- Industry experiences working with public cloud environments (AWS, GCP, or Azure) and associated deep understanding of failover, high availability, and high scalability.
- Data ingestion using one or more modern ETL computing and orchestration frameworks (e.g., Apache Airflow, Luigi, Spark, Apache Nifi, Flink, and Apache Beam).
- 3+ years of experience with SQL or NoSQL databases: PostgreSQL, SQL Server, Oracle, MySQL, Redis, MongoDB, Elasticsearch, Hive, HBase, Teradata, Cassandra, Amazon Redshift, Snowflake.
- Advanced working SQL knowledge and experience working with relational databases, authoring (SQL), and working familiarity with various databases.
- Experience building and optimizing big data pipelines, architectures, and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Working knowledge of message queuing, pub/sub-stream processing, and highly scalable 'big data' data stores.
- Outstanding communication and relationship skills, ability to engage with a broad range of partners, and ability to lead by influence.
BENEFITS:
It is our priority to provide competitive compensation and a benefit package that bridges your personal life with your professional career. Amongst our benefits are:
- Competitive Salary + Performance Annual Bonus
- Flexible work environment, including hybrid working.
- Comprehensive Healthcare Insurance Plans for self, spouse, and children
- Group Term Life Insurance and Group Accident Insurance programs.
- Employee Assistance Program
- Broad Variety of learning platforms
- Diversity, Equity, and Inclusion Programs
- Reimbursements – Home Internet & Mobile Phone
- Employee Referral Program
- Leaves – Paternity Leave (4 Weeks), Maternity Leave (up to 26 weeks), Bereavement Leave (5 days)
ABOUT ICC IN TAKEDA:
- Takeda is leading a digital revolution. We’re not just transforming our company; we’re improving the lives of millions of patients who rely on our medicines every day.
- As an organization, we are committed to our cloud-driven business transformation and believe the ICCs are the catalysts of change for our global organization.
#Li-Hybrid
Locations
IND - Bengaluru
Worker Type
Employee
Worker Sub-Type
Regular
Time Type
Full time