Senior Data Engineer

Permanent employee, Full-time · Mumbai

Overview
We are looking for a Senior data engineer to take on the responsibility of designing data pipelines and enabling Azure cloud services, perform hands-on development, and maintaining data pipelines and build/deploy Azure services. 
The Senior Data Engineer will oversee GPI's data integration work, including developing robust data models, developing data warehouse & data lakes, and writing code scripts for data integration and analysis. The candidate needs to ensure that automated streams of daily data updates and data management can be successfully implemented, alongside our designated Azure services enablement.
This role will work closely and collaboratively with members of our established Data teams, Product team, and Software Engineering teams to define requirements, mine and analyse data, integrate data from a variety of sources, and deploy high-quality data pipelines in support of the analytics needs of GPI. 
We value people being engaged and caring about customers, caring about the code they write and the contribution they make to GPI. People with a broad ability to apply themselves to a multitude of problems and challenges, who can work across teams do great things here at GPI.
We’re open-minded when it comes to hiring and we care more about culture fit and techie. If you care enough to find elegant solutions to difficult technical problems, we’d love to hear from you.
Responsibilities and Duties
  • Design, implement, test, deploy, and maintain stable, secure, and scalable data engineering solutions and pipelines in support of data and analytics projects, including integrating new sources of data into our central data warehouse/data lake, and moving data out to applications and affiliates.
  • Responsible in delivering the design and implementation of data lakes and data warehouses for our Azure architecture.
  • Responsible in designing and delivering data models required to migrate GPIs manual data collection (from different data teams) and overall data management to an online workflow on our new Azure architecture.
  • Responsible in ensuring all requirements are meticulously captured from different Data teams, and Software engineers during the process of designing the Azure stack, data models and overall architecture.
  • Responsible in maintaining and providing continuous support to the data teams, and other technical teams in ensuring the overall Azure architecture implementation is optimised for the needs of the business.
  • Responsible in developing and maintaining all ETL/ELT pipelines on our Azure platform for the different technical teams. 
  • Able to begin this position as a single contributor and ensure that the overall architecture transformation is realised in the first year.
  • Build reports and data visualizations, using data from the data warehouse and other sources.
  • Be able to build, lead and manage a data engineering team in GPI.
Key Subject Matter / Technical Skills:
  • At least 6+ years of experience in designing Microsoft Azure architecture/services 
     as well as Data Warehousing & Data Lakes modelling, and database systems.
  • Extensive experiences in Kubernetes and Containerisation technologies & development
  • Has previous experiences working on SaaS products as Sr Data Engineer/Data Engineer
  • Extensive experience in designing Azure Synapse Analytics solutions and providing strong integration with other Azure cloud architecture services. 
  • Build and deliver data pipelines that ingests data (structured and unstructured) from multiple data sources into target data lake and data warehouse. 
  • Experienced in integrating a wide variety of web-scraped data into Azure platform.
  • Extensive experience in Python development and scripting
  • Able to develop and optimise queries in SQL and NoSQL
  • Strong experience in the concepts of Extract, Transform, and Load (ETL) / (ELT)
  • Experience in creating large scale data pipelines using Azure Data Factory for data teams and enabling automation for daily data manipulation and data updates.
  • Strong experience with Azure: ADLS, Databricks, Stream Analytics, SQL DW, Cosmos DB, Cassandra, Azure Functions, Serverless Architecture, ARM Templates
  • Experience working with varied forms of data infrastructure inclusive of Hadoop, Spark, and column-oriented databases.
  • Have previously worked on real data challenges and handled huge volume, velocity, and variety of data.
  • Familiarity with APIs (REST, SOAP, and GraphQL)
  • Experience with CI/CD workflows (e.g., Azure DevOps) and Git best practices
  • Experienced in Microsoft Dot Net framework development.
  • Familiar with working with Agile projects 
  • Excellent analytical & problem-solving skills, willingness to take ownership and resolve.
About us
At GPI, our vision and goal is to support the biopharma industry and payers in achieving a sustainable balance between the pricing of medicines and patient access through the use of data-driven, evidenced-based strategies, analytics and technology innovation. Our progressive thinking and creative approach to market access is what sets us apart.
We are a dynamic, growing organisation with a diverse group of motivated employees, all working together in an environment that fosters creativity and innovation. Our company values drive the way we do business and helps us attract the best talent world-wide.
I’m proud of the team at GPI and the valuable work we deliver for our clients.

Preeti Patel 
GPI Chief Executive Officer
We are looking forward to hearing from you!
Thank you for your interest GPI. Please fill out the following short form. Should you have difficulties with the upload of your data, please send an email to careers@globalpricing.com.
Uploading document. Please wait.
Please add all mandatory information with a * to send your application.