Data Engineer (ETL Developer) Tech Company Logo

Data Engineer (ETL Developer)

📊AI / Data Scientist
🌏Taguig
🏢 On-site
💼Full time
🗓️Getting Old

Job Description

The Data Engineer is primarily responsible for participating in the design, development, and implementation of solutions for the data warehouse. The primary focus of the data warehouse includes data and analytics that support business operations including Clinical, Hospital Performance, and Corporate functions.


This role is cloud facing and requires knowledge in building and managing data engineering code in modern cloud-based technologies such as Google Big Query or equivalent. We are looking for a high energy individual willing to learn and evolve and would like to contribute to high impact healthcare environment.


Primary duties and responsibilities:

  • Assembling small to medium complex sets of data that meet non-functional and functional business requirements
  • Building required data pipelines for optimal extraction, transformation and loading of data from various data sources using GCP and SQL technologies
  • Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics
  • Working with stakeholders including data, design, and product teams and assisting them with data-related technical issues
  • Participates in unit and system testing and follow existing change control processes for promoting solutions to production system, escalating issues as needed.
  • Design, develop, review, test, and deploy using CI/CD to facilitate the data warehouse solutions.
  • Identifies data integrity issues and analyzes data and process flows for process improvement opportunities.
  • Monitoring system performance and evaluate query execution plans for improving overall system performance.
  • Working with Integration Architect, develop, test and deploy the data pipelines.
  • Participate in troubleshooting and maintaining existing solutions as required.


Qualifications:

  • Bachelors’ degree from an accredited college/university in technology related field or equivalent combination of education, training, and experience.
  • Ability to build and optimize data sets, ‘big data’ data pipelines.
  • Ability to perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions
  • Ability to build processes that support data transformation, workload management, data structures, dependency, and metadata
  • Strong SQL and relational database design/development skills
  • Development experience with cloud-based modern data warehouses such as Google Big Query or equivalent.
  • Applicants should have a demonstrated understanding and experience of relational SQL databases including Big Query or equivalent, and function/object-oriented scripting languages including Scala, Java and Python.
  • Applicants should also have an understanding of software and tools including big data tools like Kafka, Spark; workflow management tools such as Airflow.
  • Prior experience with health information systems and/or patient financial service systems a plus
  • Databricks, Data Lakehouse architecture, modern data pipelines experience a plus.
  • Excellent written and verbal communications skills