Data Engineer

Details of the offer

As a Data Engineer, you will directly report to the Head of Data and Analytics and will be responsible in design and development of data modelling, data pipelines supporting the companys platform and apps, as well as providing data and reporting support for business end users. What youll be working on: Design and development of reporting data model and data transformation jobs, including the modelling of very large data sets. Identify and implement the most efficient ways of performing data transformation tasks using best practice methods and tooling. Prepare and maintain documentation such as business requirements documents, design specifications and test cases. Work with stakeholders (including data team, software engineers and product team) to understand business requirements and translate these into technical specifications. Lead the data migration and modelling process from GCP to data warehouse. Responsible for data warehouse administration, user access and security. Contribute to the design and implementation of our data model and ETL framework. What were looking for: Minimum 2+ years of experience in a data engineering environment, with hands on experience building and maintaining complex data environments in the cloud (preferably GCP BigQuery and/or Snowflake). Extensive experience with SQL (Postgres preferred), with a core focus on analyzing and validating complex and disparate data sets to find gaps between datasets, requirements, and source systems. Demonstrate understanding and experience with following data engineering competencies: Data warehousing principles, including data architecture, modelling, database design, and performance optimization best practices. Building group data assets and pipelines from scratch, by integrating large quantities of data from disparate internal and external sources. Supporting analytics solutions to be productionised, including deployment, automation, orchestration, monitoring, and logging. Preferably with an ETL tool such as Matillion, DBT, or equivalent. Experience in deploying cloud infrastructure as code (IaC) using Terraform or similar. Experience using Python to develop scripts and small programs for job orchestration and/or data manipulation. Ability to interact with business end user to draw and distil business requirement into data pipeline design and reporting solution. Ability to prioritize on the fly and work in a high-performing, outcomes- focused environment with multiple competing and ambiguous deliverables. Working in an Agile development environmentLocation: Makati or IloiloWork Arrangement: Hybrid/Remote, Dayshift


Nominal Salary: To be agreed

Source: Whatjobs_Ppc

Requirements

Data Engineer (Databricks)

Have you ever envisioned a world where data seamlessly transforms into insights, propelling businesses forward? Imagine being at the forefront of this data-d...


Sharesource - National Capital Region

Published a month ago

Senior Software Architect (Manila- Remote)

Token Metrics is seeking an exceptional Senior Software Architect to lead the evolution of our analytics platform and trading bot. This key role will be inst...


Token Metrics - National Capital Region

Published a month ago

Devops Engineer

Job Qualifications: Two or more years of experience working as DevOps Engineer / Python/Backend Developer / Cloud Engineer / Software Reliability Engineer / ...


Grow Inc. - National Capital Region

Published a month ago

Application Management Specialist

Role Summary: The Application Management Specialist is responsible for providing second-level support for business applications, focusing on resolving techni...


Orix Metro Leasing And Finance Corporation - National Capital Region

Published a month ago

Built at: 2024-12-23T04:36:02.800Z