
Data Cloud Engineer
- Lisboa
- Permanente
- Horário completo
- Ensure that data engineers can deploy data pipelines at scale in GCP
- Manage and maintain data repositories using Gitlab.
- Utilize dbt for data transformation and modeling, ensuring data quality and consistency.
- Audit data transformation workflow via Code Reviews
- Employ Terraform for infrastructure as code (IaC), automating the provisioning and management of our data infrastructure.
- Design, develop, and maintain robust and scalable data pipelines using Airflow.
- Implement and maintain CI/CD pipelines for data engineering workflows.
- Monitor and troubleshoot data pipeline performance, identifying and resolving bottlenecks.
- Implement data quality checks and monitoring to ensure data accuracy and reliability.
- Contribute to the development of data engineering best practices and standards.
- Automate data-related tasks to improve efficiency and reduce manual effort.
- Cooperate with the support teams and provide L3 support to comply with predefined SLA's
- University degree in computer science, engineering, or comparable field.
- 3-5 years of experience as a Data Engineer or in a similar role.
- Strong proficiency in SQL, Python and containerized applications.
- Experience with data warehousing concepts and technologies.
- Hands-on experience with Google Cloud Platform (GCP) data services (e.g., BigQuery, Cloud Storage).
- Knowledge of dbt and version management tools (e.g. Git).
- Experience with infrastructure as code (IaC) using Terraform.
- Experience with workflow orchestration tools such as Airflow.
- Experience with DevOps workflow.
- Strong problem-solving and analytical skills with a system optimization-oriented mindset.
- Excellent communication and collaboration skills.
- Experience with CI/CD pipelines.
- Experience using terminals.
- Cleaning coding mindset.
- Experience in Kotlin programming language including gradle.
- Knowledge of data visualization tools (e.g. Looker).
- Scrum methodology.