About the Role:
Join as a Data Developer and contribute to building and maintaining data solutions in AWS for one of our largest clients. You'll design, develop, test, and deploy new features, working with large datasets, Python, Spark, and various AWS services. Mentor others, lead projects, and impact our team's success.
We are looking for Experienced Data Engineers to help a large customer migrate legacy ETL jobs (currently Glue/Lambda/Snowflake for business systems) to their Airflow/Databricks standard.
Candidate Location:
Latin America
Remote
Requirements:
● Level Of Experience - 10+ Years
● 8+ years of data engineering experience.
● 4+ years with Python
● 4+ years in Airflow
● 4+ years in Databricks and PySpark
● 2+ years in AWS Glue
● 2+ years in Snowflake
● 4+ years with SQL. Some NoSQL
● Experience processing large datasets.
● CI/CD experience (Git, Jenkins).
● Experience with databases (Oracle, Redshift, Aurora).
● AWS Compute, Database, and Management tools experience.
● Skills Required - ETL, data flows, ingestion, reporting, analytics, general communication and technical troubleshooting.
● Primary Technology Stack - AWS, Glue, Snowflake, Airflow, Databricks, ETL, Lambda
● Primary Responsibilities - The role is to help migrate a number of old ETL jobs (Lambda, Glue, Snowlfake) into the customer’s new ETL standard (Airflow+Databricks). The consultants would be part of the broader customer team executing this project, but still be expected to be able to operate independently.
Good to Have:
● Java/Spring Boot web app development experience.
● Microservices/RESTful API experience.
● AWS Certification/Cloud migration experience.