As a Senior Data Engineer at Andina Tech, you’ll play a key role in managing and optimizing large-scale data infrastructures in the cloud. You’ll work with cutting-edge technologies, solving complex problems in data processing, ETL pipelines, and cloud environments to deliver scalable and high-performance solutions.
Your Mision
- Work with AWS technologies to build and maintain large-scale data processing solutions, ensuring data integrity and high performance.
- Develop and optimize ETL processes in a cloud environment, ensuring efficient data ingestion, transformation, and loading.
- Contribute to the design and maintenance of Data Lakes, Data Warehouses, and Data Lakehouses, ensuring scalability and security.
- Work with multidimensional modeling to support analytics and business intelligence.
- Manage code versioning and collaborate efficiently using Git and GitHub.
- Automate CI/CD pipelines to ensure quick and secure delivery of data solutions.
What We’re Looking For
- A strong passion for cloud computing, data infrastructure, and modern web development, with a focus on performance, reliability, and security.
- Proven ability to deliver high-quality, scalable solutions that meet the evolving needs of clients while ensuring reliability and performance.
- Extensive experience taking ownership of projects and valuing a culture of trust, accountability, and mutual respect.
Must-Have Skills
- Experience with AWS Glue, Amazon S3, Amazon Athena, EC2, DynamoDB, and AWS Lambda.
- Expertise in ETL processes within cloud environments.
- Proficiency in Python and PySpark for large-scale data processing.
- Strong knowledge of SQL and NoSQL databases.
- Hands-on experience in Terraform and Docker for infrastructure automation and containerization.
- Proficiency in Git and GitHub for efficient collaboration and version control.
- Experience in automating CI/CD pipelines.
- Proficiency in English (advanced level).
Nice-to-Have Skills
- Experience with AWS Step Functions, Redshift, and EMR.
- Familiarity with agile methodologies (Scrum, Jira).
- Linux/Shell scripting skills.
- Experience with pipeline orchestration tools such as Airflow.
- Familiarity with Lakehouse tools (Iceberg, Hudi, Delta).
- Knowledge of GitHub Actions for continuous integration and delivery.
- A Computer Science degree is preferred, but we value non-traditional backgrounds as well.