Bachelor DegreePortugueseAirflowCloudETLGoogle Cloud PlatformNo SQLPythonSparkSQL
Job Description
• Implement monitoring systems and routines (data, applications, queries, etc.)
• Evolve data models, architecture, and the construction of data pipelines
• Implement data migration, processing, and storage routines (ETL)
• Develop integrations between different data sources (RDS, external APIs, etc.)
• Implement and execute load testing
• Monitor running data pipelines
Job Requirements
- Experience with Python
- Experience with GCP
- Experience with continuous integration and cloud deployment
- Experience with data lakes, data warehouses, and data marts
- Knowledge of SQL and NoSQL databases
- Familiarity with cloud platform services
- Knowledge of event-driven architecture
- Experience with streaming projects is a plus
- Experience with Airflow is a plus
- Distributed data processing (Spark or similar) is a plus
Benefits
- Remote work
- Full-time
- Opportunities for professional development
Related Guides
Related Categories
Related Job Pages
More Data Engineer Jobs
Data Engineer28 days ago
Full TimeRemoteTeam 10,001+H1B Sponsor
DataOps Engineer owning the lifecycle for our Snowflake-on-AWS platform
AirflowAWSAzureKafkaPythonSQLTerraform
Data Engineer28 days ago
Full TimeRemoteTeam 51-200H1B No Sponsor
Hands-on data engineer developing scalable data solutions for a sustainability-focused platform
AWSCloudDockerETLPostgresPythonSQL
Data Engineer28 days ago
Full TimeRemoteTeam 51-200
Senior Data Engineer building data infrastructure with modern AI tools
AirflowApacheETLKafkaPostgresPythonSparkSQLTypeScript
United States
Data Engineer28 days ago
Full TimeRemoteTeam 201-500Since 2006H1B No Sponsor
GCP Data Engineer designing scalable data platforms on Google Cloud
AirflowApacheBigQueryCloudGoogle Cloud PlatformPythonSQL
United States