Data Engineer

Full TimeRemoteTeam 474Since 2014Company Site

Location

New York

Posted

16 days ago

Salary

Not specified

Bachelor Degree2 yrs expEnglishApache AirflowAws Lake FormationAws LambdaAws SnsAzure Blob StorageAzure Data WarehouseDatabricksDbtDelta LakesFivetranInformaticaJava ScriptMatillionPythonRedshiftS3ShellSnowflakeSparkSsisTalend

Job Description

Data Engineer Location: US / Canada (Eastern Time) - Home based Job Type: Full-time, Permanent About AllCloud AllCloud is a leader in amplifying organizations’ cloud potential through AI. With a track record of hundreds of successful migrations and implementations across AWS and Salesforce, AllCloud has developed strategies and solutions that enable businesses of all sizes to remain at the forefront of innovation. AllCloud is a leader in AI-led professional and managed services. As an AWS Premier and audited managed services Partner, and Salesforce Consulting partner, AllCloud provides comprehensive AI-led cloud journey support, from initial migration to ongoing management through our Engage Managed Services. Our expertise ensures that clients remain aligned with ecosystem best practices while focusing on their core business growth. AllCloud serves clients across the globe with offices in EMEA and North America. www.allcloud.io Job Summary Are you passionate about data and delivering solutions for clients that turn data into valuable, actionable information for their business? We are hiring Data Engineers with strong experience across the entire Cloud Data stack. The ideal candidate will have extensive experience in data pipelines (ELT/ETL), data replication, data warehousing and dimensional modeling, and curation of data sets for Data Scientists and Business Intelligence users. This candidate will also have excellent problem-solving ability dealing with large volumes of data. How You'll Make Your Mark: Building scalable Cloud data solutions using MPP Data Warehouses (Snowflake, Redshift, or Azure Data Warehouse/Synapse), data storage (S3, Azure Blob Storage, Delta Lakes, or AWS Lake Formation) and analytics platforms (i.e. Spark, Databricks, etc.) Creation of data pipelines and transformations Knowledge of ETL tools such as– Matillion, FiveTran, Informatica, Talend, SSIS, etc. Transformations – dbt Experience with elastic/open search Load historical data to a data warehouse Scripting in Python, Java Script or Shell Workflow Orchestrations using Apache Airflow, AWS Step Functions, etc. Familiarity with automated promotions, SCM tools, and CICD best practices Modeling and curation of data for visualization and predictive modeling users Design and implementation of AWS and/or Azure services such as Lambda, SNS, etc. Creating data integrations with scripting languages such as Python Writing complex SQL queries, stored procedures, etc. Summary of Requirements & Experience Experience working at a consulting company Experience with Data Vault architecture Matillion Associate Certification Snowflake SnowPro AWS Data & Analytics Specialty AWS Database Specialty AWS Solutions Architect Associate AWS Developer Associate AWS Glue Why work for us? Our team inspires progress in each other and in our customers through our relentless pursuit of excellence; you will work with leaders who promote learning and personal development. AllCloud is an Equal Opportunity Employer and considers applicants for employment without regard to race, color, religion, sex, orientation, national origin, age, disability, genetics or any other basis forbidden under federal, provincial, or local law.

Job Requirements

  • Bachelor’s degree, or equivalent experience, in Computer Science, Engineering, Mathematics or a related field. Commensurate work experience will be considered in lieu of degree
  • 2+ years’ experience building scalable Cloud data solutions using MPP Data Warehouses (Snowflake, Redshift, or Azure Data Warehouse/Synapse), data storage (S3, Azure Blob Storage, Delta Lakes, or AWS Lake Formation) and analytics platforms (i.e. Spark, Databricks, etc.)
  • 3+ years with complex SQL queries and scripting
  • 3+ years’ experience building data pipelines via Python, Spark, or GUI Based tools
  • 3+ years’ experience loading historical data to data warehouses
  • 3+ years’ experience with AWS and/or Azure Cloud
  • 3+ years developing, and deploying scalable enterprise data solutions (Enterprise Data Warehouses, Data Marts, ETL/ELT workloads, etc.)
  • 3+ years of supporting business intelligence and analytic projects
  • 2+ years’ DevOps experience
  • 2+ years’ experience working in an environment with automated promotions to production
  • Familiar with ETL/ELT patterns and methodologies.
  • Good understanding of code repositories such as GIT
  • Excellent written and oral communication skills
  • Pluses
  • Experience with a Business Intelligence tool such as Tableau, PowerBI, Sigma, etc.

Related Categories

Related Job Pages