Data Engineer
Job Description
We are seeking a skilled Data Engineer to design, construct, and oversee efficient data pipelines (ETL processes). This role involves integrating data from various source systems into our data warehouse, developing and optimizing data warehouse schemas, and crafting complex SQL queries. You'll also use scripting languages like Python to transform and aggregate large datasets. Implementing data quality measures and collaborating with cross-functional teams are key. Documentation of pipeline designs and data flows is essential for transparency and future reference. Handle multiple projects, prioritize tasks, and communicate progress to stakeholders to meet deadlines.
Qualifications
1. Bachelor's or Master's degree in Computer Science, Mathematics, Physics, or a related field. 2. Minimum of 3 years of experience in data engineering or backend data development. 3. Expertise in SQL with experience in data modeling and building data warehouse solutions. 4. Proficiency in a programming language (e.g., Python) for data processing and pipeline automation. 5. Familiarity with ETL tools and workflow orchestration frameworks like Apache Airflow. 6. Experience implementing data quality checks and working with large datasets. 7. Strong problem-solving, communication, and teamwork skills for cross-functional collaboration.
Benefits
- Stock grant opportunities based on role, employment status, and location. - Additional perks and benefits dependent on employment status and country. - Remote work flexibility with optional WeWork access.
Apply Now
