Job Description
Responsibilities:
- Design and implement ETL processes to consolidate data from multiple sources.
- Build and maintain automated data pipelines that refresh the warehouse regularly.
- Ensure data quality through validation checks, monitoring, and troubleshooting.
- Document data sources, transformation rules, and maintain clear data dictionaries.
- Collaborate with business teams to translate reporting requirements into technical data structures.
- Support Data Analysts by structuring data for dashboards and reports.
Requirements
Requirements:
- Bachelor’s degree in computer science, Information Systems, or related field.
- 2-3 years of experience with SQL and database management.
- Experience with ETL tools and data pipeline development.
- Familiarity with cloud data warehouses (BigQuery, Snowflake, Redshift, or similar).
- Basic programming skills in Python or similar languages.
- Understanding of data modeling concepts and best practices.
Preferred Skills:
- Experience with data quality frameworks and testing.
- Knowledge of version control (Git).
- Exposure to analytics platforms like Amplitude or Google Analytics.
- Understanding of event tracking and instrumentation
What We Offer
- Competitive salary
- Health insurance
- Flexible working hours
- Professional development
- Team events
- Modern equipment