Data Engineer II
ملخص الوظيفة
About the Role
Were looking for a Data Engineer whos passionate about building reliable scalable and cost-efficient data systems. Youll work with a modern stack Kafka Google Cloud Platform (GCP) AWS to design and maintain the pipelines that power analytics machine learning and product insights.
Ideal role for someone with solid foundational skills in data engineering whos ready to deepen their expertise take ownership of workflows and collaborate across teams.
If you dont know every tool in our stack yet thats okay. We value curiosity problem-solving and a willingness to learn just as much as existing technical skills.
Whats On Your Plate
Design build and maintain data pipelines and workflows for batch and streaming use cases.
Work with Kafka to manage real-time data ingestion and event-driven architectures.
Leverage GCP and AWS services for storage processing and orchestration (e.g. BigQuery Dataflow S3 Lambda).
Orchestrate workflows using tools like Airflow or similar schedulers.
Ensure data quality and reliability through monitoring alerting and automated validation.
Collaborate with analysts data scientists and product teams to understand requirements and deliver data solutions that drive business impact.
Optimize for cost and performance across cloud environments.
Participate in code reviews documentation and knowledge sharing to raise the bar for the team.
Our Tech Stack
Data Ingestion & Streaming: Apache Kafka Kafka Connect
Cloud Platforms: Google Cloud Platform (BigQuery Dataflow Pub/Sub Cloud Storage) AWS (S3 Lambda Glue)
Workflow Orchestration: Apache Airflow
Programming Languages: Python SQL (bonus: Java/Scala)
Infrastructure & DevOps: Terraform CI/CD pipelines Docker
Monitoring & Observability: Grafana Prometheus Cloud-native tools
Qualifications :
What Did We Order
What Were Looking For
Experience (1-3 years) in data engineering software engineering or a related field.
Proficiency in SQL and at least one programming language (Python preferred).
Understanding of data modeling ETL/ELT concepts and cloud-based data warehouses.
Familiarity with streaming platforms (Kafka Kinesis or similar).
Comfort working in cloud environments (GCP AWS or Azure).
Strong communication skills able to explain technical concepts to non-technical audiences.
Growth mindset eager to learn adapt and take on new challenges.
Nice-to-Have (But Not Required and willing to learn)
Experience with infrastructure-as-code (Terraform CloudFormation).
Exposure to containerization (Docker Kubernetes).
Knowledge of data governance security and compliance best practices.
Additional Information :
Why Youll Love Working Here
Impact: Your work will directly influence how data powers decisions across the company.
Learning culture: We invest in your growth from mentorship to training budgets.
Modern stack: Work with cutting-edge tools and cloud platforms.
Collaboration: Partner with talented engineers analysts and product managers.
Flexibility: We care about outcomes not where you work from.
Our Hiring Philosophy
We know that a great data engineer isnt defined by checking every box. If youre excited about data engineering have a solid foundation and are eager to grow we want to hear from you.
Remote Work :
No
Employment Type :
Full-time
عن الشركة
As the worlds leading local delivery platform, our mission is to deliver an amazing experience, fast, easy, and to your door. We operate in over 70+ countries worldwide, powered by tech but driven by people. As one of Europes largest tech platforms, we enable ambitious talent to del ... اعرض المزيد