Founded by Michael Lahyani in 2005 as a magazine (Al Bab World) Property Finder today is a single technology platform and brand across multiple countries in the MENA region. We offer the most advanced tools and bestinclass user experience for homeseekers real estate brokers and developers. Property Finders most recent valuation secures our status among the Middle Easts emerging unicorns affirming a growthoriented identity.
Over the years weve expanded our operations to Bahrain Egypt Qatar Saudi Arabia and secured a strategic shareholding in Hepsiemlak the leading property portal in Turkey. With over 600 dedicated people in 6 regional offices we facilitate more than 14 million monthly visits across our platforms solidifying our position as a regional powerhouse in the proptech space.
As the pioneering portal for homeseekers in the region we are on a mission to motivate and inspire people to live the life they deserve.
Position Summary:
We are seeking a Professional Data Engineer Analytics to join our dynamic Data team where you will play a crucial role in developing and maintaining robust data solutions.
As a Professional Data Engineer Analytics you will drive innovation in analytics and uncover new perspectives and angles for analysis serving as the critical link between raw data and actionable business insights. Your primary responsibility is to transform raw data into clean organised and accessible datasets that are optimised for analysis and decisionmaking.
This role combines data engineering software development best practices and business intelligence expertise to enable and empower datadriven decisionmaking across the organisation
The Data team operates with engineering precision prioritising security privacy and regulatory compliance in every initiative. You will contribute to the teams commitment to utilising the latest tools and methodologies ensuring that our data solutions align with industry best practices.
Our Tech Stack:
Languages: SQL & Python
Pipeline orchestration tool: Dagster (Legacy: Airflow)
Data stores: Redshift/Snowflake/Clickhouse
Platforms & Services: Docker Kubernetes
PaaS: AWS ( DMS Kinesis Glue Athena S3 and others.)
ETL: FiveTran & DBT for transformation
Key Responsibilities:
- Apply Software Engineering Best Practices to Analytics Code: Analytics Engineers bring engineering rigor to analytics workflows by applying key software engineering principles
- Build Semantic Layers for Analytics: building and managing the semantic layer of an analytics stack which serves as the bridge between raw data and enduser reporting tools
- Define Data Models: Design implement and maintain logical data models that represent business concepts key metrics and relationships making data intuitive and easy to query.
- Establish Consistent Metrics: Create and enforce consistent definitions for business metrics and dimensions reducing ambiguity and ensuring everyone in the organization works with a single source of truth
- Build a SelfService Analytics Environment: create a selfservice analytics culture enabling business users to explore and analyse data independently without heavy reliance on data teams
- Automation and PreBuilt Templates: Develop reusable templates automated reports and predefined data models to streamline common analytical workflows.
- Data Governance and Security: Implement robust data governance frameworks access controls and data security measures to ensure data integrity privacy and compliance while enabling selfservice capabilities
- Create UserFriendly and Innovative Dashboards for Business: An Analytics Engineer designs and develops intuitive interactive and visually appealing dashboards tailored to business needs
- Performance Optimization: Ensure dashboards are optimized for performance handling large data volumes efficiently while providing realtime or nearrealtime updates
- UserCentric Design: Work closely with business stakeholders to understand their requirements and craft dashboards that provide actionable insights with a clear and easytonavigate interface.
- Develop and maintain ETL pipelines using SQL and/or Python.
- Collaborate with crossfunctional teams to understand and deliver data requirements.
- Use data transformation tools like DBT to prepare datasets to enable business users to selfservice.
- Ensure data quality and consistency in all data stores.
- Monitor and troubleshoot data pipelines for performance and reliability.
Essential Experience:
- 7 years of experience as a data engineer/ data Modelling.
- Extensive experience with BI tools such as Amazon QuickSight Tableau Qlik Power BI Looker .. for creating dashboards reports and data visualizations
- Proficiency in SQL.
- Proficiency in using programming languages such as Python or R for data manipulation automation and building data workflows.
- Experience with modern cloud data warehousing data lake solutions like Snowflake BigQuery Redshift Azure Synapse.
- Experience with ETL/ELT batch streaming data processing pipelines.
- Excellent ability to investigate and troubleshoot data issues providing fixes and proposing both short and longterm solutions.
- Knowledge of AWS services (like S3 DMS Glue Athena etc.)
- Familiar with dbt or other data transformation tools.
Other Desired Experience:
- Experience with Python
- Experience with orchestration tools like Dagster Airflow AWS Step functions etc.
- Familiar with CI/CD pipelines and automation
- Experience with AWS services and concepts (like EC2 ECS EKS VPC IAM etc).
- Familiar with Terraform and Terragrunt.
- Experience with pubsub queuing and streaming frameworks such as AWS Kinesis Kafka SQS SNS.