- Lead planning execution and delivery of enterprise data and MI automation projects using Databricks and Confluent.
- Develop detailed requirements project plans delivery roadmaps and work breakdown structures.
- Ensure resource allocation budgeting and adherence to timelines and quality standards.
- Manage vendors deliverables and quality of output
- Manage issues conflicts and prepare mitigation
- Collaborate with data engineers architects business analysts and platform teams to align on project goals.
- Act as the primary liaison between business units technology teams and vendors.
- Facilitate regular updates steering committee meetings and issue/risk escalations.
- Oversee solution delivery on Databricks (for data processing ML pipelines analytics).
- Manage real-time data streaming pipelines via Confluent Kafka.
- Ensure alignment with data governance security and regulatory frameworks (e.g. GDPR
CBUAE BCBS 239).
- Ensure all regulatory reporting data flows are compliant with local and international financial standards.
- Manage controls and audit requirements in collaboration with Compliance and Risk teams.
- 7 years of experience in Project Management within the banking or financial services sector.
- Experience leading data and MI automation projects (especially Databricks and Confluent Kafka).
- Strong understanding of data architecture data pipelines and streaming technologies.
- Experience managing cross-functional teams (onshore/offshore).
- Strong command of Agile/Scrum and Waterfall methodologies.
- PMP / Prince2 / Scrum Master certification
- Familiarity with regulatory frameworks: BCBS 239 GDPR CBUAE regulations
- Strong understanding of data governance principles (e.g. DAMA-DMBOK)
Bachelors or Masters in Computer Science Information Systems Engineering or related
field.
- On-time on-budget delivery of data initiatives
- Uptime and SLAs of data pipelines
- User satisfaction and stakeholder feedback
- Compliance with regulatory milestones
- Lead planning execution and delivery of enterprise data and MI automation projects using Databricks and Confluent.- Develop detailed requirements project plans delivery roadmaps and work breakdown structures.- Ensure resource allocation budgeting and adherence to timelines and quality standards.- ...
- Lead planning execution and delivery of enterprise data and MI automation projects using Databricks and Confluent.
- Develop detailed requirements project plans delivery roadmaps and work breakdown structures.
- Ensure resource allocation budgeting and adherence to timelines and quality standards.
- Manage vendors deliverables and quality of output
- Manage issues conflicts and prepare mitigation
- Collaborate with data engineers architects business analysts and platform teams to align on project goals.
- Act as the primary liaison between business units technology teams and vendors.
- Facilitate regular updates steering committee meetings and issue/risk escalations.
- Oversee solution delivery on Databricks (for data processing ML pipelines analytics).
- Manage real-time data streaming pipelines via Confluent Kafka.
- Ensure alignment with data governance security and regulatory frameworks (e.g. GDPR
CBUAE BCBS 239).
- Ensure all regulatory reporting data flows are compliant with local and international financial standards.
- Manage controls and audit requirements in collaboration with Compliance and Risk teams.
- 7 years of experience in Project Management within the banking or financial services sector.
- Experience leading data and MI automation projects (especially Databricks and Confluent Kafka).
- Strong understanding of data architecture data pipelines and streaming technologies.
- Experience managing cross-functional teams (onshore/offshore).
- Strong command of Agile/Scrum and Waterfall methodologies.
- PMP / Prince2 / Scrum Master certification
- Familiarity with regulatory frameworks: BCBS 239 GDPR CBUAE regulations
- Strong understanding of data governance principles (e.g. DAMA-DMBOK)
Bachelors or Masters in Computer Science Information Systems Engineering or related
field.
- On-time on-budget delivery of data initiatives
- Uptime and SLAs of data pipelines
- User satisfaction and stakeholder feedback
- Compliance with regulatory milestones
View more
View less