Databricks with SAP BO
ملخص الوظيفة
Key ResponsibilitiesDesign build and maintain scalable ETL ELT pipelines using Databricks PySpark Delta Lake SQL WarehouseTransform and curate data into bronze silver and gold layers following medallion architecture best practicesPublish and expose gold layer datasets through Databricks SQL Warehouse for consumption by SAP BOCollaborate with BO developers to ensure semantic layer alignmentConduct data validation and reconciliation between Databricks outputs and BO report datasetsOptimize data models queries and partitions for performance cost and scalabilityRequired Skills and Experience5 years of experience with Azure Databricks PySpark Delta Lake SQL WarehouseProficiency in SQL and data modelling star snowflake schemasFamiliarity with SAP BusinessObjects universe and report structures able to validate and support BO data consumptionExperience working in banking or financial data environments preferred
المهارات المطلوبة
- أكاديميون
- إدارة المرافق والمنشآت
- نظام إدارة المحتوى
- علوم الحياة
- LINQ
- المستشفى
عن الشركة
At Virtusa, we are builders, makers, and doers. Digital engineering is in our DNA. It’s at the heart of everything we do.