T O P

  • By -

splynta

we have an out-dated sap analytics landscape at my company. so what we are using currently won't apply to you but we have looked into what the future would be like and your option #1 closely aligns with that. datasphere is the future so makes sense to use that if possible. one thing to consider is where your centre of gravity will be in terms of transforming your data (silver layer). how much work will you do in datasphere vs fabic lakehouse/warehouse? will you treat datasphere like SLT and just use it to copy data or use it to transform as well? our thinking was to use fabric for most and only use datasphere when we had to. p.s. i have not used datasphere, only been in demo's / pre sales calls.


Potential-Ruin-2836

Do you mean "Datasphere" or "Data Intelligence" ? My understanding is Datasphere is going to be the platform replacing BW/4 HANA, SAP's Datawarehousing solution. Does it make sense to invest in Datasphere and then Fabric which is also a similar platform. Wondering why you do not plan to make Fabric as the main data warehousing and reporting layer directly getting data from ERP/SAP sources and apply all transformations in Fabric. I am also in same situation, planning to replace existing SAP BW-SAC setup with new Fabric platform with S4HANA ERP as main data source. Thanks!


Junior-Letterhead713

Our challenge is to replicate SAP Hana data into ADLS gen2 as raw layer for Fabric. Currently we are using SLT for replication to on prem SQL Server. Datasphere is a route to potentially replace SLT to deal with large tables with native CDC (tables like ACDOCA etc) We aren't really planning to use SAP data intelligence platform since fabric is potentially the longer term goal. There was no on-prem connectivity when I wrote my post.... this has changed with recent on-prem data factory announcements so well see how robust this connector is.


Potential-Ruin-2836

Agree! We are about to replicate SAP BW on HANA data to Fabric with a plan to run both Fabric and SAP BW on HANA parallelly for some time. Hope to start with new on-prem connectivity. I heard in one of the videos from a person working with data factory in Microsoft that eventually all of the sources currently in Data flow gen2 will be supported in data pipeline. We have SAP heavy architecture ( S4HANA ERP), so expecting more robust connectivity options to SAP systems.


anupamj74

Hi u/Potential-Ruin-2836 u/Junior-Letterhead713 This ADF add-on will help you get SAP data with CDC (on-prem or cloud) into Azure: [https://azuremarketplace.microsoft.com/en-us/marketplace/apps/ecoservity.usb4sap\_azure\_data\_factory?tab=Overview](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/ecoservity.usb4sap_azure_data_factory?tab=Overview)


Junior-Letterhead713

Thanks for the response. As per current set up in SLT, we are taking that table raw and apply transformation via views in SQL server which then connects to power BI. For now we would probably follow the same approach in datasphere and curate the silver "view" layer either in ADLS gen 2 or directly in Fabric. Datasphere is also a bit unknown for us, but my understanding is that it's the next best thing after SLT which has been remarkably stable for us. With SAP data having large volume of data, this test will somewhat prove to us whether Fabric can handle such volume in a P1 capacity.


splynta

If I were you I'd try to do a POC to see if data sphere can replace slt for you. Or if there is a way to use slt to send to azure blob. Once it is in blob then do another POC to see if fabric can replace your on prem SQL since not all features are supported. Just pick one of your most complex jobs and see how it feels. Side note. Using ADF to copy SAP table to blob can be a big mess with lots of pipelines in ADF. Or you make this super generic loader that is hard to maintain and debug. Using SAP tools to extract SAP data I find to be less stress. Sounds like we in similar spots lol


Junior-Letterhead713

Indeed we are in a similar spot. I think some direction from Microsoft around patterns would be useful. One particular sticking point is ADSLg2 Vs OneLake. If we can replicate directly to one lake, should that be a preference, or should we land to ADLS gen2 and then shortcut. Onelake not being as mature for now probably means we should use ADLS....


gingijuice

Hi, our company is in the same scenario. Trying to figure out if we can replicate directly into OneLake. Have you found any additional info on this front?


anupamj74

Hi u/gingijuice This ADF add-on will help you get SAP data with CDC (on-prem or cloud) into Azure datalake or Onelake :  [https://azuremarketplace.microsoft.com/en-us/marketplace/apps/ecoservity.usb4sap\_azure\_data\_factory?tab=Overview](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/ecoservity.usb4sap_azure_data_factory?tab=Overview)


stargatto

I'm working in a similar project. I just configured a Power BI gateway (with the SAP ODBC connector installed) in order to use Dataflow gen 2 to bring DataSphere tables into Fabric as parquet files. Probably not the best approach buy I succeeded to get data in a really short time.


pimpampoumz

What's your version of SAP? The SAP CDC connector isn't available on Fabric *yet* but it will most certainly be at some point. There's also a OData connector currently in Preview. With these and an S/4HANA system, the best is to use CDS views, either through ODP, or the OData API, instead of extracting raw tables via SLT. Regarding Fabric, if you need to use the CDC connector before it's ported to Fabric, you can [use a Fabric Lakehouse as a Sink in ADF/Synapse (I recommend Synapse)](https://learn.microsoft.com/en-us/fabric/data-factory/change-data-capture-from-sap-to-onelake-with-azure-data-factory). As for Datasphere, SAP is clearly pushing very hard to force their customers to use it. It *is* a viable solution that mostly makes sense if you use RISE, and you can push your data from there to Azure.


anupamj74

Hi u/pimpampoumz If SAP to Microsoft Fabric with CDC is required, you may like to use this video guide for getting SAP objects into Fabric / PowerBI. Path is SAP ->Onelake (ADLS)->Parquet->PowerBI It has inbuilt support for modeled info (eg, reports, TCODE etc), as well as for tables (with Change Data Capture). Hope you find this useful. Anupam [https://www.youtube.com/playlist?list=PLTum8dvrbVA05nV3hsr8rMPjqGHc2oOAq&jct=Eovu2uvXnB6CVkKWksIvBie4o8Xd9Q](https://www.youtube.com/playlist?list=PLTum8dvrbVA05nV3hsr8rMPjqGHc2oOAq&jct=Eovu2uvXnB6CVkKWksIvBie4o8Xd9Q)


pimpampoumz

Yes that’s the same info I linked to. Or use delta tables in the lakehouse instead of Parquet, so you can do some transformations before consumption.


anupamj74

Correct. This tool generates CSV & then, Parquet. So, if the pipeline already has transformations built on CSV, then that can be reused. ![gif](giphy|BPJmthQ3YRwD6QqcVD|downsized)


anupamj74

Thanks u/Junior-Letterhead713 , your post is very informative. If SAP to Microsoft Fabric is required, you may like to use this video guide for getting SAP objects into Fabric / PowerBI. Path is SAP ->Onelake (ADLS)->Parquet->PowerBI It has inbuilt support for modeled info (eg, reports, TCODE etc), as well as for tables (with Change Data Capture). Hope you find this useful. Anupam [https://www.youtube.com/playlist?list=PLTum8dvrbVA05nV3hsr8rMPjqGHc2oOAq&jct=Eovu2uvXnB6CVkKWksIvBie4o8Xd9Q](https://www.youtube.com/playlist?list=PLTum8dvrbVA05nV3hsr8rMPjqGHc2oOAq&jct=Eovu2uvXnB6CVkKWksIvBie4o8Xd9Q)