SAP Data intelligence - Extracting data from an ECC or S/4HANA system to Google Big Query with SLT

Поділитися
Вставка
  • Опубліковано 10 вер 2024
  • In this short video, I will demonstrate how can you extract data from an ECC system or an S/4HANA system into Google Big Query with delta load using SAP Data Intelligence and SAP SLT, you can perform Intial load, delta loads or replication(Intial Load and Delta loads).

КОМЕНТАРІ • 4

  • @othmaneallaoui1260
    @othmaneallaoui1260 10 місяців тому

    Great Raphael. I started to use RMS but we have a lot of limitations !!!

    • @raphaelwalter7412
      @raphaelwalter7412  10 місяців тому

      Hello, thank you very much for your kind message. What are the limitations that you have encountered? Let me know and I'll see if I can help. ;) Alors just to let you know, maybe SAP Datasphere will be a good option to look for when the Google Big Query Connector for Replication Flow will be made available. It should be released by the end of the year.

    • @othmaneallaoui1260
      @othmaneallaoui1260 10 місяців тому

      ​@@raphaelwalter7412
      Hello Thank you for reply.
      We are using RMS with SLT to send data from ECC 6 system to GCS (csv files) and we encounter features limitations such as :
      - We can't stop a single task from a RMS flow, when we undeploy a RMS flow and we do some changes on single task, the redeployment will start initial load & delta for all tasks (tips duplicate RMS flows ...)
      - RMS neglects the LTRS setting and does not offer the possibility of putting larger files (small files 1Mo generated on GCS side)
      - You cannot make joins (filter) between tables in replications. for example: send only the invoices line items which are in the headers table (table already filtered), so we are obliged to send the entire table, which takes a long time in initial load.
      - Sometimes, we can have an error on single task, so we modify the source system and we unsuspend the task, but RMS does not take our modifications into account, we should restart the entire RMS flow.
      Currently, we don't have datasphere in our SAP landscape, but it is part of our roadmap to gradually move the BW objects to datasphere in order to source our SAC Tenant.
      Merci et Bonne soirée.

  • @sunilupadhyay1979
    @sunilupadhyay1979 9 місяців тому

    Thanks for sharing the details Raphael.
    We are trying to replicate ECC data into Snowflake using custom python operator. (To avoid staging data load in S3)
    We are facing below issues :-
    1) For initial load, Graph is not terminating automatically.(We have used Graph Terminator)
    2)After manually stopping the graph , if we try to start the delta load, it fails with error "As massid is already in use".
    3) To overcome "massid is already in use" scenario, we tried the delta load with new massid. But system does not read any change records and also does not terminate automatically.
    Can you please suggest what can be the reason for these issues and what can be done to resolve this ?
    Thanks