Azure Data Factory: Rerun Pipeline From Failed Activity

Поділитися
Вставка
  • Опубліковано 30 вер 2024

КОМЕНТАРІ • 6

  • @pravinsingh9799
    @pravinsingh9799 Рік тому +1

    One of the stage in my pipeline has a web activity and I have set the Secure output and Secure input to true.
    Now if my pipeline fails due to some other activity and if I rerun it from the failed activity, this web activity is skipped and the status of that stage comes out as failed with the error message
    Operation on target failed: Operation on target : Activity '' references the SecureOutput from previous activity ''. Since ADF does not cache secure output, please rerun from activity '' or rerun the whole pipeline run instead.
    Any thoughts on how to handle this?

    • @austinlibal
      @austinlibal Рік тому

      Thanks for the question.
      This one seems to be a bit at odds with the functionality of rerunning a pipeline.
      I tested this in my enviornment with a Get Metadata and a Filter activity that referenced the output of the Get Metadata. I set the secureOutput to true on the Get Metadata and attempted to rerun and it failed because of the same error message that you mentioned.
      From what I have read, you would lose the ability to do accomplish this if you are referencing activities upstream in your pipeline.
      The end result would be to either run the whole pipeline again as mentioned in the error due to the security of not storing the outputs.

  • @rams27787
    @rams27787 Рік тому +1

    How to check the rerun pipeline I can't locate after rerunning from failed activity

    • @austinlibal
      @austinlibal Рік тому +1

      There should be two pipeline runs on the original failed run that has the failed pipeline and the one that succeeded. There may be a carrot or arrow icon that you need to click to view it.

  • @gobindamohan
    @gobindamohan 2 роки тому +2

    How do we automate this?

    • @PragmaticWorks
      @PragmaticWorks  2 роки тому +1

      That's a great question! There is some awesome conditional logic you can bring into a pipeline to only run activities upon the success or failure of other activities using precedent constraints. You can also use the expression builder in pipelines and utilize the outputs of activities that are further upstream in a pipeline. Those are two ways that you can automate your ETL and run large loads in one batch sequentially.
      For this specific instance though there was an error that needed to be corrected in order to rerun from the failed activity. Once a pipeline run fails you need to manually correct that issue before it can be rerun. Therefore, there would not be a great way to automate this specific process using that feature from the Monitor Hub.