Working with Large Data in Power Automate through Pagination

Поділитися
Вставка
  • Опубліковано 4 лис 2024

КОМЕНТАРІ • 68

  • @gregoryoliveira8358
    @gregoryoliveira8358 Рік тому +1

    Thanks for sharing this with us. I was thinking that the limit was the number of rows per page, but now I completely understand this feature. Thanks again!

  • @kimsalas8197
    @kimsalas8197 Рік тому +5

    At timestamp 10:05 you mention you will show how you are parsing out to get the skip token value, but you don’t show it. For the action ‘update skip token variable’, what expression are you using for the value field? I’ve tried a few things, but I don’t get correct results.

  • @thnkngwlnts
    @thnkngwlnts Рік тому

    This is exactly the solution I need as I looked all over UA-cam and couldn't find the right video that highlights all the components you included in this video. Thanks for doing that. My only thing is that I am trying to get the data from SharePoint (over 100k rows) and not dataverse and then piping the data to Cosmos I get the concept would be the same just a different data source. Thanks much.

  • @macward7495
    @macward7495 3 роки тому +1

    Thank you Steve!! Pagination and Concurrency are saving me so much time :) Wonderful video.

  • @jt9228
    @jt9228 7 місяців тому

    Just stumbled across this and subscribed to your channel. Great video on the topic; I have been searching for how to handle large dataset in Automate and this will do it for me! Thanks!!!

  • @Alex-sq7ck
    @Alex-sq7ck 2 роки тому +1

    Best video ive come across

  • @linamariasilva2141
    @linamariasilva2141 3 місяці тому

    Gracias totales me sirvió muchísimo esta información. thx

  • @SMILE4EVER_Pa1
    @SMILE4EVER_Pa1 Місяць тому

    Hi Steve
    I have power apps for dynamics 365 and power apps guides
    In my licence list
    Does it work with this licence?

  • @user-is7xl1bu4t
    @user-is7xl1bu4t 4 місяці тому

    Hello Steve, I am in the process of attempting to utilize excel to create / update records in a SharePoint List. I currently have mu flow functioning the way i want. However, I am stuck at that 5000 limit. My data will be upwards of 200 times that. I know this will move slow but will work for what I would need.
    How do I implement this on of these methods into it? I understand that I will have to turn my flow into a solution. Just not sure what that will look like with my needs.
    Currently I am utilized a Scheduled Flow--->Get files (properties only)--->Apply to each->Update file properties->List rows present in a table->Apply to each-Get items-Conditions->Delay->Update file Properties
    The video I watched to do this is
    Reza Dorrani - Add & Update Excel Data to SharePoint List using Power Automate | Excel Import using flow
    My flow will be almost Identical to his. The only difference is I did not need to do all the extra steps to select choices. My data is very clean and singular text.

  • @RAHULPATEL-oz3vg
    @RAHULPATEL-oz3vg 2 місяці тому

    Awesome tutorial.

  • @johnfromireland7551
    @johnfromireland7551 3 роки тому

    Very clear video. More please!

  • @seang2012
    @seang2012 2 роки тому +2

    Would you create a similar video showing how to use Pagination with Flow pulling from an API/Web source?

  • @cmichael7807
    @cmichael7807 Рік тому

    Concurrency is actually defaulted to 20, not 1, even with control off. To do truly synchronous (1 by 1), you need to turn concurrency control ON, and slide the scaler down to 1.

  • @vaibhavsingh594
    @vaibhavsingh594 2 роки тому +2

    Hi Steve
    Very well explained.
    One question:
    After the do until loop, with what value you have updated the skip token variable.
    Thanks in advance!

    • @SteveWinward
      @SteveWinward  2 роки тому +1

      Sorry I'm not sure I fully understand your question. I parse the OData response to get the next page URL. If that value is not empty, meaning there is a URL present, that means you can request the next "batch" of data. Hopefully that helps answer your question.

    • @vaibhavsingh594
      @vaibhavsingh594 2 роки тому

      @@SteveWinward Thanks Steve👍

  • @iainpritchard803
    @iainpritchard803 3 роки тому +2

    Brilliant...just what I need! Thank you...😊
    ...I can't find the link to the documentation / example. Has that gone ? Could you help me to get to it somehow? Desperate to use it...😊

    • @SteveWinward
      @SteveWinward  3 роки тому +2

      Here's the link on GitHub,
      github.com/SteveWinward/PowerApps/blob/master/WriteUps/PowerAutomatePagination.md

  • @sunny48285
    @sunny48285 Місяць тому

    you should have shown how you are parsing skip token as well it will be valuble

  • @hafidazer1634
    @hafidazer1634 3 роки тому +1

    Great work ! Thanks

  • @robertmerritt5245
    @robertmerritt5245 3 роки тому +1

    Hi! This is an awesome video! Thanks so much! Does this only work with DataVerse data, or can I use it with SharePoint data?

    • @SteveWinward
      @SteveWinward  3 роки тому

      It will work against any api that uses OData. SharePoint apis support OData so it should work there too.

    • @robertmerritt5245
      @robertmerritt5245 3 роки тому +1

      @@SteveWinward Awesome! Thanks again!

  • @aditya10039
    @aditya10039 2 роки тому +2

    Hey Steve! I want to export data from Power BI to excel using Power automation, But it only updates 1000 rows after a run. Can this method also be applied when the source file is from Power BI?

    • @SteveWinward
      @SteveWinward  2 роки тому

      Great question. If the Power BI api let’s you do paging then yes you can use this same technique. I’m not aware off the top of my head if the Power BI export api supports this but the pattern in Power Automate supports this.

  • @Brunos2Costa
    @Brunos2Costa Рік тому

    Can I use pagination to get pages of 100 records each? And if so, how do I configure that?

  • @madihasalman1
    @madihasalman1 Рік тому

    Hi
    Good video. But how can we do if are using SQL connector to extract more than 5000 records. ?

  • @yppatel4223
    @yppatel4223 3 роки тому +1

    Instead of Dataverse, will it work for the Sharepoint list? Also, will it work for the seated version of power automate?

    • @SteveWinward
      @SteveWinward  3 роки тому

      I believe this same pattern with work with SharePoint. I haven’t tested it myself, but my guess is it would. Also if you are doing this against SharePoint only you can used the seeded version of Power Automate.

  • @bairikrishna0808
    @bairikrishna0808 3 роки тому +1

    Thank you Steve. Will this pagination apply for the excel as a data source?

    • @SteveWinward
      @SteveWinward  3 роки тому +1

      I believe it should work with the Excel connector. I have not tried that one myself, but I imagine the API has the same pagination capabilities.

    • @SteveWinward
      @SteveWinward  3 роки тому

      @Ahmad Syakir Shahruddin For the Excel for Business connector. On the action, click on the 3 dots > Settings. Enable pagination and set the threshold to higher than 5,000. This will automatically do the paging for you and return a larger dataset size.

  • @ThisIsFrederic
    @ThisIsFrederic Рік тому

    Hi Steve,
    Thanks for your video!
    While your flow works on my data, there's still something I am not totally clear about: when discussing the "Do until" loop at 10m02s, you said "as long as the next URL property is not null", then we loop. Which makes sense for a "Do while" loop but not in this case since it's a "Do until". At least I would run the loop UNTIL the next link variable becomes null, meaning that the NextLink property does not exist anymore. Why does it work in that case?

    • @coolbeing163
      @coolbeing163 Рік тому

      Probably, if Next link property not found, it indicates as NULL

  • @ksowjanya4488
    @ksowjanya4488 2 роки тому

    I want import more than 1lakh records to dataverse from csv file in SharePoint using dataflow instead of power automate. Can you guide me possible ways please?

  • @shashankk1682
    @shashankk1682 Рік тому

    Hi Steve. Great stuff. Wanted to know if you would be helpful in knowing how would this work with sharepoint.
    I have a flow that creates/updates sharepoint item sourcing out from an excel file present on the sharepoint. The data ofcourse will be going beyond 5000 rows in excel and evidently at some point I have to worry about paging. Could you please create a video or help so that creating/updating sharepoint list items for newly added rows/existing rows (no matter the no of rows to be listed on automate) are possible. P.S. an O365 user.

  • @yilu3219
    @yilu3219 2 роки тому

    Hi, but anything more than 5000 is for premium users, right. I have basic plan which is 5000 for Pagination limit

  • @dynamics365series2
    @dynamics365series2 Рік тому

    Can you provide the expression in initialize value?

  • @srik9814
    @srik9814 3 роки тому

    Nice information

  • @tarunverma6352
    @tarunverma6352 3 роки тому +1

    How to import data from sharepoint or one drive into Dataverse, or we can directly upload excel file from canvas App and read data and insert into Dataverse as a table. please make one vidio

    • @SteveWinward
      @SteveWinward  3 роки тому

      Dataflows are how you can do what you want to do here.
      docs.microsoft.com/en-us/power-query/dataflows/overview-dataflows-across-power-platform-dynamics-365

  • @ShilabhadraSahu
    @ShilabhadraSahu 6 місяців тому

    Should elaborate on skip token more

  • @klemetsrudj
    @klemetsrudj 2 роки тому

    I'm outputting 25 fields from an entity, using fetchxml, parsing the paging cookie and doing it very similar...however; I'm trying to write these 25 fields over 10k rows (limit if 5k per run)....it's run for 24 minutes now already (ugh, so long). It seems like there should be a much much faster way to get 10k rows into excel. IE, one should not even have to apply to each, instead just get the rows and add them all to a table in one shot. If you have any ideas on how to do that. I guess I could maybe append a CSV table over and over, and then write all the contents in one push???

    • @SteveWinward
      @SteveWinward  2 роки тому

      Another idea is to use Azure Synapse and setup a data pipeline to convert Dataverse records to a csv file.

    • @klemetsrudj
      @klemetsrudj 2 роки тому +3

      @@SteveWinward actually, just using an array variable to loop over and append to, then put that into a csv table action, then create file for SharePoint with it's output as input took about 9 minutes. Much more acceptable for now.

    • @SteveWinward
      @SteveWinward  2 роки тому +1

      @@klemetsrudj excellent. Thanks for sharing your solution.

  • @babajidebantale
    @babajidebantale 3 роки тому

    Hi, thanks for this content, however, I experienced some issues.
    1. Getting all data from contact entity gave this error: "HTTP request failed as there is an error: 'Cannot write more bytes to the buffer than the configured maximum buffer size: 104857600. "
    2. I retrieved some fields in the select columns and error in No.1 above resolved but "NextLink" was empty.
    3. I also used fetchXML (which has a limit of 5k rows) and NextLink was still empty.
    Any help is appreciated. I have a 12k contact DB to work on.

    • @SteveWinward
      @SteveWinward  3 роки тому

      If you are getting the maximum buffer size error, you probably need to create a flow like I do here (ua-cam.com/video/5NtzcfmSGes/v-deo.html) which is the custom pagination example. If you go to the settings of the action and enable pagination, that step still has a max limit that you are running into. If you loop over the records 5,000 records at a time, I don't think you will hit the max buffer size error. I did not test my example with a fetchXML query specified.

  • @mohamedaffi3816
    @mohamedaffi3816 2 роки тому

    what about list sharepoint ?

  • @waltertiggeloven
    @waltertiggeloven 3 роки тому

    No matter what I try.
    Import keeps failing with message saying "something went wrong try again later". I also tried to remove the folder "Workflows" and created a new ZIP with the 3 Json's and without that subfolder.
    But nothing works. Import of the zip is impossible. 🤷‍♂️ Any idea?

    • @SteveWinward
      @SteveWinward  3 роки тому +1

      Are you following these steps to import a solution file?
      docs.microsoft.com/en-us/powerapps/maker/data-platform/import-update-export-solutions

    • @SteveWinward
      @SteveWinward  3 роки тому +1

      I added some more detailed steps to get the sample installed
      github.com/SteveWinward/PowerApps/blob/master/WriteUps/PowerAutomatePagination.md

    • @waltertiggeloven
      @waltertiggeloven 3 роки тому

      @@SteveWinward Well, many thanks for your quick response en additional info. I did not follow the correct procedure. However, even if I do this import is not possible.
      Anyway, don't bother. Your video was helpfull enough to get me on the right track. I was struggling (for almost a week now) to get pagination solved for MS Planner (through MS Graph). There were several obstacles I had to overcome and partly because of the limited information in the event of an error, it was very difficult to discover what and where exactly went wrong.
      After getting the permissions right (Admin Concent on Microsoft Graph ApiConnectors) pagination (handling NextLink) remained. I now have a working model. Many thanks for your video and support. You made one of the most complete and intelligible explanations.

    • @SteveWinward
      @SteveWinward  3 роки тому

      @@waltertiggeloven I'm glad you found this video helpful!

  • @brajgopal108
    @brajgopal108 2 роки тому

    what is the length expression.

  • @travissmithoh
    @travissmithoh 2 роки тому +1

    Will this work for over 100,000 records?

    • @SteveWinward
      @SteveWinward  2 роки тому +1

      Hey Travis. I haven’t tested this with 100,000 records. My intent was to show how to use paging if you have datasets right over the 5,000 or 10,000 mark that will get truncated by default. If you have records on the order of 100,000 items you need to regularly work with you may want to find another approach outside of Power Automate. Dataflows might be a better option or even looking at something in Azure to process the records.

    • @travissmithoh
      @travissmithoh 2 роки тому

      @@SteveWinward Thanks! I will look into Power BI Dataflows.

  • @richardma2655
    @richardma2655 2 роки тому

    How to deal with 15000 records? thanks

    • @PodOConaill
      @PodOConaill 7 місяців тому

      Did you find a solution to this?
      I have to do this for 2.5million records

  • @9028129188
    @9028129188 4 місяці тому

    HI Steve thanks for sharing the useful information. I would like to ask about as I have under apply to each action is export to file for paginated reports and it is taking quite time and passing few parameters value from dataverse then calling send email action.kindly suggest to optimize the flow performance issue .