Consider this scenario, I have a PBI report that refreshes everday and i want to extract that data and append it. I dont want it to overwrite the table but always append so that I have historical record. How can i do this? Please
The csv files could become your data source and then use “from SharePoint folder” connector to consolidate them? ua-cam.com/video/-XE7HEZbQiY/v-deo.htmlsi=YUWL7sGbRfEclx0H
i found a bug (perhaps), does anyone know a solution. Issue: although the query defines the COLUMN ORDER. I have found that if the FIRST ROW (record) has NULL CELLS (items) the column is MOVED to the END of the COLUMNS in the OUTPUT to .CSV.
@@AccessAnalytic usual you want to index your data in the model, set bunch of variables, one of which will be batch size, then initial value and optionally count for loops, as then at the end you want to append everything, and each CSV will have own headers, and you don't want them. Loop number will help you with that as then you can skip first row in CSV for loops beyond first one.
@@AccessAnalytic no, there's a limit when you query dataset in power automate regardless if the premium. In your case you called single value, so no risk there.. but if you want to catch a snapshot this is the only way..
this is brilliant for me, thanks
Great to hear!
Amazing, thanks.
You’re welcome.
Very cool, thanks Wyn
You’re welcome
Consider this scenario, I have a PBI report that refreshes everday and i want to extract that data and append it. I dont want it to overwrite the table but always append so that I have historical record. How can i do this? Please
The csv files could become your data source and then use “from SharePoint folder” connector to consolidate them?
ua-cam.com/video/-XE7HEZbQiY/v-deo.htmlsi=YUWL7sGbRfEclx0H
i found a bug (perhaps), does anyone know a solution. Issue: although the query defines the COLUMN ORDER. I have found that if the FIRST ROW (record) has NULL CELLS (items) the column is MOVED to the END of the COLUMNS in the OUTPUT to .CSV.
Not a scenario I’ve come across yet
Worth mentioning how to loop through data model as there's a limit how much stuff you can call on one go.
Any tips?
@@AccessAnalytic usual you want to index your data in the model, set bunch of variables, one of which will be batch size, then initial value and optionally count for loops, as then at the end you want to append everything, and each CSV will have own headers, and you don't want them. Loop number will help you with that as then you can skip first row in CSV for loops beyond first one.
Cheers, so this as a workaround to a paginated report when you want large data exports and don’t have premium capacity ?
@@AccessAnalytic no, there's a limit when you query dataset in power automate regardless if the premium. In your case you called single value, so no risk there.. but if you want to catch a snapshot this is the only way..
Sorry, what I’m saying is a paginated report would be a simpler solution but that requires premium capacity.
Any idea on what the data limit is?