Faster SharePoint folder consolidation using Incremental Refresh (see warning in the notes)

Поділитися
Вставка
  • Опубліковано 29 лис 2024

КОМЕНТАРІ • 68

  • @AccessAnalytic
    @AccessAnalytic  6 місяців тому

    ⚡⚡ Enabling incremental refresh means you will not be able to download the file from the service, so keep hold of that PBI desktop copy ⚡⚡
    Always take a look at the description for these sorts of updates

  • @zzota
    @zzota 6 місяців тому +8

    This is just what I need. I thought Incremental Refresh was only applicable to SQL datasets, so this is brilliant. Thanks Wyn 🙂

    • @AccessAnalytic
      @AccessAnalytic  6 місяців тому +2

      Same here… wish I’d known sooner 😀

  • @vladog1834
    @vladog1834 17 днів тому +1

    Great - exactly what i was looking for a months

    • @AccessAnalytic
      @AccessAnalytic  16 днів тому

      Fantastic. Make sure you read my warning in the description

  • @mkcorrea
    @mkcorrea 6 місяців тому +3

    Pretty nice!
    Some really big data we can also segment by generating files with that segmentation and naming it for filtering when consolidating from the SharePoint folder.
    That will enable solving a demand I have at my company. Thank you so much!👏🏻👏🏻👏🏻

  • @data-made-simple
    @data-made-simple 6 місяців тому +3

    Great timing - I’m up to 30 mins refresh time on one setup and been putting off looking into incremental refreshes. Explained clearly and simply as usual.
    Thanks for your videos- it’s really helped using SharePoint folders as quick non code approach over alternatives.

    • @data-made-simple
      @data-made-simple 6 місяців тому +1

      What about refreshing back dated files - eg an updated sales file from 2022? Assume there’s a choice between incremental refresh and full refresh?

    • @AccessAnalytic
      @AccessAnalytic  6 місяців тому

      You're welcome - glad it helps

    • @AccessAnalytic
      @AccessAnalytic  6 місяців тому

      @@data-made-simple - good question. Simplest approach is to do a refresh in the desktop file and re-publish.

    • @AccessAnalytic
      @AccessAnalytic  6 місяців тому

      If you have Premium you can get fancy with XMLA endpoints and just refresh particular partitions (apparently 😀)

    • @data-made-simple
      @data-made-simple 6 місяців тому +1

      Ah ok got it. Thanks!

  • @SergiyVakshul
    @SergiyVakshul 6 місяців тому +2

    Wyn, I’d like to add that if a file name points to the future date, then this file won’t be processed during the refresh time. Another important point is what is considered to be the current date. I mean that the specified time zone for the scheduled refresh can shift your date one day, which may impact the result (the docs cover this aspect).

  • @jawadahmadehssan6251
    @jawadahmadehssan6251 6 місяців тому +5

    I went from 6 min to 8 seconds. Thank you. Can we set up a something similar for data flows?

    • @AccessAnalytic
      @AccessAnalytic  6 місяців тому +1

      Yes, but needs PPU or P1 ( or F64 I think ). 10:56 shows brief pop up of where to click to enable for dataflow.

  • @JorgePerez-bu4ph
    @JorgePerez-bu4ph 6 місяців тому +3

    Great video as always. I got confused about something: The "RangeEnd" parameter... Am I supposed to change it manually for the next update or will it automatically update to the last date found in my data files? How does it work? I want to understand if this RangeEnd parameter is something I have to maintain, or if once the incremental refresh is set I can forget about it? Thanks!

    • @AccessAnalytic
      @AccessAnalytic  6 місяців тому +2

      Set and forget. It gets overridden by the date of refresh each time.

    • @JorgePerez-bu4ph
      @JorgePerez-bu4ph 6 місяців тому

      @@AccessAnalytic Does it matter at what step the "DateRange" filter is applied? I did it immediately after the source step, after that I have multiple steps adding and removing columns and applying some other data-cleaning transformations. It did not work. I received a bunch of new files during the weekend, all stored on the same SharePoint folder. I followed your instructions the latest files were not added after today's refresh. Changed the RangeEnd manually and it worked. I'm importing data from PDF files and using the file creation date as my date column. What could possibly be wrong?

    • @AccessAnalytic
      @AccessAnalytic  6 місяців тому

      @JorgePerez-bu4ph - I don’t think it matters. As long as the column is date time type. This is new to me so I don’t have too many experiences to help here sorry.

  • @suheilsamara7955
    @suheilsamara7955 5 місяців тому +1

    This is Brilliant!
    Issue is that my file names do not include indication for the date thus the options I have is either to use the date created or date modified columns. What are the concerns in such approach ? I rather think that using the date modified in safer than file name because if I rely on the file name while the content is modified for any reason, such changes won't be picked by the incremental refresh. Agree!

    • @AccessAnalytic
      @AccessAnalytic  4 місяці тому +1

      Potentially a file can be opened at any point and the modified date may change ( especially if opened online ).
      Just safer / more deliberate using file name date.

  • @AshishSingh-tw2vd
    @AshishSingh-tw2vd 6 місяців тому

    Great content sir ❤ The only problem which i am facing whith this is the dates . It would have been much better if you could have just explained the purpose and roles of all three dates(one which is parametrized, archive data starting and incrementally refresh date) as well.

    • @AccessAnalytic
      @AccessAnalytic  6 місяців тому

      Thanks. Not too sure what you mean sorry.

  • @zorankrekic3006
    @zorankrekic3006 6 місяців тому +1

    Simple and nice when I have new file, what to do when new data are append on last file, and is possible incremental refresh set into dataflow

    • @AccessAnalytic
      @AccessAnalytic  6 місяців тому +1

      If you set the incremental refresh period to be a couple of months then any updates to files falling in that time period will be pulled in.
      Dataflows does have incremental refresh but requires premium / PPU

  • @kiasca3489
    @kiasca3489 6 місяців тому +1

    Thanks really helpful, is it possible to do it on power query excel, so it doesn't load again the data when refreshing power pivot?

  • @mjb4365
    @mjb4365 6 місяців тому +3

    Is this also available for Excel PQ, or just Power BI?

  • @aigbekennethomorodion4026
    @aigbekennethomorodion4026 6 місяців тому +1

    Does incremental refresh now work on data sources that don’t support query folding? Just curious or maybe I have missed something in any new Microsoft updates.

    • @AccessAnalytic
      @AccessAnalytic  6 місяців тому +1

      I think this non query folding technique is limited to SharePoint and Azure Blob storage (maybe there's other storage locations too )

  • @okmr7706
    @okmr7706 6 місяців тому

    Hi Wyn, I have 2 qutestions:
    1. What happens if current time is beyond RangeEnd? Should I adjust the value of RangeEnd to a future DateTime?
    2. I guess this is only work for "consolidateing Excel files" on SharePoint Folder NOT for importing individul Excle file by Web?
    Thank you Wyn, great video again!

    • @AccessAnalytic
      @AccessAnalytic  6 місяців тому +1

      This is only for consolidating files. You can’t forward look range end. What sort of scenario do you have in mind where a future dated file is added on a regular basis?

  • @woliveiras
    @woliveiras 6 місяців тому +1

    Amazing video. Thank you. Incremental refresh in PowerBI is crazy. How Microsoft can make something simple becomes a hard thing.
    To me it is not clear what is the best parameter to choose if I wanna add just the current day in my dataset.
    And why I would update the entire last month(for example) if this is an incremental refresh. I already added it in my previous update. Do you understand my point?

    • @AccessAnalytic
      @AccessAnalytic  6 місяців тому

      Yes it’s taken me a while to understand. So if you’re not concerned about previous days data ever needing to be updated choose daily (1 day) for incremental refresh window. ( maybe consider 2or 3 days in case some error is spotted ) Then for historic storage you could do annual or monthly - doesn’t really make much difference.

  • @LukasMissias
    @LukasMissias 5 місяців тому +1

    Excellent video! Is it possible to apply to Sharepoint lists?

    • @AccessAnalytic
      @AccessAnalytic  5 місяців тому +1

      Thanks. I don’t know sorry.

    • @LukasMissias
      @LukasMissias 5 місяців тому

      @@AccessAnalytic Obrigado pela atenção 😀

  • @StephenBrincat
    @StephenBrincat 3 місяці тому

    Excellent, one question would datetime like 20/08/2024 00:00:00 work? I mean time will be always midnight.

  • @AngelMartinez-zz2ww
    @AngelMartinez-zz2ww 2 дні тому

    Thank you for your video, it has been very helpful. One question, if I upload my pbix file to the power bi service with an “archive data starting” of 365 days and an “incrementally refresh data” of 90 days and later I upload the file again with an “incrementally refresh data” of 10 days. does it perform the “archive data starting” of 365 days again? The problem is that I have daily csv files of 11 months and the powerbi is not able to run more than 9 months and I don't know how to do the initialization. Thanks

    • @AccessAnalytic
      @AccessAnalytic  День тому

      Yes I believe that any re-publishing will result in a full refresh.
      Why can't it run more than 9 months?

    • @AngelMartinez-zz2ww
      @AngelMartinez-zz2ww День тому

      @@AccessAnalytic Because they are very heavy files, with many fields and several transformations and crossings are made between them. I have a daily file since 1 January 2024 and at the end of October the refresh was not finished because of the amount of data I had to process until I saw your video but I have the problem of initialisation. My idea was to make an initialization with the information of 9 months and a refresh of 70 days and once it was consolidated decrease the refresh to 5 days.

    • @AccessAnalytic
      @AccessAnalytic  День тому +1

      @AngelMartinez-zz2ww I’d be tempted to consolidate and export them into 2 or 3 big CSVs then use those instead of the daily files

  • @samirvaghasiya9918
    @samirvaghasiya9918 6 місяців тому +1

    Thanks :)) Very informative

  • @gopichand5717
    @gopichand5717 6 місяців тому +1

    As usual great video😊

  • @marupbi
    @marupbi 5 місяців тому +1

    Amazing Job🔝

  • @munawarhussain1400
    @munawarhussain1400 5 місяців тому

    I have completed the setup. However when I publish a file with increamental refresh and then I goto schedule refresh it show error in refresh. Error is "column Debit is not available" although column is available in dataset and there is no error in BI Desktop . However when I turn off increamental refresh then publishit again, then there is no such error. Any Idea how I can solve this?

    • @AccessAnalytic
      @AccessAnalytic  5 місяців тому

      That’s a new one to me, not heard of that before sorry.

  • @iiiiii-w8h
    @iiiiii-w8h 6 місяців тому

    please can you show us this for dstaflows? I'm pretty sure I have a PPU licence assigned by my company but I've never used any of it's capabilities

    • @AccessAnalytic
      @AccessAnalytic  6 місяців тому

      I briefly show where you have to click at 10:56 in the screenshot that appears. Same process, but requires PPU or premium capacity as you say you may have ( workspace settings assign to PPU )

  • @shidubravinarunasalam1995
    @shidubravinarunasalam1995 4 місяці тому

    Hi is this option available in Power BI Desktop-On Premise, i couldn't find this option while right clicking on the table, i have done this following steps, what you have done, but the option is not available in Power BI Desktop-On Premise, but if i open in Power Bi Desktop Cloud i can do this process, but again during uploading the huge file to cloud not possible, since i have Power BI Pro License, any other suggestion from your side?

    • @AccessAnalytic
      @AccessAnalytic  4 місяці тому +1

      I’m not familiar with power bi desktop for report server ( I assume that’s what you mean by on-prem ).
      I know it’s limited in functionality compared to standard desktop

    • @shidubravinarunasalam1995
      @shidubravinarunasalam1995 4 місяці тому +1

      @@AccessAnalytic Yes i mean Power BI Report Server.

  • @worldofdata
    @worldofdata 5 днів тому

    woudl this replace old data or just add new data?

  • @Bhavik_Khatri
    @Bhavik_Khatri 6 місяців тому

    Thank you for the video. Can you please share the files, too?

    • @AccessAnalytic
      @AccessAnalytic  6 місяців тому

      You’re welcome. There won’t be much point as the query will immediately error as you won’t have access to the source files.

  • @RonDavidowicz
    @RonDavidowicz 6 місяців тому

    Is there an equivalent for Excel?

  • @FahadHameed-uq5zg
    @FahadHameed-uq5zg 6 місяців тому

    Can you show in little bit easy understable wayy this is too much complex sir.

    • @AccessAnalytic
      @AccessAnalytic  6 місяців тому

      I tried my best to make it as simple as possible. Is there a particular part you’re struggling to understand?

    • @AccessAnalytic
      @AccessAnalytic  6 місяців тому

      Maybe start with watching this one ua-cam.com/video/-XE7HEZbQiY/v-deo.htmlsi=n9s8VFSD61yxDK9_