Hats off to you sir, goodly, for this brilliant and clean way to address the challenge. It is exactly what I am looking for. Great instruction. Really appreciate it.
Hi, Firstly, I'd like to extend my gratitude for your invaluable Power BI videos. They are remarkably easy to understand and immensely informative. I have a specific request. Could you please produce detailed videos on the WINDOW and OFFSET functions, encompassing a wide array of examples? I've scoured available resources and found a significant gap in comprehensive explanations for these two functions. It would be particularly helpful if you could delve into date-related examples, such as determining a running total for every week or month and computing weekly or monthly averages. To give you a frame of reference, in SQL, we frequently employ window functions to calculate running totals or weekly average values. Your guidance would be immensely beneficial.
This is awesome, Chandeep. Brilliant! Love the clean PQ way of extracting previous row - I have always used the good old row index approach. I tell you, there are a number of my reports I will be updating in the coming days :-D Thanks for the video 👍😃
Thanks for this video! :) Not to take anything away from the expertise in power query skills but, i found such code working relatively slower in pbi and instead, pushing this problem to be solved easily in your source database helped me. especially when building reports of 400, 500+ mb. Its also a question of how many lists/objects/structs are being used here, how fast are your reports able to compile and be built/ready, how good is the query folding in such solutions, how does it work against premium reports over one gb, versus how simpler a sql solution would be, where there arent any additonal columns array_agg or list based solutions but more on the use of relational architecture. I hope everyone will think of these factors too and choose the right place where you build solutions ! :) especially since most of us would have access to a db.
Thanks for sharing this Chandeep. I have a bias for Power Query. And if something can be done better in PQ than DAX, won't even think about touching DAX.
I am bit wary of doing everything in Power Query. I tend to you use Power Query to shape data until my DAX becomes ridiculously simple. Plus it's a creative process, there are no hard rules. See what works, make it accurate and fast then move on :)
Thank you for the video and different possibilities to run this operation. Unfortunately non of the options is really fast when you have hundreds of files with millions lines. Already a file with hundred thousand lines slows terrible down. I use now a python script before to do this operation and calculations. Python does this job in seconds. It would be great to run the python script directly from excel to have it in a closed environment.
I think you should do variables in power query, I've never really understood them, yes I know each is sugar for ()=> etc. and have used them, but sometimes you declare them before 'let' sometimes after ?
hi goodly , when i was trying to implement the DAX model for this task , I got tis type of error "the arguments in GerateSeries function cannot be blank "
Is there any way to speed up PQ editor ? My table has 5000 rows and my query (fifteen steps) takes forever to refresh. Any additional step will take 5-6 min to execute. CPU and memory usage is VERY high. My machine is pretty fast so I don't know what's going on. PQ is supposed to be able to handle big data.
Hello! query on how to make a Table.Group that is dynamic, since the date columns increase and decrease each time I update the query: Table.Group(#"Remove", {"Prod Line"}, { {"Date1", each List.Sum([Behind]), type nullable number}, {"Date2", each List.Sum([#"05/24"]), type nullable number}, {"Date3", each List.Sum([#"05/25"]), type nullable number}, {"Date4", each List.Sum([#"05/26"]), type nullable number}, {"Date5", each List.Sum([#"05/27"]), type nullable number} } ) Note: I do not use override of other columns and dynamic column because there are millions of records, this causes me to slow down. Thanks in advance, success with your UA-cam channel
Check out our newly launched M Language course ↗ - goodly.co.in/learn-m-powerquery/
This walk through was fantastic and easy to understand. It was exactly what I was looking for too. Thanks very much for sharing this!
Hats off to you sir, goodly, for this brilliant and clean way to address the challenge. It is exactly what I am looking for. Great instruction. Really appreciate it.
Thanks Veer, Mainu do dhayi saal hogge es cheez nu try krde ajj tuhadi perfect video milli
Your video helped me make a SCD type 2 using an external table! Thanks 🙏
Hi,
Firstly, I'd like to extend my gratitude for your invaluable Power BI videos. They are remarkably easy to understand and immensely informative.
I have a specific request. Could you please produce detailed videos on the WINDOW and OFFSET functions, encompassing a wide array of examples? I've scoured available resources and found a significant gap in comprehensive explanations for these two functions. It would be particularly helpful if you could delve into date-related examples, such as determining a running total for every week or month and computing weekly or monthly averages. To give you a frame of reference, in SQL, we frequently employ window functions to calculate running totals or weekly average values.
Your guidance would be immensely beneficial.
That was amazing to watch!!- I’m going to have to watch it over and over, really love your approach to these problems Chandeep!!
Awesome! Thank you!
ChatGBT will not be able to explain or solve this issue the way you did in a million years from now. 🙂 Thank you so much
Enjoy! 😉
This is awesome, Chandeep. Brilliant! Love the clean PQ way of extracting previous row - I have always used the good old row index approach. I tell you, there are a number of my reports I will be updating in the coming days :-D
Thanks for the video 👍😃
Hey Bare! Glad you liked it!
Cheers
Thanks for this video! :) Not to take anything away from the expertise in power query skills but, i found such code working relatively slower in pbi and instead, pushing this problem to be solved easily in your source database helped me. especially when building reports of 400, 500+ mb.
Its also a question of how many lists/objects/structs are being used here, how fast are your reports able to compile and be built/ready, how good is the query folding in such solutions, how does it work against premium reports over one gb,
versus how simpler a sql solution would be, where there arent any additonal columns array_agg or list based solutions but more on the use of relational architecture.
I hope everyone will think of these factors too and choose the right place where you build solutions ! :) especially since most of us would have access to a db.
Awesome Chandeep. Hope you had explained the DAX formulas too.
This is way over my head… Will have to watch again 🙈
This is crazy.. You are AWESOME!!!!
Thank you 💚
Thanks for sharing this Chandeep. I have a bias for Power Query. And if something can be done better in PQ than DAX, won't even think about touching DAX.
I am bit wary of doing everything in Power Query. I tend to you use Power Query to shape data until my DAX becomes ridiculously simple.
Plus it's a creative process, there are no hard rules. See what works, make it accurate and fast then move on :)
Genius.
This saved my ass at work today. Keep it up
Glad it was helpful!
Very enjoyed to watch your PQ solutions! Thanks😊😊
Glad you enjoyed it 💚
this is what i wanted, thank Chandeep you are brilliant
Glad it was heloful!
Thank you so much Chandeep for your great efforts.
Glad you like it 💚
you have explained in a very nice and clean way....love this
Thank you 💚
excellent!!
Thanks, this is awesome, I applied in my work it was really helpful
Glad it was helpful 😊
Excellent
Thanks 💚
Brilliant, Chandeep...!! Let me tweak the offboarding part... :)
Sure.. Also recently recorded a video on basics on M.
That should be helpful to understand different syntactical requirements.
Cheers
@@GoodlyChandeep Many thanks, Chandeep! Guess Basics of M is yet to be published...
Thank you for the video and different possibilities to run this operation. Unfortunately non of the options is really fast when you have hundreds of files with millions lines. Already a file with hundred thousand lines slows terrible down. I use now a python script before to do this operation and calculations. Python does this job in seconds. It would be great to run the python script directly from excel to have it in a closed environment.
This is very helpful! Instead of null as date for the first row, I want the date for the top row. How do you do it?
Thanks for your sharing, Mr. Chandeep.
Would you mind showing me how to get previous row for multiple columns in Power Query?
Considered as magic 😮
cool man!
really good work.....
Thank you!
I think you should do variables in power query, I've never really understood them, yes I know each is sugar for ()=> etc. and have used them, but sometimes you declare them before 'let' sometimes after ?
any after the rocket sign => becomes a function.
it depends which part of the query would you like to declare it like a function.
a video on basics of M language coming soon :)
hi goodly ,
when i was trying to implement the DAX model for this task , I got tis type of error "the arguments in GerateSeries function cannot be blank "
Either your start date or end date is blank. You should fix that!
How to take calendar data in data model?
Is there any way to speed up PQ editor ? My table has 5000 rows and my query (fifteen steps) takes forever to refresh. Any additional step will take 5-6 min to execute. CPU and memory usage is VERY high. My machine is pretty fast so I don't know what's going on. PQ is supposed to be able to handle big data.
Take a look at Buffer functions (Table.Buffer, List.Buffer) and consider using custom functions for some tasks. Avoid ifs as much as you can.
You did not mention anything about the grouping so unable to proceed with our valuable practice.
Hello! query on how to make a Table.Group that is dynamic, since the date columns increase and decrease each time I update the query:
Table.Group(#"Remove", {"Prod Line"},
{
{"Date1", each List.Sum([Behind]), type nullable number},
{"Date2", each List.Sum([#"05/24"]), type nullable number},
{"Date3", each List.Sum([#"05/25"]), type nullable number},
{"Date4", each List.Sum([#"05/26"]), type nullable number},
{"Date5", each List.Sum([#"05/27"]), type nullable number}
}
)
Note: I do not use override of other columns and dynamic column because there are millions of records, this causes me to slow down.
Thanks in advance, success with your UA-cam channel
Power Query is loading for too long time for million rows... do you have any solution for it bro?
Watch this - ua-cam.com/video/UPddzZnsf5w/v-deo.html