Where to create your columns in Power BI | Data Modeling Best Practices

Поділитися
Вставка
  • Опубліковано 10 лип 2024
  • Patrick continues the journey of data modeling best practices with looking at where to create your columns within Power BI. In the data source, in Power Query or using a DAX calculated column?
    High Memory Usage and Calculated Columns (@tlachev)
    prologika.com/high-memory-usa...
    VertiPaq Analyzer (Free): www.sqlbi.com/tools/vertipaq-...
    Guy in a Cube courses: guyinacu.be/courses
    ********
    LET'S CONNECT!
    ********
    -- / guyinacube
    -- / awsaxton
    -- / patrickdba
    -- / guyinacube
    -- / guyinacube
    -- guyinacube.com
    **Gear**
    Check out my Tools page - guyinacube.com/tools/
  • Наука та технологія

КОМЕНТАРІ • 124

  • @dangelo90
    @dangelo90 4 роки тому +1

    I have recently started expanding my knowledge with PBI and your channel has amazing information, examples and tips. I appreciate your work very much! Thank you for your efforts!

  • @aubreymckillop9156
    @aubreymckillop9156 4 роки тому

    Thanks Patrick! I have asked this question so many times. Luckily in most ways I have been doing as you describe it with Views being my first spot as much as possible.
    You also showed me some new tricks. Love the videos!

  • @driouchemountasir7894
    @driouchemountasir7894 4 роки тому

    Hi Patrick ! Thanks for those advices and for sharing your tips and best practices. It's good help

  • @pisastero
    @pisastero 4 роки тому +3

    Thanks so much! After a month of struggle, I learned to push the calculated column back to the data source using SQL. Glad to hear I was on the right track.

    • @GuyInACube
      @GuyInACube  4 роки тому

      Awesome! Assuming it helped.

  • @navjeet41
    @navjeet41 4 роки тому

    BAM. Like your energy. Been building knowledge blocks with SQL to get better at this. Good work.

  • @sushmitmehta1415
    @sushmitmehta1415 4 роки тому

    Thanks Patrick ! Keep uploading helps us a lot! best wishes from INDIA

  • @KN-wi4ns
    @KN-wi4ns 2 роки тому

    Thanks for sharing your knowledge…

  • @JohnSpanos
    @JohnSpanos 4 роки тому

    Great info, thanks!

  • @Jaszczer93
    @Jaszczer93 6 місяців тому

    Great content, helped me understand the process of creating columns in Power BI, which was previously unclear to me.

  • @PeterNovakIng
    @PeterNovakIng 4 роки тому +2

    I like this videos, because I know where I can find stuff what I need but not remember where or how to do. Thank you for your help Patrick!

    • @GuyInACube
      @GuyInACube  4 роки тому +1

      Most welcome! Glad we can help you out 👊

  • @HeliSal700
    @HeliSal700 4 роки тому +11

    Great summary on this topic. Thank you
    You can summarise this in one sentence: Add or transform columns as early as possible. Ideally at the source

  • @kaltoumrachdi2086
    @kaltoumrachdi2086 4 місяці тому

    i can't think you enough for this informative video

  • @swerick
    @swerick 4 роки тому +2

    Awesome topic, Patrick. This is a question I constantly ask myself. Thanks for the info!

    • @GuyInACube
      @GuyInACube  4 роки тому

      Most welcome. It is definitely something that should be thought about.

  • @kyleparsons1735
    @kyleparsons1735 4 роки тому +1

    Such helpful advice. Thanks again! This channel is fantastic!

    • @GuyInACube
      @GuyInACube  4 роки тому

      Love it! Appreciate the kind words. Thanks for watching 👊

  • @FredKaffenberger
    @FredKaffenberger 4 роки тому +4

    Absolutely Patrick. DB > PQ > CC. The one thing I would add as a Pro user is Dataflows so, DB > DF > PQ > CC. Dataflows mean that we have a stable starting point for as hoc reporting where we need something that’s not in the certified datasets.

    • @GuyInACube
      @GuyInACube  4 роки тому +2

      YES! Dataflows can definitely help in that space and allows business users to pull from a consistent location without worrying about the down stream. Thanks for calling that out Fred. 👊

  • @jatindersingh2241
    @jatindersingh2241 3 роки тому

    Simply Awesome !!!!!!!

  • @OmisileKehindeOlugbenga
    @OmisileKehindeOlugbenga 4 роки тому

    Thanks a lot Patrick

  • @hermesjrd
    @hermesjrd 4 роки тому +2

    Patrick, Thanks for theTIPS or Best Practices.

    • @GuyInACube
      @GuyInACube  4 роки тому

      Most welcome! Thanks for watching.

  • @2404Pepe
    @2404Pepe 3 роки тому

    Excelent dude!!! tks so much

  • @giacomosorbino5789
    @giacomosorbino5789 3 роки тому

    sei bravissimo!!!! thanks for your tips... ultra usefull!! regards from Italy

  • @marleyozziestekkers9791
    @marleyozziestekkers9791 2 роки тому

    THANKS BRO - USING THE DATABASE IS DEFO THE BEST OPTION

  • @EricaDyson
    @EricaDyson 4 роки тому +1

    Good advice. Keep it coming!

    • @GuyInACube
      @GuyInACube  4 роки тому

      Much appreciated! Thanks for watching 👊

  • @kyiwinhtun
    @kyiwinhtun 4 роки тому +1

    That is awesome!!! Thanks for info :D

    • @GuyInACube
      @GuyInACube  4 роки тому

      Most welcome! Thanks for watching. 👊

  • @JackOfTrades12
    @JackOfTrades12 4 роки тому +5

    Thanks Patrick, this is almost spot on to what I do, though I avoid M unless I really need it like with JSON parsing. In my experience when working with other analysts, M becomes difficult to reverse engineer when there are 15+ steps, especially when they are nested steps. Plus, like you said, it can greatly increase refresh time if not done well.
    Since I don't have perms to create views in many sources, I use the native query option as my source, and write everything I can in SQL unless I need to reference another table after processing.
    I will then write the rest of my columns in DAX if SQL couldn't be used, or if I'm testing different methods since DAX doesn't require a full refresh.
    The last best practice I have implemented is field name prefixes that make it easier to determine what type of field I'm working with and where it came from. It's not business user friendly, but our business users typically consume the report from the service and we rename all columns in the visuals.
    This concept especially helps when you've done a bunch of nesting and you don't want to scroll through the entire list of fields.
    I do know about folders, but they aren't friendly enough yet as you have to use the Model tab to edit folders.
    Prefixes:
    '_sc_' : Sql Calc Column : Generated column in the view/query, not native to the table
    '_mc_' : M Calc Column
    '_dc_' : DAX Calc Column
    '_dm_' : DAX Measure
    _sc_FullName would be the name I give the column you added in the video if done in SQL.
    _dc_FullName if done in DAX.
    _dm_CustomerCount = COUNTROWS(Customer) // Would be my DAX measure name
    Let me know what you think.
    Thanks!

    • @GuyInACube
      @GuyInACube  4 роки тому +1

      This is great advice and practice. Thanks for sharing!!! Never thought about that approach, but makes a lot of sense.

    • @arklur3193
      @arklur3193 4 роки тому +3

      I kinda disagree with the prefixes and I wouldn't recommend at all. I know it somewhat makes easier to read "what's what and from where", but I question hardly the usefulness of such a thing. I can already separate calculated columns and measures from other objects in a table (I can create folders, PBI give a different icon to them), and if I ask myself the question: "Is it really important for me to see in the modell if a column is coming from the DB or created in Power Query M?", then it's a no. For me it makes a modell "scary to look at".
      And don't get me wrong, I'm not fully against this, just giving my thoughts in the name of "I don't like it, please people don't do this" thinking :). Do as you do.

  • @jlmacalisang
    @jlmacalisang 3 роки тому

    Wow! Nice video you got there! and also I like your T-Shirt! hehehe...

  • @OPetry1
    @OPetry1 4 роки тому

    Thank you sooooo much!

  • @takuram4616
    @takuram4616 4 роки тому +4

    Push it real good!!!, glad to know I am doing so already :-)

    • @GuyInACube
      @GuyInACube  4 роки тому

      Woot! That is great to hear. 👊

  • @alfredlear4141
    @alfredlear4141 4 роки тому +3

    Great tips.
    My PBI reports are very cpu & memory intensive.
    Coming from power pivot I do love DAX but with some multi million line datasets I need to reduce this as much as possible.

    • @GuyInACube
      @GuyInACube  4 роки тому +1

      YES! you can get some serious performance improvements by doing a few basic things. I'm always amazed the gains you can get by doing a few tweaks.

  • @DavidBridgeTechnology
    @DavidBridgeTechnology 3 роки тому

    Patrick, I am totally with you on where to put the calculation (as close to the data as possible), but I am not so sure about the extra memory required by a DAX calculation.
    If you add a column in the source or PowerQuery or Dax then you still have to load and store the data in the pbix, and theoretically (as long as the data is the same) the size should be the same.
    However, I thought I had better prove this to myself before commenting so I tried it.
    In a SQL view I added
    [fullname SQL] = CONCAT_WS(' ',[FirstName],[MiddleName],[LastName])
    In PowerQuery I created a column using the selected column example method (and yes it does create double spaces here if no middle name exists but its pretty close)
    #"Inserted Merged Column" = Table.AddColumn(dbo_pbi_Person, "Fullname - PowerQuery", each Text.Combine({[FirstName], " ", [MiddleName], " ", [LastName]}), type text)
    and finally, I added a DAX calculated column (which I have formatted to ensure it is DAX ;-))
    fullname DAX =
    var firstname = pbi_Person[FirstName]
    var middlename = IF(ISBLANK(pbi_Person[MiddleName]),"", " " & pbi_Person[MiddleName])
    var lastname = IF(ISBLANK(pbi_Person[LastName]),"", " " & pbi_Person[LastName])
    RETURN
    (firstname & middlename & lastname)
    Dax Studio reports the data size as 40k for each column, hierarchy always 160k and, for the SQL View and PowerQuery, the Dictionary space is roughly 600k, however for the DAX calculated column the dictionary space is 1,484k which is quite a lot higher. This makes the overall data size for DAX twice that of the other two methods.
    So whilst it is true that DAX calculations are using more memory, it's not because of the data; it's just the dictionary. Any idea why this is?
    I used the Person table in AdventureWorks for this test.

  • @douglasgarciatorres9767
    @douglasgarciatorres9767 4 роки тому

    Thanks Patrick!
    What if you need to implement dynamic currency conversion between several currencies and different exchange rates in a fact table (e.g. sales from different countries) with 1 billion rows?
    a) Would you implement that as one column per currency in the data source?
    b) as one column per currency in the Power Query editor?
    c) would you go for DAX?

  • @Xperiun_
    @Xperiun_ 4 роки тому +2

    Great as always. I have the same feeling you passed in the beggining of the video: why do people hear all these advices that we always give and never follow them?! That's crazy :D
    Thanks for the tips Patrick, you are amazing!

    • @GuyInACube
      @GuyInACube  4 роки тому +1

      haha love it! Appreciate it Leonardo! 👊

  • @lydriarivera1692
    @lydriarivera1692 4 роки тому

    This was a great time.I have a question I am having issues to set the table what guidence can you give me.

  • @DanyAllard
    @DanyAllard 4 роки тому

    Thanks Patrick! Nice T-shirt!

    • @GuyInACube
      @GuyInACube  4 роки тому

      You are welcome. One of my favorite shirts.

    • @anun_ymoose
      @anun_ymoose 4 роки тому

      where can we get one?

  • @culpritdesign
    @culpritdesign 2 роки тому

    Creating dashboard views in the database is a great idea! Do everything at the lowest level, create development workflow, organize the developers, create governance processes and owners.

  • @sidharathsarvesh6904
    @sidharathsarvesh6904 4 роки тому

    Thanks Patrick! I was wondering if we can cobine/merge the output data with homogeneous data in a different input sheet. To explain that further - my Power BI dashboard displays category wise data for a month. I've another input sheet that has the information on category wise trend over last few years. Now, I want to combine the latest data from the dashboard with the historical trend data and displays that in a chart/graph. Also, the trend sheet has to be UNPIVOTED before creating a chart. I've been banging my head against a wall on this one for a while. Really appreciate any help, please!

  • @manu143gr
    @manu143gr 4 роки тому

    Hi Patrick,
    When we are not owner of DB Instead in SQL server we can click on source settings icon and add new SQL query there as (FN+MN+LN) as Full Name.

  • @MrDaverock69
    @MrDaverock69 2 роки тому

    Hi Patrick, I am struggling in doing a custom sort for a Legend over my bar graph (sort to be dynamic based on sales), and i was asked to use a calculated column. Not sure if you can point me to a video that addresses this. thanks, appreciate your videos, the best i have seen.

  • @srinivaschigullapalli6165
    @srinivaschigullapalli6165 4 роки тому

    This is really a great video. I have a quick question, I have Table A and Table B and both tables are joined using relationship, I am using Direct Query to pull the data from database live, is it possible to create a calculated without merging these tables/ or Is it possible using DAX formula? If so please advise me.

  • @David-yq6un
    @David-yq6un 4 роки тому

    If we have to work with excel (stored in SharePoint) which approach recommends? Azure Ssis to populate a table?

  • @MusicPancho
    @MusicPancho 3 роки тому

    Hello ​@Guy in a Cube, awesome video. At 4:30 for the list of views imported to Power BI, in which method they are connected? Sql Server - Import / Direct Query or Live connection? :)

  • @sme1983
    @sme1983 2 роки тому

    Hi - is it possible to create a custom column with that can have cells which are manually editable? e.g. a comments column where I can add my own text to each cell. Thanks

  • @granand
    @granand 3 роки тому

    Patrick, Thanks a ton bro. I just started and I had a situation, data model, I can see my 30K+ rows showing up in filter but In the query editor, as I load more as prompted, the data, I look to filter doesn't show up .. forcing me to use DAX in the data model.
    Worse, Now as this table created using DAX doesn't show up in query editor to do further processing .. please share your experience on how to handle this scenario.

  • @megmagruder7124
    @megmagruder7124 2 роки тому

    Question: the data I’m pulling in from a Salesforce object is a super wide table, that I want to break up into many tables and get a lovely star schema. Better to connect to my object for each table or to split up after collecting the columns I want for all my details tables?

  • @francoisdutoit8840
    @francoisdutoit8840 4 роки тому

    Question: If you are preparing a dataset imported from a product using REST, but you want to create the Power BI connector from the queries, do you include only the tables, or do you include all the possible relationships between the tables too? My Power BI desktop mem usage jumped to 8GB after adding a bunch of relationships and became very slow. This is for a 2MB data test set. What would you suggest?

  • @shajulislam95
    @shajulislam95 2 роки тому

    Nice vedio

  • @jorgearmandoglez
    @jorgearmandoglez 4 роки тому

    Hello, thank you for your time giving us this great info, I have a question, what is the best practice to only upload a filtered rows, for example on a sales column table, I want to only bring those that were sold in the last year? (I only have read access on the DB, so I can't create views). Thanks in advance.

    • @alt-enter237
      @alt-enter237 4 роки тому

      I am sure you have already figured this out, but one possibility is to filter the older sales out in the Power QUery Editor step. That way they never get loaded to Power BI.

  • @ruthquintero3727
    @ruthquintero3727 2 роки тому

    If I add a column on the data view, that's not gonna show up in the query editor, right? Because I'm not doing adding it directly to my "data source"; or how would you explain it??
    Thanks in advance!

  • @zchase6631
    @zchase6631 3 роки тому

    whatif there are multiple fact tables etc. PO, PO items. Should I import all the related tables and manage the data model in power bi?

  • @joaquinmaverick82
    @joaquinmaverick82 4 роки тому

    Hi, a question... when I "unselect" the columns, they are not loaded in power bi? or they are loaded, but hidden?
    I have a very large sql query and would like to use it in several reports, but not loading all the data in each report
    Keep being awesome!

  • @smitas6005
    @smitas6005 3 роки тому

    I have a new project where there are 10 multi-tenant databases in MySQL. I would like to build a single data model that will be applicable to all these DBs in order to reduce the maintenance overheads . What would be the best data modeling architecture in this scenario ?

  • @dmatthieu2
    @dmatthieu2 4 роки тому

    Totally agree...and even before the existence of PBI, any BI developer should know this as standard practice. The DB Server is made for robust processing so use it first! Save your users the headache of poor report performance especially when they need quick answers.

  • @WyneeKariuki
    @WyneeKariuki 4 роки тому

    I have a question not related to the content in this tutorial. The "fill map" visuals available in the PowerBI market place do not have a "conditional format" feature that can allow users to define "data colors". This function was previously available in one of the earlier versions of PowerBi but when i go to "Data color" options within the fill map visual there is no longer provision for "conditional formatting". The choropleths were also able to automatically adjust the color schemes using filters in that version of PowerBI. Is it possible for you to demonstrate how one could utilize the currently available choropleths to achieve the same(Conditional formatting of Data Color for fill map visual, which can respond to filters).It would also be wonderful to see how to upload and use shape files(.shp)

  • @bradpears8789
    @bradpears8789 3 роки тому

    Where can I find the screen that shows you the tables, cardinality, table sizes, column sizes, dictionary size etc... I cannot find that!

  • @brucestrom5317
    @brucestrom5317 4 роки тому

    Based on your comments, it appears to me that the two sales tables that were joined in a star table in Part 1 were really logically one table that could be constructed in a SQL VIEW using the UNION command, or UNION ALL if you know for certain that the rows in both SELECTs that are merged with UNION ALL are unique. Your example may be preferable if the two sales tables are from two sources, or if they are both SQL tables, from two separate databases. Is my thinking correct?

  • @tonyandlisab
    @tonyandlisab 4 роки тому

    Hi Guys, maybe a simple solution but I'm struggling to find a answer. I have a table of data which calculates the number of days between 2 dates (prompt payment of invoices data). How would I show these totals graphically? e.g. 60% 1-20 days, 30% 21-40 days and 10% 40+ days. Any advice you can provide will be greatly appreciated.

  • @mahesh84
    @mahesh84 4 роки тому

    great ..thanks ... i think its better to create views and then import to report ..that's best way to improve report performance

    • @GuyInACube
      @GuyInACube  4 роки тому

      Agreed. if you can get it down to the data source, that is your best bet. But, if you don't have access, you still have some options.

  • @powerbinoob8535
    @powerbinoob8535 2 роки тому

    how do you create a column using data from 2 columns in 2 different tables ? i have a relationship within the 2 tables but i cant do a simple if statement to bring information from both tables into this new column ? Eg: col A in table A has 1 2 3 and Col B in table B has 3 2 1 i want to create a new column in table B with =if col A = col B then "ok", "not ok" i dont get to see the col B option via intellisense

  • @neeharikakusampudi7375
    @neeharikakusampudi7375 3 роки тому

    Is there a way to automate the creation of a column?

  • @himokya
    @himokya 2 роки тому

    Hi there...
    Is there any trick to have few reports/visualisation in PBI report.. and others can add text feedback.. or action plan status against each like or status or chart ?

    • @davidlopez-fe2lb
      @davidlopez-fe2lb 2 роки тому

      In PBI Service yes, there's a comment button where folks can leave comments/feedback.

  • @elmerparica5494
    @elmerparica5494 4 роки тому +2

    My new project are heavily excel environment and been telling them to move SQL and these are the questions about pulled it from a table or views.

    • @GuyInACube
      @GuyInACube  4 роки тому

      I feel your pain on the heavy excel environment. Moving to a central source can definitely be helpful.

    • @David-yq6un
      @David-yq6un 4 роки тому

      @@GuyInACube what etl do you recommend to create the Sql from the excel source?

    • @GuyInACube
      @GuyInACube  4 роки тому

      @@David-yq6un There are lots out there. if you stay with the Microsoft stack, you could go with SQL Server Integration Services (SSIS), or Azure Data Factory. If you didn't necessarily want to go to SQL, you could also have a look at Power BI dataflows as well. Basically just Power Query online.

    • @David-yq6un
      @David-yq6un 4 роки тому

      @@GuyInACube thanks for your answer, I have experience with on premise ssis but we want to work with azure platform, could you recommend any ssis or adf UA-camr? Someone as good as you two but on etl products? 😅

  • @mohitgarg360
    @mohitgarg360 3 роки тому

    What if we use summarized table to separate the tables?

  • @eagillum
    @eagillum 4 роки тому

    Wait, did I miss it? How do you push it back to the source without closing power bi, updating the source data, then reloading into a brand new pbix document? You update the source and then click refresh??

  • @ashleywilkins9644
    @ashleywilkins9644 3 роки тому

    Can you help with creating age category column? I work with medical data and with each new report I have to recreate the Age Groups. Is there a way to create a column that says If (Table Name [Age] =
    0-10 - Child
    11-17 - Youth
    18- 30 - Young Adult
    31-45 - Adult
    46-64 - Mid Life
    65 - 100 - Senior

  • @joserojas625
    @joserojas625 4 роки тому +2

    HI
    ​do you know how to dynamically change the column names ?
    ​we have a DB that it is showing financial result for last 12 month... but when we upload a new month (in april2020 we have to upload march 2020 data )​... the tool will calculate results for last 12 month...that means and column #1 will not be feb any more because will be march ..

    • @GuyInACube
      @GuyInACube  4 роки тому

      It's better to use a calendar table and just have a relationship to a date field in your fact table, then you won't need to worry about that.

  • @rodrigocorrea6348
    @rodrigocorrea6348 2 роки тому

    Is it a bad practice to do data preparation in Python?

  • @hajaaa100
    @hajaaa100 2 роки тому

    i have tried but when we do close and apply its loading for entire 6 lakhs records even thought we reduced rows in manage parameter sections.

  • @BernatAgulloRosello
    @BernatAgulloRosello 3 роки тому

    @ferrarialberto made a recent video, saying that there's no great difference between power query column and dax column. Of course the take away is to push it to the source of that's an option, but then again only if that source is you or somebody *really* reliable

  • @johnhartley2438
    @johnhartley2438 4 роки тому +1

    Never seen column from example before. When did it get released? Thanks for the tip

    • @GuyInACube
      @GuyInACube  4 роки тому

      It has been around for quite a while. Check out this video for more examples: ua-cam.com/video/GUwtPIKtqO0/v-deo.html

    • @NeumsFor9
      @NeumsFor9 4 роки тому

      @@GuyInACube Does "Column From Example" use the same SQL Server Data Mining algorithm that the Excel Fill From Example feature uses from the Data Mining Add-In? I believe it is the Logistic Regression algorithm if I remember correctly as it picks the most probable value based on your input.

    • @alt-enter237
      @alt-enter237 4 роки тому

      @@NeumsFor9 -- I am pretty sure you have figured this out by now, but I would be curious to know what you found. I would say that if it isn't the same algorithm it certainly BEHAVES like the same one.

    • @NeumsFor9
      @NeumsFor9 4 роки тому

      @@alt-enter237 Honestly have not done the research, but I can tell you this.....It feels as though PBI is the greatest hits of SSAS, SSRS, SSIS, Performance Point, Proclarity, and SQL Server Data Mining.....all glued together and marketed to a more "low code crowd" in order to bridge the BA/SA/Data Engineer/BI Developer/Data Scientist gap.

  • @vrelearning5245
    @vrelearning5245 4 роки тому

    Hi Patrick, how can we pull data directly from SAP (not Hana) to SQL?

    • @GuyInACube
      @GuyInACube  4 роки тому

      Are you asking can you import data from SAP to SQL? If so, yes, but not with Power BI. You would need to use something like SSIS or Azure Data Factory.

  • @pieterlinden8292
    @pieterlinden8292 Рік тому

    oooh, so create a schema for my PowerBI objects! Then put all my views in there, and just link to that.

  • @whats_next_31
    @whats_next_31 2 роки тому

    I have to concat 4 columns for 1 million rows and I have to do it in around 20 tables to created a unique reference column. Where should I do it? As data souce is Excel, should I do it in excel only and then import. Please help.

    • @davidlopez-fe2lb
      @davidlopez-fe2lb 2 роки тому

      1. Stop using excel as a main data source lol 2. I'd do it in Power Query Editor

  • @MortenHannibalsenOlsen
    @MortenHannibalsenOlsen 4 роки тому +1

    What about using a SQL statement in the source step, where would you rank that solution?
    I often find that to perform way faster than building the steps in the query editor (M), but maybe there is a reason you don't list that as an option?

    • @kevinwthornton
      @kevinwthornton 4 роки тому

      Morten, I've found using a SQL statement as the source, while convenient, is usually a bad idea. Not only can't the DBAs see the code, so they can't let you know there is a database change that will impact your query, but it is also harder to maintain. It's much better to move that statement to a view or stored procedure (if possible) - even if you are the DBA, which also allows for re-use with other reports.Sometimes you don't have a choice - particularly if you are querying data from a server you have no control over, and the DBAs won't allow you to create views or stored procedures, but that ideally should be fairly rare.

    • @MortenHannibalsenOlsen
      @MortenHannibalsenOlsen 4 роки тому

      @@kevinwthornton true, and Patrick also mentions that as the preferred option in the video. But, as also mentioned, sometimes that isn't an option. So my question is how a SQL statement would rank compared to creating the column in the query editor or as a calculated column

    • @kevinwthornton
      @kevinwthornton 4 роки тому

      @@MortenHannibalsenOlsen In performance terms, it should be comparable to a view since it is still pushing the calculation back to the database server.

    • @arklur3193
      @arklur3193 4 роки тому

      One thing though, hopefully you create a separate folder in Power Query and put all the queries (the SQL script/code) inside it then reference those, like "Source = Value.NativeQuery(DatasourceName, QueryString)". It helps a to read the M code, and you can easily share your SQL code with others, makes much easier to check those codes, make views out from them.

    • @MortenHannibalsenOlsen
      @MortenHannibalsenOlsen 4 роки тому

      @@kevinwthornton indeed, that's why I wonder what the reasons are behind Patrick not mentioning it

  • @DenverBronx12
    @DenverBronx12 4 роки тому +1

    It almost looked like the people who didnt have a middle initial, there were extra spaces in between their first and last name?

  • @vinayakparchande7506
    @vinayakparchande7506 4 роки тому

    hello patrick
    I HAVE 2 COLUMNS TOTAL LINE COUNT AND I HAVE TO MINUS IT BY TOTAL MISSED LINE COUNT. RESULT SHOULD BE IN 3RD COLUMN BUT IN %. ALSO I HAVE TO FILTER 3RD COLUMN TO REMOVE 100% AND KEEP BELOW 100% LINES.. SECONDLY HOW CAN I PLOT IT COUNTRY WISE, WEEK WISE AND % COLUMN WISE.

  • @Alexey0795
    @Alexey0795 3 роки тому +1

    10:22 how to get such pivot table?

    • @reed311
      @reed311 3 роки тому

      VertiPaq Analyzer

  • @mario17-t34
    @mario17-t34 2 роки тому

    Sorry where is that QEditor like on @1.18, looks like all menus' running away

    • @davidlopez-fe2lb
      @davidlopez-fe2lb 2 роки тому

      In PBI Desktop if you click up on the ribbon bar "Transform Data" button it'll open up Power Query Editor for you.

  • @alessandrofrancisco9149
    @alessandrofrancisco9149 4 роки тому

    Okay, but i can't use incremental refresh with views, they just don't work.

    • @GuyInACube
      @GuyInACube  4 роки тому +1

      I haven't tested that, but it should work as long as Query Folding kicks in. Will add it to my list of things to try out.

  • @mariafriend644
    @mariafriend644 4 роки тому

    I just had this issue... timely .

  • @paulnandes2841
    @paulnandes2841 3 роки тому

    You're a data warehouse extremist man. A little column merge in power query? No? Omg. Maybe a huge portion of us are doing models merging excel exports from our organizations that we have no control of or can't influence on their own ETL processes. M, power query and dataflow are great to do ETL, don't you think?

    • @davidlopez-fe2lb
      @davidlopez-fe2lb 2 роки тому

      Luckily if orgs have Power BI Service, they could leverage dataflows almost as a little "ETL" process themselves. That's what we have to do, seeing how we can't touch the database rather just consume the data. In this example, "have your DBAs create a column" our DBAs would laugh at us, so we just consume the data from their DB and create our own ETL process.

  • @NeumsFor9
    @NeumsFor9 3 роки тому

    Self service MDM......the first steps :)

  • @rodrigocorrea6348
    @rodrigocorrea6348 2 роки тому

    Is it a bad practice to do data preparation in Python?