Maven Power Challenge - Power Query Clean - Pt2

Поділитися
Вставка
  • Опубліковано 1 жов 2024

КОМЕНТАРІ • 21

  • @onlyoneadeleke
    @onlyoneadeleke 9 місяців тому +1

    @dganalysis i keep getting this error statement "STATE_NAMES
    Failed to save modifications to the server. Error returned: 'Columr
    Index' in Table 'STATE _LIST' contains a duplicate value '3088" and
    this is not allowed for columns on the one side of a many-to-onehs
    relationship or for columns that are used as the primary key of a
    table. Any idea on how to go about it would save my day.
    Thanks

    • @dganalysis
      @dganalysis  9 місяців тому +1

      Remove the auto detect relationships between your tables, and it should load without issue. It sounds like you have a duplicate state name in one of your tables

    • @onlyoneadeleke
      @onlyoneadeleke 9 місяців тому +1

      I'll do just that .
      I got same error for the NERC Table after cleaning the nerc region column.

    • @dganalysis
      @dganalysis  9 місяців тому

      @@onlyoneadeleke and make sure if you apply a relationship, it is between the index numbers, not between state names or nerc region names

  • @refkatheanalyst264
    @refkatheanalyst264 9 місяців тому +1

    It is a great opportunity and very helpful if you can create the visualization of this project. Thank you

    • @dganalysis
      @dganalysis  9 місяців тому +1

      Yes, I plan to do this in the new year

    • @refkatheanalyst264
      @refkatheanalyst264 9 місяців тому

      @@dganalysis Thank You, I'm waiting for that

  • @nash_life
    @nash_life 10 місяців тому +2

    Thank you Gerard.

  • @vanlessing
    @vanlessing 10 місяців тому +1

    about 10 years ago I power-queried similar messy excel files based on election results; headers/columns etc. changed over years. At least no time stamp was involved. Great PQ session. Thx

    • @dganalysis
      @dganalysis  10 місяців тому

      Thank you. I think it's important for people to practise this type of cleaning (especially if starting out). In many companies, you might have to perform all steps in cleaning, transformation, loading, modelling, analysis and visualisation, and the state of this data set is actually what you will likely come across in your job

  • @grow-with-abi
    @grow-with-abi 10 місяців тому

    I have a doubt Gerard, Since there are multiple indexes for state list, it's a duplicate value right, how could we load it to the frontend?

    • @dganalysis
      @dganalysis  10 місяців тому

      Create a model connecting the main table to the state table. Use the index number as the primary and foreign keys

    • @grow-with-abi
      @grow-with-abi 10 місяців тому

      @@dganalysis Thanks!!

  • @andrew_hubbard
    @andrew_hubbard 10 місяців тому +1

    Many Thanks for this Gerard, It helped me get past my sticking point in preparating the data for this challenge.

    • @dganalysis
      @dganalysis  10 місяців тому

      That's great, glad to hear it
      .I'll keep a look out for your entry

  • @royalhouse7778
    @royalhouse7778 10 місяців тому +1

    This was so helpful! Thank you Gerard!!!

    • @dganalysis
      @dganalysis  10 місяців тому

      Thanks, happy it helped out - hope you join in the challenge.

  • @wwpharmacist
    @wwpharmacist 10 місяців тому +1

    Thank you Gerard,
    learned alot from you

    • @dganalysis
      @dganalysis  10 місяців тому

      Thanks - hope it helped - will you be taking part in the challenge?

    • @wwpharmacist
      @wwpharmacist 10 місяців тому

      Unfortunately my laptop freeze when I open power query 8 G ram ,I don't know what's the problem, waiting for my new laptop.
      I cleaned all the date column without M code ,but your way is better
      Your way to Dice and slice the data helped me a lot .
      Thanks Gerard

    • @dganalysis
      @dganalysis  10 місяців тому +1

      @@wwpharmacist sometimes you can stage your tables, or turn off auto refresh on the tables (right click on table) it can stop the queries refreshing all the way back to source data.