@dganalysis i keep getting this error statement "STATE_NAMES Failed to save modifications to the server. Error returned: 'Columr Index' in Table 'STATE _LIST' contains a duplicate value '3088" and this is not allowed for columns on the one side of a many-to-onehs relationship or for columns that are used as the primary key of a table. Any idea on how to go about it would save my day. Thanks
Remove the auto detect relationships between your tables, and it should load without issue. It sounds like you have a duplicate state name in one of your tables
about 10 years ago I power-queried similar messy excel files based on election results; headers/columns etc. changed over years. At least no time stamp was involved. Great PQ session. Thx
Thank you. I think it's important for people to practise this type of cleaning (especially if starting out). In many companies, you might have to perform all steps in cleaning, transformation, loading, modelling, analysis and visualisation, and the state of this data set is actually what you will likely come across in your job
Unfortunately my laptop freeze when I open power query 8 G ram ,I don't know what's the problem, waiting for my new laptop. I cleaned all the date column without M code ,but your way is better Your way to Dice and slice the data helped me a lot . Thanks Gerard
@@wwpharmacist sometimes you can stage your tables, or turn off auto refresh on the tables (right click on table) it can stop the queries refreshing all the way back to source data.
@dganalysis i keep getting this error statement "STATE_NAMES
Failed to save modifications to the server. Error returned: 'Columr
Index' in Table 'STATE _LIST' contains a duplicate value '3088" and
this is not allowed for columns on the one side of a many-to-onehs
relationship or for columns that are used as the primary key of a
table. Any idea on how to go about it would save my day.
Thanks
Remove the auto detect relationships between your tables, and it should load without issue. It sounds like you have a duplicate state name in one of your tables
I'll do just that .
I got same error for the NERC Table after cleaning the nerc region column.
@@onlyoneadeleke and make sure if you apply a relationship, it is between the index numbers, not between state names or nerc region names
It is a great opportunity and very helpful if you can create the visualization of this project. Thank you
Yes, I plan to do this in the new year
@@dganalysis Thank You, I'm waiting for that
Thank you Gerard.
about 10 years ago I power-queried similar messy excel files based on election results; headers/columns etc. changed over years. At least no time stamp was involved. Great PQ session. Thx
Thank you. I think it's important for people to practise this type of cleaning (especially if starting out). In many companies, you might have to perform all steps in cleaning, transformation, loading, modelling, analysis and visualisation, and the state of this data set is actually what you will likely come across in your job
I have a doubt Gerard, Since there are multiple indexes for state list, it's a duplicate value right, how could we load it to the frontend?
Create a model connecting the main table to the state table. Use the index number as the primary and foreign keys
@@dganalysis Thanks!!
Many Thanks for this Gerard, It helped me get past my sticking point in preparating the data for this challenge.
That's great, glad to hear it
.I'll keep a look out for your entry
This was so helpful! Thank you Gerard!!!
Thanks, happy it helped out - hope you join in the challenge.
Thank you Gerard,
learned alot from you
Thanks - hope it helped - will you be taking part in the challenge?
Unfortunately my laptop freeze when I open power query 8 G ram ,I don't know what's the problem, waiting for my new laptop.
I cleaned all the date column without M code ,but your way is better
Your way to Dice and slice the data helped me a lot .
Thanks Gerard
@@wwpharmacist sometimes you can stage your tables, or turn off auto refresh on the tables (right click on table) it can stop the queries refreshing all the way back to source data.