In the spirit of how you’re asking the question yea, but in general No don’t think so. It’s a data preparation tool. eTL is far more complex than just data preparation. Tableaus target customer here is the advanced user who doesn’t want to become DBA but wants to be empowered to fix issues in data sources themselves. ETL in my view is far more complex than just data prep, it includes things like staging transformation and database design which this doesn’t really do.
I have watched your videos of this and the desktop crash course. I appreciate your effort to help people like me. Tableau people fixed the issue you faced with appending and replacing records in the SQL server but I'm not sure in the case of postgress. Thanks again.
Hi Tim, great video. I'm looking to import and upload a big group of tables into Redshift. Its easy to do a mass upload, but my challenges are in how to write the tables without a lot of manual work! My biggest questions are: can I not create a new table name but default to the name of the input table? Can I do a mass write to Redshift to avoid using an S3 bucket?
Hi Tim Great Video But i have a couple of questions here: 1) If the output table has an autoincrement value as its primary key the tableau is passing null value as it voilates the constraint. is it possible to igonre that field while inserting? 2) is it possible to perform insert and update based on keys?
Sure its documented here help.tableau.com/current/prep/en-us/prep_save_share.htm#output-options and to quote it "Enter your custom SQL and select whether to run your script before, after or both before and after data is written to the database tables. You can use these scripts to create a copy of your database table before the flow data is written to the table, add an index, add other table properties, and so on." Essentially you can write some sql that Tableau prep will run alongside the data. Maybe you need to do a specific operation after it writes it or something this will help with that.
Hi bro. Tks for ur video. I have a question. If i want to grant authorization to an user to upload a file to a specific database table (to prevent other tables from being adjusted). So how should i do?
Tableau is a read only tool so something like that has to be done with a Tableau extension. Theres a few write back extensions in the tableau extension Gallery but you should idealy build your own bespoke extension to do this.
I tried this with Teradata database and everytime I run it, it creates a blank table for me and says "No field match: data ignored" something like this. Since its creating the field names and its data types automatically I dont know whats causing the issue.
Hello Tim, thanks for the video. I have a question : I receive everyday a CSV in my server where I've installed tableau server. I want to do some transformations then save it as a CSV into another file on the same server. How can I achieve this knowing that when I want to choose the location of the output on Tableau Prep builder I can access only to my folders and not the server folders? Should I install Tableau prep builder on my server machine? Thanks !
Hi Azel. This isn't possible it would a be huge security concern if you could publish to the file structure of the server. You'd need to instead publish it as a published datasource instead to achieve this. If you have prep conductor on your server you could then automate the running of this as well but you'd likely need a shared filesystem that Tableau could access. Long story short its much better and easier to use a databse.
@@TableauTim Thank you Tim. My goal was to clean the CSV files using prep so I can send them to an ftp server. I think I'll go with a database and then use a script to export CSV from the databse to the FTP :) Do you think I can use a python script in prep to export CSV files to an FTP?
Hi Tim, awesome upload as usual . I have a question , can we use something like fuzzy learning in Prep to fix values with spelling mistakes. I tried some techniques in the cleaning steps like "pronounciation" and "spelling" but was hoping if there are some more efficient ways
Theres nothing native in the tool beyond what you've tried, however you can use the R or python capabilities inside the script tool to essentially bring this feature in using an R or python script that does this. The idea would be the prep flow feed in a list of dimensions and the r script tool would give you back the same list with a new column showing the cleaned or fixed variant so you can tie it back tot he data.
What is the use of doing this ? Wouldn’t the performance be low ? Why don’t we clean the data as required in the snowflake it self ? For ex:My prep is taking data from SF and writing back to SF I didn’t understand the why we r doing this here
Becauss the target audience for prep is people who don't wan't to write sql in a database. :D its a pretty large market. Alteryx Designer has a very similar customer base.
Nice video man! So it possible now to say that tab pre has became an ETL tool
In the spirit of how you’re asking the question yea, but in general No don’t think so. It’s a data preparation tool. eTL is far more complex than just data preparation. Tableaus target customer here is the advanced user who doesn’t want to become DBA but wants to be empowered to fix issues in data sources themselves. ETL in my view is far more complex than just data prep, it includes things like staging transformation and database design which this doesn’t really do.
I have watched your videos of this and the desktop crash course. I appreciate your effort to help people like me. Tableau people fixed the issue you faced with appending and replacing records in the SQL server but I'm not sure in the case of postgress. Thanks again.
Hi Tim, great video. I'm looking to import and upload a big group of tables into Redshift. Its easy to do a mass upload, but my challenges are in how to write the tables without a lot of manual work! My biggest questions are: can I not create a new table name but default to the name of the input table? Can I do a mass write to Redshift to avoid using an S3 bucket?
Hi Tim
Great Video
But i have a couple of questions here:
1) If the output table has an autoincrement value as its primary key the tableau is passing null value as it voilates the constraint.
is it possible to igonre that field while inserting?
2) is it possible to perform insert and update based on keys?
Sir can you please explain custom SQL at output pane
Sure its documented here help.tableau.com/current/prep/en-us/prep_save_share.htm#output-options and to quote it "Enter your custom SQL and select whether to run your script before, after or both before and after data is written to the database tables. You can use these scripts to create a copy of your database table before the flow data is written to the table, add an index, add other table properties, and so on."
Essentially you can write some sql that Tableau prep will run alongside the data. Maybe you need to do a specific operation after it writes it or something this will help with that.
Hi bro. Tks for ur video. I have a question. If i want to grant authorization to an user to upload a file to a specific database table (to prevent other tables from being adjusted). So how should i do?
Tableau is a read only tool so something like that has to be done with a Tableau extension. Theres a few write back extensions in the tableau extension Gallery but you should idealy build your own bespoke extension to do this.
Can I upload data prep output as dataset In t crm aka EA and get it schedule ? Just like we schedule recipe
I tried this with Teradata database and everytime I run it, it creates a blank table for me and says "No field match: data ignored" something like this. Since its creating the field names and its data types automatically I dont know whats causing the issue.
Works same as Ms Acesss. In order to append data you will need to assign key column first.
Hello Tim, thanks for the video. I have a question :
I receive everyday a CSV in my server where I've installed tableau server.
I want to do some transformations then save it as a CSV into another file on the same server.
How can I achieve this knowing that when I want to choose the location of the output on Tableau Prep builder I can access only to my folders and not the server folders? Should I install Tableau prep builder on my server machine?
Thanks !
Hi Azel. This isn't possible it would a be huge security concern if you could publish to the file structure of the server. You'd need to instead publish it as a published datasource instead to achieve this. If you have prep conductor on your server you could then automate the running of this as well but you'd likely need a shared filesystem that Tableau could access. Long story short its much better and easier to use a databse.
@@TableauTim Thank you Tim. My goal was to clean the CSV files using prep so I can send them to an ftp server. I think I'll go with a database and then use a script to export CSV from the databse to the FTP :)
Do you think I can use a python script in prep to export CSV files to an FTP?
Hi Tim, awesome upload as usual .
I have a question , can we use something like fuzzy learning in Prep to fix values with spelling mistakes.
I tried some techniques in the cleaning steps like "pronounciation" and "spelling" but was hoping if there are some more efficient ways
Theres nothing native in the tool beyond what you've tried, however you can use the R or python capabilities inside the script tool to essentially bring this feature in using an R or python script that does this. The idea would be the prep flow feed in a list of dimensions and the r script tool would give you back the same list with a new column showing the cleaned or fixed variant so you can tie it back tot he data.
@@TableauTim thanks man, I will surely try this out
What is the use of doing this ?
Wouldn’t the performance be low ?
Why don’t we clean the data as required in the snowflake it self ?
For ex:My prep is taking data from SF and writing back to SF
I didn’t understand the why we r doing this here
Becauss the target audience for prep is people who don't wan't to write sql in a database. :D its a pretty large market. Alteryx Designer has a very similar customer base.