I just built a data ingestion Proof of Concept to show my boss. Coming from a SSIS background, most of the steps in Data Factory seemed very awkward. This tutorial helped me complete the last step of moving csv files to another container after loading them to a SQL Server Database, thank you very much!
Very good and really handy for anyone in ETL who is new in ADF.
I just moved myself from long Oracle journey to Azure. I find your channel very nice. Great job !!
I am from oracle right now I search in cloud for opportunities can you guide how to become azure data engineer with sql
Thanks so much...every step is perfectly explained with its functionality...I was able to follow everything! keep posting stuff like this !! Thanks
Hello! Thank you so much about this video. Helped me to understand ForEach loop. I wish you healthy days.
Thank you, good demo, helped me understand what was going wrong in my for each loop
Nice :) Interesting and could grasp the content easily. Thanks for the tutorials!!
Thank you so much mam for your explanation, please post more ADF and Data bricks videos as well.
Thanks for the detailed explanation
your videos are very useful. thank you so much.
Great!! Thanks! However one thing like to mention.. you shud have show output folder after successful execution ( I know we get these 2 files, but still , for more clarity)
Atlast pls show the output folder in container also so that it is helpful to see how the data gets copied there.. Thank you! Was useful!
yes aftr submission how we knw the data was copied into output or not. anyway thanks nice teaching
Thankyou for the good explanation, I have a scenario where I need to loop through subfolder's with in a folder to get to the file. Could you suggest how to achieve that.
Thanks
Great videos and detailed explanations..This channel is amazing
@@AllAboutBI what if we have table name list in file and extract them using lookup and then iterate using foreach, do you have any video on this scenario?
@@vru5696 no but pretty straight forward..do you find any issue somewhere in the implementation
Simple explanation.
This was brilliant!
Good lecture , Thank you ❤️
Here, Childitems is array type so we have to call with index right to get both files like childitems[0] and [1]
nice explanation, but insread of using getmeta data can we use lookup activity right? and whats is difference of both when to use both and where not use both
Firstly thanks for this awesome video. Do you really have to use Metadata in this case? By using just Copy activity, I can use "*" in the filename box option available in the wildcard file path and check Recursive option. I am assuming it should be able to pick all the files under this folder. I understand if the files in the folder needed extra vetting in which case metadata would help.
Right. If we need to filter by file name or last modified date, metadata would help.
And thanks for your feedback and comment 🙏
While selecting source dataset to getmetadata activity if we have input folder and inside it 4 more folders to copy so till input folder only needs to select path in souce dataset or how?
What if... the career.txt is already on Output folder. Would it be replaced or updated or would the pipeline fail?
Could you please explain how did you add output folder in storage folder without uploading any files inside it?
Fantastic
Please help me, How to load mongo db multiple collections into adls gen2 in the azure data factory
Thank You !!
Hi mam, Here we are coping the multiple files from one place to another so we have used foreach loop. But I have tried this implementation by using only copy data activity i.e in before video u have explained rite. See what I can able to copy multiple files by simply using copy data activity only.. Then y are we using foreach loop and get meta data and all? 😅
Hello Mam, As per abv scenario i have tried copying the multiple files from Source to target , but no luck even though the whole activity is success. Please advise!
If you applied noise reducer it’s good
That was when I had no experience recording 😊 my latest videos won't be having much noise 👍
i have a query that if we are having different no of columns in each csv file then how can we do for each. why i am asking this question means while i am doing copy activity for different csv file from one adls to another it is throwing error. if any body knows answer please let me know.
Excellent
Pls give any idea ,how to kill infinite loop created inside the until activity in azure data factory
Hi, we can copy multiple files from one folder to another by using copy activity. Then what is the purpose of using Get metadata & Foreach loop? could you please explain
It was a very good explanation, only suggestion from my side is, the voice is having some noise in the audio, please try to reduce the noise in the audio.
Thanks so much 🙏 ya I have improved the audio to a greater extent in my other videos
How do overwrite files? Is it any of those copy behaviours?
Hi Mam, I am getting an error - "The length of execution output is over limit (around 4MB currently)" while I tried using the 'GetMetadata' Activity to get all the metadata of all the containers inside the source storage account. Is there any workaround for this issue ?
The storage account is an ADLS Gen1 HNS disabled one and has more than 80K containers.
I want to copy all the containers and the blobs inside those containers to an ADLS Gen 2 (HNS enabled) Storage Account.
Hi Ajay, guess it's the limitation right now. Not sure of workaround. Did u get a chance to go thru the forums?
Hi Subha I am facing issue while upload file by manually to container and I am using free subscription
Awesome one...Could you please put video for fetching particular file type from folder? for eg, i want to fetch and move only the .csv file from source folder to sink..Thanks
Great tutorial i want to copy files which are nested how to do?
Which pipeline needed for read text file content from storage and based on value from text file trigger another pipe line
I have a question, how to restrict specific no. of files to pass to for each. Suppose getmetadata gives 10files , restrict to execute 5 files in copy activity.
We might have to use a filter activity with some expresion that restricts no.of files
Hi
Have you done any to get the data from sftp to datalake if have any such kind of video could you please share to me
Hi, I have a scenario, I have a multiple excel files (4 files) in blob and need to upload the in SQL in 4 different table. I have created the stored procedure in SQL for 4 files. Can you able to help me with videos, to upload automatically on regular basis.
I too would like to have a video on this scenario. @Shruthi if u come across any video on this please share :)
so Thanks medam
I tried foreach loop activity there is no error but after getmata data it will not executed .
Hi
How do we copy folders recursively from one location to another. For eg: One adls has 10 folders in root which I want to copy to another adls.
Thanks
Can we use the same steps to load files present in nested folders one ADLS to another?
In the copy activity there is a recursive option, which should allow you to load files with the sub folder levels as well.
Actually i have created a Ftp server in my win 10 machine using IISM and while creating linked service to connect with it i get error "connection failed".
Can you please do one thing like instead of storing these files in blob storage cloud directly,
create a local FTP server in you local machine and then use data factory to connect with the Local FTP server and then copy data to blob storage.
How to copy nested folders from one storage to another
Can we copy different types of files like CSV, json and parquet together with same for each activity
just a doubt here why we add source information in metadata if we have to tell them all information again at 3rd step (in copy activity) isn't it making work double?
@@AllAboutBI Yes, I mean we get all detail of input file location on lookup.
So in copy activity we just have to move file by providing the sink location
but we have to add input detail again in activity.why?
sorry might be i missed any point in video...
Can you please create a video, how to upload multiple Excel data in Sql Server using ForEach and Data Flow and please also used data conversion. It doesn't seem to be as easy as we do in SSIS.
In SSIS also, it will be like we need to hard code the sheetname. Here also we need to mention sheet name in dataset.
I have posted a video to process multiple sheets in adf. Pls check the playlist 🙏
How do I give 2 array of parameters in foreach which should run a script inside the foreach based on these 2 parameters??
@@AllAboutBI i have to give 2 parameters for my script year and month.. and every time it has to take a new year n month parameter.. how do I achieve this? I tried a foreach loop and a script activity inside the foreach but it is taking only one parameter.. please let me know
Do you know of a good synapse training online?
thanks
mam i want to from one source to two different locations on sink by using foreach pls help me
what if we have Multi Folders with Multi Files in each folder ? any one answer pls
Two for each reqd. In the playlist itself there's an example.
Delete multiple files in multiple folders.
Hello Mam, I am getting this error for the pipeline: ErrorCode=UserErrorSourceBlobNotExist,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The required Blob is missing. Folder path: lab1-05/finance.txt/.,Source=Microsoft.DataTransfer.ClientLibrary,' .. I have followed all the steps.
I have file format like this
FirstName|LastName|Age
Naima|Khatun|26
Prati|Vermani|27
Mehzan|Ali|23
I menttioned column separated as pipeline separated and row as default..
Ma'am please suggest
I was using Blob storage , now i have changed it to data lake storage and doing and it worked.. Didn't understand how..
then you would have created a data lake storage instead of blob while creating storage account Nasima.
how can we ignore one file alone while copying using foreach?
You can have a filter to filter to be loaded files.
Else before copying use if condition and ignore
Why can't we use look up for getting files
Look up is just for giving the content of file.. not names of files under a folder
My files are present directly in the container, but not in a folder within container. I am unable to copy them to another container. Please help.
@@AllAboutBI I need to transform each row in CSV to seperate jsons. I.E one row need to be created as one json. Is it possible in data factory
@@AllAboutBI Great. What's the option for that? Do I need to use data flow or copy will work?
Hi mam, I have a scenario, i have .txt files in Blob and i want to load that files into SQL, Through SSIS
Can you help me with this?
Check the sand playlist. I hv s video related to it. Else chk my SSIS playlist
Very good demo, better than many online blogs, articles. Thanks 👍
நன்றி sago