Append Variable activity in Azure Data Factory: Story of combining things together

(2018-Oct-29) There are only a few sentences in the official Microsoft web page that describe newly introduced activity task (Append Variable) to add a value to an existing array variable defined in Azure Data Factory - Append Variable Activity in Azure Data Factory But it significantly improves your ability to control a workflow of the data transformation activities of your Data Factory pipeline.



Suppose, you have several folders in your Azure Blob storage container, where daily product inventory data from different stores is saved. You need to transfer those files with product stock daily snapshots to a database.



It will be a natural way to get a list of files from all sourcing folders and then load them all into your database.

Technically:
1) We can read metadata of our sourcing folders from the Blob storage
2) Then we can extract all the files names and save them in one queue object
3) And finally, use this file list queue to read and transfer data into a SQL database. 

Let's recreate this use case in our Azure Data Factory pipeline.

1) To get metadata of our sourcing folders, we need to select "Child Items" for the output of our [Get Metadata] activity task:



Which provides a list of sub-folders and files inside the given folder with a list of name and type of each child item. 
Here is an output of this task using a list of files from my first store:


2) Then I need to extract file names. To make it happen I pass the output of the [Get Metadata] activity task to my [ForEach] container. 
Where Items parameter is set to @activity('Metadata Store 01').output.childitems.



And then within this loop container, I actually start using this new [Append Variable] activity task and specifically choose to extract only names and not types from the previous task output set. Please note that this task can only be used for 'Array' type of variables in your pipeline.



Then all that I have to do is to replicate this logic for other sourcing folders and stream the list of the file names to the very same variable. 
In my case it's var_file_list. Don't forget to define all the necessary variables within your pipeline.


3) Once all the file names are extracted into my array variable, then I can use this as a queue for my data load task. My next [ForEach] loop container's Items parameter will be set to this value: @variables('var_file_list')


And internal [Copy Data] activity task within this loop container will receive file names by using individual item names from my variable loop set [@item()].


List of files is appended from each sourcing folders and then all the files are successfully loaded into my Azure SQL database. Just to check a final list of file names, I copied the content of my var_file_list variable into another testing  var_file_list_check variable to validate its content.



Azure Data Factory allows more flexibility with this new [Append Variable] activity task and I do recommend to use it more and more in your data flow pipelines! :-) 

Comments

  1. Are you dynamically writing filename variable value to SQL Column using CopyData? If yes would you mind showing the mapping of your CopyData in Datafactory?

    ReplyDelete
  2. Yes, the filenames are dynamically looked up and passed via the @item() expression.
    The mapping in my case is consistent for all the files so I keep hard-coded to the list of 4 columns (store_id, date_id, product_id, stock_amt). Your use-case may vary.

    ReplyDelete
    Replies
    1. Hey, first of all your article is really very helpful. Thanks a lot for this.
      My concern is : Just like your case , i also have four files with same structure and my target table contain one additional column called 'file_name'. how would i populate this target table column from the variable value(item()) in each loop for each files individually

      Delete
    2. If my item() has just the file names or this name is represented as JSON sub-structure within my array variable, that I would add a dynamic statement of item().file_name as an additional element (column) in the Mapping of my CopyData activity task.

      Delete
  3. I am copying data from FTP----->BloB------->DB. I am using blob for achive the historical files. I want to check if FTP file is already exist in the blob(using file name). If yes, then i want to send a mail. I am thinking if i can do list of file names comparison between FTP v/s blob the way you created the list of file name in the end.

    ReplyDelete
    Replies
    1. Yes, you can try to Get Metadata both of your FTP and BLOB locations, save the output of both activities to separate Array type variables and then compare them.

      Delete
    2. Please also check the Validation activity in Azure Data Factory, it could help your use case as well: https://www.sqlservercentral.com/blogs/validation-activity-in-azure-data-factory-traffic-light-of-your-operational-workflow

      Delete
  4. Can you please share ARM Template fro above Pipeline. I am still not able to map item().file_name into my table. I must be missing something while connecting to Blob storage data set.

    ReplyDelete
    Replies
    1. JSON code to all the ADF pipelines can be found here:
      https://github.com/NrgFly/Azure-DataFactory/tree/master/Samples/pipeline

      Delete
  5. This is so awesome Rayis! Like you said above"here are only a few sentences in the official Microsoft web page ...." and that is so true. I couldn't figure out exactly how to use this and the Set Variable until I read your Blog! Thanks for taking the time to do this; it saved me a lot of time.....Mike

    ReplyDelete
    Replies
    1. I'm very happy Mike that you've found this blog post helpful. For me this concept of working with variables and especially appending to array variables seemed to be a bit confusing. But once I started probing and applying this knowledge in my real project work, then it all became a lot simpler. I believe it's all about starting small and add a few more complicated steps in testing new things. I'm sure, you will be fine with all ADF components!

      Delete

Post a Comment