Append Variable activity in Azure Data Factory: Story of combining things together

(2018-Oct-29) There are only a few sentences in the official Microsoft web page that describe newly introduced activity task (Append Variable) to add a value to an existing array variable defined in Azure Data Factory - Append Variable Activity in Azure Data Factory But it significantly improves your ability to control a workflow of the data transformation activities of your Data Factory pipeline.



Suppose, you have several folders in your Azure Blob storage container, where daily product inventory data from different stores is saved. You need to transfer those files with product stock daily snapshots to a database.



It will be a natural way to get a list of files from all sourcing folders and then load them all into your database.

Technically:
1) We can read metadata of our sourcing folders from the Blob storage
2) Then we can extract all the files names and save them in one queue object
3) And finally, use this file list queue to read and transfer data into a SQL database. 

Let's recreate this use case in our Azure Data Factory pipeline.

1) To get metadata of our sourcing folders, we need to select "Child Items" for the output of our [Get Metadata] activity task:



Which provides a list of sub-folders and files inside the given folder with a list of name and type of each child item. 
Here is an output of this task using a list of files from my first store:


2) Then I need to extract file names. To make it happen I pass the output of the [Get Metadata] activity task to my [ForEach] container. 
Where Items parameter is set to @activity('Metadata Store 01').output.childitems.



And then within this loop container, I actually start using this new [Append Variable] activity task and specifically choose to extract only names and not types from the previous task output set. Please note that this task can only be used for 'Array' type of variables in your pipeline.



Then all that I have to do is to replicate this logic for other sourcing folders and stream the list of the file names to the very same variable. 
In my case it's var_file_list. Don't forget to define all the necessary variables within your pipeline.


3) Once all the file names are extracted into my array variable, then I can use this as a queue for my data load task. My next [ForEach] loop container's Items parameter will be set to this value: @variables('var_file_list')


And internal [Copy Data] activity task within this loop container will receive file names by using individual item names from my variable loop set [@item()].


List of files is appended from each sourcing folders and then all the files are successfully loaded into my Azure SQL database. Just to check a final list of file names, I copied the content of my var_file_list variable into another testing  var_file_list_check variable to validate its content.



Azure Data Factory allows more flexibility with this new [Append Variable] activity task and I do recommend to use it more and more in your data flow pipelines! :-) 

Comments

  1. Are you dynamically writing filename variable value to SQL Column using CopyData? If yes would you mind showing the mapping of your CopyData in Datafactory?

    ReplyDelete
  2. Yes, the filenames are dynamically looked up and passed via the @item() expression.
    The mapping in my case is consistent for all the files so I keep hard-coded to the list of 4 columns (store_id, date_id, product_id, stock_amt). Your use-case may vary.

    ReplyDelete
    Replies
    1. Hey, first of all your article is really very helpful. Thanks a lot for this.
      My concern is : Just like your case , i also have four files with same structure and my target table contain one additional column called 'file_name'. how would i populate this target table column from the variable value(item()) in each loop for each files individually

      Delete
    2. If my item() has just the file names or this name is represented as JSON sub-structure within my array variable, that I would add a dynamic statement of item().file_name as an additional element (column) in the Mapping of my CopyData activity task.

      Delete
    3. This was not very clear! By default copy activity does not provide file name. And processing file wise copy activity with for each loop will not help!

      Delete
    4. Please check "Copy multiple files containers between File Stores" pipeline template in ADF. It explains support for both folder and file names. Also Data Copy activity recursive folder content copying with no need to use a For Each loop container.

      Delete
  3. I am copying data from FTP----->BloB------->DB. I am using blob for achive the historical files. I want to check if FTP file is already exist in the blob(using file name). If yes, then i want to send a mail. I am thinking if i can do list of file names comparison between FTP v/s blob the way you created the list of file name in the end.

    ReplyDelete
    Replies
    1. Yes, you can try to Get Metadata both of your FTP and BLOB locations, save the output of both activities to separate Array type variables and then compare them.

      Delete
    2. Please also check the Validation activity in Azure Data Factory, it could help your use case as well: https://www.sqlservercentral.com/blogs/validation-activity-in-azure-data-factory-traffic-light-of-your-operational-workflow

      Delete
  4. Can you please share ARM Template fro above Pipeline. I am still not able to map item().file_name into my table. I must be missing something while connecting to Blob storage data set.

    ReplyDelete
    Replies
    1. JSON code to all the ADF pipelines can be found here:
      https://github.com/NrgFly/Azure-DataFactory/tree/master/Samples/pipeline

      Delete
  5. This is so awesome Rayis! Like you said above"here are only a few sentences in the official Microsoft web page ...." and that is so true. I couldn't figure out exactly how to use this and the Set Variable until I read your Blog! Thanks for taking the time to do this; it saved me a lot of time.....Mike

    ReplyDelete
    Replies
    1. I'm very happy Mike that you've found this blog post helpful. For me this concept of working with variables and especially appending to array variables seemed to be a bit confusing. But once I started probing and applying this knowledge in my real project work, then it all became a lot simpler. I believe it's all about starting small and add a few more complicated steps in testing new things. I'm sure, you will be fine with all ADF components!

      Delete
  6. Do we have any scenario where we can save file name in the copy activity. Once the file is read we cannot trace data back to file. This is very strange. I have a workaround running which get the file names using metadataactivity and then pass in to for each and the copy activity copies the data and i ihave to execute a separted db statement to update the file name. Problem with this approach is it is sequential and also slow since we are executing multiple statements for simple job.

    ReplyDelete
  7. Nice solution, but there will be a lot append variable activity in the monitor.

    Using the lookup activity also can do it now.

    select the query for lookup activity, and input below query.

    select JSON_VALUE(value,'$.name') from openjson('@{string(activity('Get Metadata1').output.childItems)}') A

    ReplyDelete
    Replies
    1. If that works, this will simplify my solution that I had built 10 months ago. It's great!

      Delete
    2. openjson is not supported by default. you will have to change the compatibility level for the same. If your organization will not allow to do so then it will not work. I also didn't get in the post that how we will only push filename in the append variable.

      Delete
  8. Great solution, helped me a lot. However, I am trying to fetch the Input details of a lookup activity. Eg. I need to extract stored procedure name in a lookup activity and populate that value into a variable. Is it possible to fetch that information?
    I tried using @activity('Lookup1').Input.childitems, which failed the task.
    Appreciate if you can provide a solution .

    ReplyDelete
  9. I have multiple folders which contains text file having same filename.
    I am using Foreach loop and the data is getting loaded from text file to Azure SQL.
    I want to pass the filename as column in Azure sql along with the rest of columns.
    How to achieve this?

    ReplyDelete
    Replies
    1. It depends how you transfer your data from a text file to Azure SQL DB. But you can always define the output dataset and add new columns if needed.

      Delete
  10. Thanks for this informative blog.
    I am trying to build a pipeline to move files from blob containter to ADLS . Requirement is blob container has list of files from 2017 Jan till 2019 Sept (Eg: A_20170101,B_20171020,C_20181208).
    I wanted to read these files and put them in different target ADLS folders in this format dynamically . ADLS/YYYY/MM/Filename
    Eg:
    ADLS/2017/01/A_20170101
    ADLS/2017/10/B_20171020
    ADLS/2018/01/C_20181208

    So far i have used metadata to get source list and destination folder i have used dynamic content to create folder structure. How can i acheive splitting of files by checking source files names?

    ReplyDelete
    Replies
    1. If you can parse your input file names, then using Folder and Filename parameters on your ADLS linked service should give you a flexibility to define your desired folder/filename structure.

      Delete
  11. Wonderful Rayis - Is there way to join the Lookup output with Metadata output before passing to foreach.
    I would like to process the files from Gen2 -> SQL Db based on modified time.
    1) I am using the lookup to get the destination table name and source directory name
    2) I am using the meta data to get the file name from the passed source directory based of modified time.
    I require to combine step 1 and 2 information to pass foreach to process the list of files added to a source directory to the destination SQL table.

    Thanks

    ReplyDelete
    Replies
    1. Please check my blog post on working with Arrays in Azure Data Factory, where I did join the output of two tasks into one array. Let me know if it works for your case.
      http://datanrg.blogspot.com/2019/06/working-with-arrays-in-azure-data.html

      Delete
    2. Hi, I have multiple JSON files in one Blob Storage folder, and in a "Copy Data" activity I am trying to sink and map the the JSON file names into a SQL Database table. In a ForEach activity I am able to get he correct files and data, but when it comes to actually getting the filename and sink/map/insert it into the SQL table I get blank values. I have tried using both "@activity('Get MetaData1').output.childitems" from the GetMetadata activity, I have also tired to append a variable using "@item().name" but I don't know how to map them correctly in order for the sink and mapping to work. Do you have any ideas?
      Thanks

      Delete
    3. Use @activity('Get MetaData1').output.childitems for the Items setting of your ForEach loop container. And then for the tasks within this ForEach loop container you can use a reference to the items from your GetMetaData activity as @item().name. Also, if this doesn't work, try to open one of the ADF templates that Microsoft included and see this very example as well.

      Delete
  12. Hi Rayis,

    Very informative, it helped me to start with. But I have a question. How will it work, if I want to update both filename as well as type in the DB. Thanks.

    ReplyDelete
    Replies
    1. If you reference to a type as file extension then this could be done by setting a file name via two variables: actual name of your file and then its extension.

      Delete
  13. I was woundering, if there is a way to capure metadata (structure) and load in the DB table like physical name should go as column name and physical type should go as column data type to the table in DB.

    ReplyDelete
    Replies
    1. Hello, can you provide an example of what you're trying to achieve? I'm a bit confused with the wording of your question.

      Delete
  14. Any ideas how to union many files into a single file based on a substring match? For example, iterate through 100 tables and union the ones with a matching substring? Customer001.CSV would be unioned to Customer002.CSV and Products001.CSV would be unioned to Products002.CSV, etc.?

    ReplyDelete
    Replies
    1. This an interesting case, but I would rather find a way to get access to the backed that produced those files and generated those extracts from the database, rather than stitching multiple files together.

      Delete
  15. Hi Rayis, This is Mallik, when I tried to copied the content of my var_file_list variable into another testing var_file_list_check , I am getting an error "type 'Array' cannot be appended to the variable 'var_file_list_check' of type 'Array'. The action type 'Append To Array Variable' only supports values of types 'Float, Integer, String, Boolean, Object' " Note: var_file_list_check I have declared as an Array and the value as "@variables('var_file_list')"

    ReplyDelete
    Replies
    1. Hi Mallik, if you only intend to copy the content of the var_file_list variable to var_file_list_check, then you can use Set Variable activity to set the var_file_list_check variable with the value of the var_file_list variable.

      Delete

Post a Comment