Setting Variables in Azure Data Factory Pipelines

(2018-Oct-15) Working with Azure Data Factory you always tend to compare its functionality with well established ETL packages in SSIS. Data flow task have been recreated as Data Copy activities; logical components have found they cloud-based siblings; as well as new kids on the block, such as Databricks and Machine Learning activities could boost adoption rate of Azure Data Factory (ADF) pipelines.

Support for local variables hasn't always been available in ADF and was only recently introduced to already available pipeline parameters. This addition makes more flexible to create interim properties (variables) that you can adjust multiple times within a workflow of your pipeline.



Here is my case-study to test this functionality.

I have a simple SQL Database with 2 tables that could hold daily and monthly sales data which I plan to load from a sample set of CSV data files from my Blob storage in Azure.



My new ADF pipeline has an event trigger that passes a file path and file name values from newly created objects in my Blob storage container:


The logic then would be to check a data feed type (Daily or Monthly) based on a file name and load data to the corresponding table in SQL Database in Azure.


And here is where this [Set Variable] activity comes as a very handy tool to store a value based on a define expression of my variable:





Then I define two sub-tasks to copy data from those flat files into corresponding tables based on the value of the FeedType variable:



And once this all being done, I place two different data files into by Blob storage container and ADF Pipeline trigger successfully executes the same pipeline twice to load data into two separate tables:






Which I can further check and validate in my Azure SQL Database:


My [Set Variable] activity has been tested successfully! 
And it's one more point toward using ADF pipelines more often.

Comments

  1. What condition you have written in the true activity based on the variable value
    I am mean how would we access the variable value if we have array type variable

    ReplyDelete
    Replies
    1. Yes, you can assign your array type variable to a For Each container and then iterate though each of the array values using item() as reference point for individual values. I've blogged about this as well: https://datanrg.blogspot.com/2018/10/append-variable-activity-in-azure-data.html

      Delete
    2. I am already inside For each container and I am setting this variable to the list of error files created during the run of copy data activity which is iterative activity for list of objects. I have assigned the values to my array variable named as FileList as below

      @activity('CheckforSFerrorfile').output.childitems


      where CheckforSFerrorfile is GetMetadata activity

      Now in next hop I want to check for first value of this array the Type of Item returned by getmetadata Activity i.e either 'File' or 'Folder' and based on the value I want to do operation in If COndition Activity.

      My expression for Ifcondition activity is as below

      @contains(variables('FileList'),'Folder')

      Please help me

      Delete
  2. Hi Gaurav, to get a first element from your array variable is easy. You don't need to process it through a loop container.

    This expression will give you the first element of your array: variables('FileList')[0].

    I've created an example in posted in my personal GitHub: https://github.com/NrgFly/Azure-DataFactory/blob/master/Samples/pipeline/bdata_adf_variable_array_first_element.json

    First, I manually create an array variable FileList := @createArray('File', 'Folder')
    And then I set another variable to the first element of the array - FirstArrayElement := @variables('FileList')[0]

    And then you can do what ever you want with the FirstArrayElement variable.
    I'm thinking to write a quick blog post about it, thanks for your idea :-)

    ReplyDelete
    Replies
    1. Blog post created:
      http://datanrg.blogspot.com/2019/04/azure-data-factory-extracting-array.html

      Delete
  3. Hi Rayis,

    We have a specific use case somewhat similar to your above two blogs, in which we have a delimited file which contains two columns ID and Data, For ex: ID = 1 and Data = A.

    Our requirement is to generate text files out of it using Data Factory, the name of the output file should be same as that of ID column (we may achieve this using variables) and inside those files corresponding data should be written. We don't need any column headers to be included inside the files, just that data text to be written. And we need to upload these files to Azure BLOB.

    We are stuck with this requirement past 2 days. It would be great help to us, if you can suggest us with some pipeline flow, in regards to above requirement.

    ReplyDelete
    Replies
    1. 1) LookUp activity to read your file (ID, Data)
      2) Output ID into array variable
      3) ForEach container using the array variable
      4) Within this container use a CopyData or any other File creation tasks to create your files (create Folder/Filename) parameters for your sink dataset) and then pass your ID file into those sink dataset parameters).

      Logic may vary, but at least you can get an idea how to process your files.

      Delete
  4. Hi Rayis,

    Thanks for your great help !!

    I achieved this as per your suggested flow. But one quick question, As I did some research, LookUp activity of ADF can only process 5000 rows upto 2 MB in size.
    Here we have a requirement of processing 1 million rows. I mean we generally process more records in BI projects through SSIS.

    Just want to confirm, Is there any functionality through which we can implement the same in ADF ?

    ReplyDelete
    Replies
    1. Yes, I agree, the Lookup in the Control Flow of the ADF is for small datasets only, it has its own memory limitation.
      For a larger datasets I would suggest using Lookup/Exists component of the Mapping Data Flows in the ADF, where along with parameterized Sink you can have your destination file naming dynamically.
      However I haven't test this route yet, you will be the first one! Tell me how it goes in your case.

      Delete

Post a Comment