06. Frequently used Nodes
Before we dive in to more sophisticated scenarios, we'll take some time to get familiar with some of the Nodes that are used in most Workflows. These are Nodes that sit directly before or after the Connector Nodes because they prepare data or evaluate the result of an action.
While there isn't a graded exercise for this section, it's recommended that you follow along to gain familiarity with them.
We also recommend that you review Nodes you should know so you have a general awareness of the full set of frequently used Nodes.
If Node
Let's begin by recapping the If Node which is the most frequently used flow control Node.
Follow these steps in a new Workflow.
Add
IfandVariable Bar.Connect
Start.RunNow → If.Add Custom Property
Variable Bar.animalas an output, setVariable Bar.animaltocat.Connect
Variable Var.animal → If.Value.Set
If.ExpressiontoValue = "cat".Run the Workflow and you'll see the Workflow Log for the
IfNode will show that it fired→ True.Set
Variable Bar.animaltodog'.Run the Workflow again to see the the
IfNode fired→ False.
Review the If help article for more information about this Node.
Choose Node
In some cases you may want to branch control in more than one of two ways - the Choose Node is a good way to do this.
Follow these steps in a new Workflow.
Replace the
IfNode with aChooseNode.Connect
Variable Bar.animal → Choose.Expression.Click the
+button on theChooseNode to add an Execution Output to define an entry forcatand another fordog.Unlike other Nodes, the
ChooseNode supports custom Execution Outputs. These allow you to conditionally affect the flow control of the Workflow.Run the Workflow and you'll see the Workflow Log for the
ChooseNode fired→ dog.
Review the Choose help article for more information about this Node.
Error Node
The Error node, which we looked at earlier, allows you to throw a custom error. You can use this to simplify technical errors or fire based on data validation failures or other internal conditions.
Review the Error help article for more information about this Node.
For Each Node
The For Each Node enables iteration of a set of records in a document. This is useful for cases where subsequent steps in a Workflow need to work with individual records or smaller numbers of records.
Follow these steps in a new Workflow.
Add
For Each.Set
For Each.SourceDocumentto the value below:{ "order": [ { "orderId": 123, "customer": "abc" }, { "orderId": 456, "customer": "def" }, { "orderId": 789, "customer": "ghi" } ] }Focus the
For Each.Path(i.e. click the Property value) and a picker will display allowing you to select the element in the document you want to use to split items out. ClickorderId(just underorder).On the Node Header, click
v → Run from this Node.In the Workflow Logs, you'll see the
For EachNode fire the→ ItemOutput three times, followed by the→ FinishedOutput.Under the
ItemProperty, you'll see the individual order id values emitted, one per row.Focus
For Each.PathProperty value again and this time select theorderProperty.This time when you run the Workflow, you'll see just that the entire
orderelement is emitted into theItemProperty.By adjusting the element that is selected in
Path, theFor EachNode can emit either entire records or values for specific fields within a document.In some cases, you may have a large number of records that need to be split into smaller chunks rather than individual records. The
ChunkSizeProperty allows you to control this behavior.Set
For Each.ChunkSizeto2and setFor Each.EncapsulationtoParentNode.Encapsulationcontrols how we wrap the matched elements. Since the document can only have one top-level element, we need to setEncapsulationwhenChunkSizeis more than one.When you run the Workflow now, you'll see there are now only two Workflow Logs because the first two order records are merged into the same iteration.
This technique can be used for cases where a target system has a limit to the number of records it can process at once.
Review the For Each help article for more information about this Node.
Loop Node
The Loop Node provides iteration over a sequence of integers.
Follow these steps in a new Workflow.
Add
Loop,IfandLoop Exit.Set
Loop.Grouptoloop1,Loop.Startto1,Loop.Stopto5andLoop.Incrementto1.Connect
Start.RunNow → Loop.When you run the Workflow, you'll see five Workflow Logs that fired
→ Loopalong with a final Log for→ Finished.You can exit a loop before it completes by using the
Loop ExitNode.Connect
Loop.Current → If.Value.Set
If.ExpressiontoValue = 3.Set
Loop Exit.Grouptoloop1.Connect
Loop.Loop → IfandIf.True → Loop Exit.When you run the Workflow, the conditional statement should cause the
Loop ExitNode to invoke on the third iteration and theLoopNode should only fire→ Finishedinstead of the next iteration.
Review the Loop help article for more information about this Node.
Formatter Node
Follow these steps in a new Workflow.
The Formatter Node allows string (text) data to be prepared by translating Property values into a templated string.
Add
Formatter.Set
Formatter.ExpressiontoHello, {object}.Add
Formatter.objectand set its value toworld.Run the Workflow to see the text
Hello, worldemitted from theResultProperty.
Review the Formatter help article for more information about this Node.
Escaping and Injection Attacks
The Formatter Node supports some common types of escaping (see the Escaping Property). However, care should be exercised even when escaping untrusted input data.
Consider an example where you are querying a SQL database for a company based on its name. An example query template is shown below:
SELECT * FROM company WHERE name like '{name}'
If the Formatter Node is used to replace the name Property out with a filter a user has provided, the user could provide a value like %'; DROP TABLE company; -- causing the full SQL statement to resolve to:
SELECT * FROM company WHERE name like '%'; DROP TABLE company; --'
This is known as an injection attack. In the example above, we considered SQL but this type of attack can be applied to almost any type of service.
Take these steps to ensure your solutions are not vulnerable to this type of attack:
Be aware of cases when user-provided data is being processed and where possible filter or sanitize it before using it to query data sources.
Wherever possible allow the appropriate connector to handle the concern of translating parameters into a query. For example, our SQL Query Connectors handle this without using string translation and therefore are not vulnerable to this problem.
If there is no other option, use the
Formatterbut ensure you are using the appropriateEscapingoption and test some adversarial cases (i.e. data that would be problematic if it wasn't escaped correctly).
See Avoiding Sql Injection Attacks for more information.
JSON Convert Node
You may encounter cases where you need to provide or accept data externally in a certain format. In other cases, certain Nodes may only be able to accept or emit data in a certain format. For these scenarios, JSON Convert supports conversion of data between XML and JSON.
Follow these steps in a new Workflow.
Add
JSON Convert.Set
JSON Convert.Jsonto the value below:{ "order": [ { "orderId": 123, "customer": "abc" } ] }Run the Workflow to see the text XML representation of the above JSON in the
JSON ConvertWorkflow Log.Copy the XML from the Workflow Log entry and paste it into
JSON Convert.Xml.Set
JSON Convert.ActiontoXmlToJson.Run the Workflow again. Note that the "order" object is no longer considered an array (i.e. the order isn't wrapped in braces). This is because in XML, when there is only one item in an parent element, it's not possible to determine whether the parent should be treated as an array container. This issue and a way to work around it is discussed in the JSON Convert help article.
Review the JSON Convert help article for more information about this Node.
Reduce Node
An important concept in app integration is the ability to exclude data that has not changed since it was last processed.
Often, it's not possible to precisely query a data source for the required delta data because the filters you need aren't supported in the third party API. For example, if you sync customers daily but the source doesn't allow you to filter for customers changed after a certain date, you'll want to discard records that haven't changed early in the Workflow so that you aren't unnecessarily wasting resources processing unchanged data.
One way of doing this is by using the Reduce Node which maintains a hash of records that it has previously encountered and is then able to remove unchanged records before continuing with the next stage of the Workflow.
Follow these steps in a new Workflow.
Add
Reduce 2.Set
Reduce 2.Grouptocontacts, SetReduce 2.SourceDocumentto the value below:{ "order": [ { "orderId": 123, "customer": "abc" }, { "orderId": 456, "customer": "def" }, { "orderId": 789, "customer": "ghi" } ] }Focus
Reduce 2.Pathand selectorderfrom the tree view that displays. This tells the Node how to identify a single record in the document.Set
Reduce 2.KeyPathtoorderId. This tells the Node that theorderIdfield located under theorderelement (as specified in thePathProperty) is the unique identifier or key for the record.Run the Workflow and drill in to the
ReducedDocumentProperty - you'll see the full set of order records are shown there.Run the Workflow a second time and you'll notice that there are no orders returned. This is because that data is now considered processed and unchanged.
Open the
SourceDocumentProperty and change thecustomerof the first order fromabctotest.Run the Workflow again to see that only the changed order is shown in the
ReducedDocumentProperty in the Workflow Logs.
Two-step Reduce
In the example above, we used the Node in what is called ReduceCommit mode. In other words, it's removing unchanged records and committing (storing) the remaining records as 'seen' so that they too will be excluded if they have not been modified by the next time the Node is invoked.
This approach is normally too simplistic for production scenarios because if a step in the Workflow fails after the Reduce Node, the data that was being processed will be removed by the Reduce Node the next time it runs and we'll have no opportunity to correct.
To get around this we split the Reduce operations into two - a Reduce action, and then later, a Commit action.
Rename
Reduce 2toReduce Step.Set
Reduce Step.ActiontoReduceand change some of the data in theReduce Step.SourceDocumentProperty. For example, change a customer code or two.Add a second
Reduce 2, rename it toCommit Step.Set
Commit Step.GrouptocontactsandCommit Step.ActiontoCommit.Copy the value of
Reduce Step.PathtoCommit Step.Pathand the value ofReduce Step.KeyPathtoCommit Step.KeyPath.Connect
Reduce Step.ReducedDocument → Commit Step.SourceDocument.Run the Workflow a few times and you'll notice that the same data is returned. This is because the Commit stage of the Reduce is not running.
Connect
Reduce Step → Commit Step.Run the Workflow again and then a second time. Notice that no data is returned the second time because the Commit stage has run.
In a more complete Workflow, there will be a series of steps between the first and second Reduce 2 Nodes. If any of those steps fail, the second Reduce 2 Node won't run which means that the data that failed to process will show up for the next run.
Review the Reduce 2 help article for more information about this Node.
String Builder Node
When a Workflow uses iterative Nodes like For Each, you may need to progressively build up a document as each iteration completes. One way of doing this is with the String Builder Node.
Follow these steps in a new Workflow.
Add
Loop, addString BuilderrenamedString Append, add secondString BuilderrenamedString Read.Set
Loop.Startto1,Loop.Stopto3andLoop.Incrementto1.Connect
Loop.Loop → String Append.Connect
Loop.Finished → String Read.Set
String Append.ActiontoAppendSet
String Append.VariableNametoexample.Connect
Loop.Current → String Append.Value.Set
String Read.ActiontoRead.Set
String Read.VariableNametoexample.Run the Workflow and check that the value of the
ValueProperty in the lastString BuilderWorkflow Log which is123.
Review the String Builder help article for more information about this Node.
Workflow Node
A key aspect to reducing effort is being able to reuse what you have already built. To support this, Flowgear allows not just Nodes to be added to a Workflow but also other Workflows. By enabling Workflows to call other Workflows, you're able to build reusable components.
Exercise 05: Using Sub-Workflows
Follow these steps in a new Workflow.
Add
Loop.Click
+to the right of theLoopNode to open the Node Chooser again and this time, click theWorkflowstab.Filter for the
Get Employeeexercise Workflow you created earlier and select it.A Node representing the chosen Workflow will be added to the design canvas. Properties that were defined on
Variable BarNodes in the chosen Workflow show up as Properties but notice that they are swapped around - an output Property on a Variable Bar is an input Property on the Workflow Node.Set
Loop.Startto0,Loop.Stopto3andLoop.Incrementto1.Connect
Loop.Current → Get Employee.id.Connect
Loop.Loop → Get Employee.Run the Workflow to see it iterate through employees with id's 0 through 3. Note that id 0 will fail because that employee id doesn't exist.
Review the Workflow Node help article for more information about calling Workflows from other Workflows.
- Connect
Start.RunNow → Loop.
Save and run your Workflow, then click Submit Exercise to grade it.
Key/Value Nodes
Storing Key/Values
In the String Builder example we looked at how we can store or accumulate data as a Workflow executes but this data is not retained after the Workflow completes.
By contrast, the Key/Value Nodes allow you to tag data and then report on it later. They are called Key/Value Nodes because they allow you to associate a key with a value.
For example, if you're integrating sales orders, the key component could be the order number while the value component could be the success or failure information, potentially including an error message if a failure occurred.
Exercise 06: Using Key-Values
Copy the steps we created in the Workflow Node example above into a new Workflow before following these steps.
Connect
Start.RunNow → Loop.Add
Set-Key Value 2, renamedStore SuccessNode after theGet Employee.Connect
Get Employee → Store Success.Set
Store Success.GrouptocontactsSet
Store Success.StatustoSuccess.Connect
Loop.CurrenttoStore Success.Key.Connect
Get Employee.name → Store Success.Value.Add a second
Set Key-Value 2, renamedStore Errorbelow the existingStore SuccessNode.Connect
Get Employee.Error → Store Error.Set
Store Error.GrouptocontactsSet
Store Error.StatustoError.Connect
Loop.CurrenttoStore Error.Key.Connect
Start.Last_Error_Info → Store Error.Value.Run the Workflow to see the error key/value fire for employee id 0 and the success key/value fire for all other employees.
Where the employee is successfully returned, we create a key/value that correlates the id of an employee with their name.
Where the employee does not exist, we create a key/value that correlates the id of the employee with an error message.
Review the Set Key-Value 2 and Set Key-Values 2 help articles for more information about these Nodes.
Save and run your Workflow, then click Submit Exercise to grade it.
Reading Key/Values
In the example above, we recorded success and error information, now we'll look at how to query that information.
Exercise 07: Reporting with Key-Values
Follow these steps in a new Workflow.
Add
Get Key-Values 2,ExcelandVariable Bar.Set
Get Key-Values 2.MatchGrouptocontacts.Set
Get Key-Values 2.EmittoXml.We're going to convert the Key/Values data to an Excel sheet and the Excel Node requires XML rather than JSON.
Connect
Get Key-Values 2.Result → Excel.TableXml.Set
Excel.ActiontoCreate.Add
Variable Bar.Report. Change the Property type fromTexttoFile, then set theFile Extensiontoxlsx.Connect
Excel.ExcelDocument → Variable Bar.Report.Connect
Start.RunNow → Get Key-Values 2andGet Key-Values 2 → ExcelRun the Workflow and click the
Download Report.xlsxProperty in the Workflow Log entry for theStartNode to see the Excel presentation of the Key/Value data.
Review the Get Key-Value 2 and Get Key-Values 2 help articles for more information about these Nodes.
Save and run your Workflow, then click Submit Exercise to grade it.
Communication
There are often cases where you want a lightweight way to send a notification to yourself or team without having to do any special configuration.
The Email Alert Node allows you to set a recipient, subject and email body for this purpose. Emails sent from this Node will always use the sender alert@flowgear.net.
Review the Email Alert help article for more information about this Node.
While this Node is intended for lightweight internal notifications, production workloads would normally use Single Email, direct support ticket creation or push notifications.