Friday, 6 August 2021

How to retrieve a random YouTube video from a playlist with Power Automate

04:10 Posted by Benitez Here , , , , No comments

This is something fun that I had originally planned to do for the MVP Advent Calendar last year by the wonderful Megan V Walker but it didn't go ahead for good reasons. Months later I decided to do this as a WTF episode to share a way of how to motivate your peers/colleagues using the awesomeness of YouTube and Microsoft Teams with Power Automate.

The idea

Since most organisations have their workforce in a hybrid or 100% working from home placement, teams don't have much engagement with each other in-person anymore. There's also been a factor of being isolated indoors which reduced what we regarded as everyday activities such as going to evening classes for learning or to the gym or catching up with friends over a bite to eat.

I wanted to come up with a way of motivating teams to reignite their mind set. One thing that I thought of was sharing a random video from a YouTube playlist with a channel in Microsoft Teams. This can be sharing 
  • something that's related to a team's professional development such as videos from the official Microsoft Power Platform YouTube channel
  • or something that's not related to work such as at home exercises, cooking videos to try since people would be isolating indoors
A random YouTube video from a playlist can be shared using a schedule of choice. Examples are
  • Daily at 12pm
  • Every Friday at 4pm for weekend motivation
  • Once a month on Tuesdays
With Power Automate cloud flows you can motivate the team using an automated schedule with a random YouTube video from a playlist.

And that is how and why this episode came to life 😊 Keep reading if you're interested and want to try this out within your organisation.

The use case

I want to share a random video from a playlist on a weekly basis to help educate or motivate my team so that they can learn or feel motivated

Prerequisites

OK firstly, I want to make it clear that I am not a developer and the following may sound scary because it might involve something that you have never done before as a non-developer. It is not scary and you can do it too!

YouTube is part of the Google family and to interact with the Google Cloud Platform, you need a Google account - eg. what you use for Gmail. 

Once you're signed into the Google Cloud Platform you need to create a Project.

Once your project is created you will need to create an API Key to use with the YouTube API. This API Key is what will be used in a cloud flow in Power Automate. The YouTube Data API v3 is the API that will be used from the Google Cloud Platform.

Disclaimer

By having a Project with API Keys in the Google Cloud Platform and calling the YouTube API, there are quota usages involved. 

Understanding the YouTube API Request

The API request I'm using in my automation is PlaylistItems using the GET method. The API Key created from the project is used for the authentication. In the GET method there are three parameters which is explained in the YouTube API documentation
  • Required
  • Filter
  • Optional
For the required parameter of part I'm referencing snippet which provides properties that I'll be using downstream in the cloud flow.

For the filter parameter I'm referencing playlistId as I'm only interested in retrieving a random video from a single playlist. I have not tried this with multiple playlists by referencing id for the filter parameter.

For the optional parameter maxResults is referenced as there can only be a maximum of 50 items (videos) returned from a playlist. This means if there are more than 50 videos in a playlist, another request is performed to retrieve the next lot of items.

This is where the pageToken parameter comes in handy as it will be used in the next API request to retrieve the next lot of items.

The logic applied

The cloud flow is going to use the HTTP request action to call the YouTube API and loop through each page and 
  • retrieve a random video and append to an array
  • use the nextPageToken value returned in the response to process the next page
From the array created of random videos from each page, a random video will be selected.

The selected random video is then shared to a Microsoft Teams channel.

Example for your learning

The following is a diagram I shared in my WTF episode.


Let's say there are a total of 180 videos in a playlist. There will be a total of 4 pages to loop through. This means 4 random videos will be selected from each page and then finally a random video will be selected from the 4 random videos (the array) which will be shared to a Microsoft Teams channel.

What the playlistItems API response looks like

The following is what the API returns and the property values will be using are highlighted in red.
  • nextPageToken
  • url of the VideoThumbnail
  • videoId

The playlist I'm retrieving a random video from

I'm a fan of at home workouts and one channel I do follow that has awesome workout videos for all levels is Mr and Mrs Muscle. Their video formats are really great and they also provide modified movements to help viewers out. They have a large variety off themes such as
  • Knee friendly workouts
  • No equipment HIIT workouts
  • Chair only home workouts
The playlist I'm using in this WTF episode is their Full Body HIIT Workouts.

Let's Automate

This is what my cloud flow in Power Automate looks like.

The trigger

For learning purposes the trigger is Manually trigger a flow. In real life the Schedule trigger would be used where you define the frequency of when the cloud flow should be triggered.

Initialize nextPageToken variable

Since YouTube allows a maximum of 50 videos to be retrieve from a playlist, an Initialize variable action is used to reference the nextPageToken in response returned by the YouTube API. 

This nextPageToken value will be used for the HTTP request in order to retrieve the next page in the playlist so that the array of random videos can be created.

Initialize random videos array

This is another Initialize variable action to that will be the array of random videos from each page of a YouTube playlist.

Do until

This is the first time I believe I'm using this action in a WTF episode. It's an action that performs the criteria defined until it has been met. In this use case the criteria is to process the actions inside the Do Until action, until the nextPageToken is blank - no longer exists. When there is no nextPageToken in the response it means that there's no page left to retrieve videos in the playlist.

HTTP request

  • The Method is GET
  • The URI is the YouTube API request
  • The content-Type in the header is the usual application/JSON
  • For the values in Queries,
    • pageToken is the optional parameter and the variable is referenced
    • part is the required parameter where snippet is referenced
    • maxResults is the optional parameter where 50 is the maximum value of items returned
    • playlistId is the ID of the YouTube playlist id which can be found in the URL of the YouTube
  • key is the API Key created in the project within the Google Cloud Platform

Set variable - nextPageToken continuous

Using a Set variable action, the nextPageToken value returned from the HTTP request is referenced by the Initialize nextPageToken variable. This is so that the Do until action can continue to perform the HTTP request action that calls the YouTube API in order to retrieve the next page of videos.


Append array - secondary videos

Using an Append array action, a random video (row) in the array of the page retrieved will be selected. A reminder that a maximum of 50 videos will be returned per page of a playlist.

The expression is
body('HTTP_request')?['items'][rand(0, length(body('HTTP_request')?['items']))]


Array value - EXPLANATION PURPOSES ONLY

This is a Compose action that I used in the WTF episode to show you the array of the random videos selected from each page returned by the API response. In the WTF episode I showed how there were only two videos selected as there were only two pages.

Random video selected

This is a Compose action to select a random video from the Random selected videos array (from the Initialize random videos array action).

The expression is
variables('Random selected videos')[rand(0, length(variables('Random selected videos')))]

Post your own adaptive card as the Flow bot to a channel

To share the random YouTube video with colleagues/peers, I am using the the Post your own adaptive card as the Flow bot to a channel action. This will post an adaptive card to a channel in Microsoft Teams. I modified the Activity update sample for adaptive cards 

There is a Product video sample for adaptive cards but at the time of writing this WTF blog post, it is not supported with Microsoft Teams.

I kept the adaptive card simple by having
  • some text
  • an image which is the thumbnail of the YouTube video
  • an action which launches a browser (web or mobile) based on the embedded URL when clciked. This will be the YouTube video URL

This was all built in the adaptive cards designer site and then I copied the card payload into the action in my Power Automate cloud flow.


For the expression that is used for the image of the adapative card, it is
outputs('Random_video_selected')?['snippet']

For the expression that is used for the action that launches the embedded URL in a browser, it is
outputs('Random_video_selected')?['snippet']['resourceId']['videoId']

Cloud flow in action

When the cloud flow runs on the scheduled time, it will call the YouTube API to retrieve the available pages from the defined YouTube playlist and will select a random video which will be shared to a channel in Microsoft Teams for learning or motivation 😄

Summary

For some daily or weekly motivation why not use Power Automate with cloud flows to share a random YouTube video to help spark some motivation in the team? It can be a workout video that you can do from home, a cooking video for "Cook off Fridays" or an educational video from the Power Platform YouTube channel. Give it a go and when it works, give me a shout on Twitter 🙂

Friday, 23 July 2021

5 Tips & Trick with Power Automate Cloud Flows

On Tuesday July 13, 2021 I shared five tips and tricks with Power Automate Cloud Flows at my former Melbourne BusApps, D365 & Power Platform user group.

In this WTF episode I shared what these five tips and tricks are so that you can learn about them too in your own time 😊

1. Naming your connection references

When you use add an action and if it's the first action to be added for that connector, as the flow maker you are usually signed in automatically which is what is used as the connection for the connector. Your credentials is used for the authentication.

Below is an example where I have added a Microsoft Dataverse action from the Microsoft Dataverse connector and I am immediately signed in.


In the WTF episode I talked about how it will automatically use the name of the connector as the Default Name value for the connection reference. In this scenario, it will appear as Microsoft Dataverse.

Even if you create a new connection from within the action, the same behaviour will appear as demonstrated in the WTF episode where the newly created connection reference will appear as Microsoft Dataverse.

How to name your connection reference

Head over to your solution and add a new connection reference. In the side panel that appears 
  1. Enter your desired name in the Display name field
  2. Provide a description
  3. Choose a connector. In my scenario it is Microsoft Dataverse
  4. Choose your credentials as the Connection
  5. Click Create
The newly created connection reference will appear in your solution.


When you view the connections in the action back in the cloud flow, you'll now see the named connection reference.

2. Create HTML table action #hack

If you try inserting a space in a header value when using the Create HTML table action, it won't work. In my scenario my header value I needed to enter was "Case Number" but when you hit the space bar on your keyboard, the space will not be rendered. The value will appear as "CaseNumber" instead.

How to insert a space in the header value

The hack I found was buried in a Power Automate community forum post. Open Notepad and type in your value, then copy and paste. That's it! Totally works.

3. Pieter's method for Apply to Each

There may be some use cases where you need to grab values from records or rows returned in an Apply to Each action to use them downstream in the cloud flow. The use case I mentioned in my WTF episode was about looping through the Cases to retrieve the fullname of the Owner so that it can be used in a notification for high priority cases.

Typically what has been done by flow makers (including myself in the past) is to 
  1. Have an Initialize Variable action before the Apply to Each which acts as the array variable
    1. In my WTF example I used this to form an array of the fullname values
  2. Within the Apply to Each have an Append to Array Variable action which grabs the fullnames of the Owner from each Case which is then surfaced up into the array variable (No.1)
  3. Any action downstream in the cloud flow can then use the array variable (No.1)

The infamous Pieter's method

Pieter's shared a blog post about what he discovered where variables are not used and that you can display your looped values as an array outside of the Apply to Each action. Couple of tricks here with a bonus tip.

Within the Apply to Each action, have a Compose action that will retrieve the value you're after. In my use case it was the fullname of the Owner associated to the high priority Case.


After this Compose action, add a new Compose action that references the output. Then, using the "Peek Code" feature, copy the expression.



Below the Apply to Each action or anywhere else downstream in the cloud flow, you can use the copied expression. I used a Compose action and copied the value to demonstrate that the fullname values were correctly retrieved


You'll notice however that it displays it as an array and will show duplicate values of the fullnames. This is because it grabbed all fullname values from all the Cases as expected. 


What if we only need to see the fullnames once, in other words the distinct values?

BONUS TIP TIME ✨

  1. To only display the distinct values, use the union function.
  2. To display the returned distinct values as a single string rather than an array, use the join function.
What's cool about the join function is that it is one smart cookie! 🍪 
The join function knows that after the last row in the array, it will not insert the delimiter you have defined in your expression. In my WTF episode use case, the expression was

join(union(variables('Fullnames'), variables('Fullnames')), ', ')


My defined delimiter is a comma and a space.

The cloud flow this time round will now display the array as string with distinct values only. No more duplicate fullname values 👍🏼 and also notice who there's no comma and a space after the last row/fullname. Told you the join function is one smart cookie 😁

4. Custom date and time formats

When you want to move away from the standard date and time formats, you can easily use a custom date and time format. This is useful when you want to have a particular format for uses cases that require a notification to be posted on Microsoft Teams in a channel or via email. 

Refer to the documentation for the format specifiers

The docs.microsoft.com documentation has been updated and outlines the different format specifiers.

For example the first format specifier, d, is displaying dates that have a day of 1 to 9 as a single integer value without a leading zero so it will appear as 1, 2, 3 and so forth.

The second format specifier in the document, dd, is displaying dates that have a day of 1 to 9 with a leading zero so it will appear as 01, 02, 03 and so forth.

In my use case in the WTF episode I wanted the date and time to display as the following

Sun Jul 18 03:00 PM

The format I used in my expression is the following

ddd MMM dd hh:mm tt

I also explained that I was using the convertFromUtc function in my expression as Microsoft Dataverse, like other systems, store date and time as UTC. If you use the value as-is from Microsoft Dataverse it will be in UTC. I therefore used the function to convert the UTC value into New Zealand local time. My expression is the following

convertFromUtc(triggerOutputs()?['body/createdon'], 'New Zealand Standard Time', 'ddd MMM dd hh:mm tt')

If you want to dynamically reference the timezone value because let's say you're in a country where there are multiple timezones, I got you! I covered this in previous WTF episodes
  • How to identify a Contacts' time zone in Flow using CDS and Bing Maps
    • This covers how to get a Contacts' time zone so that you can format the date to their local time. This is useful for external communications.
  • How to populate a date and time field based on a user's local time zone
    • This covers how to get the user's time zone so that you can format the date to their local time. This is useful for internal communications amongst the user's who have access to Microsoft Dataverse.
    • Note: This would only work if the communication is sent to a single person. In the scenario where it is sent to a group of users who may be in different timezones, then the suggestion here is to make a reference to the timezone such as "Sun Jul 18 03:00PM NZST"

5. Dataverse trigger condition

I covered this in my last WTF episode and this time round the use case is different to demonstrate how trigger conditions for the Microsoft Dataverse connector can be applied.

The example I gave in this WTF episode is about triggering the cloud flow only if the customer associated to the Case is an Account. The customer column is special where it can have two types - Account or Contact.

In the scenario where the customer column value is a Contact, the cloud flow should not trigger.

The trick

For every trigger in cloud flows, there will always be a response/output provided. The trigger in my cloud flow is the "When a row is added, modified or deleted." In the run history of a cloud flow that succeeded, you can review the outputs.


If you scroll down you'll see a property of 

_customerid_type

Copy this property and in the edit mode of the cloud flow, 
  1. Click on the ellipsis (. . .)
  2. Navigate to Settings
  3. Enter an expression in the Trigger Conditions

The expression I used is

@equals(triggerBody()?['_customerid_type'], 'accounts')

This expression references the property and therefore the condition is that the property of  must equal accounts.

The cloud flow will now only trigger when the customer associated to the Case is an Account, and not a Contact.

If you want to learn how to do AND OR statements in your condition refer to my second WTF episode #2018Throwback 

What I shared in WTF Episode 2 is in reference to the deprecated Condition action which is no longer available but the same principles apply to the trigger conditions in the Settings of the trigger for the Cloud Flow.

Summary

These were the 5 tips and tricks I shared in the user group session. 
  1. Create the connection reference in a solution in order to enter a named connection reference
  2. Use Notepad and copy + paste your value for the Header into your Create HTML table action to have a space in the Header value
  3. Outside of the Apply to each action reference the output of the Compose action within the Apply to each action and use a couple of functions to display the array as a string
    1. Union - to retrieve only the distinct values in the array
    2. Join - to display the distinct values from the array as a single string
  4. Refer to the Power Automate Cloud Flow documentation to format the desired custom date and time value
  5. Use the Microsoft Dataverse trigger conditions under Settings to enter an expression that references the properties in the body of the trigger
Awesome sauce. I hope you learnt something new and thanks for stopping by 😄

Thursday, 6 May 2021

Better management of SharePoint Document Libraries for Dynamics 365

Today integration with SharePoint for Document Management with Dynamics 365 is supported out-of-the-box. When you go through the Document Management configuration for Dynamics 365 you define which tables you want to enable for the integration and it will also automatically create default Document Libraries for each table in your SharePoint Site.

When the configuration has been completed the end user experience will be as follows:

  1. End user views a record, such as Case.
  2. The end user uploads a file by navigating to Documents under the Related tab.
  3. A Dynamics 365 Document Location record (row) is created which represents the folder in SharePoint under the default Document Library of Case.
  4. The file is uploaded against the Document Location and from the Case record the file can be viewed, as well as browsed to in SharePoint.

An issue you don't know of until it hits...

One of the known SharePoint issues is that

A library (or list) cannot have greater than 50,000 unique security scopes

The Microsoft documentation implies that the recommendation is to have less than 5,000 unique scopes and one of the methods to apply is to split your SharePoint structure in a manner where documents can live under different Document Libraries.

What can I do to manage my Document Libraries?

Well there are many ways but in this WTF episode you'll learn how to manage your Document Libraries using a period structure to split the document libraries based on a date column's year value. 

In the Case table I created a new column that represents when the Case was acknowledged. For example any files associated to a Case that was acknowledged/raised/received in the year 2020 will be stored in a Document Library that represents the year - Case 2020. This applies continuously,
  • Case 2020
  • Case 2019
  • Case 2018
  • etc...

The logic




Whenever a document is uploaded for a Case, check if a Document Library for the period exists based on the year of the date column's value.

If it does not exist create a new Document Library in SharePoint using the year value and naming convention of "Case - YYYY." 

The same needs to be applied to Dynamics 365 where a Document Location record is created. A Dynamics 365 Document Location can represent both a Document Library and a Document Location (folder) for SharePoint.

The document would have initially been uploaded in a Document Location (folder) that lives under the default Document Library of "Case" in SharePoint. This folder needs to be moved to the Document Library that represents the period of the Case. This is done within SharePoint so Dynamics 365 also needs to be updated to point to the new Document Library after the folder has been moved.

The Dynamics 365 Document Location record needs to be updated to point to the Document Library that represents the period so that
  1. It's not under the default Document Library of "Case"
  2. The end user experience is not broken if they view the Documents tab from within the Case record and click on the button that directly opens the SharePoint document location in a new browser tab.
With regards to No.2, if the Dynamics 365 Document Location is not updated an error will be encountered by the end user because in SharePoint the folder has moved but the Document Location record in Dynamics 365 is still pointing to the default Document Library where the folder no longer lives.

If you're scratching your head, don't worry - keep reading + watch my vlog and it will make sense 😅

[BTW Ryan Maclean if you're inspecting my diagram and thinking about the lack of thoroughness, I'm sorry! Lol. I know you're into diagrams correctly reflecting a process.]

Let's Automate

This is what my cloud flow in Power Automate looks like.

Some details before we dive into the steps

  • I configured the Dynamics 365 integration with SharePoint by linking it to a SharePoint site in the same tenant.
  • The table I am using in my automation that has the Regarding lookup column is Document Location.
  • The table of the Regarding object type that will be used in the trigger condition is Case.
  • In the Case table I have a custom date column called "Date Acknowledge" which is used to identify the period the Case is associated with so that it is reflected in the Document Library structure. I chose not to use the Created On date field as customers I have worked with in the past can retrospectively enter in a past date when a Case is created in Dynamics 365.
  • The naming convention I am using is "Case - YYYY"

The trigger

The Dataverse trigger is "When a row is added" where the table referenced is Document Locations.

Delay

In my vlog I explained the purpose of using the Delay action in the cloud flow. As soon as an end user clicks on the "Upload Document" button within the Case record, a Document Location record is automatically created which is by system design for the Dynamics 365 integration with SharePoint. This Document Location record is created before the document is actually uploaded. The automation in this cloud flow is only applicable when a document is uploaded so we need to give it time to identify a document is uploaded.

The delay I'm using is 2 minutes.



Note: As I write this I now realise that I should have additional logic to handle when a document is not uploaded (end user can choose to cancel the upload within the Dynamics 365 Case record) to terminate the cloud flow. That's something you should factor if you follow the steps in this WTF episode.

Get Case row by ID

This is the Dataverse Get row by ID action. In this action the Case is retrieved through the Regarding object value (remember this is the GUID of the row) and only returning my date column that represents the date the case was acknowledge. This is so that we can extract the year value from the date column to identify the period of the Case and be reflected in the Document Library structure. 

Received Year value

This is a Compose action where only the year value is extracted from the Date Acknowledged column of the Case the Document Location is in regards to. The date field is referenced from the previous Dataverse action.

The expression is
 
formatDateTime(outputs('Get_Case_row_by_ID')?['body/ben_dateacknowledged'], 'yyyy')

List Document Location rows

In this step the Dataverse List rows action is used to retrieve a Dynamics 365 Document Location where the relativeurl column equals the period of the Case. The relativeurl is the Document Library.

Initialize Parent SiteLocation variable

An Initialize variable action is used so that we can set the document location based on
  • the Document Location record that represents the Document Library for the period if it already exists. This is from the previous Dataverse List row action
  • the Document Location record that represents the newly created Document Library for the period if it does not already exist.

Condition

The condition action is checking whether a record is returned in the Dataverse List Document Location rows action.

The expression used is

length(outputs('List_Document_Location_rows')?['body/value'])

And the operation used is "is equal to"  where the value is "0"


If the value is equal to 0 then the cloud flow will proceed to the Yes path where
  1. A new Document Library is created in SharePoint
  2. A new Document Location is created in Dataverse
If the value is not equal to 0 then the cloud will proceed to the No path as it means a Document Location exists for the period which will be used downstream in the cloud flow to move the folder from the Default Document Library to the Document Library that represents the period.

If Yes - Send an HTTP request to SharePoint

I couldn't see an action to create a new Document Library in SharePoint using the SharePoint connector but I did find this blog post by Alan which I followed. Cheers Alan! 😃

In this action I'm providing a Description and the using my naming convention of "Case YYYY" for the SharePoint Document Library which references my year value that was extracted from the Date Acknowledged column from the Case table as per my earlier Received Year value Composed action. This is so that a new Document Library in SharePoint is created for the period the folder should live in instead of the default Document Library.

Add a new Document Location row

In the previous step we created the SharePoint Documentation and now the Dynamics 365 Document Location needs to be created so that it can be used for any documents uploaded against a Case. In this Dataverse Add a new row action I am creating a new Document Location record using the same naming convention as per the previous step where the year value from the Date Acknowledged column is used. 

Three more fields that need to be configured for the Dataverse action is the 
  • Location Type
  • Parent Site or Location field
    • Now I have explicitly referenced the GUID of the SharePoint Site record in Dynamics 365, best practice is to use an environment variable but that's something I am not covering in this WTF episode. For purpose of learning I am explicitly reference the GUID.
  • Relative URL
    • The relative URL is set using the same naming convention. This is important as this points to the Document Library and is used whenever a new Document Location is created that needs to refer to the Document Library that represents the period. In other words, this will be what is used in the "No" path of the condition.

Set Parent SiteLocation variable

In this Set variable action the GUID of the newly created SharePoint Document Location is to define the Parent SiteLocation variable value in the earlier step of Initialize Parent SiteLocation variable.

If No - Set Parent SiteLocation variable 2

In this Set variable action the GUID of the SharePoint Document Location is retrieved from the earlier step that uses the Dataverse List rows action -List Document Location rows. A Document Location exists for the period and the GUID will be used to define the Parent SiteLocation variable value in the earlier step of Initialize Parent SiteLocation variable.

Move folder

In this SharePoint action we are moving the folder in SharePoint from the default library to the Document Library that represents the period of the Case. The relative url value from the trigger is referenced in the Folder to Move field by using the expression of

/incident/triggerOutputs()?['body/relativeurl']

The Destination Folder represents where the folder should be moved to which will be the Document Library that represents the period so again, the year value from the Date Acknowledged column in the Case table is referenced. The expression is

/Case outputs('Received_Year_value')

Upload Document Locations for Case

In this Dataverse action the Document Location that was originally created from when the end user clicked on the "Upload document" button in the ribbon within the Document tab in Dynamics 365 needs to be updated to point to the correct Document Library that represents the period. 

This is where the Parent SiteLocation variable is referenced to set the GUID of the Document Library based on the Yes or No path taken.

Cloud flow in action

Time to give this automation a whirl. When a document is uploaded where the Regarding is Case and it is for a period that does not already exist as a Document Library in SharePoint, the Yes path will be processed.


When a document is uploaded where the Regarding is Case and it is for a period that does exist as a Document Library in SharePoint, the No path will be processed.

An alternative to Power Automate

I shared in my vlog that there is an ISV provider called Connecting Software that provides an add-on to one of their products to resolve the unique permissions issue I mentioned earlier. This product is called SharePoint Structure Creator, it will automatically manage your Document Libraries in SharePoint whenever a file is uploaded from Dynamics 365 into SharePoint. 

I came to know of Connecting Software as one of the customers I work with uses their CB SharePoint Permissions Replicator product and they provide great support. I had a good support experience with them, the person was awesome.

It is important to note that the SharePoint Structure Creator is an add-on to the CB SharePoint Permissions Replicator.

I also want to make it clear this is not a sponsored blog post by Connecting Software, this is me simply sharing that this is an alternative as what I share in this WTF episode is limited to a single table whereas their add-on product can provide automation across all tables through simple configuration.

Summary

Structuring Document Libraries for a table in Dynamics 365 can be achieved using a cloud flow in Power Automate. It's a matter of using a date column that can be used to determined the period the Case is associated to and checking if a Document Library in SharePoint exists for the period.

I hope you found this useful and learnt something from this WTF episode.

Shout out to...

My former work bestie Rex 🦖 for helping me out with my cloud flow in a couple of areas that needed to be refined. Miss you my friend.

Another SharePoint + Dynamics 365 + Power Automate blog post

If you want to learn other methods that can be applied to SharePoint integration with Dynamics 365, I recommend checking out this blog post by Ryan Maclean. Ryan shares how you can configure SharePoint differently to the default set up that is defined by Dynamics 365 using Power Automate.

Thursday, 29 April 2021

Trigger cloud flows based on the Regarding object type for Dynamics 365 or Dataverse

This WTF episode is how to trigger your cloud flow for Dynamics 365 or Dataverse based on the Regarding object type. The trigger I refer to in this WTF episode is the When a row is added, modified or deleted trigger from the Dataverse connector.

In an activity record and some other tables such as the SharePoint Document Location table, there is a lookup field/column that goes by the name of Regarding. This field on a form allows you to select another record/row from a table that it is in regards to. For example,

  1. An email is associated to a Case. The case would be the value in the Regarding field.
  2. A document has been uploaded against the Case. Behind the scenes, a SharePoint Document Location record is created to represent the location of where the file has been uploaded to in SharePoint.
What if you need to trigger your cloud flow only when a document is uploaded against a Case and not trigger when a document is uploaded against an Account? This is what you'll learn in this WTF episode.

Anatomy of the Regarding lookup column

There are different types of lookups and as mentioned earlier, the Regarding field in a form will also you to select another record/row from a table. Refer to this blog post for more details.

In model-driven apps, you see it as "Regarding," where you select the table followed by the record/row.


If you review the attribute settings of the column you'll see there is a reference to "Target Record Type" which represents the table of the Regarding object. Below is a screenshot from the classic editor.


Underneath the hood there's two parts to the Regarding column (regardingobjectid),
  1. Regarding object value - this is the GUID of the record/row
  2. Regarding object type - this is the table of where the record/row resides in
In my earlier screenshot of the Document Location the Regarding lookup column is a row from the Case (incident) table.

Why can't I use the Filter row field in the trigger?

You can try like I did but you're going to run around in circles. You'll be asking yourself why the cloud flow not trigger.

If you attempt to do filter by calling the web API directly, you're also going to find it difficult to achieve. If you search for it online using your Google powers, you'll see what I mean when you read the responses to people who have tried to filter by the Regarding object type.

The other clue is when you review the properties returned in the API after calling it directly, you won't see a property that is in reference to the Regarding object type, only the value.

So what can you do?

One of the beautiful things about cloud flows is that we can reference any property returned in the response of a trigger or an action. This can be a blessing in disguise and in this scenario, it is.

In my vlog I showed you that I had a cloud flow with the When a row is added, modified or deleted trigger and a compose action with a dummy value so that I could show you what properties are available in the response of the trigger.


Let's take a look at the run history and review the properties returned. As you can see one of the properties returned is _regardingobjectid_type


This is a property we can reference but it's not going to be in the Filter row field of the trigger. It's going to be in the old school Conditions option within the Settings of the trigger. Click on the ellipsis and select Settings.



In here you'll see the ability to add a Condition and this is where we can enter an expression that references the property.


In my second WTF episode (#throwback) I explained how to write expressions using the legacy Conditions action (note this no longer exists) where it's separated into three parts as per the screenshot below.


The first part represents the condition and in our use case we want the Regarding object type to equal the Case table.

The second part represents where the attribute (column) is being retrieved from and in our use case it will be retrieved from the Trigger.

The third part represents what value of the attribute needs to equal and in our use case it is incidents.

The expression to use when filtering by table based on the Regarding object type is

@equals(triggerBody()?['_regardingobjectid_type'],'incidents')


That's it!

Cloud flow in action

Upload a document against the Case and the cloud flow will be triggered.


If a document is uploaded against an Account, the cloud flow will not be triggered.

Summary

By using the trigger conditions within the Settings of the trigger, you can use an expression that allows you to reference the from the trigger. This is something we can't do today in Dynamics 365 classic workflows or when you call the API directly. Cheers Power Automate! 

#LetsAutomate

The truth

It took me hours when I couldn't get it working with the Filter row field and when I reviewed all the questions + responses in stackoverflow about trying directly through the Dynamics 365 API, it wasn't leading anywhere either. Then I thought - hey why not try the trigger conditions since we can call properties of the trigger. It worked!!! I slept peacefully that night LOL 😆

If you do use this, please let me know by giving me a shout out on Twitter - @benitezhere 😊