Showing posts with label Office 365. Show all posts
Showing posts with label Office 365. Show all posts

Wednesday, 2 December 2020

How to authenticate as an application with Microsoft Graph API with flow

What is Microsoft Graph API?

Microsoft Graph API is an access point for all your data across your applications and services in your Microsoft cloud. 

Previously there were different SDKs with their own security, messaging and data format requirements - there was inconsistency and the learning curve for developers was a challenge. Examples are OneDrive for Business, Outlook, Azure Active Directory, Discovery Service and so forth. In Microsoft Graph the APIs are centralised providing a standardised structure for applications to be built on top of the different services also provides capabilities in extending the experience of the  application or service such as Microsoft Teams. This allows developers to build with reduced effort compared to the previous way of having to learn the different SDKs.

Authentication with Microsoft Graph API

There are two types
  1. Delegated permissions which requires user consent
  2. Application permissions which requires admin consent
Today with the connectors in flow within Power Automate, the connector authenticates through a user account. An example is the Office 365 Users connector, as the flow maker it will use your user account as the credentials and you are required to give consent for the Microsoft Graph API to authenticate as you. Below is a screenshot that shows how the authentication method is of a user account.


For Microsoft Dataverse, there's the method of authenticating as a Service Principal Account which is essentially an application user. The Service Principal Account can be used as the authentication method for the CDS current environment connector. But what about the other connectors such as the Office 365 User connector I mentioned?

This is where you would need to do some investigation to find what option is suitable for your organisation and what you're trying to achieve with flow in Power Automate. One option is to authenticate as an application which is what I'll be sharing in this WTF episode.


This WTF episode is a side explanation to a webinar I hosted in the week of Thanksgiving 2020 where I demonstrated how the Power Platform can be used to support a hybrid workforce of teams working from the office or remotely. I go give an overview on the following,
  1. How a Canvas app with a PCF control can be used to allow employee to communicate where they are working from for the week and if working from the office, they can select a floor and desk of where to sit
  2. How to change the look and feel of cards in Microsoft Lists using JSON
  3. How a manager can interact with a bot through Power Virtual Agents in Microsoft Dataverse for Teams where a flow does the magic to display the list of employees back to the manager - this is where the authenticate as an application with Microsoft Graph API is use
You can watch the webinar recording here.

Part 1 - Create an app registration

To authenticate as an application with the Microsoft Graph API, an app registration needs to be created which can be done in the Microsoft Azure portal. I did cover this in a previous WTF episode but I'll run through it again.

Log into Azure portal, click on App Registrations and click on +New Registration. If you don't see the following below when you log into the Azure portal, search for App Registration.


Enter a Name and select whether you want a single tenant or multiple tenant followed by the register button. Refer to the table in this docs.microsoft.com article that explains the differences of the account options.


Once created you'll see details of the App Registration which will be referenced in the connector in flow within Power Automate.


Two more items need to be configured afterwards. The first one I showed in my vlog was to create a Client Secret which will be for the security of a connector in flow within Power Automate. Head to Certificates & secrets, click +New client secret.


Enter a name for the Client Secret. Select the a suitable option for the expiry setting and then click add.


The Client Secret will be created. Copy the Client Secret value and save it some where as this will be used when configuring the authentication of a connector in flow within Power Automate.


The second configuration is to enable API permissions for Microsoft Graph API. The documentation available for Microsoft Graph API is great because each API request has a Permissions section that outlines the permission required for the delegated or application permissions. The example I used in my vlog was the List directReports API request where it outlines the permissions for the application permission. I used the "User.ReadWrite.All" permission.

Head over to API permissions and click on +Add a permission and select the Microsoft Graph option.


Select Application permissions.


Search for the User.ReadWrite.All permission to select it an click on Add permissions. A message will appear that confirms the permissions has been added and it will be listed.


Next grant admin consent and that will enable the API permission for the application.



For further reading on app registration, this docs.microsoft.com article provides some further explanation on what is an App Registration.

Part 2 - Using the app registration in flow within Power Automate to authenticate as an application with Microsoft Graph API

The connector to use in flow within Power Automate is the HTTP connector. This is a premium connector so you would need to ensure that you have the relevant licensing that allows you to use premium connectors.

The following is what would be configured in the HTTP action.


1 - The API request method. This is defined by the API request in the documentation for Microsoft Graph API. In my vlog I was calling the List directReports API request and the method defined in the documentation is GET.

2 - The API request as defined in the documentation is what would be inserted in this field of the action. Again, refer to the Microsoft API documentation for the API request URI that you are using. In my use case I am retrieving the list of employees that report to a specific manager so I am using the second option listed in the documentation.

The parameter id is the Object ID of the user record in Azure Active Directory and the other parameter that can be used is the userPrincipalName which is also in the user record in Azure Active Directory.

You will also notice in my screenshot that I've also referenced a select statement which will only retrieve the two properties specified which are displayName and userPrincipalName.

3 - This is using the standard definition of application/JSON as the content-Type value.

4 - Authentication is Active Directory OAuth to reference the app registration in Microsoft Azure portal.

5 - This is the Tenant ID from the app registration in Microsoft Azure portal.

6 - This is the Base Resource URI of Microsoft Graph API.

7 - This is the Client ID from the app registration in Microsoft Azure portal.

8 - The option to use is Secret.

9 - This is the Client Secret from the app registration.

Flow in action

Once the flow has been configured you're good to go in testing your HTTP action authenticating as an application with the Microsoft Graph API.


What I did demonstrate in my vlog was how the response returned as an application will be the same as the response returned through user authentication.

Summary

To authentication as an application in flow within Power Automate you can use the HTTP connector and reference details of an Azure app registration. The connectors available (such as Office 365 Users) would not be used since they authenticate as a user whereas with the HTTP connector you can define the authentication for an application to interact with the Microsoft Graph API.

If you can't grant admin rights in Microsoft Azure portal it could be it's not accessible/not allowed for some users in your organisation, check out this docs.microsoft.com documentation. Big shout out to Audrie Gordon for the tip! 💡 Subscribe to her YouTube channel, she has great content.

Catch you in the next #WTF episode

Monday, 23 September 2019

Delaying emails to certain individuals based on their time zone

07:24 Posted by Benitez Here , , , , , , No comments
For the second time running the free Flow Online Conference was on September 10, 2019. I was lucky enough to be included among the awesome line up of speakers. My model-driven app was Star Wars themed too 😁


I presented a use case where emails needed to be delayed for certain individuals based on their time zone. To be more specific as an example, delay sending emails to only primary contacts based on their local time zone. This use case came about because previous clients who I have worked with had customers globally or nationally and wanted emails to be delayed to a group of customers.

But we can already delay emails in Outlook? 🤔

Outlook will send the emails based on the single time selected which is also in the context of the person who is sending those delayed emails. Same applies to any email marketing software out there, delaying emails will be based on the person who is sending those delayed emails and the date and time is based on their time zone - not the recipients' time zone.

Furthermore in Outlook the functionality to query contacts and have those resulting contacts in an email where they don't see each other's names is not there.

As demonstrated in previous WTF episodes, through the power of Flow the art of sending emails based on local time zones is 100% possible.

Watch my Flow Online Conference session in the below YouTube video. I present at 2:30.


The remainder of this blog post will outline what was presented.

User Story

The following is the user story from my presentation.

As a customer service technician,

I want to send delayed emails to primary contacts based on their time zone,

so that they can receive the email in their local time.

1.1 - The trigger

The trigger can be a recurrence or a Flow button. It is entirely up to how the Flow will be triggered.

1.2 - Retrieve Accounts

The email addresses of the individuals are stored in CDS in the contact entity. Since we are only sending the email to primary contacts of an organisation the CDS List Records action is used to retrieve the organisations from the account entity.


For the purpose of this Flow I did not enter a filter query. In the scenario where you need to email primary contacts of organisations that meet criteria such as primary contacts of VIP organisations, you can apply a filter query in the CDS List Records action against the account entity.

1.3 - Apply to Each

For every account returned the primary contact will need to be identified. The Apply to Each action is therefore used.

1.4 - Primary Contacts

The next step is to identify the primary contact of an organisation in CDS. The account entity has a single primary contact which is represented by a lookup field. The Apply to Each action is used alongside the CDS List Records action to retreive the contact entity with a filter query of

contactid eq @{items('1.3_Identify_Primary_Contact')?['_primarycontactid_value']}

This filter query will only return contacts based on the GUID in the primary contact field associated to the organisation. The dynamic content value is the Primary Contact field from the previous action that returns accounts.

1.5 - Apply to Each

For every identified primary contact from the previous action, a bunch of actions will be executed in order to send the delayed email to the primary contact. To ensure these actions are executed against each primary contact an Apply to Each action is used.

1.6 - Get location by address

As seen in a previous WTF episode there is a Bing Maps action "Get location by address" that allows the location latitude and longitude to be identified based on address information. The Address 1 fields in the contact record of the primary contact will be used. The assumption I made here is that the primary contact inherits the address of the account it is associated to.

1.7 - Identify local time zone of Contact

There are additional Bing Maps Time Zone APIs available which you can read from this Bing Maps blog post. These APIs are not available as Bing Maps action in Flow however the HTTP action can be used to call the APIs.

In order to find out the time in the primary contact's time zone, the "Given location coordinates, find the time zone of the place" Bing Maps API can be used through a HTTP action.

Simply reference the latitude and longitude outputs from the previous step followed by inserting a Bing Maps key.

https://dev.virtualearth.net/REST/v1/TimeZone/latitude, longitude?key=<bingmaps-key>

1.8 - Retrieve localTime property

To ensure the primary contact receives the email in their local time including the date such as September 21 at 4.30pm, we need to know what is their date and time in their local time zone.

To achieve this the localTime property in the response of the previous HTTP action can be used. Below is a screenshot of the response.


If you attempt to select the localTime as a dynamic content in a subsequent action you're out of luck as it won't be displayed. You will only see "Body" since the action used is HTTP as Flow only is aware of the Body output rather than the JSON properties.


There are two methods that can be applied to get around this.
  1. Use the Parse JSON action which will allow you to view and select the properties as dynamic content values.
  2. Use a Compose action which allows you to reference the property without using the Parse JSON action but simply an expression instead.
Both of these methods have been used and talked about in previous WTF episodes.

The Compose action with an expression is used in this Flow.

body('1.7_Local_time_of_Contact').resourceSets[0].resources[0].timeZone.convertedTime.localTime


Below is a screenshot to help you understand the expression used.

My way of explaining this expression is to work backwards:

From the HTTP action, get the localTime property from the convertedTime object, whose parent object is timeZone which is in the resources array whose parent object is resourceSets.

If you have a better way of explaining this to a non-technical soul, you can tweet to me 😁

1.8.1 - Split the date string

The format of the localTime property will be in UTC format.


The date is only required, not the time to understand the date in the primary contacts' local time zone.

To retrieve only the date string we need to separate the date from the time in the string value of localTime. the split function can be used in an expression within a Compose action where T is used to separate the two.

split(outputs('1.8_Retrieve_LocalTime'), 'T')

1.8.2 - Retrieve date only string

The output of the previous action will return the following,


This results in an array where the date and time is separated. The first row is the date and the second row is the time.

Another compose action is used to retrieve the first array in order to reference only the date of 2019-09-21.

outputs('1.8.1_Split_the_date_string')?[0]

1.9 - Retrieve windowsTimeZoneID

The same technique is used from 1.8 where the windowsTimeZoneID is referenced in a Compose action.

body('1.7_Local_time_of_Contact').resourceSets[0].resources[0].timeZone.windowsTimeZoneId


The two actions 1.8 and 1.9 are within a parallel branch as they are not dependent on each other. Both actions can be performed simultaneously.

1.10 - Convert time zone

The next step is important as the Convert Time Zone action is used to ensure the email is delayed based on the local time of the primary contact.

Flow uses UTC which means the intended local time of the primary contact (for example 9am) needs to be converted to UTC. I've covered this in a previous WTF episode which can be referred to for more learning.


For example in order to send the email at 4.30pm in the local time zone of a primary contact who is based in Wellington, it has to be converted to UTC which is equal to 4.30am. Below is a screenshot from Savvy Time to illustrate the equivalent time in UTC of the Wellington time zone.

1.11 - Delay until

The Delay Until action is used where the date and time is the output from the previous 1.10 Convert time zone action. The email will be sent when the date and time is met which is equal to the local time of the primary contact in their time zone.

1.12 - Send an email

The final action used is the Outlook Send an email (V2) (Preview) action where the content of the message is entered and the recipient is the Primary Contact from the 1.4 Primary Contacts action.

Show time!

When you run the Flow it will go ahead and do its magic.


I'll use a primary contact in Wellington, New Zealand from the Flow run history as an example. If we look at the convert time zone output, we can see that the UTC time has correctly been reflected as 4.30am. 


As seen in the earlier screenshot, 4.30pm in Wellington is equal to 4.30am UTC. Therefore the email is delayed until 4.30pm in the Wellington time zone. The Primary Contact will receive the email at 4.30pm in their time zone.

Summary

Delayed emails based on a individuals time zone can be achieved through the power of Flow. To go one step further you can also perform queries as well in the business scenario where only selected individuals need to receive the email.

Although after reading one of the reviews of my presentation there is one step that is missing in my original Flow. Maybe I was low in energy that day to recognize the gap. I used the date from the localTime which is great. However what should also be in the expression is to add 1 day so that the email is sent the following day.

My Flow is available for download from the TDG Power Platform Bank.

I also want to take the time to say big thanks to Jon Levesque and Gabriel for their massive effort in organizing and putting together a free Flow Online Conference for everyone. It was really great and I can't wait to see the next one with a new line up of speakers.

Sunday, 14 April 2019

Storing the date input value in a Dynamics 365 or CDS DateTime field

In Flow there are triggers and one of the CDS triggers available is the "When a record is selected." In this trigger you can configure an input that will allow the end user to enter information. The available inputs today are the following:


When you configure your trigger to use the date input, it might puzzle you at first when you look at the value in your model-driven app. For example in my Flow the end user is selecting a date and the value will be used as the Due Date of a Task that is created from the Flow.


When you review your Flow run history it may look like this:


The Flow will be successful however when you look into the run detail you'll notice that in the Task action the default time value associated is midnight.


When the end user views the Task in the model-driven app it will look like the following:


This may not make sense at all to the end user. The reason behind it is due to how Flow, Dynamics 365 and CDS use UTC for DateTime fields. In a model-driven app the end user is viewing it in their time zone. In my case I am in the Melbourne, Australia time zone and UTC midnight equals 11am in Melbourne.


So how do you display it accordingly? That's what this WTF episode covers.

Retrieving the time zone of the user who triggered the Flow

Since the trigger is "When a record is selected," it's not possible to use the "Get record" action to retrieve the user's time zone as you would've seen in my previous WTF episode. This action is more suited to when you want to retrieve information based on a user lookup field. However since the Flow will be triggered from an end user selecting a record, the method this time has to be different.

Enter two other MVP sources
  1. Tip #1205: Local time in Flow using Common Data Service
  2. CDS, Microsoft Flow and DateTime formats
Both outline methods that allow you to identify a user's time zone. For this WTF episode I am using the method described in the CRM Tip of the Day post. The Flow in the CRM Tip of the Day post is available for download as well.

For the remainder of this blog post I'll now break down the Flow.

1 - The trigger

The trigger is when a record is selected in CDS through the model-driven app and the date input is used as seen earlier.

2 - Retrieving the date input value

Use a compose action to grab the date input value of the trigger. This is so that it can be referenced in the last CDS action.

3 - Get the end user's Office 365 profile information

Use either the Office 365 Get My Profile or Get User Profile - either one will work, by referencing User ID from the trigger. 



The purpose of this action is to retrieve a value that will identify the Dynamics 365 or CDS user through their Office 365 ID in order to retrieve their time zone from their defined personal settings. This value is what will be used in the next action to associate it to the user's Azure Active Directory Object ID.

4 - List records to grab details of the user in Dynamics 365 or CDS

The entity to references is Users and the filter query is the following,


This will identify the user in Dynamics 365 or CDS. If you look at the output of this action and view it in Notepad++, it's this property that lives in the Users entity:

5 - Retrieve the user GUID

This step is not necessary but it keeps your Flow tidy. If you don't do this step, the Apply to Each action will appear. Use a compose action to retrieve the systemuserid value. The function you're going to use is first. In a list record action, multiple results are usually returned. When you know that you're only going to reference one record returned by the query, the first function is handy because it'll make the output appear as "one" rather than a list of records. This will prevent the Apply to Each from appearing. Your expression will look like this

first(body('1.4_List_executing_user')?['value']).systemuserid

6 - Retrieving the time zone information of the user

This get record action is the same action from my last WTF episode where you retrieve the user's personal settings to identify their time zone through the timezonecode property. Reference the output of the previous compose action that has the systemuserid.

7 - Retrieving the time zone name of the user's time zone

This list records action is the same action my last WTF episode where you reference the time zone definition entity to retrieve the time zone name. The time zone of the user is defined through the property timezonecode.

8 - Retrieve the time zone name

Similar to #5 where you use the first function in an expression to retrieve the time zone name from the standardname property. This will keep your Flow tidy.

first(body('1.7_Get_Time_Zone_Name_of_User')?['value']).standardname

9 - Set the Time of the DateTime value

When you select a date from within the trigger (#1) the default time will be midnight. To avoid confusing the end user when they are viewing the DateTime field that will reference the selected date, provide a set/fixed time. In my vlog I was using 9am which is 09:00:00.

Use a compose action and the function you are going to use in your expression is formatDateTime to reference the output of the Date Input followed by a string format of the date and time. In the string format is where you included your desired set/fixed time.

formatDateTime(outputs('1.2_Date_input_value'), 'yyyy-MM-ddT09:00:00')

10 - Convert time zone action

This is your best friend when it comes to storing date and time values in Dynamics 365 or CDS DateTime fields. To display the desired set/fixed time of 9am, you must convert 9am in the context of the end user's time zone into UTC.  Since Flow, Dynamics 365 and CDS use UTC, this is required whenever you want to display a DateTime value that makes sense to the end user based on their time zone.
  1. For the base time, the output of the compose action in #9 is used.
  2. For the the format time, use the usual ISO format. To do this select custom and enter the format.
  3. For the source time zone, use the output of the compose action in #8.
  4. For the destination time zone, use UTC.

You can also use a compose action and use the convertTimeZone function in an expression if you want to be ultra nerdy and not use the official convert time zone action. This works too #TriedAndTested

11 - Grabbing the converted time

In my action that is creating a new Task record, I am referencing the output of the convert time zone action as the Due Date. 

Ta da!

Now when you run the Flow and view the record in the model-driven app, the date and time now displays correctly according to the end user's defined time zone.

Summary

Whenever you use the Date Input in your "When a record is selected" action and need to display the value back to the end user, it has to be defined in the local time zone of the user and then converted into UTC. To reference the user's personal settings who triggered the Flow for their time zone information, you'll need to grab details of their Office 365 profile to identify their user record in Dynamics 365 and CDS.