You can now get the complete set of the Bare Bones Configuration Guide for Configuring Product Information Management within Dynamics 365 for Operations at the bundled price through Dynamics Companions store, it’s free for Premium members.
Author Archives: Author, Canine & Equine Choreographer, Citizen AI Data Scientist, and Dynamics 365 Global Black Belt at Microsoft
You can now get the complete set of the
You can now get the complete set of the Bare Bones Configuration Guide for Configuring Inventory Management within Dynamics 365 for Operations at the bundled price through Dynamics Companions store, it’s free for Premium members.
You can now get the complete set of the
You can now get the complete set of the Bare Bones Configuration Guide for Configuring Procurement and Sourcing within Dynamics 365 for Operations at the bundled price through Dynamics Companions store, it’s free for Premium members.
You can now get the complete set of the
You can now get the complete set of the Bare Bones Configuration Guide for Configuring Procurement and Sourcing within Dynamics 365 for Operations at the bundled price through Dynamics Companions store, free for Premium members.
You can now get the complete set of the
You can now get the complete set of the Bare Bones Configuration Guide for Configuring Sales Order Management within Dynamics 365 for Operations at the bundled price through Dynamics Companions store, and Premium members can download it for free.
The Waterdeep Trading Company Project
I have been reviving an old project that started a while ago. Being a lifelong fan of Dungeons & Dragons, with the unfortunate problem that I cannot find anyone to play with I have decided to create a test implementation Dynamics 365 in the AD&D format just to see how it would work and if I can find some creative ways to use Dynamics 365 and chose to implement the Waterdeep Trading Company as an example where I can track their many legal (and not so legal) entities within Faerûn.
Overview
The Waterdeep Trading Company is the purveyor of all the finest adventuring supplies to travelers, rogues, wizards and clerics in all Faerûn, and are headquartered in the great city of Waterdeep on the Sword Coast.
Recently they have been experiencing a huge upswing in the traffic through their store and have realized that the old quill and scroll based financial system is not going to scale any more, and their manual supply chain management processes that they are using will not be able to handle their forecasted future demand.
As a result, they have taken the step to modernize their finance and supply chain systems and implement Microsoft Dynamics 365 to manage all of their legal (and not so legal) entities, and the following is a journal of how they set up their system, and how they tweaked the system to make it work perfectly for them.
If you want to follow along in the journey and set up your own copy of the system, then feel free. Read More
Creating a Custom Mockaroo Flow Connector
Mockaroo is a great tool for creating dummy data for our demonstrations, because we can call it form Flow and then integrate it into some very cool examples. But if we want to make it look even simpler, we can create a custom connector in Flow that we can use to surface the Mockaroo API’s as if they are native connections.
This allows us to register the returned fields and use them in other flow steps without having to manipulate the data manually, or having to remember the connection information for the Mockaroo API’s.
If you want to see how this is done then here is everything that you need to know
Introduction
I have a number of API’s that are defined in Mockaroo that I use to create random sample data.
If we drill into any of the API’s then we can see the URL that we can call to create the demo data.
How to do it…
To create a new connector, click on the settings icon in the menu bar and click on the Custom Connectors menu item.
This will open up the list of all of the customer connectors that we have created.
To create a new connector, click on the + Create custom connector link and then select the Create from blank option.
This will open up a dialog box to start creating the Custom connector.
We just need to enter in a Custom connector name and then click on the Continue button.
This will open up the connector setup form on the first configuration step.
If we want, we can add a custom icon for the connector by clicking on the Upload link under the link and select the icon for the connector.
Now we have a nice and snazzy icon for our connector.
Next we will want to specify the Icon background color that matches the background of the icon.
And then we will want to add a Description for the connector.
Now return back to the Mockaroo API page and find the base URL for the API URL.
For this example the base URL for the API is my.api.mockaroo.com.
Now we will want to paste the URL that we got from the Mockaroo API into the Host field and then click on the Security link.
That will take us to the Security step. Right now it’s set to No authentication which is OK for the Mockaroo API so we can click on the Definition link to move to the next step.
This will take us to the Definition page where we will need to set up the Actions that will be linked back to the Mockaroo API’s. To do this just click on the New Action link.
This will take use to the action configuration page.
Start off by giving the new Action a Summary.
Then add a Description for the action.
Then enter in an Operation ID that you will reference the action with.
Next we will need to specify the format of the Request section of the connector action. The good thing is that we can get Flow to do most of the work by clicking on the + Import from sample link which will open up a panel for us.
Return back to the Mockaroo API page and copy the URL for the API.
Then paste the entire URL into the URL field and then click on the Import button.
This will build the Request section of the action for us.
Now scroll down to the Response section. This will be used to define what fields are returned back by the action. There is a default response that is generated with the connector but we will want to update that. So click on the default bubble.
This will open up an Import from sample panel that we can use to create out response variables.
Return to Mockaroo API and click on the URL to get the sample data to be generated. Then copy all of the body of the HTML that is returned.
Now paste the JSON data into the Body field and click on the Import button.
That will return you back to the Definition page. It doesn’t look like anything has happened, but click on the default button again.
That will open up the detail of the body and we can see that there are fields that are being returned that match the fields that were returned by the Mockaroo API.
Before we continue on, click on the Create connector link in the header of the page.
Now we can test the connector from the Test page. Start off by clicking on the + New connection button.
This will register a connection for us that is liked back to the Mockaroo API.
There is one last thing that we need to do and that is to specify the API key for the Mockaroo API.
Return back to the Mockaroo API and select the end of the URL which will be the API Key.
Then paste it into the key field.
All we need to do now is click on the Test operation button.
Oh no Mr Bill. It looks like it isn’t working.
Don’t worry – we think that there is a lag in the registration of the API with the Azure service.
Wait a couple of minutes and try again.
Now when we click on the Test operation it will return back data.
How it works…
Now we can see the new custom connector in action.
Start off by going into Flow and clicking on + Create from blank.
Then click on the Create from blank link again on the next page.
When the Trigger selector is displayed, select the Flow button for mobile to create a simple trigger.
Then click on the Manually trigger a flow trigger.
This will add a manual trigger for us.
Now click on the + New step button.
When the Choose an action browser is displayed, we can search for our custom connector and see all of the actions that we have published.
For this example we will select the Random address action.
This will open up the action and we will see that we need to specify a key again.
Just paste in the same API Key that we used when we were testing the connector.
Now that we have our random address we can use the data from Mockaroo in the next step. Click on + New step and then select the Bing maps connector and then the Get location from address action.
This will create an action which is looking for some address information.
If we click on the Address line field then we will see the Content browser and we will also see each of the fields that are being returned back from the Mockaroo API.
We can add the respective address fields into the Bing action.
Before we finish, we can give the Flow a better and more descriptive name, save it, and then click on the Test button.
Since this is the first time that we are going to run through the test we need to select the I’ll perform the trigger action and then click on the Save & Test button.
When the Run flow dialog is displayed, click on the Continue button.
Then click on the Run flow button.
When the Flow is kicked off, click on the Done button.
And we will see that the Flow runs and that the random address information from Mockaroo is part of the flow.
Extra credit…
If we want to add more API’s to the connector then we just repeat the process for each of the API’s in Mockaroo that we want to surface in Flow.
And then all of the actions will show up as we browse for the new custom connector in the Actions browser.
Summary
How cool is that. Although we could have connected to Mockaroo in Flow using the HTTP connector, this is so much easier and we bypass a number of additional parsing steps when the call to Mockaroo is made.
Using Profiles in Chrome to manage different demo personas
Although Internet Explorer and Edge are great browsers, Chrome does have one feature that is really useful for us when we are demonstrating Dynamics 365. Chrome will allow us to have multiple personas up and running within different browser sessions and they keep all of their session information separate. So if we are trying to show how multiple people work through a scenario, we can have them up and running at the same time without having to resort to multiple in private sessions and also multiple browser types to keep the people in check.
So even though it pains me to say it, it might be a good idea to use Chrome just for this reason.
How to do it…
Start off by opening up Chrome.
If you haven’t noticed the different profiles that you can switch between within Chrome then click on the name up in the top right of the browser and you will see that there is an option to manage the people profiles.
To see all of the people just click on the Manage people link.
This will open up a list of all of the different people that you can impersonate.
This will open up a form where we can add a new persona.
Start off by giving your new demo persona a name. In this case we added Alicia Thomber and also we added the role within the organization as well (Purchasing Agent) so that we can easily see what her role is.
After we have done that all we need to do is click on the Add button.
That will open up a new Chrome session for us that is linked to Alicia’s profile and we can open up the Dynamics 365 home page.
How it works…
This will allow us to log into Dynamics 365 with her account information, and we will be able to save this information within her profile.
Also, when she logs in she will only see that apps that she is allowed to access.
Additionally she will be secured down just to the functions that she has access to within Dynamics 365.
And we can even see all of her information within he Employee Self Service portal.
The great thing is that if we are working with multiple personas within the same session, they don’t step on each other’s toes and we can see their details at the same time.
Conclusion
How cool is that. Not only is it easy to separate out the different sessions that we need to run during a Dynamics 365 demonstration, but also we can easily tag the people with roles and even personalize each profile just like we would do in the real world.
Exporting Selected Rows to Excel
Dynamics 365 has an export function that allows us to export all of the data from a form over to excel, but did you know that you don’t have to export everything? If you just want to export out some of the data, then you can do it selectively from the list page and just export out the records that you want.
In this quick walkthrough, we will show you all how to do this.
How to do it…
Start off by opening up the list page that you want to export the rows from and select the rows that you want to export.
Then right-mouse-click on the check marks beside the records.
When the context menu is displayed, there will be two new options that show up. Export all rows and also Export marked rows.
To export just the rows that you selected, click on the Export marked rows option.
This will open up the Export to Excel panel, and we can click on the Download option to download the records locally.
When the file dialog is displayed, we can just click on the Open button to open up the Excel workbook.
This will open up Excel, and only the records that we selected in the list page will be exported out to excel.
If we switch to Edit mode, then we can manipulate the data as much as we like.
Review
How cool is that?
Configuring a BYODB and Creating a Full and Incremental Entity Export
In a production environment for Dynamics 365 for Finance & Operations, the primary database is locked down so that you cannot query or maintain the database directly.
But there are a lot of times that we would like to have access to the data so that we can create integrations with other systems, to publish data out for others view or even to report off through other reporting tools.
So the big question is how do you do this?
The answer is through the BYODB (Bring Your Own Database) feature within Dynamics 365 that allows us to attach an external SQL database to our instance and then publish any of the data entities that are available within the Data Management framework out to the database.
In addition to doing full exports of the data, we can also schedule incremental updates as well that take advantage of the change tracking feature to push only the data that has changed within the entity rather than all of the data.
Although this may seem like a daunting effort, it’s not really that hard, and in this walkthrough we will show how to get everything wired up.
Topics Covered
- Creating an Azure SQL Database
- Configuring an Entity Export to Database
- Publishing Entities to the BYODB Datasource
- Creating an Entity Export Project
- Enabling Change Tracking on Data Entities
- Creating an Incremental Data Export Project
Creating an Azure SQL Database
The first step in the process is to create a new Azure SQL Server Database that we will use as the BYODB for Dynamics.
Spoiler Alert: This will cost you some Azure $ to set up.
How to do it…
We will start off by opening up our Azure Portal by going to portal.azure.com.
Here we also created a new workspace that we can use to track all of the resources that we create for this project.
Now we will want to create a new SQL Server resource that we will house our BYODB in.
To do this, just click on the Create a resource button on the left hand side of the Azure portal.
To find the SQL server resource template, just type in sql into the filter box. Then select the SQL server (logical server) resource type.
When the details for the SQL server (logical server) are displayed, just click on the Create button to create the resource.
This will take us to the configuration pane for the SQL server that we are creating.
We will now need to give our SQL Server a Server name.
For this database we will set the Server name to mufifebyodb.
Next we will want to set the Server admin login username that we will be using to authenticate with.
For this example we set the Server admin login to the same as the Server name to make it easier to remember.
And then we will want to specify the Password that we will be using to authenticate against the database with.
Next we will want to either create a new Resource group that we will put the SQL Server resource into, or use an exisiting Resource group.
To make it easier to track down all the resources used for this project we will create a new Resource group and set it to mufifebyodb.
And finally we will want to add this resource to our workspace so we checked the Pin to dashboard option.
After we have done that we can just click on the Create button to get Auzre to configure the resource for us.
That will return us to our workspace and we will see that the resource is queued for creation.
After a couple of seconds, we will have a new SQL server resource and we will be taken to the configuration page.
Now we will want to add a new SQL Database to the server.
To do this, just click on the + New database link in the menu bar for the SQL server.
This will open up the SQL database configuration form where we will be able to specify the details for our new BYODB database.
Start off by giving the new SQL database a Database name.
For this example we will keep everything consistent and give our SQL database a Database name of mufifebyodb.
Next we will want to choose the type of database that we will want to create. If we click on the Pricing for field we will see all of the different SQL options and also how much it will cost us a month to run the SQL server database.
By default the Standard database is selected, which (at the time of writing this) is going to cost about $15 a month.
If you are feeling richer then you could select a Premium database, but for this example it’s a little bit too expensive.
So we will select the Standard database template and click on the Apply button.
After we have done that we can check the Pin to dashboard option and then click on the OK button.
This will kick off the process to create the Azure SQL database.
After a few seconds we will have a newly minted SQL database that we can start using.
Before we move onto the next step we will want to find the Connection string for the database, and to do that we will want to click on the Show database connection string link.
This will open up a page with all of the connection string examples that we will need later on in the process.
Review
Congratulations. We now have a SQL server database that we can use to export out our data entities to.
Configuring an Entity Export to Database
Now that we have created our BYODB database that is hosted outside of the main production environment we can link it with Dynamics 365.
How to do it…
To do this we will want to open up the Data management workspace by clicking on the Data management tile on the home page of Dynamics 365.
This will open up the Data management workspace.
Now we will want to click on the Configure Entity export to database tile within the Import/Export panel.
This will open up the Entity store maintenance form. Here we can create new and also maintain existing BYODB data sources.
We will want to create a new connection, so we will want to click on the + New button in the menu bar.
This will open up the New Entity Store form where we will want to define our connection details.
We will start off by giving the Entity store a Source name.
For this example we will set it to MUFIFEBYODB.
Next we will add a Description for the BYODB.
For this example we set the Description to MuFife BYODB.
Now we will return back to our Azure portal and copy the connection string for the database that we just created.
And then we will paste it into the Connection string field.
By default, the connection string doesn’t include the actual username and password for the SQL database, so we will want to update the {your username} and {your password} placeholders in the connection string.
After we have done that we can click on the Validate button to validate that the connection is correct.
If everything is good then the test will complete without any issues.
Since this is not a Premium database
then we will also want to uncheck the Create clustered column store indexes.
After we have done that we can click on the Save button.
When we return back to the Entity store page we will see that we now have a connection to the external SQL database defined.
Review
How easy was that. Now we have a connection between Dynamics 365 and an external database that we can start publishing entity data to.
Publishing Entities to the BYODB Datasource
The next step is to allow the data entities that we want to export to the BYODB to be published to the new BYODB entity store.
How to do it…
To do this we will want to find the entities within the data management workspace. To do this we click on the Data entities tile.
This will open up a list of all of the entities that are available to be published.
To whittle down the options we can filter out the list just to the usual suspects. Here we set the filter to customers so that we see the entities that relate to the customer record.
Now we will want to select the entity that we want to publish to the entity store.
In this example we will want to select the Customers V3 entity, and then we will want to click on the Publish button in the menu bar.
This will take us over to a list of all the Entity Stores that we have configured. Here we just select he BYODB that we just configured and click on the Publish button.
This will kick off a job to publish the entity to the BYODB.
When we return back we will see that the entity has been published.
If we want to see what we have just done, we can return to our Azure SQL database and click on the Query editor link.
This will open up the SQL Data Explorer form. To start seeing the data in the database we will want to login by clicking on the Login button.
This will open up a Login page with most of the data defaulted in.
All we need to do here is type in the Password that we have set up for the SQL Admin and click on the OK button.
After logging in we will be able to see the database within the SQL explorer.
If we expand the Tables folder we will be able to see the new entity has been added to the database.
Review
Now we are cooking with gas. We have linked our entity to the BYODB and we have a table that we can start accessing. But we still don’t have any data there.
Creating an Entity Export Project
To populate the table that we have within the BYODB we will create an Export project within the Data management tools, and copy all of the data that we have in the entity over there.
How to do it…
To do this we will return back to the Data management workspace and click on the Export tile.
This will open up the Export configuration page where we can build our data export.
We will start off by giving our project a Group name.
For this example we set the Group name to CustomerBYODBExport-Full.
Then we can add a Description to the project.
For this Export we set the Description to Customer BYODB Export (Full).
Now we will want to add our entity that we just published to the BYODB data source into the project.
We do that by clicking on the + Add entity button.
Now we can select the published entity that we want to include in the export.
Next, we will want to choose the Target data format that we will want to export the data in. When we scroll through the list of different options, we will see that our BYODB Entity Store is now showing up on the list.
So now we will select the MUFIFEBYODB as our Target data format.
Now we will want to select the Default refresh type that we will use for this export. We can either just select the incremental updates or we can get all of the data and do a full update.
For this export we want to publish all of the data that is in the table, so we will want to set the Default refresh type to Full push only.
After we have done that we can click on the Add button to add the entity and format mapping to the project.
Once we have done that we will see that behind the form, the entity mapping has been added. At this point we can just close the Add entity dropdown dialog.
Now that we have defined our project and the entity mapping that we want to use to push the data over to the BYODB Entity store, we can start the transfer process.
To do this we just need to click on the Export button in the menu bar.
This will schedule the export job to run in the background, and we can click on the Close button to dismiss the message box.
We will then see the current export job’s details. To see the progress we just click on the Refresh button.
It won’t take too long before the job will have processed and we will see that the data has been pushed from the entity over to the BYODB Entity Store.
In this example 48 customer records have been exported.
To see the data in the Entity Store we can return over to the Azure SQL database and click on the + New Query to start a new SQL query pane.
Inside the query we can enter a simple SQL select statement to see all of the data in the table.
All we need to do is type Select * from dbo.CustCustomerV3Staging and then click on the Run icon.
This will run the query and in the Results panel we will see all o the records that are now in the BYODB database.
If we scroll over just a little we will be able to see the customer details including the name and the customer account.
Review
How easy was that. Once we have created our Export entity in the Data Management workspace then we can add any published entity to the export projects and push the data over there.
Enabling Change Tracking on Data Entities
Now that we have all of our data transferred over to the BYODB Entity Store, we don’t want to stop there. We want to also make sure that any new data that is added to the table within Dynamics 365 is also transferred over.
We could just rerun the full export job that we just created to add new data, but that would be a little inefficient especially for Entities that may have a lot of data in them.
So now we will want to create another export project that just pushed the incremental changes. To do this we will first need to turn on Change tracking on the entity that we want to push so that the system will watch for changed records.
To do this we will open up the Target entities list and find the entity that we just created the full export for. In this case it’s the Customers V3 entity.
Then we will want to open up the CHANGE TRACKING action panel and we will see that there are a few different options for tracking changes on the entity.
For this example we will want to track all of the changes, including the changes to subsidiary tables, so we will click on the Enable entire entity button.
This will enable change tracking on the entity.
Review
That was easy.
Creating an Incremental Data Export Project
Now we that we have enabled change tracking on the Customers V3 entity we can use the change tracking to create an incremental update export job that will run in the background and continuously check for new and changed data to move over to the BYODB Entity Store.
To do this we will return to the Data management workspace and click on the Export tile to create another export project.
This will open up the Export project definition form.
We will want to give our export project a unique Group name.
In this example we will set the Group name to Customer BYODBExport-Incremental.
Next we will give out export project a Description.
Here we wet the Description to Customer BYODB Export (Incremental).
Now we can start adding our entities that we want to export by clicking on the + Add entity button.
We will want to click on the Entity name dropdown list and select the entity that we want to do the incremental update for.
In this example we will select the Customers V3 entity from the list, which is the one that we enabled change tracking on.
And next we will select the BYODB Entity Store from the Target data format dropdown.
For this example we selected the MUFIFEBYODB
Target data format.
This time we will want to select the Incremental push only from the Default refresh type which will tell it to just look for the changed data.
After we have done that we can add it to the export project by clicking on the Add button.
This will add the entity export to the project and we can close the Add entity dropdown form.
Now that we have the project created we will want to schedule this to run periodically rather than having to manually kick the process off.
To do this we will want to click on the Create recurring data job menu item.
This will open up the panel that will allow us to define the batch job details.
We will start off by giving our batch job a unique Name.
For this example we set the Name to be CustomerBYODBExport-Incremental.
Now we want to schedule the recurrence frequency of the job.
We can do that by clicking on the Set processing recurrence link.
This will open up the Define recurrence panel where we are able to set the frequency of the job runs.
For this job, we will want it to refresh each day, so we will select the Day recurrence pattern.
Then we can click on the OK button.
This will return us back to the Create recurring data job panel.
There is one last thing that we need to do here and that is to give the batch job an Application ID from Azure that it will use to authenticate through with.
Here is a quick cheat. Open up the Azure Active Directory applications form and there should be some there that have already been authorized to use Dynamics and we can copy the Client Id from here.
All we need to do is paste in the Application Id into the field.
Finally, we will just want to check the Enable flag for the Application Id that we just added and then click on the OK button.
This will open up a dialog box asking us if we want to enable the recurring job and we will want to click on the Yes button.
After we have one that we are done. The incremental export will run every day and update the BYODB Entity Store with any new or changed data.
Review
How easy was that?
Conclusion
Congratulations. We have now set up a new SQL database in Azure that other applications can access and we have not only published the data from an existing entity in Dynamics 365, but we have also created a refresh job that will go out and add any incremental data to the BYODB Entity store periodically.
This is very