Step by step to deploy D365FO with Azure DevOps

In this article we will show you how to deploy Dynamics 365 Finance & Operations features using Azure DevOps Pipelines and Microsoft Dynamics Lifecycle services.

Vinicius Almir Weiss

Vinicius Almir Weiss

October 24, 2023 | leitura de 7 minutos

dev

Dynamics 365 Finance & Operations is a cloud-based or on-premises Enterprise Resource Planning (ERP) system. It's a resourceful environment to manage finances and inventory for small and large companies. Azure DevOps is a platform that provides automated tools for the end-to-end software life cycle management. In this article we will show you how to deploy Dynamics 365 Finance & Operations features using Azure DevOps Pipelines and Microsoft Dynamics Lifecycle services.

Azure Configuration

  • Create a service account

The first step is to create a service account in Azure Active Directory (AAD). The account password must not expire and validations such as MFA must be disabled. In our case, we created an account that refers to the role it will perform to facilitate identification.

1.png

2.png

  • Create an App registration

The next step is to create an app registration because the Microsoft identity platform performs identity and access management (IAM) only for registered applications. On the AAD screen add a new app record.

3.png

Use the name you prefer, leave the single tenant option checked and change the URI to web.

4.png

When the register process is finished you will have the information referring to the record created, but more configurations must be carried out.

5.png

In the Authentication menu, add the platform Mobile and desktop applications and select the authentication URI. Set accounted support types to a single tenant and allow public clients to flow in advanced parameters.

6.png

7.png

In the API permissions menu add the Dynamics Lifecycles services permission and grant admin consent (This step must be performed by a user with admin privileges for example).

8.png

The screen to grant admin consent to LCS can be accessed with the Enterprise Applications link at the bottom of the screen.

9.png

Now you have all the permissions needed for the next steps.

LCS configuration

The LCS configuration is simpler, just add the service account created in step 1. This account will be responsible for authentication. Access the LCS and in the project menu add the created account.

10.png

Remember to grant the correct permission to the user and log in once with his account in the LCS workspace for the registration to complete. If you do not login with the account ,the user registration will remain pending and not accepted like the image below.

11.png

DevOps connecting Service configuration

This step requires the service account (step 1) as well as the application (client) ID from the application registration (step 2). It is also necessary to install the extension "Dynamics 365 Finance and Operations tools". Access the extensions menu in the DevOps page and install it.

12.png

Go to project properties and in the service connections menu select the LCS service.

13.png

Fill in the connection parameters : service account (step 1) and application (client) Id (step 2).

14.png

Artifacts Configuration

Azure DevOps uses NuGet as a package manager. This NuGet as a Service is called Azure Artifacts. It is used to generate the build package that will be sent to the LCS. The first step is to download the packages from the LCS's Shared Assets Library menu. It is necessary to download the 4 packages described below. Remember that they must have the same version of the environment you want to deploy.

15.png

The next step is to create a feed in the Artifacts menu and connect this feed. This page will show the nuget.config file that should be copied to the folder where the NuGets files were downloaded. The nuget.exe file and the credential provider (script) must also be downloaded through the get tools link.

16.png

17.png

I saved everything in one folder named Nuget, but if you decide not to , remember to fix the paths. Run the credential_provider script as an administrator to set the credentials to access the azure artifacts.

17.png

In Visual Studio go to the tools menu and open the package manager console.

18.png

It will open a terminal. Set the cursor to the NuGet folder that was created in the previous step and publish the 4 files with the command below.

19.png

As a result, the Artifacts page will look like this:

20.png

Pipeline creation

Now we will finally start the pipeline configuration. Azure Pipelines combines continuous integration, continuous delivery, and continuous testing to build, test, and deliver your code. Before start, do the setup below:

  • Define which branch will be deployed;
  • Create a folder called "build" in the branch;
  • Create an empty solution in the "build" folder, e.g. (In my case, I chose AutomaticDeploy).

21.png

Copy the nuget.config file that is in the NuGet folder to the build folder. Create and copy it to the same folder a file called "packages.config" which must contain the version of the NuGet files published in artifacts too.

22.png

After this step, create or import the new pipeline in the pipeline menu. If you want to import a base pipeline, Microsoft has a template repository. After that, choose the repository your code is maintained, in my case the TFVC.

23.png

24.png

The pipeline tool contains several tasks that can be chosen to be implemented. In this case, it will build the solution and create a release to send it to LCS. The pipeline consists of the following tasks.

  • Get Sources: Select the branch that will be built. And add tasks to the pipeline.

25.png

  • Nuget Installer Packages: This step gets the packages from your artifacts feeds and uses the config files we have uploaded to the Build folder.

26.png

  • Update Model Version: Helps differentiate the model metadata in the deployable packages and tie them back to their originating build definition providing end-to-end traceability of the code changes.

27.png

  • Build Solution: It will get the empty solution and build it with the latest changes.

28.png

  • Copy Compile Logs: Like the name of the task already says, it will save the logs for each build.

29.png

  • Use Nuget: Install the latest version to create the deployable package.

30.png

  • Create Deployable Package: Use Nuget to create the deployable package.

31.png

  • Publish artifact: Send the deployable package to artifacts.

32.png

That's it, with these steps we have a deployable package to be released. But there are other configurations that we can do, like how to trigger the pipeline. For that, select the best choice in the triggers tab. Talking about the options available, gated checkin performs validation on the change prior to it being committed to version control, whereas CI verifies a change after it is checked in. So you can choose what is the best choice for you.

33.png

Azure Release

The released pipeline is responsible for deploying the new package to LCS. The first step is to create a new release in the DevOps main menu.

34.png

Select the option empty job.

35.png

Select the package source.

36.png

Name the pipeline.

37.png

Create environment variables to pass the LCS secrets.

38.png

Define the pipeline stage name.

39.png

Configure the agent.

40.png

  • Install MSALPS: Create a one liner to install the MSALPS. I installed it in each step of the integration with LCS to avoid any trouble.

41.png

  • Dynamics LCS asset upload: Upload the package to the LCS. Don't forget to create the reference name. It will be used to identify assets in the LCS deploy stage.

42.png

  • Install MSALPS: Copy the previous task.

  • Dynamics LCS asset deployment: Deploy the package to LCS. Note that in the LCS asset id we have the reference name used on the upload step.

43.png

  • Release triggers: Defines how the reselase pipeline will be triggered.

44.png

45.png

46.png

Final Considerations

This is an example of the main features needed to implement a deploy pipeline with Azure DevOps , D365FO and LCS . It's a good starting point to be adapted to your needs. Hope you guys enjoy it.

Vinicius Almir Weiss
Vinicius Almir Weiss

Software Engineer | Pós-Doutor em Bioinformática com mais de 10 anos na área de biotecnologia e análise de dados. Apaixonado pela família e esportes a motor.

LinkedIn