DevOps pipeline for Azure functions

This article talks about building and creating pipelines on Azure DevOps for Azure functions .NET core/.NET 6

The steps to build a build and release pipelines are follows:

  1. Select your Repo
  2. Configure a pipeline template ex. ASP.NET, ASP.NET core, Starter
  3. Under a starter pipeline – There are 3 steps to create a starter pipeline.
    1. Build a script for compiling and building the code
    2. A script to archive code files in a zip folder
    3. Publish the artifact

The first step is to create a build pipeline.

In Azure DevOps -> create a new pipeline -> Select Azure Repos Git -> Select the Repository -> Select a starter pipeline

In Review your pipeline step, remove the default steps from the script. We will use the assistant to build the scripts. From the assistant on the right side, select .NET core task and add the command, path to project, and arguments, and click on the Add button

The Arguments are: –output $(Build.BinariesDirectory)/publish_output –configuration Release

You can also make a variable for the argument value and use the variable.

Let’s add the code to archive this project into a zip file.

Select Archive files from the Task assistant. Archive the files under publish_output folder and uncheck the prepend option

This is going to create an archive under the staging directory.

Let’s go ahead and publish this artifact. Select “publish build artifacts” from the Task assistant

Click “Save and Run”

You can see the published artifact in the build job
The published folder will look this:

Power Automate Flow – Flow client error returned with status code “Bad Request”

In Power Automate Flow, you may encounter the error below when tryin to save the flow

Request to XRM API failed with error: ‘Message: Flow client error returned with status code “BadRequest” and details “{“error”:{“code”:”InvalidOpenApiFlow”,”message”:”Flow save failed with code ‘InvalidTemplate’ and message ‘The template validation failed: ‘The repetition action(s) ‘Apply_to_each’ referenced by ‘inputs’ in action ‘Create_a_new_record’ are not defined in the template.’.’.”}}”. Code: 0x80060467 InnerError: Type: System.ServiceModel.FaultException`1[[Microsoft.Xrm.Sdk.OrganizationServiceFault, Microsoft.Xrm.Sdk, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35]]

There issue may happen when you’re trying to copy a value (example, lookup) from one flow to the other because the flow may have different collection (list of records for example) names. Therefore, when you’re copying one value from one flow to the other, you’re also copying the collection name.

For example, in one flow, your collection name is


In the other flow, if the collection name is different, and you’re copying the above value and pasting in the flow, you’ll see the “Bad Request” error.

To fix the issue, I deleted the value, selected the lookup field from the existing flow using dynamic content. It let me save the flow. This is a simple issue, often that people may figure it out, but I wanted to put it out there if someone run into this issue.

Azure DevOps Pipeline for CRM – Solution Packager

The first step in implementing the Azure DevOps pipeline process for Dynamics 365 is to extract/decompose the CRM solution into its individual components. This is done through the solution packager provided by Microsoft. You can download and install the solution packager from this link

After you install the the solution packager on your disk, you will see the following list

You can find the SolutionPackager.exe file in the CoreTools

Open windows powershell command prompt and navigate to the SolutionPackager.exe folder – D:\NuGetTools\Tools\CoreTools

Before we go to the next step, in the CRM instance, create a new empty solution and export the solution, this solution file is a zip file containing the customizations and solutions xml

We will try to extract this solution using the solution packager and see what it looks like. Go back to the powershell command, and type the following

.\SolutionPackager.exe /action:Extract /zipfile:”SolutionfolderPath/” /folder “OutputFolder”

Below, I specified where my solution is and the output folder where I want the extract the zip folder contents.

.\SolutionPackager.exe /action:Extract /zipfile:”D:\Other\DevOps\” /folder “D:\Other\DevOps\devopsfiles”

Let’s see what the output folder looks like:

Because, our solution was empty, we did not find the Enities, Plugins, Webresources folders. Let’s add few components to our Dynamics CRM solution

Export this solution, and run the SolutionPackager.exe from the powershell again, you’ll notice extraction of the different components

After you navigate to the output folder, you’ll see the different folders

Navigating inside the Entities folder, you’ll notice that the entities are split into their own folders and files containing the forms, views, charts etc.

Now, you can upload the root folder to the Git/GitHub/TFS, or any other repo, and take the next step in the integration of Azure Dev Ops pipeline for CRM.

Retain Overridecreatedon and createdby in Dynamics 365 during data import

In Dynamics 365 data import, the data imported will create records with today’s date and the data created will be logged against the user who is currently logged in even though the original createdon and createdby user could be different.

In many cases, when you’re importing an existing or legacy data in your Dynamics 365, you want to retain the original createdon date and the user who created that record. If you want to import your data with the date when the data was created, you can use overridecreatedon to override the data with the createdon date. You can also create this record against the original createdby user through impersonation. There are few ways to achieve this in Dynamics 365. You can use the kingswaysoft SSIS package to achieve this. However, in this blog, you will see how overridecreatedon and created by can achieved through xRM solution.

We have to use the OrganizationServiceProxy class to impersonate the createdby user. The class object has a property called “CallerId” that will be used to impersonate a user.

Here’s the code:

using Microsoft.Crm.Sdk.Messages;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Client;
using System;
using System.Net;
using System.ServiceModel.Description;
using Microsoft.Xrm.Sdk.Query;namespace DynamicsProxy
  class Program
     static void Main(string[] args)
         string crmUserName = "";
         string crmPassword = "xxxxxx";
         string crmOrgHost = "";            
          ClientCredentials clientCredentials = new ClientCredentials();
          clientCredentials.UserName.UserName = crmUserName; 
            clientCredentials.UserName.Password = crmPassword; 
            // For Dynamics 365 Customer Engagement V9.X, set Security Protocol as TLS12
            ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;

            string crmApiPath = "XRMServices/2011/Organization.svc";

            Uri crmUri = new Uri(crmOrgHost + crmApiPath);

            OrganizationServiceProxy orServiceProxy = new OrganizationServiceProxy(crmUri,null,clientCredentials,null);
            //Caller Id: this is the guid of the user we want to impersonate - 4b591077-9fb8-e911-a86e-000d3a372124
            orServiceProxy.CallerId = new Guid("47d57787-433b-eb11-a813-000d3a31c841");
            Entity ent = new Entity("contact");
            ent["lastname"] = "Hameed";
            ent["overriddencreatedon"] = new DateTime(2021, 1, 19);
        catch (Exception ex)
            Console.WriteLine("Exception: "+ex.Message);

During the code execution, you may run into an error:

{“Principal user (Id=xxxx-xxx-xxx-xxx-000d3a31c841, type=8, roleCount=5, privilegeCount=811, accessMode=0), is missing prvOverrideCreatedOnCreatedBy privilege (Id=d48cf22f-f8c2-xxxx-89eb-49f8281dxxxx)

To resolve this issue, you need to enable the override createdon privilege on the security role that the user is currently assigned to

Power Automate – Invalid type. Expected Object but got Array

In Power automate, in the parse json, if the response doesn’t align with the schema of the parse json, you’ll notice different errors, one of them being “Invalid type. Expected Object but got Array”.

The Parse json schema was expecting an object as a response.

But, instead, received an array of objects, which results in the following error

This error happens when you’re passing a list of array oject records from the child workflow or list records from the previous steps.

So the parse json was expecting an object in the format of





But instead receiving an array of objects:







The output value outputs(‘Get_List_record_of_Person_Engagement_Event’)?[‘body/value’] returns an array/list of objects

To solve this problem, you need to return just the first object value from the result. Use the following expression (in the above Response body value) to return the first value from the array

first(outputs(‘Get_List_record_of_Person_Engagement_Event’)? [‘body/value’])

This will return the following object





Which is now in-line with the format that we were expecting in the parse json

An undeclared property ‘new_bulkadjustment’ which only has property annotations in the payload but no property value was found in the payload.

Recently, I was running into an issue on xrm.webAPI.createRecord, when trying to update the lookup attribute to the entity object.

The oData naming convention for lookup attribute and the documentation around it asks you to use the schema name of the field with @odata.bind:

entity[“new_BulkAdjustment@odata.bind”] = “/new_revcontacts(00000000-0000-0000-a000-000d0a000a00)”;

This did not resolve the issue. I tried few other combinations using the REST builder, but unfortunately, nothing worked.

The way to set the lookup is:



new_BulkAdjustmnet is a lookup field on new_revcontact entity.

Azure Virtual Network (AVN) and Subnets

Azure Virtual Network (AVN) and Subnets


Azure Virtual Network is a home for virtual machines

AVN consists of an IP address range

Subnet is a logical separation of resources that you can have in a virtual network.

Each subnet has an address range and is a subset of address range of the AVN

You can spin up VMs in each of the subnets

Each VM has an IP address which is part of the subnet address range

The VM IP addresses ( & are private IP addresses and is basically used for the internal communication between the VMs – Public IP address – Users can access the application hosted on the VM from the internet

The other subnet does not have a public IP address. This VM could be used to host the database that should not be exposed to the public/internet

Network Security Groups

  • are used to control the flow of traffic into and out of the virtual machine
  • is a seperate resource defined in the azure platform
  • gets attached to the network interface that is attached to the virtual machine
  • can be attached to the network interface card (network security card – In this case, it just impacts that VM) to one VM or linked to the whole subnet (in this case, it affects the entire VMs on that subnet)
  • NSG consists of Inbound and Outbound security roles
  • When an NSG is created, some default inbound and outbound rules are already created which cannot be removed or changed
  • By default, the virtual machine does not allow traffic from the outside world, therefore you need to implement inbound rules and open the port 80 (http listener).
  • You have to setup rules accordingly to allow traffic on port 80. Source is IP address of your computer or the internet (for all users). Destination is your virtual machine/virtual network.

If you want to connect to VM using RDP, then add an inbound rule for RD for port 3389

Source of the Inbound traffic rules

Denying the inbound traffic from a certain is controlled by priority – Example.

A request is sent and goes through the rules, if a match is found, then that rule is executed.

Destination depends on network interface (VM specific) and subnet (group of VMs)

If the network security group is attached to the subnet, then specify the IP addresses of the virtual machines that will allow the incoming traffic