Logic Apps PG 1:1 meeting

I had thre pleasure to exchange in a 1 to 1 meeting of 30 minutes with Azure Logic Apps PG and especially with Divya Swarnkar.

Main points of this exchange:

-Some of my financials clients where data must reside only within their sites, are not encouraged to migrate their Biztalk flows to Azure. With the new Logic Apps : Azure function runtime,a workflow can be hosted on a a Docker container, so the clients can use their infra. But how about using enterprise connectors ?

=>To have only running on the local runtime, only « Built-in connectors » must be used, Azure connectors run only on Azure, so connection/data crosses Azure network to connect to partners. The good news is that the list of Buil-in connectors can grow-up depending the need of the usage and Divya suggest what kind of connectors is highly used and my answer is : at least the connectors provided by Biztalk: SAP, File, FTP/SFTP, MQSeries

So far,available »Built-in » connectors are:

-As a baby Biztalk, i’m comparing what was offered by Biztalk to plan how existing solutions can be upgraded to Logic Apps , and what can be the alternative of BAM for business monitoring ?

=>For business monitoring, Azure log Analytics data can be used. Within the workflow we can define Tracked properties on actions (equivalent to Biztalk promoted properties) and they will be injected and retrived by queries on log analytics data

Finally, I want to thanks LA PG team for this great initiative and encourage them to keep this kind of short 1:1 exchanges.

Self provision a Biztalk developer machine Part II : Azure DevOps

In this second part (Part I) will see how to create an Azure DevOps pipeline to create a new VM based on ARM template.

1-Create ARM template

To start, let’s create a template of a Biztalk 2016 VM. From the lab previously created, click « Add » on the « Overview » blade:


and search for Biztalk server 2016 Developer image (sql server + visual studio pre-installed):



select the template content and save it locallyu as a json file : Biztalk2016DeveloperAzureDevTestLabTemplate.json

2-Add template to git

This template must be added to a git repository for reference within the Azure DevOps pipeline :


3-DevOps pipeline

Create a new pipeline :

select the git repository where the json template were added:


the pipeline will run on a hosted agent :


add « Create Azure DevTest Labs VM » task  :


Configure the task :

-Select your Azure subscription

-The Lab created in part I

-ARM template from Git

-Template parameters


for template parameters, I created 3 pipeline parameters to prompt the user to set VM name, user name and password:


the value of these parameters will replace the template’s parameters.

Now, all you need to do is to share the pipeline link with each new joiner to create a new developer environment:

Hope this blog post help you to automate Dev VMs creation.

Self provision a Biztalk developer machine Part I : Azure DevTest Labs

The aim of this post is to share a quick solution to create a developer machine for a new joiner within a team. Previously we need to share a VHD image of one of the developpers team. So the new joiner spent a time to clean his VM, but sometimes  we forget some settings like setting the source control credentials. By using Azure DevTest Labs and Azure DevOps pipeline a new joiner can create by himself his VM with all the necessary artifacts to start developping, and delete unused ones when a developper leave the team.

First of all, we need to create an Azure DevTest labs resource:


Click Create button and set lab’s informations:


NB : keep the auto shutdown enabled for cost optimisation. This option can be updated after.

that’s all for this first part. In Part II, will see how we can use DevOps pipelines to create a new VM within the created lab.

Biztalk server 2020 : View Audit log of common management operations with Power BI

Biztalk server 2020 just released with some usefull features. Among these features is Audit log to track Biztalk artifacts chnages. Within this blog post i’ill try to explain how to ingest and display the API data in a human way with MS Power BI.

The first thing, is to install Biztalk 2020. In my case i choosed to create a VM on Azure from a built in image on the Azure Marketplace :
IMg Gallery
but few days after my vm creation, Microsoft published an image for BTS 2020:


for installation guide, it’s the same as the previous version of Biztalk. You can refer to the installation procedure in this link.

After installing and configuring yor Biztalk environnement, we ned to enable audit management operations on Biztalk concole:

Configure BizTalk Server audit management operations


the new API operation that lists operations audit is reached from the following url:


for the first access, i got an access denied, so i enabled anonymous authentication and gived my user priviliges on the iis site folder.

as a result of the API operation management, a json file is returned :


to display the API result in a humain format, I installed Power BI Desktop.

On Power BI client (no need to create an account), click on :

1-Get Data



3-Copy/past the audit log API url and ok:


4-a new window opens , click on List value:


5-convert this list record into a table:


on create table pop-up, leave default values:


6-custom columns :click on the button from the top right of the Column1 and choose the following columns :

Userprinipal :User who performed the operation

-ArtifactType: Type of artifact on which operation was performed, for example SendPort, ReceivePort, Application etc.

-ArtifactName: User configured name of the artifact, for example, FTP send port.

-OperationName: Action performed on the artifact, for example Create.

The following table summarizes possible operations name on different artifacts:

Artifact type Operation name
Ports Create/Update/Delete
Service Instances Suspend/Resume/Terminate
Application resources Add/Update/Remove
Binding file Import

-Payload: Contains information about what is changed in JSON structure, for example, {"Description":"New description"}.

-CreatedDate: Timestamp when the operation was performed.


as a result, we get the list of changes by artifact ype and the user who operated the change:


for more details on which properties are changed, you can click Record link from the Payload Column.

This is are very usefull informations for huge Biztalk group with a large administrators team.

Thats’s all for this lightweight version. For a V2 we can parse the payload column based on the the artifact type because the content will be different.

Azure container repository : import & run docker image

First of all, we need to create a container repository resource from Azure Portal

once the deployment succedded:


from cloud shell command line we will import/create nginx image from Docker public reository to the created private repository :

1-get credentials :

az acr credential show -n nameofthecontainerregistry


2-log in to the container registry :

docker login nameofthecontainer.azurecr.io


3-import nginx image :

az acr import –name nameoftheregistry –source docker.io/library/nginx:latest –image nginx:latest

4-run the nginx image:

4.1-Deploy the image to a web app


4.2-open the created site on the browser to view the running container



Visual studio/ Biztalk pipeline template error : unable to find transmitpipeline.vstemplate please repair the product to fix this issue

After a fresh Biztalk 2016 installation, i was faced the following problem when trying to create a new send pipeline :


to address the problem , Sandro Pereira blog suggests to repair the BizTalk Server installation. it’s a good solution but not the only one . My workarround is to copy the missed templates without repair install. but how?

1-Copy missed templates from Biztalk ISO install :

BizTalk Server\MSI\Program Files\Developer Tools\BizTalk\Pipeline Files

to  biztalk install folder:

Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\ProjectTemplates\BizTalk\Pipeline Files

2-reinstall visual studio templates via cmd :

devenv.exe /installvstemplates

hope that will save you time investigating this issue!!


Git : ignore Biztalk « .cs » generated files

Developping Biztalk server atefacts become sometimes a headache to filter real changes files from auto generated ones when building from Visual studio. Building a Biztalk project generate the 4 following files:

*.odx.cs : orchestrations

*.btm.cs : maps

*.xsd.cs : schemas

*.btp.cs : pipelines

to configure Git tracker to ignore these files and not add them to the changes tab, we have to make some extra work.

1-Update the principal branch localy(develop or master depending the branching flow)

2-update the « .gitignore » file to add the previous 4 files extensions

3-delete the already existing files :

-open a cmd windows from the root git repository and execute these commands :

del /S *.odx.cs
del /S *.btm.cs
del /S *.xsd.cs
del /S *.btp.cs

commit the changes from visual studio.

4-update the Git cache to ignore tracking future generated « .cs » files from the same previous cmd window  :

git rm -r --cached *.odx.cs

git rm -r --cached*.btm.cs

git rm -r --cached *.xsd.cs

git rm -r --cached *.btp.cs


Hybrid Integration :Push data from Biztalk to Azure MySQL using Azure Functions

This article presents a solution to integrate a an Azure MySQL Db partner and push data from client’s On-Premise Biztalk  2013 (not R2) App. The big deal was the solution to implements, choices were:

1-No Bbuiltin Adapter for MySQL within Biztalk, so we need to buy one

2-Create a .Net helper to connect to.Not fun of cusomisation  like this, because it’s maybe source of future limitation regarding maintenance.

3-Use a cloude builtin solution and use Azure Functions

we implements the 3rd solution for some reasons :

-My client already has Azure subscriptions (so no extra cost)

-The maintainability of the solution and the haigh availability.

-Solution reuse as the Azure Functions can be used as an elegant REST API to call from other flows

-Adopt an Hybrid solution to prepare a migration of the flows to Windows Azure

-Use a serverless solution to run code on-demand without having to explicitly provision or manage infrastructure


How to achieve this?

Create the Function

First, we have to create a resource group, please refere to the following link :


Steps to create an Azure Function service:


Full steps to achieve that:


for this sample, it will be an http trigger function and C# as a language.

The function contant(run.csx file) maust looks like :

#r « System.Data »
#r « System.Net »
#r « System.Web »
#r « MySql.Data »
#r « Newtonsoft.Json »
using System.Net;
using System.Text;
using System.Web;
using System.Data;
using MySql.Data;
using MySql.Data.MySqlClient;
using System.Configuration;
using System.Data.Odbc;
using Newtonsoft.Json;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
log.Info(« C# HTTP trigger function processed a request. »);
string data_request = req.Content.ReadAsStringAsync().Result;
string errorCode = «  »;
string errorMessage = string.Empty;
Response response = new Response();
var connStr = System.Configuration.ConfigurationManager
.ConnectionStrings[« MySQLAzureDb »].ConnectionString;
using (MySqlConnection con = new MySqlConnection(connStr))
await con.OpenAsync();
log.Info(« Init sql command. »);
using (MySqlCommand cmd = new MySqlCommand(« sp_UpsertData », connStr ))
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add(new MySqlParameter(« jDocument », data_request));
log.Info(« ExecuteReader. »);
using (var reader = cmd.ExecuteReader())
errorCode= »1111″;
errorMessage= »no result returned by the stored procedure »;
errorCode = reader.GetString(0);
errorMessage = reader.GetString(1);
log.Info(« errorCode:  » + errorCode);
log.Info(« errorMessage:  » + errorMessage);
response.ErrorCode = errorCode;
response.ErrorMessage = errorMessage;
returnnew HttpResponseMessage(HttpStatusCode.OK)
Content = new StringContent(JsonConvert.SerializeObject(response, Formatting.Indented),
Encoding.UTF8, « application/json »)
public class Response
publicstring ErrorCode { get; set; }
publicstring ErrorMessage { get; set; }
and the file function.json :
« bindings »: [
« authLevel »: « function »,
« name »: « req »,
« type »: « httpTrigger »,
« direction »: « in »,
« methods »: [
« get »,
« post »
« name »: « $return »,
« type »: « http »,
« direction »: « out »
« disabled »: false,
« frameworks »: {
« net46 »: {
« dependencies »: {
« MySql.Data »: « 7.0.7-m61 »
these files are located here:
the called stored procedure takes a parmeter of type JSON.
Note : this article is not focusin on how to generate a json from Biztalk.

How to call the function from Biztalk

Now we need to configure a send port to call the Azure function.
From Biztalk , add a new Wcf-WebHttp send port with the following configuration :
URI : the azure function URL
HTTP Method : POST
Outbound HTTP Headers:
x-functions-key:your key function
to get the function key, click the function name/Manage and show the hidden key :
Send  pipeline : a send json pipeline
Receive pipeline : a receive pipeline with an XML Disassembler pipeline component within Sisassemble stage that accepts the json response schema(the Response class defined within the function).

Security connection between MySQL server and Azure Function

In my case, the MySQL partner secures the acces by an IP white list, so the administrator asks for the Funcnction IP adresse to whitelist.
To determine the Azure Function IP Adresse, follow the steps below :
1-Sign in to the Azure Resource Explorer.
2-Select subscriptions > {your subscription} > providers > Microsoft.Web > sites.
3-In the JSON panel, find the site with an id property that ends in the name of your function app.
See outboundIpAddresses and possibleOutboundIpAddresses.
The set of outboundIpAddresses is currently available to the function app.
The set of possibleOutboundIpAddresses includes IP addresses that will be available only if the function app scales to other pricing tiers.
these steps are described on official MS Docs.
The possibleOutboundIpAddresses contains the list ofoutboundIpAddresses , so  allow access to the whole list of possibleOutboundIpAddresses :
for more details, you can leave a comment and i’ll try to provide the needed infos asap.

JSON null response REST API from Biztalk Server

Calling a a REST API from Biztalk is a straightforward operation with WCF-Webhttp adapter. but when we are facing a scenario where the API is returnin an unexpeted response essentially from non maintained application. To deal with this cenario we have 2 options:

-Create a custom component piepline

-Custom behavior with message inspector

in my case,  the API return a message body containing « null » string with a 200 http status. I choose to create a custom behavior to handle the message too early on the adapter level.

My custom behavior project contains:

-a message inspector implementing IClientMessageInspector interface

the AfterReceiveReply method is fired when the response is returned from the API :

public void AfterReceiveReply(ref Message reply, object correlationState)
var replyMsg = reply.ToString();
var mystr = replyMsg.Substring(replyMsg.IndexOf(‘>’) + 1, 8);
byte[] data = Convert.FromBase64String(mystr);
string decodedString = Encoding.UTF8.GetString(data);

if (decodedString == « null »)

string response = «  »;
Message newReply = Message.CreateMessage(MessageVersion.None, null, new TextBodyWriter(response));
// very important – forces WCF to serialize the message body contents as
// « Raw » which effectively means no serialization
newReply.Properties[WebBodyFormatMessageProperty.Name] = new WebBodyFormatMessageProperty(WebContentFormat.Raw);
var responseProperty = new HttpResponseMessageProperty();
responseProperty.Headers[« Content-Type »] = « text/plain »;
newReply.Properties[HttpResponseMessageProperty.Name] = responseProperty;

reply = newReply;

catch { }

the response is received  in 64 base format : <Binary>xxxxxxxxx</Binary> »

after a converting the content and if the message is « null », with a custom writer I create a new empty message. But why an empty one?  To take benefit of the new JSON decoder « AddMessageBodyForEmptyMessag » property and to create an empty JSON message « {} »

by this way, i can avoid to receive a null reference exception.

-endpoint behavior implementing IEndpointBehavior interface

After build and GAC the dll, we have to configure this new extension:

1-C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Config\machine.config and C:\Windows\Microsoft.NET\Framework\v4.0.30319\Config\machine.config.


2-JSON receive pipeline conf:


3-add the extension on the send port behavior section



You can find the source code here.

if you liked this tip, please share.

Issue with Schedule adapter in BizTalk 2013 R2 Application deployment

Naga Learnings

Issue: While working on BTS 2010 to 2103 R2 migration, I was getting below error while deploying BizTalk builds crated using BTDF:


error : Could not validate TransportTypeData, Address or Public Address properties for Receive Location ‘******’. Could not load file or assembly ‘Microsoft.BizTalk.Scheduler, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35’ or one of its dependencies. The system cannot find the file specified.

I spent good amount of time checking all project files, binding info and btdf settings to identify the dependecy with « » version of schedular dll.

However, actual issue is « Microsoft.BizTalk.Scheduler.dll » was not installed to GAC by default with BTS 2013 R2 installation. Due to this, BizTalk is not able to identify the reference to this dll while tyring to enable schedule adapter locations which internally refers to « Microsoft.BizTalk.Scheduler.dll ».


install « Microsoft.BizTalk.Scheduler.dll » to GAC from « C:Program Files (x86)Microsoft BizTalk Server 2013 R2 » using gacutil. Also, make sure to follow same step in other BTS environments.

Voir l’article original

%d blogueurs aiment cette page :