FTP Adapter - No such host is known Please check the configuration and The URI scheme is not valid

Earlier today I hit the following errors (found in the event log) when trying out a newly deployed FTP send port:


The adapter failed to transmit message going to send port "DCSendLimaPurchaseOrder_FTP" with URL "ftp://<servername>:21/uat/GO_PO_XML/inbound/pending/File_To_Lima_%datetime_bts2000%.xml". It will be retransmitted after the retry interval specified for this Send Port. Details:"DNS Lookup for the server "ftp://<servername>:21" failed with the following error message: No such host is known

 Please check the configuration. ".




The adapter failed to transmit message going to send port "DCSendLimaPurchaseOrder_FTP" with URL "<servername>:21/uat/GO_PO_XML/inbound/pending/File_To_Lima_%datetime_bts2000%.xml". It will be retransmitted after the retry interval specified for this Send Port. Details:"Invalid URI: The URI scheme is not valid.".


The FTP Adapter is very particular about the format of its <uri> and <serverAddress> elements. After much head scratching today I finally came up with the set of values for the btdfproj file that would work:





<TransportType Name="FTP" Capabilities="80907" ConfigurationClsid="3979ffed-0067-4cc6-9f5a-859a5db6e9bb" />



<AdapterConfig vt="8">

<Config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">






<accountName />









<firewallAddress />


















ESB Configuration Tool - Null Object Ref

Building a new BizTalk 2013r2 VM for the first time today and I ran into a problem that I guess may hit a few people.

When I started the ESB Configuration tool I received the message "Object reference not set to instance of an object". It turns out the tool relies on some on some of the IIS6 management tools. Once I'd added these from Server Manager I restarted the config tool and it worked ok. 

Received unexpected message type ''

Starting new year's resolution early - two blog posts in one day!


After receiving a response from a WCF solicit response port, my orchestration was raising the following exception:


Inner exception: Received unexpected message type '' does not match expected type 'http://xxx/dynamics/2013/03/services#EXDSalesLineShippingServiceFindResponse'


I updated the BizTalk config to trace the received WCF message. It looked ok, had expected namespace and root node.


Problem was, I'd forgotten to set the receive pipeline of the physical two way send port to XML Receive - it was sat at PassThruReceive. This meant that the message type wasn't getting promoted. Because the receive on my orchestration's logical solicit response port was strongly typed - it threw the exception.

Multiple Orchestration Instances

Had a funny problem with BizTalk today that I thought worth blogging about in case anyone else makes the same mistake.


I have the following setup:


File Receive Port (map to canonical) --> Orchestration --> Send Port (map to external)


After deploying with the BizTalk Deployment Framework I kicked off an integration test which dropped a single record flat file in the receive location. The send port did its thing and all looked good. Then I noticed many (and I mean many!) messages were being sent.


My first thought was perhaps the source file wasn't being deleted by the file adapter but that drew a blank. It turned out the problem was caused by a simple mistake with the orchestration's logical activating receive port. I had set the Binding to Direct!


So, canonical message entered the orchestration but also left the orchestration (because I have mapping on the send port rather than within the orchestration). This meant that the send (from the orchestration to the messagebox) triggered a new activating receive and another orchestration instance!

BizTalk Automated Multi Server Deploy


I’ve been working with BizTalk since 2006r1 (the one without the handy WCF adapters). Since that time I’ve tried various community offerings to improve the deployment process. In 2008 I was working for a large retailer. In order to support parallel development for phased but overlapping releases (config management fun!), they had up to four different BizTalk groups each containing between two and four servers. At the time, we had approximately eighteen different BizTalk applications of varying complexity, running on each BizTalk group. As you can imagine, this made management of the binding file time consuming and difficult. I turned to Michael Stephenson’s BizTalk Configuration Management tool on CodePlex (http://configsettingstool.codeplex.com/) this was a great help since it allowed us to maintain all the settings for the bindings in a single SQL Server database.

Last year I joined a BizTalk development team where they were already using the BizTalk Deployment Framework (BTDF)(http://biztalkdeployment.codeplex.com/) along with BizTalk 2010 and TFS 2010. I was really pleased because I’d heard a lot about the BTDF but hadn’t previously found the time to work with it. Working with the deployment framework can be quite daunting initially because it provides so much functionality. Fortunately the existing team members we able to quickly answer my questions.

At the time, the deployment framework was being used in the “conventional” way:

  • click the VS toolbar button for a quick deploy to developer’s local BizTalk environment
  • click another VS toolbar button to create an MSI to be manually copied and deployed to servers

It was at this time “I had a dream!”. Wouldn’t it be great if a check-in of source code triggered a build, automated deployment and test of our BizTalk applications. Of course, this is not a very original dream and Continuous Integration for non-BizTalk applications is very common. Just to make the task more challenging, I wanted to keep the Build Server free of BizTalk and instead deploy to a remote two-node BizTalk server group (known as our “dev” group). If I could get this this to work then it should be easy to adapt so that the BizTalk application could be deployed to other environments higher up the food chain, namely “test”, “pre-prod” and, dare I say it; “prod”.

The required development / configuration can be broken up into three categories:

  • BTDF configuration changes
  • Creation of a custom TFS build template (xaml)
  • Creation of Powershell scripts and server configuration to enable the remote execution of these scripts

BTDF Configuration Changes

When built with TFS, the folder structure is different to that when simply building from within Visual Studio. With a standard VS build you may have you may find your solution is c:\development\solution name\solution.sln. When TFS builds it first does a “get latest” of the source code into the build agent folder. The path for this is determined by the Build Agent’s “Working Directory”. This can be accessed Start\All Programs\Microsoft Team Foundation Server 2010\Team Foundation Administration Console. From the UI select Build Ahent Properties. Another dialog will be displayed enabling you to set the “Woking Directory” for any particular build agent. The default is as follows: $(SystemDrive)\Builds\$(BuildAgentId)\$(BuildDefinitionPath). The following table gives a comparison of various build paths for a Standard VS build and a TFS build where System Drive = “C”, BuildAgentId = “3” and BuildDefintionPath = "\Solution\OvernightBuild”

Standard VS Build Structure
TFS Build Structure


c:\builds\3\tfs project name\overnightbuild\sources\solution name\solution.sln


c:\builds\3\tfs project name\overnightbuild\sources\solution name\project name\project.btproj


c:\builds\3\tfs project name\overnightbuild\binaries\project.dll

The btdfproj file has an item group describing the location path for the binaries of the solution’s: Schemas, Components, Pipelines, Pipeline Components, Orchestrations and Transforms. We need to configure these item groups to select from the TFS build structure where a TFS build is being used.

An example of the ItemGroup describing the binary locations (pulled from the BizTalkDeploymentFramework.targets file) can be seen below (note this wouldn’t actually be needed since we’re are using the default name for the orchestrations assembly but I am showing it to illustrate the difference between this and the item group used when a TFS build):


We are able to override this by adding the item group into the solution’s btdfproj file as follows:


Note how this overriding group will only be used if the “TeamBuild” variable is true, explained in the TFS section.


Virtual Directories

For the BizTalk deployment framework to create Virtual Directories, it expects to find them in the folder $(RedistDir)\ProjectName\bin. With a TFS build, components of the virtual directory (e.g. .svc, .dll etc.) won’t be in the correct place.

In order to remedy this, it is necessary to override the “CustomRedist” target, as illustrated by the following example:


Note: the WebServiceBinFolderPath has been declared as a property at the top of the btdf file, as illustrated below:




Solutions are built by TFS using what are know as build definitions.

An example of the VS UI used to manage settings within a build definition can be seen below:


One critical item from the above page that must be set is the “Items to build”. When not using BTDF this would typically point include the project files for any dependent projects, then the .sln of the BizTalk solution to be built. However, for any automated BTDF build, the .btdfproj file must be appended to the list of projects / solutions that need to be built.

From the previous screen grab, notice the pane labelled “Build process template”. This points to a .xaml windows workflow file which I have adapted from a standard build template.

One critical change can be seen in the following screen grab:


Note the condition that has been added to determine if the particular loop iteration is dealing with a btdfproj. If not then MS build is called in the normal way. However, if it has been requested to build a btdfproj then the following are passed as arguments to msbuild:

“/p:TeamBuild=True /t:Installer”

The critical parameter is TeamBuild, this use for this is explained in the section “BTDF Configuration Changes”


After the required assemblies have been built (into the binaries folder rather than \bin), then next tasks are:

  • package the assemblies into MSIs
  • copy the MSIs to the target BizTalk servers
  • undeploy any existing installation of the BizTalk application
  • deploy the new installation of the BizTalk application

In order to achieve this, a couple of custom Powershell script are called from the build xaml, one for the undeploy and another for the deploy. These can be seen in the following screen grab:


In order for these Powershell scripts to be generic, so that they can be used for any BizTalk application, it has been necessary to create many custom arguments that can be passed into the xaml execution and then passed from here into the Powershell scripts. These custom arguments are defined from a section at the base of the xaml designer window, as can be seen in the following screen grab:


Any argument defined in the xaml designer will become available on the build definition UI, once the xaml has been selected as the required build template. In the following screen grab, notice how the “ApplicationNameInBizTalk” argument has been made available to be configured for the build definition


The Powershell Scripts

As described in the previous section, toward the end of the xaml build execution Powershell is called to undeploy then deploy the BizTalk application.

The sequence of events required to achieve this are illustrated in the following diagram:


An illustration of the required Powershell scripts, their functions and organisation can be seen below:


Enabling Powershell Remoting

The two main Powershell scripts (UndeployBizTalkApp.ps1 and DeployBizTalkApp.ps1) will be started from the build server. However, they must execute script on the target BizTalk servers. In order to support this, the Build server must be configured as a Powershell Client and granted permission to pass credentials to the BizTalk servers. The initial request to Powershell will be made by the TFS Build Service. The credentials of this service will be passed to the BizTalk server. It’s important that the account used for the TFS Build service has is a member of the BizTalk administrators group.

The following steps are required to enable Powershell remoting between the Build Server and BizTalk Servers:

  • On the Build Server execute the following from a Powershell command line: Enable-WSManCredSSP -Role client –DelegateComputer wsman/server name (repeat for each target BizTalk server)
  • On each target BizTalk server execute the following from a Powershell command line:Enable-WSManCredSSP –Role server
  • On the build server, update Group Policy to allow your credentials to be delegated to the remote server.
    Open gpedit.msc and browse to Computer Configuration > Administrative Templates > System > Credentials Delegation.
    Double-click "Allow delegating fresh credentials with NTLM-only Server Authentication".
    Enable the setting and add the BizTalk servers to the server list as WSMAN/BizTalkServerName. (You can enable all servers by entering WSMAN/*.)


Problem Resolution

When running the TFS build you may receive errors reporting that the schema cs files used for unit test (eg modified_OP-Order-v1.xsd.cs) could not be found.

Ensure that the “MSBuild Platform” property within the “Process” tab of the build definition is set to “X86” (rather than the default value of “Auto”)

TFS build complains that MSI missing

Ensure you’ve included the .btdfproj in the projects to build on the process tab

Problem with assemblies missing from MSI

Check you have added the TeamBuild property to the btdfproj file: <TeamBuild Condition=" '$(TeamBuild)' == '' ">False</TeamBuild>


Tom Abrahams (http://www.tfabraham.com/blog/) for his amazing effort in taking the BTDF to where it is today. All those who think Tom deserves a Connected Systems MVP – have a word with Microsoft!

Randy Aldrich Paulo (http://randypaulo.wordpress.com/2012/01/31/automating-silent-install-biztalk-deployment-framework-btdf-using-powershell/) for a great post that provided me with the basis for the development of my Powershell Scripts

Generate SQL Server Test Data

I came across (thanks Paul) a great way to quickly generate test data for SQL Server today.

DECLARE @rows INT, @a int

 SET @rows = 50


(SELECT TOP (@rows)


 SomeInt = ABS(CHECKSUM(NEWID())) % 50000 + 1 ,

 SomeLetters2 = CHAR(ABS(CHECKSUM(NEWID())) % 26 + 65) + CHAR(ABS(CHECKSUM(NEWID())) % 26 + 65) ,

 SomeMoney = CAST(ABS(CHECKSUM(NEWID())) % 10000 / 100.0 AS MONEY) ,

 SomeDate = CAST(RAND(CHECKSUM(NEWID())) * 3653.0 + 36524.0 AS DATETIME) ,

 SomeHex12 = RIGHT(NEWID(), 12),

 TrueRandomZerotoOne = RAND(CHECKSUM(NEWID()))

 FROM sys.all_columns ac1 CROSS JOIN sys.all_columns ac2 CROSS JOIN sys.all_columns ac3

 ) AS a

XML Namespaces


I've been working with XML for several years now but I've never been entirely sure that I fully understand all there is to know about namespaces. Of course namespaces are central to how document instances are defined by BizTalk, so I certainly had some understanding. However, it's quite easy to just "get by" without a complete understanding because of the inteli-sense and other helpful features provided by tools like Visual Studio and XML Spy. If I'd had only notepad in which to create the XSDs and XML documents then I'm sure I would have been stuck.
Anyhow, this has been bothering me for a while, so today I decided I was going to learn all there is to know about XML namespaces - and make it stick! This learning involved reading sections from the excellent book "The XML Schema Companion" by Neil Bradley, and otherwise messing around with XSDs and XML documents in Visual Studio.
Key Points
  • It is possible to create an XSD that does not define a namespace. Such schemas can be used to validate unqualified XML documents. Below I have pasted a simple XSD that does not define a namespace:
<?xml version="1.0" encoding="UTF-8"?>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" elementFormDefault="qualified" attributeFormDefault="unqualified">
<xs:element name="RootNode">
<xs:documentation>Comment describing your root element</xs:documentation>
<xs:element name="Name"/>
<xs:element name="Age"/>
I then used XML spy to create an XML document instance from the above XSD, this is what it came up with:
<?xml version="1.0" encoding="UTF-8"?>
<RootNode xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="D:\Rob\LearnXMLNamespaces\Person.xsd">
Note, the root node has the attribute "xmlns:xsi", this is simply defining a prefix for a namespace - it is not setting the document level namespace. In effect it is saying that elements, attributes etc from the namespace http://www.w3.org/2001/XMLSchema-instance can be used by prefixing XSI, however where there is no prefix then the element, attribute etc does not belong to a namespace - it is unqualified. The attribute noNamespaceSchemaLocation is interesting. This gives the URL reference to the XSD that will be used when validating the XML document. It also infers that the XSD does not define a namespace - and that this document is not "namespace qualified" at document level.
I then went back to the XSD and defined a namespace for it by adding a "TargetNamespace" attribute to the root node as follows:
<?xml version="1.0" encoding="UTF-8"?>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" elementFormDefault="qualified" attributeFormDefault="unqualified" targetNamespace="http://samplenamespace">
<xs:element name="RootNode">
I then used XML spy to generate a new sample XML instance document from this, updated XSD - this is what it came up with:
<?xml version="1.0" encoding="UTF-8"?>
<!--Sample XML file generated by XMLSpy v2005 rel. 3 U (http://www.altova.com)-->
<RootNode xmlns="http://samplenamespace" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://samplenamespace
A couple of things have changed:
  1. The attribute xmlns has been added to the root node and assigned the value "http://samplenamespace". This specifies that, unless explicitly qualified with a different namespace, all elements and attributes defined with the XML instance belong to the namespace "http://samplenamespace"
  2. The attribute schemaLocation (which happens to belong to the xsi schema instance namespace) has been added to the root node. The attribute contains a pair of values, the name of the namespace defined by the schema followed by a space then a URL reference to the XSD that defines the namespace
The above just skims the surface of what can be done with namespaces but in does provide the basics. There are many cases where an XML document will contain items defined in multiple namespaces. However, understanding such a document is straight-forward once you are aware of the key concepts and implementation as described above.

Update: 4th April 2012, had some more fun with XSDs today. Must remember, adding xmlns="x" to an XSD sets the namespace for that XSD; setting targetNamespace for an XSD gives the namespace for which other XSDs and XML can reference with an Import.

While creating the new schemas I came across a problem when trying to validate what seemed to be valid XML. Under the root element was <Invoice> , on attempting a validate from within XML Spy I received the message: "Unexpected element 'Invoice' in element 'InvoiceBatch'. Expected: Invoice". It turned out that this was because I had removed "elementFormDefault='qualified'", once I added this back in then it validated OK. 



Scripting Perfmon


Scripting Perfmon


We have a test team in India that I would like to run some load and performance testing. They have access to the required test server and load tools but limited experience using tools such as perfmon.


I have previously only used Perfmon to take a real-time view of performance counters, or used the ui to configure logging to a data file for later evaluation. In this particular case I didn't have access to the test server to enable me to manually configure the logging, and I know that trying to walk someone through this on the phone would be painful. After a little searching the web I discovered that it's possible to script the perfmon counter logging - exactly what I needed!


I started by creating a config file containing the counters that I wished to monitor. I called this baseling.cfg . It simply contains a unique line for each counter to be logged, as follows:


"\Memory\Available MBytes"

"\Memory\Pool Nonpaged Bytes"

"\Memory\Pool Paged Bytes"

"\PhysicalDisk(*)\Current Disk Queue Length"

"\PhysicalDisk(*)\Disk Reads/sec"

"\PhysicalDisk(*)\Disk Reads Bytes/sec"

"\PhysicalDisk(*)\Disk Writes/sec"

"\PhysicalDisk(*)\Disk Writes Bytes/sec"

"\Process(*)\% Processor Time"

"\Process(*)\Private Bytes

"\Process(*)\Virtual Bytes"


I then created a batch file containing the required command line to setup the data capture. I called the file create_data_collector.bat, it contains the following:


logman create counter BASELINE -f bincirc -max 200 -si 00:00:05 --v -o "c:\perfmon\serverbaseline" -cf "c:\perfmon\baseline.config"


To find out what the various parameters mean and other options available just run a quick google for 'logman'. In my particular case:


  •  I am creating a data collector called 'BASELINE', logging to file up to a max size of 200mb
  • The file is binary circular file, meaning that once it reaches maximum size it will begin overwriting the oldest records within the file
  •  The log is to be updated every 5 seconds
  • The file will be output to the folder c:\perfmon\serverbaseline
  • Counters to include in the log are contained within the previously mentioned configuration file: "c:\perfmon\baseline.config"


On first running the batch file locally (to test it) I received a message telling me that the data capture vehicle couldn't be created because of a permissions issue. I adjusted the permissions to the logging folder "c:\perfmon\serverbaseline" and all was good.


All that remained was to script to start the data collection. For this I created the batch file start_collector.bat, containing the following single line:


  • logman.exe start baseline


A few minutes after starting the logging I could see the log file beginning to grow. Great, I thought! I went back to the perfmon ui, clicked the 'view log data' button (looks like a db drum) and selected my new and continually growing log file. It didn't complain, and the current data view was cleared - but where was the historic data that should be been logged? I opened the log file in notepad - just binary data. I wanted to check that it was logging the data I expected so I deleted the BASELINE data collector, then re-created, choosing to log to text file, rather than binary. I then started logging, gave it a few minutes and opened the newly logged data in notepad. It appeared just as I expected, with column headers for my counters and lots of data for each subsequent row.


Great - I was logging the data but couldn't get it to appear in the perfmon graph! Then it struck me, in addition to choosing the data file as the data source for the perfmon graph I needed to add the counters to the graph. I hit the add button on the toolbar (+ symbol) and was presented with a dialog offering a limited set of counters - just the ones I'd specified in the baseline.cfg file. After selecting the counters, my historic data was displayed in the graph. Such a result called for celebratory cup of tea!




I've now also started working on a script to run a SQL profiler trace. My initial results were disappointing, the trace file was generated but my filters were not applied.


I had used the profiler UI to generate the script: file\export\script trace definition. Unfortunately, it seems that there's a bug in SQL 2005 which means that the script generated by this option will OR each of your filters. For details see http://www.devnewsgroups.net/group/microsoft.public.sqlserver.tools/topic60213.aspx





Slow File Receive


I encountered a problem a few days ago with files not being picked from a receive location. The configuration was very simple, a two node group with a file receive location pointing at a share on a windows 2003 server in a different domain. The volume of files being dropped for collection by BizTalk wasn't high - about ten per minute. The behaviour was; files collected very slowly - say one file every five minutes. If I re-started the host instance then all the remaining files would be collected very quickly but ten minutes later it would be back to slow mode.


After digging around the blogs, it occurred to me that the problem may be due to BizTalk throttling the receive host. One very helpful blog I used to determine this is at http://blogs.msdn.com/biztalkcpr/default.aspx


It is quite simple to identify if throttling is being applied. Run perfmon on the BizTalk box and add counters for "BizTalk:Message Agent - Message publishing throttling state" (publish to msgbox) and "Message delivery throttling state" (delivery to endpoint) . Ensure that you select the host instance hosting your receive port.


The following tables describe what throttling, if any is active:


Message delivery throttling state

A flag indicating whether the system is throttling message delivery (affecting XLANG message processing and outbound transports).

  • 0: Not throttling
  • 1: Throttling due to imbalanced message delivery rate (input rate exceeds output rate)
  • 3: Throttling due to high in-process message count
  • 4: Throttling due to process memory pressure
  • 5: Throttling due to system memory pressure
  • 9: Throttling due to high thread count
  • 10: Throttling due to user override on delivery

Message publishing throttling state

A flag indicating whether the system is throttling message publishing (affecting XLANG message processing and inbound transports).

  • 0: Not throttling
  • 2: Throttling due to imbalanced message publishing rate (input rate exceeds output rate)
  • 4: Throttling due to process memory pressure
  • 5: Throttling due to system memory pressure
  • 6: Throttling due to database growth
  • 8: Throttling due to high session count
  • 9: Throttling due to high thread count
  • 11: Throttling due to user override on publishing


After configuring perfmon I could see that I had a counter value of 6 for the publish state - the receive host was being throttled due to database growth. I needed to take a look at the MsgBox!


After posting the question on StackOverflow it was suggested that I run the MsgBox viewer application against the BizTalk db (thanks Chris). This is a great tool, it can be downloaded at

http://blogs.technet.com/jpierauc/pages/msgboxviewer.aspx. After running the MsgBox viewer I was presented with a report which re-enforced the case that the BizTalk was running low on DB space.


On checking the MsgBox db through SQL workbench I could see that it was just over 4gb in size and had 0 mb available! The db was set to autogrow and the disk hosting the db had plenty available space to grow. The problem was the rate at which the db would grow. This can be checked by right-clicking the db from the SQLWB object explorer selecting properties \ files. Both the data and log was set to grow by 1mb. This value is set based on the value of the model database at the time that the BizTalk databases are created. One to remember for the future - ensure the growth is set to a more reasonable chunk -10% would be good.



BizTalk Assembly Viewer


I'm currently preparing to deliver Microsoft Course 2934A - Deploying and Managing Business Process and Integration Solutions Using Microsoft BizTalk Server 2006. 

So far, I've been impressed with the content, it's well structured and I think the topic selection has been good.  

One benefit of delivering a course is that you're bound to run through every lab before-hand, there's  nothing worse that the trainer getting stuck on a lab! This is turn encourages reading on  a topic that you may otherwise skip, thinking that you already know about it or that's it's not relevant to your day job. I ran through such a lab today and learned of a feature of BizTalk that I'm sure will use in future. The feature is the BizTalk Assembly Viewer, here's the instructions for setup:

    • Open a command prompt window, navigate to BizTalk Server Install Director\Developer Tools\
    • Run regsvr32 BtsAsmExt.dll


After registering the dll, load windows explorer and double-click my computer. In the Other section, you'll see system folder called 'BizTalk Server Assemblies'. Open this up and you'll see all of the BizTalk assemblies which have been deployed to the local GAC. From here, you're able to drill into each assembly to display an entry for each artefact that it conatins. At the next level down you can double click the artefact to view the XML behind each schema, map or orchestration - nice!