SignalR SignalR

Recently David Fowler announced the release of the Signlar 1.1.0 Beta Release. So I decided to do some dabbling to get a prototype application up and running. The solution is pretty simple. It uses a SignlaR hub to broadcast the current Processor % usage, and renders it in a nice visual graph using HighCharts.

 

[important]The completed solution can be found on GitHub at https://github.com/eoincampbell/signalr-processor-demo [/important]

First things first we'll need a bare bones web application which we can pull in the relevant nuget packages into. I started with a basic empty web application running under .NET 4.5. Installing the signlar & highcharts packages is a breeze. Open up the PowerShell Nuget Console and run the following commands. HighCharts gets installed as a solution level package so you'll need to manually copy the relevant JavaScript files to your scripts directory in your application.

Install-Package HighCharts
Install-Package Microsoft.AspNet.SignalR

The Hub

Signalr relies on a "Hub" to push data back to all the connected Clients. I've created a "ProcessorDataHub" which implements the Signalr Base Hub to manage this process. It contains a constructor for Initializing a static instance of my ProcessorTicker class, and a start method to start the thread within the ticker. The HubName attribute specifies the name which the hub will be accessible by on the Javascript side.

[HubName("processorTicker")]
public class ProcessorDataHub : Hub
{
    private readonly ProcessorTicker _ticker;

    public ProcessorDataHub() : this(ProcessorTicker.Instance) { }

    public ProcessorDataHub(ProcessorTicker ticker)
    {
        _ticker = ticker;
    }

    public void Start()
    {
        _ticker.Start(Clients);
    }
}</pre>
<h2>The ProcessorTicker</h2>
<p>The heavy lifting is then done by the ProcessorTicker. This is instantiated with a reference to the Clients object, a HubConnectionContext which contains dynamic objects allowing you to push notifications to some or all connected client side callers. The implementation is fairlly simple using a System.Thread.Timer which reads the current processor level from a peformance counter once per second, and Broadcasts that value to the client side.</p>
<p>Since the Clients.All connection is dynamic, calling "updateCpuUsage" on this object will work at runtime, so long as the relevant client side wiring up to that expected method has been done correctly.</p>
```csharpClients.All.updateCpuUsage(percentage);</pre>
<h2>The Client Side</h2>
<p>One change since the previous version of SignalR is the requirement for the developer to manually &amp; explicity wireup the dynamically generated Javascript endpoint where SignalR creates it's javascript. This can be done on Application Start by calling the RouteTable..Routes.MapHubs() method</p>

```csharp
protected void Application_Start(object sender, EventArgs e)
{
RouteTable.Routes.MapHubs();
}

Finally we're ready to consume these published messages on our Client Page. Signlar requires the following javascript includes in the Head Section of your page.

<script type="text/javascript" src="/Signalr/Scripts/jquery-1.6.4.js"></script>
<script type="text/javascript" src="/Signalr/Scripts/jquery.signalR-1.1.0-beta1.js"></script>
<script type="text/javascript" src="/Signalr/signalr/hubs"></script>

With those inplace, we wire up our own custom Javascript function to access our ProcessorTicker, start the Hub on a button click, and begin receiving and processing the

<script type="text/javascript">
    $(function () {
        var ticker = $.connection.processorTicker;

        //HighCharts JS Omitted..

        ticker.client.updateCpuUsage = function (percentage) {
            $("#processorTicker").text("" + percentage + "%");

            var x = (new Date()).getTime(), // current time
                y = percentage,
                series = chart.series[0];

            series.addPoint([x, y], true, true);
        };

        // Start the connection
        $.connection.hub.start(function () {
            //alert('Started');
        });

        // Wire up the buttons
        $("#start").click(function () {
            ticker.server.start();
        });
    });
</script>

The result is that I can fire up a number of separate browser instances and they'll all get the correct values published to them from the hub over a persistent long running response. Obviously this an extremely powerful system that could be applied to Live Operations Systems where dash boards have traditionally relied on polling the server at some regular interval.

Live Processor Data to Multiple Browsers via SignalRLive Processor Data to Multiple Browsers via SignalR

~Eoin Campbell

Global Windows Azure Bootcamp

This weekend, I attended the Global Windows Azure Deep Dive conference in the National College of Ireland, Dublin. Microsoft in conjunction This was a community organised event where Local & National IT Organisations, Educational Institutions & .NET Communities were running a series of events in parallel in a number of cities around the world. The purpose; Deep Dive into the latest technology available on Microsoft as well as take part in a massively parallel lab where participants from all over the world would spin up worker roles to contribute to 3D graphics rendering based on depth data from a KINECT. Alas, Deep it was not, and Dive we didn't.

I suppose I can't complain too much. You get what you pay for and it was a free event but I'd have serious reservations about attending this type of session again. Don't get me wrong, I don't want to sound ungrateful, and fair dues to the organisers for holding the event but if you're going to advertise something as a "Deep Dive" or a "Bootcamp" then that has certain connotations that there would actually be some Advanced Hands-on learning.

Instead the day would barely have qualified as a Level 100 introduction to 2 or 3 Windows Azure technologies interspersed with Sales Pitches, Student Demo's of their project work and filler talks relating to cloud computing in general. Probably most disappointingly we didn't actually take part in the RenderLab experiment which kinda torpedoed the "Global" aspect of the day as well. You can see the agenda below. I've highlighted the practical aspects in Red.

Time Topic
0930 Welcome Dr Pramod - Pathak, Dean, School of Computing, NCI
0935 Schedule for the day  - Vikas Sahni, Lecturer, School of Computing,NCI
0940 How ISIN can help - Dave Feenan, Manager, ISIN
0945 Microsoft’s Best Practice in Data Centre Design - Mark O’Neill, Data Center Evangelist, Microsoft
1000 Virtual Machines – Demo and Lab 1 - Vikas Sahni, Lecturer, School of Computing, NCI
1100 Careers in the Cloud - Dr Horacio Gonzalez-Velez, Head, Cloud Competency Center, School of Computing, NCI
1110 Graduates available today - Robert Ward, Head of Marketing, NCI
1120 Break
1135 Web Sites – Demo and Lab 2 - Vikas Sahni, Lecturer, School of Computing, NCI
1235 Building the Trusted Cloud - Terry Landers, Regional Standards Officer for Western Europe, Microsoft
1300 Lunch
1400 Tools for Cloud Development - Colum Horgan, InverCloud
1410 Windows Azure Mobile Services – Overview and Showcase -  Vikas Sahni, Lecturer, School of Computing, NCI and Students of NCI
1440 Developing PaaS applications – Demo - Michael Bradford, Lecturer, School of Computing, NCI
1530 Break
1545 Windows Azure – The Big Picture - Vikas Sahni, Lecturer, School of Computing, NCI
1645 Q&A

Alas even the practical aspects of the day were extremely basic and the kinda of thing that most people in the room had done/could do in their own spare time.

  • During the Virtual Machines Lab, we spun up a Virtual Machine from the Windows Azure Gallery and remote desktop connected into it.
  • During the Websites Lab, we deployed a WordPress install... unless you were feeling brave enough to do something else. To be fair I hadn't done a hands on GitHub Deploy of the code so that was interesting.
  • During the PaaS Application Demo... well it was supposed to be a Hello World web/worker role deployment but god love the poor chap he was out of his depth with Visual Studio and had a few technical hiccups and it was just a bad demo. Upshot was we ran out of time before there was an opportunity for any hands on time in the room.

At 15:30 we left... I didn't have another lecture in me, although at least we'd had the common courtesy to stay that long. Half the room didn't come back after lunch.

The takeaways; I know that alot of time and effort goes into these events, and particularly when they are free, that time and effort is greatly appreciated. But you need to make sure you get your audience right. If you advertise Advanced and deliver basic, people will be disappointed. That was clear from the mass exodus that occured during the day... I'm kinda curious to know if there was anyone around for the Q&A at all. I'll be sure as heck checking the agenda on these type of events before committing my time to them in future. We aren't currently using Windows Azure in our company yet, and embarrassingly I had been promoting it internally and had convinced several of my colleagues to give up their Saturday for it.

~Eoin Campbell

Works On My MachineWorks On My Machine

I ran into a pretty horrible problem with ILMerge this week when attempting to build and deploy a windows service I'd been working on. While the merged executable & subsequently created MSI worked fine on my own machine, it gave the following rather nasty problem when run on a colleagues machine.

[error]

Could not load type 'System.Runtime.CompilerServices.ExtensionAttribute' from assembly mscorlib

[/error]

It turns out that between .NET 4.0 & .NET 4.5; this attribute was moved from System.Core.dll to mscorlib.dll. While that sounds like a rather nasty breaking change in a framework version that is supposed to be 100% compatible, a [TypeForwardedTo] attribute is supposed to make this difference unobservable.

Unfortunately things breakwhen ILMerge is used to merge several assemblies into one. When I merge my .NET 4.0 app, with some other assemblies on the machine with .NET 4.5 installed, it sets the targetplatform for ILMerge to .NET 4.0. This in turn looks into C:\windows\Microsoft.NET\Framework\v4.0.30319 to find the relevant DLLs. But since .NET 4.5 is an in place upgrade, these have all been updated with their .NET 4.5 counter parts.

Breaking Changes "Every well intended change has at least one failure mode that nobody thought of"

You need to specific that ILMerge should use the older .NET 4.0 reference assemblies which are still available in C:\Program Files\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.0. (or program files x86) if your on a 64-bit box). There's more info on the stackoverflow question where I finally found a solution and in a linked blog post by Matt Wrock.

http://stackoverflow.com/questions/13748055/could-not-load-type-system-runtime-compilerservices-extensionattribute-from-a

and

http://www.mattwrock.com/post/2012/02/29/What-you-should-know-about-running-ILMerge-on-Net-45-Beta-assemblies-targeting-Net-40.aspx

To override this behavior you need to specify this target platform directory as part of your ILMerge command. e.g.

"C:\Path\To\ILMerge.exe"
    /out:"$(TargetDir)OutputExecutable.exe"
    /target:exe
    /targetplatform:"v4,C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.0"
      "$(TargetDir)InputExecutable.exe"
      "$(TargetDir)A.dll"
      "$(TargetDir)B.dll"
I had previously been using the ILMerge.MSBuild.Tasks tool from nuget but unfortunately, this library doesn't currently support specifying the TargetPlatform. There's an unactioned open item on their issue tracker in google code.
~Eoin Campbell

Automatic AssemblyFileVersion UpdatesAutomatic AssemblyFileVersion Updates

There is support in .NET for automatically incrementing the AssemblyVersion of a project by using the ".*" notation. e.g.
[assembly: AssemblyVersion("0.1.*")]

Unfortunately the same functionality isn't available for the AssemblyFileVersion. Often times, I don't want to bump the AssemblyVersion of an assembly as it will effect the strong name signature of the assembly, and perhaps the changes (a bug fix) isn't significant enough to warrant it. However I do want to automatically increment the file version, so that in a deployed environment, I can right click the file and establish when the file was built & released.

[important]Enter the Update-AssemblyFileVersion.ps1 file.[/important]

This powershell script, (heavily borrowed from David J Wise's article), runs as a pre-build command on a .NET Project. Simply point the command at an assembly info file, (or GlobalAssemblyInfo.cs if you're following my suggested versioning tactics)  and ta-da, automatically updating AssemblyFileVersions.

The Build component of the version number will be set using the following formula based on a daycount since the year 2000.

# Build = (201X-2000)*366 + (1==>366)
#
    $build = [int32](((get-date).Year-2000)*366)+(Get-Date).DayOfYear

The Revision component of the version number will be using the following formula based on seconds in the current day.

# Revision = (1==>86400)/2 # .net standard
#
    $revision = [int32](((get-date)-(Get-Date).Date).TotalSeconds / 2)

The Major & Minor components are not set to update although they could be. Simply add the following command to your Pre-Build event and you're all set.

%SystemRoot%\system32\WindowsPowerShell\v1.0\powershell.exe 
    -File "C:\Path\To\Update-AssemblyFileVersion.ps1"  
    -assemblyInfoFilePath "$(SolutionDir)\Project\Properties\AssemblyInfo.cs"

~Eoin Campbell

WCF Header ManHe has a WCF Header... Get it

Often, you'll need to pass some piece of information on some or all of your  WCF Service operations. For my team, we had recently exposed some functionality in an old WCF Endpoint via a Web Front End and wanted to log some auditing information on each and every Service Call. Obviously modifying every single method signature to accept the new parameters would be a pretty significant breaking change to all the consumers so instead we looked at passing this information as a Custom WCF Message Header. WCF exposes a number of interfaces which you can leverage to inspect & modify messages  on the fly and to add customer behaviors to service endpoints. In the following demo, we'll go through the process of building a Custom Header for our WCF Service & subsequently passing that information from the Consumer Client back to the service. In our contrived example I'll be attempting to pass 3 pieces of information to the WCF Service as part off every message call.

  • The username of the currently logged in web front-end user
  • Since our website is deployed across multiple nodes, the id of the web node
  • The "Special Session Guid" of the current users Session
[important]The completed solution can be found on GitHub at https://github.com/eoincampbell/wcf-custom-headers-demo [/important]
This scenario could apply to a number of other real world situations. Perhaps the service is secured and called using a single WS Security Account, but we need to log the user who's session, the service call originated from on every service call. Or perhaps you're exposing your service to the public and as well as providing a username & password to authenticate, the caller also needs to provide some sort of "Subscription Account Number" in addition to their Credentials. Any of these scenarios are candidates for adding a custom header to a WCF Service.

The Quick & Dirty Solution

Of course I could just edit method signatures of each service call to accept this additional information as additional parameters but this causes a number of other problems. This might be feasible for a small in-house service, or dummy application but it causes a number of issues in reality.
  • I need to add additional parameters to every service call which isn't a particularly elegant solution
  • If I need to add more information in the future, I must edit every service call signature which at worst is a breaking change for every client and at best, a significant amount of work & re-factoring.
  • Depending on the scenario, I'm potentially intermingling Business Logic with Authentication/Authorization or some other sort of Service Wide Validation logic which is going to increase the complexity of any future re-factoring.

Operation Context & Custom Message Headers

A better solution would be to "tag" each service call with some sort of header information. In this way, we could piggy-back our additional data along without interfering with the individual method signatures of each service call. Thankfully WCF includes built-in support for MessageHeaders. The services OperationContext includes Incoming & Outgoing Message Header collections.

//Service Contract
[ServiceContract]
public interface ISimpleCustomHeaderService
{
    [OperationContract]
    void DoWork();
}
//Client Code
using (var client = new SimpleCustomHeaderServiceClient())
using (var scope = new OperationContextScope(client.InnerChannel))
{
    var webUser = new MessageHeader("joe.bloggs");
    var webUserHeader = webUser.GetUntypedHeader("web-user", "ns");
    OperationContext.Current.OutgoingMessageHeaders.Add(webUserHeader);
    client.DoWork();
}

For now I've created a very simple Service Contract which has a single void method on it called DoWork(). Adding a custom header to this service call is relatively trivial. First we instantiate a new instance of our WCF Client Proxy. We also nee to create an OperationContextScope using the WCF client channel. Since the OperationContext is accessed, via the static Current property, instantiating this scoping object, stores the current context & the OperationContext of the current Clients IContextChannel becomes that returned by the Current Property. This allows us to modify the OutgoingMessageHeaders collection of the clients channel. Once disposed the state of the original Current OperationContext is restored. MessageHeaders are passed as untyped data as they travel on the wire. In the example above I've created a strongly typed .NET object. That is then converted to an untyped header; keyed by name & namespace for transmission in the OutgoingMessageHeaders Collection. If we observe the data that travels across the wire using fiddler, we can see our Custom Header Data has been appended to the soap header section of the message.

Fiddler WCF HeadersFiddler WCF Headers

Finally, these Header values can be retrieved from the IncomingMessageHeader Collection as part of the service call processing. Since we're already in the scope the Current OperationContext, we can just directly access that context's header collection to read our headers. I've added a simple generic helper method to test to see if the Header can first be found and if so, will be returned.

public class SimpleCustomHeaderService : ISimpleCustomHeaderService
{
    public string DoWork()
    {
        //Do Work
        //...
        //Capture Headers
        var userName = GetHeader("web-user", "ns");
        var webNodeId = GetHeader("web-node-id", "ns");
        var webSessionId = GetHeader("web-session-id", "ns");
        Debug.WriteLine("User: {0} / Node: {1} / Session: {2}", userName, webNodeId, webSessionId);
        var s = string.Format("HeaderInfo: {0}, {1}, {2}",
            userName,
            webNodeId,
            webSessionId);</p>
        return s;
    }
        private static T GetHeader(string name, string ns)
    {
        return OperationContext.Current.IncomingMessageHeaders.FindHeader(name, ns) > -1
            ? OperationContext.Current.IncomingMessageHeaders.GetHeader(name, ns)
            : default(T);
    }
}

Leveraging Client & Dispatch Message Inspectors

The above solution comes with some pros and cons. On the plus side headers can be added in an adhoc manner with little friction to existing code. On the downside, it's not really ideal from a code maintenance/organization point of view. Header Data is relatively unstructured and disparate. We also end up with a lot of touch-points. Adding headers requires creating an OperationContextScope in close proximity to every service call... similarly, accessing the resultant header values must be done in the Service Methods... Imagine our WCF Service had 100 service methods, and all we wanted to do was send a single additional header to be logged on the server. That results in 100's of lines of additional code.

A better solution would be to use the built in message inspector interfaces in WCF. Message Inspectors provide you with a way to plug directly into the WCF Communications Pipeline on both the client or server side of the communication Channel. The IDispatcherMessageInspector allows us to affect messages either just after the request has arrives on the server (AfterReceiveRequest) or just before the response leaves the server (BeforeSendReply) The IClientMessageInspector allows us to affect messages either just before the request leaves the client  (BeforeSendRequest) or just after the response is received by the client (AfterReceiveReply)

public class CustomInspectorBehavior : IDispatchMessageInspector, IClientMessageInspector
{
    #region IDispatchMessageInspector</p>
    public object AfterReceiveRequest(ref Message request, IClientChannel channel, InstanceContext instanceContext)
    { ... }
    #endregion
    #region IClientMessageInspector
    public object BeforeSendRequest(ref Message request, IClientChannel channel)
    { ... }
    #endregion
}

Injecting ourselves into the message pipeline like this serves a number of advantages, We now have the opportunity to add our messages to the outbound client request in a single place. We also have a single touch point for capturing the request on the server side. If this was a licence-code validation check, this would save us from peppering every single service call with the validation check code.

In the following sections we'll look at creating a custom data header object, creating a message inspector implementation to manage injecting and extracting this data from our WCF Service, creating Client & Service Behaviors to attach our Message Inspectors and creating a behavior extension to allow these behaviors to be applied to our service through configuration.

Solution Organisation

Since both our Web Application (which will host the Web Services) and the Console Application will have some common dependencies, I've split out the majority of the code in these next sections and stored them in Common Class Library folder which both Applications can then reference.

Solution OrganisationSolution Organisation

Custom Header Data Contract

The first thing I'll create is a simple data contract object to represent our custom header information. This POCO Data Contract provides a simple way for us to encapsulate our header information into a single payload which will be transmitted with our Service Calls.

    [DataContract]
    public class CustomHeader
    {
        [DataMember]
        public string WebUserId { get; set; }
        [DataMember]
        public int WebNodeId { get; set; }
        [DataMember]
        public Guid WebSessionId { get; set; }
    }

Message Inspectors

Next I create our Message Inspectors. The two interfaces that are required are the System.ServiceModel.Dispatcher.IDispatchMessageInspector (which hooks into our pipeline on the service side) and the  System.ServiceModel.Dispatcher.IClientMessageInspector (which hooks into our pipeline on the consumer side). Within these two interfaces the two methods I'm most interested in are the IClientMessageInspector.BeforeSendRequest which allows me to modify the outgoing header collection on the client and the IDispatchMessageInspector.AfterReceiveRequest which allows me to retrieve the data on the service side.

    #region IDispatchMessageInspector
    public object AfterReceiveRequest(ref Message request, IClientChannel channel, InstanceContext instanceContext)
    {
        //Retrieve Inbound Object from Request
        var header = request.Headers.GetHeader("custom-header", "s");
        if (header != null)
        {
            OperationContext.Current.IncomingMessageProperties.Add("CustomHeader", header);
        }
        return null;
    }
    #endregion</p>
    #region IClientMessageInspector
    public object BeforeSendRequest(ref Message request, IClientChannel channel)
    {
        //Instantiate new HeaderObject with values from ClientContext;
        var dataToSend = new CustomHeader
            {
                WebNodeId = ClientCustomHeaderContext.HeaderInformation.WebNodeId,
                WebSessionId = ClientCustomHeaderContext.HeaderInformation.WebSessionId,
                WebUserId = ClientCustomHeaderContext.HeaderInformation.WebUserId
            };</p>
        var typedHeader = new MessageHeader(dataToSend);
        var untypedHeader = typedHeader.GetUntypedHeader(&quot;custom-header&quot;, &quot;s&quot;);</p>
        request.Headers.Add(untypedHeader);
        return null;
    }
    #endregion

I'll also need to create a simple static Client Context class which will provide the conduit for the Consumer Application to set header values to be picked up inside the message inspector methods.

    public static class ClientCustomHeaderContext
    {
        public static CustomHeader HeaderInformation;
        static ClientCustomHeaderContext()
        {
            HeaderInformation = new CustomHeader();
        }
    }

WCF Custom Behaviors

WCF Service Behaviors define how the endpoint (the actual service instance) interacts with its clients. Attributes like security, concurrency, caching, logging, and attached message inspectors - those are all part of the behavior. We're going to implement a new custom behavior for both the Service side and the Client side of this interaction. Since these are still just interface implementations, there's no need to create a new class to implement them. We can add this functionality to the same class which contains our Message Inspector functionality. Of course if you wanted to be a purist about it, there's nothing to stop you implementing the two message inspectors and two service behaviors in four completely separate classes.

The two behavior contracts I'm interested in here are the System.ServiceModel.Description.IEndpointBehavior, which is responsible for the client side behavior and the System.ServiceModel.Description.IServiceBehavior which is responsible for the service side behavior. Implementing these interfaces allows me to add an instance of the Message Inspectors to the service.

#region IEndpointBehavior

public void ApplyDispatchBehavior(ServiceEndpoint endpoint, EndpointDispatcher endpointDispatcher)
{
    var channelDispatcher = endpointDispatcher.ChannelDispatcher;
    if (channelDispatcher == null) return;
    foreach (var ed in channelDispatcher.Endpoints)
    {
        var inspector = new CustomInspectorBehavior();
        ed.DispatchRuntime.MessageInspectors.Add(inspector);
    }
}

public void ApplyClientBehavior(ServiceEndpoint endpoint, ClientRuntime clientRuntime)
{
    var inspector = new CustomInspectorBehavior();
    clientRuntime.MessageInspectors.Add(inspector);
}

#endregion

#region IServiceBehaviour

public void ApplyDispatchBehavior(ServiceDescription serviceDescription, ServiceHostBase serviceHostBase)
{
    foreach (ChannelDispatcher cDispatcher in serviceHostBase.ChannelDispatchers)
    {
        foreach (var eDispatcher in cDispatcher.Endpoints)
        {
            eDispatcher.DispatchRuntime.MessageInspectors.Add(new CustomInspectorBehavior());
        }
    }
}

#endregion

Adding the Custom Behavior to a WCF Service

Behaviors can be applied to services using a special Service Behavior attribute which decorates the ServiceContract. The last step is to extend the Attribute class in our CustomHeaderInspectorBehavior class and then to decorate each of services with that attribute.

[AttributeUsage(AttributeTargets.Class)]
public class CustomInspectorBehavior : Attribute, ... { ... }
[CustomInspectorBehavior]
public class ComplexCustomHeaderService : IComplexCustomHeaderService { ... }

Configuring a WCF Client to use a specific behavior

On the client side, I need to do a tiny bit more work. I can manually configure the Behavior on the WcfClientProxy every time I instantiate it but this is extra bloat and eventually I'll forget to set it somewhere and lose my behavior functionality.

using(var client = new ComplexCustomHeaderServiceClient()) {
    client.ChannelFactory.Endpoint.Behaviors.Add(new CustomHeaderInspectorBehavior());
}

Instead I'd prefer to be able to set this once in configuration and never have to worry about it again. I can achieve this by using a BehaviorExtension Element as follows and adding it to my application configuraiton file.

public class CustomInspectorBehaviorExtension : BehaviorExtensionElement
{
    protected override object CreateBehavior()<
    {
        return new CustomInspectorBehavior();
    }
    public override Type BehaviorType
    {
        get { return typeof (CustomInspectorBehavior);}
    }
}

And below is the equivalent configuration file.

    <system.serviceModel>
      <behaviors>
        <endpointBehaviors>
          <behavior name="CustomInspectorBehavior">
            <CustomInspectorBehavior />
          </behavior>
        </endpointBehaviors>
      </behaviors>
      <extensions>
        <behaviorExtensions>
          <add name="CustomInspectorBehavior"
               type="WCFCustomHeaderDemo.Lib.Extensions.CustomInspectorBehaviorExtension,WCFCustomHeaderDemo.Lib" />
        </behaviorExtensions>
      </extensions>
        <bindings>
            <basicHttpBinding>
                <binding name="BasicHttpBinding_ISimpleCustomHeaderService" />
                <binding name="BasicHttpBinding_IComplexCustomHeaderService" />
            </basicHttpBinding>
        </bindings>
        <client>
            <endpoint address="http://localhost/TestService/ComplexCustomHeaderService.svc"
                behaviorConfiguration="CustomInspectorBehavior"
				binding="basicHttpBinding"
                bindingConfiguration="BasicHttpBinding_IComplexCustomHeaderService"
                contract="ComplexCustomHeaderService.IComplexCustomHeaderService"
                name="BasicHttpBinding_IComplexCustomHeaderService" />
        </client>
    </system.serviceModel>

Calling our Client

Finally, we can call our client and test to see if our Server side application can see the headers being submitted and echo them back.

using(var client = new ComplexCustomHeaderServiceClient())
{
    ClientCustomHeaderContext.HeaderInformation.WebNodeId = 465;
    ClientCustomHeaderContext.HeaderInformation.WebSessionId = Guid.NewGuid();
    ClientCustomHeaderContext.HeaderInformation.WebUserId = "joe.bloggs"
    System.Console.WriteLine(client.DoWork());
}

Wcf Header Demo ResultWCF Header Demo Result

Excellent.

~Eoin Campbell