Neudesic Blogs

Passion for Innovation

Slickgrid Currency Column Formatter


Slickgrid is a JavaScript grid component by Michael Leibman.  This is a great grid which is very flexible and customizable and very fast to load and display data.  Over the next few weeks we will be blogging about Slickgrid in general as well as a few areas in which we have extended its functionality.  This blog will address one of those areas where functionality was extended.


I recently came across the need to display pricing information for products and could not find information on a currency column formatter.  The raw data returned from my web service displayed the data as 0.0000.  So there was no currency formatting ($ symbol) and the decimal precision was returned to four places.  I wanted the data to be formatted like typical US currency with a $ and to two decimal places, like this: $0.00.  Maybe I didn't look hard enough to find a ready-made solution, but at any rate, I quickly wrote my own currency formatter and thought it might help someone else looking to do something similar.


The column formatting functions for Slickgrid are contained in the slick.formatters.js file.  I appended this file with a new function by adding the following bit of code:


function CurrencyFormatter(row, cell, value, columnDef, dataContext) {

    if (value === null || value === "" || !(value > 0)) {

        return "$" + Number();

    } else {

        return "$" + Number(value).toFixed(2);




Additionally, you will need to modify the top section of the script to register the newly created namespace.  I added the following code (highlighted in yellow) to the existing function:


(function ($) {

  // register namespace

  $.extend(true, window, {

    "Slick": {

      "Formatters": {

        "PercentComplete": PercentCompleteFormatter,

        "PercentCompleteBar": PercentCompleteBarFormatter,

        "YesNo": YesNoFormatter,

        "Checkmark": CheckmarkFormatter,

        "Currency": CurrencyFormatter





Now when you define your columns in your html page that will display your Slickgrid control, you need to add the following bit of code to the column definition to reference the currency formatting function that we just created above:


                columns = [

                { id: "Price", name: "Price", field: "Price", width: 80, cssClass: "cell-title", formatter: Slick.Formatters.Currency }



That's it.  Now the data shown in the currency formatted column will display as "$0.00", rather than the raw, unformatted data that may be returned in your web service or database call.  If the value being evaluated is null, an empty string or an empty object, the data will be rendered as $0.  This did the trick for me!  Stay tuned for more Slickgrid information in future blogs.  Happy coding.

Announcing the release of Neuron ESB 2.6!

I'm very happy to announce the release of Neuron ESB 2.6. If you are using an earlier release of Neuron ESB, you can download this latest release from here: Neuron ESB 2.6. This release significantly extends the Neuron ESB platform by introducing new capabilities that will allow businesses to more easily scale, develop, connect and operationally manage their solutions. Businesses turn toward Neuron ESB to solve their service and integration problems while reducing the total cost of ownership of their solutions.  Neuron ESB accomplishes this by continually refining and extending its core capabilities in ways that can be effectively leveraged in agile environments by Microsoft .NET developers.

Neuron ESB 2.6 introduces many new features and enhancements, some of which include:

* Multi Instance Runtime
This capability allows organizations to install and run
multiple instances of the Neuron ESB Runtime Service
on a single server. Each instance can be configured as
either a 32 or 64 bit process, capable of running side by
side. This allows organizations to easily partition
business solutions to run on a single server and support
multiple developers.

* Process Designer
Neuron ESB 2.6 now offers a flexible, easy to use,
Process Designer with a drag and drop interface that
ships with over 30 configurable Process Steps that do
everything from calling a service, updating a database or
queue to parsing an Excel file. Organizations can create
more complex business processes using newly
introduced Process Steps such as “While” and “For”
looping constructs as well as an “Excel to Xml” parsing
Process Step.

* Adapters and Connectivity
Several new adapters are introduced as well as
enhancements to assist organizations connect,
compose and expose new capabilities within their
environment. Also in this release is the addition of a fifth
transport which can be used to configure Topics. Some
of these adapters and enhancements are:

- Event Based SharePoint 2010 Publication
- Event Based Dynamics CRM 2011 Publication
- FTP/FTPS Adapter
- ODBC Enhancements to support ReturnValue and OutPut type parameters
- Topics configurable with Named Pipes

* Monitoring and Reporting
New administrative, monitoring and reporting capabilities
are introduced with the release of Neuron ESB 2.6.
However, one of the most sweeping changes affecting
custom reporting is the refactoring of the Neuron ESB
database structure. The Neuron Audit tables have been
refactored specifically enable custom reporting. For
example, custom properties that are added to the ESB
Message are stored in an XML data typed column
supporting XQuery. Some of the features are:

- WMI Performance Counters for Topics and Endpoints
- WMI Performance Counter for Request/Reply call Total Time
- WMI Failed Message Events
- Database Enhancements for Custom Reporting
- Filter Query User Interface for Reporting

* Deployment Management
Neuron ESB supports XCopy Deployment through the
use of its Environmental Variable feature set. Neuron ESB
2.6 augments this through the extension of Environmental
Variable support for the Neuron Audit Database. Some of
the features included to assist deployments are:

- Assignment of the Same Neuron Audit Database to more than one Deployment Group
- Environmental Variable support for Neuron Audit database
- Environmental Variable support for Service Endpoint ACL property
- Read Only ESB Solution Configuration (*.ESB) support within the Neuron ESB Explorer and Runtime

* Service Endpoints
Neuron ESB 2.6 plays a critical role as organizations
evolve to look toward Cloud offerings for service hosting
or business to business communication. As a web
service broker, Neuron ESB 2.6 now includes all of the
Microsoft Azure Service Bus Relay Bindings to facilitate
hybrid approaches to bridge On Premise integration with
Cloud based solutions. Neuron ESB 2.6 ensures that
communication between the Cloud and On Premise
systems can be handled securely and reliably, while
managing the integration requirements of the
organization. Other important features in this release are:

- Azure Service Bus Integration
- Delegation Support for REST based services
- Pluggable WCF Service Behaviors


The full details of "What's New in Neuron ESB 2.6" can be downloaded from our web site here:

This release includes all the accumulated fixes. The complete list of changes can be found in the Neuron Change Log located at the root of the Neuron ESB installation directory as well as on our web site.

Posted: Apr 02 2012, 05:19 by marty.wasznicky | Comments (0) RSS comment feed

Categories: Azure | Connected Systems | Custom Application Development | Headlines | Neuron | Neuron ESB | WCF

C# Code to look up Current User in Active Directory

Here's some re-usable C# code to lookup the currently logged-in user in Active Directory to get various AD properties such as FirstName, Last Name, and Email. The method IsExistInAD() below is handy in intranet applications where your ASCX or ASPX can assume the current user is authenticated in the domain and you need properties of the user from Active Directory. Method IsExistInAD() takes, as input the user name in the format DOMAIN\\alias and performs a Directory search using .NET Directory Services using System.DirectoryServices.ActiveDirectory. If successful, it populates the private SearchResult _result variable with the various properties from Active Directory and returns true. If the Directory search does not find the current user, the IsExistInAD() method returns false. Note this code handles multiple domains, so if some of your users have username e.g. NORTHAMERICA\\bobsmith and other users have e.g. SOUTHAMERICA\\juanparamo, this code handles it by parsing the domain name and using it in the rootDirectory of the Directory Searcher, so it will find the user in the correct ActiveDirectory Domain.

Note that your ASP.NET application gets this user name to pass as input to IsExistInAD() automatically for the currently logged in user from the Page.User.Identity.Name property when your web application is configured for Windows Authentication.

The first time you setup the target server that will run the Site Info Web Application, you must configure IIS to use Windows Authentication. The Site Info Web Application depends on this and it is not the default configuration of IIS. This configuration setting is Windows/IIS and does not require adjustment on future deployments of new builds or upgrades.

How to Configure Windows Authentication

On the target IIS server

From Server Manager, Open Internet Information Services (IIS) Manager

In the left side panel, select the server (e.g. ZLMRCWEB31)

Double-click the Authentication icon to open the Authentication Applet

  1. Enable Windows Authentication

  1. Disable Anonymous Authentication

Code Default.aspx.cs

using System;
using System.Collections.Generic;
using System.Collections;
using System.Linq;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.DirectoryServices;
using System.DirectoryServices.ActiveDirectory;

namespace TestADLookupUsersEmail
public partial class _Default : System.Web.UI.Page
public class ContactADFields
public string FirstName;
public readonly string FirstNameProp = "givenname";
public string LastName;
public readonly string LastNameProp = "sn";
public string Email;
public readonly string EmailProp = "mail";
public string FullName;
public readonly string FullNameProp = "displayname";

private SearchResult _result;
private ContactADFields contact = new ContactADFields();

protected void Page_Load(object sender, EventArgs e)

public string getUserIdentityName()
return Page.User.Identity.Name;
public string getUserEmail()
if (IsExistInAD(Page.User.Identity.Name))
if (_result.Properties.Contains(contact.FirstNameProp))
contact.FirstName = (string)_result.Properties[contact.FirstNameProp][0];

if (_result.Properties.Contains(contact.LastNameProp))
contact.LastName = (string)_result.Properties[contact.LastNameProp][0];

if (_result.Properties.Contains(contact.EmailProp))
contact.Email = (string)_result.Properties[contact.EmailProp][0];
if (_result.Properties.Contains(contact.FullNameProp))
contact.FullName = (string)_result.Properties[contact.FullNameProp][0];

int propCount = _result.Properties.PropertyNames.Count;
foreach (string propName in _result.Properties.PropertyNames)
string propVal = (string)_result.Properties[propName][0] as String;
catch (Exception)

return contact.Email;

/// <summary>
/// Parse a User Identity Name e.g. "REDMOND\\billg" setting the out accountName and out domainName
/// </summary>
/// <param name="path"></param>
/// <param name="accountName"></param>
/// <param name="domainName"></param>
/// <returns>true if successful parsing the input user name</returns>
private bool ParseUserName(string path, out string accountName, out string domainName)
bool retVal = false;
accountName = String.Empty;
domainName = String.Empty;

string[] userPath = path.Split(new char[] { '\\' });
if (userPath.Length > 0)
retVal = true;
accountName = userPath[userPath.Length - 1];
if (userPath.Length > 1)
domainName = userPath[userPath.Length - 2];

return retVal;

/// <summary>
/// Lookup user in AD, and if successful, set SearchResult _result and return true.
/// </summary>
/// <param name="loginName">The Page.User.Identity.Name e.g. "REDMOND\\billg"</param>
/// <returns>True if found in AD. Also sets SearchResult _result.</returns>
private bool IsExistInAD(string loginName)
DirectorySearcher search = null;
string userName;
string domainName;
if (ParseUserName(loginName, out userName, out domainName))
DirectoryContext dirCtx = new DirectoryContext(DirectoryContextType.Domain, domainName);
if (dirCtx != null)
Domain usersDomain = System.DirectoryServices.ActiveDirectory.Domain.GetDomain(dirCtx);
if (usersDomain != null)
DirectoryEntry rootDirEntry = usersDomain.GetDirectoryEntry();
if (rootDirEntry != null)
search = new DirectorySearcher(rootDirEntry);
search.Filter = String.Format("(SAMAccountName={0})", userName);
search = new DirectorySearcher();
search.Filter = String.Format("(SAMAccountName={0})", loginName);

// Adding properties to the DirectorySearcher is supposed to make the
// query more efficient by only returning the fields we want. However,
// doing so seems to always make teh Last Name prop ("sn") return blank.

_result = search.FindOne();

if (_result == null)
return false;
return true;

Posted: Mar 22 2012, 17:35 by Martin.Cox | Comments (0) RSS comment feed

Categories: Custom Application Development

Neuron ESB 2.5.14 Release Launched!

· Neuron ESB October Feature Release 2.5.14
· National Archives of the UK launches with Neuron ESB!
· Neuron ESB Architectural Assessments

I'm very happy to announce the October Feature Release, 2.5.14, for Neuron ESB 2.5. If you are using an earlier released build, you can download this latest release from here Neuron ESB 2.5.14 .

The product team is very excited about this release as it contains many new features which our customers will immediately benefit from. Some of the features included are:

- ODBC Adapter - This adapter supports Publish, Subscribe, Query and Execute mode. It allows the use of both stored procedures and SQL statements. Also supports FOR XML clause when the Sql Client ODBC driver is selected. Supports schema generation as well as before and after statements when in Publish mode. Supports using dynamic connection strings to connect to the database. This can be set within a pipeline code step i.e. context.Data.SetProperty(“odbc”,”ConnectionString”,<somevalue>)

- SMTP Adapter - This adapter supports dynamic setting of SMTP properties, XSL transform for message bodies, attachments, InfoPath and delivery notifications.

Service Policies
Support has been added under the Availability tab of Service Policies for customization of the SOAP Fault returned to the calling client when Limited Availability of an Endpoint is enabled. This message can be customized to include either the service name, the service endpoint url or both i.e. “The Neuron end point, '{0}', configured with the following url, ‘{1}’, is not available.”

New capabilities regarding how Neuron handles metadata has been added. For example, custom message properties can now be preserved on Request/Response calls. Also, an end to end TransactionId property has been introduced along with a ParentId property to facilitate more advanced and customized tracking and service monitoring scenarios.

- Msmq Pipeline Step
- enhanced with PEEK capability

- Pipeline Execution Pipeline Step - this allowed users to call a pipeline within a pipeline, essentially creating composable processes. This step is capable of executing pipelines at runtime that exist in external/secondary ESB solution files, allowing for library of patterns to be centrally maintained, developed and reused.

- Audit Pipeline Step - now accepts an XPath value to determine what part of the message to store in the Neuron database.

Dynamic Connection Strings
Both the SQL and ODBC adapters have been enhanced with the ability for their connection strings to be set at runtime.

Performance Counters:
New WMI performance counters to capture error and throughput statistics from Neuron Endpoints, Neuron Topics and Neuron Parties. These can be used within 3rd party monitoring tools.

Resubmit Failed Messages:
The ability to edit and resubmit messages within the Message Viewer has been completely refactored. Users can now edit a single message and resubmit that message directly to a Party (running associated pipeline processes) or to an Adapter or Service Endpoint (circumventing those same associated pipeline processes).

Bulk edit and Resubmit of messages has also been added. Users may select multiple messages in either the Message History or Failed Message reports and choose to edit and then resubmit all of them at once directly to a Party or to an Adapter or Service Endpoint.

Besides new features, this release includes all the accumulated fixes since our previous 2.5.13 release in July 2011. Important fixes are included which positively affect the following:

- Msmq Topics
- Pipeline testing, configuration and runtime performance
- Service Policies
- Event Logging

The complete list of changes can be found in the Neuron Change Log located at the root of the Neuron ESB installation directory as well as on our support site.

To determine the current version you are working with ,see What version of Neuron are you running?

As an addendum to our shipped documentation, I'll be regularly posting information on how to use the new features, so continue to monitor our blog and forums.

Upgrade Instructions
Upgrading to this release is identical to the previous release and is fairly straight forward, requiring updating existing ESB Configuration files. You can read more about the update considerations on our forum.

Neuron ESB Architectural Assessments
Remember, Architectural Reviews, pre and post production roll out assessments as well as advanced training are available through the Neuron ESB Product Team. Our goal is to ensure success.

Stay tuned this channel! Once I finish the “What’s New” and “How to use it” documentation, I’ll post the download link and the docs.

Kind regards,

Marty Wasznicky
Neuron ESB

Neudesic, L.L.C.
Work: (949) 754-5223

Fax: (949) 754-6523

Posted: Nov 19 2011, 02:28 by marty.wasznicky | Comments (1) RSS comment feed

Categories: Neudesic Main | AppFabric | Connected Systems | Custom Application Development | General | Neuron | Neuron ESB | WCF

Visual Studio 2010 and MsBuild Tasks for Azure Deployment

While Windows Azure SDK and Windows Azure Tools for Visual Studio help a lot when developing and deploying Cloud Services, there's still much to be desired, especially with managing application setting and numerous connection strings between development, cloud staging and production, as well as “mixed mode” deployments. Where there's no place for human errors - automation rules, which is where MsBuild can help.


Let’s say I have this service configuration


 <?xml version="1.0" encoding="utf-8"?>
<ServiceConfiguration serviceName="WindowsAzureProject1" xmlns="" osFamily="1" osVersion="*">
  <Role name="MvcWebRole1">
    <Instances count="1" />
      <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" 
               value="UseDevelopmentStorage=true" />
      <Setting name="Greeting" value="Hello from Azure (Dev) Web Role!" />


and want to deploy the service to the cloud with:


<?xml version="1.0" encoding="utf-8"?>
<ServiceConfiguration serviceName="WindowsAzureProject1" xmlns="" osFamily="1" osVersion="*">
  <Role name="MvcWebRole1">
    <Instances count="2"/>
      <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" 
         value="DefaultEndpointsProtocol=https;AccountName=[cloud account];AccountKey=[account key]"/>
      <Setting name="Greeting" value="Hello from Azure (Published) Web Role!"/>


While it is possible to modify .cscfg via a generic Xslt transformation, it is preferable to use transform file using the syntax (as used with ASP.NET 4.0 projects). The transform file to use, ServiceConfiguration.Publish.cscfg, could look like:

<?xml version="1.0" encoding="utf-8"?>
<sc:ServiceConfiguration serviceName="WindowsAzureProject1" osFamily="1" osVersion="*"
  <sc:Role name="MvcWebRole1" xdt:Locator="Match(name)">
    <sc:Instances count="2" xdt:Transform="Replace" />
      <sc:Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" 
          value=" DefaultEndpointsProtocol=https;AccountName=[cloud account];AccountKey=[account key]" 
          xdt:Locator="Match(name)" xdt:Transform="Replace" />
      <sc:Setting name="Greeting" value="Hello from Azure (Published) Web Role!" xdt:Locator="Match(name)" xdt:Transform="Replace" />


This resembles the original configuration file but contains only items that need to be added, modified or removed (notice xdt:Locator and xdt:Tranform attributes).

To make the transformation work, one needs to import and use TransformXml task within a target which executes after the “CorePublish” task.


Here is how (as the Cloud Service projects allows for limited actions what can be added using VS UI, the file need to be added and .ccproj file edited manually):


1.       Copy the transform file (ServiceConfiguration.Publish.cscfg) to the cloud service folder (in Windows Explorer).

2.       Unload the Cloud Service Project (right-click the project, then select Unload Project).

3.       Open the project file for editing (Right-click the project, then select Edit [project name].ccproj).

4.       Add the following task after the last target in the project file:


<Import Project="$(MSBuildExtensionsPath)\Microsoft\VisualStudio\v10.0\Web\Microsoft.Web.Publishing.targets" />

    <EnvironmentConfiguration Include="ServiceConfiguration.Publish.cscfg">
    <None Include="ServiceConfiguration.Publish.cscfg">
  <Target Name="TransformEnvirontmentConfiguration" AfterTargets="CorePublish">
    <MakeDir Directories="$(OutDir)TransformCscfg" />
    <Copy SourceFiles="$(OutDir)Publish\%(EnvironmentConfiguration.BaseConfiguration)" 
        DestinationFiles="$(OutDir)TransformCscfg\%(EnvironmentConfiguration.BaseConfiguration)" />
    <TransformXml Source="$(OutDir)TransformCscfg\%(EnvironmentConfiguration.BaseConfiguration)" 
        Destination="$(OutDir)Publish\%(EnvironmentConfiguration.BaseConfiguration)" />
    <Message Importance="high" 
        Text="Transformed %(EnvironmentConfiguration.BaseConfiguration) using %(EnvironmentConfiguration.Identity) into $(OutDir)Publish\%(EnvironmentConfiguration.BaseConfiguration)" />


5.       Save and close the file, then reload the project.



The above fragment imports ASP.NET 4.0 MsBuild targets, includes the transform file into the project fold and defines a new target named "TransformEnvironmentConfiguration" to be executed before "CorePublish" (one of the Cloud Service project tasks). The transform task internally makes a working folder TransforCscfg, where it copies the service configuration file, after which the required transformatino takes place. Note that copying to the working folder is a workaround to the bug in TransformXml which keeps a lock and prevent successful completion of the MsBuild script.


Now when the Cloud Service project is published to the cloud or to the intermediary location, the transformation kicks in.

If the project was published locally, it can be executed in "mixed mode", i.e. running in the compute emulator while using cloud storage service and other resources by adding yet another another task to the project: 


  <Target Name="RunPublishedServiceConfig">
    <Exec Command="&quot;$(ServiceHostingSDKInstallDir)bin\csrun.exe&quot; /run:$(OutDir)$(ProjectName).csx;$(OutDir)Publish\%(EnvironmentConfiguration.BaseConfiguration) /launchbrowser" />


The task can be then invoked as an external tool from Visual Studio as follows:

Command: C:\Windows\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe
Arguments: $(ProjectDir)$(ProjectFileName) /T:RunPublishedServiceConfig

This works well for an ideal solution where the roles use RoleEnvironment class to obtain settings and connection strings. In Azure deployments, storing these in web.config and app.config files is very close to "hard coding", as any change requires re-deployment of the service. In a previous project I was able to add similar transforms for each web and worker role .config. What remains is to enable multiple transforms for different deployment targets, such a staging and production. I hope I'll figure it out before the next Azure project.

Posted: Jul 01 2011, 10:11 by Milenko.Djuricin | Comments (1) RSS comment feed

Categories: Azure | Custom Application Development

Persistence in WF 4.0

From time to time I would get into problems with persisted workflows, e.g. I could not resume it nor terminate it due to the system being unable to de-serialize workflows. If we change a workflow or any type that is serialized with it – there could be, actually there will be problems with long running and persisted workflows for sure.

WF 4.0 is much easier in this respect, since the new persistence provider serializes only the activities containing active bookmarks and the objects in the current scope. In many cases versioning effort can be limited by focusing on the fields added to the custom types, where .NET serialization attributes, such as [System.Runtime.Serialization.OptionalField] can be of great help.


Posted: Aug 31 2010, 11:40 by Milenko.Djuricin | Comments (2) RSS comment feed

Categories: Custom Application Development | Workflow Foundation

Continuous Delivery

Continuous integration is a concept that most in the industry are familiar with, but what about continuous delivery? Continuous integration, if you’re not aware, is the process of automating the building of your software project such that every time a change set is committed to the project’s version control repository, a dedicated build server will automatically build the software to ensure that there are no errors in the checked-in source code. Continuous delivery is an extension of continuous integration. During continuous delivery, the source code is built, but the end result is a packaged software project that can be released to QA (and possibly end users) with every check-in. More...

Posted: Aug 08 2010, 04:20 by Michael.Collins | Comments (0) RSS comment feed

Tags: , , ,
Categories: Custom Application Development

Hosting the Workflow Designer

In my last post, I introduced you to the Managed Extensibility Framework for creating composable applications made out of parts. In this post, I’m going to start exploring the new Workflow Foundation technology that was re-introduced with .NET 4.0. Over the next few days, I’ll explore different aspects of Workflow Foundation and look at how it can be used to create customizable and extensible applications. But in this post, we’re going to start with the basics: hosting the workflow designer. More...

Posted: Jul 30 2010, 09:59 by Michael.Collins | Comments (0) RSS comment feed

Tags: , , , ,
Categories: Custom Application Development | Workflow Foundation





Neudesic Social Media

Follow Neudesic on Twitter Follow Neudesic on Facebook Neudesic Neudesic