Detecting SharePoint Forms Services

by Scosby Friday, September 30, 2011


The reader will learn how to use the Forms Services Feature Detection Protocol. The solution demonstrated will post a HTTP request to a SharePoint detector page without calling a SharePoint web service. This approach allows even a SharePoint Reader to use the Forms Services Feature Detection Protocol. Future posts about Forms Server will utilize the solution presented in this post.

InfoPath In SharePoint

InfoPath Forms Services, included with Microsoft SharePoint Server 2010, provides a Web browser experience for filling out InfoPath forms. InfoPath 2010 integrates with SharePoint’s Business Connectivity Services (BCS) enabling users to connect their organization's forms to important business data that is stored in external line-of-business systems such as SAP, Oracle or even Twitter! Read the InfoPath Forms Services Overview for more information.

Forms Server Detector

Using the Forms Services Feature Detection Protocol, the reader can easily detect if Forms Server exists and is enabled on a specific site. By posting a query string to the FormServerDetector.aspx page, a response of HTTP No Content 204 will be returned if the request was successful but Forms Server features are not enabled, and a HTTP OK 200 if Forms Server features are enabled. If Forms Server is enabled, the response body contains the detection result but it should always be true according to the protocol.


The solution first constructs the detector URI from the following parts: the site URL, the FormServerDetector.aspx page and the protocol query parameter. Using the detector URI, the solution posts an HTTP request and inspects the HTTP response to detect the Forms Server status for the site. The protocol specifies the client request headers should include an Accept header with */* as the value. If Forms Server is enabled, the response body will contain the following text:

<server IsFormServerEnabled = 'true' />

Code Sample

        public static bool IsFormsSevicesEnabled()


            UriBuilder builder = new UriBuilder("http://scvm1");

            builder.Path = "/_layouts/FormServerDetector.aspx";

            builder.Query = "IsFormServerEnabled=check";


            string servicePath = builder.Uri.GetComponents(UriComponents.AbsoluteUri, UriFormat.SafeUnescaped);


            HttpWebRequest request = (HttpWebRequest)WebRequest.Create(servicePath);

            request.Accept = "*/*";

            request.UseDefaultCredentials = true;


            HttpWebResponse response = null;




                response = (HttpWebResponse)request.GetResponse();


            catch (WebException ex)


                response = (HttpWebResponse)ex.Response;



            if (response.StatusCode == HttpStatusCode.OK)


                using (System.IO.StreamReader sr = new StreamReader(response.GetResponseStream()))


                    System.Xml.Linq.XElement doc = XElement.Parse(sr.ReadToEnd());


                    XAttribute fsAttribute = doc.Attribute("IsFormServerEnabled");


                    return bool.Parse(fsAttribute.Value);




            return false;


The reader should take note of the protocol query parameter assigned to the builder.Query property. By using a value of “check” for the detection protocol query parameter, the request will be processed by the protocol server. This solution provides broad support for remote clients to detect Forms Server capabilities, but without resorting to checking for SharePoint Feature definitions.

More Information

·         InfoPath Forms Services Overview

·         Forms Services Feature Detection Protocol Specification [MS-FSFDP]

·         Request for Forms Server Detection

Tags: ,

IT | Programming

Remotely Opening Office Files From SharePoint 2010

by Scosby Monday, August 22, 2011


Developing a client side application for SharePoint capable of opening files in their native Office application does not require very much code for a robust solution. This post will explore how to use the well documented SharePoint.OpenDocuments ActiveX control from managed code instead of from Javascript in a Web Browser. This approach will provide the end users with a familiar SharePoint experience in the application.


There are at least three obvious and important questions to ask regarding this topic:

·         Why not start a new process using the file’s URL?

·         Why not download the file to a temporary location?

·         Why not write a monolithic switch statement, using a handful of Office interop assemblies?

Start Process

The first question asks what is wrong with starting a new process for the file’s URL. This approach is indeed the most simple. However, undesired behavior can occur when attempting to open files from within folders of a document library.

Download File

The second question asks what is wrong with downloading the file. This approach seems more reasonable. However, there is no out-of-the-box integration with SharePoint given this approach. Additionally, this creates a burden on the application to clean up the downloaded file after it is no longer needed.

Interop Assemblies

The third question asks why not write a switch statement to determine which Office interop assembly should be used to open the file? This approach requires much more code. Additionally, it would be an edge case to need such a high level of complexity for opening Office files.


Fortunately, the three problems described in this post are easily solved using the existing Office tools provided by Microsoft. In particular, SharePoint developers are likely familiar with the SharePoint.OpenDocuments ActiveX control, used by Javascript in a Web Browser, to perform the task at hand.

The SharePoint.OpenDocuments control is defined in the OWSSUPP.dll file that is installed in the %ProgramFiles%\Microsoft Office\Office14 directory on the client computer during Microsoft Office Setup. This assembly is not well documented, but it is the one used by the SharePoint.OpenDocuments control.

Given this information, the reader can navigate through the OWSSUPP assembly with the Visual Studio Object Browser and discover the matching methods needed for the managed code solution. The next section will explain the SharePoint.OpenDocuments control, so the reader can gain a foundation to build upon in the code sample.

SharePoint.OpenDocuments Control

According to the documentation, the control is defined as “An ActiveX control that enables users of Microsoft SharePoint Foundation 2010 to create documents based on a specified template, or to edit documents using their associated applications.”

In other words, this control enables a developer to open a specific Office file from SharePoint in its native Office application on the client. When used in managed code, the control will display the familiar Open Document dialog for Office applications (see figure 1).


Figure 1 – The Open Document dialog is shown when opening a file from SharePoint

This post will focus on the OpenDocuments.ViewDocument method. As stated in the documentation, this method “opens the document for reading instead of editing, so that the document is not locked on the server.” This behavior is best for a majority of scenarios. By using this method, the developer allows the end user to decide when to edit or lock the document on the SharePoint server.

Code Sample

This code sample uses a hardcoded URL to open a file. However, real world applications will be working with URLs from many different sources, such as search results or views. The developer needs the following version of Office to complete this code sample:

·         Microsoft Office 2010 (2007 should work but not 2003)

o    32-Bit

The code sample will use a console application to open a hardcoded URL for an Office file by using the COWSNewDocument interface and the IOWSNewDocument2 interface.

1.       Create a new console application in Visual Studio.

2.       Right click the project’s References and select Add Reference.

3.       The developer will not find the OWSSUPP components in the COM tab. Thus, select the Browse tab and navigate to %ProgramFiles%\Microsoft Office\Office14\OWSSUPP.DLL to add the reference to the project.

4.       Double click the newly added OWSSUPP assembly in the console application’s References list to open the Object Browser.

5.       Using the Object Browser, navigate into the Interop.OWSSUPP assembly. Expand the OWSSUPP namespace and explore its members. Be sure to look at the COWSNewDocument interface and the IOWSNewDocument2 interface, since these are the building blocks for the code sample.

6.       Modify the Main method in the Program class to look like the following code snippet, be sure the fileUri variable represents a file that is accessible.

        static void Main(string[] args)


            Uri fileUri = new Uri("http://scvm1/Shared Documents/Document 1.docx");


            string fileUrl = fileUri.GetComponents(UriComponents.AbsoluteUri, UriFormat.UriEscaped);


            OWSSUPP.IOWSNewDocument2 control = new OWSSUPP.COWSNewDocument() as OWSSUPP.IOWSNewDocument2;




7.  Start debugging and watch the Open Documents dialog pop up.


The code sample is very short. Note how the COWSNewDocument interface is instantiated and cast to the IOWSNewDocument2 control interface. This interface is the earliest implementation of the ViewDocument method. Thus, the reader should use the IOWSNewDocument2 interface, unless additional functionality in the newer versions is needed.


For additional functions in the OWSSUPP interfaces, such as EditDocument or CreateNewDocument, it is wise to compare the documentation for the SharePoint.OpenDocuments control with the OWSSUPP members in the Object Browser. Since Microsoft doesn’t document OWSSUPP on MSDN, it is extremely beneficial to refer back to the SharePoint documentation. Most of the times, it is OK to omit the varProgId parameter, in which case that tells the control to default to the currently installed application for that file.


This post showed how easy it is to translate the SharePoint.OpenDocuments ActiveX control into a managed code solution for opening SharePoint Office files in their native application. The reader has learned how to compare the existing documentation for the SharePoint.OpenDocuments control against the OWSSUPP members in the Object Browser. The end users will appreciate a familiar experience in the reader’s application, as the Open Document dialog is the same experience provided by SharePoint from the browser.

Tags: ,

IT | Programming

SharePoint 2010 Content Organizer - Client Object Model

by Scosby Wednesday, March 16, 2011


SharePoint 2010 introduced the Content Organizer to provide a more formal routing infrastructure for site content. Based on a list of configured rules, content is routed to a library based on conditions such as column values or content type. Read more about the Content Organizer on MSDN. This post assumes the user has installed the SharePoint 2010 Client Object Model redistributable.


Using the Content Organizer

One particular feature of the Content Organizer is to redirect users to the Drop Off library. This setting is configured in the Content Organizer Settings link located under Site Administration on the Site Settings page. (See figure 1)

Figure 1 – Content Organizer Redirect Setting

Enabling this setting will send all content for a library that has a configured rule to the drop off library. In other words, if users upload a document to the library it will be sent to the drop off library instead. This only applies to libraries that are listed as “target libraries” in a Content Organizer Rule. (See figure 2).

Figure 2 – Content Organizer Rule configured to a target library

Once a new item has been routed to the drop off library, either a manager or a timer job will move the item to its final location once the rule’s conditions have been met. The final location is the target library defined by the rule. A daily timer job, conspicuously called “Content Organizer Processing” (see figure 3), is created for each web application with a content organizer enabled site. This job will evaluate items in the drop off library against the rule’s conditions. The item must satisfy all the rule’s conditions in order to be moved to the target library. Unless a manager submits content before the job is run, any content matching the rule’s conditions will not be routed to the target library (final location) until the job has run. It is possible to run the job on demand, or have a manager submit the item in the drop off library, to move items immediately.

Figure 3 - Content Organizer Timer Job


When using the SharePoint Client Object model, there is no implementation provided for determining content organizer rules. Additionally, uploading with the client object model ignores the redirect users setting and bypasses the content organizer, ignoring any rules defined for a target library.  By design, when redirecting users to the drop off library, the content organizer restricts a library’s available content types for new documents or uploads to those defined as content organizer rules specifying that library as a target. If no content organizer rules exist, then the library will behave as it does without the content organizer.


Despite the lack of built in support with the client object model for the content organizer, it is possible to discover the rules and honor the redirect setting. The solution is to mimic the content organizer redirect behavior by releasing to the drop off library instead of uploading to the target library. This post will demonstrate how to retrieve the list of enabled content organizer rules using a CAML query. If there is a rule for the release content type and the redirect users setting is enabled, then files should go to the drop off library instead of the target library.

Designing the Solution

The solution has three high-level steps to perform.

Checking the Content Organizer

Using the client object model, it is possible for the user to determine if a Microsoft.SharePoint.Client.Web (Web) is using the content organizer. The field client_routerenforcerouting should be checked for a bool indicating if the Web is redirecting users to the drop off library.

The user should not simply check for this property on the Web and assume all libraries have rules, however. SharePoint requires the user to define content organizer rules to control the routing of content. Thus, the user must retrieve the content organizer rules and evaluate them against the intended content type for an upload. If a rule matches the content type, then the content should be sent to the drop off library instead of the target library, again, only if client_routerenforcerouting is true.

If a Web does not enforce the content organizer redirect, it is safe to upload directly to any library. Thus, retrieving rules for a Web without the content organizer redirect would not be necessary. This method assumes the user has loaded the Web.AllProperties property on the Web context as follows, else an exception is thrown: context.Load(context.Web, x => x.AllProperties)

Code Sample – checking for content organizer rules

        private static bool RequiresContentOrganizer(Web web)


            if (!web.IsObjectPropertyInstantiated("AllProperties"))


                throw new InvalidOperationException("Web.AllProperties is not initialized!");



            string fieldName = "client_routerenforcerouting";


            Dictionary<string, object> fieldValues = web.AllProperties.FieldValues;


            if (fieldValues.ContainsKey(fieldName))


                object value = fieldValues[fieldName];


                if (value != null)


                    bool result = false;


                    if (bool.TryParse((string)value, out result))


                        return result;




                        throw new InvalidOperationException("Unexpected field value in Web properties for content organizer redirect setting!");





            return false;


Query the Content Organizer Rules

The content organizer rules are created in a site list called “Routing Rules”.  This special list has default views which group the rules. The user can choose to display the rules grouped by Content Type or Target Library. Either grouping will provide a collapsible section so the user can easily navigate complex sets of routing rules.

In order for the client object model to release to the drop off library, instead of the target library, it is necessary to construct a CAML query against the routing rules list. The query will return the view fields defined in the special “Rule” content type and allow the user to inspect the routing rule from the client object model.

Querying the rules list can be split into two parts:

·         Find the list.

·         Query the list.

When using the SharePoint client object model, it is possible for the user to easily construct a simple “All Items” CAML query. The user can specify a row limit and a collection of fields to include in the search results. This approach reduces the amount of XML manipulation the user must perform to build the CAML query.

Code Sample – Part One: Find the List

The routing rules list URL should be consistent across Language packs; you can check the RoutingRuleList_ListFolder key in the dlccore resx file located in the SharePoint %14 Hive%\Resources directory. This means that regardless of your site’s language, you can rely on SharePoint naming the routing rules list URL as “…/RoutingRules”.

        string webUrl = context.Web.ServerRelativeUrl;


        //Handle root sites & sub-sites differently

        string routingRulesUrl = webUrl.Equals("/", StringComparison.Ordinal) ? "/RoutingRules" : webUrl + "/RoutingRules";


        List routingRules = null;


        ListCollection lists = context.Web.Lists;


        IQueryable<List> queryObjects = lists.Include(list => list.RootFolder).Where(list => list.RootFolder.ServerRelativeUrl == routingRulesUrl);


        IEnumerable<List> filteredLists = context.LoadQuery(queryObjects);




        routingRules = filteredLists.FirstOrDefault();


        if (routingRules == null)


            throw new InvalidOperationException("Could not locate the routing rules list!");


Code Sample – Part Two: Query the List

The query should filter the list results to include only the active (based on the RoutingEnabled field) content organizer rules. The user should notice how the CAML query is restricted to a row limit of 100 and defines a custom set of view fields in the static CamlQuery.CreateAllItemsQuery method. This method generates a basic CAML query, which is then parsed with LINQ  to XML and modified to include the query element.

        private CamlQuery GetContentOrganizerRulesCaml()


            string[] viewFields = new string[]












            CamlQuery caml = CamlQuery.CreateAllItemsQuery(100, viewFields);


            XElement view = XElement.Parse(caml.ViewXml);



            XElement routingEnabled = new XElement("Eq",

                new XElement("FieldRef", new XAttribute("Name", "RoutingEnabled")),

                new XElement("Value", new XAttribute("Type", "YesNo"), "1"));


            XElement query = new XElement("Query", new XElement("Where", routingEnabled));


            //Add query element to view element



            caml.ViewXml = view.ToString();


            return caml;


 Evaluate the Rules and Conditions

After the user gets the Content Organizer rules it is important to match the release content type id with any of the rules. Should there be a match, the user must release to the drop off library instead of the content type’s library. If no rules match the upload’s content type, then the content can be sent to the content type’s library.

Code Sample – Evaluate Rules and Send Content

This method demonstrates how the user can parse the search results from the Rules List. The “RoutingContentTypeInternal” field needs to be split, in order to determine the rule’s content type and content type ID. If a rule matches, then the user can determine where to correctly send the content.

        private static void EvaluateRules(ListItemCollection items)


            string yourContentTypeId = "0x01010B"; //replace with your upload content type ID.


            ListItem rule = null;


            foreach (ListItem item in items)


                string contentType = null;


                string contentTypeId = null;


                if (item.FieldValues.ContainsKey("RoutingContentTypeInternal"))


                    object value = item.FieldValues["RoutingContentTypeInternal"] ?? string.Empty;


                    string[] values = value.ToString().Split(new char[] { '|' }, StringSplitOptions.None);


                    if (values.Length == 2)


                        contentType = values[1];


                        contentTypeId = values[0];




                if (yourContentTypeId == contentTypeId)


                    rule = item;






            if (rule != null)


                //send to drop off library...




                //send to content type library...




This post explained the difficulty of using the client object model to release content to a Web with the content organizer enabled (redirecting users to drop off library) and determining which libraries have been impacted by the content organizer rules. This post explained several code snippets from the attached sample class file, so the user can better understand how to implement and use the object model.

Scosby Content (1.73 kb)

Tags: , ,

IT | Programming

Custom Search Results in KnowledgeLake Imaging for SharePoint - Part 3

by Scosby Friday, December 10, 2010


This post uses Imaging for SharePoint version 4.1 and requires the SDK. Contact KnowledgeLake to learn more about Imaging for SharePoint or the SDK. Contact KnowledgeLake Support to obtain the latest SDK if your company already has a license.

This post will demonstrate how to create a Silverlight Search Results control in a SharePoint Solution. This post will use the DataGrid control available in Microsoft’s Silverlight Control Toolkit. When doing any development work, one should always test in a staging/testing environment before going live to a production server. Class files are available for download at the end of the post.

The User Control

The project will use the Silverlight Toolkit’s DataGrid for displaying our Search Results to the end user. Add a new user control to the KLSearchExtensions project, and name it CustomQueryReults. This name should be used exactly as shown otherwise KnowledgeLake Imaging for SharePoint will not recognize the user’s custom control.


<UserControl xmlns:sdk=""












        <viewModel:CustomQueryResultsViewModel x:Name="Model"

                                               PropertyChanged="Model_PropertyChanged" />


    <Grid x:Name="LayoutRoot">

        <controlsToolkit:BusyIndicator IsBusy="{Binding IsBusy}">

            <sdk:DataGrid x:Name="dataGrid"




                          Margin="4" />






The User Control’s data context is an instance of our ViewModel. In XAML, the User Control should look similar to the following, where the XML namespace viewModel points to your ViewModel’s project namespace (in this case it is KLSearchExtensions.ViewModel):


The User Control’s root element is a Grid. Add the Silverlight Toolkit’s BusyIndicator to the Grid. The BusyIndicator control hosts the actual content of our control, but allows the user to display an information message during loading or searching. Add the Silverlight Toolkit’s DataGrid to the BusyIndicator control. Be sure to set the AutoGenerateColumns property to false, the Search Results will be dynamically built for the user to bind to the grid.

Code Behind

A Strongly Typed ViewModel

The user will benefit from creating a convenience property to get an instance of the ViewModel from the User Control. Create a new private property named TypedModel. This property will encapsulate a strongly typed instance of the ViewModel for our User Control. In other words, the property returns the User Control’s DataContext property as an instance of our ViewModel. The code should look like the following:


Strongly Typed ViewModel Code Sample

        private CustomQueryResultsViewModel TypedModel




                return this.DataContext as CustomQueryResultsViewModel;



IQueryResults Interface

Implement the IQueryResults interface from the namespace KnowledgeLake.Silverlight.Search.Contracts. After implementing the interface, the Error event will be defined. This event is raised by KnowledgeLake Imaging for SharePoint when an error occurs during the loading of the Search Result extension. The user should not raise this event, but the user can handle this event. It is important to note, since this is a Silverlight application the user will have to log to IsolatedStorage on the client’s machine or call a webservice to record the error. However, handling this error is beyond the scope of this post.


The IQueryResults interface members OpenExecuteSearch (method) and Query (property) will be implemented as wrappers around the User Control’s ViewModel members. The ExecuteQueryStringSearch method is beyond the scope of this post and will not be implemented. The interface members’ implementations should look similar to the following:


IQueryResults Interface Code Sample

        public event EventHandler<KnowledgeLake.Silverlight.EventArgs<Exception>> Error;


        public void ExecuteQueryStringSearch()


            throw new NotImplementedException();



        public void OpenExecuteSearch(string keywords)





        public string Query


            get { return this.GetTypedModel().Query; }

            set { this.GetTypedModel().Query = value; }


ViewModel PropertyChanged Event

The PropertyChanged event is declared in the User Control’s XAML and wired up to an event handler called Model_PropertyChanged, this handler must respond to the Query and SearchResultSource properties. The method should look like the following:

ViewModel Property Changed Code Sample

        private void Model_PropertyChanged(object sender, PropertyChangedEventArgs e)


            var model = sender as CustomQueryResultsViewModel;


            switch (e.PropertyName.ToUpperInvariant())


                case "QUERY":




                case "SEARCHRESULTSOURCE":




                case "MESSAGE":





Displaying Search Results

Create a method called RenderResults which takes a parameter of type SearchResultSource. This method will render the search results into the DataGrid. If the user has added or removed columns to the query since it was last executed, the UpdateColumns property will evaluate to true. In this scenario, it is necessary to clear the grid of all columns and rebuild the grid.


The SearchResultSource.ViewColumns items are all represented as a property of each item in the SearchResultSource.Results collection. The SearchResultSource.Results property is dynamically built to contain the corresponding ViewColumn’s value when bound to the DataGrid.ItemsSource property. In other words, these are the rows of the grid with the column values included.


Displaying Search Results Code Sample

        private void RenderResults(SearchResultSource result)


            if (result.ViewColumns != null)


                if (result.UpdateColumns) //user changed result columns on the query builder




                    foreach (ViewColumn column in result.ViewColumns)


                        var item = new DataGridTextColumn();

                        item.IsReadOnly = true;

                        item.Header = column.FriendlyName;

                        item.Binding = new Binding(column.InternalName);






                this.dataGrid.ItemsSource = result.Results;




If the end user added or removed columns to the query, the method clears the grid’s columns and rebuilds them from the SearchResultSource.ViewColumns collection. As the user iterates the ViewColumns collection, be sure to create a new Binding for the column using the ViewColumn.InternalName property. This binding enables the grid to display the appropriate column value for each item in the SearchResultSource.Results collection. This is powerful functionality provided by KnowledgeLake Imaging for SharePoint. Essentially, the user does not need to worry about what columns the user chose to include in the query and how to bind those columns to the Search Results.





Download Files:

Blog Posts In This Series

View full size...

Finished Custom Control

Tags: , , , ,

IT | Programming

Custom Search Results in KnowledgeLake Imaging for SharePoint - Part 2

by Scosby Friday, December 10, 2010


This post uses Imaging for SharePoint version 4.1 and requires the SDK. Contact KnowledgeLake to learn more about Imaging for SharePoint or the SDK. Contact KnowledgeLake Support to obtain the latest SDK if your company already has a license.

This post will demonstrate how to create a Silverlight Search Results control in a SharePoint Solution. This post will use the DataGrid control available in Microsoft’s Silverlight Control Toolkit. When doing any development work, one should always test in a staging/testing environment before going live to a production server. Class files are available for download at the end of the post.

The ViewModel

Create a new folder in the KLSearchExtensions project named ViewModel. Next, add a new class named CustomQueryResultsViewModel to the folder. This class will be responsible for managing the service layer which retrieves search results. Additionally, the User Control will bind to the ViewModel for display of results and status information.

Service Layer

The ViewModel’s constructor should instantiate an instance of the SearchService class, located in the KnowledgeLake.Silverlight.Imaging.Search.Services namespace, into a private member field. The constructor should also wire up the SearchService.FacetSearchCompleted event to a handler that will process the results of our query. In order to use the SearchService class, the Silverlight application needs to have specific bindings in its ServiceReferences.ClientConfig file. The SearchService class will dynamically construct the client proxy with its current URL, thus the client endpoint addresses can be omitted. A downloadable ClientConfig file is included in this post. It should look like the following:






        <binding name="FacetQuerySearchSoap" maxBufferSize="2147483647" maxReceivedMessageSize="2147483647">

          <security mode="None" />


        <binding name="LoggerSoap" maxBufferSize="2147483647" maxReceivedMessageSize="2147483647">

          <security mode="None" />


        <binding name="FileInformationSoap" maxBufferSize="2147483647" maxReceivedMessageSize="2147483647">

          <security mode="None" />


        <binding name="WorkflowServiceSoap" maxBufferSize="2147483647" maxReceivedMessageSize="2147483647">

          <security mode="None" />


        <binding name="TaxonomywebserviceSoap" maxBufferSize="2147483647" maxReceivedMessageSize="2147483647">

          <security mode="None" />


        <binding name="RecordsRepositorySoap" maxBufferSize="2147483647" maxReceivedMessageSize="2147483647">

          <security mode="None" />


        <binding name="IndexWebServiceSoap" maxBufferSize="2147483647" maxReceivedMessageSize="2147483647">

          <security mode="None" />





      <endpoint binding="basicHttpBinding" bindingConfiguration="FacetQuerySearchSoap" contract="SearchClient.FacetQuerySearchSoap" name="FacetQuerySearchSoap" />

      <endpoint binding="basicHttpBinding" bindingConfiguration="LoggerSoap" contract="LoggingClient.LoggerSoap" name="LoggerSoap" />

      <endpoint binding="basicHttpBinding" bindingConfiguration="FileInformationSoap" contract="ViewClient.FileInformationSoap" name="FileInformationSoap" />

      <endpoint binding="basicHttpBinding" bindingConfiguration="WorkflowServiceSoap" contract="WorkflowClient.WorkflowServiceSoap" name="WorkflowServiceSoap" />

      <endpoint binding="basicHttpBinding" bindingConfiguration="TaxonomywebserviceSoap" contract="TaxonomyClient.TaxonomywebserviceSoap" name="TaxonomywebserviceSoap" />

      <endpoint binding="basicHttpBinding" bindingConfiguration="RecordsRepositorySoap" contract="OfficialFileClient.RecordsRepositorySoap" name="RecordsRepositorySoap" />

      <endpoint binding="basicHttpBinding" bindingConfiguration="IndexWebServiceSoap" contract="IndexClient.IndexWebServiceSoap" name="IndexWebServiceSoap" />


    <extensions />




ViewModel Constructor Code Sample

        public CustomQueryResultsViewModel()


            this.searchService = new SearchService();

            this.searchService.FacetSearchCompleted += new EventHandler<FacetSearchEventArgs>(searchService_FacetSearchCompleted);



        private void searchService_FacetSearchCompleted(object sender, FacetSearchEventArgs e)


            this.IsBusy = false;


            if (e.Results != null && e.Error == null)


                SearchResultSource results = SearchProcessor.ProcessNewSearchResult(e.Results);


                results.UpdateColumns = this.UpdateResultColumns(results);


                this.SearchResultSource = results;




                this.Message = "Service error getting search!";




In the FacetSearchCompleted handler notice two things:

  1. There are a few properties used in the method. The IsBusy and Message properties notify the UI to display certain information to the user, such as a busy indicator or an error message. The SearchResultSource property holds the processed results of our query for later use.
  2. We pass the SearchResults (e.Results) to the SearchProcessor class and the UpdateResultColumns method on the ViewModel. 

The SearchProcessor class is responsible for greatly simplifying the task of binding dynamic results to a data grid. This method is now publically accessible as of KnowledgeLake Imaging for SharePoint (hotfix 1) or higher. The return value from the ProcessNewSearchResult method is of type SearchResultSource. The SearchResultSource class contains an IEnumerable collection of the search results in its Results property.

Detecting New Results Columns

When the Search Service is processing the results, it is important for the grid to know if the end user has added or removed a column so it can rebuild the columns. Thus, the ViewModel needs a method to determine if the results have been updated by the client since a previous query was executed. Create a method named UpdateResultColumns that returns a Boolean and has a parameter of type SearchResultSource. The method should look similar to the following:


Detecting New Results Columns Code Sample

        private bool UpdateResultColumns(SearchResultSource results)


            bool updateColumns = true;


            if (this.SearchResultSource != null)


                //We need to determine if the user added/removed columns if SearchResultSource is not null.

                if (this.SearchResultSource.ViewColumns.Count < results.ViewColumns.Count)


                    var uniqueColumns = results.ViewColumns.Except(this.SearchResultSource.ViewColumns, new ViewColumnComparer());


                    updateColumns = uniqueColumns != null && uniqueColumns.Count() > 0;




                    var uniqueColumns = this.SearchResultSource.ViewColumns.Except(results.ViewColumns, new ViewColumnComparer());


                    updateColumns = uniqueColumns != null && uniqueColumns.Count() > 0;




            return updateColumns;



The UpdateResultColumns method checks to see if the user added or removed columns to the query. If the SearchResultSource parameter has more columns than the model property, the user added columns to the query. In either case, if there are any columns that do not belong the method returns true. The result of the method is used in the FacetSearchCompleted handler to flag the model’s SearchResultSource instance to update the columns. This value is assigned to the SearchResultSource.UpdateColumns property, informing the UI to render the results.


Query Layer

The ViewModel should expose two public methods, to be called by the IQueryResults interface from the User Control, named ExecuteQuery and OpenExecuteSearch. The methods should look similar to the following:

Query Layer Code Sample

        public void ExecuteQuery()


            if (!string.IsNullOrWhiteSpace(this.Query))


                this.IsBusy = true;






        public void OpenExecuteSearch(string keywords)


            if (!string.IsNullOrWhiteSpace(keywords))


                string query = QueryManager.GetSearchQuery(keywords, "And", false);


                this.Query = query;




The OpenExecuteSearch method is called by the KnowledgeLake Search Center’s implementation of the OpenSearch protocol. Visit for more information. Review the KnowledgeLake Imaging for SharePoint to learn more about how the Search Center leverages the OpenSearch protocol.

The QueryManager class is responsible for constructing a query into a format used by KnowledgeLake Search for performing a SharePoint Search query. The user should never attempt to manipulate this string; again, the QueryManager class encapsulates all necessary logic to construct the query. The User Control will be notified, via a BindingExpression, of the updated Query property value and call the ExecuteQuery method to begin retrieving search results.

Part three of this series will look at the User Control for the Custom Search Results.

Download Files:

Blog Posts In This Series

Tags: , , , ,

IT | Programming

Custom Search Results in KnowledgeLake Imaging for SharePoint - Part 1

by Scosby Friday, December 10, 2010


This post uses Imaging for SharePoint version 4.1 and requires the SDK. Contact KnowledgeLake to learn more about Imaging for SharePoint or the SDK. Contact KnowledgeLake Support to obtain the latest SDK if your company already has a license.

This post will demonstrate how to create a Silverlight Search Results control in a SharePoint Solution. This post will use the DataGrid control available in Microsoft’s Silverlight Control Toolkit. When doing any development work, one should always test in a staging/testing environment before going live to a production server. Class files are available for download at the end of the post.

Getting Started

Extending KnowledgeLake Imaging for SharePoint

KnowledgeLake Imaging for SharePoint allows for the extension of the Search Results control in the search center and the web part with a custom Silverlight 4 control. This is a powerful feature because it gives users the ability to customize the display of KnowledgeLake Search Results in a variety of ways. However, any custom Search Results control will replace the default KnowledgeLake control in both the search center and the web part. There is no way to add functionality to the default KnowledgeLake Search Results control. In fact, a custom control will not inherit any functionality from the default control. All the functionality of a custom control must be implemented by the developer from the ground up. See figure 1 below for a picture of the default control in the search center, it would look similar in the web part.

Figure 1 - Default Search Results Control

View full size...

Setting Up the SharePoint Solution Project

Create a new Silverlight 4 application and name it KLSearchExtensions. The Silverlight application does not need to be hosted in a new web site. Uncheck the “Host the Silverlight application in a new Web site” box in the New Silverlight Application dialog, it should now appear similar to Figure 2 below.

Figure 2 - Create a New Silverlight Application
View full size...

In my previous post on Creating SharePoint 2010 Solutions for Silverlight Applications I described how to easily deploy Silverlight applications to SharePoint 2010. This approach allows the developer to hit F5 to build, deploy, and debug the Silverlight extension. Pretty slick stuff! Create a new SharePoint Solution and name it ExtensionSolution, as shown in Figure 3 below.

Figure 3 - Create a New SharePoint Solution

 View full size...

The extension must be placed in the %SharePointRoot%\Template\Layouts\KLClientBin directory, thus set the Deployment Location’s Path property appropriately. This is shown in Figure 7 below. Read more about project output references in my previous post mentioned above.

Figure 4 - Set Deployment Location Path

View full size...

Creating the Extension Project

The extension will be designed to use the Model-View-ViewModel (MVVM) pattern. Add to the project the following references from the KnowledgeLake Imaging for SharePoint SDK and Silverlight Control Toolkit:

·         KnowledgeLake.Silverlight.dll

·         KnowledgeLake.Silverlight.Imaging.Search.dll

·         KnowledgeLake.Silverlight.Search.Contracts.dll

·         System.Windows.Controls.Input.Toolkit.dll

·         System.Windows.Controls.Toolkit.dll

KnowledgeLake Imaging for SharePoint requires a user control to be specifically named and implement a specific interface. Add a new User Control to the KLSearchExtensions project and name it CustomQueryResults, if the control is not exactly named this way the extension won’t be recognized.

Part two of this series will look at the ViewModel for the Custom Search Results.

Download Files: (3.46 kb)

Blog Posts In This Series

Tags: , , , ,

IT | Programming

Changing the port for SharePoint 2010 Central Administration

by Scosby Tuesday, September 21, 2010

After reading a rather ridiculous blog post, where the author's solution was to create Alternate Access Mappings, I thought I'd address how to change the port with less confusion. 

Using the Set-SPCentralAdministration cmdlet, you can change the port hosting your Central Administration web application. You can easily change it as follows:

Set-SPCentralAdministration –Port 8282 

You can read more about the Set-SPCentralAdministration cmdlet here:

In conclusion, I find a browser bookmark is quite capable of getting me to Central Admin with a single click on the Favorites Bar. This is a quick and easy solution that relieves me from having to remember what port is hosting Central Admin. However, I understand the need to change it may arise and PowerShell is there is save the day!

Tags: ,

Custom Health Rules in SharePoint 2010

by Scosby Monday, April 26, 2010

You can build your own health rules in MSF or MSS 2010. SharePoint 2010 includes a new, integrated health analysis tool that is named SharePoint Health Analyzer that enables you to check for potential configuration, performance, and usage problems.  A health rule runs a test and returns a status that tells you the outcome of the test. When any rule fails, the status is written to the Health Reports list in SharePoint Foundation 2010 and to the Windows Event log. The SharePoint Health Analyzer also creates an alert in the Health Analyzer Reports list on the Review problems and solutions page in Central Administration. You can click an alert to view more information about the problem and see steps to resolve the problem. You can also open the rule that raised the alert and change its settings.

Let's take a breif walk through on the MSDN example, so I can point out how to use the new VS2010 SharePoint Project to simplify the deployment of your new custom health rule. Of course, nothing is free and there are always unintended consequences. So be sure to review the guidelines for desiging health rules before you get started. The following screen shots will illustrate the key points for building your new rule and the output you can expect in Central Administration.

You can follow along with developing and deploying a custom health rule on MSDN.

When creating your health role, derive from the abstract base class SPHealthAnalysisRule and implement the abstract methods. The only thing I will note here, is to simplify your first rule and make the SPHealthAnalysisRule.Check override with the following:

        public override SPHealthCheckStatus Check()
            //Implement test logic...
            return SPHealthCheckStatus.Failed;

In order to deploy our new health rule, be sure to create a class derived from SPFeatureReceiver and follow the MSDN example for the FeatureActivated and FeatureDeactiving method overrides. It is ok for you to declare your feature receiver in the same assembly as your health rule, you can always change this later. Once you have completed this task, build the feature receiver and open it with Redgate's Reflector so you can obtain the assembly full name. Write this down for later.

Next, let's add an empty SharePoint project to our solution. Once it's created, add a feature and let's wire in our assembly containing our FeatureReceiver and custom Health Rule.

If we double click our Package.package file (not shown above), we can add our assembly to the GAC in the "Advanced" tab. Choose to add an assembly from project output, as shown in the following 2 screens:


Now we can edit our Feature by double clicking it in the solution explorer. In the properties window, we need to declare the receiver assembly and class using the assembly full name from Reflector.


Be sure that you follow along with the MSDN example and set your scope and always force install options accordingly (as shown above).

Next, we can set the SharePoint project as our startup project for our Visual Studio solution and hit "F5" to deploy our new feature and have it run the overrides in the FeatureReceiver class.

Now we can visit the Monitoring section in Central Administration to see our new health rule:

We can now run it from the "View Item" menu and see what happens:


After a moment, we should see the health analyzer ribbon on the Central Administration home page:

 If we click on "view these issues", we see a nice summary of what's going on in our farm:


Finally, we can check the Windows Event Log to see how the Health Analyzer reports our health rule:

Good luck with creating your own Health Rules and be sure to follow the above mentioned guidelines!


Tags: ,

IT | Programming

Creating SharePoint 2010 Solutions for Silverlight Applications

by Scosby Tuesday, March 9, 2010

The goal of this post is to explain the process of creating a SharePoint Solution that deploys a Silverlight application. At the time this post was written, I was using Visual Studio 2010 RC and MSS 2010 "beta 2".

There are two primary benefits for using the SharePoint Solution. First, we can initially develop the SL application using our SharePoint Solution and then have an easier transition to production. Last, but not least, we can have our SL app compiled and added to the SharePoint Solution during the build process. This is probably the most beneficial reason for using this technique. Admittedly, there are other ways of deploying your SL app but this one makes a lot of sense when you consider that you'll probably want to have a custom Site Definition for hosting your SL app anyways. But enough background information already, on with the show!

  1. Create a new site definition project, even if you’re not creating a site definition. This creates the special “Site Definition” folder that you cannot add later. So it’s better to have it and not use it, than to need it and not have it.

  2. Enter your development machine’s url

  3. Close the newly created onet.xml file for the site definition. Unless, of course, you are creating a new site definition.
  4. Set your new SharePoint Solution as the Startup Project.
  5. Add your Silverlight Application to your solution.
  6. In order to dynamically compile the Silverlight Application project’s output into our SharePoint solution, we need to use a Module. Add a new module to the project.

  7. Close the newly created Elements.xml file and then delete the Sample.txt file underneath your module.
  8. Click on the Module in Solution Explorer. Open the Project Output References property window.

    You should see a screen similar to the following:

  9. Click add and then change the Deployment Type to TemplateFile. Choose your Silverlight project in the Project Name property drop down. Next, expand the Deployment Location property and then change the Path to whatever location you want. See Developing SharePoint solutions for a more comprehensive chart of the Deployment Type file locations.

  10. A Module is deployed with a Feature. If you have not already defined one, a default Feature will be created to host the module.
  11. Press F5 to deploy the solution and begin debugging.

Tags: , , ,

IT | Programming