Jills blog Jills blog – of technology and everyday life

Integrate LinkedIn with Sharepoint


Recently I worked on a project where the customer wanted to publish and share information from their website to LinkedIn to reach out to broader audiences and to involve user interaction. The solution is based on Sharepoint 2010. LinkedIn should not be used as a fully integrated identity provider but as an connector to some of the services on LinkedIn. Here are some of the experiences I made through the project.


LinkedIn restrictions

We wanted to share articles on LinkedIn to enable discussion on both LinkedIn and on the portal site. When loggedin with Linkedin in the portal the user could post comments directly from the site and get all comments from Linkedin. To get any data from Linkedin API it requires to call services on behalf of an authenticated user, otherwise no data can be retreived from the API. This was a drawback, because comments from other user could not be shown in the portal without being logged in, even if the comments is in a public Linkedin group. There was also a lack of possibility to share rich data using the Group API . To share content we created a new discussion post on LinkedIn. This has a limitation on content length and rich text formatting, but it has a submitted url property, so we used that url to share a link to the article in the portal.

Authentication overview

LinkedIn uses OAuth (2.0) for authentication and API access. OAuth provides a standard for authentication and authorization functionality for client and server applications. This allows the Sharepoint portal to accept and verify identities from LinkedIn and access data on behalf of the user in a secure way, and without the users every revealing their passwords.

Oauth protocol uses security tokens to communicate between client and server. The API keys is identifying the application and is required to make API calls. The initial request uses the api key to get a access token.  The access token is needed in all further calls to access data from LinkedIn on behalf of the user. No data access is allowed without an access token. The access token is retrieved by redirecting to LinkedIn’s authorization dialog.

The permissions that are granted to the application is listed in the authorization box (this permissions are specified from which data the portal will be using). If the user allows access then there is a redirection back to the portal with a valid access token.  This access token was implemented to be stored in a session and used throughout all further calls to the LinkedIn API from the portal.

Authentication api: https://developer.linkedin.com/documents/authentication

Linkedin Certificate

We had some trouble making server requests to LinkedIn with this exception: The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel. —> System.Security.Authentication.AuthenticationException: The remote certificate is invalid according to the validation procedure. at System.Net.Security.SslState.StartSendAuthResetSignal(ProtocolToken message, AsyncProtocolRequest asyncRequest, Exception exception) at…

The solution is to install the SSL certificate on the web server. We could solve by adding Verisign root 3 certificate as trusted source (via Central admin > Security -> Manage trust):


Root 3 – VeriSign Class 3 Primary CA – G5  (file: PCA-3G5.pem)

However, the exception was still throwing on one of the servers after the certificated was installed. To fix this issue we added a workaround in code. Before calling LinkedIn, a bypass for SSL validation check was added:

    // fix for bypassing any certificate errors
ServicePointManager.ServerCertificateValidationCallback =
delegate(object s, X509Certificate certificate,
X509Chain chain, SslPolicyErrors sslPolicyErrors) { return true; };



To implement OAuth required for the LinkedIn integration we are used a third party OAUth provider library in .NET called DotNetOpenAuth. DotNetOpenAuth library requires a token manager that handles the storage and retrieval of the OAuth tokens. This token processing is handled by a custom class that implements the IConsumerTokenManager interface. The IConsumerTokenManager interface is called by the DotNetOpenAuth library to handle the storage and retrieval of the tokens that are needed for user authentication in the OAuth standard. For our solution the class that implements the interface and storage handling is done with server session variables. It stores the Linkedin Api  keys  initially and before the authentication process starts.

DotNetOpenAuth provides a WebConsumer class for the service requests.  This class uses the Tokenmanager and uris for the requests. We implemented the api uris and processed the request/response data from Linkedin with mapping to ServiceEntities objects that represents the structure of the data expected or returned from Linkedin. When this structure is in place the objects can be easily  mapped. To generate the api uris and the object formats, study the api documentation https://developer.linkedin.com/rest

LinkedIn permissions scopes are setup in the RequestTokenEndpoint. If permissions need to be changed then this endpoint needs to be changed: https://api.linkedin.com/uas/oauth/requestToken?scope=r_fullprofile rw_groups r_contactinfo r_emailaddress. The scope determines which permission the application wants to access on behalf of the user.


April 01 / 2013
Author Jill
Category .Net, Sharepoint
Comments No Comments
Tags ,

New look

This blog has been dead for a while. So maybe a new look will make this site alive again? I’m planning on writer shorter posts, making it easier to write something, and it doesn’t need to be so perfectly well written.. because it will only stop myself from posting anything. So.. new posts coming soon!

September 09 / 2012
Author Jill
Category Myself
Comments No Comments

Custom WCF REST service in SharePoint

Recently I have been working with Windows Communication Foundation (WCF) Service in SharePoint 2010. We built a jQuery AJAX heavy web application on top of the SharePoint platform, with business data exposed from a custom Microsoft SQL database. The solution should be accessed from within a SharePoint intranet site and within its security boundaries and SharePoint groups. Based on those requirements we chose WCF Service as communication architecture to expose the object model and the business objects.

WCF configuration

Windows Communication Foundation (WCF) is a unifying programming model for creating service oriented applications, that supports many hosting options and communications protocols. The technology may seem complex in terms of all the configuration possibilities. For this solution we have used a WCF JSON enabled REST service, which is a service configured to use the HTTP protocol. The requests and responses are text based (not SOAP) and the formatting is JSON. The serializing and deserializing of objects on both sides of the service is taken care of by the service, simply passing JavaScript objects on frontend and .NET objects on backend.

To set up a SharePoint project with WCF service some dll references needs to be added manually: System.ServiceModel, System ServiceModel.Web and System.Runtime.Serialization. To expose the service to run inside the SharePoint project an ISAPI mapped folder is added to the solution with a service file (.svc). This is a text file that contains information about the service that can be run using Microsoft Internet Information Services.

The svc file point to the implementation of the service:

<span style="color: #808080;">&lt;%@ ServiceHost Language="C#" Debug="true" Service="Myservice.Service,Myservice, Version=,Culture=neutral, PublicKeyToken=a0b806fa45919e70" %&gt;</span>

The service is defined by a service contract which is an interface class defining data and operations, and a service class which implements the contract. The service is a wrapper for business logic class which processes the requests and responses, and the data layer which accesses the database.

The configuration of the service is set up in web.config file:

<span style="color: #808080;">&lt;system.serviceModel&gt; &lt;serviceHostingEnvironment aspNetCompatibilityEnabled="true"/&gt; &lt;!-- Behavior setting for wcf  service --&gt; &lt;behaviors&gt; &lt;endpointBehaviors&gt; &lt;!-- configuration for rest relies on web http --&gt; &lt;behavior name="RestBehavior"&gt; &lt;webHttp /&gt; &lt;/behavior&gt; &lt;/endpointBehaviors&gt; &lt;/behaviors&gt; &lt;services&gt; &lt;!-- register wcf service --&gt; &lt;service name="Myservice.Service"&gt; &lt;endpoint address="" binding="webHttpBinding" behaviorConfiguration="RestBehavior" contract="Myservice.IService" bindingConfiguration="WindowsAuthenticationBasicHttpBinding"&gt; &lt;/endpoint&gt; &lt;/service&gt; &lt;/services&gt; &lt;bindings&gt; &lt;!-- webhttp binding for service--&gt; &lt;webHttpBinding&gt; &lt;binding name="WindowsAuthenticationBasicHttpBinding"&gt; &lt;security mode="TransportCredentialOnly"&gt; &lt;transport clientCredentialType="Windows" /&gt; &lt;/security&gt; &lt;/binding&gt; &lt;/webHttpBinding&gt; &lt;/bindings&gt; &lt;/system.serviceModel&gt;</span>

<serviceHostingEnvironment aspNetCompatibilityEnabled=”true” />
Setting this value to true indicates that all WCF services running in the application run in ASP.NET Compatibility Mode. In WCF, a binding determines how WCF is going to communicate. For a RESTful endpoint, the binding is set to <WebHttpBinding>.  An endpoint behavior for the service is set to to enable the web programming model for WCF.

The communication channel for WCF is HTTP instead of SOAP, configured by <webHttpBinding> as binding configuration. This channel is configured with security on the transport channel, using windows credentials (which would be integrated SharePoint authentication). Note that clientCredentialType is set to “Windows”. Another option is “Ntlm”, but when Kerberos is used as authentication provider, only “Windows” would work.

The service itself is configured in the configuration with the service interface contract and the mapping to the binding endpoints and behaviors.

WCF implementation

The WCF service is defined with a service contact which specifies the operations the client can perform on the service.

[ServiceContract] public interface IService { [OperationContract] [WebInvoke(Method = "POST", BodyStyle = WebMessageBodyStyle.WrappedRequest, ResponseFormat = WebMessageFormat.Json)] String HelloWorld(string Message); }

To pass a custom object as request or response specify a DataContract that agree on the exchange format. The datamembers must be serializable:

[DataContract] public class Tag { [DataMember] public Int32 TagID{ get; set; } [DataMember] public string KeyT { get; set; } }

The service class implementation:

[AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class Service : IService { ... }

Security and permissions

The application is hosted in a SharePoint site and uses SharePoint authentication. The WCF service is also configured to use the same security provider. To expose different functionality to different user groups, a set of SharePoint groups was created.  The access level is implemented both on fronted and backend, authenticating user against the specified groups. The service call itself is not doing the access check, except from the authentication. The business layer checks what access level the current user has and returns appropriate data. This is possible beacuse the service is running with the current user credentials.  It is also possible to configure permissions levels on the WCF itself by using role based security. This would be easy to setup for AD user groups,  but I’m not sure how it would work for SharePoint groups.

Testing WCF services

The code in the service itself is very simple; it does almost nothing because it is an interface and external point to the application. A service could do data validation and the main action is to transfer objects to call methods on the business model. But since the service contains code it is still prone to have defects and bugs like other parts of the application. Therefor it must be a good idea to do some validation and verification to ensure proper quality. And the idea was to do unit testing and to use Test Driven Development (TDD). But to make a WCF service testable and take advantage of TDD, the code has to be refactored to accept dependencies.

What made unit testing so difficult to start with is that the service has dependencies on the business and data access layers. When testing relies on resources like database it becomes unstable and hard to debug as there are dependencies through different layers.  In addition there were dependencies to SharePoint framework as well. This would result in an another type of test called integration test. So I ended up implementing an integration test even that was not the intention when I started. They are still a good value as you could test the overall system. But I experienced it to be very fragile and hard to detect errors because it would go through several layers and paths on system. There were also problems running the tests when the system was  depending on sharepoint authentication.

A unit test should test only one single specified requirement of the system.  I started looking at Pex and Moles framework for unit testing. There are other testing and mock frameworks like NUnit and Moq, but I needed a framework that is able to also mock SharePoint libraries. So I could do testing for the buisness and database layers. But I did not have time to do it properly because of the dependency injection learning curve. I have studied an example of using Ninject for WCF dependency injection, so maybe I will do a post on it later when I have tested it out.

The frontend

The frontend was implemented with Knockout,  a JavaScript MVVM inspired framework for separating view from model. Maybe another post on that later..  As a developer tool it is simply great. It uses JQuery data binding and templates to build rich and responsive user interfaces. We also did some unit testing on the frontend using QUnit framework. This part of the application was not my main responsibility, but was developed by one of my great colleagues :)

September 13 / 2011
Author Jill
Category .Net, Sharepoint
Comments No Comments

Sharepoint deployment with powershell

My first powershell script. The name says it all: it’s powerfull, you can do a lot with it. But what is Powershell:
Windows PowerShell is a new Windows command-line shell designed especially for system administrators. The Windows PowerShell includes an interactive prompt and a scripting environment that can be used independently or in combination.Unlike most shells, which accept and return text, Windows PowerShell is built on top of the .NET Framework common language runtime (CLR) and the .NET Framework, and accepts and returns .NET Framework objects. This fundamental change in the environment brings entirely new tools and methods to the management and configuration of Windows.Windows PowerShell introduces the concept of a cmdlet (pronounced “command-let”), a simple, single-function command-line tool built into the shell. You can use each cmdlet separately, but their power is realized when you use these simple tools in combination to perform complex tasks. Windows PowerShell includes more than one hundred basic core cmdlets, and you can write your own cmdlets and share them with other users.Like many shells, Windows PowerShell gives you access to the file system on the computer. In addition, Windows PowerShell providers enable you to access other data stores, such as the registry and the digital signature certificate stores, as easily as you access the file system.
Working with Sharepoint, you can do most of the things STSADM does, and much more. A good starting point is the Powershell reference for Sharepoint by technet. Also a nice overview and collection of resources in this article by “sharepoint Joel”, from the very beginner to more advanced.
My first script simply installs all wsp file located in a dropfolder  (remove and uninstall first).
Read more →

November 17 / 2010
Author Jill
Category Sharepoint
Comments No Comments

Sharepoint 2010 – filter content on managed metadata term

We want to filter content on managed terms. Create a new column with metadata field type and bind it to a termset from the termstore. Create new content types with the metadata columns. Add some content based on it (lists, calender, pages) with different metadata. To retrieve data you for example use contentbyquerwebpart or Spquery. We want to set the query to a caml query filtered on a metadata field. A metadata field is a lookup type, and you can not write a query as it was a text field (althoug the term itself is text).

<code> &lt;Eq&gt;&lt;FieldRef ID='" + metacolumnid + "' LookupId='TRUE' Nullable='TRUE'/&gt; &lt;Value Type='Lookup'&gt;" +id + "&lt;/Value&gt;&lt;/Eq&gt; </code>

The Fieldref ID is the GUID of the metadata column. The lookup value is the GUID of the term we want to filter on. Or it is not the GUID itself, but a lookup property: WssId.  This is an identifier (ID) of the list item that the lookup field points to. And how to find it? Well it took me some time to find out.. There is some hidden magic behind the scenes! All terms are actually stored in a list (everything in Sharepoint is lists, so not a big suprise :). This list is hidden, but can be found: [site url]/Lists/TaxonomyHiddenList/AllItems.aspx.

Based on the IdForTerm we can retreive the wssid by TaxonomyField.GetWssIdsOfTerm

<code> TaxonomyField.GetWssIdsOfTerm(site, _ termStoreId, termSetId, termId, includeDescendants, _ limit) </code>

But we don’t want to hardcode the id values. To get the data more dynamically, we can create term store objects based on the default termstore and we probably know which group and termset we filter on (because a metadata column is bound to a termset).

<code> TaxonomySession session = new TaxonomySession(curSite); TermStore termStore = session.DefaultSiteCollectionTermStore; Group termGroup = termStore.Groups["MyGroup"]; TermSet termSet = termGroup.TermSets["MyTermset"]; String metacolumnid = "{603E67C1-99C9-4EB7-B4D8-A299DF4AF468}"; //metadata column id ............ private int GetTermWssId(SPWeb rootWeb, string termtitle, TermStore termStore, TermSet termSet) { Guid guidid = new Guid(); int result = 0; if (rootWeb.Properties.ContainsKey("TaxonomyHiddenList")) { Guid taxonomyHiddenListId = new Guid(rootWeb.Properties["TaxonomyHiddenList"]); SPList taxonomyHiddenList = rootWeb.Lists[taxonomyHiddenListId]; SPQuery query = new SPQuery(); // we might have included the IdForTermSet in the query but we assume // that Guid is really unique // so there should not be temrs in other terms sets having the same ID string guid = termSet.Id.ToString(); guid = guid.Replace("{", ""); guid = guid.Replace("}", ""); query.Query = String.Format(@"{0}{1}", termtitle, guid); // query.Query = String.Format(@"{0}", guid); SPListItemCollection items = taxonomyHiddenList.GetItems(query); if (items.Count == 1) { guidid = new Guid(items[0]["IdForTerm"].ToString()); int[] wssIds = TaxonomyField.GetWssIdsOfTerm(rootWeb.Site, termStore.Id, termSet.Id, guidid, false, 1); result = wssIds[0]; } } return result; } </code>

This is the solution I came up with, maybe it could be solved more elgant than this?

Another thing to remember.. To be able to query the metadata store and to get the above code to work, make sure the conent type syndication is activated on your site.

November 16 / 2010
Author Jill
Category Sharepoint
Comments No Comments

Sharepoint webpart error causing page to fail

It happens.. You have a webpart with error, causing the whole page to fail. What to do? You can quickly access the webpart and delete it through webpart maintance page [sitecoll]/?contents=1

November 16 / 2010
Author Jill
Category Sharepoint
Comments No Comments

Sharepoint 2010 – term store management

Managed metadata with term store management is an improvement of taxonomy management in Sharepoint 2010. This is a hierarchical collection of centrally managed terms that you can define, and then use as attributes for items in.
.A term set is a collection of related terms
Management of taxonomy takes place within the Term Store Management Tool, which is accessible through either Central Administration or Site Administration.

At top we have the term store, which can contain several term groups. A  term group defines a security boundary, and sets permissions for administration of terms. A group can contain several term sets (up to 1,000). A term set can contain terms (up to 30,000).  Managed metadata terms can be dfined in multiple languages. This is done by distinguishing the term itself from the words that represent the term. A label is a word or a phrase that represents a term. A term can have multiple labels

The management of the taxonomy service takes place in a Term Store management tool, available trough Central administration or site administration. First you need to create a managed metadata service, which sets up a database to be used as the term store, and a connection that  provides access to the service.

Go to central administration and manage service applications. Create a new managed service.

When the service is created select in the list and select the properties option from the ribbon bar.

In the properties edit window set the content hub by entering the url of the sitecollection you want to be the main term store service for yout site collections. Other site collections can subscribe to the content types and use the term store published from the hub site collection. It means you can share content types and terms across site collections. But first the content hub has to be published. Select the manage metadata service connection from the service list and then select properties from the ribbonbar. Check the publish settings below. Then publish the selected  service connection by publish from the ribbonbar.

The hub synchronization runs as a timer job every 15 minutes. If you want to speed up the publishing, you can run the Content Type Hub and Content Type Subscriber jobs manually from the Monitoring page of Central Administration.

To check which service a site collection is subscribing to, go to Site Collection Administration >Content Type Publishing.

November 16 / 2010
Author Jill
Category Sharepoint
Comments No Comments

New job

I started in a new job at Bouvet 1.oct, joining a Microsoft team with allmost 40 talented and really nice people. I’m very happy and proud to be a part of it. And I have started looking into Sharepoint 2010. Lots of fun and new challenges!

November 16 / 2010
Author Jill
Category Myself
Comments 1 Comment


Summer and  vacation time! There has been few updates here lately.. and it will not be updated berfore after a long summer vacation :)

July 17 / 2010
Author Jill
Category Myself, Non-tech
Comments No Comments

Playing with Surface

This post has been in a draft version for a couple of months now, so it´s time to publish..
So this year I have worked with some exciting technologies, including iPhone development, Microsoft Surface Table and now I’m investigating the IPad (which I owe :). They have in common: multitouch screens, taking the user experience to the next level, it is easy and intuitive to use, making it possible to create visualizations of data with added value for the users. And this is the main reasons for why I think this is such an exciting area to work with, together with investigating new technology ;) .


Surface table

Earlier this year we got or own Surface Table in our office at work, and I had the pleasure of playing with it for a while. This technology is not completly new and it has exsisted for some years now.
From Microsofts own site: Microsoft Surface is a revolutionary multi-touch computer that responds to natural hand gestures and real-world objects, helping people interact with digital content in a simple and intuitive way. With a large, horizontal user interface, Surface offers a unique gathering place where multiple users can collaboratively and simultaneously interact with data and each other. Surface table has some differences compared to a normal touch screens, it can read “tagged” objects and have multiple (52) touch spots which make a true Multi user experience and users can work together simultaneously. Another difference is of course that it is not only a screen, but a table standing on the floor where people can gather around. It has a 360-degree user interfaces having the content appear correctly for everybody.
The screen is a 30-inch XGA DLP® projector and the table is heavy ( almost 100kg!). The resolution is 1024 x 768px, which is the only thing that didn’t impress me..
The surface is really just a computer which runs Windows vista, so to develop applications you use .NET (WPF or XNA). This makes it easy for .NET developers to get started to play with it. To develop you can download a Surface Client, which is a simulator, but it can only be installed on Win Vista. I developed directly on the Surface, just connect a monitor and you use is as it was a computer and do your developement in Visual Studio as normal.


This how a standard byte tag looks like. Surface can recognize this identity tag. It is possible to make your own tags according to certain spesifications

You need to setup and get to  know the Surface SDK which includes the APIs, documentation, and tools to help you develop Surface touch-enabled applications. The core functionality of the sdk is the ScatterView and serves as a useful starting point for understanding how it works. To make some object move, rotate or resize you use the ScatterView control and add scatterview items containing your objects. It’s so simple, and you have already made a cool demo in no time!

I created a small simple demo app, which is a whiteboard with virtual sticker notes, which can be added and moved around and written to by the using the virtual keyboard. It can be used for project planning and so on. I also made a simple log in where employees can unlock the notes and edit information by entering his employe card upon the table (with a tag sticked to it for identifaction). I have a video showing this demo app and the source code, but I can´t find  it at the moment, because it has been some months ago since I worked with it. And end of story is that the table just a while ago somehow managed to get broken, someone slipped it onto the floor? Well it was not me, so it seems like there will not be any more development soon..


Surface demo app. Wish I had better photo or video

June 23 / 2010
Author Jill
Category .Net
Comments No Comments