Accessing WCF services using SharePoint Secured Store Services


With the invent of Secure Stored Services so often we land in a situation to access third party WCF services using the credentials stored in the SSS providers. Sometimes third-party service providers add a clause to use host headers to the all the subsequent requests except first.

A possible way to implement this is to use cache objects to store credentials. If these credentials are sent through wire, it breaks the whole purpose of SSS. Hence its important to encrypt these requests before passing on.

This blog will cover this scenario.

Algorithm goes like this:

1) Retrieve Credentials for SSS

2) Encrypt the credentials

3) Store the credentials in HTTP Cache object

4) First time access to web service with above credentials and cache the response

5) Subsequent request to the Web-service using the cached HTTP header object

1) Retrieve Credentials for SSS:

This is pretty staright forward. Use the Class SecureStoreCredentialCollection from Microsoft.BusinessData.Infrastructure.SecureStore to retrieve the credentials.

private SecureStoreCredentialCollection GetCredentials(string targetApplicationID)
{
SecureStoreCredentialCollection credentials = null;
SPServiceContext context = SPServiceContext.GetContext(SPServiceApplicationProxyGroup.Default, SPSiteSubscriptionIdentifier.Default);
SecureStoreServiceProxy ssp = new SecureStoreServiceProxy();
ISecureStore iss = ssp.GetSecureStore(context);
credentials = iss.GetCredentials(targetApplicationID);

return credentials;
}

2) Encrypt the credentials:

Cryptography is another big concept. I am not going deep in to it. Idea is to use some Microsoft provided cryptographic algorithms which will encode the data using the passphrase along with modes and padding attributes we select.

public string EncryptString(string message, string passphrase)
{
byte[] Results;
System.Text.UTF8Encoding UTF8 = new System.Text.UTF8Encoding();

// Step 1. We hash the passphrase using MD5
// We use the MD5 hash generator as the result is a 128 bit byte array
// which is a valid length for the TripleDES encoder we use below

MD5CryptoServiceProvider HashProvider = new MD5CryptoServiceProvider();
byte[] TDESKey = HashProvider.ComputeHash(UTF8.GetBytes(passphrase));

// Step 2. Create a new TripleDESCryptoServiceProvider object
TripleDESCryptoServiceProvider TDESAlgorithm = new TripleDESCryptoServiceProvider();

// Step 3. Setup the encoder
TDESAlgorithm.Key = TDESKey;
TDESAlgorithm.Mode = CipherMode.ECB;
TDESAlgorithm.Padding = PaddingMode.PKCS7;

// Step 4. Convert the input string to a byte[]
byte[] DataToEncrypt = UTF8.GetBytes(message);

// Step 5. Attempt to encrypt the string
try
{
ICryptoTransform Encryptor = TDESAlgorithm.CreateEncryptor();
Results = Encryptor.TransformFinalBlock(DataToEncrypt, 0, DataToEncrypt.Length);
}
finally
{
// Clear the TripleDes and Hashprovider services of any sensitive information
TDESAlgorithm.Clear();
HashProvider.Clear();
}

// Step 6. Return the encrypted string as a base64 encoded string
return Convert.ToBase64String(Results);
}

 

Reverse engineer the algorithm to decrypt:

 public string DecryptString(string message, string passphrase)
{
byte[] Results;
System.Text.UTF8Encoding UTF8 = new System.Text.UTF8Encoding();

// Step 1. We hash the passphrase using MD5
// We use the MD5 hash generator as the result is a 128 bit byte array
// which is a valid length for the TripleDES encoder we use below

MD5CryptoServiceProvider HashProvider = new MD5CryptoServiceProvider();
byte[] TDESKey = HashProvider.ComputeHash(UTF8.GetBytes(passphrase));

// Step 2. Create a new TripleDESCryptoServiceProvider object
TripleDESCryptoServiceProvider TDESAlgorithm = new TripleDESCryptoServiceProvider();

// Step 3. Setup the decoder
TDESAlgorithm.Key = TDESKey;
TDESAlgorithm.Mode = CipherMode.ECB;
TDESAlgorithm.Padding = PaddingMode.PKCS7;

// Step 4. Convert the input string to a byte[]
byte[] DataToDecrypt = Convert.FromBase64String(message);

// Step 5. Attempt to decrypt the string
try
{
ICryptoTransform Decryptor = TDESAlgorithm.CreateDecryptor();
Results = Decryptor.TransformFinalBlock(DataToDecrypt, 0, DataToDecrypt.Length);
}
finally
{
// Clear the TripleDes and Hashprovider services of any sensitive information
TDESAlgorithm.Clear();
HashProvider.Clear();
}

// Step 6. Return the decrypted string in UTF8 format
return UTF8.GetString(Results);
}

}

Convert Secured string to string:

To convert secured string to string use the interop API’s and dispose (Marshal) the object

 private string ReadSecureString(SecureString sstrIn)
{
if (sstrIn == null)
{
return null;
}

IntPtr ptr = Marshal.SecureStringToBSTR(sstrIn);
string str = Marshal.PtrToStringBSTR(ptr);
Marshal.ZeroFreeBSTR(ptr);
return str;
}

3) Store the credentials in HTTP Cache object

If the credentials are not secured, secure it using crypography method described above and cache the object for 30 mins.

{
string cacheValue = isSecured ? crypto.EncryptString(returnValue, PASSPHRASE) :returnValue;
//Cache credentials for 30 mins
HttpContext.Current.Cache.Insert(strCacheName,cacheValue,null,
System.Web.Caching.Cache.NoAbsoluteExpiration,
TimeSpan.FromMinutes(30),
System.Web.Caching.CacheItemPriority.Normal, null);
}

return returnvalue;

Decrypt the cache object and return the credentials on subsequent requests:

Since the cache is secured, even if the packets are trapped, hackers cannot retreive the credentials (Unless they are aware of our encrypt algorithms :)).

Now coming back the WCF service, for the first request to the service, credentials would be passed and this response will be stored in the header and for subsequent requests, requests with this header will be passed.

Define the XML format:

 XmlDocument xDoc = new XmlDocument();
xDoc.PreserveWhitespace = true; //keep all line breaks
xDoc.XmlResolver = new HtmlResolver(); //will resolve entities
XmlNamespaceManager ns = new XmlNamespaceManager(xDoc.NameTable);
ns.AddNamespace("html", "http://www.w3.org/1999/xhtml");

4) First time access to web service with above credentials and cache the response

Now get the secured credentials form the methods described before and use them to download the data from WCF service and transform it to XML:

client.UseDefaultCredentials = false;
client.Credentials = new NetworkCredential(strUsername, strPwd);
StringBuilder strBuilder = new StringBuilder(client.DownloadString(new Uri(URI)));
xDoc.LoadXml(strBuilder.ToString());

Please note, this initial response will be used as cookie header for subsequent requests.And this URI will be same for all the requests. This initial response will have response which is defined by expression “SessionClass” (Eg:”//html:span[@id=’session_key’]”) and ‘selectsinglenode’ will retrieve the first element which matches the response and ‘sessioncookieformat’ defines the format of session object (eg: “Session={0}”) and store the cache until the WCF session expires.

if (xDoc != null)
{
XmlNode node = xDoc.SelectSingleNode(SESSIONCLASS, ns);
string strSessionId = string.Format(SESSIONCOOKIEFORMAT, node.InnerText);
//Cache session id for 23 hrs
HttpContext.Current.Cache.Insert(SESSIONCACHENAME,strSessionId,
null,
System.DateTime.UtcNow.AddHours(23),
System.Web.Caching.Cache.NoSlidingExpiration,
System.Web.Caching.CacheItemPriority.Normal, null);
return strSessionId;
}

5) Subsequent request to the Web-service using the cached HTTP header object

Now build the “RequestURL” which will required URL for WCF service to download the data and start the request with the client header which contains the SessionID from the cache using GetSessionCookie method.

using (WebClient client = new WebClient())
{
client.UseDefaultCredentials = false;
string strToken = GetSessionCookie();
client.Headers.Add(HttpRequestHeader.Cookie, strToken);
StringBuilder strBuilder = new StringBuilder(client.DownloadString(new Uri(url)));
XmlDocument xDoc = new XmlDocument();
xDoc.PreserveWhitespace = true; //keep all line breaks
xDoc.XmlResolver = new HtmlResolver(); //will resolve entities
xDoc.LoadXml(strBuilder.ToString());
return xDoc;
}

XDoc data can be used in the presentation layers further. Thus to implement the WCF interfaces on SharePoint its crucial to understand the structure and security features implemented by WCF service providers.

Importance of DB architecture in SharePoint


I have seen many SharePoint Database’s poorly architected, which will leads to the poor performance of the over all Farm. If you are dealing with huge SharePoint farms with more than 1 or 2 TB of data for the ease of maintenence Sharepoint DB architecture comes in to picture.

Often we land in the trouble of splitting the content DB by using SPMergeContentDB and this may fail due to isignificant site collection size, user traffic, and SQL Server load. When the STSADM MergeContentDBs command fails, both the source and destination databases can be corrupted. http://support.microsoft.com/?id=969242

Another importance to DB architecture is during the migration. Since SP2013 supports only database attach migrations, its crucial to keep the DB’s with ease of maintenance.

This blog post is to discuss about few DB governance tactics to avoid these issues.

DB Governance:

Microsoft’s recommendation is not to keep any DB’s more than 100 GB and 200 GB is the maximum hard limit. To achieve this, analysis of the DB growth is important. If you are migrating from lower sharepoint versions or from other CMS such as Lotus notes, Documentum etc you will already have an idea of this. If not, a survey with the stake holders (Department managers) is a good idea.

I am going to discuss about few strategies of DB governance. Before adopting to any one of these, the following dependencies should be considered:

  • Data growth
  • Site spaces
  • Ease of Database maintenance &
  • Data isolation

Strategy 1:

Round robin way: This is a simplest database architecture you can have if the site collection growth is more or less same across all the site collections and if you don’t have enough resources for SharePoint DB maintenance plans, follow the below strategy.

Scenario:

If the growth of the database is calculated to be 2TB of data for the next 2-3 years (Probably you need to upgrade to next version) follow the below steps:

  1. Create 2,000 GB/100 GB -> 20 content databases with names WSS_Content_1 to WSS_Content_2
  2. Verify all the databases are online

Whenever a new site collection is created, it will use the least used db (DB with least no of site collections) in the round robin way. One of the caveat with this approach is if you have a huge site collection (>50GB), you should use a dedicated content DB.

  1. Create a new content DB WSS_Content_Large
  2. Set other DB’s to offline state such that site collection will be created in new DB
  3. Create new site collection
  4. Make other DB’s online
  5. Make WSS_Content_large offline to avoid creation of new Site collections.

Strategy 2:

If your farm demands high performance and if your SharePoint administrator is handy enough to move the sites with quarterly or half yearly DB plans below strategy will be helpful. If you have a farm with more than 200 site collections you can consider adopting to this strategy:

Idea is to segregate databases according to the site collection sizes and implement a DB maintenance to keep them consistent i.e, first create Content DB’s with represents sizes of site collections (Eg: WSS_Content_2GB, WSS_Content_4GB etc), Initially all the sites will reside under WSS_Content_2GB and when the Site collection grows beyond 2 GB it will be moved to 4 GB and goes on. If the size of DB grows beyond 100 GB create a new DB (Eg: WSS_Content_2GB_01).

Detailed steps are explained below:

Create databases WSS_Content_2GB, WSS_Content_4GB, WSS_Content_6GB

Set WSS_Content_4GB and WSS_Content_6GB offline such that new site collections will be created at WSS_Content_2GB

Quarterly DB maintenance activity: Every 3 or 4 months, the below steps should be performed to keep the DB sizes consistent

List all the site-collections with content DB name and size using Powershell or STSADM

Get-SPSite -Limit ALL | select url, @{label=”Size in MB”;Expression={$_.usage.storage/1MB}} | Sort-Object -Descending -Property “Size in MB” | ConvertTo-Html -title “Site Collections sort by size” | Set-Content SizeReport.html
Now identify all the site collections which are more than 2 GB. Now move the sites which are between 2-4GB to WSS_Content_4GB and more than that to WSS_Content_6GB.
You can use either STADM SPMergeContentDB or Powershell command Move-SPSite as explained below.
  • After some time, if any of the DB’s exceeds 90 GB make that DB offline and create a new DB.For an eg, if WSS_Content_2GB exceeds 90 GB, create a new DB WSS_Content_2GB_01 and make the former offline.
  • Now If there is a requirement to create a site collection of 25 GB, best practice is to allocate a separate a site content DB.For this create a new content DB, WSS_Content_25GB and create the site collection using powershell, rather than central admin.
This process will ensure, at any situation none of the database exceeds 100 GB and its very easy for the SharePoint Admin to maintain the DB’s during migrations to next versions.