Sign In
Jan 20
Formatting numbers as string in PowerShell

I had to look this up again today (easily the 5th time in the last month). So, to make it easier to look up the 6th time, I'm putting it here.

If you want to format a number as a string, the quickest way is with the –f operator:

    $serverName = "ABC{0:000}D" –f $srvID

The above line, with the $srvID variable equal to 13 will set the $serverName variable to "ABC013D"

More information: http://social.technet.microsoft.com/wiki/contents/articles/4250.string-formatting-in-powershell.aspx#NET_formatting

Jan 14
ChicagosNext Hackathon

I am thrilled to point you to a community/charity event that has been scheduled to leverage the large number of technology professionals in Chicago for Ignite: ChicagosNext Hackathon.

The goal is to build the next app that creatively solves issues within our local neighborhood using Azure and Office 365.

 I have signed up to help organize and participate in the event. I cannot wait so see you there!

More information is availble in the Sway below.

 

Jan 13
Troubleshooting S2S configuration

I was working on a project recently with an interesting issue. The organization has an on premise SharePoint farm. They are developing Apps for SharePoint that will leverage the server-to-server trust approach (a.k.a. S2S or High-trust) for authenticating the apps. I was engaged to help troubleshoot an "Access Denied" error when testing the app in a QA/Staging environment. So we started the troubleshooting. What follows is a dump of links that help with this troubleshooting process. Bing/Google will include these, but I'm storing them on my blog for my future use. You're welcome. J

More TroubleShooting Tips for High Trust Apps on SharePoint 2013 (S. Peschka)

Creating High Trust SharePoint Apps with Microsoft Office Developer Tools for Visual Studio 2012 - Preview 2 (K. Evans)

Configure an environment for apps for SharePoint (TechNet)

How to: Package and publish high-trust apps for SharePoint 2013 (TechNet)

In case you are interested, what I found in this particular environment was three items:

  1. While the certificate used to sign the S2S tokens was trusted by SharePoint, the certificate had a signing chain that was not. Every cert in the chain, all the way up to the Root CA, must be trusted by SharePoint.
  2. The issuer id that was used in the Publish Wizard was entered with both the specific issuer id *and* the realm id. The value entered was two GUIDs with an "@" sign. Only the specific issuer id (the GUID before the "@") is necessary if you are sharing a cert across apps.
  3. The IIS server was not authenticating the user. The remote web was allowing anonymous access. When the app made a call back to SharePoint, which was facilitated by TokenHelper's CreateUserContextForSPHost() method, the user identifier was an empty string. So of course, SharePoint rejected the call because it could not verify that the user had permission to the site.

    As a side note, I was able to verify that the S2S configuration was correct by changing to app-only permissions.

Having fixed the issue, I was on my out the door when an interesting question was raised. A large portion of the user community will be on mobile devices and not on the corporate network. If we need the remote web to authenticate the user, what will that look like? The answer is two logon prompts. One for SharePoint and one for the remote web. And as you can imagine, that user experience is sub-optimal.

Fixing this identity problem is not trivial, and I'll try to cover some of those points in the future. But the moral of the story – be sure to plan for authentication at the beginning. You should understand how *every* piece of your solution will authenticate and authorize users. And you should be prepared to bridge any gaps that you find in the AuthN/AuthZ story. It is very important, and can be very difficult.

If you need assistance, please reach out to me. Contact information is on the home page.

Jan 13
Speaking at SPS DFW

I am delighted to be speaking at SPS DFW on March 7, 2015. With the timing of the event, being just a few months before Microsoft's Ignite and //build/ conferences, Eric and I have chosen a session that can help those new to Office 365 to get up to speed. My topic is Getting Started with Office 365 Development, in which I will broadly discuss the capabilities of the Office 365 platform and technologies from which it can be accessed.

I look forward to seeing my Texas friends again! Please be sure to turn on the heat. J

Jan 07
Function keys on Surface Type Cover 2

I use a Surface Pro 3 as my main computer these days. For the most part, it is docked on my desk with an external keyboard and two external monitors. I am quite happy with the performance with the machine, my only real complaints are the connected standby issue and the external display handling that Paul Thurott has detailed.

I ran into a particular frustrating issue today. I'm working without the docking station and without a mouse – just touch and the Type Cover. I have become quite proficient at using the Charms (yes, I know they are going away, but the can be useful!). My issue was that pressing the keys on the keyboard for the Charms (in the yellow box in the image below) did not work.

As you can see, the top row of keys have an alternative state – the function (Fn) keys. To enable the function keys, you press the Fn button (in green). It turns out that you can lock the function keys by holding Fn and pressing Caps lock. That is what happened to me, although not on purpose.

Jan 05
Reading the SharePoint change log from CSOM

I am quite often asked to design or review SharePoint-based solutions. A very common business requirement in these solutions is a business process that is based on a list. Of course, this is how SharePoint workflows are designed and this is a natural approach for many businesses. As you can imagine, there are instances where workflow is not used. (The reasons for not using workflow are outside the scope of this article.) A common replacement for workflow is event receivers, but using them creates a very brittle dependency between SharePoint and your process and has the risk of server performance degradation, data loss, or both. Some organizations will read the entire list and compare it to a known state. (I call this the "rewrite SharePoint search approach and I do not recommend it.)

One possible alternative approach is the SharePoint change log. This approach will not work for all scenarios, but in a recent customer solution it is very helpful. This solution was using a SharePoint list to make requests. The request process uses only a few columns, and once a request is processed, the SharePoint list is forgotten. (Long-time programmers like me call this a fire-and-forget process. Start the process and then forget about it.) As I set out to code the proof-of-concept, I noticed a distinct lack of examples. And thus a blog post idea was born! And, since we are in a cloud-first world, I will use the Client-side Object Model (CSOM) for my example.

SharePoint Change Log object model

Accessing the SharePoint change log is accomplished using the GetChanges() method on the primary objects:

As you would imagine, the methods return a collection of objects that are descendants of the object on which the call is made. For example, calling GetChanges on a list will return changes made to the list and to items (and folders and files) in that list, but will not return changes made to other lists in the same web or to the site collection-based galleries.

Each of these methods has a parameter of type ChangeQuery, and this type provides insight into the capabilities of the Change Log that far surpasses the MSDN reference.

ChangeQuery properties

The properties of the ChangeQuery object can be separated into two general categories – change actions and objects changed. (Abstracting the properties to a couple of Enumerations would be a nice exercise for some ambitious reader.)

Change Action    

  • Add    
  • DeleteObject    
  • GroupMembershipAdd    
  • GroupMembershipDelete    
  • Move    
  • Rename
  • Restore
  • RoleAssignmentAdd
  • RoleAssignmentDelete
  • RoleDefinitionAdd
  • RoleDefinitionDelete
  • RoleDefinitionUpdate
  • SystemUpdate
  • Update

Objects Changed

  • Alert
  • ContentType
  • Field
  • File
  • Folder
  • Group
  • Item
  • List
  • Navigation
  • SecurityPolicy
  • Site
  • User
  • View
  • Web

(Note that not all actions apply to all object types.)

Change class and its inheritance hierarchy

As mentioned previously, the GetChanges methods return a ChangeCollection. The items in the collection all inherit from the Change class. This inheritance hierarchy is crucial to understanding the items returned by the query.

System.Object
  Microsoft.SharePoint.Client.ClientObject
    Microsoft.SharePoint.Client.Change
      Microsoft.SharePoint.Client.ChangeAlert
      Microsoft.SharePoint.Client.ChangeContentType
      Microsoft.SharePoint.Client.ChangeField
      Microsoft.SharePoint.Client.ChangeFile
      Microsoft.SharePoint.Client.ChangeFolder
      Microsoft.SharePoint.Client.ChangeGroup
      Microsoft.SharePoint.Client.ChangeItem
      Microsoft.SharePoint.Client.ChangeList
      Microsoft.SharePoint.Client.ChangeSite
      Microsoft.SharePoint.Client.ChangeUser
      Microsoft.SharePoint.Client.ChangeView
      Microsoft.SharePoint.Client.ChangeWeb

With the understanding of the returned objects, we can process the changes by casting the change to the appropriate type, which then makes available properties that further identify the specific item that was changed.

Processing all the changes in a site collection

The following code snippet will get all the changes to a site collection:

ClientContext clientContext = new ClientContext(siteUrl);

clientContext.Credentials = new SharePointOnlineCredentials(
                        "user@tenant.domain"

                        ConvertToSecureString("SuperSecurePassword"));

 

var site = clientContext.Site;

clientContext.Load(site);

ChangeQuery siteCQ = new ChangeQuery(truetrue);

var siteChanges = site.GetChanges(siteCQ);

clientContext.Load(siteChanges);

clientContext.ExecuteQuery();

 

foreach (Change change in siteChanges)

{

    Console.WriteLine("{0}, {1}", change.ChangeType, change.TypedObject);

}

 

When run against my test site, the following types and objects are returned:

Getting changes for a specific action and/or object type

Using the ChangeQuery properties, a subset of the actions and types can be queried:

 

//ChangeQuery siteCQ = new ChangeQuery(true, true);
					

 

 

ChangeQuery siteCQ = new ChangeQuery(falsefalse);

 

 

siteCQ.Item = true;

 

 

siteCQ.Add = true;

 

 

siteCQ.SystemUpdate = true;

 

 

siteCQ.DeleteObject = true;

 

 

Processing the changed items

To process the items, simply test the object type and cast as appropriate:

 

foreach (Change change in changes)

 

 

{

 

 


					if (change is Microsoft.SharePoint.Client.ChangeItem)

 

 

 {

 

 

  ChangeItem ci = change as ChangeItem;

 

 

  changeType = ci.ChangeType.ToString();

 

 

  itemId = ci.ItemId.ToString();

 

 

  Console.WriteLine("{0}: {1}", itemId, changeType);

 

 

    }

 

 

}

 

 

Armed with the object type, and its identifier, we can make the necessary call to get the object. The list item example would be:

 

ListItem li = list.GetItemById(ci.ItemId);

 

 

clientContext.Load(li);

 

 

clientContext.ExecuteQuery();

 

 

However, there is a significant is we need to account for: there is no guaranteed that the item referenced in the change log currently exists. While you may be processing an Add change, that change could have been followed by a delete. This is precisely why the change log is not appropriate for all scenarios. So, be sure to handle exceptions for missing items. (For list items, issue a CamlQuery for the ItemId and test for 0 records returned.)

Getting Changes since last run

One additional processing item – if you poll the change log on a schedule, you will not want to re-read all the changes from the last run. The ChangeQuery has a few properties that indicated which changes are desired: ChangeTokenStart and ChangeTokenEnd. Your processing loop should store the ChangeToken of the current Change object, and then persist the last token. During subsequent runs of the program, read the last-processed token from storage and set it as the ChangeTokenStart property of the ChangeQuery. (The ChangeTokenEnd property would usually be left blank.)

 

ChangeQuery siteCQ = new ChangeQuery(falsefalse);

 

 

siteCQ.ChangeTokenStart = lastProcessedToken;

 

 

siteCQ.Item = true;

 

 

siteCQ.Add = true;

 

 

siteCQ.SystemUpdate = true;

 

 

siteCQ.DeleteObject = true;

 

 

Summary

Processing the change log in sharepoint in not too different than processing any other sharepoint object: Create a query, execute the query, and process the objects. While the change log is not appropriate for all scenarios, when used it can dramatically reduce the load on the SharePoint farm and substantially reduce code execution time.

Jan 02
Using Office 365 Exchange Online for SharePoint on premise outgoing email

A client of mine is in the process of migrating to Office 365. In this particular migration, their Exchange infrastructure moved first, the result being that all user mailboxes are hosted in Exchange Online and the on-premise Exchange servers are being decommissioned. This has an impact on SharePoint's outgoing mail functionality. This blog post details the approach that we used to resolve the issue.

Install SMTP Server as an intermediary

Since the outbound email in SharePoint uses the basic SMTP functionality, we need to have it communicate with an SMTP service that accepts simple requests. Exchange Online has some advanced requirements for messages: transport-level security (TLS); domain or IP Address whitelisting; authenticated sending. (The actual requirements depend on the mail distribution scenario.) Since SharePoint cannot meet these requirements, an intermediary SMTP server is required. In the new configuration, SharePoint will send mail to the intermediary via basic SMTP (port 25). The SMTP server will then relay the messages to Exchange Online, and combined with a mail connector or outbound security, the message will be accepted by Exchange Online for delivery to the intended mailbox.

TechNet articles

The information above, and the configuration steps necessary, are spelled out on TechNet. You may find them hard to locate, since they do not contain SharePoint-specific information. The intermediary SMTP server that we are implementing is at How to configure Internet Information Server (IIS) for relay with Office 365. This solution uses outbound security on IIS SMTP to send emails from an Exchange Online account.

If you wish for the sending email to be from an account that does not have an Exchange Online mailbox, there is a different alternative that uses a Mail Flow connector in Exchange Online. This information can be found at How to Allow a Multi-function Device or Application to Send E-mail through Office 365 Using SMTP.

Dec 16
Microsoft Cloudshow Episode on Identity

In mid-November, I had the privilege of sitting down with Andrew Connell to talk about my favorite topic for his podcast (http://www.microsoftcloudshow.com/podcast/Episodes/060-microsoft-cloud-identity-an-interview-with-paul-schaeflein). I had a great time!

Anyone who knows about my (lack of) memory when it comes to names and faces will appreciate that I have a similar (lack of) talent for acronyms. First question that AC asked, I answered wrong. I hope you don't hold that against me. J (For the record, ADAL is the Active Directory Authentication Library.)

As I mentioned at the end of the show, here are links to the relevant technologies/products/libraries that we discussed. It was a pretty wide-ranging interview as you can tell from the lengthy list. I encourage you to subscribe to the podcast. It is absolutely worth an hour of your time each week.

Azure Active Directory documentation: http://msdn.microsoft.com/library/azure/jj673460.aspx

Active Directory Authentication Library: http://msdn.microsoft.com/en-us/library/azure/jj573266.aspx

Azure AD Graph API (maintain users and groups via REST): http://msdn.microsoft.com/en-us/library/azure/hh974476.aspx

Azure Access Control Service (ACS): http://msdn.microsoft.com/en-us/library/azure/hh147631.aspx

Authentication Scenarios for Azure AD: http://msdn.microsoft.com/en-us/library/azure/dn499820.aspx

OAuth spec: http://oauth.net/ and documentation: http://oauth.net/documentation/

OpenID Connect: http://openid.net/connect

    OpenID Connect preview for Azure: http://msdn.microsoft.com/en-us/library/azure/dn645541.aspx

Federated Identity:

    Wikipedia: http://en.wikipedia.org/wiki/Federated_identity

ASP.NET Identity: http://www.asp.net/identity

    AD FS 3.0: http://technet.microsoft.com/en-us/library/hh831502.aspx

Directory Integration (Azure AD & On-premises Active Directory): http://msdn.microsoft.com/en-us/library/azure/jj573653.aspx

OWIN (Open Web Interface for .Net): http://owin.org

    ASP.NET OWIN/Katana: http://www.asp.net/aspnet/overview/owin-and-katana

ChicagosNext Hackathon: http://chicagosnext.com

Dec 12
Understanding OAuth tokens and their lifetime

I received a question in email the other day – what is the lifetime of a SharePoint OAuth token? Interesting question, so I did some research.

First, go read Kirk's post on the content of SharePoint's app tokens: Inside SharePoint 2013 OAuth Context Tokens

So, the token is just a string in JSON format that contains relevant properties. The property that we want to understand is the expiration value. (Well, actually the delta between "Not valid before" and "Expires.") To find out the answer, I followed Kirk's post to grab a token from Fiddler. Then, I used some NuGet goodness to look at the token. Microsoft has published a NuGet package that contains a helper library for JWT tokens: System.IdentityModel.Tokens.JWT

Using this library, you can perform basic operations on the token:

var at = new JwtSecurityToken(accessToken);
var lifetime = at.ValidTo - at.ValidFrom;
Console.WriteLine("Access Token lifetime: {0}", lifetime);

Turns out that the answer to the question is 1 hour. (Actually 65 minutes.)

There are some interesting methods in the System.IdentityModel.Tokens.JWT namespace, things like ValidateToken. I mention this because if you are processing tokens in your code, you *really* should make sure that they are valid. So, instead of just using the JwtSecurityToken constructor to create the token, use the JwtSecurityTokenHandler class:

var h = new JwtSecurityTokenHandler();
SecurityToken at = null;
h.ValidateToken(accessToken, new TokenValidationParameters(), out at);

To learn more about token validation, read Vittorio's post: http://www.cloudidentity.com/blog/2014/03/03/principles-of-token-validation/

But, I didn't stop there. What about refresh tokens?

Using the JwtSecurityToken class did not work with the refresh token. Thinking I knew more than Microsoft, I manually tried to decode the token, with no luck. So, when all else fails, read the manual, right?

RFC 6749 OAuth 2.0 October 2012

   

1.5. Refresh Token

   

Refresh tokens are credentials used to obtain access tokens. Refresh

tokens are issued to the client by the authorization server and are

used to obtain a new access token when the current access token

becomes invalid or expires, or to obtain additional access tokens

with identical or narrower scope (access tokens may have a shorter

lifetime and fewer permissions than authorized by the resource

owner). Issuing a refresh token is optional at the discretion of the

authorization server. If the authorization server issues a refresh

token, it is included when issuing an access token (i.e., step (D) in

Figure 1).

   

A refresh token is a string representing the authorization granted to

the client by the resource owner. The string is usually opaque to

the client. The token denotes an identifier used to retrieve the

authorization information. Unlike access tokens, refresh tokens are

intended for use only with authorization servers and are never sent

to resource servers.

 

    http://tools.ietf.org/html/rfc6749

 

Opaque to the client means that the token is not intended to be decoded/decrypted. You just cache it and use it necessary to get an access token as needed. Since the refresh token is at the discretion of the authorization server, the authoritative source for SharePoint refresh tokens is Microsoft. And their description of the refresh tokens is not necessarily clear on the matter:

Handling Refresh Tokens

   

Refresh tokens do not have specified lifetimes. Typically, the lifetimes of refresh tokens are relatively long. However, in some cases, refresh tokens expire, are revoked, or lack sufficient privileges for the desired action. The client application needs to expect and handle errors returned by the token issuance endpoint correctly. When you receive a response with a refresh token error, discard the current refresh token and request a new authorization code or access token. In particular, when using a refresh token in the Authorization Code Grant flow, if you receive a response with the interaction_required or invalid_grant error codes, discard the refresh token and request a new authorization code.

   

From <http://msdn.microsoft.com/en-us/library/azure/dn645536.aspx>

 

So, what do we do about refresh tokens? One thing to keep in mind is that every time you get an access token from SharePoint, you also get a refresh token. So, whenever your application uses the refresh token to get a new access token, you should replace the refresh token you have stored with the new refresh token received with the access token. In most scenarios, this should suffice.

 

Dec 03
MSIS7102: Requested Authentication Method is not supported on the STS

A quick post, which if it gets enough Search Engine love will save someone else time…

In response to a customer issue in which an iPad on the corporate network could not log on an ADFS 3.0 (Windows Server 2012 R2) secured resource, I saw the above error in my Event Viewer on the federation servers. The federation farm was setup in a typical architecture:

http://i.technet.microsoft.com/dynimg/IC698143.jpg

In this architecture, combined with a split DNS entry for the ADFS endpoints, external clients will resolve to the proxy servers and internal clients (on the corporate network) the federation servers.

The default authentication policy for ADFS 3.0 is Forms Authentication for the Extranet and Windows Authentication (IWA) for the Intranet. You can see these settings in AD FS Manager under Authentication Policies:

In PowerShell:

 

If you recall, the scenario that was failing was iPad on the intranet. Since the iPad is not domain-joined, Windows Authentication will fail. AD FS does provide for "falling back" to a different authentication method – you can see the property WindowsIntegratedFallbackEnabled is set to True. However, in the Intranet, there is no other provider configured, so there is no other provider to fall back on. (For more information about which browsers/clients will use Windows Authentication, refer to Configuring intranet forms-based authentication for devices that do not support WIA.)

To resolve my issue, I enabled Forms Authentication on the Intranet. Even though the authentication provider name is "FormsAUthentication" – this is not the FBA approach that you may remember from .Net/SharePoint. In IIS terms, this is Basic Authentication with TLS/SSL.

Now that there is another authentication method available, AD FS logic for using IWA will apply. And since the User Agent for the iPad is not configured for IWA, the server renders a login form.

HTH!

1 - 10Next