Programmatically inserting Internet Calendar subscriptions into Outlook Web Access via Exchange Web Services

As part of the work required for moving all of our students from Google Apps for Education to Microsoft Office365, we’ve got to somehow replicate our timetabling information calendars.  We did these in Google a year or so ago, and they’ve been pretty successful and popular in production use, both with the students and the folk who run the timetabling. Losing the functionality they provide would be unpopular, and make the Office365 migration somewhat less compelling for the University as a whole.

We’d naively hoped that Microsoft’s new Graph API would let use do this using Office365 groups, but it has limitations when you need to do things as overnight batch jobs running on behalf of thousands of users.  We did hack up a script that used a combination of Graph and Exchange Web Services (EWS) (the older, SOAP based API from Microsoft) to create Office365 group calendars, but we found that we’d need a large number of fake “system” users that would own these groups/calendars due to per user calendar/group creation and ownership limits.  We put this to the project team, but it wasn’t a popular idea (despite that being similar to how we do things in Google land. But then again Google Apps for Education users are free so we don’t care how many we create).

As an alternative it was suggested that we look at generating iCalendar (ICS) format files that the students could subscribe to. We’d already done something similar for staff who are already on Office365, so creating these from our intermediate JSON data files that we currently use to feed into Google was relatively easy. It should be noted that this is subscribing to constant periodic updates from an Internet delivered ICS file, not just loading a single instance of an ICS file into a user’s default calendar – the same file format is used for both operations!

It was then suggested as part of the project that we investigate how we could force calendar subscriptions to these ICS files in Office365.  The existing staff subscriptions are optional and each individual staff user has to subscribe themselves.  Not many do to be honest, and this option would potentially create a lot of work for the service desk and IT support folk when students are migrated, and then at the start of each year.  If the students could have the subscription to the pre-generated ICS file made for them, and kept up to date (so that if they accidentally delete it, it gets reinstalled automatically within 24 hours), we’d have something that would look very much like the existing Google calendars solution.

A quick search (via Google!) showed that quite a few people have asked how to create subscriptions automatically to Internet calendars before… and the usual answer that comes back from Microsoft is that you can’t.  Indeed we’d asked initially if Graph API could do this, but got told “not yet supported”, although it was a “great scenario”.  This is rather surprising when you consider that Microsoft are selling Office365 partly on its calendaring abilities. We know the underlying Exchange system underneath Office365 can do it as you can set it up from Outlook Web Access (which appears to use its own proprietary API that looks like a JSON encapsulation of EWS with some extra bits tacked in!).

Intrigued (and tasked with “investigate” by the project), we attempted to hunt down how Office365 sets up and stores Internet calendar subscriptions.  This was a long and tortuous path, involving sifting through large amounts of EWS and Exchange protocol documentation, using MFCMAPI to look at the “hidden” parts of Exchange accounts, quite a lot of trial and error, and an awful lot of bad language! 😉

It transpires that subscriptions to Internet shared calendars generate what appears to be a normal calendar folder under the “calendars” distinguished folder ID.  This calendar folder has a “Folder Associated Item” attached to it, with a class of “IPM.Sharing.Binding.In”.  We’re not sure what that associated item is for yet, but it doesn’t appear to contain the URL pointing to the remote ICS file.  Its most likely metadata used by the internal system for keeping track of the last access, etc.

The Internet calendar file URL itself is actually stored in a completely different item, in a different folder.  There is a folder called “Sharing” in the root folder (note this is the user’s top level root folder, above the “msgrootfolder” that contains the mailbox, calendars, etc) and this contains items for each internet calendar subscription, including ones that have been deleted from the OWA front end.  These items are in the class “IPM.PublishingSubscription” and, just to make something that is hard even harder, the ICS URL is hidden in some “Extended Properties”. MFCMAPI is a handy tool in the Exchange hacker’s toolbox!

Extended Properties are effectively places that the underlying MAPI objects provide to applications to store custom, non-standard data. You can access and manipulate them using EWS, but only if you know something about them.  The GetItem method in EWS lets you ask for an ItemShape that returns all properties, but it transpires that Microsoft’s idea of “all” means “all the standard properties, but none of the extended ones”. Luckily the MFCMAPI application uses MAPI rather than EWS and so exposes all of these Extended Properties.

The Extended Properties that appear to contain Internet calendar subscription information appear in the property set with the GUID ‘F52A8693-C34D-4980-9E20-9D4C1EABB6A7’.  Thankfully they are named in the property set so we can guess what they are:

Property Tag Property Name Example Contents
0x8801 ExternalSharingSubscriptionTypeFlags 0
0x80A0 ExternalSharingUrl
0x80A1 ExternalSharingLocalFolderId LgAAAACp0ZZjbUDnRqwc4WX[…]
0x80A2 ExternalSharingDataType text/calendar
0x80A6 ExternalSharingRemoteFolderName Timetable for

We guessed that the ExternalSharingLocalFolderId Extended Property would point to the normal calendar folder.  It does, but there’s a twist still: EWS returns this ID in a different format to all the others.  Normally, EWS returns folder IDs, item IDs as Base64 encoded binary strings in a format called “EwsId”.  Whilst ExternalSharingLocalFolderId is indeed a Base64 encoded binary string, it is in a different format called “OwaId”. If you feed an “OwaId” format identify that you’ve got from a FindItem or GetItem call in EWS back into another call in EWS, you’ll get an error back.  You have to take the Base64 encoded “OwaId”, pass it through an EWS ConvertId call to get a Base64 encoded “EwsId” and then use that.  No, it doesn’t make any sense to us either!

So that lets us get the ICS calendar data back out for calendar subscriptions.  We could theoretically then generate forced subscriptions by creating these folders/items and filling in the extended properties with suitable data.  However from our point of view of providing a supportable, production service we decided that we’d gone too far down the Exchange rabbit hole. We would be relying on internal details of Exchange that Microsoft haven’t documented, and indeed have said isn’t supported. At the moment we’re deciding what to do instead. The default fall back is to just give the students the ICS calendar file URL and explain to them how to set up subscriptions using the OWA web interface. This will be give a poorer user experience than the current Google calendars do, but there’s not really much else we can do unfortunately.

Getting an X.509 certificate into an Office365 Azure AD app manifest

We’ve been playing with Office365 apps as part of the preparation for a move from Google Apps for Education to Office365 for our students (not my idea!). I’ve been trying to use the new REST based Microsoft Graph API to talk to Office365 from a daemon process so that we can update timetable information in calendars nightly (which is what we do at the moment with Google’s calendar API and it works well). Unfortunately Graph API is relatively new and isn’t really ready for prime time: for one thing it doesn’t support daemon processes that are using the confidential client OAuth2 authentication flow creating/updating calendar entries on Unified Groups (even though it does support the same deamon creating and deleting the Unified Groups themselves and adding/removing members. No, I’ve no idea why either… I assume its just because it is a work in progress!).

So the next trick to try is to use Exchange Web Services (EWS) via SOAP to see if that can do what Graph API can’t yet. EWS can use OAuth style bearer tokens, so I hoped I could use the nicer Graph API where possible and just have to put the nasty SOAP stuff in a dark corner of the code somewhere. Unfortunately, the SOAP OAuth didn’t like the access tokens that Graph API was giving back: it complained that they weren’t strong enough.

It turns out that this is because I was using a client ID and secret with the Graph API’s OAuth2 code to get the access token, but EWS SOAP calls require the use of X.509 certificates. And this is where, once again, developing against Office 365 gets “interesting” (or to put another way, “massively painful”). The Azure AD management portal offers a nice interface for managing client IDs/secrets but no user interface for doing the same with X.509 certificates. So how do you link an X.509 certificate to an Azure AD app?

The answer is via the “app manifest”. This is a JSON format file that you can download from the Azure AD management portal’s page for the app. Its also very lightly documented if you don’t happen to be sitting at desk in the bowels of Microsoft. Luckily there are very helpful Microsoft folk on Stack Overflow who came to my rescue as to where I should stick my certificate information to get it uploaded successfully.  My certificates were just self signed ones generated with openssl:

openssl req -x509 -days 3650 -newkey rsa:2048 -keyout key.pem -out cert.pem

The key information I was missing was that the X.509 information goes into its own section in the app manifest JSON file – an array of hashes called “keyCredentials”.  In the original app manifest I’d downloaded, this had been empty, so I’d not noticed it. The structure looks like this:

"keyCredentials": [
"customKeyIdentifier": "++51h1Mw2xVZZMeWITLR1gbRpnI=",
"startDate": "2016-02-24T10:25:37Z",
"endDate": "2026-02-21T10:25:37Z",
"keyId": "<GUID>",
"type": "AsymmetricX509Cert",
"usage": "Verify",
"value": 3d4dc0b3-caaf-41d2-86f4-a89dbbc45dbb"MIIDvTCCAqWgAwIBAgIJAJ+177jyjNVfMA0GCSqGSJWTIb3DQEBBQUAMHUxCzAJBgN\
<Lots more Base64 encoded PEM certificate data here...>

The keyID is any old GUID you can generate and isn’t anything to do with the cryptography. The customKeyIdentifier is the finger print of the X.509 certificate, and the value field contains the Base64 encoded PEM data for the certificate without its ASCII armour. The startDate and endDate fields have to match up with the similar lifetime validity timestamps minted into the certificate (the Azure AD management portal will barf on the upload if they aren’t).

One nice feature is that a single app can have multiple authentication methods available. Not only can you have more than one client ID/secret pair (which you need to have as they have a lifespan of at most 2 years, so every year you’ll need to generate a new one before the old one dies), but it appears that having the X.509 certificates in place does not stop the easier client ID/secret OAuth2 access from working.

OK, so lets have a bit of Perl that can get us a suitable Bearer access token once we’ve got the X.509 certificate stuffed in place inside the Azure app manifest:


# Script to test getting an OAuth2 access token using X.509 certificates.

use strict;
use LWP::UserAgent;
use Data::Dumper;
use JSON;
use JSON::WebToken;
use Crypt::Digest::SHA256;
$| = 1;

my $tenantUuid = ‘<Our Azure Tenant ID>’;
my $tokenEndpoint = “$tenantUuid/oauth2/token”;

open(CERT, “cert.pem”);
my $cert = ”;
while(my $line = ) {
$cert .= $line;
close CERT;
open(KEY, “key.pem”);
my $key = ”;
while(my $line = ) {
$key .= $line;
close KEY;
my $fingercmd = ‘(openssl x509 -in cert.pem -fingerprint -noout) | sed \’s/SHA1 Fingerprint=//g\’ | sed \’s/://g\’ | xxd -r -ps | base64′;
my $fingerprint = `$fingercmd`;
chomp $fingerprint;

my $now = time();
my $expiryTime = $now+3600;
my $claims = {
‘aud’ => $tokenEndpoint,
‘iss’ => ‘< Our client ID for the app >’,
‘sub’ => ‘< Our client ID for the app >’,
‘jti’ => ‘< Our client ID for the app >’,
‘nbf’ => $now,
‘exp’ => $expiryTime,

my $signedJWT = JSON::WebToken->encode($claims, $key, ‘RS256’,
{‘x5t’ => “$fingerprint”},
my $authorizeForm = {
grant_type => ‘client_credentials’,
client_id => ‘< Our client ID for the app >’,
resource => ‘’,
scope => ‘Group.ReadWrite.All’,
client_assertion_type => ‘urn:ietf:params:oauth:client-assertion-type:jwt-bearer’,
client_assertion => $signedJWT,

my $ua = LWP::UserAgent->new;
my $response = $ua->post( $tokenEndpoint, $authorizeForm);
if (!$response->is_success) {
warn “Failed to get access token: ” . $response->status_line . “\n”;
die “Response dump:\n\n”. Dumper($response) . “\n”;

my $json = JSON->new->allow_nonref;
my $oauth2_info = $json->decode( $response->decoded_content );

print Dumper($oauth2_info);

The $oauth2_info structure contains the required access token, plus its expiry information and a few other bits and bobs. The access token is used with SOAP requests by adding an HTTP Authorization header contain the the word “Bearer”, a space and then the access token as returned (the access token will be a Base64 encoded JWT itself).  If you need the WSDL file for the SOAP requestss, you can get it from (though note you’ll need to authenticate to get it!).

One last nuance: for Office365 you seem to need to tell it what domain its talking to. I don’t know why – once it has decoded and verified the authorization bearer access token you’d think it should know what client ID your using, and thus what Azure AD domain it is in. It doesn’t.  So, you need to add another HTTP header called “X-AnchorMailbox” that contains an email address of a person/resource in the domain you’re playing with.  Once that’s there alongside the Authorization header with the Bearer access token, you’re good to go with some SOAPy EWS requests.

Apps for Education

Google Apps for Education allows HE and FE institutions to make use of Google Apps services for their staff and students. This includes services such as email, calendaring and file store.

At Loughborough University we’ve been investigating how to integrate some of the Google Apps for Education services into our local IT ecosystem to improve the student experience. What follows is a brief overview of two of these developments – one for Google Drive and another extending the use of Google Calendar.

Google Drive

We were asked to look at using Google Drive to provide additional file store for students. This would not only cut down on our file storage costs for students, but they would also be able to more easily access their files when off campus or using mobile devices. Additionally they could potentially share their files with 3rd parties (eg work placement employers, friends at other institutions, etc) which would be difficult to do with locally held file stores.

Google provide a web browser based user interface to this service and also a Windows file system plugin. We had hoped that the Windows plugin could be used on our lab PCs around campus to give students a seamless view of the Google Drive file store, but unfortunately it appeared to only use the Google credentials for the user that originally installed it. This was fine on staff machines with only one user, but our lab machines may have many people using them and are locked down to prevent students installing new software.

We therefore looked to the Google Drive Application Programming Interface (API) to see if there was a way we could work round this.  As well as providing multiple users on the same machine with access to the file store, we really wanted to avoid having to ask the user for their password again to access the Google Drive. Instead we intended to make use of the Active Directory log ins that they had already made on the lab PCs.

We decided the easiest way to support this was to use a Linux machine running Samba as a “bridge” between the Windows file sharing world and the Google Drive APIs.  Samba is a free, open source implementation of the Common Internet File System (CIFS), which is the underlying protocol that supports Windows file sharing.  As well as being open source, Samba also has the concept of Virtual File System (VFS) modules, which are extensions that allow developers to intercept CIFS operations to provide additional services.

After testing basic functionality via the pre built PHP client libraries, we started prototyping in Perl in order to work out how to generate the OAuth2 delegated authority JSON Web Token (JWT). OAuth2 allows a Google Apps service account to authenticate to Google’s servers and then act on behalf of one of our Google Apps users.  These prototype Perl scripts were designed to be “thrown away” – they merely allowed us to experiment rapidly in a comfortable development environment.

Once we had the Perl prototype authenticating successfully and issuing the RESTful Google Drive API calls, we then turned our attention to implementing a Samba VFS module in C. Whilst using the Samba version 4 distribution we were actually working with the Samba 3 part of the code, as much of the new Samba 4 code is concerned with Active Directory (AD) domain controller provision, and all we needed our bridge server to be was a fileserver bound into the campus AD.

This new VFS module would first take the username of the user making the CIFS connection and then look up their email address in the campus AD using LDAP.  Once the email address is returned it can then be used to create a JWT for passing to Google’s OAuth2 service to authenticate the delegated service account access to the user’s Google Drive account. A successful login returns a limited lifespan bearer access token which is then used with subsequent RESTful Google API calls to authenticate each transaction.

The rest of the Samba VFS module coding was concerned with mapping the CIFS operations to Google Drive operations.  This is quite difficult – CIFS and Google Drive not only have some underlying differences in the way that the file system structure is designed, that requires work arounds in the VFS module, but we also had to learn about the CIFS protocol itself.  CIFS has been developed over many years with additions made as new operating systems and services have been deployed, often with little or no documentation.

Eventually we managed to provide enough mappings for the following simple operations to work:

  1. file or directory creation,
  2. file or directory deletion,
  3. file contents editing,
  4. directory listings
  5. stat() type system calls to return file information (some of which we had to fake!),
  6. file contents caching (to improve performance),
  7. simple whole file locking on the windows side (there is no real concept of locking on the Google Drive side as it is naturally a shared space).

The result of our development is that we can now mount one (or even more) Google Drive file stores on a windows PC simply by doing a CIFS file sharing mount onto our “bridge” Linux server.  Performance is not stellar but is acceptable for many users – caching the files returned from Google on the file store of the bridge server helps a lot, especially as many people read lots of files but only edit a few in any one session.

Screen Shot of Google Drive

Google Drive shown being accessed via both Google’s own web interface and our Samba CIFS bridge


At the moment it is being tested by a number of staff on campus and will be rolled out to shared lab PCs in the Library shortly. One additional advantage of the Samba bridge is that it allows non-Windows machines to also mount Google Drive as a file store.  For example Linux machines can now mount Google Drive storage using this system, which is a platform that Google themselves do not yet support directly.


The student Google Apps users get access to Google’s excellent Calendar service. However there are other, pre-existing University calendars with information of interest to students held elsewhere. One example of this is the lecture, lab and tutorial timetabling system. We were asked if we could attempt to bring the timetable data into the Google Calendar system so that students would have their timetables available alongside their personal calendar entries.

There were a number of issues to solve here. We could have created an extra calendar for each student and injected their personal timetable into it, but this would have required many thousands of calendars to be created.  As lectures details often change during the term, we would also have many more events to be maintained. It would not simply be the case of creating the calendar events at the start of the year, term or semester as the calendars will have to be updated daily.

Instead we decided to create a new calendar for every module being taught. This was still around 1500 calendars for one semester but we then further sub-divided them into the “part” of the course (first years are Part A, second years are Part B) with six new users being created to handle the calendars for one part each.

Screen Shot 2014-02-03 at 15.28.48

A view of the first year Mathematics undergraduate module timetables for one week in a Google Calendar.

Google Calendar API then allows us to share an individual module calendar from the “part user” to the student users that are taking that module.

Our system needs at least three different Perl scripts to run:

  1. Create the initially empty module calendars in each of the “part user” accounts (this is the most heavily time dependent part, due to API limitations on the speed and number of calendars you can create at any one time),
  2. Extract data from our existing timetable system’s Oracle database and inject these as events in the appropriate module calendar. This script also needs to handle events that were once in the timetabling system but which have been deleted or changed,
  3. Attach the correct module calendars to each student user. This is done by interrogating both the module group membership from our Active Directory and current state of module calendar sharing, and then issuing the required remove/add ACL rules using the Google Calendar API.

Alongside the ability to share a full calendar of module lectures with our students, we are also able to inject events into their primary calendar directly, using the same core Perl modules to manage the OAuth2 delegation. We have begun to use this by creating “all day” events, listing a students library books due for return.

Screen Shot 2014-02-03 at 15.30.24

A Google Calendar showing a library book return reminder entry.

Thanks to APIs we have created to our library systems, we are able to deduce which students are required to return their books in seven days and can create events, with controlled notifications, in the students calendar prior to the books return date.

We are currently testing these systems, with a subset of the students, to gain feedback and aim to go live for all undergraduate and taught course students in Autumn Term 2014.