Update: Wednesday, 8th January 2014. This post has seen an incredible amount of traffic which I have found to be a very rewarding experience- thank you! I’ve lived up to my word and managed to export the speaker information. Both the spreadsheet and PowerShell script has been updated to include this information.
Something I have found frustrating with the SharePoint Conference 2014 website over the holidays is that you have to browse through the sessions as search results pages. It makes planning how I want to fill my days at the conference very difficult.
I also wanted to sit down with my colleagues to see if there are any particular sessions that interest them. Without all the sessions available in a format such as a spreadsheet this would become a very tiresome task.
There was no way I was going to do this by hand – at this time there is about 183 published sessions and 12 pieces of information per session that would require me to use copy and paste 4392 times and click between a browser and Excel 600 times…no thank you
Jumping the gun the a little maybe as I have yet to register (this is top of my to-do list when I’m back in the office on Monday and I’ll kick myself if this is available when you register!) but I broke out PowerShell and wrote a script to download all the information for the sessions from the SharePoint Conference (#SPC14) website to a spreadsheet – both of which you can download.
By no means is this script particularly complex or elegant – but I really wanted this information in a spreadsheet and pretty fast so forgive me if it is not up to my usual standard…the key thing is I achieved what I set out to do and can share it with you all. The last piece of information which I’m still trying to export are the speakers for the sessions – I’ll update if I manage it.
I hope you find the #SPC14 session spreadsheet helpful – see you at the conference!
Each time the conference comes around I’d try to make a business case or as in recent weeks, plead to attend but for various reasons, it has never been possible. This year however the answer was yes!
It’s been a long wait but I think this is for the better – I have now far more experience with SharePoint than ever and feel that the company and I will gain more by attending this time around. My focus has shifted and I will no longer be attending the conference with just SharePoint in mind but instead with SharePoint, Cloud & Windows Azure, Office 365 & SharePoint Online and Yammer to think about!
A few highlights I am looking forward to:-
Meeting some of my peers who I have followed online for so long (Todd Klindt, Bill Baer, Wictor Wilén and Spencer Harbar to name a few)
Attending some of the outstanding sessions I’m sure the following speakers will be delivering such as Shane Young, Laura Rogers, Fabian Williams, Jennifer Mason, Andrew Connell and Joel Oleson
Sessions such as Office 365 identity federation using Windows Azure and Windows Azure Active Directory; Best practices for Information Architecture and Enterprise Search and Real-world SharePoint architecture decisions
Networking with other like-minded people
Obtaining the invaluable material that I will have access to through attending the conference
Learning more about what is or will become available to those who were wanting to pursue this certification now that the Microsoft Certified Solutions Master (MCSM) programme is no more
Discovering what the next big thing for SharePoint might be?
I will be blogging about the conference no doubt before, during and after the event – you can follow all of this content with this tag #SPC14. The fun of organising this trip now begins!
It’s official you can now customise the Office 365 login page with your own branding – ok not quite yet but Microsoft has just released a preview feature that will soon allow us to.
The dull Californian highway image that we are all accustom to can now be replaced with your own image along with your own logo and login information which is impressive in itself but these elements can also be localised for different languages. This is all achieved with a preview feature Microsoft have recently released for Windows Azure Active Directory Premium which of course underpins Office 365. Pricing for this feature is not yet available but I imagine it will be published soon.
What can be customised?
Microsoft has made the following elements of the login page customisable.
The “Banner Logo” which is displayed on the sign-in page and access panel.
The “Sign In Page Illustration” displayed on the sign in page to the left of the login form.
The “Sign In Page background colour” visible when there is no sign in page illustration present or for low bandwidth connections.
The “Sign In Page Text” that appears below the login form and can be used to give more information to users such as where to get support.
The “Tile Logo” which is not used but might be used to replace the “organisational account” pictogram.
A “User ID Label” which again is not used but could be set to “Company email” or “User ID”.
Browse to the Active Directory page and select your Office 365 directory.
Click on the “Enable Active Directory Premium” link on the summary page.
Then from the summary page click on the “Customise your organisation’s Sign In and Access Panel pages” link where you will be able to upload your logo and other assets.
As this is a preview feature Microsoft has decided for the first few weeks of the preview that users must opt-in on each device to experience the customised sign-in page. To opt-in, you must visit https://login.microsoftonline.com/optin.srf.
Demo customised sign-in pages
Microsoft has also provided to demo fictitious sign-in pages that you can get access to experience a customised login page.
Last week for my birthday I treated myself to a new Canon PowerShot SX280 HS digital camera and shelved my rather old and tied Canon IXUS 220 HS. This was in part because I could no longer look at blurred and distorted images. Before anyone comments this was not due to my shaky hands but instead because the camera has been a gem and put up with a battering over the last three years – in fact, I’m surprised it lasted this long!
By no means is this post going meant to be an in-depth feature-by-feature review – I read what felt like a million of these before deciding on this camera instead I’ve shared some of these review with you below. I wanted to use this post to share with you my first thoughts on this camera, how it immediately delivered on my surprise trip to Berlin for my Birthday and generally how excellent I think the Canon PowerShot SX280 HS is.
A lot of the reviews I read highlighted an issue with the firmware whereby the battery becomes flat after only a few seconds of video recording. With this in mind, I managed to update the firmware which fixed the issue and touch wood I’ve not had any problems thus far.
The reviews also advised purchasing a spare battery if you want to use the GPS logging feature due to how much battery the uses. As I quite liked the idea of using the GPS logger especially while on the trip to Berlin I brought a couple of spares! Even with this feature enabled I was still getting pretty much two-thirds of a day of rather intensive use out of the camera but it was reassuring knowing I had a spare battery in my pocket. One thing to remember is to disable the GPS logger feature when you’re not using it as I forgot and after a few days of not using the Camera, I went to use it only to find the battery was flat.
Using the built-in WiFi you can share pictures and video to various services such as Facebook and Flickr. You have to configure these services on the camera via your computer which I found quite tedious even for me but after a little patience and perseverance, 30 minutes later I managed to set up all my sharing accounts. It’s really neat being able to sync pictures straight to my phone using the Canon CameraWindow iPhone app or send them straight to my Flickr account.
My favourite feature on the camera is the Hybrid Auto mode that records a small four-second video clip prior to every picture you take and these are then merged to form a movie of your day known as Movie Digest. I found that if you take lots of Hybrid Auto pictures it causes the camera to slow down while it saves and merges the movie and by using a high-speed SD card I was able to resolve this.
The 20x optical zoom on the camera is incredible but at this level of zoom, I found my best photos were those that I have taken with support from a Gorilla Grip tripod or similar.
One thing I would suggest is to play around with all the modes of the camera as there are a number of effects that you can apply to images along with some great functions that unfortunately are not so obvious until you navigate through the various menus.
I just love this camera, it has enough manual controls to keep me happy and the Hybrid mode is just perfect for instantly creating a fantastic video memory of pictures you’ve taken throughout the day. Changing from my old Canon to my new one was almost seamless with the way Canon has kept the user interface and controls very similar. The camera really lived up to all the reviews I read and even my expectations from Canon. Overall the camera was competitively priced against similar models on the market and with Canon throwing in £30 cashback, it’s a bargain.
A customer asked me if I could help troubleshoot their SharePoint environment – they had extended a web application and configured it to use Forms Based Authentication (FBA) with SSL however they were getting errors when accessing the new site.
I started troubleshooting the configuration across all the servers in their SharePoint 2013 farm. I stepped through the configuration for the web application in Central Administration – reviewing the authentication provider settings and alternate access mappings. I then reviewed the web.config and made sure that the FBA settings were present and correct along with the IIS website bindings. This is when I noticed that there was no hostname against the https/443 binding – the option to add one was also disabled.
After a little research, I found an article from ArmgaSys. It turns out that my customer’s wildcard SSL certificate was issued without an * in the name, therefore, the hostname cannot be specified once the SSL certificate is selected. I followed the steps in this article from and the customer was able to access their SharePoint site without any errors this time.
A summary of these instructions are included below: –
To resolve this and make the hostname field editable launch Microsoft Management Console (MMC) and open the Certificates snap-in.
Locate the wildcard certificate, right click on it and select properties.
If the Friend Name property doesn’t start with a * then add one and apply any changes you make.
Now go back to IIS and select the SSL certificate in the bindings of the SharePoint website with the issue.
The hostname field should now be editable where you should then enter the hostname for your SharePoint site.
If you’re working with Windows Azure and want to use PowerShell to perform management tasks you will first need to install and configure Windows Azure PowerShell as per this article “How to install and configure Windows Azure PowerShell“.
Select the Windows Azure PowerShell and then click install
Launch PowerShell as an Administrator
Type get-help *Azure* to see all the Windows Azure cmdlets – you will be asked to update help
Download your Windows Azure subscription publish settings file by typing Get-AzurePublishSettingsFile or by browsing to the download publish profile page
Save the publish settings file to a directory – in my case I store this alongside my Windows Azure scripts directory that I have synchronised with Dropbox
Import the publish settings file by typing Import-AzurePublishSettingsFile <PathToPublishSettingsFiles>
Check to see that you can are connected to your Windows Azure subscription by entering Get-AzureSubscription which should return information about your subscription.
Update: while the VIP address is guaranteed for the lifetime of the deployment – a customer recently lost their VIP address which resulted in their custom domain name become unresolvable. Whilst this was acceptable as we were still in a phase of testing it did cause me some concern. Why had the VIP address changed without our knowledge – we had not made any configuration changes to causes this. I did some further research and found an article (Using custom domain names with Windows Azure Cloud Service) in the Documentation section on the Windows Azure website where it advised you should use CNAME records and point these to your *.cloudapp.net domain name. I asked the customer to do this and we have been able to use the system since.
This post describes how I configured one of my Windows Azure hosted Virtual Machines with my domain name registrars DNS – this meant that I could make SharePoint 2013 available using my domain name.
On the virtual machine instances, page in the Windows Azure Management Portal (https://manage.windowsazure.com) browse to the virtual machine you want to configure with your external DNS.
On the right, in the “Quick Glance” section you will see that a “Public Virtual IP (VIP) Address” is listed (this is shown in the image below but for security purposes, I have changed my VIP to 111.111.111.111). The VIP address is the IP address I need to direct my external DNS to.
This VIP address is guaranteed to remain for the deployment of the cloud service – therefore if the deployment is removed the VIP address will no longer be available. Corey Sanders has written a great article on the Windows Azure MSDN blog (http://blogs.msdn.com/b/windowsazure/archive/2011/07/06/windows-azure-deployments-and-the-virtual-ip-address.aspx) where he confirms that the VIP is guaranteed for the lifetime of deployment and provides a great alternative if the virtual machine deployment has to be removed.
“If that is not possible (e.g. you must delete/deploy), then a little planning beforehand can still help here: just create a new hosted service, update CNAME and A record to new hosted service (but keep old deployment there). Wait 24 hours and it should be safe to delete the older deployment.”
I also decided to add a shorter TTL to my A record just in case the VIP address does ever change for whatever reason and I need to propagate a change quickly. I’m not sure if this is advisable or not and am seeking confirmation on this.
A quick test you can do before making any changes to your DNS is to browse directly to your VIP address (http://111.111.111.111). This in my case took me to the default IIS site however this will depend on your configuration.
After I confirmed that the VIP address was accessible I then proceeded to make changes to my external DNS through my domain registrars control panel – in this example, I wanted to point a host record (or an A record) to my virtual machine.
This post is quite a fun one. Whilst I was working with a customer today someone came up to me and asked if it was possible to add tabs to their content pages to which I gave it a few seconds thought and I responded “sure that’s absolutely possible – leave it with me!”.
I then spent my commute home thinking about how tabs could be delivered for end-users to make use of without them having to meddle around with any code. Sure getting tabs to work in SharePoint is pretty straight forward and is something we’ve all done at least on a couple of occasions but I give more thought about making it easier for the end-users to consume rather than just meeting the customer’s requirement by putting in a solution that isn’t pretty nor easy to use.
Solution
I eventually decided to use, what I thought was a very simple approach to giving users the option to use tabs. My solution makes use ofthe tabs from the jQuery UI (http://jqueryui.com/tabs/) library. It starts with a small modification to the master page that is currently being used. The following code should be added before the closing </head> tag.
I then added the following to the “Reusable Content” list in the root site of the Site Collection where I was adding tabs. Make sure that the “Automatic Update” is unchecked for this piece of reusable content.
Below is the code that should be added to the Reusable HTML field.
To add the tabs onto a content page you can simply select the item that has just been added to “Reusable Content” list by clicking on the “Insert” tab whilst editing the page and expanding the “Resumable Content” menu.
Rich text that represents the HTML markup for the tabs is then added onto the page. Each tab is represented by a bullet list item “<li>” and a content area “<div>”. The names of tabs you require can then be added by carefully overtyping the existing tab names. You must be careful not to introduce or remove any markup as this might prevent the tabs from working correctly.
Once you have entered the names of the tabs you can then add the appropriate content by overtyping the content that you wish to include in that tab. This content can consist of rich text such as tables, images and also web parts. Again you must be careful not to introduce or remove any markup. Any tabs that are no longer required can be carefully removed by deleting the bullet list item and content area.
There are other ways to achieve the same result but I thought this was a simple approach using out-of-the-box functionality. Happy tabbing!
The default interval for Windows Azure Active Directory Sync (DirSync) synchronisations is 3 hours. If for instance, your Active Directory has lots of changes you probably want to consider shortening the sync interval.
The schedule can be modified by changing the “Microsoft.Online.DirSync.Scheduler.exe.Config” configuration file. Before proceeding to make any changes to the sync interval you should evaluate how long it takes to complete synchronisation. You can do this by reviewing the application event log for entries that indicate when sync has started and completed.
To modify the configuration file open “C:\Program Files\Windows Azure Active Directory Sync\Microsoft.Online.DirSync.Scheduler.exe.Config” in Notepad. You will then need to modify the value of the “Synctimeinterval” key – the notation of this is Hours:Minutes:Seconds.
Save the configuration file and restart the “Windows Azure Active Directory Sync Service” Windows Service (via PowerShell Restart-Service MSOnlineSyncScheduler) to apply this change.
When configuring Windows Azure Active Directory Sync (or DirSync as it was previously known) it’s useful to be able to run various synchronisation tests. The default synchronisation schedule is 3 hours so unless you want to wait you will need to force a full synchronisation using PowerShell.
To do this you need to load the Windows Azure Active Directory Sync PowerShell module and run a cmdlet. Start by navigating to “C:\Program Files\Windows Azure Active Directory Sync” in PowerShell and then run “.\DirSyncConfigShell.psc1” from this directory. This will launch a new PowerShell console with the Windows Azure Active Directory Sync PowerShell module loaded (Add-PSSnapin Coexistence-Configuration). Then to force a full synchronisation you need to run the Start-OnlineCoexistenceSync cmdlet.
You can verify that synchronisation has occurred by reviewing the application event log on the server running DirSync – there should be several items in the log such as “Directory Synchronization, Event ID – 114, Export cycle completed”. There is also a status of the Active Directory Synchronisation on the “Users and Groups” page in the Office 365 admin portal. There are also two other ways to see the status of synchronisation jobs which I will go into in more detail in a later post but these include using the Forefront Identity Manager (FIM) client and Fiddler web debugging proxy.
You can create a shortcut to “C:\Program Files\Windows Azure Active Directory Sync\DirSyncConfigShell.psc1” on the desktop for ease of administration. I, however, take this one step further and create a shortcut to perform a synchronisation as well. Create a shortcut with the following target below.
The first batch of our sloe gin has been prepared and laid down until at least Christmas or should that be for as long as I can keep my hands off it!!!
The method I decided to use was to prick the sloes multiple times by hand with a fork and divide them between some lovely 1L Kilner bottles. I then filled the bottles up to one-third of there size with sloes and with two-thirds of Gordan’s gin.
The Method
I decided not to add any sugar at this stage as I wanted to allow the sugar from the sloes to be extracted first (I always find sloe gin to be on the sweet side and I have a very sweet tooth). That’s not to say I won’t be adding any sugar, but not for a few weeks at least and when I do I have been advised to either use honey or to heat the sugar in a small amount of water to allow the sugar to dissolve more easily before adding it to the gin. The bottles will be given the occasional shake and swirl to ensure that all that lovely flavour from the sloes mixes in properly.
I intend to leave the gin flavouring until Christmas (that’s almost three months!) and may even hold back one bottle longer to see how it changes even more time…but hopefully, I will be able to make a few more bottles over the coming weeks.
If you’re visiting Cromer anytime soon then you must try and have a meal in The Red Lion on the seafront. It was very reasonable for two of us to have two courses there – both of which were very delicious and adequately filling.
The Red Lion, Cromer, Norfolk, NR27 9HD (01263 514964)