The blog of Wictor Wilén

  • SPC14: Scripts for Mastering Office Web Apps 2013 operations and deployments

    Tags: Office Web Apps, Presentations, WAC Server, Exchange 2013, SharePoint 2013

    Here’s another post with scripts from my sessions at SharePoint Conference 2014 – this time from the Mastering Office Web Apps 2013 Operations and Deployments session (SPC383). To get a more in-depth explanation of all the details, please watch the recording at Channel 9.

    Let’s start…but first! OWA = Outlook Web App and WAC = Office Web Apps (Web Application Companion).

    Preparing the machine before installing Office Web Apps

    Before you install the Office Web Apps bits on the machine you need to install a set of Windows Features. The following script is the one you should use (not the one on TechNet) and it works for Windows Server 2012 and Windows Server 2012 R2.

    # WAC 2013 preparation for Windows Server 2012 (R2)
    Import-Module ServerManager
    # Required Features
    Add-WindowsFeature NET-Framework-45-Core,NET-Framework-45-ASPNET,`
    # Recommended Features
    Add-WindowsFeature Web-Stat-Compression,Web-Dyn-Compression
    # NLB
    Add-WindowsFeature NLB, RSAT-NLB
    Note, that I also add the NLB features here – it is not required if you use another load balancer.

    Starting a sysprepped WAC machine

    The following script is used when booting up a sysprepped machine that has all the Office Web App binaries installed (including patches and language packs), but no WAC configuration whatsoever. It will simply configure a NIC on the box and then join the machine to a domain and rename the machine. A very simple script that can be automated. Most of the scripts, just as this one, contains a set of variables in the beginning of the script. This makes it much more easier to modify and work with the scripts.

    $domain = "corp.local"
    $newName = "WACSPC2"
    $ou = "ou=WAC,dc=corp,dc=local"
    $ethernet = "Ethernet1"
    $ip = "" 
    $prefix = 24
    $dns = ""
    # Set IP
    Write-Host -ForegroundColor Green "Configuring NIC..."
    New-NetIPAddress -InterfaceAlias $ethernet -IPAddress $ip -AddressFamily IPv4 -PrefixLength $prefix 
    Set-DnsClientServerAddress -InterfaceAlias $ethernet -ServerAddresses $dns
    # Verify 
    # Get creds
    $credentials = Get-Credential -Message "Enter credentials with add computer to domain privilegies"
    # Join domain
    Write-Host -ForegroundColor Green "Joining domain..."
    Add-Computer -Credential $credentials -DN $domain -OUPath $ou 
    # rename
    Write-Host -ForegroundColor Green "Renaming machine..."
    Rename-Computer -NewName $newName
    # Reboot
    Write-Host -ForegroundColor Green "Restarting..."

    Once this script is executed the machine should reboot and be joined to a domain.

    Configure a new Office Web Apps Farm

    Once the machine is joined to the domain it is time to configure Office Web Apps. If you want more information about the variables/parameters I use I recommend watching the session! These variables are solely for demo purposes and you should adapt them to your needs. Also this step requires that you have a valid certificate (pfx file) that conforms to the WAC certificate requirements.

    # New WAC Farm
    Import-Module OfficeWebApps
    $farmou = "WAC"                           # Note the format!!
    $internalurl = "https://wacspc.corp.local"
    $externalurl = ""
    $cachelocation = "c:\WACCache\"           # %programdata%\Microsoft\OfficeWebApps\Working\d\
    $loglocation = "c:\WACLog\"               # %programdata%\Microsoft\OfficeWebApps\Data\Logs\ULS\
    $rendercache = "c:\WACRenderCache\"       # %programdata%\Microsoft\OfficeWebApps\Working\waccache
    $size = 5                                 # Default 15GB
    $docinfosize = 1000                       # Default 5000
    $maxmem = 512                             # Default 1024
    $cert = "wacspc.corp.local.pfx"              # File name
    $certname = "wacspc.corp.local"              # Friendly name
    $certificate = Import-PfxCertificate -FilePath (Resolve-Path $cert) -CertStoreLocation  Cert:\LocalMachine\My -ea SilentlyContinue 
    $certificate.DnsNameList | ft Unicode
    New-OfficeWebAppsFarm -FarmOU $farmou `
        -InternalURL $internalurl `
        -ExternalURL $externalurl `
        -OpenFromUrlEnabled `
        -OpenFromUncEnabled `
        -ClipartEnabled `
        -CacheLocation $cachelocation `
        -LogLocation $loglocation `
        -RenderingLocalCacheLocation $rendercache `
        -CacheSizeInGB $size `
        -DocumentInfoCacheSize $docinfosize `
        -MaxMemoryCacheSizeInMB $maxmem `
        -CertificateName $certname `
        -EditingEnabled `
    (Invoke-WebRequest https://wacspc1.corp.local/m/met/participant.svc/jsonAnonymous/BroadcastPing).Headers["X-OfficeVersion"]

    As a last step I do a verification of the local machine and retrieve the current Office Web Apps version.

    Create the NLB cluster

    In my session I used NLB for load balancing. The following scripts creates the cluster and adds the machine as the first node to that cluster. The script will also install the DNS RSAT feature and add two DNS A records for the internal and external names for the Office Web Apps Server. That step is not required and might/should be managed by your DNS operations team.

    # Create NLB Cluster
    $ip = ""
    $interface = "Ethernet1"
    # New NLB Cluster
    New-NlbCluster -ClusterPrimaryIP $ip -InterfaceName $interface -ClusterName "SPCWACCluster" -OperationMode Unicast -Verbose
    # DNS Bonus
    Add-WindowsFeature  RSAT-DNS-Server  
    Import-Module DnsServer
    Add-DnsServerResourceRecordA -Name "wacspc" -ZoneName "corp.local" -IPv4Address $ip -ComputerName ( Get-DnsClientServerAddress $interface  -AddressFamily IPv4).ServerAddresses[0]
    ping wacspc.corp.local
    Add-DnsServerResourceRecordA -Name "wacspc" -ZoneName "" -IPv4Address $ip -ComputerName ( Get-DnsClientServerAddress $interface  -AddressFamily IPv4).ServerAddresses[0]

    Adding additional machines to the WAC farm

    Adding additional machines to the WAC farm is easy, just make sure you have the certificate (pfx file) and use the following scripts on the additional machines:

    Import-Module OfficeWebApps
    $server = "wacspc1.corp.local"
    $cert = "wacspc.corp.local.pfx"  
    Import-PfxCertificate -FilePath (Resolve-Path $cert) -CertStoreLocation  Cert:\LocalMachine\My -ea SilentlyContinue 
    New-OfficeWebAppsMachine -MachineToJoin $server
    # Verify

    Configuring NLB on the additional WAC machines

    And of course you need to configure NLB and add the new WAC machines into your NLB cluster:

    $hostname = "WACSPC1"
    $interface = "Ethernet1"
    Get-NlbCluster -HostName $hostname | Add-NlbClusterNode -NewNodeName $env:COMPUTERNAME -NewNodeInterface $interface

    That is all of the scripts I used in the session to set up my WAC farm. All that is left is to connect SharePoint to your Office Web Apps farm

    Configure SharePoint 2013

    In SharePoint 2013 you need to add WOPI bindings to the Office Web Apps farm. The following script will add all the WOPI bindings and also start a full crawl (required for the search previews):

    The first part (commented out in this script) should only be used if your SharePoint farm is running over HTTP (which it shouldn’t of course!).

    asnp microsoft.sharepoint.powershell -ea 0
    # SharePoint using HTTPS?
    #$config = Get-SPSecurityTokenServiceConfig
    #$config.AllowOAuthOverHttp = $true
    # Create New Binding
    New-SPWOPIBinding -ServerName wacspc.corp.local
    Get-SPWOPIBinding | Out-GridView
    # Check the WOPI Zone
    # Start full crawl
    $scope = Get-SPEnterpriseSearchServiceApplication | 
        Get-SPEnterpriseSearchCrawlContentSource | 
        ?{$_.Name -eq "Local SharePoint Sites"}
    # Wait for the crawl to finish...
    while($scope.CrawlStatus -ne [Microsoft.Office.Server.Search.Administration.CrawlStatus]::Idle) {
        Write-Host -ForegroundColor Yellow "." -NoNewline
        Sleep 5
    Write-Host -ForegroundColor Yellow "."

    Connect Exchange 2013 to Office Web Apps

    In the session I also demoed how to connect Office Web Apps and Exchange 2013. The important things here to remember is that you need to specify the full URL to the discovery end-point and that you need to restart the OWA web application pools.

    # WAC Discovery Endpoint
    Set-OrganizationalConfig -WACDiscoveryEndpoint https://wacspc.corp.local/hosting/discovery
    # Recycle OWA App Pool
    Restart-WebAppPool -Name MSExchangeOWAAppPool
    # (Opt) Security settings
    Set-OwaVirtualDirectory "owa (Default Web Site)" -WacViewingOnPublicComputersEnabled $true -WacViewingOnPrivateComputersEnabled $true


    I know I kept all the instructions in this blog post short. You really should watch the recording to get the full picture. Good luck!

  • SPC14: Scripts for Real-world SharePoint Architecture decisions

    Tags: Conferences, SharePoint 2013

    As promised I will hand out all the scripts I used in my SharePoint Conference 2014 sessions. The first set of scripts are from the demo used in the Real-world SharePoint Architecture decisions session (SPC334). This session only contained one demo in which I showed how to set up a Single Content Web Application and using Host Named Site Collections when creating Site Collections.

    Creating the Web Application and the Root Site Collection

    The first part of the script was to create the Web Application using SSL, configure the certificate in IIS and then create the Root Site Collection. The Web Application is created using the –Url parameter pointing to a FQDN, instead of using the server name (which is used in the TechNet documentation, and causes a dependency on that specific first server). Secondly the script assumes that the correct certificate is installed on the machine and we grab that certificate using the friendly name (yes, always have a friendly name on your certificates, it will make everything much easier for you). A new binding is then created in IIS using the certificate. Finally the Root Site Collection is created (it is a support requirement) – the Root Site Collection uses the same URL as the Web Application and we should not specify any template or anything. This will be a site collection that no end-user should ever use.

    asnp *sh*
    # New Web Application 
    $wa = New-SPWebApplication `
        -Url 'https://root.spc.corp.local/' `
        -SecureSocketsLayer `
        -Port 443 `
        -ApplicationPool 'Content Applications' `
        -ApplicationPoolAccount 'CORP\spcontent' `
        -Name "SP2013SPC Hosting Web App " `
        -AuthenticationProvider (New-SPAuthenticationProvider) `
        -DatabaseName 'SP2013SPC_WSS_Content_Hosting_1' `
        -DatabaseServer 'SharePointSql'
    # Get Certificate
    $certificate = Get-ChildItem cert:\LocalMachine\MY | 
        Where-Object {$_.FriendlyName -eq "spc.corp.local"} | 
        Select-Object -First 1
    $certificate.DnsNameList | ft Unicode
    # Add IIS Binding
    Import-Module WebAdministration
    $binding = "IIS:\SslBindings\!443"
    Get-Item $binding -ea 0 
    $certificate | New-Item $binding
    # Root site
    New-SPSite `
        -Url https://root.spc.corp.local `
        -OwnerAlias CORP\spinstall

    Creating Host Named Site Collections

    Secondly we created a Host Named Site Collection (HNSC) in our Web Application. For HNSC this can only be done in PowerShell, and not in Central Administration, and we MUST use the –HostHeaderWebApplication parameter and it MUST have the value of the Web Application URL.

    New-SPSite `
        -Url https://teams.spc.corp.local `
        -Template STS#0 `
        -OwnerAlias CORP\Administrator `
        -HostHeaderWebApplication https://root.spc.corp.local


    My Site host and Personal Sites

    If you would like to have Personal Sites and the My Site Host in the same Web Application (which in many cases are a good approach) then you must make sure to have Self Service Site Creation enabled on the Web Application and then use the following scripts. The script will first create a Farm level Managed Path for the Web Application by using the –HostHeader parameter. Then we just create the My Site Host as a Host Named Site Collection.

    # Personal Sites
    New-SPManagedPath `
        -RelativeUrl 'Personal' `
    # My Site Host
    New-SPSite `
        -Url https://my.spc.corp.local `
        -Template SPSMSITEHOST#0 `
        -OwnerAlias CORP\Administrator `
        -HostHeaderWebApplication https://root.spc.corp.local


    Configure search

    In the session I also explained why you should have a dedicated Content Source for People Search (watch the session for more info). And using the following script we add the correct start addresses to the two content sources, based on the Site Collections created above:

    # Configure Search
    $ssa = Get-SPEnterpriseSearchServiceApplication
    # The root web application
    $cs = Get-SPEnterpriseSearchCrawlContentSource -SearchApplication $ssa -Identity "Local SharePoint sites"
    # People search
    $cs = Get-SPEnterpriseSearchCrawlContentSource -SearchApplication $ssa -Identity "Local People Results"

    Once this is done, you just kick off a full crawl of the People Search and then wait for a couple of hours (so the Analytics engine does its job) before you start the crawl of the content.


    This was all the scripts I used during the demo in the SPC334 session. It’s a proven method and I hope you can incorporate them into your deployment scripts.

  • SPC 14 sessions, recordings and wrap-up

    Tags: Presentations, SharePoint 2013, Office Web Apps, Conferences

    Wow, that was an awesome conference! SharePoint Conference 2014 is over and I’m very glad I attended the conference – both as a speaker and attendee. Finally Microsoft and the SharePoint Product Group told us about their future and vision for SharePoint and SharePoint Online. If you knew how long we have waited for this…

    I’m glad they start to sort out the service (ie Office 365) and now can add new capabilities into the platform.
    I’m glad Jeff Teper officially said that there will be at least one more version of SharePoint on-premises.
    I’m glad that the product group is listening to our and our customers feedback.
    I’m glad that we have such a strong community
    I’m excited about the future of SharePoint (to be honest, it’s been some time since I had that feeling).

    My sessions

    As a first time speaker at this event I was a bit nervous, which the ones who attended my sessions might have noticed. I’m proud that so many people turned up on my sessions, especially the Architecture session where we had people standing in the back and we had 90 minutes of Q&A at the end! That was cool! Unfortunately the room where I had all my three sessions suffered from severe microphone issues (which impacts my session ratings), apart from that everything except one demo was a success. Everything was recorded so if you did not have time to attend my sessions or just want to see them again here they are:

    Real-world SharePoint architecture sessions (SPC334)

    Mastering Office Web Apps 2013 operations and deployments (SPC383)

    Designing, deploying, and managing Workflow Manager farms (SPC356)

    Co-presented with Spencer Harbar.


    If you have any questions on my sessions, feel free to post them here. And before you ask – yes, I will post all the PowerShell scripts I used, but in a separate blog post(s).

    If you’d like to watch more videos from SPC14, head on over to Channel 9 and take a look at any of the keynotes and sessions for free. I’m really looking forward to see the what’s up next with SharePoint, I think the next conference (whatever it will be called) will be something very different from this one.

  • Using SQL Server Resource Governor to optimize SharePoint 2013 performance

    Tags: SharePoint 2013, SQL Server


    We all know that one of the most important parts of SharePoint 2013 (and 2003, 2007 and 2010) are SQL Server. Bad SQL Server performance will lead to bad SharePoint performance! That’s just how it is! There are tons of ways of doing this by having enough cores, adding more RAM, using fast disks, using multiple instances and even servers. You should all already be familiar with this.

    Search is one of the components in SharePoint that requires A LOT of resources, especially when crawling and doing analytics. For both SQL Server and SharePoint Search there are plenty of documentation on how to optimize both the hardware and configuration of these components. In this post I will explain and show you how to use the SQL Server Resource Governor to optimize the usage of SQL Server, especially for Search.

    SQL Server Resource Governor

    Default Resource Governor configurationThe Resource Governor was introduced in SQL Server 2008 and is a feature in SQL Server that allows you to govern the system resource consumption using custom logic. You can specify limits for CPU and memory for incoming sessions. Note that the Resource Governor is a SQL Server Enterprise feature (but also present in Developer and Evaluation editions).

    The Resource Governor is by default disabled and you have to turn it on. Just turning it on doesn’t do anything for you. You have to configure the Resource Pools, Workload Groups and the Classification.

    Resource Pools

    Resource Pools represents the physical resources of the server, that is CPU and memory. Each resource has a minimum value and a maximum value. The minimum value is what the resource governor guarantees that the resource pool has access to (that is those resources are not shared with other resource pools) and the maximum value is the maximum value (which can be shared with other pools). By default SQL Server creates two Resource Pools; internal and default. The internal pool is what SQL Server itself uses and the default pool is a …. default pool :-). Resource Pools can be created using T-SQL or using the SQL Server Management Studio.

    Workload Groups

    Each Resource Pool can have one or more Workload Groups, and the Workload Groups is where the sessions are sent to (by the Classifier, see below). Each Workload Group can be assigned a set of policies and can be used for monitoring. Workload Groups can be moved from one Resource Pool to another. Workload Groups can be created using T-SQL or using the SQL Server Management Studio.


    The Classification of requests/sessions are done by the Classifier Function. The Classifier function (there can be only one) handles the classification of incoming requests and sends them to a Workload Group using your custom logic. The Classifier function can only be created using T-SQL.

    Using SQL Server Resource Governor to optimize Search Database usage

    So, how can we use the Resource Governor to improve or optimize our SharePoint 2013 performance? One thing (among many) is that we can take a look at how Search crawling affects your farm. While crawling the crawler, part from hammering the web servers being crawled (which you should have dedicated servers for), it also uses lots of SQL Server resources. In cases where you only have one SQL Server (server, cluster, availability group etc) all your databases will be affected by this, and one thing you don’t want to do is to annoy your users during their work with a slow SharePoint farm. What we can do here using the Resource Governor is to make sure that during normal work hours the Search databases are limited to a certain amount of CPU (or RAM).

    Configure the SQL Server Resource Governor to limit resource usage of Search databases

    The following is one example of how you can configure SQL Server to limit the resource usage of the SharePoint Search databases during work hours and not limit them during night time. All the following code is executed as a sysadmin in the SQL Server Management Studio.

    Create the Resource Pools

    Our customized Resource GovernorWe need two resource pools in this example – one for sessions using the Search databases under work hours (SharePoint_Search_DB_Pool) and one for sessions using the Search databases during off-work hours (SharePoint_Search_DB_Pool_OffHours). We configure the work hours Resource pool to use at the maximum 10% of the total CPU resources and the Off hours pool to use at the max 80%. In T-SQL it looks like this:

    USE master
    CREATE RESOURCE POOL SharePoint_Search_DB_Pool
    	MAX_CPU_PERCENT = 10,
    CREATE RESOURCE POOL SharePoint_Search_DB_Pool_OffHours
    	MAX_CPU_PERCENT = 80,

    Create the Workload Groups

    The next thing we need to do is to create two Workload Groups (SharePoint_Search_DB_Group and SharePoint_Search_DB_Group_OffHours) and associate them with the corresponding Resource Pool:

    CREATE WORKLOAD GROUP SharePoint_Search_DB_Group
    USING SharePoint_Search_DB_Pool
    CREATE WORKLOAD GROUP SharePoint_Search_DB_Group_OffHours
    USING SharePoint_Search_DB_Pool_OffHours

    After this we need to apply this configuration and enable the Resource Governor, this is done using this T-SQL:


    Create the Classifier function

    The Resource Pools and Workload Group are now created and the Resource Governor should start working. But all the incoming requests are still going to the default Resource Pool and Workload Group. To configure how the Resource Governor chooses Workload Group we need to create the Classifier function. The Classifier function is a T-SQL function (created in the master database) that returns the name of the Workload Group to use.

    The following Classifier function checks if the name of the database contains “Search” – then we assume that it is a SharePoint Search database (of course you can modify it to use “smarter” selection). During normal hours it will return the SharePoint_Search_DB_Group and between 00:00 and 03:00 it will return the SharePoint_Search_DB_Group_OffHours group for the Search databases. For any other database it will return the “default” Workload Group.

    CREATE FUNCTION fn_SharePointSearchClassifier()
    RETURNS sysname
    	DECLARE @time time
    	DECLARE @start time
    	DECLARE @end time
    	SET @time = CONVERT(time, GETDATE())
    	SET @start = CONVERT(time, '00:00')
    	SET @end = CONVERT(time, '03:00')
    	IF PATINDEX('%search%',ORIGINAL_DB_NAME()) > 0 
    		IF @time > @start AND @time < @end 
    			RETURN N'SharePoint_Search_DB_Group_OffHours'
    		RETURN N'SharePoint_Search_DB_Group'
    	RETURN N'default'

    This is the core of our logic to select the appropriate Workload Group. You can modify this method to satisfy your needs (you need to set the Classifier to null and reconfigure the Resource Governor, and then set it back and reconfigure again whenever you need to change the method). An important thing to remember is that there can only be one Classifier function per Resource Governor, and this method will be executed for every new session started.

    To connect the Classifier function to the Resource Governor there is one more thing that we need to do. First the connection and then tell the Resource Governor to update its configuration:



    You should now immediately be able to see that the Resource Governor starts to use this Classifier function. Use the following DMVs to check the usage of the Resource Pools and Workload Groups respectively.

    SELECT *  FROM sys.dm_resource_governor_resource_pools
    SELECT *  FROM sys.dm_resource_governor_workload_groups

    Resource Governor DMV views

    As you can see from the image a above the Resource Governor has started to redirect sessions to the SharePoint_Search_DB_Group.

    Another useful T-SQL query for inspecting the usage is the following one, which will list all the sessions and which Workload Group they use and from where they originate.

    SELECT CAST( as nvarchar(40)) as WorkloadGroup, s.session_id, CAST(s.host_name as nvarchar(20)) as Server, CAST(s.program_name AS nvarchar(40)) AS Program
              FROM sys.dm_exec_sessions s
         INNER JOIN sys.dm_resource_governor_workload_groups g
              ON g.group_id = s.group_id


    You have in this post been introduced to the SQL Server Resource Governor and how you can use it to optimize/configure your SharePoint environment to minimize the impact crawl has on the SQL Server database during normal work hours.

    Remember that this is a sample and you should always test and verify that the Resource Pool configurations and the Classifier logic works optimal within your environment.

  • Office Web Apps 2013: Excel Web App ran into a problem - not rendering Excel files

    Tags: Office Web Apps, WAC Server


    This is a story from the trenches where Excel Web App in Office Web Apps 2013 refuses to render Excel documents, while other Apps such as Word and PowerPoint works just fine. The end-users are met with the generic error message: “We’re sorry. We ran into a problem completing your request.”

    Houston - we got a problem

    The problem is easy to solve but can be somewhat difficult to locate and in this post I will show you how to find the issue and fix it.


    Whenever Office Web Apps 2013 fails to render a document it shows the end-users a generic error message without any details. Fortunately the Office Web Apps Server contains good logging mechanisms and will in most cases give you an idea on where to solve it and in some cases it’s written in clear text.

    This specific issue, for the Excel Web Apps, shows itself in three different places (except for the error message that is shown in the user interface). First of all “normal” sys admins will see a couple of errors in the System Event Log manifesting itself like this:

    System log

    Event Id 5011:

    A process serving application pool 'ExcelServicesEcs' suffered a fatal 
    communication error with the Windows Process Activation Service. 
    The process id was '2168'. The data field contains the error number.

    Event Id 5002:

    Application pool 'ExcelServicesEcs' is being automatically disabled due to a series 
    of failures in the process(es) serving that application pool.


    Pretty nasty messages which does not give you a clue, except that something is horribly wrong. There are also lots of Dr Watson log entries in the Application log which might cause the admin to start looking up the Microsoft support phone number.

    The more “clever” admin then knows that Office Web Apps actually has it’s own log in the Event Viewer. When checking that log messages like the following are shown for the Excel Web App:

    Event Id 2026:

    An internal error occurred.
       at System.Diagnostics.PerformanceCounterLib.CounterExists(String machine, String category, String counter)
       at System.Diagnostics.PerformanceCounter.InitializeImpl()
       at System.Diagnostics.PerformanceCounter..ctor(String categoryName, String counterName, String instanceName, Boolean readOnly)
       at System.Diagnostics.PerformanceCounter..ctor(String categoryName, String counterName, Boolean readOnly)
       at Microsoft.Office.Excel.Server.CalculationServer.ExcelServerApp.Initialize()
       at Microsoft.Internal.Diagnostics.FirstChanceHandler.ExceptionFilter(Boolean fRethrowException, 
          TryBlock tryBlock, FilterBlock filter, CatchBlock catchBlock, FinallyBlock finallyBlock)

    This should actually start to give you an idea – something is wrong with the Performance Counters on this machine. Worst thing to do here is to start fiddling with the registry and try to fix it or start adding users/groups into the performance counter groups.

    The “smartest” Office Web Apps admin then takes a look at the Trace Logs (ULS) (and that admin most likely read my SharePoint post “The Rules of SharePoint Troubleshooting” – if not, he/she should!). This is what will be found:

    Excel Web App                 	Excel Calculation Services    	cg34	Unexpected	Unexpected exception occured 
      while trying to access the performance counters registry key. Exception: System.InvalidOperationException: Category does not 
      exist.     at System.Diagnostics.PerformanceCounterLib.CounterExists(String machine, String category, String counter)     at ...
    Excel Web App                 	Excel Calculation Services    	89rs	Exception	ExcelServerApp..ctor: An unhandled exception 
      occurred during boot. Shutting down the server. System.InvalidOperationException: Category does not exist.     at 
      System.Diagnostics.PerformanceCounterLib.CounterExists(String machine, String category, String counter)     at 
      System.Diagnostics.PerformanceCounter.InitializeImpl()     at ...
    Excel Web App                 	Excel Calculation Services    	89rs	Exception	...atchBlock, FinallyBlock 
      finallyBlock) StackTrace:  at uls.native.dll: (sig=4635455b-a5d6-499c-b7f2-935d1d81cf8f|2|uls.native.pdb, offset=26E32) at 
      uls.native.dll: (offset=1F8A9)	 

    The key thing here is the “Category does not exist” message.

    When the Excel Web App Calculation Services is starting (and the Excel Calc Watchdog) it is trying to read a performance counter. If that performance counter is not found – it will just crash!

    Unfortunately there is no good way to find out which performance counter it is trying to use, except firing up good ole Reflector. Using that tool we can find that it is trying to access an ASP.NET performance counter.


    The fix for the problem is easy – we just need to register/update the performance counters for ASP.NET. This is done using the lodctr.exe tool like this:

    lodctr C:\Windows\Microsoft.NET\Framework64\v4.0.30319\aspnet_perf.ini

    Give it a few seconds and then retry to load an Excel file using Office Web Apps and all your users should once again be happy.


    A simple fix to an annoying problem, which could be difficult to locate unless you know where to look (and in this case also have the skillz to read some reflected code).

    This error might not be so common, but it shows the importance of having a correctly installed machine and that you shouldn’t go fiddling with settings or the registry if you’re not really sure on what you’re doing – ok, not even then…

  • SharePoint 2013: How to refresh the Request Digest value in JavaScript

    Tags: SharePoint 2013, JSOM, Security


    SharePoint 2013 (and previous versions) uses a client side “token” to validate posts back to SharePoint to prevent attacks where the user might be tricked into posting data back to the server. This token is known by many names; form digest or message digest or request digest. The token is unique to a user and a site and is only valid for a (configurable) limited time.

    When building Apps or customizations on top of SharePoint, especially using patterns such as Single Page Applications (SPA) or using frameworks such as knockout.js it is very common that you see errors due to that the token is invalidated, which is due to that you have not reloaded the page and the token has timed out. The purpose of this article is to show you how you can refresh this form digest using JavaScript.

    How to use the Request Digest token

    When working with CSOM or REST you need to add the Request Digest token to your request. Well, with CSOM (JSOM) it is done for you under the hood, but when you handcraft your REST queries you need to manually add them. It usually looks something like this:

        url: _spPageContextInfo.siteAbsoluteUrl + "/_api/web/,,,",
        method: "POST",
        headers: { 
            "Accept": "application/json; odata=verbose", 
            "X-RequestDigest": $('#__REQUESTDIGEST').val() 
        success: function (data) {},
        error: function (data, errorCode, errorMessage) {}

    On line #6 we add the “X-RequestDigest” header, with the value of the hidden input field “__REQUESTDIGEST” – which is the form digest value. I will not dig deeper into this since it’s part of basically every SharePoint POST/REST sample on the interwebs.

    But what happens if you’ve built a page (SPA application) where you don’t reload the page and the users work on the page for longer times than the token timeout (default 30 minutes). Then they will get exceptions like these:

    HTTP/1.1 403 FORBIDDEN
    {"error":{"code":"-2130575252, Microsoft.SharePoint.SPException","message":{
    "value":"The security validation for this page is invalid and might be corrupted. 
    Please use your web browser's Back button to try your operation again."}}}

    How to refresh the token

    So how do we get a new and updated token? There are multiple ways to refresh the token, or retrieve a new and updated one. The two most common ways are to either use ye olde web services and call into the /_vti_bin/sites.asmx and use the GetUpdatedFormDigest method. To use this you have to create a SOAP message and then parse the response and retrieve the updated token. You can then either pass that new token for your subsequent requests, or even better update the hidden __REQUESTDIGEST field. The second one is to use the new REST POST endpoint /_api/contextinfo and then parse that response which can be either XML or JSON. This is how to do it the JSON way:

        url: _spPageContextInfo.webAbsoluteUrl + "/_api/contextinfo",
        method: "POST",
        headers: { "Accept": "application/json; odata=verbose"},
        success: function (data) {
        error: function (data, errorCode, errorMessage) {

    It’s also worth noticing that for every REST query SharePoint actually generates a new Form Digest which is sent back as a response header (X-RequestDigest), so you could always read that response header and update the form digest.

    Also, if you’re not using JavaScript but instead building an app using other frameworks, platforms, languages etc – you can always use the two aforementioned methods to update the token. Well, you actually need to do it since you don’t have any hidden input field :)

    How to do it the best and native way

    The drawbacks with the methods above is that you have to either request a new form digest all the time to make sure that it is up to date, or catch the exception and retry your query. And as we all know this will lead to bad performance and/or cluttered JavaScript code (like we don’t have enough of that already!).

    Fortunately there is a native method for refreshing the token built-in to the SharePoint JavaScript libraries. It’s a function called UpdateFormDigest(). It’s defined in INIT.js. The method takes two parameters, first of all the URL to the current site (remember the token is only valid for one site) and secondly it takes an update interval parameter. The update interval value is also already given to us, using a global constant called _spFormDigestRefreshInterval. And this is how you should use the function:

    UpdateFormDigest(_spPageContextInfo.webServerRelativeUrl, _spFormDigestRefreshInterval)
    As you can see, it’s very simple, only use built-in stuff and there’s no magic to it. Under the hood this method uses the /_vti_bin/sites.asmx method and it does it synchronously. This means that all you have to do is copy and paste this line into your own code just before your REST calls. The other smart thing about this method is that is uses the update interval and only updates the form digest when needed – so no extra calls.


    There you have it – no more security validation issues with your SPA applications in SharePoint 2013. All you have to do is copy and paste the line above and stay on the safe side.

  • Presenting at TechX Office 365 January 23-24 2014

    Tags: Conferences, Office 365

    imageThis year has barely started but the conference season is already running at full speed. The first conference for me of this year will be the TechX Office 365, here in Stockholm, Sweden.

    This is a conference organized by Microsoft with sole focus on Office 365. There will be national and international speakers of top class. I will do two presentations, one about Building Apps for SharePoint [Online] 2013 and one about Building Apps for Office  2013.

    The conference is held January 23rd to 24th at the Microsoft offices in Stockholm (Akalla). There are still a few seats left, so hurry over to get your ticket.

  • Summing up the year of 2013 and embracing 2014

    Tags: SharePoint, Azure, Personal, SharePoint 2013, Office 365

    Wow, 2013 was an interesting year and the time has come for my annual blog post to sum up the year that soon has passed us and looking a bit into the crystal ball for the next one. This is my seventh summary post and it is always fun to look back at what has happened during the last 12 months (2012, 2011, 2010, 2009, 2008, 2007 and 2006).

    For me the year has been really intensive on all levels; I don't think I´ve ever experienced such a huge demand for my professional services as of now, there is so much new stuff to learn and it´s harder and harder to keep up, I have a hard time resisting doing tons of community stuff and at the same time we had a huge construction work at our house, and of course having two soon-to-be teenager girls takes its toll!


    The number of blog posts I create every year continues to decrease, but I do hope the quality improves and that you still get some decent value out of my posts. There are so many good bloggers out there and I don´t want to repeat what everyone else is writing about. There are a couple of posts that I´m quite proud of and here´s the list of the ones you have visited the most the last 12 months:

    There´s no coincidence that four of the top five posts written this year is about Office Web Apps Server 2013 (WAC) – it is my new favorite server product, and I think it is one of the core server products/services that will be a huge and integral part of the "One Microsoft" and its services.

    I also had the benefit of participating in a "real" writing project – as a co-author of "Inside Microsoft SharePoint 2013". This was my second book written together with some of the most famous SharePoint book authors. If you still haven´t ordered yourself a copy then you´re missing out on something!


    I´ve continued to do sessions at conference, perhaps not that many this year. I try to choose conferences that fits me, my family and clients and also I try to focus on creating good, new and interesting content. I´m not the kind of person that like to do the same content over and over again. I´m incredibly lucky being in this position and being able to travel and meet all the awesome people around the world. I know there are a couple of conferences that I would like to have presented at, but had to turn down due to other commitments…maybe next year. To read more about the presentations I´ve done over the last year and see the decks and some video recordings, check out my Presentations page.


    I got re-awarded the MVP status once again, now my fourth time. It´s always really nice to be given this award.


    Talking about the Microsoft Certified Solutions Master, MCSM, could be a couple of posts on its own, but let´s try to crunch it down. Early January this year I attended the beta rotation of the brand new MCSM program for SharePoint. This program was totally redone to suit both on-premises SharePoint 2013 and Office 365/SharePoint Online (contrary to what some people think and say). There is/was no better training for SharePoint, in the cloud or not, than this program, and there will never be such a good program again! I was fortunate to pass both the written exam and the qualification lab "in-rotation" (that is no retakes or anything), being one of the first ones. Unfortunately the whole MCSM program was cancelled during this year. But once a Master always a Master. I´m really proud that I am one of the few who has passed MCM for SharePoint 2010, MCA for SharePoint 2010 and MCSM for SharePoint (2013) – a bit sad I didn´t get the chance to get the 2007 exam and get a full hand :-(


    Can´t write this post without a little section about SharePoint. What will happen to SharePoint, will it cease to exist? To some degree I do think so. But SharePoint as a product has played out its role in my opinion. SharePoint is just a piece in the puzzle of future collaboration software. Take a look at how Workflow has been moved out of SharePoint, how Office Web Apps is a separate product, how applications now are Apps outside SharePoint, how Enterprise Social now is a service (in Yammer). SharePoint will be there in the form of building sites and acting like a glue between all the other services. Will it be known as SharePoint? I don´t really know, but in a few years, I don´t think so. It sounds like judgment day for some, and it might be, unless you are prepared. I think this "brave new world" will for the ones who can accept the change be full of opportunities…and I´m looking forward to it! On the other hand the recent messages from Redmond that SharePoint on-premises ensures the current on-premises customers that SharePoint as a product will be here for another couple of years, which is good, it gives you good options to slowly move from a product to a service. But the innovation will be in the services, Office 365 and Azure, not in the products.


    Last year's predictions was not that off the chart. The Cloud message is ever increasing, and there´s no end to it. I also predicted a "collapse" of the SharePoint community, to some degree I think that has started to happen. The community is still thriving but there is not a single community as it used to be. There has been several new community sites and community conferences started this year. Not that it is a totally bad thing, but in my opinion it does not help the community moving forward. We´re also seeing many of the old community celebs and influencers moving away from SharePoint as a specialty and instead focusing on the new set of services.

    So what about 2014 – what do you think Wictor?

    SharePoint is dead, long live SharePoint. I wrote it above; SharePoint as a product is slowly going away, instead "SharePoint as a service" is where the focus should be. If anyone of you watched the PDC08 keynote, when Azure was announced – do you remember the slide where "SharePoint Services" was one of the listed services. I think this is where we´re going, six years later.

    Azure domination! The Azure team at Microsoft is really impressive right now, look at all the different services they announce and improve. Being a bit late to the game, but now being captain of the Cloud movement. If it was something I would bet my career on now, it would be Azure.

    Services, services, services! Everything will be services. Combine the things I said about SharePoint, with the things about Azure and add the recent announcements of the killed ForeFront products (and others). Microsoft is all in on the Devices and Services thing and you should be too. This changes the way we design, build and sell our professional services.

    The future does look a bit cloudy, doesn´t it.

    Happy New Year

    That was it! I do have a lot more to say, but you all should be on vacation right now stocking up on energy for 2014 so I keep it short. Next year will be an intensive year for me, I know it. I´m already excited about the new engagements I have planned for early 2014, about the SharePoint Conference in Las Vegas (last SP conference?) where I will be a presenter for the first time, and also the big change at Microsoft with a new CEO – how will that affect me!?

    So, to all of you I wish a Happy New Year and I´m looking forward to seeing a lot of you out there next year!

  • SharePoint 2013 Architecture Survey

    Tags: SharePoint 2013, Conferences, Surveys

    Happy Holidays everyone!

    At the upcoming SharePoint Conference, next year in Las Vegas, I will be presenting a session called Real World SharePoint 2013 Architecture decisions. The session will discuss and give examples of real world decisions and trade-offs you might be faced with as a SharePoint Architect. In order to make the session even more interesting I would like you all to help out with some statistics. Therefore have I created a small survey with a few questions. Filling it out should not take you more than an a few minutes, so there is no excuse not to do it.

    You can answer the questions either with data from your own companys standpoint, or if you are a consultant you are free to answer with your current project(s).

    The survey in it's full glory can be found here:

    Feel free to spread this link cross all your networks, the more data I have the more interesting it will be.

    The results will contribute to the SPC334 session and will shortly after that be presented here on this blog.

  • I will be speaking at SharePoint Conference 2014 in Las Vegas

    Tags: SharePoint 2013, Conferences

    SharePoint Conference 2014I’m really proud to announce that I will be speaking at the long anticipated SharePoint Conference 2014 in Las Vegas, March 3-6 2014. The SharePoint Conference hosted by Microsoft is returning to Las Vegas, but this time located at the Venetian, bigger and perhaps more interesting than in a long time. If you are in the SharePoint business as a developer, IT-Pro, architect, business analyst, power user or executive, then this is the conference where you would like to be next year.

    For the first time I will be speaking at this conference, which I’m really excited and a bit nervous about. I will present two sessions of which I’m really passionate about:

    SPC383: Mastering Office Web Apps 2013 deployment and operations

    Tuesday 4/3 9:00-10:15 @ Lido 3001-3103

    Microsoft Office Web Apps 2013 is a crucial part of any SharePoint, Exchange and Lync on-premises deployment. In this session we will dive into the details of planning, deploying and operating your Office Web Apps server farm. Through a great number of demos we will create a new farm from scratch, make it highly available and then connect it to SharePoint and Exchange. We will cover aspects such as scale considerations, patching with minimum downtime and security decisions you have to take as a Office Web Apps farm admin.

    SPC334: Real-world SharePoint architecture decisions

    Wednesday 5/3 13:45-15:00 @ Lido 3001-3103

    Being a SharePoint architect can be challenging - you need to deal with everything from hardware, resources, requirements, business continuity management, a budget and of course customers. You, the architect, have to manage all this and in the end deliver a good architecture that satisfies all the needs of your customer. Along the line you have to make decisions based on experience, facts and sometimes the gut feeling. In this session we will cover some of the architectural changes in the SharePoint 2013 architecture, some of the new guidance from Microsoft and provide insight into a number of successful real-world scenarios. You will see what decisions were made while designing and implementing these projects with emphasis on why they were made.

    SPC356: Designing, deploying, and managing Workflow Manager farms

    Wednesday 5/3 10:45-12:00 @ Lido 3001-3103

    Workflow Manager is a new product that provides support for SharePoint 2013 workflows. This session will look at the architecture and design considerations that are required for deploying a Workflow Manager farm. We will also examine the business continuity management options to provide high availability and disaster recovery scenarios. If you want a deep dive on how Workflow Manager works, then this is the session for you.

    I very glad to be doing this Workflow session since it will be co-presented with my buddy, the almighty MCA/MCM/MCSM/MCx Spencer Harbar!

    I’m really looking forward to seeing you there, in my sessions, in the other interesting sessions, in the casinos and … I’ll better stop there – what happens in Vegas stays in Vegas…

    [Updated 2014-02-19] Added rooms and time slots as well as the SPC356 session.

AWS Tracker

About Wictor...

Wictor Wilén is a Director and SharePoint Architect working at Connecta AB. Wictor has achieved the Microsoft Certified Architect (MCA) - SharePoint 2010, Microsoft Certified Solutions Master (MCSM) - SharePoint  and Microsoft Certified Master (MCM) - SharePoint 2010 certifications. He has also been awarded Microsoft Most Valuable Professional (MVP) for four consecutive years.

And a word from our sponsors...

SharePoint 2010 Web Parts in Action