PowerShell Warmup script with a browser object

Whenever you create a new webapplication, SharePoint will set a specific timestamp when the IIS applicationpool will recycle every night and when memory limits are exceeded. If you would write proper code and dispose the stuff you no longer need you are less likely to hit recycles due to memory limits, however you are still stuck with a nightly apppool recycle.

Now with SharePoint 2010 (or 2013) the first hit following an appool recycle can be a bit slow due to all the stuff that needs to be cached, and especially when you are running an public facing site, or an internal site hosting news you do not want to hassle the first user on your farm with a slow responsive page. So you will probably end up with some sort of warmup script (like the one my colleague Marc blogged about in PowerShell warmup script). However most of these scripts out there use a webclient to hit your page. That would be fine if you would have ‘static’ HTML output, but as soon as you have any ajax calls on your page that will not be executed when you do your WebRequest. (Basically a WebRequest will not execute any clientscripts, and might skip some of the assets you want loaded or cached).

Therefore I created a new version of a PowerShell scripts, and instead of using a WebRequest I used the browser object to load the pages. From within the PowerShell a list of pages is generated that will be hit once the script is executed, and based on that list a new browser will be started and hit each and every page on the list, writing the readystate of the document.
This way we can be sure all pages are loaded into your cache, and made sure all contents are at least been hit once. Since it’s a PowerShell script you can easily schedule it to run after your apppool recycle.

Making calls to your browser from within a PowerShell script is fairly easy, once you know the objects to use, the following snippets shows you to load an URL into a IE window and wait till the page is loaded.

# new IE object
$oIE=new-object -com internetexplorer.application
# visibility can be set to false so we don't see the actual browser window
$oIE.visible=$true;

# wait till browser is loaded
while ($oIE.busy) {
	sleep -milliseconds 50
}
# request new url
$oIE.navigate2("http://www.mavention.com");

# wait till url is loaded
while ($oIE.busy) {
	sleep -milliseconds 50
}
$doc = $oIE.Document;
$doc.readyState;

# Close browser
$oIE.Quite();

In the script I used a text file to save the URL’s to so we can see what pages are getting hit in our script. Within the script there are some additional comments on how it works, and you can download the version as zipped warmupscript, or use the following snippet as a reference.

#############################
# 23-05-2012 Albert-Jan Schot 
# Quick way to itterate through all pages and hit them with a webrequest
#
# 07-02-212 Albert-Jan Schot
# Updated script to use an IE instance instead of a webrequest to cache js (as pointed out by Waldek)
#
# The script expects 2 parameter:
# -filename             required          the name and location of the file to store all page urls to
# -webappurl            required          the URL of the webapplication to warm up
# -deleteFile           optional          default: true switch if the file should be deleted (if not (thus false) it will extend existing files, 
#													have the risk of running pages multiple times)
# -warmup               optional          default: true switch if the pages should be warmed up by hitting them with a webrequest
#############################

param([string]$filename = $(throw "fileName is required."), [string]$webAppUrl = $(throw "webAppUrl is required."), [bool]$deleteFile = $true, [bool]$warmup = $true)

Write-Host "Adding SharePoint Snapin.."
if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null) {
    Add-PSSnapin "Microsoft.SharePoint.PowerShell"
}

# Delete file if required
if($deleteFile) {
      Write-Host -ForegroundColor Green "Deleting file containing urls ..." 
      Remove-Item $filename -ErrorAction SilentlyContinue
}

# Filter helper 
filter Get-PublishingPages { 
      if([Microsoft.SharePoint.Publishing.PublishingWeb]::IsPublishingWeb($_)){
      $pubweb = [Microsoft.SharePoint.Publishing.PublishingWeb]::GetPublishingWeb($_)
      $query = new-object Microsoft.SharePoint.SPQuery 
      $query.ViewAttributes = "Scope='Recursive'"
      $pubweb.GetPublishingPages($query)    
      }
      else {
            Write-Host -ForegroundColor Red "Web is not of type publishing" $_.Url;
      }
} 

# Get Webapplication and itterate through all sites and their pages
Write-Host -ForegroundColor Green "Processing all pages of all sites in the webapp..." 
$wa = Get-SPWebApplication -identity $webAppUrl
$wa.Sites | % {
      Write-Host -ForegroundColor Green "Processing site: " $_.Url 
      $web = $_.AllWebs
      $web | % { $_ | Get-PublishingPages | select Uri | Add-Content $filename} 
  }
Write-Host -ForegroundColor Green "Finished processing all sites and their pages." 

# Execute a WebRequest for each url to make sure everything is cached 
if($warmup) {
      # new IE object
      $oIE=new-object -com internetexplorer.application
	  # visibility can be set to false so we don't see the actual browser window
      $oIE.visible=$true;

      while ($oIE.busy) {
            sleep -milliseconds 50
      }

      Get-Content $filename | ForEach-Object {
            #url looks like @{Uri=#} so we need to strip out the first 6 characters, and the last one to retrieve a callable url
            $url = $_.Substring(6, ($_.Length - 7)); 
            $oIE.navigate2($url);

            #Wait for request 
            while ($oIE.busy) {
                  sleep -milliseconds 50
            }

            $doc = $oIE.Document;
            Write-Host $url is: $doc.readyState
      }

      $oIE.Quit();
}

Leave a Reply