If your system is slow, IISLOGS give you the true server side performance statistics, review this in the first instance

We have recently delivered a large solution with 3000+ users and millions of records. The system went from a low user-base / record count to its current level in a very short period. We were keen to measure the performance of the system early on. We wanted to measure the following :

  • % Slowpages (pages that take longer than 5 seconds to load)
  • SDK calls per hour
  • User calls per hour
  • Number of logins
  • Unique visitors

Authentication is via ADFS so we don’t get the username in the IISLOGs and we are behind a reverse proxy so we don’t get the client IP. We followed these steps to setup logging to give the detail we needed.

Configure IIS Logs

We added time taken and rolled over IIS logs every hour.

Powershell Script to Import

Log Parser is a great tool, we wrote a script to import the logs from IIS into SQL. We look for files older than 70 mins that don’t have the archive bit set and we import them. We run this step at 15 mins past each hour. Once its imported we unset the archive bit. The archive bit technique is old school and really shows my age. Any new file will have the bit set as its READY FOR ARCHIVE (BACKUP) once its backed up it will get unset. Now.. a side affect is the file may not get backed up but you have imported it into SQL so that’s your backup.
[code language=”powershell”]

param(
[string] $iisPath,
[string] $table
)
$iisPath = Join-Path $iisPath "*.log"
$iisPath
$files = ls $iisPath | where {$_.attributes -match "Archive"}
Foreach ($file in $files) {
$createtime = $file.CreationTime
$nowtime = Get-Date
"Processing file [$file] $createdtime"
if (($nowtime – $createtime).totalminutes -gt 70)
{
"file [$file] created more than 70 mins ago"
& "logparser.exe" "SELECT * INTO $table FROM $file where cs-uri-stem like ‘%.aspx’" -i:iisw3c -o:SQL -server:SQLSERVER -database:webLogs -createTable: ON
& "logparser.exe" "SELECT * INTO $table FROM $file where cs-uri-stem like ‘%.svc’ and not (cs-Method = ‘GET’ and cs-uri-stem=’/XRMServices/2011/Discovery.svc’) " -i:iisw3c -o:SQL -server:SQLSERVER -database:webLogs -createTable: ON
& "logparser.exe" "SELECT * INTO $table FROM $file where cs-uri-stem like ‘%.ashx’ and not ( cs-uri-stem like ‘%.js.%’ or cs-uri-stem like ‘%.css.%’) " -i:iisw3c -o:SQL -server:SQLSERVER -database:webLogs -createTable: ON
$file.attributes = ‘Normal’
}
Else
{
"file [$file] not created more than 70 mins ago $createdtime"
}
}
[/code]
The where filters allows us to focus on the right detail, excluding for example, image requests.

SQL To Process

You now have all the data in SQL. Any decent DBA or SQL programmer can write some quries to slice and dice based on DATE or TIME, but.. the following tricks are useful. CRM injects details into the csUriQuery, it injects details about how much time was spent in SQL, who the users was, what SDK call is being made. For example:

+OnBeginRequest:07/15/2014-09:01:28.437+LogEntries:0+SqlCalls:0+SqlCallsMs:0+GC:1183+OnEndRequest:09:01:28.437

For SDK calls you see the call and the entity:

+OnBeginRequest:07/15/2014-09:01:25.161+SdkB(RetrieveMultiple:10116):09:01:25.161+SdkE:09:01:25.161+LogEntries:0+SDKSqlCalls:1+SDKSqlCallsMs:0+GC:1183+LogEntries:0+SqlCalls:0+SqlCallsMs:0+GC:1183+OnEndRequest:09:01:25.161

Now, I think that is very awesome. You can extract the USER HASH from the csUriQuery, extract the following from the string UserIdHash:-438120035. If you return just this number you can count how many times this shows and this will give you the unique visitors. You can workout logins, csUriQuery will start with + and it will reply with a 302 which is a redirect to the ADFS server. You can do adhoc processing, we looked at the number of Retrieves grouped by entity and we found out that we had an issue with some code that was generate lots of unneeded retrieves on a certain entity. You can work out what requests are parts of dashboards and workout which dashboard they are from by looking at the referrer. This allows you to view how long slow dashboards take and how many times they are executed. You can join this with the metadata schema and get the dashboard name :

Dashboard,Hour,Times Executed,Avg Time
Sales,10,31,4
Sales,11,10,3

This shows that between 10 and 11 the sale dashboard has been executed 31 time and took an average of 4 seconds to complete the request.

If you need to investigate a slow running CRM system you will be very surprised what detail you can get from the IISLOGs, trust the logs. If a user says the system is slow but logs disagree then its likely to be network or client JavaScript performance.

Get our latest articles in your inbox

Enjoyed this article? Sign up for our email newsletter and get real-world information on all things Microsoft, cloud and tech. Your information will be shared with MailChimp but no one else, and you can unsubscribe with one click at any time