Feeds:
Posts

## Visualizing S3D Data With SCA CmdLets & Graphviz

Visualizing data as a graph is a very effective way to “understand” the data. A graph can be thought of as a node with edges that link to related data. In this blog we’ll look at how to automate graphs for a couple of S3D scenarios:

• Reporting plant versions

This graph relates S3D version with plants that are at the same version.

• Reporting IFC status for plants in a site

This shows IFC status for plants serviced by an IFC Server

A .doc file is provided below with sample PowerShell script. Just download the file, rename to .zip and extract the .ps1 file within.

### A Taste of Graphviz

Graphviz is an open source graph visualization tool that help you create such type of graphs using it’s DOT language to layout the structure.

So the above layout can be written in dot language as:

digraph graphname {
rankdir=TB
node [shape=box]
Server [label="Server-01"]
P1 [label="Plant-01"]
P2 [label="Plant-02"]

Server->P1 [label="V11"]
Server->P2 [label="V11 R1"]
}

In this case:

• digraph – Specifies a directional graph (most common)
• rankdir – Specifies the direction of layout: TopBottom or LeftRight (LR)
• label – Specifies the label for each node
• -> – Links a node with it’s related node

### Calling Graphviz from PowerShell

For Windows, there’s a zip package that can be downloaded from Graphviz site. We’ll be using the dot.exe from this package to construct graph bitmaps. So go ahead and download the zip and extract the exe to somewhere in your path. In my case, it under the current folder where I’m running the PowerShell ISE.

To run the above command in PowerShell, use this snippet:

@"
digraph graphname {
rankdir=TB;
node [shape=box]
Server [label="Server-01"]
P1 [label="Plant-01"]
P2 [label="Plant-02"]
Server->P1 [label="V11"]
Server->P2 [label="V11 R1"]
}
"@ | .\Graphviz\bin\dot.exe -Tpng -o .\Test.png



Highlight the lines above and right-click to choose the option to “Run Selection”. This will execute the highlighted text and you should see a Test.png in your current folder. The –Tpng –o .\Test.png clause instructs dot.exe to output a “png” file and save it as Test.png.

### Using SCA CmdLets with Graphviz

Since SCA comes with a host of CmdLets that allow you to probe Site, Plant and IFC information, let’s see how we can use a couple of these in concert with Graphviz to output something useful for a S3D administrator.

Couple of scenarios one can think of are:

• Outputting list of plants under a site with their versions
• Outputting IFC status for a list of plant

#### Visualizing S3D Versions:

Let’s look at the 1st one. SCA has a cmdlet, Get-SCAPlantInfo which when invoked as:

Get-SCAPlantInfo <server> -AllPlants

will output a list of all plants on the server. Something like this:

So if we needed to output this information with version as a starting node and plant name as edges, it would look something like this in .dot language:

digraph S {
rankdir=TB;
"09.01.16.00" ->"S3D"
"09.01.16.00"->"S3D_Copy"
}

To automate the creation of this grammar, we can use a PowerShell script that acts as a DSL for the DOT language as detailed in a blog by Doug Finke here (and explained in his book Windows PowerShell for Developers – a great read).

Using the PowerShell script we can wrap the calls to create a new graph and add edges within it:

New-Graph S {
Get-SCAPlantInfo Local_SQL --AllPlants | ForEach {Add-Edge $_.Version$_.Plant}
} | .\Graphviz\bin\dot.exe -Tpng -o .\Plants.png

The resultant Plant.png shows this (below). You can already see the usefulness in visualizing this data :

#### Visualizing IFC Status

Taking this a step further, Get-SCAPlantInfo has an –OnlyIFCStatus option that lets you output IFC information for a plant. Something like this:

If we want to run it against all the plants in a site and create a data graph, we would need to:

1. Gather all plants on the server using Get-SCAPlantInfo –AllPlants
2. Filter the ones that we’re interested in maybe by site or name
3. Call Get-SCAPlantInfo with the plant and use –OnlyIFCStatus to get ifc data

And we get the resultant graph:

That should be all. You can take these concepts and use it with other CmdLets like Get-SCADbServerInfo (db & user info), Get-SCAComputerInfo, etc. Hopefully this will encourage you to use Graphviz and SCA CmdLets to easily and efficiently visualize S3D data.

PowerShell Script

## Running SCA on remote machines with PowerShell 3.0

I recently had a requirement to start SCA automatically on remote machines. This was needed for our stress test where we normally have a bunch of servers taking part in the test with different roles: MSSQL Server, IIS Server, File Server, etc.

Now SCA, version 5.7.2 onwards, comes with a console app, SCARunner, that allows you to run SCA from command line. That is fine if you are starting ScA from a batch file interactively, but still doesn’t solve the issue of running it remotely. Luckily PowerShell with it’s remoting and WMI capabilities is geared for just these kind of task, and helped me solve this problem.

In this blog, I’ll show you one way of accomplishing this task.

##### First Step

The first step is to install SCA on a remote server and then register the server within SCA. I choose the use the default hostname as the registration name since it makes it easier to use it within the PowerShell script. In the image below, my machine name is SJOSHI

##### Second Step

The next step is to test the SCARunner on the remote machine. To do that, you need to open a DOS/Command window in SCA installation folder which is C:\Program Files (x86)\Intergraph\SCA

To start SCARunner you need to specify these options:

-s  – The registration name. So in my case it would be SJOSHI

-f  – The folder where you want to save the files. The scan xml and perfmon .blg file

So for example, if I needed to start SCA on my machine, I would do:

SCRunner –s SJOSHI –f D:\Temp\Low

When invoked, SCARunner first runs a scan of the registered server and saves the scan xml. Then it creates a perfmon collector with the name of the machine and starts it.

To stop this collector , I need to do:

SCARunner –stop

And this will stop and remove the collector from perfmon.

##### Third Step

To be able to invoke this from PowerShell remotely, I need to first enable remoting on the server. This is done by running the command, Enable-PSRemoting –Force on the server from a PowerShell admin prompt.

##### Fourth Step

So at this stage we have the server configured (SCA & PowerShell) and we know the options to run SCARunner. The remaining part is to work out the PowerShell script, so let’s tackle that now.

PowerShell with its WMI Cmdlets, allows you to invoke methods against WMI classes. In our case, we need to start a process, SCARunner, so the Win32_Process with it’s Create method looks like an ideal candidate.

To invoke a WMI method, we can use the Invoke-CimMethod Cmdlet which takes the class name, method name and an arguments dictionary. So let’s see how you can use it.

Say I want to open a file, test.txt in my D:\Temp\Low folder using this class. I can do this in command line by typing notepad D:\Temp\Low\Test.txt

And the file does open up. To do the same in PowerShell, you need to type:

Invoke-CimMethod –ClassName Win32_Process –MethodName Create –Arguments @{CommandLine=“notepad D:\Temp\Low\Test.txt”}

What’s with the –Arguments field ? Well the Get-Help on Invoke-CimMethod shows it to be a dictionary (key-value type).

So it takes, a key and a value. The key name is CommandLine and value is the path to exe with arguments if reqd. The key name is obtained from the Create method description as shown here

So we can get notepad to open via WMI. Now lets try it against a remote-machine. If you do a Get-Help Invoke-CimMethod in PowerShell, you will see that it takes a –ComputerName or –CimSession parameter.

So type the same command with an additional –ComputerName option followed by a machine name that you have for this test:

Invoke-CimMethod –ComputerName XYZ ..rest of the command as before.

As you can see from the image below, the remote process is created and shown under Task Manager. Do note, the remote processes don’t show a UI because of security reasons.

So let’s now try with the SCARunner application against a remote machine on which SCA is installed (and server registered). Everything will be same as before except the CommandLine, which should now include the path to SCARunner along with the start-up options:

And you can see that it does start successfully which is pretty cool!

To stop it, you can just use the –stop option, in the $cmdLine variable:$cmdLine = "C:\Program Files (x86)\Intergraph\SCA\SCARunner.exe -stop"

In my setup, I use a CSV file to import a list of servers using the Import-Csv cmdlet, and startup SCA on those machines.

Hope this motivates you to use PowerShell to automate remote-tasks which would otherwise be hard to accomplish in a simple and efficient way.

## Aggregating Perfmon Data With PowerShell

If you have used PowerShell V2 (and above) for sometime now, you would know that you can use the Import-Counter Cmdlet to read Perfmon data from a .blg file.

This works fine when you want to look at data “as it is” and export to a CSV file or format it for display. I recently ran into a scenario where I had parse multiple perfmon files and then “look” at the data with different time intervals for a few specific counters only.

PS >dir D:\Sunit\Blog\PerfmonFiles -Include *.blg -Recurse | Select Fullname

FullName

——–

D:\Sunit\Blog\PerfmonFiles\T1HW\DataCollector01.blg

D:\Sunit\Blog\PerfmonFiles\T1K2\DataCollector01.blg

D:\Sunit\Blog\PerfmonFiles\T2HW\DataCollector01.blg

Also I needed to dump all the data into one CSV file so that I could use that for further analysis. So I wrote a function to allow me to do this and is included here in case it’s useful for someone else too.

The crux of the script is in this section where I calculate the averages depending on the interval:

 $data = Import-Counter -Path$Path -Counter $counters$d = $data | Where {$_.countersamples.status -eq 0} $entries = @()$runID = ($Path -split "\\")[-2] #Change this to your depth Write-Debug "Key$runID" for($i=1;$i -lt $d.Count;$i += $Interval) {$UBound = $i + ($Interval-1) $entries += [pscustomobject] @{ Timestamp = ([datetime]$d[$i].Timestamp).ToLongTimeString() CPUAvgProc =$d[$i..$UBound] | Select -ExpandProperty CounterSamples |  Where {$_.Path -like$counters[0]} | measure -Property CookedValue -Average  | select -ExpandProperty Average CPUAvgPriv = $d[$i..$UBound] | Select -ExpandProperty CounterSamples | Where {$_.Path -like $counters[1]}  | measure -Property CookedValue -Average | select -ExpandProperty Average RunID =$runID } }

The script takes in pipeline input (from dir cmd) and lets you specify the interval and CSVFolder to dump individual files to. In my case the logging interval was 15 secs, so if I specified –Interval 4, it would average data for every 4 counters and display it in each minute interval.

It also has a –Summary option that let’s you see the logging-interval and the counter paths in the perfmon file. Something like this:

To aggregate the data for multiple files and place all data in one file, you would do:

dir D:\Sunit\Blog\ParseBLGFiles -Include *.blg –Recurse | ParseBLGFile -Interval 4 –AllInOneFile

Here’s a run with interval specified as 4 (data is avg. for each min since logging interval is 15 sec in this case) and option to save all of the data in one csv file:

dir D:\Sunit\Blog\PerfmonFiles -Include *.blg -Recurse | ParseBLGFile -Interval 4 -AllInOneFile –CSVFolder D:\Sunit\Blog\PerfmonFiles | ft -AutoSize

And the subsequent CSV file, AllRuns.csv

Here’s the entire script (rename to .zip). If it proves helpful, do please let me know.

As you may know, SQL Server will create a query plan for each batch it executes. The plan is then stored in plan cache for future re-use. However if your application uses lots of ad-hoc queries (like S3D filters for e.g.), you will end up with a lot of cached plans that are used only once and are just lying idle, taking space and wasting memory.

So how do you check the usage of such plans? You can do it two ways:

• Run this query against your database
SELECT Count(*) Total, SUM(size_in_bytes)/1048576 MBs
FROM sys.dm_exec_cached_plans
WHERE cacheobjtype LIKE 'Compiled Plan%'
And objtype = 'Adhoc' And usecounts=1

• Run a scan of your database server using SCA (System & Configuration Analyzer) 5.6.5 or later. The AdHoc counters are shown under MSSQL Info node as shown below:

So now you know the adhoc plan usage but how do you limit it’s growth? Luckily SQL Server 2008 has added a configuration option, “optimize for ad hoc workloads”, that can be used to improve the efficiency of the plan cache. With this options set to 1, the SQL engine will store a small compiled plan stub in the plan cache when a batch is compiled the first time, instead of a full compiled plan.

If you use SCA, you can see the current settings for this option under MSSQL Config Options node:

Alternatively you can run sp_configure to show advanced options:

Since it’s an advanced option, you need to reconfigure it with override. So to enable it, you would need to set the run_value to 1.

In some cases, like mentioned here , you may need to periodically monitor the cache and clear it manually using the command, DBCC FREESYSTEMCACHE(‘SQL Plans’), or create an automated SQL server job to do that.

To help with an automated solution, SCA now includes a job to clear the Adhoc cache:

The job is then invoked as per the schedule and shows up in the Job Manager view:

The job is only invoked if the cache size is greater than 500 MBs

Hopefully this gives you an idea on how to manage your “cache plans”…till next time.

## Updated – Automating & Charting SQLIO Using PowerShell

I have updated the PowerShell script, GraphSQLIO.ps1, to include the histogram data that appears in the output file. As you know, the data is something like this:

histogram:
ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+
%: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 100

Now when you run the script, it will save this data too and the charts will show it like this:

The updated files are here (save & rename to .zip and extract):GraphSQLIO

Note: The script defaults to using MSChart control assembly that is delivered with .NET 4.0. In other words, it assumes that you are using PowerShell 3.0 and above. If you are using PowerShell 2.0, then make sure that you have:

• Installed MSChart control from MSDN
• Commented the line in the script (see below) that uses .NET 4.0 and un-commented that line that use .NET 3.5 lib for MSChart control

param
(
[string]$OptionsFile= “TestSQLIO.cmd”, [string]$ResultsFile= “SQLIOResults.xml”,
[switch]$OnlyCharts ) Add-Type -AssemblyName System.Windows.Forms #Use this on .NEt 4.0 otherwise comment the line below Add-Type -AssemblyName System.Windows.Forms.DataVisualization #Use this on .NEt 3.5 #Add-Type -AssemblyName (‘System.Windows.Forms.DataVisualization, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35’) #Script level variable to store Run entries$Script:entries = @()

function ParseRunEntries
{
$folder = Split-Path (Resolve-Path$OptionsFile) -Parent
$lines = (Get-Content$OptionsFile) -split [Environment]::NewLine | Where {$_.Trim().Length -gt 0 -and$_.StartsWith(“sqlio”)} | foreach {$_.Trim()} try { foreach($line in $lines) {$options = $line -split “-” | foreach {$_.Trim()}
$filename = [String]::Empty if($options.Length -eq 9)
{
$sqlObj = New-Object psobject Add-Member -InputObject$sqlObj -m NoteProperty -Name “Options” -Value @()
foreach($option in$options[1..$options.Length]) {$sqlObj.Options += “‘-$option'” #Create parameter array of string type$value = $option.Substring(1).Trim() if($option -match “^k(w|W|R|r)”){
$filename +=$value.ToUpper()
Add-Member -InputObject $sqlObj -m NoteProperty -Name “Operation” -Value$value
}
elseif($option -match “^s\d+” ) {$filename += $option Add-Member -InputObject$sqlObj -m NoteProperty -Name “Duration” -Value ([int]$value) } elseif($option -match “^f(random|sequential)”) {
$filename +=$option
Add-Member -InputObject $sqlObj -m NoteProperty -Name “Mode” -Value$value
}
elseif($option -match “^o\d+”) { Add-Member -InputObject$sqlObj -m NoteProperty -Name “Outstanding” -Value ([int]$value) } elseif($option -match “^b\d+”) {
$filename +=$option
Add-Member -InputObject $sqlObj -m NoteProperty -Name “Size” -Value ([int]$value)
}
elseif($option -match “-F.*\.txt”) { Add-Member -InputObject$sqlObj -m NoteProperty -Name “ParamFile” -Value $value } }$filename += “.txt”
Add-Member -InputObject $sqlObj -m NoteProperty -Name “OutputFile” -Value$filename
Add-Member -InputObject $sqlObj -m NoteProperty -Name “CaptureLatency” -Value “‘-LS'” Add-Member -InputObject$sqlObj -m NoteProperty -Name “Buffering” -Value “‘-BN'”
$Script:entries +=$sqlObj
$sqlObj =$null
}
}
}
catch{
write “Error executing script: $Error” } } function RunEntries {$validRuns = @()
if($Matches) {$Matches.Clear()}
foreach($obj in$Script:entries)
{
$sqlioPath = Resolve-Path sqlio.exe$results = Invoke-Expression -Command “$sqlioPath$($obj.Options)” | Out-String$results | Set-Content $obj.OutputFile #Add the results to each run object Add-Member -InputObject$obj -m NoteProperty -Name “Results” -Value $results if($results.Contains(“histogram:”))
{
$lines =$obj.Results -split [Environment]::NewLine | Where {$_.Trim().Length -gt 0} | foreach {$_.Trim()}
foreach($line in$lines)
{
$outputValue = ParseOutputValue$line
if($outputValue -is [double]) { if($line -match “IOs/sec.*”)
{
Add-Member -InputObject $obj -MemberType NoteProperty -Name “IOsPerSec” -Value$outputValue
}
elseif($line -match “MBs.*”) { Add-Member -InputObject$obj -MemberType NoteProperty -Name “MBsPerSec” -Value $outputValue } elseif($line -match “Avg_Latency.*”)
{
Add-Member -InputObject $obj -MemberType NoteProperty -Name “AvgLatencyMs” -Value$outputValue
}
elseif($line -match “using (specified|current) size:”) { Add-Member -InputObject$obj -MemberType NoteProperty -Name “TestFilesize” -Value $outputValue } } elseif($line -match “(?\d+)\s+thread[s]?\s+(reading|writing).*file\s+(?[a-zA-Z]?[:\\]?.*)”)
{
Add-Member -InputObject $obj -MemberType NoteProperty -Name “Threads” -Value ([int]$Matches.th)
Add-Member -InputObject $obj -MemberType NoteProperty -Name “TestFile” -Value$Matches.TestFile.Trim()
}
elseif($line -match “^ms:\s+\d+”) { #ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+$lineWithValues = $line -split {$_ -eq “m” -or $_ -eq “s” -or$_ -eq “:” -or $_ -eq “+”} | where {$_.Trim().Length -gt 0} | foreach {$_.Trim()}$values = $lineWithValues -split “\s+” | foreach {[int]$_}
Add-Member -InputObject $obj -MemberType NoteProperty -Name “LatencyValues” -Value$values
}
elseif($line -match “^%:\s+\d+”) { #%: 0 0 0 0 3 4 5 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 85$lineLatency = $line -split {$_ -eq “%” -or $_ -eq “:”} | where{$_.Trim().Length -gt 0} | foreach{$_.Trim()}$percentValues = $lineLatency -split “\s+” | foreach {[int]$_}
Add-Member -InputObject $obj -MemberType NoteProperty -Name “LatencyPercent” -Value$percentValues
}
}
$validRuns +=$obj
}
}
$validRuns | Export-Clixml -Path$ResultsFile
}

function ParseOutputValue($linevalue) {$value = [String]::Empty
if($linevalue -match “([^(ms:|%:\d+)].*:\s+)(?\d+)”) {$value = [double] $Matches.no }$value
}

function GetChartArea($chart,$curTicks)
{
$chartArea = New-Object System.Windows.Forms.DataVisualization.Charting.ChartArea$chartArea.AxisY.MajorGrid.LineColor = “Blue”
#$chartArea.Area3DStyle = New-Object System.Windows.Forms.DataVisualization.Charting.ChartArea3DStyle  # -Property @{Enable3D=$true; IsRightAngleAxes=$false; Rotation=20; Inclination=20; PointDepth=100; PointGapDepth=200}$chartAreaName = “CA$curTicks”$chartArea.Name = $chartAreaName$chartArea.BackSecondaryColor = [System.Drawing.Color]::LightSteelBlue
$chartArea.BackGradientStyle = [System.Windows.Forms.DataVisualization.Charting.GradientStyle]::DiagonalRight$chartArea.AxisX.Title = “Block Size”
$chartArea.Area3DStyle.Enable3D =$false
$chart.ChartAreas.Add($chartArea) | Out-Null
$chartArea } function SetSeries($chart, $chartArea,$seriesName, $chartType=”Column”,$seriesType=”Cylinder”)
{
$chart.Series.Add($seriesName) | Out-Null
$chart.Series[$seriesName].ChartArea =$chartArea.Name$chart.Series[$seriesName].BorderWidth = 2$chart.Series[$seriesName].ChartType =$chartType
$chart.Series[$seriesName].LabelForeColor = [System.Drawing.Color]::DarkGreen
$chart.Series[$seriesName].LabelBackColor = [System.Drawing.Color]::LightGreen
$chart.Series[$seriesName][“DrawingStyle”] = $seriesType$chart.Series[$seriesName][“PointWidth”] = “0.5” } function GenerateCharts { if(-not (Test-Path$ResultsFile))
{
throw “Invalid file specified”
}

$Script:entries = Import-Clixml$ResultsFile
$groupEntries =$Script:entries | Group-Object -Property Operation, Mode -AsHashTable -AsString

$form = New-Object System.Windows.Forms.Form$form.Text = “SQLIO CHARTS”
$form.Width = 700$form.Height = 700

$tabHost = New-Object System.Windows.Forms.TabControl$tabHost.Dock = “Fill”
$form.Controls.Add($tabHost)
$runThreads = ($Script:entries | select -First 1).Threads
$runOutstanding = ($Script:entries | select -First 1).Outstanding
$runDuration = ($Script:entries | select -First 1).Duration
$runFileSize = ($Script:entries | select -First 1).TestFilesize
$runFilePath = ($Script:entries | select -First 1).TestFile
$form.Text = “SQLIO-$runDuration sec run with $runThreads threads,$runOutstanding pending IOs & $runFileSize(MB)$runFilePath file”

foreach($grpKey in$groupEntries.Keys)
{
#Create tab page to host group Chart
$chartName = ($grpKey -replace “,”, “-“) -replace ” ” , “”
$tabPage = New-Object System.Windows.Forms.TabPage -ArgumentList “Run-$($chartName)” #create chart host$chart = New-Object System.Windows.Forms.DataVisualization.Charting.Chart
$chart.BackColor = [System.Drawing.Color]::Transparent$chart.Dock = “fill”

if($grpKey -match “random”) {$chart.Titles.Add(“Note: IOPS & Avg.Latency”) | Out-Null
}
else
{
$chart.Titles.Add(“Note: MBs/sec & Avg.Latency”) | Out-Null }$chart.Titles[0].Font = New-Object System.Drawing.Font(“Arial”,11, [System.Drawing.FontStyle]::Bold)

#Create chart areas
$grpEntry =$groupEntries[$grpKey] AddChartAreas$chart $grpEntry #Add chart to tab page$tabPage.Controls.Add($chart) #Add tab page to tab control$tabHost.TabPages.Add($tabPage) }$form.Add_Shown({$form.Activate()})$form.ShowDialog() | Out-Null
}

function AddChartAreas($chart,$currentGroup)
{
for($i=0;$i -lt 4; $i++) {$curTicks = (Get-Date).Ticks
#Get a chartarea and add it to chart
$chartArea = GetChartArea$chart $curTicks if($i -lt 3)
{
#Get and set a series
$seriesName = “SE-$curTicks”

SetSeries $chart$chartArea $seriesName$chart.Series[$seriesName].IsValueShownAsLabel =$true
$chart.Series[$seriesName].IsXValueIndexed = $true #Add series data if($i -eq 0)
{
$chartarea.AxisY.Title = “Avg. Latency(ms)”$currentGroup | ForEach-Object{$chart.Series[$seriesName].Points.AddXY($_.Size,$_.AvgLatencyMs)} | Out-Null
}
elseif($i -eq 1) {$chartarea.AxisY.Title = “MBs/sec”
$currentGroup | ForEach-Object{$chart.Series[$seriesName].Points.AddXY($_.Size, $_.MBsPerSec)} | Out-Null } elseif($i -eq 2)
{
$chartarea.AxisY.Title = “IOPS”$currentGroup | ForEach-Object{$chart.Series[$seriesName].Points.AddXY($_.Size,$_.IOsPerSec)}| Out-Null
}
}
else
{
$chartArea.AxisY.Title = “Percentage”$chartArea.AxisX.Title = “Latency (ms)”
$chartArea.AxisY.IsMarginVisible =$true
foreach($grpEntry in$currentGroup)
{
$seriesName = “SE-” + (Get-Date).Ticks SetSeries$chart $chartArea$seriesName -seriesType “Default”

for($i=0;$i -lt $grpEntry.LatencyPercent.Count;$i++)
{
$chart.Series[$seriesName].Points.AddY($grpEntry.LatencyPercent[$i]) | Out-Null
}
}
}
}
}

Clear-Host
if($OnlyCharts) { GenerateCharts } else { ParseRunEntries RunEntries GenerateCharts } #$Script:entries

## Automating & Charting SQLIO Using PowerShell

If you haven’t used SQLIO, it’s a tool available from Microsoft to determine the I/O capacity of a given hardware configuration before deploying SQL Server in production. I won’t go into details of using SQLIO since there are bunch of articles on the net on how to go about using/configuring it.

I’ve published the script & supporting files on GitHub:

https://github.com/sunitjoshi/GraphSQLIO

What I want to show you here is how to automate SQLIO runs and parse the subsequent output to easily create charts that can help in analyzing the output data.

Something like this:

The “Note” is to basically reminds me that, I need to look at IOPS & Latency for random IOs and MBs/sec & Latency for sequential IOs.

My TestSQLIO.cmd file has this:

sqlio -kR -s300 -frandom -o8 -b8 -LS -BN -Fparam.txt

sqlio -kR -s300 -frandom -o8 -b64 -LS -BN -Fparam.txt

sqlio -kR -s300 -frandom -o8 -b512 -LS -BN -Fparam.txt

sqlio -kW -s300 -frandom -o8 -b8 -LS -BN -Fparam.txt

sqlio -kW -s300 -frandom -o8 -b64 -LS -BN -Fparam.txt

sqlio -kW -s300 -fsequential -o8 -b8 -LS -BN -Fparam.txt

sqlio -kW -s300 -fsequential -o8 -b64 -LS -BN -Fparam.txt

And my param.txt has this:

C:\sqltest.dat 4 0x0 32768

This runs the test using 8, 64, 512 block sizes and 8 pending IOs. These are values that I’m using and you can surely expand this for your needs.

You can see that I don’t specify the output file in the runs above since my PowerShell script concocts one for me. The PowerShell script runs the TestSQLIO.cmd (default input file) and outputs data to SQLIOResults.xml (default file).

I choose to create a PSObject in code that relates to an individual run with properties like Operation, Mode, Duration, Size, etc. so that I can serialize the object collection using Export-Clixml CmdLet. The reason I did this is, I didn’t want to worry about all the other artifacts, like the TestSQLIO.cmd and param.txt later, to query information on threads used, duration the test was run for, size of the testfile, etc. If I dump everything in an xm file, then that’s all I need later. Again this is just my preference and may or may not work for you.

With that said, this how the resultant xml looks like:

 <Obj RefId="0">
<TN RefId="0">
<T>System.Management.Automation.PSCustomObject</T>
<T>System.Object</T>
</TN>
<MS>
<Obj N="Options" RefId="1">
<TN RefId="1">
<T>System.Object[]</T>
<T>System.Array</T>
<T>System.Object</T>
</TN>
<LST>
<S>'-kR'</S>
<S>'-s10'</S>
<S>'-frandom'</S>
<S>'-o8'</S>
<S>'-b8'</S>
<S>'-LS'</S>
<S>'-BN'</S>
<S>'-Fparam.txt'</S>
</LST>
</Obj>
<S N="Operation">R</S>
<I32 N="Duration">10</I32>
<S N="Mode">random</S>
<I32 N="Outstanding">8</I32>
<I32 N="Size">8</I32>
<S N="OutputFile">Rs10frandomb8.txt</S>
<S N="CaptureLatency">'-LS'</S>
<S N="Buffering">'-BN'</S>
<Obj N="Results" RefId="2">
<TNRef RefId="1" />
<LST>
<S>sqlio v1.5.SG</S>
<S>using system counter for latency timings, 2530840 counts per second</S>
<S>parameter file used: param.txt</S>
<S>_x0009_using 8KB random IOs</S>
<S>_x0009_enabling multiple I/Os per thread with 8 outstanding</S>
<S>_x0009_buffering set to not use file nor disk caches (as is SQL Server)</S>
<S>using specified size: 1024 MB for file: C:\sqltest.dat</S>
<S>initialization done</S>
<S>CUMULATIVE DATA:</S>
<S>throughput metrics:</S>
<S>IOs/sec:   235.04</S>
<S>MBs/sec:     1.83</S>
<S>latency metrics:</S>
<S>Min_Latency(ms): 1</S>
<S>Avg_Latency(ms): 134</S>
<S>Max_Latency(ms): 972</S>
<S>histogram:</S>
<S>ms: 0  1  2  3  4  5  6  7  8  9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+</S>
<S>%:  0  0  0  0  0  0  0  3  1  0  0  1  2  0  1  1  0  1  1  1  1  1  1  0 85</S>
</LST>
</Obj>
<Db N="TestFilesize">1024</Db>
<Db N="IOsPerSec">235.04</Db>
<Db N="MBsPerSec">1.83</Db>
<Db N="AvgLatencyMs">134</Db>
</MS>
</Obj>

Also once I have just this file, I can quickly “graph it” using the –OnlyCharts option and create tab based charts as shown above.

I used a a tabbed interface since I group my test runs based on Operation (R/W) and mode (sequential/random) and create a Hashtable:

Group-Object -Property Operation, Mode -AsHashTable –AsString

The Hashtable keys serve as individual tabpage that are added to the main tab hosted in the form.

To create the charts, I do this:

• Create a form host with a tab-control that “fills” the form area  (GenerateCharts function)
• Group the entries by Operation and Mode (GenerateCharts function)
• Cycle through group keys by:
• First creating a Chart object (GenerateCharts function)
• Then creating a Chart Area and it’s Series for the current key’s values (AddChartAreas function)

That should be it!

## Testing S3D Workflows With Coded-UI Tests

#### What are Coded-UI Tests?

Coded-UI tests provide a mechanism to automatically execute and validate a test case or workflow that operates at user-interface layer. These tests “drive” the application like a normal user with a keyboard and mouse.

The tests are created in Visual Studio Ultimate or Premium edition and have been supported since VS 2010. A good description of these tests is detailed here.

In this article we’ll look at creating a Coded-UI test for a common S3D task – Refreshing a session file. It will be a step-by-step walkthrough, so I would suggest having VS 2010+ and S3D ready so that you can follow along.

#### Phase 1

Start SmartPlant-3d and define your workspace using your filter. Then click on File->Save, and save your session file to a desired location. Here I’m just saving it to the Documents folder.

#### Phase 2

Start Visual Studio with admin option and choose to create a Class library project of type “Test” and then select “Coded UI Test Project” as the template to use.

At this stage you will be prompted with a dialog asking about how to create the UI test. As we will be recording actions, choose the “Record action” option.

This will start the “UIMap – Coded UI Test Builder” and it will show up as a small tool pallet in the lower right corner of the screen. This tool allows you to record actions like clicking buttons, typing text, etc. and for identifying control and their properties.

#### Phase 3

Now click the red record button and the tool will change the button to indicate that it’s recording actions. Do note that you should be careful to record only actions that are absolutely required and not pollute the recording with unneeded actions/gestures/etc.

Now click on Start (Windows Start) menu and then click on SmartPlant 3D link as shown below:

At this stage S3D will load up and “New” template dialog will show up as shown below:

Move the mouse to the UI Builder and click on the first icon from left to pause the recording. Next click on “Recorded Actions” icon and you should see something like this:

If you see any extra recorded steps, click on it and then press the Delete key to remove the action. Let’s go ahead and create a method for these set of actions and call it “LaunchS3D”.

Phase 4

Click on the “Generate Code” icon and type in the name, “LaunchS3D”, along with a  method description and then click on “Add and Generate” button:

Visual Studio will generate the action method and switch back to UI Builder so that you can start recording the next set of actions.

#### Phase 5

So at this stage we have a method to launch S3D and show the “New” template dialog. So now we need to click on the top-right “X” in this dialog to close it.

In other words, we need a way to identify the “X” control. The way you do in Coded-UI test is to “add” the “X” to it’s “UI Map”, so that VS can create a representation of that control in code.

So go ahead and move your mouse over the “X” to highlight it and then press CTRL + I and you should see this:

You can see that the UICLoseButton control has been identified by VS and added to it’s UI MAP. Also you can see that UINewWindow is the parent of UICloseButton. So if we need to close the UICloseButton, we need to make sure it “it” exists or it’s parent, UINewWindow exists.

We can add such a check in VS by using an assert. So click on the UINewWindow->UINewTitle control and then click on the Exists property. Then click on “Add Assertion”, so you should have something like this:

Click on Ok and the click on “Generate Code” link and type in a name for the assert method, say “CheckNewTemplateWin”:

This will generate the code for our assertion, “CheckNewTemplateWin”.

Now click on the first button to start recording and then move mouse over the new template dialog and click on the Close “X” button.

Then click on “Recorded Steps” and you should see this:

Go ahead and click on the “Generate Code” and then create a method called “CloseNewTemWin”

So at the end of Phase 5, we have created method to start S3D, check for new template dialog, and finally, closes the dialog.

#### Phase 6

Now within S3D, press the keystrokes CTRL + O which should bring the “File Open” dialog and then click on “Recorded Actions” in UI Builder and you should see this:

In S3D, press ALT + N to focus in the “File name” dialog and type in the path to you saved session file. Press “Enter” to start loading the session file.

At this stage, click on “Recorded Actions” in the UI Builder and you should have something like this. Again if you see an extraneous actions in the recorder, go ahead and delete them:

After your session file loads up, go back to the UI Builder and pause the recording. Then click on “Generate Code” and create a method called “LoadSessionFile” for the set of recorded actions. Make sure to click on “Add and Generate” to have Visual Studio create the method within your project.

#### Phase 7

So let’s recap what we have now. We have a method to start S3D, one to check the new-template dialog and close it and lastly one to load the session file. So what we need now is to record the actions to refresh the session file and then save it.

So start recording and click on the “Select Command” arrow in S3D and then press F5 key to start refreshing the file.

You should see these set of actions recorded:

Now switch back to S3D after making sure that the recording is in recording state , and then click on the top-right “X” in S3D to close it. When the save session file dialog comes up, don’t dismiss it by clicking on the “Yes” button.

Click on the “Show Recorded Steps”, icon in UI Builder and you should see the following, again delete any extraneous actions:

At this stage, let’s generate a method for these set of actions which is – clicking on the top-right “X” or close button in S3D window. So click on “Recorded Actions” first, then fill in the method name and next click on “Add and Generate” to generate code for the method.

#### Phase 8

Now we need to code the method to click on “Yes” button to save the refreshed session file. So we need to capture the “Yes” button in VS UI Map. So like before, first pause the recording, then hover your mouse over the “Yes” button and press CTRL + I to get something like this:

Since we want the “Yes” button to be visible before we click on it, let’s add an assertion for that check. So click on the “Exists” property and then click on “Add Assertion”

Then close the UI Map and click on “Generate Code” button and finally generate the code for the assertion.

Now we will add the code for the click action on the “Yes” button. Make sure recording is on and then click on the “Yes” button which will cause S3D to save the session file and exit.

Now click on “Recorded Actions” and you should see this:

At this stage go ahead and click on “Generate Code”, then give the method a name and create it within your project:

This should complete our action recording for this use case. You can now close the UI Builder and switch to Visual Studio.

#### Phase 9

If you switch to Visual Studio, you should see something like this. The generated “TestMethod” lists all the asserts and methods that we created during this exercise.

The file, CodedUITest1.cs, which contains these methods is also highlighted in the Solution Explorer. There is another file called, UIMap.uitest, in the Solution Explorer that contains the UI Map that we worked with so far.

Do this, double-click on UIMap.uitest and you should see something like this:

On the left you can see all the actions and asserts and on the right you can see the UI Map that you worked with.

Go ahead and start S3DHost by double-clicking on the session file. Then press Alt+F4, to bring the “Save Session File” dialog. At this stage, shift focus to Visual Studio, right-click on “UIYesButton” and choose the menu, “Locate UI Control”.

You should see focus shift to S3DHost window with the “Yes” button highlighted as shown below:

So this confirms that our UI Map is correct and we are able to locate the control that we stored a reference to during our recording session.

#### Phase 10

Let’s now try to play or test our recorded actions. So double-click on CodedUITest1.cs to bring the code window to front. We now need to build the solution, so click on Build->Build Solution. Alternatively, you can use the F6 key which is the shortcut key to build your solution.

Once that’s done and you get a message in the status bar saying “Build succeeded”, click on the menu Test->Windows->Test Explorer, to show the test window.

You should now see something like this:

You should now be able to, either click on one of the “Run” menus, or right-click on your test method and choose “Run Selected Tests” to invoke the actions that you recorded.

So go ahead and choose one of the options and you see S3DHost start up and go through the recorded actions. If all the actions succeed, you should get a green check next to your test as shown below.

To make sure that the actions do work, you can try to place some objects in S3D, run the tests and finally open the session file to see if the new objects do show up.

#### Testing From Command Line

You can use MSTest.exe and VSTest.Console.exe (for VS 2012 plus) to run coded ui test. These are located under Visual Studio installation folder under the following path:

So go ahead type in cmd in the windows start dialog. Then press the Shift key and right-click on cmd to choose “Run as administrator”

Navigate to the MSTest.exe folder location and then type in MSTest /? ,to list the options available with the tool.

We will use: /testcontainer:path_to_our_dll /resultsfile:path_to_results_file

The dll file is the Dll created by your project and resides in the Debug (or Release) folder. You can see that location by right-clicking the project in Visual Studio and choosing “Open Folder in File Explorer”

Then you can navigate to the bin\Debug folder and copy the path to the directory.

Once you run the test, it will create the results file and output the end result onto the DOS window, as shown below.

If you stayed with me and made it this far , then you should have a fairly good idea about the capabilities of Coded-UI Test. Hopefully you can leverage it’s abilities and use in creating automated test for your functional testing workflows.