Feeds:
Posts
Comments

Archive for the ‘Databases’ Category

As you may know, SQL Server will create a query plan for each batch it executes. The plan is then stored in plan cache for future re-use. However if your application uses lots of ad-hoc queries (like S3D filters for e.g.), you will end up with a lot of cached plans that are used only once and are just lying idle, taking space and wasting memory.

So how do you check the usage of such plans? You can do it two ways:

  • Run this query against your database
SELECT Count(*) Total, SUM(size_in_bytes)/1048576 MBs
FROM sys.dm_exec_cached_plans 
WHERE cacheobjtype LIKE 'Compiled Plan%' 
And objtype = 'Adhoc' And usecounts=1

image
  • Run a scan of your database server using SCA (System & Configuration Analyzer) 5.6.5 or later. The AdHoc counters are shown under MSSQL Info node as shown below:

SNAGHTML91d5bb8

So now you know the adhoc plan usage but how do you limit it’s growth? Luckily SQL Server 2008 has added a configuration option, “optimize for ad hoc workloads”, that can be used to improve the efficiency of the plan cache. With this options set to 1, the SQL engine will store a small compiled plan stub in the plan cache when a batch is compiled the first time, instead of a full compiled plan.

If you use SCA, you can see the current settings for this option under MSSQL Config Options node:

image

Alternatively you can run sp_configure to show advanced options:

image

Since it’s an advanced option, you need to reconfigure it with override. So to enable it, you would need to set the run_value to 1.

image

In some cases, like mentioned here , you may need to periodically monitor the cache and clear it manually using the command, DBCC FREESYSTEMCACHE(‘SQL Plans’), or create an automated SQL server job to do that.

To help with an automated solution, SCA now includes a job to clear the Adhoc cache:

image

The job is then invoked as per the schedule and shows up in the Job Manager view:

image

The job is only invoked if the cache size is greater than 500 MBs

image

 

Hopefully this gives you an idea on how to manage your “cache plans”…till next time.

Read Full Post »

I have updated the PowerShell script, GraphSQLIO.ps1, to include the histogram data that appears in the output file. As you know, the data is something like this:

histogram:
ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+
%: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 100

Now when you run the script, it will save this data too and the charts will show it like this:

FormHist

The updated files are here (save & rename to .zip and extract):GraphSQLIO

Note: The script defaults to using MSChart control assembly that is delivered with .NET 4.0. In other words, it assumes that you are using PowerShell 3.0 and above. If you are using PowerShell 2.0, then make sure that you have:

  • Installed MSChart control from MSDN
  • Commented the line in the script (see below) that uses .NET 4.0 and un-commented that line that use .NET 3.5 lib for MSChart control

Some folks had trouble downloading the file so I have added the PowerShell script here in it’s entirety:

param
(
[string]$OptionsFile= “TestSQLIO.cmd”,
[string]$ResultsFile= “SQLIOResults.xml”,
[switch]$OnlyCharts
)

Add-Type -AssemblyName System.Windows.Forms
#Use this on .NEt 4.0 otherwise comment the line below
Add-Type -AssemblyName System.Windows.Forms.DataVisualization

#Use this on .NEt 3.5
#Add-Type -AssemblyName (‘System.Windows.Forms.DataVisualization, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35’)

#Script level variable to store Run entries
$Script:entries = @()

function ParseRunEntries
{
$folder = Split-Path (Resolve-Path $OptionsFile) -Parent
$lines = (Get-Content $OptionsFile) -split [Environment]::NewLine | Where {$_.Trim().Length -gt 0 -and $_.StartsWith(“sqlio”)} | foreach {$_.Trim()}

try
{
foreach($line in $lines)
{
$options = $line -split “-” | foreach {$_.Trim()}
$filename = [String]::Empty
if($options.Length -eq 9)
{
$sqlObj = New-Object psobject
Add-Member -InputObject $sqlObj -m NoteProperty -Name “Options” -Value @()
foreach($option in $options[1..$options.Length])
{
$sqlObj.Options += “‘-$option'” #Create parameter array of string type
$value = $option.Substring(1).Trim()
if($option -match “^k(w|W|R|r)”){
$filename += $value.ToUpper()
Add-Member -InputObject $sqlObj -m NoteProperty -Name “Operation” -Value $value
}
elseif($option -match “^s\d+” ) {
$filename += $option
Add-Member -InputObject $sqlObj -m NoteProperty -Name “Duration” -Value ([int]$value)
}
elseif($option -match “^f(random|sequential)”) {
$filename += $option
Add-Member -InputObject $sqlObj -m NoteProperty -Name “Mode” -Value $value
}
elseif($option -match “^o\d+”) {
Add-Member -InputObject $sqlObj -m NoteProperty -Name “Outstanding” -Value ([int]$value)
}
elseif($option -match “^b\d+”) {
$filename += $option
Add-Member -InputObject $sqlObj -m NoteProperty -Name “Size” -Value ([int]$value)
}
elseif($option -match “-F.*\.txt”)
{
Add-Member -InputObject $sqlObj -m NoteProperty -Name “ParamFile” -Value $value
}
}
$filename += “.txt”
Add-Member -InputObject $sqlObj -m NoteProperty -Name “OutputFile” -Value $filename
Add-Member -InputObject $sqlObj -m NoteProperty -Name “CaptureLatency” -Value “‘-LS'”
Add-Member -InputObject $sqlObj -m NoteProperty -Name “Buffering” -Value “‘-BN'”
$Script:entries += $sqlObj
$sqlObj = $null
}
}
}
catch{
write “Error executing script: $Error”
}
}

function RunEntries
{
$validRuns = @()
if($Matches) { $Matches.Clear()}
foreach($obj in $Script:entries)
{
$sqlioPath = Resolve-Path sqlio.exe
$results = Invoke-Expression -Command “$sqlioPath $($obj.Options)” | Out-String
$results | Set-Content $obj.OutputFile

#Add the results to each run object
Add-Member -InputObject $obj -m NoteProperty -Name “Results” -Value $results

if($results.Contains(“histogram:”))
{
$lines = $obj.Results -split [Environment]::NewLine | Where {$_.Trim().Length -gt 0} | foreach {$_.Trim()}
foreach($line in $lines)
{
$outputValue = ParseOutputValue $line
if($outputValue -is [double])
{
if($line -match “IOs/sec.*”)
{
Add-Member -InputObject $obj -MemberType NoteProperty -Name “IOsPerSec” -Value $outputValue
}
elseif($line -match “MBs.*”)
{
Add-Member -InputObject $obj -MemberType NoteProperty -Name “MBsPerSec” -Value $outputValue
}
elseif($line -match “Avg_Latency.*”)
{
Add-Member -InputObject $obj -MemberType NoteProperty -Name “AvgLatencyMs” -Value $outputValue
}
elseif($line -match “using (specified|current) size:”)
{
Add-Member -InputObject $obj -MemberType NoteProperty -Name “TestFilesize” -Value $outputValue
}
}
elseif($line -match “(?\d+)\s+thread[s]?\s+(reading|writing).*file\s+(?[a-zA-Z]?[:\\]?.*)”)
{
Add-Member -InputObject $obj -MemberType NoteProperty -Name “Threads” -Value ([int]$Matches.th)
Add-Member -InputObject $obj -MemberType NoteProperty -Name “TestFile” -Value $Matches.TestFile.Trim()
}
elseif($line -match “^ms:\s+\d+”)
{
#ms: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+
$lineWithValues = $line -split {$_ -eq “m” -or $_ -eq “s” -or $_ -eq “:” -or $_ -eq “+”} | where {$_.Trim().Length -gt 0} | foreach {$_.Trim()}
$values = $lineWithValues -split “\s+” | foreach {[int]$_}
Add-Member -InputObject $obj -MemberType NoteProperty -Name “LatencyValues” -Value $values
}
elseif($line -match “^%:\s+\d+”)
{
#%: 0 0 0 0 3 4 5 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 85
$lineLatency = $line -split {$_ -eq “%” -or $_ -eq “:”} | where{$_.Trim().Length -gt 0} | foreach{$_.Trim()}
$percentValues = $lineLatency -split “\s+” | foreach {[int]$_}
Add-Member -InputObject $obj -MemberType NoteProperty -Name “LatencyPercent” -Value $percentValues
}
}
$validRuns += $obj
}
}
$validRuns | Export-Clixml -Path $ResultsFile
}

function ParseOutputValue($linevalue)
{
$value = [String]::Empty
if($linevalue -match “([^(ms:|%:\d+)].*:\s+)(?\d+)”)
{
$value = [double] $Matches.no
}
$value
}

function GetChartArea($chart, $curTicks)
{
$chartArea = New-Object System.Windows.Forms.DataVisualization.Charting.ChartArea
$chartArea.AxisY.MajorGrid.LineColor = “Blue”
#$chartArea.Area3DStyle = New-Object System.Windows.Forms.DataVisualization.Charting.ChartArea3DStyle `
# -Property @{Enable3D=$true; IsRightAngleAxes=$false; Rotation=20; Inclination=20; PointDepth=100; PointGapDepth=200}

$chartAreaName = “CA$curTicks”
$chartArea.Name = $chartAreaName
$chartArea.BackSecondaryColor = [System.Drawing.Color]::LightSteelBlue
$chartArea.BackGradientStyle = [System.Windows.Forms.DataVisualization.Charting.GradientStyle]::DiagonalRight
$chartArea.AxisX.Title = “Block Size”
$chartArea.Area3DStyle.Enable3D = $false
$chart.ChartAreas.Add($chartArea) | Out-Null
$chartArea
}

function SetSeries($chart, $chartArea, $seriesName, $chartType=”Column”, $seriesType=”Cylinder”)
{
$chart.Series.Add($seriesName) | Out-Null
$chart.Series[$seriesName].ChartArea =$chartArea.Name
$chart.Series[$seriesName].BorderWidth = 2
$chart.Series[$seriesName].ChartType = $chartType
$chart.Series[$seriesName].LabelForeColor = [System.Drawing.Color]::DarkGreen
$chart.Series[$seriesName].LabelBackColor = [System.Drawing.Color]::LightGreen
$chart.Series[$seriesName][“DrawingStyle”] = $seriesType
$chart.Series[$seriesName][“PointWidth”] = “0.5”
}

function GenerateCharts
{
if(-not (Test-Path $ResultsFile))
{
throw “Invalid file specified”
}

$Script:entries = Import-Clixml $ResultsFile
$groupEntries = $Script:entries | Group-Object -Property Operation, Mode -AsHashTable -AsString

$form = New-Object System.Windows.Forms.Form
$form.Text = “SQLIO CHARTS”
$form.Width = 700
$form.Height = 700

$tabHost = New-Object System.Windows.Forms.TabControl
$tabHost.Dock = “Fill”
$form.Controls.Add($tabHost)
$runThreads = ($Script:entries | select -First 1).Threads
$runOutstanding = ($Script:entries | select -First 1).Outstanding
$runDuration = ($Script:entries | select -First 1).Duration
$runFileSize = ($Script:entries | select -First 1).TestFilesize
$runFilePath = ($Script:entries | select -First 1).TestFile
$form.Text = “SQLIO-$runDuration sec run with $runThreads threads, $runOutstanding pending IOs & $runFileSize(MB) $runFilePath file”

foreach($grpKey in $groupEntries.Keys)
{
#Create tab page to host group Chart
$chartName = ($grpKey -replace “,”, “-“) -replace ” ” , “”
$tabPage = New-Object System.Windows.Forms.TabPage -ArgumentList “Run-$($chartName)”

#create chart host
$chart = New-Object System.Windows.Forms.DataVisualization.Charting.Chart
$chart.BackColor = [System.Drawing.Color]::Transparent
$chart.Dock = “fill”

if($grpKey -match “random”)
{
$chart.Titles.Add(“Note: IOPS & Avg.Latency”) | Out-Null
}
else
{
$chart.Titles.Add(“Note: MBs/sec & Avg.Latency”) | Out-Null
}

$chart.Titles[0].Font = New-Object System.Drawing.Font(“Arial”,11, [System.Drawing.FontStyle]::Bold)

#Create chart areas
$grpEntry = $groupEntries[$grpKey]
AddChartAreas $chart $grpEntry

#Add chart to tab page
$tabPage.Controls.Add($chart)

#Add tab page to tab control
$tabHost.TabPages.Add($tabPage)
}

$form.Add_Shown({$form.Activate()})
$form.ShowDialog() | Out-Null
}

function AddChartAreas($chart, $currentGroup)
{
for($i=0; $i -lt 4; $i++)
{
$curTicks = (Get-Date).Ticks
#Get a chartarea and add it to chart
$chartArea = GetChartArea $chart $curTicks

if($i -lt 3)
{
#Get and set a series
$seriesName = “SE-$curTicks”

SetSeries $chart $chartArea $seriesName
$chart.Series[$seriesName].IsValueShownAsLabel = $true
$chart.Series[$seriesName].IsXValueIndexed = $true

#Add series data
if($i -eq 0)
{
$chartarea.AxisY.Title = “Avg. Latency(ms)”
$currentGroup | ForEach-Object{$chart.Series[$seriesName].Points.AddXY($_.Size, $_.AvgLatencyMs)} | Out-Null
}
elseif($i -eq 1)
{
$chartarea.AxisY.Title = “MBs/sec”
$currentGroup | ForEach-Object{$chart.Series[$seriesName].Points.AddXY($_.Size, $_.MBsPerSec)} | Out-Null
}
elseif($i -eq 2)
{
$chartarea.AxisY.Title = “IOPS”
$currentGroup | ForEach-Object{$chart.Series[$seriesName].Points.AddXY($_.Size, $_.IOsPerSec)}| Out-Null
}
}
else
{
$chartArea.AxisY.Title = “Percentage”
$chartArea.AxisX.Title = “Latency (ms)”
$chartArea.AxisY.IsMarginVisible = $true
#Add series data
foreach($grpEntry in $currentGroup)
{
$seriesName = “SE-” + (Get-Date).Ticks
SetSeries $chart $chartArea $seriesName -seriesType “Default”

for($i=0; $i -lt $grpEntry.LatencyPercent.Count; $i++)
{
$chart.Series[$seriesName].Points.AddY($grpEntry.LatencyPercent[$i]) | Out-Null
}
}
}
}
}

Clear-Host
if($OnlyCharts)
{
GenerateCharts
}
else
{
ParseRunEntries
RunEntries
GenerateCharts
}
#$Script:entries

 

Read Full Post »

If you haven’t used SQLIO, it’s a tool available from Microsoft to determine the I/O capacity of a given hardware configuration before deploying SQL Server in production. I won’t go into details of using SQLIO since there are bunch of articles on the net on how to go about using/configuring it.

I’ve published the script & supporting files on GitHub:

https://github.com/sunitjoshi/GraphSQLIO

What I want to show you here is how to automate SQLIO runs and parse the subsequent output to easily create charts that can help in analyzing the output data.

Something like this:

Form2

The “Note” is to basically reminds me that, I need to look at IOPS & Latency for random IOs and MBs/sec & Latency for sequential IOs.

My TestSQLIO.cmd file has this:

sqlio -kR -s300 -frandom -o8 -b8 -LS -BN -Fparam.txt

sqlio -kR -s300 -frandom -o8 -b64 -LS -BN -Fparam.txt

sqlio -kR -s300 -frandom -o8 -b512 -LS -BN -Fparam.txt

sqlio -kW -s300 -frandom -o8 -b8 -LS -BN -Fparam.txt

sqlio -kW -s300 -frandom -o8 -b64 -LS -BN -Fparam.txt

sqlio -kW -s300 -fsequential -o8 -b8 -LS -BN -Fparam.txt

sqlio -kW -s300 -fsequential -o8 -b64 -LS -BN -Fparam.txt

And my param.txt has this:

C:\sqltest.dat 4 0x0 32768

This runs the test using 8, 64, 512 block sizes and 8 pending IOs. These are values that I’m using and you can surely expand this for your needs.

You can see that I don’t specify the output file in the runs above since my PowerShell script concocts one for me. The PowerShell script runs the TestSQLIO.cmd (default input file) and outputs data to SQLIOResults.xml (default file).

I choose to create a PSObject in code that relates to an individual run with properties like Operation, Mode, Duration, Size, etc. so that I can serialize the object collection using Export-Clixml CmdLet. The reason I did this is, I didn’t want to worry about all the other artifacts, like the TestSQLIO.cmd and param.txt later, to query information on threads used, duration the test was run for, size of the testfile, etc. If I dump everything in an xm file, then that’s all I need later. Again this is just my preference and may or may not work for you.

With that said, this how the resultant xml looks like:

 <Obj RefId="0">
    <TN RefId="0">
      <T>System.Management.Automation.PSCustomObject</T>
      <T>System.Object</T>
    </TN>
    <MS>
      <Obj N="Options" RefId="1">
        <TN RefId="1">
          <T>System.Object[]</T>
          <T>System.Array</T>
          <T>System.Object</T>
        </TN>
        <LST>
          <S>'-kR'</S>
          <S>'-s10'</S>
          <S>'-frandom'</S>
          <S>'-o8'</S>
          <S>'-b8'</S>
          <S>'-LS'</S>
          <S>'-BN'</S>
          <S>'-Fparam.txt'</S>
        </LST>
      </Obj>
      <S N="Operation">R</S>
      <I32 N="Duration">10</I32>
      <S N="Mode">random</S>
      <I32 N="Outstanding">8</I32>
      <I32 N="Size">8</I32>
      <S N="OutputFile">Rs10frandomb8.txt</S>
      <S N="CaptureLatency">'-LS'</S>
      <S N="Buffering">'-BN'</S>
      <Obj N="Results" RefId="2">
        <TNRef RefId="1" />
        <LST>
          <S>sqlio v1.5.SG</S>
          <S>using system counter for latency timings, 2530840 counts per second</S>
          <S>parameter file used: param.txt</S>
          <S>_x0009_file C:\sqltest.dat with 4 threads (0-3) using mask 0x0 (0)</S>
          <S>4 threads reading for 10 secs from file C:\sqltest.dat</S>
          <S>_x0009_using 8KB random IOs</S>
          <S>_x0009_enabling multiple I/Os per thread with 8 outstanding</S>
          <S>_x0009_buffering set to not use file nor disk caches (as is SQL Server)</S>
          <S>using specified size: 1024 MB for file: C:\sqltest.dat</S>
          <S>initialization done</S>
          <S>CUMULATIVE DATA:</S>
          <S>throughput metrics:</S>
          <S>IOs/sec:   235.04</S>
          <S>MBs/sec:     1.83</S>
          <S>latency metrics:</S>
          <S>Min_Latency(ms): 1</S>
          <S>Avg_Latency(ms): 134</S>
          <S>Max_Latency(ms): 972</S>
          <S>histogram:</S>
          <S>ms: 0  1  2  3  4  5  6  7  8  9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+</S>
          <S>%:  0  0  0  0  0  0  0  3  1  0  0  1  2  0  1  1  0  1  1  1  1  1  1  0 85</S>
        </LST>
      </Obj>
      <I32 N="Threads">4</I32>
      <Db N="TestFilesize">1024</Db>
      <Db N="IOsPerSec">235.04</Db>
      <Db N="MBsPerSec">1.83</Db>
      <Db N="AvgLatencyMs">134</Db>
    </MS>
  </Obj>

Also once I have just this file, I can quickly “graph it” using the –OnlyCharts option and create tab based charts as shown above.

I used a a tabbed interface since I group my test runs based on Operation (R/W) and mode (sequential/random) and create a Hashtable:

Group-Object -Property Operation, Mode -AsHashTable –AsString

The Hashtable keys serve as individual tabpage that are added to the main tab hosted in the form.

To create the charts, I do this:

  • Create a form host with a tab-control that “fills” the form area  (GenerateCharts function)
  • Group the entries by Operation and Mode (GenerateCharts function)
  • Cycle through group keys by:
    • First creating a Chart object (GenerateCharts function)
    • Then creating a Chart Area and it’s Series for the current key’s values (AddChartAreas function)

That should be it!

 

Read Full Post »

As most Smart-3D admins know, regularly inspecting Oracle alert log and MSSQL database server log, is an important part of  daily admin tasks. Most critical errors related to failed database start-ups, initialization, failed login attempts, issues with capture/apply processes, etc. are written to these logs. Reading MSSQL logs is still simpler compared to Oracle, since in Oracle the alert log resides deep in directory structure that is hard to remember. Further, the path changes between 10g (under C:\ORACLE\PRODUCT\10.2.0\ADMIN\SP3DSMP1\BDUMP)  and 11g (under C:\ORACLE\diag\rdbms\sp3dsmp7\sp3dsmp7\trace) for each instance , making it more harder to keep track of it.

The SCA CmdLet, Get-SCADbServerInfo, tries to alleviate this issue a bit by providing the ReadLog() method, that works for both Oracle (1g/11g) and MSSQL servers. The only input it needs is the SCA server name and it figures out the path for the instance under the SCA server context. Plus, you can run the command remotely from a client machine, and as long as you have permissions to the Oracle server, the command will return the contents of the log file without you having to RDP to the server  machine and browsing to the deeply buried file, to open it, which I think is pretty powerful and useful. Also since the  output is returned by a PowerShell CmdLet (Get-SCADbServerInfo), you can additionally pipe it to other PowerShell cmds like “where”, “last”, etc. and transform the data to your liking.

So let’s go thru. the steps on how to use this command using both Oracle and MSSQL server registrations. I’ll further break the filtering in steps, so you can see how I arrive at the end-result:

Step-1: Finding your SCA Server Names:

First step is to import the SCACommands module and then run the Get-SCAServer CmdLet to see your current server registrations. I’ll be using the ones, highlighted below:


GetSCAServers

Step-2: Initializing Get-SCADbServerInfo –

Next, you would initialize the Get-SCADbServerInfo using an SCA servername as the input. So I would do the following to init the MSSQL server first:

  $db = Get-SCADbServerInfo SMP7SQL

Once you have done this, you can call the $db.ReadLog() method and pipe it to Get-Member CmdLet to see the colns. returned in the output. As you can see in the image below, “Message”, is the field I’m interested in and since it’s of type String, it can be filtered using -like or -match PS operators.

AsI’m interested in getting any “failed” messages from the first 10 lines (ReadLog() for MSSQL is sorted by DateTime in descending format), I would need to:

1. Get first 10 lines :  $db.ReadLog() | select -first 10

2. Filter for “failed” string using the where operator. The current item in PS is denoted by $_ and $_.Message gets me the Message property : $db.ReadLog() | select -first 10 | where {$_.Message -like “*failed*”}

3.  Format to fit the screen & wrap any long line. So my final cmd line is: $db.ReadLog() | select -first 10 | where {$_.Message -like “*failed*”} | ft -auto -wrap

Reading MSSQL Log

Reading MSSQL Log

Moving onto Oracle, I first init the object for Oracle using:

$db = Get-SCADbServerInfo SMP1_ORA

I can also look at the no of lines in the alert log, by piping the output to the measure command. The output from ReadLog() for Oracle is individual lines of string and Oracle keeps appending messages to the alert log, so latest messages are towards the end of the log. In this case, I want to look at the last 2500 lines in the alert log and filter the ones that start with an Oracle error.  As you know, any Oracle error has the string “ORA-“ in it, so this is what I’ll use in my filter.

So my filter for Oracle is: $db.ReadLog() | select -Last 2500 | where {$_ -like “ORA-*”} | ft -AutoSize

Note how I use PS CmdLet to help me “filter” what I need, and that’s where PS really shines:

ReadLog_ORA

Reading Oracle Alert Log

Sending Email Notification:

We will make this a bit more interesting and send an email of important message using the Send-EmailMessage PS CmdLet. So this is what you will need to do:

1. Setup you smtpServer in a variable – I have set mine in $smtpServer in the PS window

2. Convert the output of the previous filters to a String using Out-String PS cmdlet since the -Body parameter for Send-EmailMessage accepts that and store that in a $messages variable

3. So my final filter string would be:  $messages = $db.ReadLog() | where {$_.Message -like “starting*”} | ft -AutoSize | Out-String

Sending Email

Sending Email

Thus you can you can see that, using SCA CmdLets in tandem with PS built-in ones, can help you come up with interesting ways to ease some your admin pains. Hopefully this will pique your interest in exploring PS more on your own. Feel free to leave a comment if you have any questions.

Read Full Post »

As you all may know, MS added an entire new command and scripting language, PowerShell, as part of the base OS starting with Windows-7. Although it has been available with XP as a downloadable component, it has really been made into, what you call, a first-class citizen with Windows-7 – it’s included with OS and exposes 236 CmdLets that cover a vast gamut of functions, from querying services to running remote jobs.

I have been playing with PowerShell for sometime now and I have come to the conclusion that, it’s a powerful tool in an S3D Admins toolkit. The scripting language is simple to start-with and is targeted at admins rather than developers; although you can create real complex scripts with it.

Here’s a simple (because of the syntax) but useful example, that queries for services on different servers that start with the name “MSSQL*”:

Get-Services

Get-Services

As you can see, in one-line, you are able to run a command, Get-Service, against multiple remote machines, and have the output returned in a nicely formatted table, sized to fit the console window. I think this pretty much conveys the power and depth of the PS language and supporting environment. Hopefully this should pique your interest in knowing more about PowerShell and seeing how it may best fit in your work environment. I won’t go over much on the language & it’s syntax, since MS sites have loads of information on it.

What I will go over now, are the CmdLets, that have been added to SCA (V5.3), which I think will surely benefit 3D Admins. I’ll go over in detail, how to use the CmdLets, including configuration and options.

Step 1 – Find your PowerShell Modules Path

The SCA CmdLets are included as a PowerShell module, that need to be installed in your PS module-search-path. The easiest was to find this is, to query for the Env:/PSModulePath variable as shown in the bitmap below:

Module Path

Module Path

So you would create a folder called, SCACommands, underneath one of your module search paths and copy the downloaded files into that folder, as shown below:

SCACommands Folder

SCACommands Folder

Step 2 – Confirm Module Availability 

The next step is to make sure that the module is available to be loaded. You can do that by running the command, Get-Module -ListAvailable, and it should list SCACommands, as one of the modules:

ListModule SCACommands

ListModule SCACommands

Step 3 – Import SCACommands & List Functions

The next step would be to load SCACommands module using, Import-Module SCACommands. You can then check the functions exposed by the module, by running, Get-Command -Module SCACommands. If everything goes well, you should see the output below:

List Functions

List Functions

Step 4 – Using Get-SCAServers

This CmdLet allows you to quickly list all the servers registered with SCA. The name coln is important here, since that is what Get-SCADbServerInfo and Get-SCAPerfCounters CmdLets, require as an input. The Get-SCAComputerInfo requires a hostname instead, so that you don’t have to register all machines in SCA.

GetSCAServers

Step 5- Using Get-SCAComputerInfo

This CmdLet lets you easily gather hardware and OS specs from a machine including logged on users. It requires a hostname as input and the machine does not have to be registered with SCA.

GetSCAComputerInfo

You can use a command line like, $comp = Get-SCAComputerInfo localhost , to store the information in a local variable, called $comp. You can then output useful information like hard-disk specs, video-card info, etc. as shown above. To query more than a single machine, use pipeline input as in, ‘machine1’, ‘machine2’ | GetSCAComputerInfo

Step 6 – Using GetSCADbServerInfo

This CmdLet allows you to gather information about a database server that includes version info., memory usage, connected-users, oracle patch info, etc. It accepts pipeline input, so you can query multiple machines if required. Again it’s best to store the output in a local variable so that you can view properties that are collections, like ConnectedUsers. If you need to only view users, you can use the switch, -OnlyUsers, and that will cause the command to only emit user logged on to the database server.

OraclePatch

Information on Oracle Patches is sorted by DateApplied field

MSSQL Info

Information on users connected to MSSQL database

Step 7 – Get-SCAPerfCounters

Important OS performance counters can be viewed using this CmdLet.  It also accepts an optional switch parameter, -GetDetailed, that when specified, get information on expensive queries for MSSQL and Hit Ratios for Oracle.

Oracle

Oracle

MSSQL

MSSQL

I hope that these CmdLets will be a useful addition to a 3DAdmins toolkit. Best would to download and play with these, and as always, let me know if they would like me to go in more detail on their usage. In the next series, I’ll go over CmdLet for monitoring 3d Symbols, comparing shares and checking catalog state. Till then, happy scripting!

Read Full Post »

If you have dealt with performance issues related to configuration with Oracle, you surely know that setting the redo-log file size, is very important to control checkpoint activity. With earlier versions of Oracle, you had run bunch of scripts and monitor the database, to arrive at some approximation for it’s size. However with 10g onwards, Oracle has added the fast_start_mttr_target init parameter, that can help you  in correctly sizing the readolog files. The V$INSTANCE_RECOVERY view gives you the size via the OPTIMAL_LOGFILE_SIZE column.

So let’s see how we can obtain this value using SCA. If you run a scan in SCA without first setting the fast_start_mttr_target, you won’t see  the “Optimal LogFile Size” value under Redo Stats node as seen here:

SCA Scan

 So first you need to make sure that fast_start_mttr_target init parameter is set for your database. In my case I’m going to set it to 5 mins as shown below:


Now when I run a scan in SCA, you can see that I have an additional row under Redo Stats with name “Optimal LogFile Size” with a value, that Oracle has populated based on current database activity

Optimal LogFile Size

 I can also look at the current logfile size and determine whether they need to be resized as shown below: 


That’s all to it. Hopefully this will help you in easily figuring out the correct redo-log file size for your Oracle database environment.

Read Full Post »

As some of you know, we released System & Configuration Analyzer (SCA) about a week back. Those who don’t know about SCA, basically it’s a .NET application to gather detailed system and database information of a windows server to help troubleshoot performance issues related to non-optimal settings and/or resource constraints. Now the intent was to gather data from windows OS servers, but since it comes loaded with OCI “zero-configuration” dlls, some adventurous analysts, used it to connect and analyze Oracle database hosted on Linux.

I found this pretty cool so thought including it as a post. You can see the familiar UNIX (“/”) path separators in the image below. I guess, this does go with the adage: “Users can be more innovative than developers” 🙂

SCA (Linux Server)

 

 

 

 

 

 

 

I forgot to mention that, to only see “Database” information in SCA, you need to set the “Perf Collector Type” to database. In that case, SCA won’t pull OS (Linux) information and only collect database (Oracle) information.

SCA – Options for Linux Host

Read Full Post »

Older Posts »