Feeds:
Posts
Comments

Posts Tagged ‘SQLIO’

If you haven’t used SQLIO, it’s a tool available from Microsoft to determine the I/O capacity of a given hardware configuration before deploying SQL Server in production. I won’t go into details of using SQLIO since there are bunch of articles on the net on how to go about using/configuring it.

I’ve published the script & supporting files on GitHub:

https://github.com/sunitjoshi/GraphSQLIO

What I want to show you here is how to automate SQLIO runs and parse the subsequent output to easily create charts that can help in analyzing the output data.

Something like this:

Form2

The “Note” is to basically reminds me that, I need to look at IOPS & Latency for random IOs and MBs/sec & Latency for sequential IOs.

My TestSQLIO.cmd file has this:

sqlio -kR -s300 -frandom -o8 -b8 -LS -BN -Fparam.txt

sqlio -kR -s300 -frandom -o8 -b64 -LS -BN -Fparam.txt

sqlio -kR -s300 -frandom -o8 -b512 -LS -BN -Fparam.txt

sqlio -kW -s300 -frandom -o8 -b8 -LS -BN -Fparam.txt

sqlio -kW -s300 -frandom -o8 -b64 -LS -BN -Fparam.txt

sqlio -kW -s300 -fsequential -o8 -b8 -LS -BN -Fparam.txt

sqlio -kW -s300 -fsequential -o8 -b64 -LS -BN -Fparam.txt

And my param.txt has this:

C:\sqltest.dat 4 0x0 32768

This runs the test using 8, 64, 512 block sizes and 8 pending IOs. These are values that I’m using and you can surely expand this for your needs.

You can see that I don’t specify the output file in the runs above since my PowerShell script concocts one for me. The PowerShell script runs the TestSQLIO.cmd (default input file) and outputs data to SQLIOResults.xml (default file).

I choose to create a PSObject in code that relates to an individual run with properties like Operation, Mode, Duration, Size, etc. so that I can serialize the object collection using Export-Clixml CmdLet. The reason I did this is, I didn’t want to worry about all the other artifacts, like the TestSQLIO.cmd and param.txt later, to query information on threads used, duration the test was run for, size of the testfile, etc. If I dump everything in an xm file, then that’s all I need later. Again this is just my preference and may or may not work for you.

With that said, this how the resultant xml looks like:

 <Obj RefId="0">
    <TN RefId="0">
      <T>System.Management.Automation.PSCustomObject</T>
      <T>System.Object</T>
    </TN>
    <MS>
      <Obj N="Options" RefId="1">
        <TN RefId="1">
          <T>System.Object[]</T>
          <T>System.Array</T>
          <T>System.Object</T>
        </TN>
        <LST>
          <S>'-kR'</S>
          <S>'-s10'</S>
          <S>'-frandom'</S>
          <S>'-o8'</S>
          <S>'-b8'</S>
          <S>'-LS'</S>
          <S>'-BN'</S>
          <S>'-Fparam.txt'</S>
        </LST>
      </Obj>
      <S N="Operation">R</S>
      <I32 N="Duration">10</I32>
      <S N="Mode">random</S>
      <I32 N="Outstanding">8</I32>
      <I32 N="Size">8</I32>
      <S N="OutputFile">Rs10frandomb8.txt</S>
      <S N="CaptureLatency">'-LS'</S>
      <S N="Buffering">'-BN'</S>
      <Obj N="Results" RefId="2">
        <TNRef RefId="1" />
        <LST>
          <S>sqlio v1.5.SG</S>
          <S>using system counter for latency timings, 2530840 counts per second</S>
          <S>parameter file used: param.txt</S>
          <S>_x0009_file C:\sqltest.dat with 4 threads (0-3) using mask 0x0 (0)</S>
          <S>4 threads reading for 10 secs from file C:\sqltest.dat</S>
          <S>_x0009_using 8KB random IOs</S>
          <S>_x0009_enabling multiple I/Os per thread with 8 outstanding</S>
          <S>_x0009_buffering set to not use file nor disk caches (as is SQL Server)</S>
          <S>using specified size: 1024 MB for file: C:\sqltest.dat</S>
          <S>initialization done</S>
          <S>CUMULATIVE DATA:</S>
          <S>throughput metrics:</S>
          <S>IOs/sec:   235.04</S>
          <S>MBs/sec:     1.83</S>
          <S>latency metrics:</S>
          <S>Min_Latency(ms): 1</S>
          <S>Avg_Latency(ms): 134</S>
          <S>Max_Latency(ms): 972</S>
          <S>histogram:</S>
          <S>ms: 0  1  2  3  4  5  6  7  8  9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24+</S>
          <S>%:  0  0  0  0  0  0  0  3  1  0  0  1  2  0  1  1  0  1  1  1  1  1  1  0 85</S>
        </LST>
      </Obj>
      <I32 N="Threads">4</I32>
      <Db N="TestFilesize">1024</Db>
      <Db N="IOsPerSec">235.04</Db>
      <Db N="MBsPerSec">1.83</Db>
      <Db N="AvgLatencyMs">134</Db>
    </MS>
  </Obj>

Also once I have just this file, I can quickly “graph it” using the –OnlyCharts option and create tab based charts as shown above.

I used a a tabbed interface since I group my test runs based on Operation (R/W) and mode (sequential/random) and create a Hashtable:

Group-Object -Property Operation, Mode -AsHashTable –AsString

The Hashtable keys serve as individual tabpage that are added to the main tab hosted in the form.

To create the charts, I do this:

  • Create a form host with a tab-control that “fills” the form area  (GenerateCharts function)
  • Group the entries by Operation and Mode (GenerateCharts function)
  • Cycle through group keys by:
    • First creating a Chart object (GenerateCharts function)
    • Then creating a Chart Area and it’s Series for the current key’s values (AddChartAreas function)

That should be it!

 

Read Full Post »