Michael J. Swart

January 23, 2014

Generating Concurrent Activity

Filed under: Miscelleaneous SQL,SQL Scripts,SQLServerPedia Syndication,Technical Articles — Michael J. Swart @ 10:21 am
Testing Concurrency

I’ve discovered that DBAs use different methods to accomplish what amounts to the same thing: Generating concurrent activity on a server. I want to explore a number of methods I’ve seen. For each method, I want to call one particular procedure, many times at once, and often… Oh, and for free.

Why are we doing this?

  • For Testing: The whole point is to generate concurrent activity on the server. Testing with this kind of activity is one of the only ways to avoid resource contention issues.
  • For Demos: Concurrency issues are so common and so varied, that it’s not surprising how often we see demos that depend on this kind of generated database activity.
  • For Troubleshooting: This is my favorite reason. A concurrency issue can’t be fixed well unless it can be reproduced reliably on some test environment. That’s why the methods below have a spot on my troubleshooting toolbelt.

The whole idea is to get many workers active on SQL Server at once.

The exocomps were smart enough to exit on their own

For each method below, look out for the place where I specify these “parameters”

  • Number of virtual workers (50 in my example)
  • Number of iterations (for instance 10,000)
  • Connection string
  • Procedure to execute (s_DoSomething in my case)

One other thing to watch for is the overhead that this tool puts on the machine. Ideally, the method is suitable to run this from the same test environment that the SQL Server sits on. So I want my method to be fairly lightweight. This means that it’s best to handle iterations on SQL Server which cuts down on overhead associated with opening connections. So in most cases, instead of

exec dbo.s_DoSomething

I have

declare @i int = 0; while (@i < 10000) begin exec dbo.s_DoSomething; set @i+= 1; end

Notepad + DOS Method

I adapted this method from a clever trick I saw once. It was Paul Randal giving a demo on tempdb contention.  You can find a video of that demo by visiting this newsletter.
It’s a simple idea. You have two batch files, the first is called Run.bat:

echo off
sqlcmd -S MYSERVER\MYINSTANCE -E -Q "set nocount on; declare @i int = 0; while (@i < 10000) begin exec tempdb.dbo.s_DoSomething; set @i+= 1; end" > NUL

and the second is called Run50.bat:

start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat
start run.bat

Click Run50.bat and boom: instant parallel activity. Now it is a little sloppy – it opens up 50 command windows – but it does the trick pretty nicely.

Using PowerShell

We should be able to do this in one line right? Yep. Adapted from a post from Kendra Little I came up with this:

foreach($worker in 1..50) `
{ `
  Start-Job `
    -ScriptBlock `
      { `
        Import-Module sqlps; `
        Invoke-Sqlcmd `
          -Query "set nocount on; declare @i int = 0; while (@i < 10000) begin exec tempdb.dbo.s_DoSomething; set @i+= 1; end" `
          -ServerInstance "MYSERVER\MYINSTANCE" `
          | Out-Null `
      } `

Except that I don’t think I got it quite right. Whatever I changed makes this method unsuitable.

The script schedules a job 50 times, but it takes about a minute just to schedule them all. Once they’re scheduled, the jobs take time to start and not enough of them work in parallel to generate the needed concurrent activity so I give this method a “skip”. If you’re really comfortable with powershell, maybe you can get this to work faster. If you can, let me know.


This is my preferred method. It started out as the program I used to test upsert concurrency at http://michaeljswart.com/go/upsert but a friend at work showed me that .net 4.5 has some nifty new asynchronous methods that make it look nicer, perform faster and weigh lighter.

using System;
using System.Linq;
using System.Data.SqlClient;
using System.Threading.Tasks;
namespace AConsoleApplication {
    class Program {
        static void Main( string[] args ) {
            var cs = new SqlConnectionStringBuilder();
            cs.DataSource = @"MYSERVER\MYINSTANCE";
            cs.InitialCatalog = "tempdb";
            cs.IntegratedSecurity = true;
            cs.AsynchronousProcessing = true;
            string connectionString = cs.ToString();
            Task[] tasks = Enumerable.Range( 0, 50 )
                .Select( i => RunCommand( connectionString ) )
            Task.WaitAll( tasks );
        private static async Task RunCommand(
                string connectionString
            ) {
            string sql = @"s_DoSomething";
            SqlConnection conn = new SqlConnection( connectionString );
            SqlCommand cmd = new SqlCommand( sql, conn );
            cmd.CommandType = System.Data.CommandType.StoredProcedure;
            for( int i = 0; i < 10000; i++ ) {
                try {
                    await cmd.ExecuteNonQueryAsync();    
                } catch( Exception ex ) {
                    Console.WriteLine( ex.Message );


HammerDB (originally HammerOra) is a free utility that allows users to run benchmarks against various environments. Although it was originally built for running benchmarks on Oracle, the utility now works on Windows and for SQL Server (hence the name change). I was first introduced to the utility via Kendra Little (again):

Follow these links to learn how to use the tool for its typical function, running benchmarks. Then, once you know how to do that, it’s a quick step to repurpose the tool for your own concurrent activity. For example, replace HammerDB’s generated script with this one:

package require tclodbc 2.5.1
database connect odbc "DRIVER=SQL Server Native Client 11.0;SERVER=MYSERVER\\MYINSTANCE;PORT=1433;TRUSTED_CONNECTION=YES";
odbc "set nocount on; declare @i int = 0; while (@i < 10000) begin exec tempdb.dbo.s_DoSomething; set @i+= 1; end"
odbc disconnect

It’s steady and lightweight and works really well.


SQLQueryStress is a tool written by Adam Machanic that you can download and install for free.


It’s fairly idiot proof and avoids a lot of the complexity of the other methods. For my own purposes, I want the utility to be lightweight. So I …

  • remember to turn off “Collect I/O Statistics”
  • remember to turn off “Collect Time Statistics”
  • Set “Number of iterations” to one.
  • Iterate in the query window, i.e. modify the query to call s_DoSomething 10,000 times.

My Ranking

Concurrent Activity Method Stars Notes
C# console app ★★★★½ It performs fastest and uses the least resources. Also, because I’m biased, it gets a small bump by not suffering from NIH issues.
HammerDB ★★★ Very fast (2nd fastest), but the interface is clumsy and modifications need tclsh experience <yuck> It’s best to use this for its intended purpose, as a benchmark tool.
Notepad and DOS ★★★ Quick and dirty and really fast. However, it’s still pretty clumsy. Interrupting a test part way through is difficult.
Powershell ★½ Yuck. I couldn’t get two runs that looked the same and it was near impossible to sort powershell cpu and memory pressure from SQL Server pressure.
SQLQueryStress ★★★★ It does the job well. It was a little difficult to interrupt a test. It also takes care to make it a lightweight framework.

Next Week

I’ll show some typical ways I define s_DoSomething.


  1. […] Generating Concurrent Activity – Michael J. Swart (Blog|Twitter) […]

    Pingback by (SFTW) SQL Server Links 24/01/14 • John Sansom — January 24, 2014 @ 7:00 am

  2. Okay, do you guys get the illustration? Is it clear that each of them are unreliable minions or servants that often misbehave?

    Comment by Michael J. Swart — January 24, 2014 @ 10:36 am

  3. Oh yeah, totally got it. In fact, I’m linking to it in our San Diego class in February. Great job, sir.

    Comment by Brent Ozar — January 24, 2014 @ 12:15 pm

  4. A very timely topic for me Michael. Thanks for pulling all this information together.

    I’ve been working with HammerDB and Jonathan Kehayias’ workload generator http://www.sqlskills.com/blogs/jonathan/the-adventureworks2008r2-books-online-random-workload-generator/ . Jonathan’s solution is nice because you can feed the powershell script any SQL script you want and it is easy to fire off multiple powershell sessions.

    Comment by Justin Randall — January 25, 2014 @ 8:22 am

  5. Hey Justin,

    It’s easy to fire off a handful of sessions, but how do you easily fire off 50 powershell sessions? or 500? When I evaluated powershell, I found that each session takes enough computer resources (memory/cpu) that I couldn’t be sure that what I was measuring wasn’t caused by the powershell framework itself.

    I think HammerDb and Jonathan’s solutions are good for benchmarks. Like when you’re testing and monitoring SQL Server but you’re agnostic about the app.

    For my own goals, I want to test and debug a particular application’s concurrency (that means lots of concurrent threads. It’s those goals that I used as the basis for my rating.

    Comment by Michael J. Swart — January 27, 2014 @ 9:25 am

  6. Michael, great post! A few years ago I blogged about how I generated workloads remotely with PowerShell, which – if you have more than one machine available – eliminates the problem PowerShell has of becoming such a major part of the observer overhead when running locally:


    I also had a different post explaining how I also use PowerShell to collect perfmon counters:


    Now, I am the last guy you’ll see saying “Use PowerShell” in cases where another tool is more appropriate. But in this case I find it a bit easier to generate a bunch of jobs in PowerShell than to create a C# app that – with my skills – will likely take over the box. :-)


    Comment by Aaron Bertrand — January 29, 2014 @ 2:52 pm

  7. Good point Aaron,
    I think what you and I find easier are different. And that’s fine. I find C# an extremely handy tool.

    My actual plan is to do this activity once. So when this tool is complete, I can click it once and it will launch several clients which call a server’s tempdb.dbo.s_DoSomething. Then s_DoSomething is where I put all the test functionality. s_DoSomething is next week’s post and I have a feeling it will equally as useful whether you like my prefered methods here or you stick with powershell. Stay tuned.

    On another note, I was surprised that your post was part of a TSQL Tuesday (#15) on the topic of automation. That’s what last month’s topic was as well. I think we have our first duplicate. See the whole list

    Comment by Michael J. Swart — January 29, 2014 @ 3:23 pm

  8. Someone forgot:

    ALTER TABLE dbo.TSQLTuesdays ADD CONSTRAINT uq_topic UNIQUE (Topic);

    Comment by Aaron Bertrand — January 29, 2014 @ 3:47 pm

  9. […] Part One: Generating Concurrent Activity […]

    Pingback by Troubleshooting Concurrency – A Demo - SQL Server - SQL Server - Toad World — February 13, 2014 @ 10:30 am

  10. Michael,

    A cleaner version of Run50.bat that will do the same job.

    @echo off
    for /L %%n in (1,1,50) do run.bat

    Thanks for the post.

    Comment by Nelson G — February 13, 2014 @ 4:03 pm

  11. Da Doo Run Run Run
    Da Doo Run Run

    Thanks for the tip Nelson!

    Comment by Michael J. Swart — February 13, 2014 @ 4:15 pm

  12. I don’t want to claim that PowerShell is the best option here but, one way to make your PowerShell snippet much faster is to call sqlcmd instead of loading the SQLPS module. Loading the SQLPS module is very slow and, when you said that it took a minute to create 50 jobs, I was surprised. I never experienced that type of lag with the Start-Job Cmdlet. However, I can see loading SQLPS taking that long. Just my two cents…

    Comment by Derik Hammer — February 26, 2015 @ 3:36 pm

  13. That’s cool Derik.

    If I’m not using the SQLPS module, aren’t I just coding in a fancy version of DOS batch? :-)

    I continue to use my custom c# program as my go-to solution. It’s hard to consider (reconsider) any other solution when I’ve got something that works so well.

    In the future, if I come across any other database admin work where powershell is involved, I’ll try to remember that SQLPS’s performance needs some extra scrutiny.

    Comment by Michael J. Swart — February 26, 2015 @ 4:03 pm

RSS feed for comments on this post. TrackBack URL

Leave a comment

Powered by WordPress