Blog

I’m curious. I like to know how things work. I have to read the user manual of everything I buy so I know all of its features. I like clicking all the buttons in applications I work with. However, SQL Server Management Studio (SSMS) has so many I haven’t gotten through all of them. At SQL Saturday #118 Wisconsin earlier this year, I was watching a presentation and the presenter clicked “Include Client Statistics”.

“What is this magic?!” I thought.

MSDN says it “Displays information about the query execution grouped into categories. When Include Client Statistics is selected from the Query menu, a Client Statistics window is displayed upon query execution. Statistics from successive query executions are listed along with the average values. Select Reset Client Statistics from the Query menu to reset the average.”

I mentioned it in last week’s webcast, “SSMS: More Than Meets The Eye“, but I wanted to give you more detail about it here.

This button is located on the SQL Editor toolbar.

Open a query editor window, click on it to highlight it, write a query, and execute it. Your results will look like Figure 1.

Figure 1 – Client Statistics

Some really valuable information can be found here, such as the number of INSERT, DELETE, UPDATE, and SELECT statements – think of how useful that can be when tuning a complex stored procedure. You can see the number of rows affected or returned, and the number of transactions executed.

The network statistics lets you see how much traffic is moving from your client to the server and back.

The time statistics tells you how much time was spent processing on the client versus how much time was spent waiting for the server. These figures are in milliseconds.

That’s useful when you run a query once, but its usefulness becomes more apparent when you are tuning a query. Statistics will be shown for the ten most recent executions, and they will be averaged, as shown in Figure 2. Thus, as you are making changes to a query – perhaps changing the conditions in the WHERE clause, or comparing an AVERAGE to a window function – the statistics will update. You can, at a glance, compare the changes. You’ll even see green arrows for improved performance, and red arrows for worse performance.

Figure 2 – Client Statistics for multiple executions of a query

Client Statistics is not as in-depth of a tool as, say, execution plans, showing you where the performance of your query can be improved. It does, however, give you access to information that you might otherwise need to keep track of in your head, or scribbled on a Post-It Note – like I used to do when writing report queries. It’s another useful tool in the SSMS toolbox!

↑ Back to top
  1. Do you have any idea how many Post-It Note trees I’ve killed over the years? This is pretty nifty.

  2. A quick question:

    Do the results take into account the changes on subsequent runs of the query where the data now resides in memory, or would you always expect to see a reduction in the amount of network traffic for the second (and subsequent) runs as it is no longer reading from disk?

    • You will not always see a reduction in “Bytes received from server” or “TDS packets received from server”, even if the exact same query us run multiple times.

    • @ross The Client Statisics have very little to do with the SQL Server caching. Most of the Client Statistics functionality is related to the ‘big picture’ of client to database request and the data being sent back to the client. Whether the data is in cache of SQL Server or not it would not matter for the TDS packets nor for the ‘Bytes received from server’.
      The only single place where you see the effect of the data being cached or not is ‘Wait time on server replies’.

      Wait time on server replies – is the time in milliseconds between the last request packet which left the client and the first response packet which came back from the server to the client.

      So in this case, if you have the data in chache, this time generally will be shorter in the consequtive runs. And if you clear the cache, then the time will be longer for the first execution, because the data needs to be cached. (Use SSDs and the time difference will be insignificant. :) )

      Hope this helps.

      Regards,
      Feodor

  3. This is a good tip. Keep writing please.

  4. Yes, this is a great post!
    I like using Client statistics when developing complex logic queries and procedures in SQL Server.

    The Client Statistics kinda give an insight on what is going on beyond the SQL Server internals, i.e. TDS packets, network volumes transferred etc.

    Recently I wrote an article on the ‘big picture’ topic, which explains more in detail the full cycle of the workload which goes to and from SQL Server. The article can be read here: http://www.simple-talk.com/sql/database-administration/how-come-the-hourglass-why-database-applications-slow-down.-/

    There is one limitation of the Client Statistics in Management Studio, though. It only measures the workload of a query from Management Studio to the database engine. There is no way for them to measure the actual client statistics of an application. It would be so nice if there was a tool which could account and aggregate the TDS and network traffic in a similar way from the application perspective, don’t you think?

  5. Pingback: Something for the Weekend - SQL Server Links 06/12/12

  6. Jes,

    Great post and many thanks for this tip. I had not known about this feature before. It seems real useful and easy to use (one click!).

    You can copy and paste the Client Statistics tab results into other docs, Excel, etc. – but not with the column headings (like in the query results tab) – not a biggie.

    At first, I thought this might be a new SSMS feature in SQL 2012. Since then, I confirmed that the feature is present in SSMS for SQL 2008 R2, SQL 2008, and SQL 2005 as well. I also checked the Query Analyzer tool from SQL 2000 (predecessor of SSMS), and it has a simpler version of this feature (no tool bar icon – enable through the Query menu or keyboard shortcut, and only displays statistics for the most recent query run and the averages across runs versus displaying the history of multiple runs). So this feature has been around for quite a while! Who knew?

    I Googled for prior hits on the topic and found some – but if you didn’t know to look for the specific topic (the feature name of “Client Statistics”) or hadn’t stumbled across it, you probably wouldn’t know it was there. Another case where covering a topic again can be a very good thing – thanks a second time.

    Like Feodor, I think it would be great if there were similar tools / capabilities to track app level SQL client interactions. Another topic for another time.

    Thanks again,

    Scott R.

  7. Interesting. I have a lunch ‘n learn scheduled for later this week with our SQL developers and this is included as a demo. I use the ‘Bytes received from server’ column to demo why you should include NO COUNT in your procedures and triggers.

  8. I use this tool when trying to improve the performance of Stored Procs, but concentrate on the ‘Average’ column with at least 6 runs of the code. There are invariably small variations in each execution, so I find that averaging the results out gives a better idea of any performance changes (I won’t say ‘improvements’ – it doesn’t always work like that!). Paste screen shots of these results into a document to show the changes, along with execution plans if possible. Then you can show the work done should there be any questions relating to it.

  9. Pingback: Connecting to SQL Server – latency, part 1 « sqlconcept.com

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

css.php