How to Log Wait Stats to Table with sp_BlitzFirst

When you’re analyzing SQL Server performance, one of the best places to start is wait stats. What’s your SQL Server waiting on?

sp_BlitzFirst makes it really easy to trend this data over time for free with all currently supported versions of SQL Server (2008+). Just set up an Agent job to run this every 15 minutes:

sp_BlitzFirst logging tables

sp_BlitzFirst will create those tables if they don’t already exist. (The database & schema have to already exist, though.) It logs cumulative data in those tables just like the DMVs they’re sourced from:

  • sys.dm_os_wait_stats
  • sys.dm_os_performance_counters
  • sys.dm_io_virtual_file_stats

The tables are designed to be combined centrally from all of your servers, too – they include columns for ServerName and CheckDate (which is stored in datetimeoffset format, so you can report on data from servers in different time zones.)

But since that data is cumulative, it can be a little painful to query – so sp_BlitzFirst also creates a set of delta views for you. They’re named exactly the same as the table names you picked, but with _Deltas added at the end, like BlitzFirst_WaitStats_Deltas. These join the current sample with the prior sample to figure out how much time elapsed between samples, and how much wait time you had per minute during that sample.

Throw that into Microsoft’s free Power BI tool, and presto:

sp_BlitzFirst data in Power BI

From here, you can drill down into the other tables to see why the SQL Server was slow during that time range. Heck, armed with the other First Responder Kit scripts and their abilities to log data to table, you can even find out what queries were causing your high wait stats.

And it’s all totally free.

Wanna learn how? We’ll teach you in the very first module of our Expert Performance Tuning for 2016 & 2017 class (and yes, that includes those of you in our PASS Summit pre-con!)

Previous Post
How to Do a SQL Server Architecture Review
Next Post
[Video] Office Hours 2017/10/11 (With Transcriptions)

29 Comments. Leave new

  • Can we combine this with a CMS? I’m trying to figure out the best way to monitor several SQL servers, each with several databases – for a few things, and log it all to several tables. Ideally, I want to monitor things like configuration changes, perfmon stats, db and log file size, disk space remaining, wait stats and extended blocking/deadlocks.

    CMS would let you run queries against every server – but would that work with this, logging it all to a central table or set of tables? I’ve never set up CMS before – just reading about it before I got this post in my email, so it was fortuitous timing. Homelab project tonight! (Maybe you have some insight to head me down the right path) Thanks!

    • With a CMS, you’re just executing queries remotely on many servers. If you want to push the data to a central server, that gets a little trickier – but we’ll be talking about that in the class.

  • Markus Pöhler
    October 13, 2017 1:45 pm

    I love this.

  • How about having cleanup parameter in procedure it self, logging all data and not cleaning up will create issue in long run.

  • Tony Fountain
    October 13, 2017 3:20 pm

    Again, another awesome demonstration of useful features and tools by you guys! Thanks for what you provide to the SQL community!

  • Very useful article, thanks. Could you share PowerBI file?

  • This will be included in the pre-con? Will there be additional material covered in the Expert Performance Tuning class, or are they covering the same material?

    • Craig – the answers you seek are very carefully hidden in the post above, but I have faith in you. I believe you can read through to the last paragraph. 😉

  • Willem Leenen
    October 15, 2017 3:17 am

    This is awesome! Stroopwafels for you when we meet again.

  • I’m assuming that this is a recent post based on the date. However when you click on the link to the Expert Performance Tuning link it says “This product is currently out of stock and unavailable.”

  • test soft

  • Brent, please, share you Power BI file. Because I’m a newbie in Power BI. 🙂

  • Question: how do you get cool enough to have a database named ¯\_(?)_/¯ on your machine? 🙂

    • I have entirely too much free time, hahaha.

    • CREATE DATABASE [¯\_(?)_/¯]
      CONTAINMENT = NONE
      ON PRIMARY
      ( NAME = N’garbage’, FILENAME = N’C:\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\MSSQL\DATA\garbage.mdf’ , SIZE = 8192KB , FILEGROWTH = 65536KB )
      LOG ON
      ( NAME = N’garbage_log’, FILENAME = N’C:\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\MSSQL\DATA\garbage_log.ldf’ , SIZE = 8192KB , FILEGROWTH = 65536KB )

  • Aron Pedersen
    October 24, 2017 3:37 pm

    Thank you, I love this. Any ETA for when the @OutputServerName will work for all our awesome scripts?

Menu
{"cart_token":"","hash":"","cart_data":""}