When you’re analyzing SQL Server performance, one of the best places to start is wait stats. What’s your SQL Server waiting on?
sp_BlitzFirst makes it really easy to trend this data over time for free with all currently supported versions of SQL Server (2008+). Just set up an Agent job to run this every 15 minutes:
@OutputDatabaseName = 'DBAtools',
@OutputSchemaName = 'dbo',
@OutputTableName = 'BlitzFirst',
@OutputTableNameFileStats = 'BlitzFirst_FileStats',
@OutputTableNamePerfmonStats = 'BlitzFirst_PerfmonStats',
@OutputTableNameWaitStats = 'BlitzFirst_WaitStats';
sp_BlitzFirst will create those tables if they don’t already exist. (The database & schema have to already exist, though.) It logs cumulative data in those tables just like the DMVs they’re sourced from:
The tables are designed to be combined centrally from all of your servers, too – they include columns for ServerName and CheckDate (which is stored in datetimeoffset format, so you can report on data from servers in different time zones.)
But since that data is cumulative, it can be a little painful to query – so sp_BlitzFirst also creates a set of delta views for you. They’re named exactly the same as the table names you picked, but with _Deltas added at the end, like BlitzFirst_WaitStats_Deltas. These join the current sample with the prior sample to figure out how much time elapsed between samples, and how much wait time you had per minute during that sample.
Throw that into Microsoft’s free Power BI tool, and presto:
From here, you can drill down into the other tables to see why the SQL Server was slow during that time range. Heck, armed with the other First Responder Kit scripts and their abilities to log data to table, you can even find out what queries were causing your high wait stats.
And it’s all totally free.
Wanna learn how? We’ll teach you in the very first module of our Expert Performance Tuning for 2016 & 2017 class (and yes, that includes those of you in our PASS Summit pre-con!)
Can we combine this with a CMS? I’m trying to figure out the best way to monitor several SQL servers, each with several databases – for a few things, and log it all to several tables. Ideally, I want to monitor things like configuration changes, perfmon stats, db and log file size, disk space remaining, wait stats and extended blocking/deadlocks.
CMS would let you run queries against every server – but would that work with this, logging it all to a central table or set of tables? I’ve never set up CMS before – just reading about it before I got this post in my email, so it was fortuitous timing. Homelab project tonight! (Maybe you have some insight to head me down the right path) Thanks!
With a CMS, you’re just executing queries remotely on many servers. If you want to push the data to a central server, that gets a little trickier – but we’ll be talking about that in the class.
I love this.
Markus – thanks!
How about having cleanup parameter in procedure it self, logging all data and not cleaning up will create issue in long run.
Nimit – sure, that’s totally up to you to determine how much data you’d like to keep, and where you’d like to keep it.
Again, another awesome demonstration of useful features and tools by you guys! Thanks for what you provide to the SQL community!
Our pleasure, sir!
Very useful article, thanks. Could you share PowerBI file?
Konstantin – absolutely, that’s what we’re covering in class. See you there!
This will be included in the pre-con? Will there be additional material covered in the Expert Performance Tuning class, or are they covering the same material?
Craig – the answers you seek are very carefully hidden in the post above, but I have faith in you. I believe you can read through to the last paragraph. 😉
This is awesome! Stroopwafels for you when we meet again.
Willem – ooh, then it was all worth it! I love those things, heh.
I’m assuming that this is a recent post based on the date. However when you click on the link to the Expert Performance Tuning link it says “This product is currently out of stock and unavailable.”
Td – yep, unfortunately the sales on that have closed since the blog post. They’ll reopen soon.
Brent, please, share you Power BI file. Because I’m a newbie in Power BI. 🙂
Jiri – sure, read the post and there’s instructions on how to get the Power BI file. Thanks!
I read and found nothing. 🙂
Jiri – read the last sentence really carefully. Out loud, maybe. Thanks!
Mr Ozar, Cant seem to get the pbix without attending the class or preconn. If you could pls share it. Thank you.
AJ – we’re sharing it in the classes and pre-cons. You’re welcome to join in! See you there.
Question: how do you get cool enough to have a database named ¯\_(?)_/¯ on your machine? 🙂
I have entirely too much free time, hahaha.
CREATE DATABASE [¯\_(?)_/¯]
CONTAINMENT = NONE
( NAME = N’garbage’, FILENAME = N’C:\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\MSSQL\DATA\garbage.mdf’ , SIZE = 8192KB , FILEGROWTH = 65536KB )
( NAME = N’garbage_log’, FILENAME = N’C:\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\MSSQL\DATA\garbage_log.ldf’ , SIZE = 8192KB , FILEGROWTH = 65536KB )
Thank you, I love this. Any ETA for when the @OutputServerName will work for all our awesome scripts?
As soon as you code it! That’s the beauty of open source. 😉