It’s Now Easier to Query sp_BlitzFirst’s Historical Tables

SQL Server
4 Comments

When you want to know why the server’s slow, you can ask Brent. sp_BlitzFirst® checks a lot of DMVs to find common causes of server slowness.

When you turn on @ExpertMode = 1, you can see additional information like file stats, wait stats, and Perfmon counters.

To log those to a table permanently, check out the @Output parameters. In this example, I’m writing all of the working tables to the DBAtools database:

That creates the tables if they don’t already exist, and then adds the result sets each time it runs so you can track performance over time.

Why was the server slow yesterday afternoon?

When it’s time to query the data back out, you can add the @AsOf parameter with a date & time to see the main result set of alert results as of that moment in time:

The query goes 88mph, and then goes back in time
The query goes 88mph, and then goes back in time

That returns results within 15 minutes of either side of your time so you can see if there were any problems leading up to (or after) that moment. It’s really useful when someone says the server was slow last night.

However, that only shows the basic result set of sp_BlitzFirst® – not the detailed file/Perfmon/wait stats results. For those, you’re best off querying the tables directly for trending purposes.

v20: You Can Query Your File/Perfmon/Wait Statistics Over Time, Too

When you use the @parameter (or PerfmonStats or WaitStats), the latest version of sp_BlitzFirst® automatically creates a matching view, too, with a suffix of _Deltas. For example, if your results are in DBAtools.dbo.WaitStats, then sp_BlitzFirst automatically creates a view called WaitStats_Deltas. Querying that view will give you trending data over time because it automatically matches up samples for you to get running averages.

This way, if you run sp_BlitzFirst® in an Agent job every 5 minutes, you can track your SQL Server’s statistics over time. You can query it back out with SSMS, or your favorite reporting tool.

You can grab sp_BlitzFirst® along with all our other scripts in our handy download pack. Enjoy!

Previous Post
What Is a Staging Environment and How Do You Build One?
Next Post
Don’t Use Scalar User-Defined Functions in Computed Columns.

4 Comments. Leave new

  • So I’m checking index stats on a customers site, that’s what I do, I tune them. missing and poor performing I can handle. but the holy grail for me is to convert the index that won’t seek, one scans the hell out of them, but wont seek.

    So far the closest I’ve come was to drop it and look for the missing index info but that’s not always possible or even a good idea. How about a post on that sort of problem.

    Reply
    • Pete – we’ve actually got a whole entire video course on tuning indexes, plus another one on reviewing execution plans. It’s pretty far beyond what we could cover in a blog post, so click Training at the top of the site and you can check out what’s on offer.

      If you have a more specific question, we might be able to give you a more specific answer – but we just can’t teach you to tune queries and indexes inside a single blog post.

      Reply
  • Hello Brent,

    I would like to set a retention period of logging only last 14 days data into DBA tools sp_blitzfirst table and then on 15th day..i expect the data of 1st day to be deleted automatically. Can it be done,If yes, can u plz help me

    Reply
    • PK – if I were you, I’d set that up as an Agent job that runs just once every day, deleting data older than 15 days in that table. If you’re comfortable coding an Agent job to do that, then go ahead – otherwise, if you’d like help on doing that kind of thing, you may want to hit the readme and go to the Slack channel. Someone there may be willing to help you write that code. Thanks!

      Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.