#TSQL2sday: How Much Plan Cache History Do You Have?

Execution Plans
8 Comments

For this month’s T-SQL Tuesday, Bert Wagner asked, “What code have you written that you would hate to live without?”

I bet you think I’m going to say sp_Blitz, and I would indeed hate to live without that, but you already know about that one, dear reader. Lemme show you something else cool.

Before I run sp_BlitzCache to analyze which queries have been the most resource-intensive, I wanna know, “How much query plan history does SQL Server have available?” Looking at the top 10 plans in the cache doesn’t do me much good if:

I could go on and on – there are so many things that give SQL Server amnesia. So here’s how I check:

Here are the results on a server that just restarted:

Freshly restarted

That means:

  • We have 53 cached plans that were created in the 6AM hour today
  • We have 412 that were created in the 5AM hour

In that environment, sp_BlitzCache is gonna be much less useful. However, let’s look at data from June 27th for a stable server that’s been up for a long time – shown in Excel because we have a tool that automatically grabs query data for us and puts it in spreadsheets:

Plans by date and time

The query breaks up results into hours for today, but then older than today, it just groups stuff into whole days. (After all, I don’t care what hour a query was compiled on two days ago.) On that server, I can be pretty confident when I query the plan cache – I’ve got a lot of history. There are still new queries coming in, and I may be facing some non-parameterized one-off issues, but it’s not that bad.

This one is a little more curious – it was collected on June 25th:

Something happened today in the 1400 hour

We do have valid query plans going all the way back to June 22 – meaning, no one has run FREEPROCCACHE – but there are only about 1,000 plans from prior to the current date (June 25.) The vast majority of cached plans are from today, and furthermore, there were over ten thousand new plans generated in the 14:00 hour!

This might indicate that:

  • Someone rebuilt a ton of indexes around noon-13:00
  • We have an application that isn’t parameterizing its queries, and is sending in a flood of single-use strings, and SQL Server is flushing them out of memory (and no, Optimize for Ad Hoc doesn’t fix that)

This query doesn’t give me the answers, mind you – it just helps me ask better questions, faster. Hope it helps you too! You’re free to use that for any purpose – here’s a Github Gist with the code & license.

Previous Post
Building SQL ConstantCare®: Analyzing Query Plans
Next Post
#TSQL2sday: +

8 Comments. Leave new

  • Brent, PlanCache is all very fine, but what about QueryStore? It holds even most (all?) flushed plans, and I’ve been surprised or horrified or impressed or something by how many versions of every darned statement, it seems to hold.

    Reply
  • danielle.paquette-harvey
    November 17, 2020 12:39 pm

    Wow! Discovered that every query in our software has this issue “The .NET app calls .Parameters.Add without setting the parameter size”. Just instructed the developers to change the way they add Parameters ton queries but it’s gonna take a long time before it goes in production.. Hopefully my plan cache is gonna get better one day 😉

    Reply
  • Hi Brent,
    in my scenario, over 50% of the plans were created in the last 2hrs…

    Work in progress on these two points:
    – The developers aren’t parameterizing their queries
    – The .NET app calls .Parameters.Add without setting the parameter size

    I also noticed many statistics updated several times a day (50+times). It’s -update stats because no index rebuild nor update stats are scheduled during the day.
    So.. what’s happening is
    1- update statistics runs and updates stats
    2- some plans flushed out
    3- new plans created
    4- GOTO 1

    Would you suggest any action to mitigate this situation?

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.

Menu