tl;dr — Not necessarily.
One of our students asked me a great question: if you update statistics on every table in the database, is that effectively the same as dumping the procedure cache on the instance? Will every execution plan have to be recompiled the next time it is run? I thought it was a great question and it spurred an interesting discussion about maintenance and recompilation.
Updating statistics can certainly cause recompilation. But based on my observations, it won’t necessarily cause every execution plan to recompile.
To test this out, I’m going to use a free tool that Jeremiah and I wrote, sp_BlitzTrace™, to trace activity in Extended Events and show me when a recompile occurs.
Our Test Setup: Users Table with No Nonclustered Indexes
I’m running a simple, non-optimized query against a restored copy of the StackOverflow database on SQL Server 2014. To keep things simple, I have left the dbo.Users table without any non-clustered indexes. The auto-create statistics option is on with the default synchronous setting. I cleared the procedure cache before I started my test.
sp_BlitzTrace™ starts an Extended Events trace against my own session. Then I run my (somewhat troubled) query and stop the trace. Then I read the results.
exec sp_BlitzTrace @Action='start', @TargetPath='S:\XEvents\Traces\', @SessionId=@@SPID, @TraceRecompiles=1;
SELECT DisplayName, Reputation
WHERE DisplayName LIKE 'Kendra%'
ORDER BY Reputation DESC;
exec sp_BlitzTrace @Action='stop';
exec sp_BlitzTrace @Action='read';
I Find the Statistic To Update and Get Out My SledgeHammer
I find the statistics on the table by running sp_helpstats. This shows that I have a column statistic on the DisplayName column. I copy the name of that off to my clipboard. This is the statistic I’m going to update, because it’s in the “where” clause of my query.
I update the statistics with FULLSCAN with this code:
UPDATE STATISTICS dbo.Users _WA_Sys_00000005_1D7B6025 WITH FULLSCAN;
I’m not a fan of updating all your stats with fullscan. I just got out the sledgehammer for the purpose of demonstrating if it would cause a recompile.
… And I Don’t See A Recompile
After the statistics update, I run my test query block again (the same one above) with sp_BlitzTrace™. I do NOT see a row returned showing a recompile event:
I can run the query multiple times, and the trace never shows a recompile event.
Hmm. Well, maybe stats didn’t update somehow? Well, we can check. I query sys.dm_db_stats_properties and it shows that this statistic DID update:
.. Let’s Fake It Out and Make a Modification
There’s Our Recompile!
Following the “fake insert” and statistics update, I run my query again and the Extended Events trace shows a sql_statement_recompile event. The cause is statistics change:
Finding: Statistics Update Alone Didn’t Cause a Recompile
SQL Server was smart enough to check if the data had changed. Updating statistics alone doesn’t always invalidate execution plans.
What Does This All Mean?
Mostly, I think it just means that SQL Server is pretty clever about when to do recompiles — at least SQL Server 2014, since that’s where I ran this test. But don’t take this as an excuse to be lazy. I’m a fan of using the @OnlyModifiedStatistics=’Y’ option on Ola Hallengren’s index maintenance solution if you’re using @UpdateStatistics=’Y’: because why even update the statistic if it looks like nothing has changed?