We’ve already covered how to log activity using sp_WhoIsActive, but I wanted to share how I’ve seen it done in the wild. The difference between the two methods is that this version logs everything to the same table and is for continuous activity logging (scheduled job), whereas the other version uses a new table each time it logs activity and is for when you are working on a problem.
WHAT IS IT?
If you don’t know about sp_WhoIsActive already, you’ve been living under a rock. It’s just the most awesome, free SQL Server tool there ever was. Adam Machanic, the author of sp_WhoIsActive, says it “is a comprehensive DMV-based monitoring script, designed to tell you at a glance what processes are active on your SQL Server and what they’re up to.”
I’m a fan of using free tools that can help me do my job more effectively. I’ve used sp_WhoIsActive even when I’ve also had paid tools monitoring production.
You can download sp_WhoIsActive here.
Create a new job and plop the below code into the job step, modifying the first 3 variables as needed. The code will create the logging table if it doesn’t exist, the clustered index if it doesn’t exist, log current activity and purge older data based on the @retention variable.
How often should you collect activity? I think collecting sp_WhoIsActive data every 30-60 seconds is a good balance between logging enough activity to troubleshoot production problems and the storage needed to keep the data in a very busy environment.
SET NOCOUNT ON;
DECLARE @retention INT = 7,
@destination_table VARCHAR(500) = 'WhoIsActive',
@destination_database sysname = 'Crap',
SET @destination_table = @destination_database + '.dbo.' + @destination_table;
--create the logging table
IF OBJECT_ID(@destination_table) IS NULL
EXEC dbo.sp_WhoIsActive @get_transaction_info = 1,
@get_outer_command = 1,
@get_plans = 1,
@return_schema = 1,
@schema = @schema OUTPUT;
SET @schema = REPLACE(@schema, '<table_name>', @destination_table);
EXEC ( @schema );
--create index on collection_time
= 'USE ' + QUOTENAME(@destination_database)
+ '; IF NOT EXISTS (SELECT * FROM sys.indexes WHERE object_id = OBJECT_ID(@destination_table) AND name = N''cx_collection_time'') SET @exists = 0';
SET @parameters = N'@destination_table varchar(500), @exists bit OUTPUT';
EXEC sys.sp_executesql @SQL, @parameters, @destination_table = @destination_table, @exists = @exists OUTPUT;
IF @exists = 0
SET @SQL = 'CREATE CLUSTERED INDEX cx_collection_time ON ' + @destination_table + '(collection_time ASC)';
EXEC ( @SQL );
--collect activity into logging table
EXEC dbo.sp_WhoIsActive @get_transaction_info = 1,
@get_outer_command = 1,
@get_plans = 1,
@destination_table = @destination_table;
--purge older data
= 'DELETE FROM ' + @destination_table + ' WHERE collection_time < DATEADD(day, -' + CAST(@retention AS VARCHAR(10))
+ ', GETDATE());';
EXEC ( @SQL );
You could also break the DELETE out into its own job so that it runs once a day. If you do this, loop through the DELETEs in batches rather than all at once.
This is just a starter query. I generally am looking at a specific time period based on a complaint from a user that queries were running slow earlier in the day. I once used the WhoIsActive logging table to identify who was causing tempdb to grow absurdly big. By the time I got to it, tempdb was 500GB in size! I switched my query to include tempdb_allocations and tempdb_current plus ORDER BY tempdb_allocations DESC, and it was immediately clear a business user was querying ALL.THE.THINGS.
SELECT TOP 1000 CPU, reads, collection_time, start_time, [dd hh:mm:ss.mss] AS 'run duration', [program_name], login_name, database_name, session_id, blocking_session_id, wait_info, sql_text, *
WHERE collection_time BETWEEN '2016-07-20 07:55:00.000' AND '2016-07-20 09:00:00.000'
AND login_name NOT IN ('DomainName\sqlservice')
--AND CAST(sql_text AS varchar(max)) LIKE '%some query%'
ORDER BY 1 DESC
I have my own version of this here
that also includes pruning of old data, etc. Doens’t use sp_whoisactive but the same underlying DMVs
FYI: Your site looks odd when viewed from Chrome. The post occupies only 1/3 of the browser window, whereas the menu occupies 2/3 of it.
Thanks Tara – fixed now.
Tara, I’d like a way to be able to gather a list of all usernames **and** applications that attach to a database over time. This data can be found for historic connections via something like:
select * from
::fn_trace_gettable(‘D:\Apps\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\Log\log_589.trc’,0) x
INNER JOIN sys.trace_events e ON eventclass = trace_event_id
….however, the SPID column within this corresponds to the process id on the connecting client machine, so to find out the corresponding application, you need to reads the running application spid’s from the desktop client, which is obviously not a particularly easy thing to do.
Are you aware of any existing implementations of this idea?
The only way I’ve been able to figure this out is by requiring that all applications specify the “Application Name” parameter in the connection string. If they don’t specify it, you are going to see something generic, which is not helpful.
The SPID column in a trace is not the process id on the client machine. It is the process id of the connection within SQL Server. But maybe I am not understanding what you mean.
> The SPID column in a trace is not the process id on the client machine. It is the process id of the connection within SQL Server.
That would certainly make more sense, but I swear when I checked to confirm what I thought it was, that it was indeed the client spid – will have to double check that I guess.
Thanks for a great article, just one question
Why is the first column a varchar(8000) and not a time?
“Why is the first column a varchar(8000) and not a time?”
Because it’s a formatted column designed for using SSMS as a GUI. (Why 8000? Probably an error on my part.)
My recommendation for data capture is to use the @format_output = 0 switch, which outputs all of the numbers as actual numbers rather than text.
Great job, Tara!
Oh and why is it not the TIME data type? Well, two reasons: A) The proc in its current state is designed to be backward compatible to SQL Server 2005, and there was no TIME data type back then. And B) I don’t think I can fit the information into the TIME data type; what if a request has been running for over 24 hours?
Do you have a copy of sp_whoisactive with all the the azure related stuff (msdb ect) commented out. The last one you posted (sp_whoisactive_v11_112.sql) still has it in there. I’m at a new job and really want to get it working in Azure. I am trying to install is on the USER database, not the master.
Gail — for WhoIsActive support, head to the website.
Thank you but those ones are not for azure. I am unable to get the one that is for azure to work.
Right, but that’s where you should go for support — our blog comments aren’t a good place for that.
Thanks for understanding
Does anyone have a copy of sp_whoisactive with all the the azure related stuff (msdb ect) commented out. The last one Adam posted (sp_whoisactive_v11_112.sql) still has it in there. I’m at a new job and really want to get it working in Azure.
From the result of sp_whoisactive, is there a way to convert the physical reads to MB?
Sure, with your ol’ archenemy MATH, muhahaha. Pages are 8KB, so you can do some division to get the numbers you’re looking for.
Thanks … but how many pages sql server reads in one physical read… I mean if the “sp_whoisactive” value of [physical_reads] is 12345 how many pages is that? so I will multiply it by number of reads then by 8kb … I think there is one parameter missing in this equation unless each read is equivalent to one page, and I don’t think so, ain’t?!
One read is one page. 12345 reads is 12345*8kb. In MB that’s 96.44.
There are a lot of > < in this, is this a browser thing? Which browser should I use, I’m using Chrome. thanks
e.g. SET @schema = REPLACE(@schema, ‘<table_name>’, @destination_table);
That’s weird – when I copy out it says
SET @schema = REPLACE(@schema, ‘ < ;table_name > ;’, @destination_table);
Replace < with < and > with >.
getting this error..
Msg 102, Level 15, State 1, Line 1
Incorrect syntax near ‘<'.
Vishal — try now. Our old code plugin is kind of a pain sometimes.
[…] approaches have their merits, and neither is that uncommon. However, the latter is much more regularly blogged about, so I’m going to present the former, because it kinda helped remind me of a few […]
while running it on SQL 2012 getting below error.
Invalid object name ‘sys.dm_exec_query_statistics_xml’.
Yep, it’s 2016+ only.
It’s funny how many times I’ve set these captures up over the years, and how many times I have to figure it all out again. Thanks for keeping the blog and posts around Brent!
Sebron – you’re welcome!
Hi Tara, thanks for sharing this tool. I’m not able to get rid of the error ” Column name ‘collection_time’ does not exist in the target table or view. ” when the script run, using sp_whoisactive Version 11.32 SQL Server 2012. I was wondering if you know a fix to this issue. Thanks
I have not seen that issue. I’ve got a client running version 11.32 and running the script in the blog post every minute. No issues with the job. Perhaps check that the table is getting created properly – sounds like you’re missing collection_time – though the script should work as is. Perhaps you changed the script a little? All that you need to change is the values for the variables – nothing else.
how to track DML changes using sp_whoisactive?
Gladys – you can’t do that, unfortunately. For tracking DML changes, you’re better off considering a trigger like this:
Hey it seems in the second query the columns in are referenced incorrectly. In the scripts start_time is incorrect since it should be starttime
No, start_time is correct. I double checked it just now in the latest sp_WhoIsActive in case Adam changed the column name. It’s still start_time. Adam’s naming convention is lower case and underscores to separate words. I don’t see a single starttime in sp_WhoIsActive when I use CTRL+F.
I am able to log the sp-names in a table using whoisactive successfully but not able to log parameter’s values with sp-routine names, is it possible in it ?
Nauman – for questions, head to a Q&A site like https://dba.stackexchange.com.
Each time I run this it comes with error in job log – Executed as user: NT SERVICE\SQLSERVERAGENT. Warning: Null value is eliminated by an aggregate or other SET operation. [SQLSTATE 01003] (Message 8153) Column name or number of supplied values does not match table definition. [SQLSTATE 21S01] (Error 213) Warning: The join order has been enforced because a local join hint is used. [SQLSTATE 01000] (Error 8625) Warning: The join order has been enforced because a local join hint is used. [SQLSTATE 01000] (Error 8625). The step failed.
Can you please guide on the fololwing
1. Structure of the Table – “WhoIsActive” with defined data types
2. Can the name in “WhoIsacive” be reused in statement – line 21 – SET @schema = REPLACE(@schema, ”, @destination_table);
3. If 2 is not possible, please give table structure for teh table to be created and used in line 21
Philip, I believe that error is coming from sp_WhoIsActive. Get a newer version of it if yours is out-dated. Otherwise, get a slightly older version if you are on the latest. For the structure of the job, just run the script and it’ll create the table. You can check the columns and data types yourself at that point. You don’t need to do anything with line 21 or any of the lines except lines 3 through 6 which you can change to whatever you want.
Thanks Tara for your quick response. I do not have WhoISActive before now today is my first experience with it. I have downloaded the original script and deployed the SPROC on master Db as specified by Adam Machanic. But that was SPROC and not table but I want to customse the entire process into sql agent job and collect data every 30sec . So If I 3 -6 is the only change to make – my retention is 15 days in line 3 and line 4 is WhoIsActive and line 5 is master per sysname . I left line 21 as you advised but the job failed and says WhoISActive object does not exists … It seems I have to create the table its not created automatically as suggested .. ro what Im still dong wrong ?
Lines 14-23 are what creates the table. See Adam’s documentation here for info on the @schema option to help you troubleshoot this: http://whoisactive.com/docs/25_capturing/. Last thing I’d suggest to try is to run lines 1-23 separately in a query window and see if the table gets created. If that doesn’t fix it, you’ll need to troubleshoot it further as it’s very hard to troubleshoot through blog post comments.