Creating Tables and Stored Procedures in TempDB – Permanently

No, not #tables – actual tables. Here’s how:

The first one disappears when my session is over, but the latter two persist until the SQL Server is restarted.

Why would you ever do the latter two? Say you need to share data between sessions, or between different applications, or staging tables for a data warehouse, or just faster tables that live on local SSDs in a cluster (as opposed to slower shared storage), or you wanna build a really crappy caching tier.

If you use global temp tables or user-space tables, though, you have to check for duplicates before creating your tables. Local temp tables are just all yours, and you can have a thousand users with the exact same-name local temp tables.

Next up, the ever-so-slightly different magic of temporary stored procedures:

Here, the first TWO disappear when my session is over, and only the latter one sticks around. Diabolical. So the ## temp stored proc doesn’t really help me here because I can never tell when the creator’s session is going to finish. (Not God. His session keeps right on going.)

So why would you ever create stored procedures – temporary or user – in TempDB? You might not have permissions in the user databases, just might not be technically allowed to change things, or maybe you’ve got monitoring queries that you want to hide, or you want to create procs temporarily to check parameter sniffing issues.

All of the above will disappear when the SQL Server is restarted – or will they? Not if you create them permanently in the model database, which is the source of TempDB’s creation when SQL Server restarts:

Why would you ever wanna do this? Well, say you need to make sure that, uh, in case … look, I’m just an idea man. Somebody, somewhere, is looking for a really bad idea. That’s what I’m here for.

Previous Post
Where Clauses and Empty Tables
Next Post
RAM and Sympathy

37 Comments. Leave new

  • Very Interesting! But…. as you said, “diabolical” !!

  • great post, I like it

  • I’m pretty sure global temp tables (##) disappear when the last session that references them closes/resets. At least if I create a global temp table in a query window in SSMS via

    CREATE TABLE ##tmp (ID int not null)

    if i close the query window and open a new one, the object is no longer queryable.

    You state that it exists until SQL Server restarts…

    • I was under this impression. I felt a cold hand clutch my heart until I realised that if this *wasn’t* the case, I’d have noticed the second time I tried to create ##a, ##b or ##x (yeah, I’m one of THOSE people).

      I want to play around with it now; I’d like to know how SQL Server keeps track of which SPIDs are attached to which ##tables.

    • Asked and answered; globals seem to disappear once the session that created them has disconnected and all locks are released. Playing around with sessions, transactions and locking hints supports this, and Technet sez:

      “If you create the global temporary table ##employees, any user in the database can work with this table. If no other user works with this table after you create it, the table is deleted when you disconnect. If another user works with the table after you create it, SQL Server deletes it after you disconnect and after all other sessions are no longer actively using it.”

      (I think the second and third instances of the word “user” are misnomers in this paragraph and “session” would be more accurate; also read “you disconnect” as “the session that created the table is disconnected”).

  • Ray Herring
    May 4, 2016 9:16 am

    I guess you could use the permanent table/procedure approach to set up raw data collection for monitoring. Aggregate the data and then ship it to a permanent repository.

    Of course if you put it in Model it will end up in Every new database not just tempDB.

  • Hahaha!

    I’ve written dozens of these posts and not hit publish (the latest one was going to be called “How to modify columns online with WRITEPAGE”)

    Well done.

  • Great article.

    We can create a table or a stored procedure in master database with a prefix of “sp_” to the name of the stored procedure or table and then they are available for use in all databases.

    Do you have a preference to creating permanent tables or stored procedures in master vs model database? Why? Or both ways are not preferred ways?


    • Mirza – generally I’d rather create things in user databases, not system databases, and then fully prefix them when I use them.

      • Don’t the sp_blitz et al. installation scripts install to master? 😉

        Agreed though; I can’t say for sure what happens either way but wouldn’t be surprised to find that nonstandard objects disappear from master when you upgrade the instance, maybe even when you apply a CU or SP.

  • Didn’t see you mention this and while most DBAs know this, some don’t… if you create the stored procedure in model it’ll exist not only in tempdb but in every newly created user database, since all of those are copied from model.

  • Yeah the problem is that you will have those object in every user database in addition to temp db. I think what you want is this right? 🙂

    That way it’s only in tempdb and added bonus for being extremely obscure.

    USE master;
    CREATE PROC dbo.usp_tempObjects
    EXEC tempdb.sys.sp_executesql N’CREATE PROC dbo.usp_myTempWorker AS
    SELECT * FROM sys.databases;
    EXEC tempdb.sys.sp_executesql N’CREATE TABLE dbo.myTempTable (ID INT IDENTITY(1,1), Stuffing VARCHAR(100));’;

    EXEC sys.sp_procoption @ProcName = ‘usp_tempObjects’, @OptionName = ‘startup’, @OptionValue = ‘on’;

  • If you want to use a pattern where each of many independent db are identical in schema, such as when you want each client to be isolated into their own db, putting objects in model can be a quick shortcut for onboarding new clients by just doing ‘create database newclient’ rather than needing to do additional schema deployments.

    Of course people who do that also tend to be the ones asking what the max number of db per AG is too, but that’s a story for another day.

  • Yes well I personally find it annoying when I inherit an instance, create a new database, and find users and objects in it already for no good reason I can determine that someone added to model database.

  • An old version of Great Plains (7.5) did the permanent table in TempDB thing. There are two tables that are used for tracking simultaneous user sessions. Every once in a while something gets stuck and we have to log everyone out and clear those tables to fix it (old team used to reboot SQL Server, until we learned the real issue). Not sure if newer versions or Dynamics GP still does this.

    I just love legacy systems…

  • I know a vendor whose upgrade path is to write the existing database into tempdb, install the new schema and then migrate the data back.

    They didn’t understand how that was a bad idea.

    Suggested the business find a replacement vendor ASAP.

    • Guh. That sounds like one of those things people do for pet programming projects just because they can, like Tetris in Excel via VBA or that T-SQL Minesweeper I played once. Possibly even an example of what you shouldn’t do that somehow made it into production because somebody got the wrong end of the stick.

  • Tim Jennings
    May 5, 2016 10:25 pm

    If there was something I wanted in the TempDB but didn’t necessarily want showing up in every new database that might get created, I would setup an agent job that would run on startup to execute a creation script for me on TempDB.

  • Holly Irick
    May 6, 2016 9:25 am

    It seems to me the best use of this information might be to check your security procedures to see that nobody is doing it! I’m real uncomfortable with the idea of invisible tables, it sure isn’t in my monitoring process.
    How about a start up procedure that deletes tables? or a polling one that deletes tables not belonging to sessions.

  • BCD Featherstone
    May 6, 2016 10:22 am

    And another fine use for purposely creating objects in tempDB…”Securing” code (OMG)

    The cycle:
    User wants secret squirrel stuff from app.
    App creates object in tempDB that accesses only user database(s)
    App use the object.
    App blows off the object.

    No one can read the code in the user database, (unless they know to & how to monitor tempDB).

    Of course, this requires that the vendor app has elevated access (that I hate), but the woes of security incidents are not on the vendor’s plate.

    Yes, I have dealt with this in the wild many times. Whee!!!

  • Jonathan Shields
    May 29, 2016 2:05 pm

    Interesting idea. Seems a bit like “yeah you can do it, but should you “. I guess maybe if you wanted to collect Sp_whoisactive data on a schedule or something you could use it as long as the process managed the size of the collected data – would that be a good idea?

    • Jonathan – in order to know if something would be a good idea, you have to ask, “A good idea for what? What’s the problem you’re trying to solve?”

  • Jonathan Shields
    May 31, 2016 4:29 pm

    Agreed. I guess my reticence kind of comes from fear of lots of people littering up tempdb on one of my instances with rubish they never removed but that’s no reason not to do it I guess as long as the data was being managed.
    In my admittedly vague example I was thinking it could be used to collect some useful data on active processes over time to potentially help with tuning.

  • Hi great post. How long have people been able to create user-space tables in TempDb? I have not seen people do that before.

    • By default, a user can only create local (#tablename) or global (##tablename) tables in tempdb. To crease userspace tables, they would need CREATE TABLE in tempdb. I only know this offhand because I worked on a system where session data was stored in userspace tempdb tables (and had to be periodically cleaned out unless the instance was regularly restarted).

      To automatically grant CREATE TABLE in tempdb, either that needs to be done in model (as pointed out above, a very bad idea because every new database gets that) or in a SQL statement as part of a “startup” stored procedure. Kids, don’t try this at home!

  • I have flirted with using a similar warped approach for the following twisted reason and after reading all this im not only more convinced it could work for this situation but fear that i might actually go down this twisted path and really loath myself later. The problem was a.legacy client developed system that utilized extensive custom logging via a short series of stored procedures that logged into user tables in the source system as part of an etl process. Client wanted to scale by using a 2016 always on ag secondary read only source database, but also have the ability to run against the primary depending on the failover situation. Thought had been to shift the logging tables to reside in the tempdb so the tables would always be write enabled regardless of the node running on, primary or a read only secondary. Perment logging could be attained via a stand alone process that would scrape content from tempdb and record content in the current read write primary regardless of what node it ran on at any given moment. Insane, yes, put a scenario where you.might do such a cazy thing… just because you could.. this might answer should?

    • If the logging is only singleton inserts, you’re probably better off putting the logging table in a user database that isn’t in the AG (we’ll call this table NonAGDB.dbo.Log), and then replacing the original logging table (inside the AG database, call it AGDB) with a view pointing to NonAGDB.dbo.Log. As long as all the replicas have a database named NonAGDB, the logging will still work. You can still back up that database, merge contents across servers with stuff like SSIS, etc, and it’ll buy you persistence without worrying about TempDB.

  • Kevin Farlee
    April 10, 2019 9:26 am

    You do understand the impact goes beyond TempDB, right? Yes, model is used to re-create TempDB every time the instance restarts, that part lines up.
    But, model is also used to create any new database. That means if you create user objects in model, EVERY user database is going to have them as soon as you create them.
    That may be useful for some purposes of standardization etc. but don’t be surprised by it…

  • Don’t you find that thinking like a crazy person from time to time helps figure out what was motivating the persons that implemented some crazy solution you are now stuck supporting for no good reason other than people are relying on this being there now and have no clue why it is the way it is. It simply is just the way it is. I worked with another consultant a few years back – he was truly crazy – also happened to be a consultant for the FBI as a profiler – worked on the Unibomber case – you need a crazy to sometimes understand another crazy. Don’t discount the value of a bad idea – just be careful to prevent yourself from committing them in public.

  • I’ve dealt with an ERP system for oil and gas that created semi-permanent ## stored procs. It was super dumb.


Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.