Creating Tables and Stored Procedures in TempDB – Permanently

No, not #tables – actual tables. Here’s how:

The first one disappears when my session is over, but the latter two persist until the SQL Server is restarted.

Why would you ever do the latter two? Say you need to share data between sessions, or between different applications, or staging tables for a data warehouse, or just faster tables that live on local SSDs in a cluster (as opposed to slower shared storage), or you wanna build a really crappy caching tier.

If you use global temp tables or user-space tables, though, you have to check for duplicates before creating your tables. Local temp tables are just all yours, and you can have a thousand users with the exact same-name local temp tables.

Next up, the ever-so-slightly different magic of temporary stored procedures:

Here, the first TWO disappear when my session is over, and only the latter one sticks around. Diabolical. So the ## temp stored proc doesn’t really help me here because I can never tell when the creator’s session is going to finish. (Not God. His session keeps right on going.)

So why would you ever create stored procedures – temporary or user – in TempDB? You might not have permissions in the user databases, just might not be technically allowed to change things, or maybe you’ve got monitoring queries that you want to hide, or you want to create procs temporarily to check parameter sniffing issues.

All of the above will disappear when the SQL Server is restarted – or will they? Not if you create them permanently in the model database, which is the source of TempDB’s creation when SQL Server restarts:

Why would you ever wanna do this? Well, say you need to make sure that, uh, in case … look, I’m just an idea man. Somebody, somewhere, is looking for a really bad idea. That’s what I’m here for.

Want to learn more? Take my class.

My Fundamentals of TempDB class is for curious folks who want to learn:

  • What uses TempDB, and how that affects performance: not just temp tables and table variables, but also triggers, cursors, sorting indexes, workspace spills, the version store, and more
  • How to host TempDB depending on how you use it: understanding what goes in the log file, the data files, why the number of data files matter, and whether your particular workload should be on SAN, local storage, or ephemeral storage, and why
  • How to monitor it: after you understand what uses it and how to host it, you need to keep tabs on how its performance changes over time using DMVs, Perfmon counters, and the First Responder Kit scripts

Learn more and register now.

Previous Post
Where Clauses and Empty Tables
Next Post
RAM and Sympathy

46 Comments. Leave new

  • Very Interesting! But…. as you said, “diabolical” !!

    Reply
  • great post, I like it

    Reply
  • I’m pretty sure global temp tables (##) disappear when the last session that references them closes/resets. At least if I create a global temp table in a query window in SSMS via

    CREATE TABLE ##tmp (ID int not null)

    if i close the query window and open a new one, the object is no longer queryable.

    You state that it exists until SQL Server restarts…

    Reply
    • I was under this impression. I felt a cold hand clutch my heart until I realised that if this *wasn’t* the case, I’d have noticed the second time I tried to create ##a, ##b or ##x (yeah, I’m one of THOSE people).

      I want to play around with it now; I’d like to know how SQL Server keeps track of which SPIDs are attached to which ##tables.

      Reply
    • Asked and answered; globals seem to disappear once the session that created them has disconnected and all locks are released. Playing around with sessions, transactions and locking hints supports this, and Technet sez:

      “If you create the global temporary table ##employees, any user in the database can work with this table. If no other user works with this table after you create it, the table is deleted when you disconnect. If another user works with the table after you create it, SQL Server deletes it after you disconnect and after all other sessions are no longer actively using it.”

      (I think the second and third instances of the word “user” are misnomers in this paragraph and “session” would be more accurate; also read “you disconnect” as “the session that created the table is disconnected”).

      Reply
  • Ray Herring
    May 4, 2016 9:16 am

    I guess you could use the permanent table/procedure approach to set up raw data collection for monitoring. Aggregate the data and then ship it to a permanent repository.

    Of course if you put it in Model it will end up in Every new database not just tempDB.

    Reply
  • Hahaha!

    I’ve written dozens of these posts and not hit publish (the latest one was going to be called “How to modify columns online with WRITEPAGE”)

    Well done.

    Reply
  • Great article.

    We can create a table or a stored procedure in master database with a prefix of “sp_” to the name of the stored procedure or table and then they are available for use in all databases.

    Do you have a preference to creating permanent tables or stored procedures in master vs model database? Why? Or both ways are not preferred ways?

    Thanks.

    Reply
    • Mirza – generally I’d rather create things in user databases, not system databases, and then fully prefix them when I use them.

      Reply
      • Don’t the sp_blitz et al. installation scripts install to master? 😉

        Agreed though; I can’t say for sure what happens either way but wouldn’t be surprised to find that nonstandard objects disappear from master when you upgrade the instance, maybe even when you apply a CU or SP.

        Reply
  • Didn’t see you mention this and while most DBAs know this, some don’t… if you create the stored procedure in model it’ll exist not only in tempdb but in every newly created user database, since all of those are copied from model.

    Reply
  • Yeah the problem is that you will have those object in every user database in addition to temp db. I think what you want is this right? 🙂

    That way it’s only in tempdb and added bonus for being extremely obscure.

    USE master;
    GO
    CREATE PROC dbo.usp_tempObjects
    AS
    EXEC tempdb.sys.sp_executesql N’CREATE PROC dbo.usp_myTempWorker AS
    SELECT * FROM sys.databases;
    ‘;
    EXEC tempdb.sys.sp_executesql N’CREATE TABLE dbo.myTempTable (ID INT IDENTITY(1,1), Stuffing VARCHAR(100));’;
    GO

    EXEC sys.sp_procoption @ProcName = ‘usp_tempObjects’, @OptionName = ‘startup’, @OptionValue = ‘on’;

    Reply
  • If you want to use a pattern where each of many independent db are identical in schema, such as when you want each client to be isolated into their own db, putting objects in model can be a quick shortcut for onboarding new clients by just doing ‘create database newclient’ rather than needing to do additional schema deployments.

    Of course people who do that also tend to be the ones asking what the max number of db per AG is too, but that’s a story for another day.

    Reply
  • Yes well I personally find it annoying when I inherit an instance, create a new database, and find users and objects in it already for no good reason I can determine that someone added to model database.

    Reply
  • An old version of Great Plains (7.5) did the permanent table in TempDB thing. There are two tables that are used for tracking simultaneous user sessions. Every once in a while something gets stuck and we have to log everyone out and clear those tables to fix it (old team used to reboot SQL Server, until we learned the real issue). Not sure if newer versions or Dynamics GP still does this.

    I just love legacy systems…

    Reply
  • I know a vendor whose upgrade path is to write the existing database into tempdb, install the new schema and then migrate the data back.

    They didn’t understand how that was a bad idea.

    Suggested the business find a replacement vendor ASAP.

    Reply
    • Guh. That sounds like one of those things people do for pet programming projects just because they can, like Tetris in Excel via VBA or that T-SQL Minesweeper I played once. Possibly even an example of what you shouldn’t do that somehow made it into production because somebody got the wrong end of the stick.

      Reply
  • Tim Jennings
    May 5, 2016 10:25 pm

    If there was something I wanted in the TempDB but didn’t necessarily want showing up in every new database that might get created, I would setup an agent job that would run on startup to execute a creation script for me on TempDB.

    Reply
  • Holly Irick
    May 6, 2016 9:25 am

    It seems to me the best use of this information might be to check your security procedures to see that nobody is doing it! I’m real uncomfortable with the idea of invisible tables, it sure isn’t in my monitoring process.
    How about a start up procedure that deletes tables? or a polling one that deletes tables not belonging to sessions.

    Reply
  • BCD Featherstone
    May 6, 2016 10:22 am

    And another fine use for purposely creating objects in tempDB…”Securing” code (OMG)

    The cycle:
    User wants secret squirrel stuff from app.
    App creates object in tempDB that accesses only user database(s)
    App use the object.
    App blows off the object.

    No one can read the code in the user database, (unless they know to & how to monitor tempDB).

    Of course, this requires that the vendor app has elevated access (that I hate), but the woes of security incidents are not on the vendor’s plate.

    Yes, I have dealt with this in the wild many times. Whee!!!

    Reply
  • Jonathan Shields
    May 29, 2016 2:05 pm

    Interesting idea. Seems a bit like “yeah you can do it, but should you “. I guess maybe if you wanted to collect Sp_whoisactive data on a schedule or something you could use it as long as the process managed the size of the collected data – would that be a good idea?

    Reply
    • Jonathan – in order to know if something would be a good idea, you have to ask, “A good idea for what? What’s the problem you’re trying to solve?”

      Reply
  • Jonathan Shields
    May 31, 2016 4:29 pm

    Agreed. I guess my reticence kind of comes from fear of lots of people littering up tempdb on one of my instances with rubish they never removed but that’s no reason not to do it I guess as long as the data was being managed.
    In my admittedly vague example I was thinking it could be used to collect some useful data on active processes over time to potentially help with tuning.

    Reply
  • Hi great post. How long have people been able to create user-space tables in TempDb? I have not seen people do that before.

    Reply
    • By default, a user can only create local (#tablename) or global (##tablename) tables in tempdb. To crease userspace tables, they would need CREATE TABLE in tempdb. I only know this offhand because I worked on a system where session data was stored in userspace tempdb tables (and had to be periodically cleaned out unless the instance was regularly restarted).

      To automatically grant CREATE TABLE in tempdb, either that needs to be done in model (as pointed out above, a very bad idea because every new database gets that) or in a SQL statement as part of a “startup” stored procedure. Kids, don’t try this at home!

      Reply
  • I have flirted with using a similar warped approach for the following twisted reason and after reading all this im not only more convinced it could work for this situation but fear that i might actually go down this twisted path and really loath myself later. The problem was a.legacy client developed system that utilized extensive custom logging via a short series of stored procedures that logged into user tables in the source system as part of an etl process. Client wanted to scale by using a 2016 always on ag secondary read only source database, but also have the ability to run against the primary depending on the failover situation. Thought had been to shift the logging tables to reside in the tempdb so the tables would always be write enabled regardless of the node running on, primary or a read only secondary. Perment logging could be attained via a stand alone process that would scrape content from tempdb and record content in the current read write primary regardless of what node it ran on at any given moment. Insane, yes, put a scenario where you.might do such a cazy thing… just because you could.. this might answer should?

    Reply
    • If the logging is only singleton inserts, you’re probably better off putting the logging table in a user database that isn’t in the AG (we’ll call this table NonAGDB.dbo.Log), and then replacing the original logging table (inside the AG database, call it AGDB) with a view pointing to NonAGDB.dbo.Log. As long as all the replicas have a database named NonAGDB, the logging will still work. You can still back up that database, merge contents across servers with stuff like SSIS, etc, and it’ll buy you persistence without worrying about TempDB.

      Reply
  • Kevin Farlee
    April 10, 2019 9:26 am

    You do understand the impact goes beyond TempDB, right? Yes, model is used to re-create TempDB every time the instance restarts, that part lines up.
    But, model is also used to create any new database. That means if you create user objects in model, EVERY user database is going to have them as soon as you create them.
    That may be useful for some purposes of standardization etc. but don’t be surprised by it…

    Reply
  • Don’t you find that thinking like a crazy person from time to time helps figure out what was motivating the persons that implemented some crazy solution you are now stuck supporting for no good reason other than people are relying on this being there now and have no clue why it is the way it is. It simply is just the way it is. I worked with another consultant a few years back – he was truly crazy – also happened to be a consultant for the FBI as a profiler – worked on the Unibomber case – you need a crazy to sometimes understand another crazy. Don’t discount the value of a bad idea – just be careful to prevent yourself from committing them in public.

    Reply
  • I’ve dealt with an ERP system for oil and gas that created semi-permanent ## stored procs. It was super dumb.

    Reply
  • Tell me if I’m crazy…I wrote a table-valued function that returns everything I want to know about a table (yes, I know about sp_help, but mine returns other things I use often like extended properties of the table which include a comment-style description of the table, author, creation date, modifications made, etc., columns and their data types in a readable format such as decimal(19,4) instead of Type = decimal; Length = 9; Prec = 19, Scale = 4, foreign keys for which the table is the “referencing table” instead of the “referenced table”, columns of the foreign keys, and lots more).

    Occasionally, I want to get the column and data type information for #temp tables that I’ve created (perhaps I’ve just SELECTed INTO a #temp table and then want to be able to copy/paste the definition of the columns quickly and easily to make it a permanent table).

    I was thinking that perhaps, this would be a good reason to create a copy of my user table-valued function directly in tempdb?

    Since it also lives in my user-created databases, I don’t care if it gets dropped when SQL Server restarts – it’s easy for me to re-create it only when I need it.

    Reply
    • Akchavarria – for questions about your code, head to https://stackoverflow.com.

      Reply
      • Thanks for replying @Brent Ozar! Sorry if my question was confusing…I wasn’t asking for help on any specific code. I was just wondering if you thought my situation was one where it would not be considered “wrong” to create a persistent user-defined object in tempdb. You implied that all instances of needing to create a persistent user-defined object in tempdb were a bad idea in your last few sentences of the post, so I just wanted to see if I was creating dumb problems for myself.

        Thanks again, I stumble across a lot of your blog posts and enjoy them very much!

        Reply
  • It helps to be clear about what you mean by “Persistent” user-defined object in tempdb. Strictly speaking, nothing in tempdb is persistent past the next server restart, as TempDB is recreated. Also, if you’re in an HA configuration such as Availability Groups, you need to consider what happens when you fail over to another replica which has its own tempdb.
    As was mentioned earlier in the thread, the only way to make it truly persistent is to add the object(s) in Model, which has the implication that the object will show up in ALL new databases, not just tempdb.

    Reply
    • Kevin – good to see you! Yep, I mention that in the post. You just have to read through all the way to the end. 😉

      Reply
    • Thanks for your response Kevin! Yes, sorry, my use of “persistent” would be a misnomer here because we are talking about tempdb. I suppose I just meant “user-defined object”. In fact, I mention in my original question that I don’t care that it doesn’t persist in between SQL Server restarts, because it is not a function that is used in any production application – it is merely something that helps me develop (I can easily get table structure information for #temp tables).

      I think I must be really bad at explaining myself, sorry! My question is just “Is creating a user-defined object in tempdb that only exists for my development needs and can easily be re-created if blown away by a SQL Server restart a ‘bad idea’?”

      Reply
    • @Kevin Farlee Also, I forgot to mention: Thanks for pointing out the consideration about fail overs! I don’t have any Availability Groups set up, so that’s not a concern. Also because the function is only there when I need it for development needs, I wouldn’t care if it randomly disappears because it isn’t used anywhere in production.

      Reply
  • I searched and suddenly here I found a simple solution to one task given by a customer who wants a special thing: every day the database (normally running on server A) is restored from backup to server B. And the customer wants something I never understood why the heII. He wants the time the restore started, and the time the restore ended. And he wants it in the table in that restored database. That is: I have a script that selects getdate in a variable in a batch file, performs a restore, and then creates a table (which of course is not on server A, so it is not in the restored db on server B) and puts the times from the variable and the current “getdate” in it (two columns: RestoreStart and RestoreEnd). Of course, the next day on the next restore I will lose this information, so the table will always contain only one row, because it can’t remember more than the last restore.
    Using a global table in TempDB helps me in that before dropping the database, I can “create table as select” in TempDB and after the restore, do the same in reverse direction and then insert another row. I was thinking of saving the table to some file in the temp folder of the OS, but (ab)using the TempDB seems like a better solution to me. It has only one minor drawback: server B must not crash during the restore, because I would obviously lose the whole table at that point. But I can probably live with it :).

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.