Memory Dangerously Low or Max Memory Too High
By default, SQL Server’s max memory is 2147483647 – a heck of a lot more than you actually have. Trivia time – that’s the max number for a signed 32-bit integer. SQL Server will just keep using more and more memory until there’s none left on the system.
If the operating system has no memory available, it will start using the page file instead of RAM. Using the page file in place of memory will result in poor system performance – operations that should be fast and in memory will read and write to disk constantly. You can learn more about low memory conditions in this Microsoft blog post.
In our Sysadmin’s Guide to Microsoft SQL Server Memory, we explain why SQL Server really does need as much memory as possible, and what it uses memory for. However, that doesn’t mean we should leave SQL Server’s max memory set to unlimited. If we starve the OS for memory, then Windows itself may not get the memory it needs.
We compare the ‘max server memory’ setting in sys.configurations to the amount of memory the server actually has (as seen in sys.dm_os_sys_memory).
We also check sys.dm_os_nodes to see if any NUMA nodes are reporting that memory is dangerously low, and can no longer create threads.
Just Don’t Make Things Worse
A lot of folks are tempted to set Min Memory to the same value as Max Memory to make sure SQL Server always has plenty of memory.
But, per Microsoft:
If the same value is specified for both min server memory and max server memory, then once the memory allocated to the SQL Server Database Engine reaches that value, the SQL Server Database Engine stops dynamically freeing and acquiring memory for the buffer pool.
This can impact other caches and memory consuming operations.
Before You Fix It – Understand the Unit of Measure
Word to the wise– the max memory setting in SQL Server is set in megabytes, not gigabytes. So if you intend to tell SQL Server to use 20GB as it’s max memory setting, you’d set that to 20480 (the equivalent in megabytes). Misunderstanding this can lead to the SQL Server not allocating enough memory to properly start up.
Good News, Standard Edition May Use More Memory Than You Think
Starting in SQL Server 2012, the 64GB memory limit for Standard Edition applies only to data cache, and other parts of SQL Server can use memory above that. Standard Edition’s data cache limit goes up to 128GB in SQL Server 2014. Read more about it here.
To Fix the Problem
1) Decide what to set your max server memory (MB) to. Our simple “starter” rule of thumb is to leave 4GB or 10% of total memory free, whichever is LARGER on your instance to start with, and adjust this as needed.
2) Execute the change. We’ve got detailed steps on how execute your change below. This does not require restarting your SQL Server instance, but it will cause execution plans to recompile.
How to Set Max Server Memory (MB) in SQL Server
Choose your own adventure: use the GUI or TSQL.
Option 1: Use THE GUI
Once you get to the properties…
Remember that 1GB is 1024MB, so calculate accordingly!
Option 2: Set Max Server Memory with T-SQL
If you’re comfortable with sp_configure and have it set to let you access advanced options, you can set it directly in T-SQL. Just replace 12345 below with the number of megabytes you want to use:
EXEC sys.sp_configure 'max server memory (MB)', '12345';
These suggestions make a few assumptions:
- You’re only running one SQL Server
- You’re not running SSIS, SSRS, SSAS, etc.
- You’re not using it to run any other applications on the server
We recommend isolating SQL workloads as much as possible. If you have to consider any of the listed items, memory settings become more complicated.
After this change, your server should be less vulnerable to paging to disk. If the system was already paging, then you should see increased performance. To double-check if you’re paging to disk, check Task Manager, Performance tab, and look at the Free Memory metric for Windows. If it’s under 200, you’re in danger of swapping to disk.