If you want to run the labs on your own (either during class, or later), you’re going to need:

  • A fast SQL Server
  • The right Stack Overflow database
  • An Agent job to rapidly restore the database between labs
  • SQLQueryStress

The setup is exactly the same for all of my Mastering classes, so as long as you’ve set up for any of ’em, you’re good to go. If this is your first class, let’s talk through each piece.

Building a SQL Server

In class, we use AWS EC2 VMs:

  • Mastering Index Tuning, Query Tuning, and Parameter Sniffing use an i3.xlarge: 4 cores, 30GB RAM, local NVMe SSD storage.
  • Mastering Server Tuning uses an i3.2xlarge: 8 cores, 61GB RAM, local NVMe storage. We use more cores & RAM here because we run heavier stress tests, since we’re doing server-level discussions

Each server needs at least 300GB local SSD space to deal with the ~200GB Stack Overflow database and index creation space. It needs to be fast storage, too: for a rough idea, you’ll need to be able to restore the backup in 15 minutes or less. (I’m just using restore speed as one example here – you’ll be doing lots of storage-intensive tasks, like creating indexes on large tables.)

Before you install anything on your VM, run a test with CrystalDiskMark. You want at least 1,000 MB/sec for sequential reads and 400 MB/sec for sequential writes. Results for an AWS i3.xlarge like we use in class is shown at right: if you can’t get the top line’s results to be at least that high, stop: you’re not going to be able to keep up in class.

If you want to use an Azure VM, use the L8s v2 instance type with 8 cores, 64GB RAM, and 1.8TB of local NVMe storage. It’s the smallest instance type Microsoft offers with NVMe storage, which is what you’ll need to keep up with the labs. Be aware that local NVMe storage erases itself whenever the VM is shut down.

Can you use faster hardware? Yep, and as long as you’re not caching the entire database in RAM, you’ll probably still have the same basic challenges that we tackle in all of the classes.

After building the server, install:

  • SQL Server 2017 or 2019 Developer Edition (Express Edition won’t work)
  • During the install, check the box for “Grant Perform Volume Maintenance Task” – this lets SQL Server restore databases way faster, which will be important during the class
  • Use the default collation during install, SQL_Latin1_General_CP1_CI_AS (don’t get fancy with binary collations)
  • Use at least 4 equally sized TempDB data files, each at least 10GB in size (more files is okay, but just definitely get at least 4 to avoid PFS/SGAM contention during the labs)
  • The latest Cumulative Update
  • The most recent SQL Server Management Studio (and it really does have to be the most recent one)
  • The most recent First Responder Kit scripts in the master database – use Install-Core-Blitz-With-Query-Store.sql

Getting the Stack Overflow database

In class, we use the 2018-06 version of the Stack Overflow database, but I’ve customized it to include a lot of indexes. Download it from the links below (only visible to logged-in Mastering class students):

That’s 4 files because if you stripe a backup across 4 files, it generally goes faster (as do the restores.) You’ll need all 4 of them in order to restore the database. If you’re working in a cloud VM, save these backup files on the permanent storage of your VM, not the local ephemeral storage (which is erased each time your VM shuts down.)

Set Up a Restore Job

Don’t just restore it once manually: you’ll need to automate this process. During class, between each lab, the students restore their database to this starting point. To do that, set up an Agent job to restore them. Here’s the script I use – just modify the file path locations for your backups. In my example below, I’m doing my restores from E:\MSSQL\BACKUP – you may need to modify your backup path and file names.

If you’re working in the cloud, put the backup files on your permanent storage, but restore the database onto your ephemeral (fast) storage. That way, whenever you start your SQL Server VM up for the first time in a while, you can load the data into the fast storage.

Note that mine also has a step to set Cost Threshold for Parallelism and MAXDOP back to normal defaults for the VM we’re working with in the cloud. I also set compatibility level to 140 (SQL Server 2017) – that’s just because I do most of my client work on 2016-2017, so I want the database to behave that way. If you do most of your work on a different version, you can use a different compat level. We’ll talk during the class about how some labs need specific compat levels.

Run your restore job to get an idea of how long it’ll take – if it takes more than 15 minutes, you probably want to use a faster VM to be able to keep up in class. If the restore takes, say, 20-30 minutes, you’re going to have a tough time keeping up when you’re doing storage-intensive things like creating indexes.

SQLQueryStress for load testing

SQLQueryStress is an easy, free, open source load testing utility. Download the latest version (zip), and extract it to a folder. You don’t have to run a setup or anything – it’s just a single .exe file that you can run whenever you want to do a load test.

Then, download my load test files and extract those to the same folder. Your folder will end up looking like this:

SQLQueryStress folder

And that’s it! See you in class!

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.

Menu
{"cart_token":"","hash":"","cart_data":""}