Blog

Managing People Sucks, But You Should Try It Anyway

#SQLPass
15 Comments

I just got asked for advice on what it takes for geeks to rise up through the ranks of your company and become a manager.

I’m probably the worst person to ask.  Check out my wacko career path:

  • 1992 – Went to college determined to learn economics and preach capitalism to the newly freed Russian people.
  • 1993 – Dropped out of college after 3 semesters. Economics bored the hell out of me.
  • 1994 – Went back to work in hotels (had experience from HS)
  • 1996 – Worked my way up the hotel chain, became hotel General Manager.
  • 1997 – Realized management is nothing more than hiring, training, disciplining and firing subordinates – no matter how much you make or how high you go.
  • 1998 – Switched to IT.
  • 1999 – Realized IT was nothing more than buying, installing, repairing and replacing servers.
  • 2000 – Switched to coding.
  • 2003 – Realized development was nothing more than learning languages, mastering them, and then picking a new one when yours was deprecated.
  • 2004 – Pushed into management because of my hotel experience, took it for the moolah
  • 2005 – Burned out on management again, switched to DBA.
  • 2006 – Turned down two offers to manage my team.  Never been happier.
  • 2008 – Switched to “expert,” where I blog, work with the community and work with Quest developers. I blogged about my job here.

The next step is still evolving – I think I’m going to end up working in social media somehow – but you can be damn sure my day job won’t be in management.

My advice: unless you really, passionately love interacting with people – not in chat rooms, but in meatspace – don’t get into management.  I’ve met so many fantastic geeks who thought (just like I did) that management could be learned via a checklist process and studying.  Management requires a whole lot of skills that are completely and wildly different than geekdom.  Your average Gap sales clerk has a better head start on IT management than the best geek.

Being a good manager boils down to one simple thing: motivating people you dislike to do things they dislike.

Your Employees Are The Whammies
Your Employees Are The Whammies

Anybody can motivate people they like to do things they like.  That’s not management – that’s being a game show host.

Most people can even motivate people they dislike as long as they’re trying to get those people to do things they like.  You grit your teeth and bear it, and the people grit their teeth and bear you because they’re doing stuff they like.

Managers have to do this nasty task all the time.  You can’t always fire the people you hate, or let your people ignore the tasks they hate.  If you’ve got half a dozen employees, at any one time, at least one of them needs to do something they hate.  Whether it’s filing expense reports on time, improving their soft skills, or dealing with an abusive client, it just never stops.

I’ve learned that I’m no good at motivating people I dislike to do things they dislike, so I’ve adapted.  Rather than having a day job in management, I do some management in my spare time.  This lets me do community management, which consists of getting people I love to do things they love.  If somebody’s a jerk, I give ’em the finger and eject ’em from my community.  The resulting group of people gets to conquer tasks they love.

If you really wanna go for the management track, start by leading a volunteer community first.  It’s the easiest management there is.  If you like that and if you succeed – if people rally around you and your group accomplishes tough challenges – then you’ll stand a better chance of succeeding at tougher kinds of management, like leading employees.

Find your local PASS chapter today and get started in the volunteer community.


Links for 24 Hours of PASS #24hop

#SQLPass
3 Comments

Having problems getting in with your links?  Here’s the links for the rest of the day:

If you don’t see video or can’t hear audio, close your web browsers and the Office LiveMeeting client, then go back in.  Some attendees have had to leave and come back in several times before video works successfully.

If you still can’t get in, check out Tom LaRock’s live stream.  You can watch Tom and listen to the current presentation.  In between sessions, Tom gives his feedback on the topic.  Come join us!

The meetings are being archived, and they’ll be available for replay after the PASS Summit in early November.


SQL Server 2008 R2 Frequently Asked Questions

38 Comments

Got questions about the new features in Microsoft SQL Server 2008 R2?  Can’t wait for R2’s release date?  Here’s the answers I’ve been giving out most often to database administrators wondering about the changes in SQL Server 2008 R2 vs SQL Server 2008.

What will be the SQL Server 2008 R2 release date?

SQL Server 2008 R2 will be released in May 2010.  Despite it being released in the year 2010, it will not be called SQL Server 2010.

SQL Server 2008 R2 Utility Explorer
SQL Server 2008 R2 Utility Explorer

What’s the cost to upgrade to SQL 2008 R2?

If you’ve got Software Assurance for your SQL Server 2008 licensing, then you get all R2 updates included for free.  If you don’t have SA, then you’ll need to decide whether the new features in SQL Server 2008 R2 vs SQL Server 2008 will be worth the cost of upgrading, because you won’t get SQL 2008 R2 for free.  R2 is not considered a service pack.

What’s new, and what are the new features in SQL Server 2008 vs SQL Server 2008 R2?

From a very high level, here’s the new features in R2 that weren’t in 2008:

  • Database Engine – not much new here.  There’s some infrastructure introductions to support Data-Tier Applications in the future, but they’re not too useful as of the August CTP.  I blogged about it in a 3-part series starting with this article about How SQL 2008 R2 is Like Virtualization for Databases.  If you’ve heard the terms Utility Control Point, Utility Explorer, or SQL Server Utility, I’ve got your answers in that series.
  • Business Intelligence – there’s a ton of new functionality for self-service BI in SQL Server Analysis Services 2008 R2.  Excel 2010 will act as a client for SSAS.  You can learn more about it in this screenshot-packed review of what’s new in Project Gemini and this pivot table tutorial for Project Gemini.
  • Scale-Out Servers – if you need to scale a data warehouse beyond a single server, you’ll want to keep an eye on Project Madison.  It’s the result of Microsoft’s acquisition of DATAllegro.
  • T-SQL Enhancements – got nothin’ for you here.  No new commands, no new data types, no new stored procedure goodies.

Will I need Visual Studio 2010 to edit DAC Packs?

Data-tier application projects, new with SQL 2008 R2, can’t be edited with SQL Server Management Studio.  You’ll need the upcoming release of Visual Studio 2010 to create and edit .dacpac files.

Where can I download the SQL Server 2008 R2 CTP?

MSDN and Technet subscribers can download 2008 R2 Enterprise Edition now for free.  Check in the SQL Server 2008 section of the download site, or click here to download SQL Server 2008 R2 for free.  Keep in mind that this is a preview build, not a feature-compete beta.  It should not be used in production.  When databases are attached to a SQL 2008 R2 server, their version number is upgraded to 660, and these databases cannot be reattached to an older (SQL 2008) server.

How will SQL Server 2008 R2 licensing work for virtual servers?

R2’s licensing has an ugly change for shops who use virtualization.  Right now, if you buy SQL Server 2008 Enterprise Edition now by the CPU, you get unlimited virtualization rights.  If you’ve got a 4-socket virtual host and you buy 4 sockets of Enterprise Edition, you can run as many SQL Servers on that host as you want.  From Microsoft’s SQL Server 2008 Licensing Guide:

“For enterprise edition there is an added option: if all physical processors in a machine have been licensed, then you may run unlimited instances of SQL server 2008 in one physical and an unlimited number of virtual operating environments on that same machine.”

You may not be running SQL Server widely in virtualization environments yet, but ask yourself how many SQL Server 2000 and 2005 instances you’re running today.  SQL Server doesn’t just go away – the instances you install today will still be in production for years to come.  They might not be virtualized today, but they’re gonna be virtual years from now, and today’s licensing saves you a fortune.

Microsoft sees that coming, and they’re changing it in R2.  Review the SQL Server 2008 R2 Editions PDF and you’ll find that R2 Enterprise Edition doesn’t come with unlimited virtualization.  To get that feature, you have to spring for the new Datacenter Edition, which costs around $60k per CPU socket.

If I was a DBA with a budget to buy SQL Server licenses this year, I’d make that purchase now.  In May, Enterprise Edition’s price is going up, and it will have less licensed features.  I’d buy it with Software Assurance anyway, so I’ll get R2’s new features if I want them.  If I didn’t want those new features, I’d still have the flexibility of running unlimited SQL Server 2008 instances in virtualization.

Why isn’t the next version called SQL Server 2010?

Microsoft is following the precedent set with Windows Server 2003 R2, which added some features but wasn’t a groundbreaking change. SQL 2008 R2 does include some pretty cool stuff, but it doesn’t include major earth-shaking changes in the database engine itself.

Where can I read more about the new features and changes?

Here’s what I’ve written so far:


SQL Server Data Compression: It’s a Party!

58 Comments

When I was in high school, Dad and I lived with his mom, my Grandma Ozar, for a couple of years.  We took care of things around the house and made sure her coffee pot was always full.  She could really down that coffee – at least two pots a day.  (Looking back, if we could have reduced her caffeine consumption, she probably wouldn’t have needed so much Valium.)

I Can Quit Anytime I Want*
I Can Quit Anytime I Want*

Grandma and some friends took a road trip to Las Vegas, and while they were gone, I threw an epic party.  We’re talking 30 gallon garbage cans filled with homemade Jungle Juice.  (I’m one of those reasons she couldn’t completely eliminate the Valium.) A couple hundred of my closest friends had a good old time.

A couple of my less-than-closest friends had a little too good of a time at my expense and started trashing the house.  They walked up the staircase smashing the picture frames of every family photo, then started to throw a couch off the second floor balcony.  My security guys (I’m telling you, it was that good of a party) carried them out before they got too carried away.

Cleaning Up After The Party

The next morning, the very-closest-friends did a fantastic job of getting things back to normal.  We replaced all the picture frame glass, got the dirt out of the sofa, vacuumed the place top to bottom, and finished the Jungle Juice.  I thought we’d hidden all our tracks, but we got busted by the tiniest of clues.

Someone had left beer bottle caps on top of door jambs all around the house.

Amazing!  Who thinks of this stuff?  You know they did it on purpose, too – they were just itching to get me into trouble.  One beer cap on one door jamb, I could understand, but all over the house?  Damn.

I was disinherited for that particular shindig.

I learned a valuable lesson: if you’re not absolutely sure you can clean up every trace of everybody else’s messes, you shouldn’t throw parties.  Sooner or later, somebody you can’t trust is going to show up at your party, and they’re going to do something that’ll get you in trouble.  Come to think of it, it’s just like being a DBA.

SQL Server 2008 Data Compression: No Inheritance Either

Microsoft SQL Server 2008’s Data Compression feature lets you compress objects – not just tables, but individual indexes.  This compression does incur a little extra CPU power to handle the compression, but that extra overhead is more than offset by increased IO performance.  Generally speaking, the database server is sitting around waiting on disk subsystems.  Adding a little CPU work while dramatically reducing IO needs results in faster query return times.  You need to test compression to see if it works well in your environment, because it may not work well in heavy-insert databases.

Today, though, I’m going to focus on the dark side of compression: a complete lack of inheritance.

When you compress tables and indexes, it’s a one-time action.  You’re only taking care of what exists today.  If someone (or even you) turns around and creates an index on that same table tomorrow, it won’t be compressed by default.  Whoever creates the index has to make sure that it’s compressed, and there’s nothing in SQL Server Management Studio that will hint to them that other parts of that same object are compressed.

To make matters worse, your development, QA and production environments might all have different compression settings, and you’d never notice it at a glance.  Compression is transparent to applications, so your developers won’t know why one environment performs much differently than another even though they have the same hardware, same indexes, same statistics.

Implementing compression is a multi-step process:

  • Figure out what objects you should compress
  • Plan to handle all of your environments (dev, QA, production)
  • Compress them during a low-activity window
  • Regularly patrol your environments checking for added objects that weren’t compressed
  • Keep your environments in sync

If you don’t stay on top of all of these, you’ll need Valium too.

* – Yes, about that picture.  No, it’s not mine.  Yes, it’s licensed with Creative Commons.  No, it wasn’t even the worst jungle juice picture I could find licensed with Creative Commons.  Yes, I too am amazed that people upload their party pictures to Flickr, let alone license them with Creative Commons.


Blog Better Week: Strunk & White’s Elements of Style

Blogging
14 Comments

This week I’m focusing on how you can improve your blog. So far I’ve talked about why you should schedule blog posts, how to write a product review, how to spice up your blog with pictures, and how to get people to find you online, and today I’m wrapping things up with a book recommendation.  My next series in a couple of weeks will focus on social networking basics.

Everything I know about style, I learned from two dead white guys.

Watching Project Runway is Not Enough
Watching Project Runway is Not Enough

Strunk and White’s classic book The Elements of Style crams a lot into less than 100 pages, and it’s less than $15.  It covers basic rules of grammar, explains frequently misused words, and inspires readers to become better writers.

My first reaction upon reading the Approach to Style section was a revolt.  I don’t want my blog to read like a faceless newspaper article or an encyclopedia entry.  However, the more I dug into the book, the more I realized that it encouraged the reader to follow some simple guidelines in their efforts to build their own unique voice.  When the book’s authors put similar passageways from Faulkner and Hemingway side-by-side to show how two writers convey the same concept, it just plain works.

It’s chock full of examples from famous books I know I should have read a long time ago.  Reading each snippet reinforces the notion that writing really is an art, and that every time we publish a blog entry, Strunk and White each turn another revolution in their graves.

The hilarious presenter and blogger Jimmy May (BlogTwitter) first prompted me to buy this book, and I owe him a debt of gratitude.  I’d also like to thank Mrs. Weathersby, my senior year English teacher, and point out that I learned a valuable lesson in her class.  Although she was horrified to discover that I’d written John Szegda’s term paper for him, I did indeed learn that writing pays off.

I highly recommend The Elements of Style to any blogger who wants to improve their craft, and you can buy it from Amazon.


Blog Better Week: How to Write a Product Review

Blogging, Book Reviews
10 Comments

This week I’m focusing on how you can improve your blog.  So far I’ve discussed how to build your momentum by scheduling posts ahead of time, the basics of search engine optimization, and how to spice things up with pictures.  Tomorrow’s post will finish up the series with a book recommendation for bloggers of all skill levels.

Bloggers – product reviews are timeless content that will attract more readers, especially readers you’ve never had before. People search the web for product reviews, and if they like the quality of your review, they might stick around and read more articles.

Reviews for enterprise-quality products, like SQL Server tools, are really hard to find.  Some magazines task their writers with banging out quick reviews, but the writers may not have your level of IT experience.  They might look at features, get all excited, and not understand how the features would work in real life.

Two Burned Thumbs Up!
Two Burned Thumbs Up!

Here’s a few tips on how to write a useful review:

Spend at least a month with the product, using it on a daily basis. If you find that the product solves a need, and that you get excited to use the product instead of doing things “the old way”, mention that in your review.  On the other hand, if you have to remind yourself that this product is installed, and if you find yourself cringing when you open it, then that needs to be reflected in your review.

Talk to somebody who’s used it for longer than a month. Ask the vendor for a few contact names of customers.  Granted, these customers have been hand-picked by the vendor because they love the product, but you can still get useful information from them.  Ask the user why they picked the product, how much they paid for it, and whether they’d buy it again.  Ask for at least one thing they don’t like about the product or the company.

Call for support. Even if you’re not having a problem, make one up.  My personal favorite: “Hi, I’m trying to run this on <insert really new or really old OS here>, and whenever I double-click on the desktop icon, nothing happens.”  See how the support experience goes.  If you’re connected directly with the person who wrote the code, that’s a good thing and a bad thing – it means they probably don’t have any customers.  If you’re connected directly with someone who doesn’t even understand the product, that’s just a bad thing.

Make a list of the product’s competitors and ask why they’re different. Don’t just ask the vendor whose product you’re reviewing, either, because those bozos all say the same thing: “Nobody competes with us!  We’re unique!”  Yeah, and so are snowflakes, but there’s a bunch of those too.  Tell your friends about the product, and odds are they’ll know a competitor.  Call or email the competing vendor and ask, “Hey, I’m evaluating Product X – why should I choose your product instead of Product X?”  They’ll be more than happy to tell you all the ways Product X bites the big one.

Introducing - The Pinto Four Wheel Drive
Introducing the Pinto Four Wheel Drive!

Don’t write the review until you can list at least three things you don’t like about the product. If you can’t find those three things, you haven’t used the product long enough.  For example, I’m drinking the Apple Kool-Aid, but even I can tell you three things I don’t like about a piece of Apple hardware within the first month of using it.  (My iPhone: the battery life sucks, it doesn’t fit in docks when it’s got a case on, and the browser keeps reloading pages whenever I switch windows.)

Send your review to the company before you post it. This isn’t an attempt to blackmail the company – rather, you’re asking them if there’s any inaccuracies.  You might have misunderstood how the product worked, or maybe you got some details wrong.  Ask the company’s marketing department to double-check your work before it goes live.

DON’T write too many sponsored reviews. Companies like PayPerPost, ReviewMe and SponsoredReviews offer bloggers $5-$500 per blog post depending on the blog’s popularity and the number of words & links in the review.  Done right, it comes off as somewhat funny and doesn’t offend too many people, as this example post shows.  (It still left me with a slimy feeling, but it’s still the best-written one I’ve seen.)  Done wrong, it exudes spamminess, and readers will unsubscribe from your blog after just a couple of those posts.  I’ve personally avoided these because I’m not in blogging to make money, as I discussed in my series on How to Start a Blog.

Amazon Earnings Report Snippet
Amazon Earnings Report Snippet

DO include an affiliate link to buy the product. Don’t be ashamed to make a little money off your work.  If you’re going to do a book review, sign up for the Amazon Affiliate program and generate an affiliate link for the book.  Use that link in your review article.  When someone clicks on your Amazon link to order the book, you get a 4% cut of the revenue.  Here’s the hilarious part: no matter what they buy, you get a cut, as evidenced in this screenshot of my Amazon earnings report.  If you’re reviewing something that’s not sold by Amazon, consider signing up for Commission Junction, which offers similar programs for other vendors like Newegg. I don’t make much off these programs – around $100/month – but it’s a nice perk.

Include product pictures, and link them to buy the item. People love clicking on pictures, even if they’re not scantily clad models.  Take advantage of that psychological impulse and use your aforementioned affiliate link so that when they click on the picture they go straight to the store to buy the item.  (If this kind of tip gets you all excited, check out Problogger’s series on how they made over $100k with the Amazon Associates program.)

If a company approaches you directly about writing a review, it’s not uncommon to ask for compensation in the form of their products.  For example, if a software company asks you to review their Widgetizer Pro, they’ll include a few free licenses too.  Consider using one for yourself and giving a couple away to your readers – maybe via a contest, choosing a random commenter on that blog post.  However, don’t be suckered into putting dozens of hours of your own time into researching the product, testing it, and writing the review, all in exchange for $250 of licensing.  Put a dollar amount on your time, estimate what it’ll take to do a fair job on the review, and ask the company to either compensate you as a consultant or ask for more freebies in exchange for the review.  If you don’t, then you probably won’t put the full amount of time into the review, and your readers will be able to tell you’re doing a puff piece.  Rule of thumb: if your knowledge of the product consists of reading the brochure, you’re not doing justice to your readers.

If you accept compensation, disclose that in the review. Just mention in the review that the gear was provided to you by Company X in exchange for the review.  If you were paid to write the review, things get more complicated – the FTC is considering holding bloggers liable for their sponsored reviews.

Finally, think twice before you body-slam a product. If a product just doesn’t float your boat, consider returning it to the company and not posting anything on your blog.  Maybe you weren’t the target audience after all, or maybe the product could get better in a version or two.  Send your feedback to the company so they can help improve their work.  If you roast a company’s product in your review, other companies will think twice before sending you a product for review.

This happens to me too – authors have sent me several books for review, but I didn’t end up posting the reviews on my blog.  I’ve forwarded my thoughts over, explained why I wasn’t such a big fan, but wished them the best of luck in their work.  No sense in burning bridges – you only have one online reputation.  Guard it carefully.

At the same time, if the product does something really boneheaded, like putting your production servers in danger, then you should probably identify that product before somebody makes a dumb investment.  After all, a lot of people bought Pintos….

Next Up: A Small Book That Makes a Big Difference In Your Blog Skills


Blog Better Week: Spice Things Up with Images

Blogging
6 Comments

This week I’m focusing on how you can improve your blog. So far I’ve talked about why you should schedule blog posts and the basics of search engine optimization.  Tomorrow’s post will cover how to write a product review on your blog.

Your writing is boring.

Bored Blog Reader
Your Blog Readers

I’ve read your blog.  I know you’re struggling to be funny, and I appreciate that.  I struggle with it too.  We can’t all come up with ways to weave detective stories into our SQL Server blog posts.

Cheat: add pictures to distract us from your writing.

Your writing can be dead serious and technically complex, but you can still liven things up with a romance novel cover, a drag queen, or a guy making fun of himself.  There’s just two simple rules you have to follow when adding images to your blog.

Rule #1: Don’t Steal Pictures

I hate plagiarism.  I hate it when people pass off my writing as their own, and I know my photographer friends feel the same way about their creative work.

Fortunately, some photographers don’t mind sharing their work as long as it’s properly attributed and linked.  There’s plenty of places to get free images for your blog, and my favorite is Flickr’s advanced search.  Put in your search terms, and then scroll to the bottom of the page and check the box that says “Only search within Creative Commons-licensed content.”  These are images that you can reuse as long as you link back properly to the original image.  (When I’m doing presentations with Creative Commons images, I put the link on the bottom of the slide.)

When you have trouble finding good pictures, turn your search around.  Instead of searching for technical terms related to your article’s topic, search for words that describe the emotion you’re trying to convey, like funny, confusing, challenge, broken, etc.  Sort by “Interesting”, and Flickr will show you the images people love the most.  I’m always surprised by how many great photos I find this way, and how well they work in the blog entry even though they didn’t initially appear to have anything in common with my point.

Side note – I tend to reuse funny images when I find real gems, so I mark them as favorites in Flickr.  If you ever wonder what my next presentation will include, check out what photos I’ve bookmarked recently, and that’ll give you a peek into my brain.

Before you edit the file (change the orientation, crop it, add a funny lolcats-style caption) double-check the usage rights.  It’s on the right side of the photo page on Flickr where it says “Some Rights Reserved.”  Some photos allow Remix use, whereas some don’t allow modifications.

Rule #2: Don’t Steal Bandwidth

When you find a picture you want to use, right-click on it and save it to your computer.  Upload it to your blog, and make the photo be a hyperlink back to the original web page, not the image.  The web page has information about the author and links to their other photos.

Don’t just put an IMG SRC tag that point directly to the other person’s web server.  This uses too much of their bandwidth at no gain to them.  Savvy webmasters will figure out what you’re doing and take action.

Next Up: How to Write a Product Review On Your Blog


PASS Board of Directors Nominations Open

#SQLPass
0

I never attended a Professional Association for SQL Server (PASS) meeting until I went to my first nation-wide summit in Denver in 2007.  I lived in an area (South Florida) that didn’t have a PASS presence, and I was just going to the PASS Summit for the SQL Server training.

I thought the PASS Board of Directors was something that didn’t really matter to me.  When I got to the summit, I saw the election, and I thought, “What do these people do?  I’ve never heard of most of these candidates.”  I scanned over their bios, cast an uninformed vote for some folks who appeared well-qualified but that I’d never heard of.

I thought I was stupid for not knowing these candidates.  I was wrong.

See, I dunno about you, but I want a professional organization that is actively building a sense of community, that is actively getting out there shaking hands and kissing babies, and that makes me feel like I’m wanted.  I want PASS to be out there in the face of every single DBA, asking them, “How can I make your job easier?  How can I make it easier for you to find your next job?  How can I help improve your skills?”

Steve Jones wrote a great blog entry about who should run for PASS Board of Directors, and I’d only add one thing:

If you haven’t heard of somebody, it’s not your fault – it’s theirs.

When I write up my list of endorsed candidates, that’s going to be my first qualification.  Somebody might be the greatest organizer on the planet, but if they can’t get their own personal name out there, how the hell are they going to get PASS’s name out there?  And yes, I know that PASS requires a lot of different skillsets, but in volunteer organizations these days, everybody has to be an evangelist.

SQL Server skills don’t matter for Board of Directors members.  I haven’t sat in the BoD meetings, but my guess is that Tom LaRock isn’t busting out SSMS to optimize stored procedures.  I’m pretty sure Andy Warren isn’t debating the merits of cursors versus set-based operations.  When choosing Board of Directors candidates, we need to focus on people who can market themselves and market their projects.

If you know somebody who’s a great evangelist, somebody who can convince people to band together and contribute to a common goal, then talk to them about running for the Board of Directors.  Here’s Steve’s article about what it takes, and here’s how to apply.

(And no, I’m not running, and I echo Grant Fritchey’s thoughts on the matter when he says, “My current position is completely in line with William Tecumseh Sherman; I will not accept if nominated and will not serve if elected. Not yet anyway.” I’d love to help the community, but my spare time’s spoken for this year.)


Blog Better Week: The Basics of SEO

Blogging
5 Comments

This week I’m focusing on how you can improve your blog. Yesterday I kicked things off with why you should schedule blog posts.  Tomorrow I’ll break some bad news to you – your blog is boring.

I blog to help other people.

I’m only successful if they can find me.

Avoid the hot dogs.
No, not every player is a winner.

Picture yourself walking through a carnival jam-packed with crazy idiots, insightful geniuses, and half-clothed hotties all vying for your attention.  They’re all screaming at the tops of their lungs, enticing you into their carnival tent to see their freak show.  Some are free, some cost money, and all are desperate to get you inside.  When you finally pick a tent and go in, you find that even inside the tent, while you’re trying to focus on what you went in for, there’s even more barkers inside trying to get you to go to another tent.

The Internet isn’t fair. The Internet IS a fair.

You might be crafting the smartest, most well-written blog out there, but you’re still out there, and you’re out there along with a gazillion other carnies.  Some of them are trying to make money off their booth, and they’re going to do whatever it takes – fair, slimy or otherwise – in order to get people in the door.

Your one weapon, the great equalizer, is the search engine.

Search engines want to hook their users up with the best content possible.  The better the search results, the more likely the readers are to come back to that same search engine again.  Search engines make money off ads, so they too are carnies hawking their wares.  They want to help you help them help their users. (Triple score! I just won a giant stuffed banana!)  Making your site easier for search engines is called Search Engine Optimization (SEO).

Step 1: Break your blog posts into sections.

This assumes that your posts are longer than a couple hundred words – and frankly, they need to be.  The less content you have, the less search engines have to work with.  Less isn’t more when it comes to blog posts.

For every couple/few paragraphs, throw in a header like you see here.  In WordPress, you should use Heading 3, because Headings 1 & 2 are already used for higher-level titles in your blog (like the blog title and post titles).  Use descriptive words in those headings, words that match the kinds of things people are searching for.  For examples, check out my entry last week about truncate_only, and look at what words I used.

At the same time, you have to walk the line between serving the search engines and serving readers.  If you just stuff your headings chock full o’ keywords, your blog won’t be enjoyable to read.  Make the headlines fun first, and SEO-friendly second.

Step 2: Specify summaries and keywords for your posts.

If you go to a search engine and search for SQLIO, here’s what shows up:

SQLIO Search Results
SQLIO Search Results

See the one-sentence explanations below each page title?  Read through those more closely.  Which ones look the most inviting to the readers?

In the battle for visitors, this is like your opening shot.  You only get a tiny amount of screen real estate, and every pixel counts.  Instead of making the search engine guess what your article is about or using a random sentence from your post, why not tell it exactly what to show?  You can!

If you’re using WordPress, install the All in One SEO Pack.  After installation, it’ll warn you that it must be configured.  Just go to the settings page and click the Enable radio button.  All done.  Then, when editing a blog post, you’ll have these fields at the bottom of the post:

All in One SEO Pack
All in One SEO Pack

Without the All in One SEO Pack plugin, WordPress tries to guess the post’s Title, Description and Keywords.  The plugin gives you fine-tuned control over exactly what your post is about.

I don’t set these fields on every blog post I write, because I don’t expect most of ’em to have long-lasting search engine power.  The more work I put into a post, though, the more likely I am to put some effort into the SEO fields so that more folks can find my work.

Step 3: Publish a sitemap.xml file.

If you’re using WordPress, all you have to do is install the Google XML Sitemaps Generator plugin.  There’s not much to configure, but set yourself a reminder in your favorite task management software to check the settings once a month.  I’ve had some problems when my file permissions changed and the plugin was no longer able to write the sitemap file.  If that happens, the plugin shows instructions on how to fix it, but unless you go into the plugin’s configuration page, you’ll never know there’s a problem.  It doesn’t send an email or alert you other than showing an error on the config page.

If you’re using a different kind of blog, I got nothin’ for ya.  It’s up to you to figure out how it works.

After publishing a sitemap.xml file, sign up for Google Webmaster Tools.  It’s a completely free service that gives you valuable statistics and diagnostics about your blog, your readers and your search engine optimization.  Once a month, I log into Webmaster Tools to find out if Google’s had any problems indexing my blog.

These steps take a little work, but I’ve found that over time they make a big difference.  Of course, if you’re lazy, there’s a solution for that too.

Enter the Snake Oil: Search Engine Optimization

Wherever there’s carnivals, there’s snake oil vendors: people selling mythical potions that will cure all ailments.  In this case, the ailment is your low search engine ranking, and there’s a long list of vendors who’d love to give you a hand with that problem.

If you pay someone $500 to do search engine optimization for you and boost your search rankings, how do you know if they succeeded?  They’ll tell you that it may take days or weeks for the search engines to notice the changes.  They’ll tell you that your mileage may vary, and that all sales are final, no refunds allowed.  You might even be working on your blog yourself at the same time, and it’s impossible to tell how much improvement was due to their efforts versus yours.

Some SEO vendors do a fantastic job, but they’re priced outside the reach of most bloggers (including me).  If you’re dead set on spending money on search engine optimization, check out the book Landing Page Optimization by Tim Ash.  It explains how to build out sections of your site specifically for search engine visitors and how to get them to stick around.

I’d recommend my favorite SEO company to you here, but then I’d lose my competitive advantage.  After all, the other carnies are reading this too…

WordPress SEO Tutorial Video from Google

Matt Cutts (BlogTwitter) is a member of Google’s Webspam team.  He spoke at WordCamp San Francisco about how WordPress bloggers can make their web site more searchable.  It’s a 45-minute long video, so warm up that plate of bacon and get cozy.


Blog Better Week: Building Your Blogging Momentum

Blogging
13 Comments

This week I’m focusing on how you can improve your blog.  Enjoy!  Tomorrow’s post will show you how to get people into your carnival booth – I mean, web site.

The question bloggers ask me over and over is, “How do you blog so often?”

The answer is simple: stop clicking Publish and start clicking Schedule.

Blog Nirvana - Plenty of Scheduled Posts
Blog Nirvana - Plenty of Scheduled Posts

Fail to Plan and You Plan to Fail

Blogging is no different than any other IT work; if you’re always doing things at the last possible moment, you’re going to do a crappy job.  If you start doing your work in advance before it’s due, then you’ll find yourself putting more and more polish into your work.  You’ll stop sweating bullets, stop stressing out over quality versus quantity, and stop approaching your blog with guilt.

I schedule my posts for publication on weekdays, often a week or more ahead of time.  I’ve gotten into the routine of scheduling posts on Mondays and Wednesdays, leaving myself Fridays for spontaneous stuff.  If I have an urgent flash of news that I just have to push out ASAP, I’ll write it up and then rotate out my next scheduled blog post to later in the line.  This post is a great example – I’m writing it on Monday, August 3rd for publication on Wednesday, August 19th, but if something comes up, I can reschedule this post later and later.  (Edit – sure enough, I pushed it back, and I had enough blog entries about blogging that I built a whole week of ’em.) It’s a timeless post – not good for eternity, but at least it can be published at any time without losing its impact.

Plus, when I schedule a blog post ahead of time and sleep on it, often I’ll return to it the next day and remember something I should have added.  I can take my time to refine the post rather than hitting Publish and cringing.

Write When You Can, Not When You Gotta

Scheduling posts ahead of time gives you a sudden flexibility.  When you feel creative, write, and write until you don’t feel creative anymore.  Write as many blog posts in a row that you’ve got time for, and then quit.

My best blogging time seems to be Saturday mornings.  I’ll pile up a list of blog ideas in my favorite task management tool, RememberTheMilk.com, and on Saturday morning I’ll pull up the list to see what strikes my fancy.  If I find the words coming easily to my fingers, then I’ll blog until I get constipation of the word processor.  Usually I can bang out 3-4 entries at once, which buys me two weeks of time.

Next Saturday, if I’m not feelin’ the love, I won’t feel guilty – because I’ve already got enough articles to tide me over.  Voila: stress-free blogging.

How to Get Started Scheduling Blog Posts

Brace yourself: just go cold turkey.

The next time you write a blog post, schedule it to appear a week from now – minimum.  Yes, you’re going to feel guilty.  Yes, you’re going to think that your readers will be horrified at your lack of blogging, but no, none of your readers will actually notice.

What they WILL notice is your sudden increase of quality from that point forward.

Isn’t it worth 7 days of silence for a lifetime of better blogging?

Next Up: The Basics of WordPress SEO (Search Engine Optimization)


How to BACKUP LOG WITH TRUNCATE_ONLY in SQL Server 2008, R2, 2012, 2014, 2016, 2017, 2019

BACKUP LOG WITH TRUNCATE_ONLY is a dangerous command: it empties out the contents of your SQL Server’s transaction log without really backing it up.  Database administrators sometimes run this command right before shrinking their log file with a DBCC SHRINKFILE command, thereby freeing up drive space.

Why ‘truncate_only’ is not a recognized backup option.

When you truncate transaction logs, you lose the ability to recover to a specific point in time. You shouldn’t be running this command except during extreme emergencies. Unfortunately, administrators started running it on a regularly scheduled basis, and then they got surprised when they couldn’t restore the way they wanted.

Microsoft recommends that instead of truncating logs, you switch to simple recovery mode instead.  That way you don’t generate logs you won’t be using, and you won’t incur performance impacts from repeatedly filling and truncating the logs.  You also remove the need to regularly back up the transaction log.  This has plenty of drawbacks – if something goes wrong with your database, your only option will be to restore the previous full backup.  You could lose hours – maybe even days – of data.

To stop people from shooting themselves in the foot, Microsoft removed this capability completely from SQL Server 2008.  If you try to use this command:

You get an error:

The only official workaround in SQL Server 2008 and newer is to switch the database’s recovery model to simple as shown in Books Online.  This empties out the transaction log, thereby letting the DBA run a DBCC SHRINKFILE afterwards, then switch the recovery model back to full.

That solution still suffers from most of the same problems as using TRUNCATE_ONLY – the database’s recoverability is compromised.  It’s just as bad of a solution, but unfortunately Microsoft can’t remove that workaround since we do need to put databases into simple recovery mode for other reasons.

How to BACKUP LOG WITH TRUNCATE_ONLY in SQL Server 2008

Don’t try what you’re about to see at home. We’re what you call experts.  We’ve got years of experience that keeps us safe.  Just like TRUNCATE_ONLY, this solution has a ton of drawbacks and compromises your recoverability.  This should only be done in cases where a log has grown out of control and must be erased or else the system may crash.  In any other situation, you should consider backing up the log with conventional means.

We can fake it by not writing our backup to a real device.  SQL Server lets us use the NUL: location as a backup target, so the following will do a log backup without actually saving the contents anywhere:

Remember, we’re still not fixing anything here: whatever caused the log file to grow in the first place can happen again and put us right back where we started.

Your data is not actually backed up. This is a giant problem.  Don’t leave this script lying around where a junior DBA might see it and reuse it for regular backups.  This should only be used to impress your friends with your useless knowledge of SQL Server, much like I’m doing here.

Other Common Questions About Transaction Log Backups

Q: Why shouldn’t I shrink log files?
A: When SQL Server needs to grow the log file back out, it’s a blocking operation. Everything in the database is put on hold while the log file grows out. We can avoid this for data files by using Instant File Initialization, but that doesn’t take effect for log files.

Q: But I had a one-time log file growth and I swear it’ll never need to grow that big again.
A: Okay, cool – go ahead and shrink the log file this once, but make sure you leave enough log file space for normal operations.

Q: How big should the transaction log be for normal operations?
I generally start at 25% of the data file size. If you plan on rebuilding your indexes, the log needs to be large enough to hold the size of your largest object, plus space for transactions that are happening during your index rebuilds. If your database is dominated by a single large object, then it might need to be bigger than 25%. Plus, if you’re using things like replication, mirroring, or AlwaysOn Availability Groups, you’ll need enough log space to hang on to transactions during replica downtime – until that replica comes back up and can download the rest of the transactions it missed during the outage.

Q: How can I find out if Virtual Log Files (VLFs) are a problem for me?
Run our free sp_Blitz®, a health check stored procedure that catches databases with abnormal VLFs, bad growth configurations, and much more.


Stop Shrinking Your Database Files. Seriously. Now.

I had sworn to myself that if I saw one more helpful article about how to shrink your SQL Server database files with DBCC SHRINKFILE or how to back up your log with TRUNCATE_ONLY, I was going to write a rant about it.

Epic Advice Fail
Epic Advice Fail

SQL Server Magazine just tweeted about their latest article, a reader-submitted solution on how to shrink your database files with ease.

AAAAAAAAAAARGH.

To make matters worse, this particular article is by a Microsoft DBA and doesn’t include a single word about the problems involved with shrinking your database files.

AAAAAAAAAAARGH.

Don’t shrink your database files just to free up drive space.  Stop.  It’s an unbelievably, disgustingly, repulsively bad idea.  Your disk drive space is for files, not for ornamentation.  You don’t get bonused based on the amount of free space on your drives.  Empty files don’t take longer to back up.  And so help me, if you find yourself shrinking databases so often that you have to automate it, you need to cut up your DBA card and reconsider your choice of career.

I’m not going to reinvent the wheel by telling you why.  Instead, I’m going to point to half a dozen posts explaining why this advice is just a flat out epic fail:

I feel bad going nuclear on this article, but I’m not just venting about the author.  This kind of advice shouldn’t clear any kind of SQL Server editorial team either.

Coming next month: “How to Reduce Your Backup Times with the Truncate Table Command!”

If you still think you want to shrink a database, check out how to shrink a database in 4 easy steps.

Learn More About Why Your SQL Server is Slow

sp_Blitz®: Free SQL Server Health Check – You’ve inherited a SQL Server from somebody, and you have no idea why it’s slow. sp_Blitz® gives you a prioritized list of health and performance issues, plus gives you URLs for more details about each issue.

Our Free 6-Month DBA Training Plan – Every Wednesday, you get an email with our favorite free SQL Server training resources. We start at backups and work our way up to performance tuning.

SQL Critical Care® – Don’t have time to learn the hard way? We’re here to help with our quick, easy day process that gets to the root cause of your database health and performance pains. Contact us for a free 30-minute sales consultation.


We’re Moving to Chicago

30 Comments

Erika and I have moved a lot lately!

Chicago Skyline
Chicago Skyline

In 2005, we left Houston, Texas and moved to Miami Beach, Florida so Erika could pursue her education as an air traffic controller.  In early 2008, when she finished her degree and got an offer from the FAA, we moved back to Houston.  Later that year, when she stopped working for the FAA, we decided to move up to Whitehall so she could see snow for the first time.

Amazingly, Erika actually liked the winter.  When we visited Chicago, she saw it as the perfect combination of big city life plus winters.  I’ve always liked Chicago, and I don’t mind winter in a city (as long as I don’t have to shovel driveways or scrape my car off).  We’d originally planned on moving back to Houston after Michigan because I wanted to plant some roots, but Chicago really called to us, and we could see ourselves spending several years there.

So in late September, we’ll be moving into a loft in downtown Chicago’s South Loop.  (I’m still learning the terminology – it’s a couple of blocks from the Mayor’s townhouse on Michigan, if you know that area.)  I’m really excited at getting back into a big city and checking out the museums, deep dish pizza, and the Chicago SQL Server user group scene.

But mostly the pizza and museums!


SQL Server 2008 R2: Into the Clouds

SQL Server
13 Comments

R2’s new Data-Tier Application (DAC) capabilities give the DBA to manage databases as if they were more like virtual servers.  Today I’m going to talk about what that means short-term and long-term.

Short-Term: Nothing To See Here, Move Along

I don’t think most DBAs will see a .dacpack file in the wild for years:

  • It’s SQL Server 2008 R2 only. The new SQL Server Utility model can’t be used to manage older versions of SQL Server.  Whenever I poll enterprise DBAs, the majority of ’em are still on 2000 and 2005.
  • It’s only available in Enterprise Edition. Microsoft has only put the coolest new features in Enterprise Edition lately, and this is no exception.
  • They don’t develop themselves. Somebody has to develop the DACs before you can deploy ’em to your server.  Yes, you can reverse-engineer an existing database into a new Data-Tier Application, but…
  • Not all SQL Server objects are supported. Data-Tier Applications support only a subset of objects like tables, non-clustered indexes, views, functions, and stored procedures – and not just any stored procs, either.

To illustrate that last point, I tried extracting one of my favorite databases, a Twitter cache built using Tweet-SQL, and got the following errors:

DAC Pack Errors
DAC Pack Errors

Some of the errors included:

  • This object depends on dbo.tweet_usr_followers(Stored Procedure) that is not supported in a DAC.
  • Accessibility(SqlAssembly) – This object type is not supported in a DAC.
  • dbo.tweet_acc_archive(StoredProcedure) – This object type is not supported in a DAC.

The problem: extended stored procedures and encrypted stored procedures.

Ironically, these techniques are often used by third party vendors to deliver their code – like TweetSQL, and like Quest, for that matter.  Yesterday, I blogged about why third-party vendors are a great use case for DACs, but now we can see why adoption in that user group won’t be quick.  Why would a third-party vendor deploy their database as a .dacpack if it can only be deployed on Enterprise Edition and none of the contents can be encrypted?

In fact, who would use this at all when it’s such a minor subset of SQL Server’s capabilities?

Flash Back to SQL Server 2008’s Release

At the PASS Summit in Seattle, I talked to a DBA who’d really had it with Microsoft.  He complained that 2008 didn’t have anything groundbreaking for production database administrators.  He’d loved SQL 2005’s introduction of SQL Server Management Studio, but since then, he hadn’t seen anything out of Microsoft to really make his job easier. PowerShell and Policy-Based Management were theoretical steps in the right direction, but not big enough.  (To date, I still don’t see widespread adoption of either, and I agree with his sentiments.)

More than making life easier for production DBAs, SQL Server 2008 started to install plumbing that would make life easier for BI users.  Much easier.  Much, much easier.  SQL Server 2008 R2 brings the faucets and shows just how easy it’s going to be.  To quote Microsoft’s R2 marketing page:

“Self-service analysis tools allow end-users to quickly create solutions from disparate data sources within a familiar Microsoft Office Excel user interface. By publishing these solutions in SharePoint Server, users can easily share them with others.”

As Microsoft delivers capabilities to end users, they’re focusing on self-service.

Not making it easier for administrators – making it self-service.  Making it easy enough that no dedicated administrator is required – in theory, at least.

The reality is that DBAs don’t buy SQL Server.  DBAs sell SQL Server.  They sell it to CIOs, developers, and BI users.  Microsoft’s approach with self-service BI in SQL Server 2008 R2 means that they’re selling directly to the BI users, empowering them to do their own work without getting approval from the DBA.

That DBA who complained about Microsoft not focusing on DBA tools isn’t going to be any happier with SQL Server 2008 R2.  The features available at release will be focused on BI professionals, and the new-plumbing features that will work long-term are focused at a different audience altogether.  When you read that SQL Server Utility features will make life easier for production DBAs, you need to read between the lines.

Long-Term: Look Up, The Sky Is Coming

The subset of supported features is eerily similar to SQL Azure, Microsoft’s cloud-based SQL Server offering.

Oh Noes!
Oh Noes!

Did you hear that just now?

All over the universe, DBAs cried out in anguish at the prospect of reading yet more about how the cloud is coming to steal their buckets.

Just like SQL Server 2008 R2 starts to deliver self-service BI, the next versions of SQL Server will probably focus on delivering self-service data storage.  The “self” could be:

  • Developers who just want to write applications that store data without the hassles of asking for DBA permission
  • Network admins who want to manage SQL Server the same way they manage the rest of their servers – as virtual servers, slicing up pools of resources
  • Project managers who want to buy a third-party application and don’t care about how the database works

Virtualization didn’t become popular by catering to the people who liked their own dedicated servers.  It caught on because it catered to the people who paid the bills and the people who had to manage all those dedicated servers.  Likewise, the DAC concept might catch on not by catering to DBAs, but by catering to people who never really liked DBAs.

These types of users will love the DAC concept, and furthermore, they might like the concept of just hosting their database in the cloud.  Us hard-core DBAs look at Azure’s pricing model ($10/mo for 1gb, $100/mo for 10gb) as crazy high, but to project managers and network admins, that’s not too bad at all.  Compare it to the cost of a fully configured SQL Server Enterprise Edition box with licensing (because remember, DACs only work on Enterprise Edition) and the cost is downright sensible.

DBAs complain about security problems in the cloud, but the DAC concept appears to have conquered some of those limitations.  Security requirements are built into the DAC package, including logins and permissions, and non-secure multi-database elements (like extended stored procedures) just aren’t allowed in .dacpack files.

When I interviewed Tom Casey at the PASS Summit in 2008, he hinted at this by saying that SQL Server is already somewhat multi-tenant, and now I see where this is going.  Down the road, you can choose whether to deploy a DAC in your internal cloud (SQL Server resource pool managed by the SQL Server Utility Control Point) or to Microsoft’s cloud.  If your developers have already confined their application to using a subset of SQL Server’s functionality, then it’s no additional headaches for you.

The SQL Server Utility model won’t change the way you work this year, but ask your Windows administrators how their job has changed with the advent of virtualization.  Take heed of the lessons they learned, because your job will be changing next.

Related SQL Server 2008 R2 Posts:


SQL Server 2008 R2 Hands-On Lab

7 Comments

Wanna play with R2, but you don’t have the time to do it?

Yeah, I remember what it was like having a real job.

R2 Splash Screen
R2 Splash Screen

That’s where I come in.  It’s my job to learn the latest and greatest from our friends in Redmond, and I’ve already got my SQL Server 2008 R2 lab set up.

Each afternoon this week from 2:30 PM Eastern til 7:00 PM Eastern, you can drop in and watch me working with 2008 R2.  You can ask questions, and I’ll click on whatever you want.  It’s your chance to see what R2 (and Windows 7, for that matter) is like without actually installing it in your environment.

Wednesday’s LiveMeeting link is here, and for audio, dial 1-866-237-3252, passcode 945363.  We did audio over LiveMeeting (computer speakers) on Tuesday, but we had some attendees who couldn’t get the audio through their firewall, so I’m doing it over the phone today.

For future events, come back to this page and I’ll update the meeting link.

Keep in mind that the August CTP doesn’t include Office 2010 or any new SharePoint bits, so I can’t show the slice-and-dice stuff that’s coming in the next version of Excel.


SQL Server 2008 R2: The DAC Pack

20 Comments

R2 makes it easier for DBAs to move databases around from server to server in much the same way virtualization admins move guest OS’s around between physical hosts.  In my last blog post about SQL Server 2008 R2, I explained that databases are becoming more like virtual servers, and today I’ll talk about what that means to DBAs.

R2: Bringing Sexy DAC

SQL Server 2008 R2 still has the same concept of databases, but it’s added a new level above databases called Data-Tier Applications, abbreviated DAC because the abbreviation DTA was already too widely-known.  The DAC includes the database schema plus some server-level objects required in order to support the database, like logins.

The DAC does not include the data inside your database. For deployment best practices, you should have any necessary data (configuration tables, basic lookup tables) already scripted out as part of your deployment strategy. With the DAC approach, it makes sense to put these scripts inside the database as objects. For example, you might have a stored procedure called usp_deploy that populates all of the necessary configuration tables via insert statements.

In R2’s SQL Server Management Studio, right-click on a database and click Tasks, Extract Data-Tier Application.  This starts a wizard that will reverse-engineer your database schema, figure out what makes it tick, and package it in a way that you can redeploy it on another server.  The information is saved in a file with a .dacpac extension, and if you try to open it with SQL Server Management Server, you’ll hit a stumbling block:

Not So Fast
Not So Fast

Microsoft’s taking an interesting approach here by drawing a line in the sand.  The first hint pops up in Books Online:

“A DAC can be authored and built using a SQL Server Data-tier Application project in Microsoft Visual Studio. Current plans are to introduce the SQL Server Data-tier Application project type in a future beta release of Visual Studio 2010.”

Henceforth:

  • SQL Server Management Studio is for production database administrators.
  • Visual Studio is for database developers.

What DACs Mean for Database Administrators

If you never had a change control process and your developers just implemented changes willy-nilly in production, then the DAC approach won’t change anything.  Your developers will do what they’ve always done.

If you’ve got change control processes in place, your developers probably hand you change scripts and tell you to implement them in production. If you’re ambitious, you audit their work as a sanity check to make sure their work will scale.

In the future, your developers may be creating and updating their database schema, stored procedures, functions, etc. inside Visual Studio, packaging them into DAC Packs, and handing them to you.  In order for you to check their work, you’ll need to switch over into Visual Studio, or perhaps log onto their development SQL Servers to see the schema changes there.  This is another nail in the coffin of the power of the DBA.  From the nosql movement to the DBA-less cloud, DBAs need to be acutely aware of how things are changing.

This isn’t necessarily a bad thing; it’s worked great in the world of virtualization.  As a VMware sysadmin, I didn’t need to understand what each virtual server was doing, whether it conformed to best practices, or even what was running on it.  I managed them in large quantities with low overhead simply by moving things around based on the resources they needed.  If a server’s needs grew, I could move them to a larger VMware host or a less-active host.  I only purchased resources incrementally for the entire pool rather than micromanaging what each server needed.  I didn’t do as good of a job as if I’d micromanaged each server’s configuration, but I was able to manage more servers with less manpower.  Everything’s a tradeoff.

What if you, as a production DBA, could manage more instances and more databases with less time?  What if, instead of looking at lines of T-SQL code, you were able to step back and see the bigger picture?  What if you treated every application as a sealed, hands-off third-party app?

Perfect DAC User: Third-Party App Vendors

At Quest Software, we build multi-tier applications that store data in SQL Server databases.  For example, Foglight Performance Analysis monitors the most offensive queries in your applications and gives you insight as to what developer you should shoot first.

We have basic requirements for our repository databases, but when we hand the product off to the customer, we’re relying on the honor system.  We ask that you not deploy the repository with Auto-Close enabled, for example, but we have to trust that you know what that means and how to make sure it’s not enabled.  If we deploy our repository as a DAC, however, we can build in policies that check for things at deployment time.

Another downside of storing our data in SQL Server is that you, the SQL Server DBA, tend to poke around.  I’ve worked with DBAs who wanted to extend Quest’s database to store more stuff, or they wanted to store their utility queries in Spotlight’s work database.  They say, “Hey, this QuestWorkDatabase is on all of my servers – I’ll just stash my queries in here and nobody will notice.”  Next thing you know, they’re storing tables with data in there too, and then it’s only a matter of time until they break our schema.

I would love to deliver our repository as a sealed, hands-off appliance database that the DBA couldn’t break.  Unfortunately, though, Books Online says:

“After deployment, the database is managed like any other database. Configuration of the database is done using common mechanisms such as the ALTER DATABASE Transact-SQL statement, the database management dialogs in Management Studio, or using the SQL Server Management Objects in the SQL Server PowerShell provider.”

The vendor in me says, “Damn! The DBAs can still hose up our schema.”

The DBA in me says, “Yay! I can still add indexes and fix bad code in vendor products!”

Next Up: DACs and The Cloud

What DACs don’t support gives us a clue about where database technology is going, and I’ll talk about that in my next SQL Server 2008 R2 blog post.


Blog Quiz from Chris Shaw

2 Comments

Today’s quiz from Chris Shaw is a two-parter, and for once in a web quiz, the second part really is a quiz.

1. Do you feel that you have a reliable SAN solution? If so, what’s the secret?

Your SAN Admins
Your SAN Admins

The key to getting a reliable SAN solution that no human being can mess up is to spend over a million bucks on it and buy an Enterprise-class SAN.  Those things are engineered like nobody’s business.  Once they’re set up, it’s hard for us meatbags to goof it up.

Spend less than that, and you get a midrange or entry-level SAN.  These SANs, if left to themselves, are pretty reliable.  The problem comes in when well-meaning mammals walk up and start tweaking dials or upgrading firmware.  The more changes you make to your SAN, the less reliable it gets.

I hardly ever see a stock Honda Civic at the side of the road with the hood up.  The only times I see a stranded Civic is when it’s loaded up with spoilers, tinted windows, custom rims, a fire extinguisher mounted to the A-pillar, and a series of add-on gauges rising up from the dash.  If you don’t want your SAN to look like something out of the Fast and the Furious, the secret is to lay off the nitrous oxide and stop tweaking settings.

2. Explain database mirroring in layman’s terms.

When asked to explain radios, Albert Einstein said:

“You see, wire telegraph is a kind of a very, very long cat. You pull his tail in New York and his head is meowing in Los Angeles. Do you understand this? And radio operates exactly the same way: you send signals here, they receive them there. The only difference is that there is no cat.”

Database mirroring is exactly the same.  See?  Now, if you’ll excuse me, I’m itching to go play Einstein on a Beach by Counting Crows.  I always giggle when I hear it on the radio – it’s like Einstein’s meowing.

Who I’m Tagging

I’m taggin’ the funniest SQL guys I know – the old Bacon Bits and Bytes crew – to see if they’ll outfox my database mirroring analogy:


SQL Server 2008 R2: Think Virtualization

Virtualization
12 Comments

How are my servers doing?

How are my databases doing?

If my boss came in and asked which servers needed more horsepower or which ones could be consolidated together, how would I come up with an answer?

I’ve always found it funny that DBAs and sysadmins don’t have good answers to these questions.  We roll our own monitoring solutions, cobble things together from a bunch of parts, or fork over money to third party vendors to get a picture of how our environment’s doing.

Virtualization admins, on the other hand, have fantastic answers to these questions.  Take this screenshot from VMware vSphere’s management console:

Host Server Utilization Rates
Host Server Utilization Rates

The sysadmin can see a lot of relevant information at a glance: how utilized the host server CPUs are, how utilized the server memory is, each server’s status, and more.  Wouldn’t it be nice to get a simple report like that for SQL Server?  Check out this screenshot from the new SQL Server 2008 R2 CTP:

SQL Server 2008 R2 Utility Explorer
SQL Server 2008 R2 Utility Explorer

The first thing to notice is the two pie charts: Managed Instance Health and Data-tier Application Health.

Not servers and databases – instances and applications.  That’s your first hint that things are going to be different long-term.

Virtualization Changed Everything, But Took Time

Looking ahead, Microsoft wants us to start thinking of databases as being less connected to physical servers, and to think of our physical servers as a resource pool.  Imagine if databases were self-contained packages that could be moved from server to server – just like virtual servers can be moved from host to host today.

The virtualization push took years to accomplish because there were so many things that had to be handled.  We had to figure out how to handle drivers, how to handle resource sharing, how network segregation and storage throughput would work, and even systems management had to be rethought from the ground up.  It took a long time to get it right, and today virtualization is pushing into enterprises everywhere.

Today, SQL Server faces struggles not unlike the early days of virtualization. There are several factors that cloud the Utopian vision of moving databases around seamlessly:

  • Connection strings – our apps call for their data by a specific server name.  If we’re going to abstract servers away, then we need a way to find our data.
  • Logins – logins are set at the server level, yet are tied into databases.  If we move a database from one server to another, we have to make sure that the login exists on the new server, has the same password, and has the same level of access rights.  If the application frequently uses a specific login to call the TRUNCATE TABLE command, for example, we need to know it’ll have that same level of permissions on the new server.
  • SSIS/DTS packages – these techniques can be used to pipe data in and out of our database servers, and they’re often tied in with local servers.
  • Scheduled jobs – as much as I fight developers who want to put jobs on their servers, reality is that I don’t always win.  (Schedulers belong in applications, not databases.)
  • Anal retentive DBAs – we know best, right?  We finely tune some of our applications so that the data lives on one set of disks, logs live on another set of disks, and maybe indexes or partitioned data live on yet another set of disks.  If we start shuffling databases around, we’re going to need to abandon that level of control.

Abstracting all of this stuff out of the database architecture isn’t going to be easy, but SQL Server 2008 R2 is starting to take the first step.

The SQL Server Utility: Virtualization for Databases

SQL Server 2008 R2 introduces the concept of the SQL Server Utility: a pool of resources (instances) that host applications (databases).

The Utility is managed by a Utility Control Point: a single SQL Server instance that gathers configuration and performance data.

All of this is visualized through SQL Server Management Studio’s Utility Explorer:

SSMS 2008R2 Utility Explorer
SSMS 2008R2 Utility Explorer

This dashboard shows some basic metrics about CPU use and storage use at both the instance level and the application level.

Down the road – years down the road – this might provide DBAs with the same level of fast reaction times that virtualization admins currently enjoy.  Got a database running out of space?  Move it to a server with more space.  Got an application burning up too much CPU power?  Slide it over to one that’s got the latest and greatest processors.

In order for this concept to work, though, we need to do more than just think differently about our databases; we’re going to need to deploy them differently.  SQL Server 2008 R2 introduces the concept of the Data-tier Application (DAC), a self-contained package with everything our applications need in order to store and process their data.  It’s not going to work for everyone, and in my next post, I’ll talk about what SQL Server 2008 R2’s DACPack means for DBAs.

Continue to Part 2: What the DAC Means for DBAs


Download the SQL Server 2008 R2 Enterprise CTP

7 Comments

Ladies and gentlemen, start your downloads.

The SQL Server 2008 R2 community preview is now available for downloading.  Read the Data Platform Insider blog entry for more details.

The download link in that blog entry didn’t work for me, but the ever-insightful Wesley Backelant pointed out that it’s available in MSDN/Technet too.  In your product downloads, click on Servers, SQL Server 2008.  The R2 Enterprise CTP is cleverly hidden in there, with downloads for x86 (hmmm), x64 and Itanium listed.  There’s also an Express Edition set of downloads too.

Keep in mind that these downloads don’t include any SharePoint or Office bits needed in order to preview some of R2’s new BI-focused features.

Side note – if you’re into SQL Server and you’re not following Wesley on Twitter, you should be.  He consistently posts news first and responds fast.  Yet another reason that Twitter pays off for DBAs!