Blog

Speaking at PASS Camp 2008 in Germany

#SQLPass
2 Comments

I love my job!

When a new version of SQL Server comes out, database administrators want to know what features will make an immediate difference in their lives.  They want a very fast recap of what they need to do first, what they need to plan for, and what to tell the rest of their staff.  They don’t have the time to build lots of SQL testbeds, play around with the features, discover problems and figure out the best way to implement new policies.

As a full time SQL Server expert for Quest, I do have that time – it’s my job.  It’s my job to dive into SQL Server, learn how to embrace the new features, and learn how to help DBAs do their jobs faster and better.

Plus, since I joined Quest a few months ago, I’ve had the privilege of working with some of the best SQL Server experts around. Our developers, support teams, QA teams and project managers have an absolutely jaw-dropping amount of expertise on SQL Server.

For example, I join a weekly conference call with our support teams to talk about really tough escalation cases.  I get genuinely excited at some of these cases, and when one of our customers pushes the limits of SQL Server, I’ve been known to say, “That’s really cool!”  Of course, our support teams then ask me if I’d like to take ownership of that ticket, and my answer is usually NO, but that’s only because I don’t have enough time in the day to do everything I want to do.

At PASS Camp 2008 in Dusseldorf, I’m doing a session about SQL Server 2008’s DBA-friendly features that have the biggest bang for the buck.  I’m going to concentrate on things that database administrators need to do right away after they install SQL Server – things that will make their lives easier and things that their managers will appreciate.  After a DBA installs SQL Server, we want the manager to say, “Ah, this new version is great!  How did we ever live without it?  Our DBAs rock!”

Making DBAs look good – I love my job.


Important SQL Server Data Services warning

0

An important email just came out from the Microsoft SSDS team about this week’s upgrade.  The second bullet point in the impact alert is especially important, and everybody needs to take note.  Here’s a screenshot:

Got that?  Good.  To be a good DBA, it’s important to be able to read Wingdings.


Tracking the Chicago-Mac race with GPS

2 Comments

A friend of mine is competing in the 100th annual Chicago-Mac race, a sailboat race up Lake Michigan from Chicago to Mackinac Island.  This year they gave out GPS tracking devices so you could follow your favorite boat’s progress during the race.

Now I’m not saying anybody cheated, but the route of the Hannah Frances is a little unorthodox:


The Wine Trials: a wine book for donut lovers

0

Last night, the author & editor of The Wine Trials held a release party at the Caroline Collective, the Houston coworking office where I lay my weary laptop.  Robin Goldstein and Alexis Herschkowitsch talked vino, signed books and raised glasses.  I liked them, and I liked the book, and I’ll tell ya why.

I’m just a regular guy.  When I go to a restaurant or a wine shop, I don’t want to dazzle anybody with a deep knowledge of wine.  I just want to spend a reasonable amount of money (say, two or three glasses should cost less than an entree) and drink something that tastes as good as what I’m eating.

I don’t have a very sophisticated palate (mmm, donuts) and I gotta think there are some wines that basically taste good to everybody – like, well, donuts.  You don’t need a sophisticated palate to like donuts – you just like ’em.  Granted, there are a few wackos who don’t like donuts, and there are a few of us who have graduated to beignets, but walk into any office meeting in America with a box of donuts and you’ll win friends and influence people.

So how do we find wines that are the equivalent of donuts – beverages with a wide, almost universal appeal and reasonable donut-style pricetags?  The Wine Trials took the approach of a large quantity of blind tastings: hundreds of people sampling wine from brown paper bags.  (Sounds a lot like downtown Houston, only with feedback forms for each wine.)  Their tastings covered wines in all price ranges, but they focused the book on the top 100 wines under $15.

My test for any review book is to open it up and read their opinions about something I personally have tried and know well.  Erika and I are on a champagne kick at the moment, and Editor’s Pick in the book is Friexenet Cordon Negro Brut.  Sold – that’s our second favorite budget bubbly, and I can forgive them for not including our favorite (Francois Montand) because it’s nearly impossible to find.

You can buy The Wine Trials from Amazon.

In related news, today is Champagne Friday at Caroline, and in honor of Robin & Alexis, we’ll be serving Friexenet.  If you’re in the Houston downtown or museum district area, come join us for a glass.


Storage virtualization for SQL Server: friend or foe?

Storage
0

Storage virtualization is a really slick SAN technology that does for SANs what VMware did for servers: it abstracts away the underlying hardware to make management easier.  Multiple SANs can be swapped around back and forth behind the scenes without affecting any servers that store data on those SANs.

It’s nowhere near common yet – it’s somewhat like VMware was several years ago, not quite large in the datacenter – but it’s gaining traction, and it’s something that DBAs need to be aware of.  As a DBA, you need to know the risks and rewards so that if your SAN team wants to evaluate storage virtualization, you’ll be able to voice your opinion.

But why do any research?  Take it from me – research is hard, sweaty and painful.  Why not just repeat MY opinion and call it your own?  After all, my opinion is cool and it’s free, and you can read it online right now courtesy of Search SQL Server:

SearchSQLServer article by Brent Ozar – virtual database storage for SQL Server: friend or foe?


Coding Horror post about normalization

2 Comments

I’m a DBA, not a programmer, but I subscribe to Jeff Atwood’s excellent Coding Horror blog because it’s well-written, funny and teaches good lessons.  In his latest post, he talks about database normalization – when to do it and when to avoid it.

Pay particular attention to the links to HighScalability.com – they have great stories about scalability problems and lessons-learned from really big sites like YouTube and Twitter.  Read those stories and know ’em well, because that’s the easiest way to learn some really expensive lessons.


Software Development Meme

0

Jason Massie tagged me, so it’s my turn to answer the questionnaire….

How old were you when you first started programming?

Dad and Mom upgraded us from an Atari 2600 to a Commodore 64 when those came out, so I must have been around 9-10.  I don’t remember much from those early attempts at programming, but I remember being really frustrated that there was so much typing involved to copy the stuff from magazines into my own computer.

See, I have this mental problem where I don’t want to do something unless I can be really successful at it in a short period of time.  That problem is often defined as laziness, but I’m not lazy – I don’t mind working really hard, but I don’t want to STRUGGLE really hard.  I will work 20-hour days, but I wanna know that I’m actually achieving great things, not trying to accomplish something basic.

Typing long, boring lines into a computer, especially lines that I didn’t think up, definitely fell into the category of time wasters.  It’s one thing to pour your ideas out into a keyboard, but it’s another thing to transcribe somebody else’s ideas, character for character, and then try to hunt down your typos without a debugger.

(That same mental problem is what kept me out of sports!  Practice for weeks just to be able to shoot a free throw?  You’re out of your mind….)

How did you get started in programming?

Lemme tell you how I didn’t get started: I vividly remember Mom forcing me to do piano lessons, and Dad forcing me to do soccer.  Neither of those hobbies stuck in any way, shape or form.  I don’t remember how I got started in programming, but I remember poking and peeking around on my own, so I bet I ran across it in a magazine.  I was a voracious reader.

What was your first language?

BASIC.  (Makes me chuckle because future generations will respond to that question so differently.)  I think my second language, if you can call it that, would have been DOS batch files, though.  I really, really enjoyed MS-DOS.

What was the first real program you wrote?

To me, it’s not a real program until the first user signs on.  That’s when you find the real bugs, the real shortcomings.  My first real program was a help desk front end written in classic ASP.

Our company was growing by leaps and bounds, and McAfee wanted absurd amounts of money for more user licenses for their help desk software.  That software sucked – I mean, reeeeeally sucked – and I said to myself, I could write something better using the same SQL Server back end, save the company a lot of money, and the users would love me because it’d be so much easier to use.  Plus, if I had any ugly bugs, they could use the old Windows console version while I sorted my bugs out.  The company still uses that system today, and it’s had over 100k help desk tickets.  Hooah!

That’s still my favorite rush in programming: walking past somebody’s computer and seeing my stuff on their screen as they interact with it.  I really love knowing that somebody could choose to use any piece of software out there, and they’re choosing to use mine.  That rocks.  It’s like being the cool kid in school.  (Only without the chicks.)

What languages have you used since you started programming?

Trying to think of these in the order of my career:

  • BASIC on a Commodore 64
  • DOS (yes, I consider batch files a language, especially when they’re hundreds of lines long)
  • VBscript
  • HTML
  • Classic ASP
  • Topspeed Clarion
  • T-SQL
  • Java (the language that made me decide to never learn another language again)

What was your first professional programming gig?

Telman (subsequently bought by UniFocus) hired me on the basis of personal relationships and my hospitality industry knowledge, and then sent me off to Clarion training.  I really liked Clarion, and I haven’t touched it in years.  Makes me want to go play with it now, come to think of it.  Clarion was a database-independent language: in theory, you could change your database back end with a couple of mouse clicks, recompile your program, and hook it up to a different database.

The problem is that when you’re using a particular database, you want to take advantage of the database-specific features that give you better performance or more capabilities.  If your code is generic enough, though, or if you’re willing to invest the time to debug it once, it does work, and I did manage to switch a few apps between Clarion’s proprietary flat files, to Microsoft Access, to SQL Server 7.0.

Telman was also my last professional programming gig.  Clarion was a dying language, so the company had to switch to a “new”, more maintained language – either Java or .NET.  I saw the writing on the wall; .NET would have short-term staying power because it has the Microsoft marketing power behind it, but something else would come along in 5-10 years and knock it over.  I could spend a few years becoming really proficient in Java or .NET – but then have to relearn a new language within a decade.  Why bother?  The ANSI SQL language lasts forever, even across different vendors.

If you knew then what you know now, would you have started programming?

Yeah, because I think it makes me a better database administrator.  No way in hell would I go back to programming, though – I hate finding bugs in my own code.  Stored procs are easy enough to unit test and be pretty certain that they’re correct, whereas code that faces end users – that’s another problem entirely.  Users are crazy.  They click everywhere, they do things that don’t make sense, and they expect everything to work flawlessly.  That is seriously hard work.  I really respect good programmers.

If there is one thing you learned along the way that you would tell new developers, what would it be?

Languages come and go like fashion trends.  Don’t get stuck in a pair of baby blue bell bottom pants: choose a language based on how your resume will look 5 or 10 years from now, not based on what the cool kids are doing this week.

Really, really good programmers can pick up a new language in a heartbeat, but you may not be so lucky.  Spend your time focusing and getting really good at one language, and don’t listen to the siren song of whatever new language is sexy today.  Take database administration: learn to code ANSI SQL today, and you’ll still be using that same syntax in 20 years.  Learn the trendy new LINQ, and you may be relearning something else in a few years.  (I’m not sayin’, I’m just sayin’.)

What’s the most fun you’ve ever had … programming?

Embedding sounds in other people’s help desk tickets with hidden HTML code in the ticket notes.  I embedded the A-Team theme song in a note on somebody’s ticket, so whenever he pulled up his list of tickets, the A-Team song started playing, and he didn’t know why.  When I finally let out the secret, hoo boy, that triggered a flood of embedded sounds and graphics.

Who are you calling out?

Bert Scalzo because he talked me into this sweet job

Brian Knight because reading his stuff is almost as good as listening to his seminars

Conor Cunningham before he goes back to work for Microsoft

Jeremey Barrett because I bet he’s better at programming than he lets on

Linchi Shea because he’s a SAN genius

Rhonda Tipton because I’m seconding Jason’s call-out

Before me, the tag order was something like this: Jason Massie > Denis Gobo > Andy Leonard > Frank La Vigne > Pete Brown > Chad Campbell > Dan Rigsby > Michael Eaton > Sarah Dutkiewicz > Jeff Blankenburg > Josh HolmesLarry Clarkin


AMD triple-core servers and SQL Server 2005

9 Comments

If you’re using AMD Phenom 3-core processors, SQL Server 2005 won’t install without going through some hoops first.  Microsoft just published a knowledge base article on SQL 2005 errors on 3 and 6 core servers.

I don’t usually blog about knowledge base articles, but I got such a laugh out of that one that I just had to share it.

Want to know about MS KB articles as soon as they’re published?  Subscribe to the Microsoft Knowledge Base RSS feed for SQL Server.  You DO use an RSS reader, don’t you?  Hint hint….


Transparent Data Encryption in SQL Server 2008

SQL Server
3 Comments

I played around with TDE a couple of weeks ago, and I was surprised by how difficult it is to implement.  I’d expected to be able to check a box, put in a password, and click OK, but it’s nowhere near that easy.  Restoring encrypted databases from one server to another can also give DBAs a nasty surprise when they least expect it.

I’ve heard several DBAs comment recently about how SQL Server Management Studio is targeted more at developers than it is database administrators, and I think 2008 will reinforce that perception.  Implementing TDE is a good example: there’s no wizard, there’s no obvious steps, etc.  Right-click on a database and try enabling encryption, and there’s no obvious reason as to why the feature is disabled – the DBA has to dig through documentation to find out that a server certificate is required first.  Ugh.  TDE is a good first step towards secure data files, but any toddler will tell you that those first steps are always the toughest.


Russia as seen from my hotel bathroom

5 Comments

I’m spending a few days in St. Petersburg, Russia to visit Quest’s office.  Sure, I have the regular touristy photos, like myself in front of old buildings:

But what I really like about overseas travel is that everything looks and feels different, and it forces me to look at the design and user interface for everything.

Take toilets, for example.  American toilets have the button on the side, and Russian toilets have the buttons on the top.  That’s right, buttons – more than one:

My first thought when I saw this in my hotel room was, “Ah, that’s interesting.  Now what would be the advantages or disadvantages of putting the button on the top?”  I guess you can’t leave things on top of the toilet if the button is on the top.  With American toilets, you can just pull the lid off, and the button remains on the side, so it’s probably easier to design the lid.  This particular toilet lid doesn’t come off at all, which is an advantage in hotels – tourists hide all kinds of stuff in toilets.  I know this from my years of working in hotels.

My second thought was, “Two buttons?  I’m screwed.”  I’ve been in Europe just enough to have seen bidets, and heard about toilets with built-in bidet attachments.  I half expected water to come shooting up and hit me in the eye, so I stood to the side when I tested the buttons.  Turns out the two buttons have two different amounts of water: regular, and Niagara Falls.  I think I could flush a small cat down this toilet.  Now that’s cool.

Are American toilets better than Russian toilets?  No, they’re just different.  The differences continue throughout the bathroom, like the hooks on the wall:

These have a fun, playful design language, very mid-century-modern.  These would look awesome in an American 1950’s ranch-style house, but I’ve never seen anything like them before.  I certainly wouldn’t expect to see them in a traditional hotel like ours where the paintings on the wall are framed with mock gold leaf.  The hotel is very conservative, and then these bathroom hooks say BAM!  Bathroom fixtures seem to be more modern-looking in Europe and Russia, and more traditional in America.  Dunno why.  Not better, just different.

But wait – there’s more.  Check out the toiletries in my room, but look closely and see if anything looks odd:

In the white box, there’s a nail file.  A nail file!  Wow.  I’ve never seen that before.  I’d be curious to know how often people need a nail file when they travel.  Different.

Plus, the labels are all in English.  Not Russian, just English.  Know why?  Because only Americans are dumb enough to travel without all of their stuff.  And I should know – last week I went to California without any socks, and this week I came to Russia without my electric razors.  Which brings me to this:

It takes a big man to admit he’s scared, and I’m a mighty big man, so I don’t mind telling you: razor blades scare the hell out of me.  There must have been some terrifying experience in my childhood involving razor blades, because I never touch them.  When I was old enough to shave, I bought an electric razor, and I’ve always used them ever since.  Normally, if I forget my razor when I travel, I go buy another one.  But here I am in Russia, and if I bought an electric razor, it’d have their electric plugs, and I wouldn’t be able to use it at home without an adapter.

I said to myself, “You’re so interested in things that are different – why not give it a try?”  After all, I’ve seen Queer Eye for the Straight Guy.  I know how the concept works.  So with shaking hands, I lathered up my face and used a sharp piece of metal to scrape hairs off my face.  (Can you tell this concept bothers me?)

And you know what?  I liked it.  It gives a really nice, close shave, and as long as you lather up with warm water and use a sharp razor, it doesn’t cut your face.

That’s why I like travel.  Different can be better.  I might even stick with razor blades.


Quickly replying to recruiter emails

1 Comment

In today’s hiring market, DBAs get a lot of emails from recruiters that say things like:

“I am curious to know if you would be interested in (or know someone who is interested in) relocating to Big Falls East for the position discussed below….”

The job description never includes the salary range, and it rarely includes information about the industry – two things that should be important to a job candidate.  (If you don’t care about the industry, you should start caring – imagine going to work for Ford Motors, an airline, or a mortgage company in today’s economy.)

To handle these emails fast, go into your email program and create a new signature.  The signature should include a default reply text that is really generic, like this:

“Thanks for the email.  If you can forward on information about the salary range and the industry, I’ll pass it along to my peers.  After being repeatedly burned (hey, buddy, check out this job – oh, sorry, I didn’t know it only paid $X) I have a policy against forwarding jobs without salary & industry information.  Thanks, and have a great day.”

Most email programs will let you quickly change your signature for one specific email.  That way, when recruiter emails come in, just hit reply, insert your recruiter signature, and hit send.

When the next email comes in (either with the right info, or without it) you can handle it the right way.  If it includes good information, then it’s a good recruiter, and you should pass it on to your buddies.  If it doesn’t include any more information than the first email, just hit delete, because the recruiting process is off to a bad start.

I know recruiters will disagree, but here’s the deal: if a recruiter is desperate enough to send a blind email to a stranger without a resume, asking that stranger to pack up their house and move to another state, then there’s a reason.  They’ve already exhausted the local candidates – either because they’re not paying enough, the company has a bad reputation locally, or they need a hard-to-find skill set.  Think about that before replying with your resume, because the first two reasons will come back to haunt you, and the third reason means you’re worth a lot of money.


Quest LiteSpeed isn’t right for everybody

2 Comments

Sean McCown at Infoworld wrote a blog post today that said some negative things about Quest and LiteSpeed, and I had to respond. He’s switched away from using LiteSpeed, and he had an interesting set of reasons.

“I just don’t need the centralized repository in my lab, and I do enough demos that having native SQL syntax for my backups is worth something to me.”

Everybody has different needs.

Sean and I would totally agree that labs have a different set of needs than a typical enterprise or datacenter.

In fact, if I was just doing a lot of demos like Sean, and if I wanted native SQL syntax for my backups, I wouldn’t even use a third party product at all: I’d use SQL Server 2008’s new backup compression to make my life really easy. That way, you can stay purely native, and that makes for really easy demos.

Quest LiteSpeed, on the other hand, isn’t targeted just at labs and demos. It’s targeted at DBAs who need enterprise-ready backup solutions, complex needs like log shipping, centralized reporting, log reading with undo & redo for transactions, and so on. Not everybody needs the advanced features that LiteSpeed provides, and Sean doesn’t. We would both agree that LiteSpeed is not the right product for his lab.

“Even knowing any of the devs is hard these days, and getting anyone to make any changes is next to impossible.”

At smaller companies, especially startups with a minimum of customers, developers can work directly with end users. For better or worse, Quest has a lot of customers, and I’d like to think that’s because Quest has fantastic products. (It’s a pretty good problem to have.) Our product managers coordinate feature requests across all of our enterprise customers, large and small, which helps our developers focus on what they need to do: develop great software.

Just this week, in fact, Quest Software hosted a SQL Server Customer Advisory Board at the offices in Aliso Viejo. We flew out a dozen customers for two days of meetings about our road maps, upcoming features, and asking for their input on where we go next. I wish we could bring every Quest customer out there for the same input, but on the other hand, Steve Ballmer isn’t flying me up to Seattle anytime soon to ask me what SQL Server needs to do next.

We’re in a feature freeze for our new LiteSpeed v5.0. The management tools have been rewritten from the ground up, and I’m really impressed. But there comes a point where you have to stop adding new features and just focus on testing the code, and that’s where we’re at now. Heck, I have a long list of things I want to add, but even I can’t sneak ’em in – and I work for Quest! I’m already excited about the features I’m trying to cram into v5.1.

“Now, with the team being more corporate….”

If your main criteria for a backup product is easy access to the guy who wrote it, then Quest LiteSpeed is not the product for you.

Just a couple of months ago when I was a DBA instead of a Quest employee, though, that was not my main criteria for software selection. My main criteria included a global 24×7 support organization with experience across database technologies, a large QA team dedicated to testing mission-critical backup software on every possible platform and a product with an unbelievably strong track record. The fact that Quest was Microsoft’s 2007 Global ISV Partner of the Year again this year is proof that sometimes bigger is better.


Developers: sometimes you just gotta trust the DBA

0

At this week’s Customer Advisory Board at Quest, I’ve had a great time sharing war stories with other database administrators.  One theme keeps popping up again and again: developers don’t trust DBAs and seem to refuse to take the DBA’s word for anything.  It seems that developers just want to keep learning things the hard way.

Case in point: a developer wants to loop through all string fields in every database in the shop looking for secure information.  They’re after things like social security numbers, credit card numbers, bank account numbers and so forth.  They want to check the contents of every char/varchar/nchar/nvarchar field in every table, look at the top X records, and check the string contents to see if it matches known text patterns.

In theory, this could be solved by a stored procedure.

In practice, a stored proc would be a bad idea.  SQL Server isn’t particularly good at string processing, and a query like this will instantly push the CPU to 100%, thereby severely slowing down other queries on the box.  Plus, large databases like SAP can have tens or hundreds of thousands of string fields, and storing the results will take gigs of memory and/or TempDB space on a large BI/BW data warehouse.

The DBA involved has repeatedly recommended against the stored proc approach, but the developer still wants to build one, saying he’s Googled for ways to do it and he’s getting some good leads.  At what point  the DBA will sit back and say, “You know what, go for it, because when your stored proc takes down a production box, I’ll be all too happy to explain whose code did it.”

Another case involved developers who want to store completely unstructured XML data inside the database as text fields, and want to parse through that XML data when searching millions of records for matching attributes.  The XML format will be extremely flexible, but when storing it as text, it’s just not going to perform.  The developers see that as a SQL Server performance problem, when in reality it’s a bad design that won’t scale.

We spent some time tonight at the CAB talking about where the DBA draws that line – where he gives up trying to protect the developer and just gives the developer enough rope to hang themselves.  It’s an interesting challenge politically, but the moral of the story is this: developers out there, if your DBA keeps recommending that you do something, and all of a sudden one day he caves in and sees things your way, be careful.  You might be in for a rocky ride.


I have big hands

1 Comment

I’m staying at an oceanside hotel in Laguna Beach, California for a meeting with some of our customers.  When I say oceanside, I mean oceanside: there’s a huge wave break about a hundred feet from my balcony, just two stories down.  The waves are deafening.

It’s a really romantic setting, but I was still surprised to find these at my bedside:

“Grande for Increased Protection?”  Wow.  I mean, it’s one thing to have this kind of gear provided for you by the hotel, but it’s still another when they give you the big ones.

Or maybe they just gave the big ones to me.  Maybe the front desk clerk called down to housekeeping and said, “Look, I’ve seen this guy.  You gotta give him the big ones.  No playing around here.”

If you want to see the full details of the hotel amenities, click on the photo.  I wouldn’t want to show the whole thing on my normal public page without clicking on it.


SQL Server Partitioning: Not the Best Practices for Everything

Partitioning, SQL Server

When SQL 2005 was in beta, I clapped my hands and jumped for joy the first time I played with partitioning.  I’ve done a lot of data warehouse work, and partitioning lets the DBA break down massive tables into smaller, easy-to-manage chunks.  We can put heavily accessed parts of the table onto fast storage, and less-accessed data onto slower, cheaper storage.  The cool part is that it’s completely transparent to applications (as long as you don’t have to change primary & foreign keys): they don’t have to know the table is even partitioned.

We implemented it almost as soon as SQL 2005 came out, and I kept wondering why more people didn’t take advantage of this feature.  Sure, it’s not particularly intuitive, and there’s no GUI for it in SQL Server Management Studio, but it does work.

“Can partitioning help us?”

Fast forward to yesterday: I met with Mark Schmulen and David Lyman of NutshellMail.com.  It’s a slick solution for those of us who work behind corporate firewalls that prevent us from accessing personal email sites like Gmail and Yahoo, as well as personal networking sites like Facebook and Myspace.  When I worked for JPMorgan Chase, I was frustrated that I couldn’t get my personal email because I had a network of DBA buddies who all knew and used my personal email address.  We would ask each other for support from time to time, and I liked being able to help out.  At JPMC, though, the only way I could do that was contact all my friends and have them switch to using my corporate email – which I couldn’t check when I was outside of the office, so that was a lose-lose situation for me.  NutshellMail gets around that by checking your personal email accounts on a schedule you select, and then sending you digest emails to your work address.  Makes perfect sense, and I’d use it in a heartbeat – even if they weren’t using SQL Server for storage.  (Bonus!)

We talked scalability, and they’d heard partitioning may be a good answer for future growth.  I was pleasantly surprised to hear that the word about partitioning is getting out, and that was the first time I said no, partitioning isn’t the right answer for this one particular case.  Having talked through that, I wanted to touch base on some of the reasons here.

Partitioned tables require Enterprise Edition.

Standard Edition can’t do partitioning, and Enterprise is a few times more expensive than Standard.  On a 2-CPU box, we’re talking about a $30-$50k difference in costs.  If we’re only using Enterprise for the partitioning, then there’s a better return on investment if we put that $30-$50k into faster storage instead.  This is especially relevant for servers with direct attached storage, even in a raid 10 config.  I wouldn’t consider partitioning unless the server had at least 10-20 drives for the data array alone.

Partitioning best practices means using both fast and slow storage.

In the case of a data warehouse, a sales table is loaded every night with the previous day’s sales data.  The most recent data is under heavy load because it’s constantly being accessed by both the ETL processes and the daily reports – because users really want to know what happened in the last few days to a month.  We have to keep years and years of sales data on hand for curious users who want to check trends, though, and we don’t want to keep all of that data on extremely fast (and extremely expensive) storage.  (Yes, partitioning gurus, I know there’s work involved in moving data from one array to another, but sometimes it’s worth it.)  If we partitioned by date range, we could keep the new data on fast storage, and shuffle the older data off to slower storage.

Partition functions and schemes are easy to design incorrectly.

Before implementing partitioning for the first time, either get personalized advice from a DBA who’s done it several times, or get an identical development system for repeated testing.  I had the luxury of an identical QA environment where I could repeatedly test different partitioning strategies and replay the end user load to see which strategy performed best.  Not everybody’s that lucky (come to think of it, I should have played the lotto more) and it’s a mistake to try it on a production system first.  When I first started using it, I thought I knew exactly the right strategy for our data, and we’re talking about a schema I knew forwards and backwards.  I was wrong – really wrong – but at least I found the right answer before going into production.

In summary, outside of data warehouses, I like to think of partitioning as the nuclear bomb option.  When things are going out of control way faster than you can handle with any other strategy, then partitioning works really well.  However, it’s expensive to implement (Enterprise Edition plus a SAN) and you don’t want to see it in the hands of people you don’t trust.

More SQL Server Table Partitioning Resources

Before you start designing a table partitioning strategy, or if you’re trying to troubleshoot why your partitioned tables aren’t working as fast as you expect, check out our SQL Server table partitioning resources page.


Caroline Collective: The Morning After

7 Comments


It’s the morning after the Caroline Collective grand opening party. The place was absolutely packed with interesting people, and that’s my definition of a success. For images of the evening, check out the Caroline Collective Flickr pool.

Seemed like everybody I talked to wanted to get in on the coworking action, and they had the same basic questions:

Q: When is Caroline open?

It doesn’t have an “official” schedule yet, just whenever a tenant shows up. The tenants have keys. I’m here on weekdays by 7-8am, and from then on there’s somebody here until pretty late. Join BrightKite.com – it’s like Twitter, but for physical locations – and you can get notified whenever one of the tenants checks in at 4820 Caroline. Sometimes we’re here, but we just forget to unlock the doors – send a Twitter or a BrightKite note and it’ll rouse us.

Q: Is everybody in the web business?

No, that’s what I thought it was going to be too, but it turns out that’s not the case. The only web folks we have here (off the top of my head) are True Light Resources, and everybody else is all over the map. Caroline tenants include software developers, a TV producer, a photographer, a foundation, and of course, the ArtStorm crowd next door. It’s a really broad pool of people.

Q: What’s it take to join?

Anybody can show up anytime Caroline’s open and just sit down at an available desk. There’s free WiFi and free coffee. If you like what you see, there’s a few membership levels. At the $300/mo level, you get your own desk, which is pretty cool because you can leave your stuff there – monitor, speakers, whatever.

Q: What’s it take to get an office?

Patience, grasshopper. Those went first – as of a couple days ago, they’re all taken. We’ve got plenty of desks available, though.

Q: Who were those two well-dressed hotties?

Those would be Ned Doddington (left) and Matthew Wettergreen, the founders of Caroline:

Matthew Wettergreen and Ned Dodington

Excellent photo by Ed Schipul, and excellent vision by Matt & Ned.  This place was absolutely awesome last night, and the energy was so positive.  Everybody seemed to “get” the whole coworking concept, and I’m excited at what’ll happen over the coming weeks.  Congratulations, guys!


Time for a career change

1 Comment

I know, I just told you guys that I got all settled in at Quest and I’m loving it and everything, but I just had the most unbelievable offer come in. It just came out of nowhere straight into my email, and what can I say – I just can’t pass it up. It’s perfect for my background.

A picture is worth a thousand words, so with no further ado, here’s the email itself:

Look out, Zohan.  A former SQL Server DBA makes an even better hairstylist.  They said it themselves – I’m the perfect candidate.  Hooah!


SQL Server backups using SAN snapshots

4 Comments

One of my articles just got posted up at SearchSQLServer. I cover some of the pros and cons of backing up data warehouses with SAN snapshots.

I wrote that just after we got done with a NetApp deployment at Southern Wine. The funny part is that we deployed the QA environment pretty quickly, but months after I’ve left, they still haven’t moved the production warehouse over to the NetApp. I should have borrowed that set of drives until they were ready to go live, heh.

Update as of 6/3 – I originally wrote that article for SearchSQLServer.com.  I didn’t see the article appear there for a while, and when it popped up on SSWUG, I assumed SearchSQLServer had transferred it over to SSWUG.  Turns out that’s not the case.  We’re working through the details of that now.

This isn’t the first time one of my articles has been “pirated” – used to happen to me when I wrote for HAL-PC – but this is the first time it’s happened with an article I actually got paid to write.  Next thing you know, I’m going to need an agent.  Or maybe not.