Caroline Collective grand opening coming June 7th

2 Comments

I moved my gear into my desk at Caroline Collective today – well, actually, it was more of a buy-and-move. Most of this stuff is a new copy of the gear I use at home.

The grand opening will be Saturday, June 7th from 7pm til 10pm. The invite reads as follows:

“We are (collectively) happy to announce the Grand Opening of the Caroline Collective, Houston’s first coworking venue! Please join us Saturday, June 7th from 7-10pm at the home of Caroline Collective for drinks, light refreshments and heavy celebration. The event will be co-hosted with ArtStorm featuring collage artist Patrick Turk and the esteemed tenants of Caroline. Your attendance is not only appreciated, it’s needed. After all, it’s about all of us. Tasty hops provided by Houston’s own Saint Arnold Brewery and delectable dishes by chef David Grossman.”

Caroline’s at 4820 Caroline Street, 77004 with easy access from the Wheeler stop on light rail. Parking is available. I’ll be there, so if you’re interested in meeting a strangely outgoing SQL Server guy from Quest Software, c’mon down.

This is also the same night as the Ladytron concert, for which I hold two tickets.  The concert starts at 9pm – the jury’s still out as to whether or not Erika and I will leave the CC grand opening for the concert.  If I’m debating tossing a pair of Ladytron tickets to stay here, then you know it’s going to be a good party.


Getting Into a Database Administrator Position

Professional Development
39 Comments

I got a question from Ron G asking how to go about changing positions from help desk to DBA.  Here’s my thoughts:

Build on what you already know.

If you’re used to working on IBM AIX systems, for example, you’ll want to utilize some of that skillset by working with databases that run on AIX.  If you’re used to working on Windows computers (even in just a help desk environment), you want to stay on Windows.  Don’t try to learn both an operating system and an application at the same time if you can avoid it, because the faster you can get up to speed on just the database alone, the faster you’ll be able to get paid.

Attend free webinars.

Find third party vendors that support the database you’re trying to learn, and check out their marketing webinars.  They’re in the business of helping database administrators learn and grow, and they conduct some great training sessions for free just to get their products in front of you.  I’ve done a couple SQL Server training webcasts for Quest Software that cover how to accomplish common DBA chores using the native tools versus how much faster it is with the Quest tools.  I don’t know about you, but I learn a lot faster when I’m listening to a real human being talk instead of reading dry text, and webcasts are much more fun.

Join the local database user group.

You’d be surprised how many cities have user groups for databases.  Go, and promptly close your mouth, hahaha.  Don’t try to contribute, just sit, watch, listen and learn.  People will give presentations every month about database topics.  You’ll learn a little about databases, but more importantly, you’ll learn about the city’s market for the database you’re trying to learn.  Other people will get to know you, and down the road, you’ll find somebody who’s willing to show you the ropes.  (Everybody wants to hire junior DBAs.

Volunteer after hours with your DBA.

Talk to the friendliest DBA at your company (or another company in the user group) and tell them you’re interested in learning more.  Tell them that you’re willing to show up after hours if they’re doing maintenance and watch & learn.  This isn’t going to be an easy sell – with telecommuting these days, a lot of maintenance is done remotely via VPN – but if you’re lucky, you’ll find a taker.  At Southern Wine, I had a relationship like this with a junior DBA: whenever I planned after hours maintenance, I’d email him to tell him when it’d take place.  If he wanted to join me, we’d meet up at the office that night and I’d explain each of the steps I was doing as I did it.  It slowed me down as a DBA, but the payoff came when I wanted to take vacations, because he was already familiar with more systems than he’d ordinarily come across.

Find local database software companies.

Companies all over the US build add-on software for your database platform of choice.  They build things like performance monitoring tools, backup software, database utilities, etc., and all of this software needs support.  They have a help desk, and they’d love to hire people who want to grow their database experience.  You’ll be able to make a quick career change, plus get into a position where you’re learning databases on the job.  You can find these companies by Googling for your database platform name plus tools or management, like “SQL Server management” or “SQL Server tools”.  Also check the magazines for these (yes, there are database magazines, even!) and look at each of the advertisers to see where they’re located.  Call them and ask if they have an office in your city, because some of these companies are pretty big.  (Quest has over 3,000 employees all over the globe.)

Avoid consulting companies unless you know another employee there.

I know I’ll get email for this one, but here’s the deal: a lot of shady consulting companies are willing to throw anybody into a position just to make billable hours.  They pay you $X per hour, and they bill the client twice as much.  Presto, they’re making money off you, and they don’t care whether you know what you’re doing or not.  The client won’t find out right away because the consulting company won’t let them talk to you directly – they’ll manage all meetings via a project manager who does all the client interaction.  After a few months, when the client figures out that you don’t know what you’re doing, the consulting company can shuffle you off to another project.  You won’t learn much (there won’t be another DBA there to help you) and you’ll get demotivated.

Most importantly, be honest.

Don’t be afraid to say you don’t know the answer to something.  My official job title at Quest is “SQL Server Domain Expert”, and I get a big chuckle out of that.  Yesterday I met with two people for three hours (hi, Eyal and Melanie) and it would take two hands to count the number of times I said, “I don’t know the answer to that.”  Granted, my job puts me in the line of fire for some really tough technical questions, but you get the point.  Database administrators can’t know everything – today’s databases cover way too much functionality – and that’s okay.  Nobody expects you to know everything, but they’ll expect you to know where to find the right answers quickly.

More DBA Career Articles

  • Moving from Help Desk to DBA – a reader asked how to do it, and I gave a few ways to get started.
  • Development DBA or Production DBA? – job duties are different for these two DBA roles.  Developers become one kind of DBA, and network administrators or sysadmins become a different kind.  I explain why.
  • Recommended Books for DBAs – the books that should be on your shopping list.
  • Ask for a List of Servers – DBA candidates need to ask as many questions as they answer during the interview.
  • Are you a Junior or Senior DBA? – Sometimes it’s hard to tell, but I explain how to gauge DBA experience by the size of databases you’ve worked with.
  • So You Wanna Be a Rock & Roll Star – Part 1 and Part 2 – wanna know what it takes to have “SQL Server Expert” on your business card?  I explain.
  • Becoming a DBA – my list of articles about database administration as a career.

Benchmarking VM Fusion storage

Hardware
2 Comments

I use a Macbook Pro with VMware Fusion, which lets me run a bunch of virtual Windows machines for testing. I upgraded my internal drive to a quick 320gb one a few weeks ago, intending to run all my VMs off that internal drive, making it easier to do testing on airplanes or in coffee shops. Problem is, one of the things I test is Quest LiteSpeed, database backup software that does a lot of disk I/O, and during heavy backup/restore testing, I couldn’t do much multitasking because the internal laptop drive was being hit so hard.

I figure it’s a matter of time before I put a second hard drive in the Macbook Pro, but for now, I’ll stick with external drives.

I’ve got a bunch of USB drives, so I did some comparisons to see which ones got the fastest storage performance inside VMware. I wanted to find out how much performance I’d really gain by switching to an external drive, and did it make a difference whether I used FireWire or USB.

Keep in mind that this is not the native performance of each drive – this is the performance as seen inside a VMware Windows XP guest. I didn’t care what the native performance is, because that doesn’t do me any good. I’m only interested in performance as seen by the VMware guests because that’s the only thing I’d use the external drives for.

Here’s the benchmarking results from HD Tune:

Internal Western Digital 320gb 2.5″ WD3200BEVT:

WD3200BEVT

Big, bulky external Maxtor OneTouch III 3.5″ 500gb connected via FireWire 400:

External 500 via FireWire 400

Whoa. That’s only averaging 35.5 MB/sec, whereas the internal gave me 48.5 MB/sec. I’m surprised there.

External Maxtor OneTouch III 3.5″ 500gb connected via USB:

External 500 via USB

Not much difference in speed between FireWire and USB as far as speed goes, but the USB connection used almost double the CPU power.

External USB enclosure with a Fujitsu MHY2120BH 2.5″ 120gb (the Macbook Pro’s original internal drive):

External 120 via USB

I’ve also got an external SATA RAID enclosure that does mirroring & striping, and I’ll test that later just out of curiosity, but there’s no way I’d put my virtual machines on there because it weighs more than my laptop and it makes a loud racket.

These results are not scientific – I just did one pass of testing on each drive. Your mileage may vary. Offer not valid in all fifty states. No purchase required to win. See participating locations for details.

My verdict: I’ll put the virtual machines on the 2.5″ external enclosure.  It’s tiny, doesn’t require external power, and it won’t be as fast as the internal drive – but at least it’ll let me multitask.


Performance troubleshooting flowchart

4 Comments

Junior DBAs, warm up your printers. The Microsoft SQL Customer Advisory Team, aka SQLCAT, has published a flowchart explaining how to troubleshoot performance issues with SQL Server. This is by far the best visual representation I’ve seen into how a senior DBA’s brain works:

SQL Customer Advisory Team’s Troubleshooting SQL Server 2005/2008 Performance and Scalability Flowchart

And just think – the rest of us had to learn this stuff the hard way – by reading manuals!


Signed my lease at the Caroline Collective

2 Comments

Yesterday afternoon, I stopped by the Caroline Collective and signed a lease on my very own desk.  Sounds odd to lease a desk, eh?  Especially sight unseen – there’s no actual desks in the space, just a big empty room with concrete floors, white & blue walls, and fluorescent lights dangling from a low ceiling.

Matt & Ned nervously showed me their prototype wood desk, and I could tell they weren’t sure whether or not I was going to “get” it.  One of them said something about how the desk would be finished off, and I had to laugh.  Finishing anything off isn’t the point.  It’s not that coworking needs to be unpolished, but the finish on the desks doesn’t matter.  Although, I do have to confess that I cringed when I read Ned’s Twitter about Knoll furniture – I thought to myself, please, God, don’t go buying high-end office furniture this early in the game.  I love modern stuff like that, but damn, it’s expensive for a startup business.  Anyway, I was relieved to see the desks were inexpensive but well-crafted wood jobs instead.

I get the whole coworking thing, especially as somebody who telecommuted for five years, but I bet most people aren’t going to foresee it until the desks and the personalities go in.  The factor that makes coworking tick is the chemistry – the unique mix of people from different backgrounds, different companies (or no company at all), the laid-back informal discussions that don’t come from meetings organized on a calendar.

When the people are in, when the artists are slinging paint, when the beer is in the fridge, when the desks are filled haphazardly with relics of different careers, that’s when people will get it, and it’ll happen like wildfire in a city like Houston.

And you’ll wish you’d have signed a lease on a desk while they were still available.  Trust me.


VMware ESXi 5 on an Apple Mac Mini 2010 – It Works!

Hardware
78 Comments

Great news – Pedro Costa has got a working solution!  Apple Mac Mini 2010, 2011, and 2012 models all boot a patched version of VMware ESX 5.0:

VMware ESXi 5 Running on an Apple Mac Mini

I’d always wanted a small VMware vSphere 5 (ESXi) lab farm up and running, and I wanted to use Apple Mac Minis just for compactness and the silence.  It has to be vSphere ESX or ESXi, not VMware Fusion or Parallels, because my clients all use ESXi and I wanted to be able to do things like VMotion and Storage VMotion.

These ISOs work for my Apple Mac Mini 2011 (5.1):

Download the ISO, burn it to CD, and boot from it.  The install goes flawlessly.  The USB keyboard works, video out (via HDMI!) works, and the onboard Ethernet wired network card works.  WiFi doesn’t, but that’s okay – I wouldn’t even run a lab off that.

Presto – my Ikea datacenter comes to life!

My Ikea Datacenter

That’s two Mac Minis running VMware ESXi 5, a cheap $250 Netgear NAS handling the storage duties for shared storage, and a few other pieces of unrelated tech gear.

Thanks, Pedro!


My rudimentary approach to software testing

0

I’m testing the new builds of LiteSpeed and Toad.  I’m not to the point where I’m using my L337 SQL skillz yet because I take a very basic ground-level approach to testing.

I go through an entire program and click on every menu item.

Everywhere that it lets me add something, I add it, and then I go back and edit that something to see if it looks the same as what I added.  Then I delete it.

That’s all.

Sounds simple, but it takes days, and it finds a surprising number of bugs.  You would not believe the number of things that cause a program to crash just by clicking on them.  I’ve done these tests with developers in the same room, and they say things like, “Why on earth would you click there?”  Well, because you put a button there, and somebody’s going to click it.  If it’s not supposed to be clicked, get it off the screen.

In a week or two, I’ll get to the point where I’m testing advanced concepts inside the software, but for now, it’s Click City.

The other benefit of this approach is that you learn just about every piece of the software, every nook and cranny.  In the case of Toad, it’s mind-blowing.  I thought it was just a SQL Server development tool, and I don’t believe in unitaskers, but oh no – this is one serious multitasker.  I can see many Toad-centric howto blog posts in my future.


Great SQL Server newsletter

3 Comments

When DBAs ask where to go for simple, straightforward SQL Server tips, I usually point them to SQL-Server-Performance.com.  That site puts out a fantastic daily newsletter with SQL notes, questions & answers, and general tips.

I’m not wild about daily email newsletters, but this one’s a gem.  It’s gotten even better since Peter Wardy started putting a personal editorial at the top of each issue.  Here’s the one from today:

“I was tasked to buy a mattress under blanket for a bed today and who would have thought that the process would be so complicated. I thought it would be as simple as choosing the correct size—king or queen. Little did I know that the decisions were endless; from summer and winter mattress protectors to underlay blankets with wool and magnetic touch zones and prices that ranged from $60 to over $800. When I think about the purchase decision of an under blanket it is very similar to an open source database.

When I went to the shop today all I wanted was an under blanket that would protect the mattress, keep me warm and provide some comfort. Whether that was a cotton one or a wool blend with a high pile did not worry me, I just wanted something that worked. When I choose a database it is the same as an under blanket, I want a database where I do not need to make decisions about which database engine I want to use. I want an engine that works no matter what my requirements rather than having to choose a different engine if I want transactions or a different engine again if I want Foreign Key constraints. I think one of the key benefits of SQL Server is that the behaviour is typically the same whether a single user is running Express Edition or a multinational corporation is running Enterprise Edition.” – Peter Wardy

The only things that would make this newsletter better are searchable online archives and an easier-to-find subscription link.  I can help out with that last one:

Subscribe to the SQL-Server-Performance.com Newsletter


Windows 2008 with Hyper-V on a Dell Latitude D830

Hardware, Virtualization
4 Comments

Hyper-VMy new work laptop for Quest is a Dell Latitude D830 running Windows XP. I gave XP another shot, and after a day I’d switched back to my Macbook Pro. I needed to use some of the XP-only apps, so I virtualized the XP image, and set about turning the Dell into my mobile datacenter.

Windows 2008 Standard installs fine, but doesn’t have drivers for:

  • Video card – NVidia Quadro 140M – go to support.dell.com and download the Vista 64-bit drivers
  • Wireless network card – still working on this one
  • Bluetooth – don’t install the Dell Vista drivers for this.  It doesn’t work.  I don’t need it anyway, but just thought I’d try it.

Hyper-V works great so far.  It’s no replacement for VMware Workstation or VMware ESX/VI.  Had a few minor problems, but nothing big:

  • Can’t easily copy a VM.  Have to export it, then import it again, and you have to be careful about how you handle folder paths.
  • Can’t make templates.  VMware handles this with Virtual Center, and I’m hoping Microsoft will come out with a similar add-on product to make it easier to quickly scale out multiple servers.
  • There’s some network strangeness when you use a laptop with both a wired connection and a WiFi connection.  It’s not obvious how to get the VMs to switch back and forth between the two.  I’ll probably need to write up a guide on this once I get the wireless adapter working, because I haven’t seen anything about it online yet.  With VMware, you can put the wired and wireless adapters on the same virtual switch, and the changes are transparent to the guests (or as Windows calls them, partitions).
  • Windows 2008 doesn’t have iSCSI target support built in.  I’d love to have this so that I could have a little farm on here and quickly switch drives inside the OS’s, but no dice.  I’ve been looking at a couple of iSCSI target add-on software packages, but it’s not worth the $200-$300 price, especially when those licenses only support a few guests.

I’m loving Hyper-V so far, though.  It’s not as good as ESX, but it’s a heck of a strong product for a first version.


Bought a Wii – and then gave it away

1 Comment

On my trip to DC this week, I ran across a couple of Nintendo Wiis in stock at Target.  For those of you who don’t follow video games, the Nintendo Wii is a notoriously hard-to-find game system.  It’s been out for months, but you still can’t walk into a store and just buy one.  Prospective customers learn through the grapevine when new shipments of Wiis will be delivered at local stores, and then patiently wait outside before the store opens in the hope that they’ll snag one.  Some are bought just to make a profit – people resell them on Craigslist for $0-$200 more than their $250 sticker price.

The Wii’s selling point is the motion-sensing remote control.  When playing the baseball game that comes with the Wii, you don’t push a series of buttons to swing when the baseball comes at you.  Instead, you simply hold the remote control like a bat, and swing it.  The better your swing, the better you hit the ball inside the game.  Same with tennis: hold the controller like a tennis racket and put topspin on the ball, and presto, the virtual ball spins.  It’s pretty impressive, and it’s playable by young and old alike.  As long as you can wave the controller around, you can play.

This simple, intuitive control system makes the Wii a smash hit at parties.  Anybody can grab the controller and play tennis, baseball, golf, whatever.  No skills required.  I’ve been wanting to throw myself a welcome-to-Houston party to get back in touch with our Texas friends, and I thought the Wii would be a great icebreaker.  I’m not into video games – I haven’t owned a console in years – but when I saw the Wii in stock, I had to get it.

Then I started thinking – why not take it to Emily’s house and show it to her 4-year-old stepson?  When we arrived, he had a friend over, and his friend recognized the Wii with wide eyes.  They got all excited, and I had a great time watching them jumping around the living room swinging at virtual baseballs, swinging virtual golf clubs, and rolling virtual bowling balls.  The other kid’s mom came over and went, “Wow, you have a Wii?  How’d you get it?”  She’d been dying to get one for her three kids.

It took me back to my childhood when one of our neighbors, Foster Cuthoff, had the coolest video game system on the block.  We’d all go over to his basement and play Contra for hours and hours.  Everybody thought it was so cool, and we would get all antsy over who would play next.

And then it happened.  My feeble mind connected the dots, and I realized I’m the wrong owner for that Wii.  It made so much more sense for Em to have it.  I’d only dabble with it every now and then at parties, because to me games aren’t really any fun unless they’re shared.  So, I gave it to Em.  Her stepson was so excited that Em says he’s still going around saying how much he loves me.

Awesome.  If that’s not success, I don’t know what is.  Granted, I didn’t save anybody’s life or anything, but it’s a hell of a good feeling.


Back from vacation, heading to my new job

2 Comments

My blog has been quiet the last few days because I’ve been in Washington DC visiting my Mom and my sister Emily.  I wanted to sneak in a quick visit between jobs.  I took the opportunity to redo both of their wireless networks and achieved a 0% success rate – I had to get two new wireless routers for different reasons.  As a reuslt, I haven’t had much connectivity, so no blogs for you.

Today, I’m flying to California to start work at Quest.  I’ve got my laptop cellular card all rigged up, got my camera charged up, and got my game face on.  Expect mucho bloggo.


Meet Quest Software’s new database guy

5 Comments

I have a new job, and lemme just tell you, it feels like my whole career has prepared me for this one.

I spent several years at UniFocus, a hospitality software company that relied on SQL Server. I honed my SQL skills, but probably more importantly, I managed a small team of programmers. I worked with executives to find out what they wanted to build, and then I turned those visions into reality with the help of a fantastic team. I worked under tight budget and time constraints, and I knew firsthand the importance of getting software shipped in the best shape possible.

During that time, I became more and more interested in the art of industrial design. I love beautiful software and hardware, especially things that make my job as easy as possible. I bought a microwave that only had two buttons (add a minute and stop) simply because it had such an elegant user interface. When Erika and I visited New York City, the first museum I wanted to see was the Cooper-Hewitt National Design Museum. I really appreciate the art of good design, and working for a software company, I tried to implement good visual and workflow design wherever possible. I wanted to build stuff that people could use without a manual, stuff that would make people say, “Of course, how else could it possibly work?”

After UniFocus, I moved to Southern Wine & Spirits, a seven billion dollar wholesale & distribution company. I faced the database challenges of a large enterprise like dealing with armies of consultants, planning for disaster recovery, multi-terabyte data warehouse maintenance, implementing storage area networks and building whole infrastructures from the ground up. After Southern, I felt perfectly comfortable saying I was a Senior DBA, and not just because I’ve got some gray hair.

In the last few years, I’ve been mesmerized by the concept of building online communities since I started beta testing Flock in 2005. Flock didn’t just have a great design that I could appreciate, it also had this buzz, this exciting feeling, and their company employees actively participated in jump-starting their online community. This was the first time I’d seen a company fervently tending to their users, building and nurturing relationships in a way that worked with the Internet’s strong points. I “got it” in a way I can’t explain, and I was so interested that I continue to follow some of the Flock community creators to this day, like Tara Hunt, Chris Messina and Will Pate, and they’ve gone on to do all kinds of exciting things.

On May 1st, I’ll leverage my SQL knowledge, my software business experience, my love of design and my community fetish when I start as a database domain expert for Quest Software.

I’m tasked with staying very technical, knowing SQL Server inside and out, knowing how Quest’s products fit in, helping to improve products and sharing my knowledge with the community.

I certainly have my work cut out for me: SQL Server expands in new directions with every release, and Quest has a big product line to fill SQL’s gaps. I’m a real believer in Quest’s ability to make a DBA more productive, more reliable and frankly, more valuable to their employer, and I haven’t even used half of their SQL Server products yet.

And probably my favorite responsibility is helping to support the SQL Server community. From the moment I got involved with Quest as a customer, they’ve proven to me time and again that they’re serious about nurturing the SQL Server community, helping their customers, and sharing the wealth of knowledge. I’m proud to be in a position where I’ll be able to contribute more to that community and represent a company that really believes in it. Just look at the list of Quest Experts like Kevin Kline and Bert Scalzo, and you can see why I’d be honored to going to work at Quest.

Working with a great line of SQL Server products, building communities around them, and interacting with impressive people – I truly have my own personal dream job.

Kinda makes you wanna break into song….

https://www.youtube.com/watch?v=V5BxymuiAxQ


Coworking Comes to Houston

1 Comment

The new Caroline Collective space is bringing coworking to Houston, and if my job hunt works out the way I hope this week, I’ll be one of the anchor tenants.

People who telecommute can feel isolated without the fun personal interaction with creative, dynamic people.  Coworking aims to solve that by offering desks and offices to people who would otherwise be parking their laptops at coffeeshops.  I’m heartbroken since my favorite downtown coffeeshop, Kaveh Kanes, closed while I was living in Miami Beach, but it looks like the Caroline Collective will more than take its place.

I’ve already driven over to the building and checked it out from outside, but I’m excited to get over there this week and see the layout.  I’m jumping right in to grab one of the offices because I do a lot of conference calls, and I’ll probably want to stash a couple of servers over there.  Plus I just wanna decorate my office with a giant Penny Arcade mural.


Writing Better Queries by Joe Webb

0

Joe Webb gave a presentation on “Tips & tricks for writing better queries” for the Nashville PASS chapter, and it’s a great read.  Developers and DBAs should all read this for a clear, concise refresher.

A video of the presentation would be even better, but for the most part, the slide deck speaks for itself.  Every bullet point is well-thought out – for example, “Be mindful of widely varying parameter inputs” – it doesn’t really explain why, but believe me – he’s right.  Been there, done that, got the t-shirt.


Google App Engine: the Amazon Web Services gets a competitor

Google Compute Engine
0

I’ve followed Amazon Web Services since its unveiling because I saw it as a really disruptive service that enabled startups to get off the ground for much less capital.

Now AWS finally has a competitor: the new Google App Engine, described in a blog entry over at the new Google App Engine Blog.  The feature list includes:

  • “Dynamic webserving, with full support of common web technologies” – for now, the only language they support is Python.
  • “Persistent storage (powered by Bigtable and GFS with queries, sorting, and transactions)”- Amazon came out of the gate without the querying, sorting and transactions, and it hurt them initially.  Web apps need databases, period.  I don’t just say that because I’m a DBA, I say that because I’ve also written (crappy) web apps.  Google’s Bigtable is obviously proven, since they use it for web indexing, but there could be a learning curve for new developers.
  • “Automatic scaling and load balancing” – to me, this is one of the two biggest feature advantages Google has over Amazon.  Amazon’s EC2 will scale, but it’s manually done, and third party companies have built solutions just to handle the scaling & load balancing.  This is the expensive part of building a .com – the infrastructure.
  • “Google APIs for authenticating users and sending email” – whoa. Sure, this is cool from a developer perspective, but think about it from a business perspective.  New web apps built on the Google App Engine will most likely use Google’s authentications, and at that point, people will be logging into the web app with Google usernames & passwords.  That means Google suddenly has deep hooks into every new web app, because the web app’s authors will be very reluctant to move their apps elsewhere.  The move would suddenly be obvious, because end users would have to log in with different usernames/passwords.  Presto, lock-in.  Brilliant move on Google’s part because they can say they’re just offering a service, and you don’t HAVE to use it.  Everybody will, and everybody will hope their awesome App Engine web app gets bought out by Google.
  • “Fully featured local development environment”

Humina humina humina – that’s a lot of power in five bullet points.

Amazon still has advantages because they’re more of a virtualization service provider than a shared web host provider.  With Amazon EC2, the developer can run servers around the clock with running processes, whereas the Google App Engine specifies that:

“Application code only runs in response to a web request, and must return response data within a few seconds. A request handler cannot spawn a sub-process or execute code after the response has been sent.”

That’s probably due to their automatic scaling and load balancing stuff – they spin up web apps on demand, and they don’t want to leave the web server running for hours on a bunch of different boxes.  Amazon has an edge here because developers can use background processes to churn through data or do queue-based stuff.  Speaking of queues, Amazon has more features than just EC2: they have Mechanical Turk, Simple Queueing Service, and now their Fulfillment Service.  They’re doing a lot more than just web hosting.

Building a pure web app?  Use Google App Engine.  Building a business with lots of moving parts?  Amazon Web Services still probably makes more sense.  Either way, it’s such an exciting time to be a developer.

The cost on all this: free for sites that host up to around 5 million page views per month.  Free.  Frrrrree.  It rolls off the tongue.


DBA Candidates: Ask For A List of Servers

Professional Development
4 Comments

When interviewing for a database administrator position, ask the manager for a list of database servers, the applications that rely on them, and who to contact when the server goes down.

(The manager may express concern about the secure nature of the list, and that’s fair – I would say the same thing.  If they don’t want to hand over the list, then just ask them to put their hands on it and answer the rest of the questions for you.)

Count the number of servers.  How many servers are missing contact names?  Ask if the contact names are up-to-date.  What’s the SLA for each server, and is it product, development or DR?

If there’s no document and the managers act as if creating something like this would require a Herculean amount of effort, then be aware that the new DBA will have no idea what she’s managing, what the expectations are for the job, or who to call when things break.
Ask if this list can be created by existing staff members before the start date.  Managers might say that this list sounds like a good thing for the DBA to create when she starts, but the DBA won’t have the contacts, resources or knowledge to build the list.  Danger!  Danger!  Secretarial work lies ahead!

An out-of-date list isn’t surprising if there hasn’t been much DBA turnover for a while.

A nonexistent list isn’t surprising either – but it’s a warning sign for the future DBA.

More DBA Career Articles

  • Moving from Help Desk to DBA – a reader asked how to do it, and I gave a few ways to get started.
  • Development DBA or Production DBA? – job duties are different for these two DBA roles.  Developers become one kind of DBA, and network administrators or sysadmins become a different kind.  I explain why.
  • Recommended Books for DBAs – the books that should be on your shopping list.
  • Ask for a List of Servers – DBA candidates need to ask as many questions as they answer during the interview.
  • Are you a Junior or Senior DBA? – Sometimes it’s hard to tell, but I explain how to gauge DBA experience by the size of databases you’ve worked with.
  • So You Wanna Be a Rock & Roll Star – Part 1 and Part 2 – wanna know what it takes to have “SQL Server Expert” on your business card?  I explain.
  • Becoming a DBA – my list of articles about database administration as a career.

SQL Server Consolidation: Plan for SQL Server 2008

0

When you’re building a consolidation plan, stop to think about when each database will be migrated to a newer version of SQL Server.

With SQL Server 2008 coming fast, some content owners will want to move their databases to 2008 as soon as it comes out in order to take advantage of things like the resource governor. But if you’ve consolidated multiple database servers onto one, and just one of those databases can’t be moved to 2008, then everybody has to wait!

This is especially important when dealing with third party solutions, applications written by other companies who may not have an aggressive development policy on supporting new versions of Microsoft SQL Server. Even now, in calendar year 2008, I deal with apps that still don’t support SQL Server 2005.

In any consolidation plan, try to include a “Next Step” section that addresses how quickly (or slowly) the newly consolidated databases will be moved to the next version of SQL Server.

Taken in the opposite direction, this can even be a selling point for SQL Server 2000 consolidation projects. Imagine this conversation with an application owner:

The DBA: “Bob, I’m going to consolidate your databases next month.”

Bob the Lazy Application Owner: “No.”

The DBA: “Actually, yes, I am. We’re taking all of the apps that still aren’t certified for SQL 2005, and we’re moving them to a single server. Your server’s an ancient single-core box that gives me nothing but heartache, and I’m tired of managing several of these boat anchors. I’m going to consolidate them all onto a new multi-core server that can handle all of our SQL 2000 instances, and I’ll have less management to do.”

Bob: “Will it save me time?”

The DBA: “It won’t let you play more golf during business hours, if that’s what you’re asking.”

Bob: “Then we’re not doing it.”

The DBA: “That’s a shame, seeing as your manager already signed the consolidation project charter.”

Bob: “What?!?”

The DBA: “Yep. I showed her that the total hardware maintenance on these five old 2000 servers is over $20,000 per year, especially now that they’re long discontinued. Plus this will free up four sets of SQL Server Licensing, so the consolidation project will actually pay for itself on day one.”

Bob: “Why wasn’t I told about this?”

The DBA: “Because you didn’t attend the meeting. I sent you an invite, but your secretary called me the day of the meeting and said you had an offsite meeting. I think it was the week of the PGA Tour event here in town, come to think of it.”

Bob: “Oh. Consolidation it is, then.”


DBA 101: It Takes Confidence

Professional Development
8 Comments

A good database administrator projects confidence – without being a jerk.

This positive yet firm attitude helps the DBA accomplish their job in tough situations:

  • A programmer insists their code is fine, and it’s a database problem that the DBA needs to fix
  • A manager asks the DBA how much to budget for server hardware next year
  • A project team says they want to run a mission-critical app on dilapidated hardware with no plan B

A wishy-washy DBA surrenders in those discussions and the database infrastructure suffers for it.

Managers – when you’re hiring a DBA, take a note of their attitude.  Are they selling you on their ideas, their background and their skills?  Is it a successful sale?  Remember that after they’re hired, they will have similarly small blocks of time in which to sell you on other ideas like database outage windows, purchasing strategies, and T-SQL problems.  A DBA has to be positive yet firm in their business relationships.

Candidates – do a shot of Jager before the interview.  Okay, maybe not, but act like you did.  And yes, I’m posting this because I recently conducted a phone interview with a candidate who was a total yes-man: he had the skills, but he lost his chance because both me and the manager saw that he’d get walked all over.

More DBA Career Articles

  • Gaining Experience for a DBA Job – a reader asked how to do it, and I gave a few ways to get started.
  • Development DBA or Production DBA? – job duties are different for these two DBA roles.  Developers become one kind of DBA, and network administrators or sysadmins become a different kind.  I explain why.
  • Recommended Books for DBAs – the books that should be on your shopping list.
  • Ask for a List of Servers – DBA candidates need to ask as many questions as they answer during the interview.
  • Are you a Junior or Senior DBA? – Sometimes it’s hard to tell, but I explain how to gauge DBA experience by the size of databases you’ve worked with.
  • So You Wanna Be a Rock & Roll Star – Part 1 and Part 2 – wanna know what it takes to have “SQL Server Expert” on your business card?  I explain.
  • Becoming a DBA – my list of articles about database administration as a career.

Submitted my abstracts for PASS 2008

#SQLPass
0

The Professional Association for SQL Server (PASS) group holds an annual summit for their members.  I attended last year, and blogged about some of my experiences:

While I was there, I remember thinking, “Man, I wish they had more storage information.  I can’t be the only guy here who’s pushing the limits with my SANs.”  The only storage session I got excited about by Linchi Shea, and if you ever get the chance to talk to this guy, you need to do it.  I sincerely hope that he wins the lottery, because I want his job – exploring performance limits with database servers on SANs.

This year, when PASS sent out their request for speakers, I thought – why not me? I’ve probably got more storage and virtualization experience than most database administrators, since I’ve managed SANs and VMware farms, and I have a ton of lessons that I should be passing on to other DBAs.  I’ve got an upcoming article in SQL Server Magazine about database storage, I’ve written whitepapers for Quest and SearchSQLServer, and I’m good enough, smart enough, and doggone it, people like me.

So I submitted a few session topics that I could cover with disturbing detail and shocking first-hand knowledge:

  • Virtualized Storage Pros & Cons
  • Dealing with Virtual Database Servers
  • Finding your Storage Bottlenecks with SQLIO
  • SAN Tips for First-Time Users

All of these are things I’ve wanted to cover in the blog anyway (you should see my list of blog post ideas) so might as well cover ’em in person.
Plus, I’m really excited about this year’s summit.  It’s in Seattle (so more MS folks will attend), SQL Server 2008 is coming, Microsoft’s virtualization technologies are coming and it’s just a great time to be a DBA.

Which reminds me – if you’re a DBA in the Miami/Fort Lauderdale area, Southern Wine is still looking for my replacement.  Email me if you’re interested!  It’s a fantastic company to work for, and it’s a great opportunity to work with a lot of cool technologies like blades, database mirroring, SANs and .NET web apps.


SQL Server 2005 Setup Checklist Part 2: After The Install

SQL Server
98 Comments

I’ve covered what needs to happen before you install SQL Server – now, let’s talk about what to do immediately after the setup finishes.

Install not just the service packs, but also the cumulative updates.

Starting with SQL Server 2005’s Service Pack 2, Microsoft releases hotfixes in cumulative packs. These updates do more than just fix bugs: they improve how SQL Server performs. These updates are free performance benefits – and who doesn’t like that?

To find the latest service packs and cumulative updates, check out SQLServerUpdates.com. It’s got version numbers, build numbers, and download links for all versions of SQL Server in one place.

Double-check that Instant File Initialization is enabled.

Paul Randal wrote an excellent blog post on how to tell if instant initialization is enabled.  Follow the instructions in his post, and you’ll know for sure.  (While you’re there, subscribe to his blog – it’s chock full of SQL-y goodness.)

Best Practice: Move TempDB to its own drive.

By default, the TempDB files are put on the same drive as the SQL Server binaries. Even if the user chooses a custom install, TempDB still goes on the same drive as the other data files, and that’s not a good idea either. Instead, the TempDB data files should be on their own dedicated drive.

Fix this by first moving TempDB to its own drive.  In this example, I put the data file on the T drive and the log file on the L drive.  (Be aware that the directory paths must already exist.)

use master
go
alter database tempdb modify file (name=’tempdev’, filename=’T:\MSSQL\DATA\tempDB.MDF’, size = 1mb)
go
alter database tempdb modify file (name=’templog’, filename=’L:\MSSQL\LOGS\templog.LDF’, size = 1mb)
go

I only set a 1mb file size because SQL Server does something tricky: even though we’re telling it to use a different drive letter, it will look for this amount of free space on the drive TempDB currently uses! If SQL Server was installed on the server’s C drive, for example, and we try to create a 10gb TempDB file on a T: drive, that SQL command will fail if there isn’t 10gb of free space on the C drive. Yep, it’s a bug – get over it.

After this code runs, restart the SQL Server. That will create the new TempDB file on the new drive. Manually delete the old TempDB file on the original drive, because SQL Server doesn’t delete that itself.

Now that TempDB is on the right drive, expand it to the full size you want, and then create additional TempDB files.  The current guidance from Paul Randal is to make 1/4-1/2 the number of TempDB files that you have processor cores.  If you’ve got a quad-socket, quad-core box, that’s 16 cores, so you need 4 to 8 TempDB files.  Generally I start on the lower end unless I know the server will be under heavy TempDB pressure from its applications.

Here’s the code to create one additional TempDB data file – you can modify this for more files:

USE [master] GO
ALTER DATABASE [tempdb] ADD FILE ( NAME = N’tempdev2′, FILENAME = N’T:\MSSQL\DATA\tempdev2.ndf’ , SIZE = 10GB , FILEGROWTH = 0)
GO

The data file creation should only take a couple of seconds – if it takes more than ten seconds, then instant file initialization isn’t configured correctly. We talked about this back in the pre-installation checklist, so go back and revisit that before you create the next TempDB file. Fix the security to allow for instant file initialization now – it has a huge performance impact on database growth.

Assuming that one file growth only took a couple of seconds, then go ahead and create the rest of the TempDB data files.

Notice that I don’t have filegrowth enabled.  You want to proactively create the TempDB files at their full sizes to avoid drive fragmentation.  If you have a dual-cpu quad-core server (8 cores total) and an 80-gb array for TempDB data, you would create eight 10gb files for TempDB.  That way, each file is contiguous, all laid out in one big chunk.  If you create them as smaller files and let them autogrow, then the disk will be fragmented all over the place because the files will be growing at random times.  Plus, you could end up with differently sized TempDB files if one of them happened to grow faster than the rest.  That’s why I pre-grow all of the TempDB files ahead of time and get them at exactly the right size.

Configure SQL Server memory for best practices.

Sounds easy, right? Go into SQL Server Management Studio, right-click on the server name and click Properties, go into Memory, and just configure it. There’s only a couple of fields – how hard could it be?

The minimum and maximum memory amounts are important, especially since we gave the SQL Server account the permission to lock its pages in memory. If other applications are running on this server, we need to specify how much memory we want SQL Server to take.

Ideally, no one would ever remote desktop into a SQL Server and run programs. Unfortunately, this happens, and we have to plan for it by leaving enough free memory for people to run things like SQL Server Management Studio. When I’m first building a server that isn’t running any other applications at all, I like to leave 10% of the memory free, or 4gb, whichever is larger. Then I monitor the free memory over the course of a month or two, and adjust it up or down during the next outage window.

If the server does multiple duties like act as a web server or application server, we have to be much more conservative with memory. Application owners never seem to know how much memory they’ll really use in production: SAP BW’s Netweaver, for example, tends to use anywhere from 10% to 50% of the memory on our production server, and it’s tough to predict. As a result, we have to leave the SQL Server’s memory allocation at just 50% of the available memory on the server.

I set the minimum server memory to 50% of the server’s total memory.  This will let SQL Server release memory if the server comes under memory pressure, like if someone remote desktops in and runs a very poorly written application.

The only way to know the right answer long term is to use Perfmon or a performance monitoring utility to watch the server’s free memory. I’ve written up a separate blog post on using Perfmon for SQL Server monitoring.

Set the Default Database Path

Even if you chose this during setup, we need to revisit it because SQL Server puts both the data files and the log files in the same directory. In SSMS, right-click on the server name and click Database Settings. The paths for the data files and log files can be configured from there.

Tweak the model database.

This tip comes courtesy of reader John Langston.  Whenever a new database is created, SQL Server uses the “model” database as – well, as the model.  You can make changes to that database, and those changes will automatically happen to any new databases.  John writes:

I also like to go to model and change the recovery model from FULL since we use SIMPLE a lot, even in production and also change the datafile autogrowth setting from 1 MB.

Great tip!

Configure Database Mail with public & private profiles.

Database Mail is a pure SMTP solution that beats the daylights out of SQL 2000’s SQLmail. It doesn’t require Outlook to be installed on the database server, doesn’t need any MAPI code, and works with any company email server that can be accessed via SMTP.

There’s plenty of sites on the web that explain how to configure Database Mail, but I want to address something: be aware that developers can use Database Mail for things that SQL Server shouldn’t be doing. For example, they may decide to use Database Mail to send out mass emails to your end users or customers. There’s nothing technically wrong with that, but it increases the load on the database server and it sends all outgoing email with the SQL Server’s Database Mail account.

At our shops, we use internal emails like (servername)@ourcompany.com to identify which server is sending the database mail. Those email addresses make sense to us because we just need to know where the alerts are coming from – we would never hit Reply to a server-generated email.

However, if developers use SQL Server to send out emails directly to customers, those customers will indeed reply. I had a nasty problem where a couple of developers decided to purge old customer accounts, and they used SQL Server’s Database Mail to broadcast an announcement to those users. The email read something like, “You haven’t used your account in 30 days, so we’re deleting it. Please contact us for questions.” Of course a lot of customers got aggravated and sent some nastygram replies, which arrived in the IT team’s inboxes, who had no idea what was going on. After some confusion, we were able to track down the guilty party, but those emails never should have gone out from the IT staff.

Bottom line: if you decide to use Database Mail (and you should), consider setting up separate private and public email profiles. The public email profile used by the developers should be sent from the developer management team’s group email address – that way, they can address any replies themselves.

Configure SQL Server Agent’s failsafe operator.

After configuring Database Mail, create at least one SQL Server Agent operator. This operator’s email address should be a distribution list for the database administrator group. Even if the company only has one DBA, never use an individual person’s email address – use a distribution list instead. When the DBA goes on vacation or gets a job at another company (or heaven forbid, gets fired), it’s easier to add someone to a single distribution list instead of modifying operators on dozens or hundreds of servers.

Then right-click on the SQL Server Agent, configure the alerting system to use Database Mail, and set up that DBA group as the failsafe operator. That way if anything happens and SQL Server doesn’t know who to alert, it can alert the group.

Create default alerts for severities 16 through 25.

SQL Server’s alerting system has the ability to notify operators whenever major things break inside the database. These include running out of space in log files, backup failures, failed logins and other things DBAs just need to be aware of. Don’t rely on this as your only SQL Server monitoring system, because it only sends alerts when it’s too late to take proactive action, but still, it’s better than nothing.

The below script will set up an alert for severity 16. Copy this and repeat the same thing for 17-25, but change ‘Database Team’ to be the name of your default operator. Notice that @delay_between_responses is set to 60 – that means if it sends out an alert, it won’t repeat that same alert for 60 seconds. This is useful because when a database runs out of drive space, for example, all hell will break loose for a minute or two, and you don’t want hundreds of emails and pages per minute.

USE [msdb] GO
EXEC msdb.dbo.sp_add_alert @name=N’Severity 016′,
@message_id=0,
@severity=16,
@enabled=1,
@delay_between_responses=60,
@include_event_description_in=1,
@job_id=N’00000000-0000-0000-0000-000000000000′
GO
EXEC msdb.dbo.sp_add_notification @alert_name=N’Severity 016′, @operator_name=N’Database Team’, @notification_method = 7
GO

 

Set Up Maintenance Plans

This is where things start to get different on a shop-by-shop basis. Some places use native backups, some places use backup compression software, some places use index defrag software, and so on. I’ve written a lot of articles about my personal backup best practices, and one about why SQL native backups suck. (Hey, if I was politically correct, I’d be writing manuals instead of reviews.)

Benchmark It, and Find the Slowest Link

Before it goes into production, load test and stress test it to find where you’ll need to improve performance down the road.  Before it goes live, this is your one chance to really bang the daylights out of it without anybody complaining.

Last: the First Responder Kit

sp_Blitz, sp_BlitzCache, sp_BlitzIndex and more give you a SQL Server health check in a matter of seconds. They give you a prioritized list of health and performance issues, plus gives you URLs for more details about each issue.