Blog

Quickly replying to recruiter emails

1 Comment

In today’s hiring market, DBAs get a lot of emails from recruiters that say things like:

“I am curious to know if you would be interested in (or know someone who is interested in) relocating to Big Falls East for the position discussed below….”

The job description never includes the salary range, and it rarely includes information about the industry – two things that should be important to a job candidate.  (If you don’t care about the industry, you should start caring – imagine going to work for Ford Motors, an airline, or a mortgage company in today’s economy.)

To handle these emails fast, go into your email program and create a new signature.  The signature should include a default reply text that is really generic, like this:

“Thanks for the email.  If you can forward on information about the salary range and the industry, I’ll pass it along to my peers.  After being repeatedly burned (hey, buddy, check out this job – oh, sorry, I didn’t know it only paid $X) I have a policy against forwarding jobs without salary & industry information.  Thanks, and have a great day.”

Most email programs will let you quickly change your signature for one specific email.  That way, when recruiter emails come in, just hit reply, insert your recruiter signature, and hit send.

When the next email comes in (either with the right info, or without it) you can handle it the right way.  If it includes good information, then it’s a good recruiter, and you should pass it on to your buddies.  If it doesn’t include any more information than the first email, just hit delete, because the recruiting process is off to a bad start.

I know recruiters will disagree, but here’s the deal: if a recruiter is desperate enough to send a blind email to a stranger without a resume, asking that stranger to pack up their house and move to another state, then there’s a reason.  They’ve already exhausted the local candidates – either because they’re not paying enough, the company has a bad reputation locally, or they need a hard-to-find skill set.  Think about that before replying with your resume, because the first two reasons will come back to haunt you, and the third reason means you’re worth a lot of money.


Quest LiteSpeed isn’t right for everybody

2 Comments

Sean McCown at Infoworld wrote a blog post today that said some negative things about Quest and LiteSpeed, and I had to respond. He’s switched away from using LiteSpeed, and he had an interesting set of reasons.

“I just don’t need the centralized repository in my lab, and I do enough demos that having native SQL syntax for my backups is worth something to me.”

Everybody has different needs.

Sean and I would totally agree that labs have a different set of needs than a typical enterprise or datacenter.

In fact, if I was just doing a lot of demos like Sean, and if I wanted native SQL syntax for my backups, I wouldn’t even use a third party product at all: I’d use SQL Server 2008’s new backup compression to make my life really easy. That way, you can stay purely native, and that makes for really easy demos.

Quest LiteSpeed, on the other hand, isn’t targeted just at labs and demos. It’s targeted at DBAs who need enterprise-ready backup solutions, complex needs like log shipping, centralized reporting, log reading with undo & redo for transactions, and so on. Not everybody needs the advanced features that LiteSpeed provides, and Sean doesn’t. We would both agree that LiteSpeed is not the right product for his lab.

“Even knowing any of the devs is hard these days, and getting anyone to make any changes is next to impossible.”

At smaller companies, especially startups with a minimum of customers, developers can work directly with end users. For better or worse, Quest has a lot of customers, and I’d like to think that’s because Quest has fantastic products. (It’s a pretty good problem to have.) Our product managers coordinate feature requests across all of our enterprise customers, large and small, which helps our developers focus on what they need to do: develop great software.

Just this week, in fact, Quest Software hosted a SQL Server Customer Advisory Board at the offices in Aliso Viejo. We flew out a dozen customers for two days of meetings about our road maps, upcoming features, and asking for their input on where we go next. I wish we could bring every Quest customer out there for the same input, but on the other hand, Steve Ballmer isn’t flying me up to Seattle anytime soon to ask me what SQL Server needs to do next.

We’re in a feature freeze for our new LiteSpeed v5.0. The management tools have been rewritten from the ground up, and I’m really impressed. But there comes a point where you have to stop adding new features and just focus on testing the code, and that’s where we’re at now. Heck, I have a long list of things I want to add, but even I can’t sneak ’em in – and I work for Quest! I’m already excited about the features I’m trying to cram into v5.1.

“Now, with the team being more corporate….”

If your main criteria for a backup product is easy access to the guy who wrote it, then Quest LiteSpeed is not the product for you.

Just a couple of months ago when I was a DBA instead of a Quest employee, though, that was not my main criteria for software selection. My main criteria included a global 24×7 support organization with experience across database technologies, a large QA team dedicated to testing mission-critical backup software on every possible platform and a product with an unbelievably strong track record. The fact that Quest was Microsoft’s 2007 Global ISV Partner of the Year again this year is proof that sometimes bigger is better.


Developers: sometimes you just gotta trust the DBA

0

At this week’s Customer Advisory Board at Quest, I’ve had a great time sharing war stories with other database administrators.  One theme keeps popping up again and again: developers don’t trust DBAs and seem to refuse to take the DBA’s word for anything.  It seems that developers just want to keep learning things the hard way.

Case in point: a developer wants to loop through all string fields in every database in the shop looking for secure information.  They’re after things like social security numbers, credit card numbers, bank account numbers and so forth.  They want to check the contents of every char/varchar/nchar/nvarchar field in every table, look at the top X records, and check the string contents to see if it matches known text patterns.

In theory, this could be solved by a stored procedure.

In practice, a stored proc would be a bad idea.  SQL Server isn’t particularly good at string processing, and a query like this will instantly push the CPU to 100%, thereby severely slowing down other queries on the box.  Plus, large databases like SAP can have tens or hundreds of thousands of string fields, and storing the results will take gigs of memory and/or TempDB space on a large BI/BW data warehouse.

The DBA involved has repeatedly recommended against the stored proc approach, but the developer still wants to build one, saying he’s Googled for ways to do it and he’s getting some good leads.  At what point  the DBA will sit back and say, “You know what, go for it, because when your stored proc takes down a production box, I’ll be all too happy to explain whose code did it.”

Another case involved developers who want to store completely unstructured XML data inside the database as text fields, and want to parse through that XML data when searching millions of records for matching attributes.  The XML format will be extremely flexible, but when storing it as text, it’s just not going to perform.  The developers see that as a SQL Server performance problem, when in reality it’s a bad design that won’t scale.

We spent some time tonight at the CAB talking about where the DBA draws that line – where he gives up trying to protect the developer and just gives the developer enough rope to hang themselves.  It’s an interesting challenge politically, but the moral of the story is this: developers out there, if your DBA keeps recommending that you do something, and all of a sudden one day he caves in and sees things your way, be careful.  You might be in for a rocky ride.


I have big hands

1 Comment

I’m staying at an oceanside hotel in Laguna Beach, California for a meeting with some of our customers.  When I say oceanside, I mean oceanside: there’s a huge wave break about a hundred feet from my balcony, just two stories down.  The waves are deafening.

It’s a really romantic setting, but I was still surprised to find these at my bedside:

“Grande for Increased Protection?”  Wow.  I mean, it’s one thing to have this kind of gear provided for you by the hotel, but it’s still another when they give you the big ones.

Or maybe they just gave the big ones to me.  Maybe the front desk clerk called down to housekeeping and said, “Look, I’ve seen this guy.  You gotta give him the big ones.  No playing around here.”

If you want to see the full details of the hotel amenities, click on the photo.  I wouldn’t want to show the whole thing on my normal public page without clicking on it.


SQL Server Partitioning: Not the Best Practices for Everything

Partitioning, SQL Server

When SQL 2005 was in beta, I clapped my hands and jumped for joy the first time I played with partitioning.  I’ve done a lot of data warehouse work, and partitioning lets the DBA break down massive tables into smaller, easy-to-manage chunks.  We can put heavily accessed parts of the table onto fast storage, and less-accessed data onto slower, cheaper storage.  The cool part is that it’s completely transparent to applications (as long as you don’t have to change primary & foreign keys): they don’t have to know the table is even partitioned.

We implemented it almost as soon as SQL 2005 came out, and I kept wondering why more people didn’t take advantage of this feature.  Sure, it’s not particularly intuitive, and there’s no GUI for it in SQL Server Management Studio, but it does work.

“Can partitioning help us?”

Fast forward to yesterday: I met with Mark Schmulen and David Lyman of NutshellMail.com.  It’s a slick solution for those of us who work behind corporate firewalls that prevent us from accessing personal email sites like Gmail and Yahoo, as well as personal networking sites like Facebook and Myspace.  When I worked for JPMorgan Chase, I was frustrated that I couldn’t get my personal email because I had a network of DBA buddies who all knew and used my personal email address.  We would ask each other for support from time to time, and I liked being able to help out.  At JPMC, though, the only way I could do that was contact all my friends and have them switch to using my corporate email – which I couldn’t check when I was outside of the office, so that was a lose-lose situation for me.  NutshellMail gets around that by checking your personal email accounts on a schedule you select, and then sending you digest emails to your work address.  Makes perfect sense, and I’d use it in a heartbeat – even if they weren’t using SQL Server for storage.  (Bonus!)

We talked scalability, and they’d heard partitioning may be a good answer for future growth.  I was pleasantly surprised to hear that the word about partitioning is getting out, and that was the first time I said no, partitioning isn’t the right answer for this one particular case.  Having talked through that, I wanted to touch base on some of the reasons here.

Partitioned tables require Enterprise Edition.

Standard Edition can’t do partitioning, and Enterprise is a few times more expensive than Standard.  On a 2-CPU box, we’re talking about a $30-$50k difference in costs.  If we’re only using Enterprise for the partitioning, then there’s a better return on investment if we put that $30-$50k into faster storage instead.  This is especially relevant for servers with direct attached storage, even in a raid 10 config.  I wouldn’t consider partitioning unless the server had at least 10-20 drives for the data array alone.

Partitioning best practices means using both fast and slow storage.

In the case of a data warehouse, a sales table is loaded every night with the previous day’s sales data.  The most recent data is under heavy load because it’s constantly being accessed by both the ETL processes and the daily reports – because users really want to know what happened in the last few days to a month.  We have to keep years and years of sales data on hand for curious users who want to check trends, though, and we don’t want to keep all of that data on extremely fast (and extremely expensive) storage.  (Yes, partitioning gurus, I know there’s work involved in moving data from one array to another, but sometimes it’s worth it.)  If we partitioned by date range, we could keep the new data on fast storage, and shuffle the older data off to slower storage.

Partition functions and schemes are easy to design incorrectly.

Before implementing partitioning for the first time, either get personalized advice from a DBA who’s done it several times, or get an identical development system for repeated testing.  I had the luxury of an identical QA environment where I could repeatedly test different partitioning strategies and replay the end user load to see which strategy performed best.  Not everybody’s that lucky (come to think of it, I should have played the lotto more) and it’s a mistake to try it on a production system first.  When I first started using it, I thought I knew exactly the right strategy for our data, and we’re talking about a schema I knew forwards and backwards.  I was wrong – really wrong – but at least I found the right answer before going into production.

In summary, outside of data warehouses, I like to think of partitioning as the nuclear bomb option.  When things are going out of control way faster than you can handle with any other strategy, then partitioning works really well.  However, it’s expensive to implement (Enterprise Edition plus a SAN) and you don’t want to see it in the hands of people you don’t trust.

More SQL Server Table Partitioning Resources

Before you start designing a table partitioning strategy, or if you’re trying to troubleshoot why your partitioned tables aren’t working as fast as you expect, check out our SQL Server table partitioning resources page.


Caroline Collective: The Morning After

7 Comments


It’s the morning after the Caroline Collective grand opening party. The place was absolutely packed with interesting people, and that’s my definition of a success. For images of the evening, check out the Caroline Collective Flickr pool.

Seemed like everybody I talked to wanted to get in on the coworking action, and they had the same basic questions:

Q: When is Caroline open?

It doesn’t have an “official” schedule yet, just whenever a tenant shows up. The tenants have keys. I’m here on weekdays by 7-8am, and from then on there’s somebody here until pretty late. Join BrightKite.com – it’s like Twitter, but for physical locations – and you can get notified whenever one of the tenants checks in at 4820 Caroline. Sometimes we’re here, but we just forget to unlock the doors – send a Twitter or a BrightKite note and it’ll rouse us.

Q: Is everybody in the web business?

No, that’s what I thought it was going to be too, but it turns out that’s not the case. The only web folks we have here (off the top of my head) are True Light Resources, and everybody else is all over the map. Caroline tenants include software developers, a TV producer, a photographer, a foundation, and of course, the ArtStorm crowd next door. It’s a really broad pool of people.

Q: What’s it take to join?

Anybody can show up anytime Caroline’s open and just sit down at an available desk. There’s free WiFi and free coffee. If you like what you see, there’s a few membership levels. At the $300/mo level, you get your own desk, which is pretty cool because you can leave your stuff there – monitor, speakers, whatever.

Q: What’s it take to get an office?

Patience, grasshopper. Those went first – as of a couple days ago, they’re all taken. We’ve got plenty of desks available, though.

Q: Who were those two well-dressed hotties?

Those would be Ned Doddington (left) and Matthew Wettergreen, the founders of Caroline:

Matthew Wettergreen and Ned Dodington

Excellent photo by Ed Schipul, and excellent vision by Matt & Ned.  This place was absolutely awesome last night, and the energy was so positive.  Everybody seemed to “get” the whole coworking concept, and I’m excited at what’ll happen over the coming weeks.  Congratulations, guys!


Time for a career change

1 Comment

I know, I just told you guys that I got all settled in at Quest and I’m loving it and everything, but I just had the most unbelievable offer come in. It just came out of nowhere straight into my email, and what can I say – I just can’t pass it up. It’s perfect for my background.

A picture is worth a thousand words, so with no further ado, here’s the email itself:

Look out, Zohan.  A former SQL Server DBA makes an even better hairstylist.  They said it themselves – I’m the perfect candidate.  Hooah!


SQL Server backups using SAN snapshots

4 Comments

One of my articles just got posted up at SearchSQLServer. I cover some of the pros and cons of backing up data warehouses with SAN snapshots.

I wrote that just after we got done with a NetApp deployment at Southern Wine. The funny part is that we deployed the QA environment pretty quickly, but months after I’ve left, they still haven’t moved the production warehouse over to the NetApp. I should have borrowed that set of drives until they were ready to go live, heh.

Update as of 6/3 – I originally wrote that article for SearchSQLServer.com.  I didn’t see the article appear there for a while, and when it popped up on SSWUG, I assumed SearchSQLServer had transferred it over to SSWUG.  Turns out that’s not the case.  We’re working through the details of that now.

This isn’t the first time one of my articles has been “pirated” – used to happen to me when I wrote for HAL-PC – but this is the first time it’s happened with an article I actually got paid to write.  Next thing you know, I’m going to need an agent.  Or maybe not.


Caroline Collective grand opening coming June 7th

2 Comments

I moved my gear into my desk at Caroline Collective today – well, actually, it was more of a buy-and-move. Most of this stuff is a new copy of the gear I use at home.

The grand opening will be Saturday, June 7th from 7pm til 10pm. The invite reads as follows:

“We are (collectively) happy to announce the Grand Opening of the Caroline Collective, Houston’s first coworking venue! Please join us Saturday, June 7th from 7-10pm at the home of Caroline Collective for drinks, light refreshments and heavy celebration. The event will be co-hosted with ArtStorm featuring collage artist Patrick Turk and the esteemed tenants of Caroline. Your attendance is not only appreciated, it’s needed. After all, it’s about all of us. Tasty hops provided by Houston’s own Saint Arnold Brewery and delectable dishes by chef David Grossman.”

Caroline’s at 4820 Caroline Street, 77004 with easy access from the Wheeler stop on light rail. Parking is available. I’ll be there, so if you’re interested in meeting a strangely outgoing SQL Server guy from Quest Software, c’mon down.

This is also the same night as the Ladytron concert, for which I hold two tickets.  The concert starts at 9pm – the jury’s still out as to whether or not Erika and I will leave the CC grand opening for the concert.  If I’m debating tossing a pair of Ladytron tickets to stay here, then you know it’s going to be a good party.


Getting Into a Database Administrator Position

Professional Development
39 Comments

I got a question from Ron G asking how to go about changing positions from help desk to DBA.  Here’s my thoughts:

Build on what you already know.

If you’re used to working on IBM AIX systems, for example, you’ll want to utilize some of that skillset by working with databases that run on AIX.  If you’re used to working on Windows computers (even in just a help desk environment), you want to stay on Windows.  Don’t try to learn both an operating system and an application at the same time if you can avoid it, because the faster you can get up to speed on just the database alone, the faster you’ll be able to get paid.

Attend free webinars.

Find third party vendors that support the database you’re trying to learn, and check out their marketing webinars.  They’re in the business of helping database administrators learn and grow, and they conduct some great training sessions for free just to get their products in front of you.  I’ve done a couple SQL Server training webcasts for Quest Software that cover how to accomplish common DBA chores using the native tools versus how much faster it is with the Quest tools.  I don’t know about you, but I learn a lot faster when I’m listening to a real human being talk instead of reading dry text, and webcasts are much more fun.

Join the local database user group.

You’d be surprised how many cities have user groups for databases.  Go, and promptly close your mouth, hahaha.  Don’t try to contribute, just sit, watch, listen and learn.  People will give presentations every month about database topics.  You’ll learn a little about databases, but more importantly, you’ll learn about the city’s market for the database you’re trying to learn.  Other people will get to know you, and down the road, you’ll find somebody who’s willing to show you the ropes.  (Everybody wants to hire junior DBAs.

Volunteer after hours with your DBA.

Talk to the friendliest DBA at your company (or another company in the user group) and tell them you’re interested in learning more.  Tell them that you’re willing to show up after hours if they’re doing maintenance and watch & learn.  This isn’t going to be an easy sell – with telecommuting these days, a lot of maintenance is done remotely via VPN – but if you’re lucky, you’ll find a taker.  At Southern Wine, I had a relationship like this with a junior DBA: whenever I planned after hours maintenance, I’d email him to tell him when it’d take place.  If he wanted to join me, we’d meet up at the office that night and I’d explain each of the steps I was doing as I did it.  It slowed me down as a DBA, but the payoff came when I wanted to take vacations, because he was already familiar with more systems than he’d ordinarily come across.

Find local database software companies.

Companies all over the US build add-on software for your database platform of choice.  They build things like performance monitoring tools, backup software, database utilities, etc., and all of this software needs support.  They have a help desk, and they’d love to hire people who want to grow their database experience.  You’ll be able to make a quick career change, plus get into a position where you’re learning databases on the job.  You can find these companies by Googling for your database platform name plus tools or management, like “SQL Server management” or “SQL Server tools”.  Also check the magazines for these (yes, there are database magazines, even!) and look at each of the advertisers to see where they’re located.  Call them and ask if they have an office in your city, because some of these companies are pretty big.  (Quest has over 3,000 employees all over the globe.)

Avoid consulting companies unless you know another employee there.

I know I’ll get email for this one, but here’s the deal: a lot of shady consulting companies are willing to throw anybody into a position just to make billable hours.  They pay you $X per hour, and they bill the client twice as much.  Presto, they’re making money off you, and they don’t care whether you know what you’re doing or not.  The client won’t find out right away because the consulting company won’t let them talk to you directly – they’ll manage all meetings via a project manager who does all the client interaction.  After a few months, when the client figures out that you don’t know what you’re doing, the consulting company can shuffle you off to another project.  You won’t learn much (there won’t be another DBA there to help you) and you’ll get demotivated.

Most importantly, be honest.

Don’t be afraid to say you don’t know the answer to something.  My official job title at Quest is “SQL Server Domain Expert”, and I get a big chuckle out of that.  Yesterday I met with two people for three hours (hi, Eyal and Melanie) and it would take two hands to count the number of times I said, “I don’t know the answer to that.”  Granted, my job puts me in the line of fire for some really tough technical questions, but you get the point.  Database administrators can’t know everything – today’s databases cover way too much functionality – and that’s okay.  Nobody expects you to know everything, but they’ll expect you to know where to find the right answers quickly.

More DBA Career Articles

  • Moving from Help Desk to DBA – a reader asked how to do it, and I gave a few ways to get started.
  • Development DBA or Production DBA? – job duties are different for these two DBA roles.  Developers become one kind of DBA, and network administrators or sysadmins become a different kind.  I explain why.
  • Recommended Books for DBAs – the books that should be on your shopping list.
  • Ask for a List of Servers – DBA candidates need to ask as many questions as they answer during the interview.
  • Are you a Junior or Senior DBA? – Sometimes it’s hard to tell, but I explain how to gauge DBA experience by the size of databases you’ve worked with.
  • So You Wanna Be a Rock & Roll Star – Part 1 and Part 2 – wanna know what it takes to have “SQL Server Expert” on your business card?  I explain.
  • Becoming a DBA – my list of articles about database administration as a career.

Benchmarking VM Fusion storage

Hardware
2 Comments

I use a Macbook Pro with VMware Fusion, which lets me run a bunch of virtual Windows machines for testing. I upgraded my internal drive to a quick 320gb one a few weeks ago, intending to run all my VMs off that internal drive, making it easier to do testing on airplanes or in coffee shops. Problem is, one of the things I test is Quest LiteSpeed, database backup software that does a lot of disk I/O, and during heavy backup/restore testing, I couldn’t do much multitasking because the internal laptop drive was being hit so hard.

I figure it’s a matter of time before I put a second hard drive in the Macbook Pro, but for now, I’ll stick with external drives.

I’ve got a bunch of USB drives, so I did some comparisons to see which ones got the fastest storage performance inside VMware. I wanted to find out how much performance I’d really gain by switching to an external drive, and did it make a difference whether I used FireWire or USB.

Keep in mind that this is not the native performance of each drive – this is the performance as seen inside a VMware Windows XP guest. I didn’t care what the native performance is, because that doesn’t do me any good. I’m only interested in performance as seen by the VMware guests because that’s the only thing I’d use the external drives for.

Here’s the benchmarking results from HD Tune:

Internal Western Digital 320gb 2.5″ WD3200BEVT:

WD3200BEVT

Big, bulky external Maxtor OneTouch III 3.5″ 500gb connected via FireWire 400:

External 500 via FireWire 400

Whoa. That’s only averaging 35.5 MB/sec, whereas the internal gave me 48.5 MB/sec. I’m surprised there.

External Maxtor OneTouch III 3.5″ 500gb connected via USB:

External 500 via USB

Not much difference in speed between FireWire and USB as far as speed goes, but the USB connection used almost double the CPU power.

External USB enclosure with a Fujitsu MHY2120BH 2.5″ 120gb (the Macbook Pro’s original internal drive):

External 120 via USB

I’ve also got an external SATA RAID enclosure that does mirroring & striping, and I’ll test that later just out of curiosity, but there’s no way I’d put my virtual machines on there because it weighs more than my laptop and it makes a loud racket.

These results are not scientific – I just did one pass of testing on each drive. Your mileage may vary. Offer not valid in all fifty states. No purchase required to win. See participating locations for details.

My verdict: I’ll put the virtual machines on the 2.5″ external enclosure.  It’s tiny, doesn’t require external power, and it won’t be as fast as the internal drive – but at least it’ll let me multitask.


Performance troubleshooting flowchart

4 Comments

Junior DBAs, warm up your printers. The Microsoft SQL Customer Advisory Team, aka SQLCAT, has published a flowchart explaining how to troubleshoot performance issues with SQL Server. This is by far the best visual representation I’ve seen into how a senior DBA’s brain works:

SQL Customer Advisory Team’s Troubleshooting SQL Server 2005/2008 Performance and Scalability Flowchart

And just think – the rest of us had to learn this stuff the hard way – by reading manuals!


Signed my lease at the Caroline Collective

2 Comments

Yesterday afternoon, I stopped by the Caroline Collective and signed a lease on my very own desk.  Sounds odd to lease a desk, eh?  Especially sight unseen – there’s no actual desks in the space, just a big empty room with concrete floors, white & blue walls, and fluorescent lights dangling from a low ceiling.

Matt & Ned nervously showed me their prototype wood desk, and I could tell they weren’t sure whether or not I was going to “get” it.  One of them said something about how the desk would be finished off, and I had to laugh.  Finishing anything off isn’t the point.  It’s not that coworking needs to be unpolished, but the finish on the desks doesn’t matter.  Although, I do have to confess that I cringed when I read Ned’s Twitter about Knoll furniture – I thought to myself, please, God, don’t go buying high-end office furniture this early in the game.  I love modern stuff like that, but damn, it’s expensive for a startup business.  Anyway, I was relieved to see the desks were inexpensive but well-crafted wood jobs instead.

I get the whole coworking thing, especially as somebody who telecommuted for five years, but I bet most people aren’t going to foresee it until the desks and the personalities go in.  The factor that makes coworking tick is the chemistry – the unique mix of people from different backgrounds, different companies (or no company at all), the laid-back informal discussions that don’t come from meetings organized on a calendar.

When the people are in, when the artists are slinging paint, when the beer is in the fridge, when the desks are filled haphazardly with relics of different careers, that’s when people will get it, and it’ll happen like wildfire in a city like Houston.

And you’ll wish you’d have signed a lease on a desk while they were still available.  Trust me.


VMware ESXi 5 on an Apple Mac Mini 2010 – It Works!

Hardware
78 Comments

Great news – Pedro Costa has got a working solution!  Apple Mac Mini 2010, 2011, and 2012 models all boot a patched version of VMware ESX 5.0:

VMware ESXi 5 Running on an Apple Mac Mini

I’d always wanted a small VMware vSphere 5 (ESXi) lab farm up and running, and I wanted to use Apple Mac Minis just for compactness and the silence.  It has to be vSphere ESX or ESXi, not VMware Fusion or Parallels, because my clients all use ESXi and I wanted to be able to do things like VMotion and Storage VMotion.

These ISOs work for my Apple Mac Mini 2011 (5.1):

Download the ISO, burn it to CD, and boot from it.  The install goes flawlessly.  The USB keyboard works, video out (via HDMI!) works, and the onboard Ethernet wired network card works.  WiFi doesn’t, but that’s okay – I wouldn’t even run a lab off that.

Presto – my Ikea datacenter comes to life!

My Ikea Datacenter

That’s two Mac Minis running VMware ESXi 5, a cheap $250 Netgear NAS handling the storage duties for shared storage, and a few other pieces of unrelated tech gear.

Thanks, Pedro!


My rudimentary approach to software testing

0

I’m testing the new builds of LiteSpeed and Toad.  I’m not to the point where I’m using my L337 SQL skillz yet because I take a very basic ground-level approach to testing.

I go through an entire program and click on every menu item.

Everywhere that it lets me add something, I add it, and then I go back and edit that something to see if it looks the same as what I added.  Then I delete it.

That’s all.

Sounds simple, but it takes days, and it finds a surprising number of bugs.  You would not believe the number of things that cause a program to crash just by clicking on them.  I’ve done these tests with developers in the same room, and they say things like, “Why on earth would you click there?”  Well, because you put a button there, and somebody’s going to click it.  If it’s not supposed to be clicked, get it off the screen.

In a week or two, I’ll get to the point where I’m testing advanced concepts inside the software, but for now, it’s Click City.

The other benefit of this approach is that you learn just about every piece of the software, every nook and cranny.  In the case of Toad, it’s mind-blowing.  I thought it was just a SQL Server development tool, and I don’t believe in unitaskers, but oh no – this is one serious multitasker.  I can see many Toad-centric howto blog posts in my future.


Great SQL Server newsletter

3 Comments

When DBAs ask where to go for simple, straightforward SQL Server tips, I usually point them to SQL-Server-Performance.com.  That site puts out a fantastic daily newsletter with SQL notes, questions & answers, and general tips.

I’m not wild about daily email newsletters, but this one’s a gem.  It’s gotten even better since Peter Wardy started putting a personal editorial at the top of each issue.  Here’s the one from today:

“I was tasked to buy a mattress under blanket for a bed today and who would have thought that the process would be so complicated. I thought it would be as simple as choosing the correct size—king or queen. Little did I know that the decisions were endless; from summer and winter mattress protectors to underlay blankets with wool and magnetic touch zones and prices that ranged from $60 to over $800. When I think about the purchase decision of an under blanket it is very similar to an open source database.

When I went to the shop today all I wanted was an under blanket that would protect the mattress, keep me warm and provide some comfort. Whether that was a cotton one or a wool blend with a high pile did not worry me, I just wanted something that worked. When I choose a database it is the same as an under blanket, I want a database where I do not need to make decisions about which database engine I want to use. I want an engine that works no matter what my requirements rather than having to choose a different engine if I want transactions or a different engine again if I want Foreign Key constraints. I think one of the key benefits of SQL Server is that the behaviour is typically the same whether a single user is running Express Edition or a multinational corporation is running Enterprise Edition.” – Peter Wardy

The only things that would make this newsletter better are searchable online archives and an easier-to-find subscription link.  I can help out with that last one:

Subscribe to the SQL-Server-Performance.com Newsletter


Windows 2008 with Hyper-V on a Dell Latitude D830

Hardware, Virtualization
4 Comments

Hyper-VMy new work laptop for Quest is a Dell Latitude D830 running Windows XP. I gave XP another shot, and after a day I’d switched back to my Macbook Pro. I needed to use some of the XP-only apps, so I virtualized the XP image, and set about turning the Dell into my mobile datacenter.

Windows 2008 Standard installs fine, but doesn’t have drivers for:

  • Video card – NVidia Quadro 140M – go to support.dell.com and download the Vista 64-bit drivers
  • Wireless network card – still working on this one
  • Bluetooth – don’t install the Dell Vista drivers for this.  It doesn’t work.  I don’t need it anyway, but just thought I’d try it.

Hyper-V works great so far.  It’s no replacement for VMware Workstation or VMware ESX/VI.  Had a few minor problems, but nothing big:

  • Can’t easily copy a VM.  Have to export it, then import it again, and you have to be careful about how you handle folder paths.
  • Can’t make templates.  VMware handles this with Virtual Center, and I’m hoping Microsoft will come out with a similar add-on product to make it easier to quickly scale out multiple servers.
  • There’s some network strangeness when you use a laptop with both a wired connection and a WiFi connection.  It’s not obvious how to get the VMs to switch back and forth between the two.  I’ll probably need to write up a guide on this once I get the wireless adapter working, because I haven’t seen anything about it online yet.  With VMware, you can put the wired and wireless adapters on the same virtual switch, and the changes are transparent to the guests (or as Windows calls them, partitions).
  • Windows 2008 doesn’t have iSCSI target support built in.  I’d love to have this so that I could have a little farm on here and quickly switch drives inside the OS’s, but no dice.  I’ve been looking at a couple of iSCSI target add-on software packages, but it’s not worth the $200-$300 price, especially when those licenses only support a few guests.

I’m loving Hyper-V so far, though.  It’s not as good as ESX, but it’s a heck of a strong product for a first version.


Bought a Wii – and then gave it away

1 Comment

On my trip to DC this week, I ran across a couple of Nintendo Wiis in stock at Target.  For those of you who don’t follow video games, the Nintendo Wii is a notoriously hard-to-find game system.  It’s been out for months, but you still can’t walk into a store and just buy one.  Prospective customers learn through the grapevine when new shipments of Wiis will be delivered at local stores, and then patiently wait outside before the store opens in the hope that they’ll snag one.  Some are bought just to make a profit – people resell them on Craigslist for $0-$200 more than their $250 sticker price.

The Wii’s selling point is the motion-sensing remote control.  When playing the baseball game that comes with the Wii, you don’t push a series of buttons to swing when the baseball comes at you.  Instead, you simply hold the remote control like a bat, and swing it.  The better your swing, the better you hit the ball inside the game.  Same with tennis: hold the controller like a tennis racket and put topspin on the ball, and presto, the virtual ball spins.  It’s pretty impressive, and it’s playable by young and old alike.  As long as you can wave the controller around, you can play.

This simple, intuitive control system makes the Wii a smash hit at parties.  Anybody can grab the controller and play tennis, baseball, golf, whatever.  No skills required.  I’ve been wanting to throw myself a welcome-to-Houston party to get back in touch with our Texas friends, and I thought the Wii would be a great icebreaker.  I’m not into video games – I haven’t owned a console in years – but when I saw the Wii in stock, I had to get it.

Then I started thinking – why not take it to Emily’s house and show it to her 4-year-old stepson?  When we arrived, he had a friend over, and his friend recognized the Wii with wide eyes.  They got all excited, and I had a great time watching them jumping around the living room swinging at virtual baseballs, swinging virtual golf clubs, and rolling virtual bowling balls.  The other kid’s mom came over and went, “Wow, you have a Wii?  How’d you get it?”  She’d been dying to get one for her three kids.

It took me back to my childhood when one of our neighbors, Foster Cuthoff, had the coolest video game system on the block.  We’d all go over to his basement and play Contra for hours and hours.  Everybody thought it was so cool, and we would get all antsy over who would play next.

And then it happened.  My feeble mind connected the dots, and I realized I’m the wrong owner for that Wii.  It made so much more sense for Em to have it.  I’d only dabble with it every now and then at parties, because to me games aren’t really any fun unless they’re shared.  So, I gave it to Em.  Her stepson was so excited that Em says he’s still going around saying how much he loves me.

Awesome.  If that’s not success, I don’t know what is.  Granted, I didn’t save anybody’s life or anything, but it’s a hell of a good feeling.


Back from vacation, heading to my new job

2 Comments

My blog has been quiet the last few days because I’ve been in Washington DC visiting my Mom and my sister Emily.  I wanted to sneak in a quick visit between jobs.  I took the opportunity to redo both of their wireless networks and achieved a 0% success rate – I had to get two new wireless routers for different reasons.  As a reuslt, I haven’t had much connectivity, so no blogs for you.

Today, I’m flying to California to start work at Quest.  I’ve got my laptop cellular card all rigged up, got my camera charged up, and got my game face on.  Expect mucho bloggo.