These days, I find myself using my iPad more and more as my main computing device on the road. Some recent OS & app updates brought some killer improvements that have let me leave my laptop at home more often. Here’s my favorites these days:
Be social with the Twitter app – The official Twitter app runs in the background, and whenever you’re mentioned in a tweet or you get a direct message, a notification pops up. Just like a full-blown laptop, you can switch over to Twitter, reply, and then switch back over to your app, right where you left off. This was the biggest thing I was missing at conferences – I wanted to take notes or work on my blog, but get alerted when someone asked me a question on Twitter.
Give better presentations with Apple Keynote – This is the Apple equivalent of PowerPoint. It opens PowerPoint files, and when combined with the VGA dongle, it can display PowerPoint slides on a projector. Just like PowerPoint’s presenter view, you get a presenter view on your iPad too! There’s two serious drawbacks. First, writing slide decks from scratch in Keynote on an iPad is hell, and second, while the VGA dongle is plugged in, you can’t plug in the charger. This isn’t a dealbreaker given the iPad’s excellent battery life (usually 8-10 hours), but it’s a bummer. At conferences, I spend most of my time running around like a SQLChicken with my head cut off, but I present for at least an hour a day. That would be the perfect time for me to plug in and recharge the iPad, but no dice.
Read everything with GoodReader – this file viewer on steroids lets you read PDFs, Office documents, and much more, plus syncs with a file server, cloud server, FTP server, and even Google Docs! I keep all my reference material with me on the road so I can answer questions faster.
Sync thoughts with Elements – I’m old school, and I keep all my notes in text files. I use Elements as a text editor on my iPad, and it automatically syncs all my text files across my phone, laptops, and iPads via Dropbox, a free cloud file share service. Elements does one thing, and it does it extremely well.
Manage my blog with WordPress – the best blogging platform has an app for writing posts, doing quick fixes on the road, and moderating & responding to comments. It works on both iPads and iPhones, by the way.
My favorite other apps for traveling – I use Night Stand HD as a beautiful alarm clock while my iPad or iPhone is plugged in (and it never locks the screen), the free TripIt to track all my travel plans in one place, and Ambiance as a white noise generator.
My favorite iPad cases – Ladies in the audience, I probably have more bags than you have purses. I looove bags, and I’m not ashamed to say it – although I’m a little ashamed to admit that I’ve got more than a dozen laptop or iPad bags in rotation right now. Here’s my favorites:
- Small – iPad-only messenger bag – Booq Push – when I want to take only the iPad and nothing but the iPad. It’s extremely minimal, but the construction is just awesome. Top-notch quality. Leave your power charger behind, though, because the only thing you can bring with this is a USB cable, headphones, a VGA dongle, and airplane tickets. Available at Amazon. (But not Prime as of this writing.)
- Medium – iPad shoulder bag – Booq Taipan XS – slightly larger with pockets to bring chargers, a small digital camera, and more little goodies. Available at Amazon. (But not Prime as of this writing.)
- Large – iPad, DSLR, and more – Tamrac Rally 5 – I recently picked up a more serious digital camera and a couple of lenses, so this bag has become my leisure-trip carryon. It houses my iPad, Lumix GF1, lenses, chargers, and whatnot in a smaller case than a typical laptop bag. Tamrac makes a whole series of laptop + camera bags, including the Rally 7 for 15″ laptops plus DSLRs and lenses.
I know what you’re thinking, though. “Brent, I don’t have an iPad, and I don’t want to hear any more about your cool toys.” Well, I’ve got good news – make that great news. I happen to know somebody who’s giving away ten iPads plus software to monitor your SQL Servers just in time for the holidays. Watch this space!
Jen McCown (the wife part of the husband-and-wife MidnightDBA team) declared yesterday to be the first Un-SQL Friday – a day where bloggers talk about anything other than SQL Server. The topic of the day was branding, and Jen’s post declared me a branding superstar. (Awk-ward.)
I couldn’t participate right away because I already had a post scheduled for Friday, and I try not to overwhelm you guys with blog posts. I read the responses unfold, and I’m not going to name names, but I saw a heck of a lot of really, really bad advice – and nobody referenced a book.
Hello, people. Here comes the tough love.
In the land of the blind, the one-eyed man is king. In the land of SQL Server, the guy who’s only read a couple of books about it is considered a “branding superstar.” I’ll be the first to tell you that I suck at branding and marketing, and I’ll tell you that because when it comes to marketing, I don’t consider my competitors to be the SQL Server community. I consider my competitors to be the marketing community, because my real competitors are big consulting companies that can afford to hire full-time marketing departments.
When I quit my job in July and joined SQLskills, I stopped getting a paycheck. Period. I’m a consulting partner – I have to find clients, do work, and get the clients to pay us. I thought for sure that I’d be eating ramen noodles for six months while I got my pipeline together, but a funny thing happened on the way to the microwave. I got email after email that all started with the same few words: “I loved your blog/presentation/webcast on ___, and I was wondering if you could help us with…”
That’s the kind of customer that consulting companies love. The customer already knows who I am, they know what I do, and they choose to work with me. No cold calls, no taking the executives out to the golf course, no expensive dinners at steak houses – just instant relationships. I can only do this because I’ve built a brand over the last few years that companies know they can trust, and that they want to work with by choice. I could only do that because I learned from people who really understood branding and marketing, because I sure as heck don’t.
There are millions of people out there who do marketing for a living. There are thousands of people who are amazing marketers, and there are dozens of books about it at your local bookstore. Stop learning by trial and error and error and error – go pick up one of these books:
- Accidental Branding: How Ordinary People Build Extraordinary Brands – David Vinjamuri
- The 22 Immutable Laws of Branding – Al Ries and Laura Ries – you can break some of these laws and still become successful, but you’ll get a big head start if you play by the rules.
- The Whuffie Factor: Using the Power of Social Networks to Build Your Business – Tara Hunt
- Permission Marketing – Turning Strangers Into Friends, and Friends Into Customers – Seth Godin
- All Marketers Are Liars Tell Stories – Seth Godin
You wouldn’t take your car to the dry cleaners to find out what that rattling noise is. You wouldn’t go to your favorite restaurant to ask how to implement accounting software. Likewise, don’t take branding advice from people who don’t make a successful living doing branding. ‘Nuff said.
I’m angry when people steal my blog posts. I’m bummed when people give away pirated PDF copies of our book. But the very worst feeling for me is when someone steals my presentations.
I pour dozens of hours into each of my 45-60 minute sessions. For every single one, I have to:
- Write the story
- Build the slide deck
- Pick just the right images and properly attribute them under Creative Commons
- Take product screenshots or write demo code
- Build a resources page on my site
- Rehearse some more
- Give the presentation at the local level, figure out what works and what didn’t, tweak the slide deck to refine it, and then start the rehearsal process again
- Eventually build up to presenting it at the national level
Some national conferences require me to upload my slide deck ahead of time, and every time I do it, I cringe. I would love for every attendee to be able to use my slides as a set of notes for later reminders, but I’m terrified because of what plagiarists do with my slides. Yes, people actually re-give my presentations, and I’m not the only one who’s fallen victim. Here’s what you need to know before you submit an abstract to a conference:
Read the recording rights in the speaker contract. When you present at a major conference, you sign a speaker contract with that conference that gives the conference organizers certain rights. Most conferences record video or audio of your session, and you need to be aware of what happens with those recordings afterward. Some conferences like TechEd and SQLBits give the recordings away for free on the web, and some conferences like the PASS Summit sell the recordings. If you pour a lot of time into building a session, and then people can view it for free over the web, you’ll have a tough time selling that content yourself later. People will be less likely to pay for a pre-con session if they can get that same material for free online.
Find out if “recording” includes transcription posting rights. Some conferences have different ideas of what the word “recording” means. I’ve worked with one conference that decided to take transcriptions of my session and publish them as separate articles on their web site. That would have been okay with me if they’d have taken the time to use a spell checker and to polish the articles, but here’s what the article ended up looking like:
“Perfmon is real important. As you can see here on the scren, when the Disck Reads metric is above this numbur, evertyhing is fine, but when it’s below this, pick your job up off the floor and…”
Since the article had my name on it, I was horrified, and I asked the conference to take it down immediately. I volunteered to clean the article up myself at no charge because I’d rather lose money and look good than ignore it and look stupid. Thankfully, they agreed to take it down period, but I learned my lesson – I won’t sign another speaker agreement that allows the conference to post transcriptions that I haven’t been able to approve ahead of time.
Find out if “recording” includes the slides. One conference’s contract includes the PowerPoint deck as part of the recording, then says the conference can do anything they want with the recording. You have to read between the lines to figure out that this includes the ability to do anything they want with your slide deck, including taking your identifying information out and letting other people re-present your presentation! Over the last couple of weeks, Adam Machanic, Gail Shaw, and I have been engaged in a struggle with a particular conference’s organizers. The conference took previously presented decks, gave them to other presenters, and let them re-present the material. I’m not going to name the conference because they’re working with us to improve how they handle ownership of presentations, thank goodness. The frustrating part is that I had to repeatedly explain to the conference organizers just how bad this would be for their image if it became public.
Ideally, know your rights even before you submit an abstract. Some conferences will honor your requests for changes, but only if you make that a part of your abstract submission. Having gone through the above messes, I know that certain conference organizers want me to say “This presentation and abstract is copyright Brent Ozar, and no rights are transferable to anyone else without my express written permission ahead of time.” I’m frustrated that I have to even tell them that, but it is what it is, and if I don’t include that in the abstract then I may have to back out of the conference later when the speaker contract arrives.
Decide when you’re willing to let presentations go to the public. For certain events, I give attendees my PowerPoint slides because I want them to re-deliver my presentations. I want my SQLCruise attendees to take their new-found knowledge back to the office, then give the presentations to their coworkers. Their boss will see the value gained by sending someone on the cruise, and then hopefully pay for the attendee to return each year. However, I make it really clear to the attendees that they’re not allowed to present these decks in public or to user groups. For other events, I build decks knowing full well that they’ve got an expiration date – I won’t ever be able to attract as many in-person viewers once the presentation is available on the web for free.
Share PDF copies of your presentation, not the PowerPoint original. When attendees and conferences want copies of your slide decks, don’t give them the PPTX files. Give them an exported PDF copy of the slides, which is good enough for people who need to verify that you’ve got content ready or want to take notes. When people insist on getting the PowerPoint slides, ask them why, and don’t be satisfied with brush-off answers. You’re the content owner – take control of your content.
Set up Google Alerts for your presentation titles and abstracts. It really bums me out to even have to type that, but the reality is that plagiarism isn’t going away. When you get a hit, politely approach the speaker and ask, “I see that you’re presenting on The Top 10 Ways to Make Microsoft Access Stop Sucking. That sounds a lot like a presentation I did last month, and I’d love to hear more about your ideas.” When a speaker truly didn’t know about your presentation, they’ll be excited to bounce ideas off you. On the other hand, if they were planning on plagiarizing your work, they’ll adjust their plans accordingly. If they don’t respond in a way that gives you the warm-and-fuzzies, email the conference organizers and give them a heads-up.
Finally, don’t use copyrighted material in your own presentations. Movie pictures, album covers, and other people’s photos are generally off-limits. Instead, use my post on Finding Free Pictures for Blog Posts and Presentations.
When I complain about plagiarism, I hear the same thing over and over from other bloggers: “Nobody ever plagiarizes me. I guess I’m not that important.”
Really? So you’ve been checking to see if your stuff has been copied?
Exactly. It’s time to find out how.
Watch Your Trackbacks and Incoming Links
Odds are your blog posts will include links back to your own site at some point, like when you refer to your other posts. The quickest way to stay on top of this is to glance at the “Incoming Links” module in your WordPress dashboard:
In that screenshot, I can see that Steve Jones, Tom LaRock, Stacia Misner, Ted Kreuger, and “unknown” have all linked to my site recently. By glancing at that list, I can see that most of those are completely okay, but the “unknown” one gives me pause, so I’d click on that to make sure it’s a legit blog. On a side note, you should always monitor these anyway, click on all of the links, and read what people are saying about you.
Another built-in WordPress tool is the list of pingbacks. When people copy your work verbatim and publish it, their blog may try to send a pingback link alerting you. Go into your Comments list and filter it by pings only:
In that screenshot, I can see that Sean Gallardy has linked to my SQL Server checklist. I would want to click on that link to make sure it’s not an exact word-for-word copy of my own checklist, or another one of my blog posts that happened to link to my own checklist.
Set Up Free Google Alerts
Even if the plagiarist is smart enough to disable pingbacks, they probably won’t strip the links out of your blog posts. To catch those, I set up Google Alerts for real-time notifications; whenever Google runs across the word “BrentOzar.com” anywhere on the web, they send me an email. I can tell at a glance if it’s a plagiarized post, a forum question pointing to one of my articles, or a blog comment. I’ve set up similar alerts for sites I manage, my name, companies I work for, and so on.
When I’ve built a blog post I’m particularly proud of, I even set up Google Alerts for key phrases in the post. For example, in my SQL Server 2008 DAC Pack blog post, I used the phrase “Bringing Sexy DAC.” I can be fairly certain that phrase will not come up often, and if it shows up on the intertubes, somebody’s stealin’ my work. That phrase is a little down the page, beyond the first paragraph, so it shouldn’t show up if someone’s only showing the first few sentences of my post (which would be okay.) I set up a Google Alert for that, and if anybody is automatically reposting my work, I get notified.
(Yes, I’ve deleted that Google Alert now because I know by saying this, I’m going to get a bunch of tweets saying “I’m Bringing Sexy DAC!” Heh. I love you people.)
Monitor Your Referrers
If you’re using web analytics tracking to see how (un)popular your site is, it probably has a screen to show which sites are linking to you. In my favorite free web analytics tool, Google Analytics, it’s under Traffic Sources, Referring Sites:
Because the plagiarist may not be popular yet, you need to go through ALL of the referring sites, not just the top ten. The more popular you get, the more painful this gets, but on the plus side, you get a warm, fuzzy feeling seeing everybody linking to you in a good way.
I go through this list looking for sites I don’t recognize, then I drill into the analytics to find out exactly where in the site they’re coming from, and I click on it. Hopefully it’s not an exact copy of one of my posts that links to another one of my posts.
Use Tynt.com to Tweak Copy/Paste
This has to be one of the coolest tools I’ve ever seen. The easiest way to understand how it works is to see it in action. Go to any page on BrentOzar.com, select some text, copy it, and then paste it into a text editor:
SHAZAM. It doesn’t get much more obvious than that. I used to use more polite wording, but after being repeatedly plagiarized, I’m going with the big guns now.
Tynt even gives you a slick dashboard to show where your content is being pasted:
Search Manually with Copyscape.com
Finally, every now and then I go searching for copies of my recent posts with Copyscape.com. I put in a URL to a recent post (30-60 days old), and Copyscape goes hunting for similar copies. Their logic is pretty fuzzy, and it gives me a lot more misses than hits, but when it hits, it hits big time. It catches plagiarists who are smart enough to disable trackbacks, strip out your links, and even futz with your wording to try to make it look different.
This is how I caught CrazySQL initially, and how I found that BugoSQL was trying to hide some of my posts in disguised PDF files.
It’s a lot of work catching these diabolical bastards, and it’s like a never-ending game of Whack-a-Mole. I have to keep playing because I make a living off my content – it’s my marketing tool to bring in new consulting customers. This is especially important to me now that I’ve become a full time consultant; I don’t get paid unless I’m working for a client. I’m not getting paid to write this, either, but I do it because I’m passionate about helping the community and helping bloggers protect their content.
“If you sit by the river long enough, you will see the body of your enemy float by.” – Proverb
“The tribe has spoken. It’s time for you to go.” Jeff Probst, Survivor
In Monday’s post about the latest plagiarist, I talked about how WordPress.com is wonderfully responsive to complaints of plagiarism. Whenever I find my work copied on a blog hosted at WordPress.com, I email the blogger first to talk them into taking it down. If they don’t respond or comply, I just file a DMCA request via email, and WordPress takes the offending pages down within a day or two. I don’t bother alerting the community if it’s just one or two posts, or if I’m the only guy who’s been ripped off. I’d rather just get the problem fixed, move past it, and let the plagiarist lick their wounds in private.
Unfortunately, not every host responds like WordPress.com, and the problem gets worse with self-hosted and international sites.
For the last year, I’ve been struggling with another site that’s copying my stuff:
Pardon my French, but I hate these bastards.
- They’re physically copy/pasting my articles. If you see the “Read more” link, that’s what happens when you copy/paste content off my web page thanks to Tynt.com, a great tool for bloggers that I’ll be discussing later this week. This jerk is just going to my page, highlighting what he likes, copying it, and pasting it into his own blog.
- They’re hotlinking to my images. The error message you see (it even includes BrentO!) resides on i.brentozar.com, my caching site, and I pay whenever someone views that image. Don’t get me wrong, it’s not expensive, but this lazy no-good bum is costing me money.
- They’re ripping off my stuff, Microsoft TechNet, MSSQLTips, and a bunch of you bloggers.
- They have no contact link to send email, and no personally identifying information.
- They’re using UmbraHosting.com, and that crappy hosting company doesn’t respond to DMCA requests or emails. They’re effectively condoning plagiarism.
A few days ago, I was ecstatic to see that the domain name had lapsed without being renewed, and the site went down. Gail Shaw and I were cheering that this bozo had finally bit the dust, but like a zombie, he’s back from the dead.
So I give. I can’t get this site to die, and that’s where you come in. I can’t offer a bounty to whoever gets the plagiarism removed off that site, because that’ll be too tough to prove, so I’ll offer a reward instead.
When that site is no longer serving plagiarized material (either because it’s taken down altogether, or the plagiarized material is gone) I’ll donate $250 to the non-profit Electronic Frontier Foundation in the name of the SQL Server community.
C’mon, people. Show me what ya got. Ferret out the owner of this site, get in contact with the hosting company, and show off your ability to play Internet Detective.
Update Nov 18th – it’s down! I’m going to wait a few days to make sure they stay down, and then I’m ecstatic to make the $250 donation to the EFF.
Update December 4th – in the last couple of weeks it went up and came down a couple of times, but their web host has assured me they won’t let it go online again with copyrighted material intact. To celebrate, I’ve made the $250 donation to the EFF. Yay, rights protection! I got a swag package for the $250 donation, and I’ll give that away on my blog when I get it.
Every now and then I get enough content piled up about a subject that I decide to dedicate a whole week-long blog post series to it. The good news is that you’ll have plenty of original material to read this week, but the bad news is that you’ll also have copied material too. Welcome to Plagiarism Week!
When I was researching material for my FreeCon, I wanted to show bloggers how to optimize their site for search engines. One of the topics involved putting related images in each blog post; if you’ve got a story about SQL Server setup checklists, you should put screenshots in there and tag them appropriately. Search engines recognize that your post is more complete than others because it’s got eye candy.
To illustrate it, I went to Images.Google.com and searched for SQL Server setup checklists because I’m quite fond of my setup checklist post and it does well in search engines. The results look like this:
The third result is mine, and I recognized it immediately because the screenshot had my company’s SQL Server name in it (from the time I wrote the post). The majority of the images on the page do indeed relate to SQL Server, but some of them are surprising. For example, the baby’s face at the bottom left might seem odd, but it’s actually from the comment avatars on my blog post. The one that intrigued me most was result #4 – the MCITP logo. That’s a pretty high-ranking result for such a generic picture – the content must be fantastic! So I clicked on it:
Wow, that is indeed some good content. Of course, I might be a little biased, because it’s my checklist.
Compare the copied checklist screenshot above (no, I’m not going to link to that guy’s site) to my SQL Server setup checklist, and you’ll notice that he stripped out my introductory paragraphs where I talk about building these checklists through my years of experience. He didn’t just delete my personal text, he also went to the effort of deleting every link back to my site, including the images. He even merged my Part 1 and Part 2 pages together to avoid linking to me.
This isn’t a casual copy/paste job or an RSS tool – this is a hard-working plagiarist who had managed to circumvent every protection I’d built into the blog. He had ads on the checklist, and he’d managed to rise to the first page of Bing results:
I hadn’t caught this guy earlier because I use Google, and he doesn’t show up in Google’s results. No, I’m not saying Bing promotes plagiarism – I’m just saying it’s like the high school teacher who didn’t quite catch on that you copied your term paper from mine.
I followed the steps in my article What to Do When Someone Steals Your Blog Posts. I contacted the author via the email on his About Me page and LinkedIn profile, neither of which I’m going to link to here. When someone links to your web site, search engines believe you have a more credible web site, and I’m not about to give this guy any Google juice. When he didn’t respond to emails, I filed DMCA takedown notices with his web host, WordPress.com, which has always been extremely responsive for me. I love how WordPress protects the rights of authors whose content has been stolen:
I wasn’t his only victim. He stole multiple posts from Microsoft, too, like this one:
The bad news is that he may have stolen your content, too. Since he went to great lengths to disguise my content, I’m guessing he may have disguised yours too, so it’s time to spend some time reading his web site:
If you interact with this author, I have two requests. First, keep it civil – he’s a real guy somewhere with a real life and a real job. He made mistakes, and he’s about to learn from them, but it’s not like he killed anybody. I could have emailed my contacts at his employer, but I don’t want to ruin his life – I just wanted the plagiarism to stop. Keep the punishment in perspective. Second, don’t make racial comments – remember that the last big plagiarism scandal around here was a white guy from the US. It can happen anywhere to anyone.
SQL Server 2012 brings huge improvements for scaling out and high availability. To put these changes into perspective, let’s take a trip down memory road first and look at the history of database mirroring.
SQL Server 2005 first introduced mirroring, although it wasn’t fully supported until a service pack. In many ways, mirroring beat the pants off SQL Server’s traditional high availability and disaster recovery methods. Log shipping, clustering, and replication were known for their difficulties in implementation and management. With a few mouse clicks, database administrators could set up a secondary server (aka mirror) to constantly apply the same transactions that were applied to the production server. In synchronous mode, both servers had to commit every transaction in order for it to commit, giving a whole new level of confidence that no transactions would be lost if the primary server suddenly died. In asynchronous mode, servers separated by hundreds or thousands of miles could be kept in sync with the secondary server being a matter of seconds or minutes behind – better than no standby server at all.
SQL Server 2008 improved mirroring by compressing the data stream, thereby lowering the bandwidth requirements between the mirroring partners.
In one of the most underrated features of all time, Microsoft even used mirroring to recover from storage corruption. When the primary server detected a corrupt page on disk, it asked the mirror for its copy of the page, and automatically repaired the damage without any DBA intervention whatsoever. Automatic page repair doesn’t get nearly the press it deserves, just silently working away in the background saving the DBA’s bacon.
Database Mirroring’s Drawbacks
While SQL Server was able to read the mirror’s copy of the data to accomplish page repairs, the rest of us weren’t given the ability to do anything helpful with the data. We couldn’t directly access the database. The best we could do is take a snapshot of that database and query the snapshot, but that snapshot was frozen in time – not terribly useful if we want to shed load from the production server. I wanted the ability to run read-only queries against the mirror for reporting purposes or for queries that could live with data a few minutes old. Some companies implemented a series of snapshots for end user access, but this was cumbersome to manage.
Unlike log shipping and replication, mirroring only allowed for two SQL Servers to be involved. We could either use mirroring for high availability inside the same datacenter, OR use it for disaster recovery with two servers in different datacenters, but not both. Due to this limitation, a common HA/DR scenario involved using a cluster for the production server (giving local high availability in the event of a server failure) combined with asynchronous mirroring to a remote site. This worked fairly well.
The next problem: database failovers are database-level events. DBAs can fail over one database from the principal to the secondary server, but can’t coordinate the failover of multiple databases simultaneously. In applications that required more than one database, this made automatic failover a non-option. We couldn’t risk letting SQL Server fail over just one database individually without failing over the rest as a group. Even if we tried to manage this manually, database mirroring sometimes still ran into problems when more than ten databases on the same server were mirrored.
Database mirroring didn’t protect objects outside of the database, such as SQL logins and agent jobs. SQL Server 2008 R2 introduced contained databases (DACs), a packaged set of objects that included everything necessary to support a given database application. I abhor DACs for a multitude of reasons, but if you were able to live with their drawbacks, you could more reliably fail over your entire application from datacenter to datacenter.
Enter AlwaysOn: New High Availability & Disaster Recovery
It’s like mirroring, but we get multiple mirrors for many more databases that we can fail over in groups, and we can shed load by querying the mirrors.
That might just be my favorite sentence that I’ve ever typed about a SQL Server feature.
I am the last guy to ever play Microsoft cheerleader – I routinely bash the bejeezus out of things like the DAC Packs, Access, and Windows Phone 7, so believe me when I say I’m genuinely excited about what’s going on here. I’m going to solve a lot of customer problems with mirroring 2.0, and it might be the one killer feature that drives Denali adoption. This is the part where I raise a big, big glass to the SQL Server product team. While I drink, check out the Denali HADR BooksOnline pages and read my thoughts about the specifics.
First off, we get up to four replicas – the artist formerly known as mirrors.
Denali also brings support for mirroring many more databases. We don’t have an exact number yet – we never really got one for 2005 either – but suffice it to say you can mirror more databases with confidence.
DBAs set up availability groups, each of which can have a number of databases. At failover time, we can fail over the entire availability group, thereby ensuring that multi-database applications are failed over correctly.
Denali’s HADRON improvements change my stance on virtualization replication. For the last year, I preferred virtualization replication over database mirroring because it was easier to implement, manage, and fail over. Virtualization still wins if you want to manage all your application failovers on a single pane of glass – it’s easy to manage failovers for SQL Server, Oracle, application servers, file servers, and so on. However, the secondary servers don’t help to shed any load – they’re only activated in the event of a disaster.
AlwaysOn Isn’t Perfect
I need to be honest here and tell you that Denali threw out the baby with the bathwater. There’s going to be a lot of outcry because some of our favorite things about database mirroring, like extremely easy setup, are gone. Take a deep breath and read through this calmly, because I think if you see the big picture, you’ll think we’ve got a much smarter toddler.
AlwaysOn relies on Windows clustering. I know, I know – clustering has a bad reputation because for nearly a decade, it was a cringe-inducing installation followed by validation headaches. Some of my least favorite DBA memories involve misbehaving cluster support calls with finger-pointing between the hardware vendor, SAN vendor, OS vendor, and application vendor. This is different, though, because clusters no longer require shared storage or identical hardware; we can build a cluster with a Dell server in Miami, an HP server in Houston, and a virtual server in New York City, then mirror between them. Now is the right time for AlwaysOn to depend on clustering, because the teething problems are over and clustering is ready for its close-up. (One caveat: clustering requires Windows Server Enterprise Edition, but Microsoft hasn’t officially announced how licensing will work when Denali comes out.)
When you’ve got a clustering/mirroring combo with multiple partners involved, you want to know who’s keeping up and who’s falling behind. You’ll also want to audit the configurations. There’s an improved Availability Group dashboard in SQL Server Management Studio, but I’d argue that GUIs aren’t the answer here. For once, brace yourself – I would actually recommend PowerShell. I’ve given PowerShell the thumbs-down for years, but now I’m going to learn it. It’ll make HADRON management and auditing easier.
Summing Up Denali AlwaysOn
There’s a lot of challenges here, but as a consultant, I love this feature. It’s a feature built into the product that gives me new ways to handle scalability, high availability, and disaster recovery. There’s a lot of potential in the box, but the clustering requirements are going to scare off many less-experienced users. Folks like us (and you, dear reader, are in the “us” group) are going to be able to parachute in, implement this without spending much money, and have amazing results.
Over the next few months, I’ll be taking you along with me as I dig more into this feature. I plan to implement it in labs at several of my customers right away, and I’ll keep you posted on what we find. If it’s anywhere near as good as it looks, I’m going to be raising a lot of glasses to Microsoft.
If not, I’ll be pointing Diet Coke bottles at Building 35 until they fix the bugs, because this feature could rock.
Microsoft Atlanta is a new cloud-based SQL Server monitoring tool. Companies can deploy an agent on their SQL Server 2008 and 2008 R2 instances, and the agent will communicate up to Microsoft’s servers in the cloud (through a secure gateway also installed inside the company’s network.) Microsoft aggregates the data, analyzes it looking for problems, and displays it back to IT users like database administrators through a web-based dashboard.
I’m not bashful about cheerleading for Microsoft products that rock, but I’m not feeling this one quite yet. Here’s the issues I’m concerned about:
It only monitors SQL Server 2008 and newer. Most of my clients still have SQL Server 2005 (if not 2000) in the shop. They don’t want one tool to monitor SQL Server 2008 performance and another solution for SQL Server 2005. Frankly, they don’t even want to settle for one monitoring tool per database platform – they want one pane of glass to monitor SQL Server, SharePoint, IIS, app servers, and often other database platforms like Oracle or Sybase.
It sends diagnostic data to the cloud. Some of my clients can’t let their own developers get access to queries running on the production server, let alone send them offsite to servers that could be in any geographic location. In larger enterprises, privacy and compliance regulations fiercely guard where data can go. Atlanta could be sending your usernames, application names, and queries to the cloud – and don’t forget that queries can include things like customer names, personally identifying numbers like SSNs, or even credit cards.
It’s not real-time. When a user calls saying their query just timed out or produced an error, I need to see what’s happening on my servers right now. I don’t want to use one tool to troubleshoot what happened 90 seconds ago versus what’s happening now. I need live data, and I can’t wait for round trips to the cloud.
It feels like it’s targeted at small businesses that can’t afford a monitoring infrastructure or an in-house database administrator. They’ve got just one or two SQL Servers, and if they added a monitoring tool in-house, they would have to put it on the same SQL Servers they’re monitoring – thereby introducing a nasty little point of failure. Even if they added another SQL Server, they wouldn’t know how to read the results of most monitoring tools, and they see Microsoft Atlanta as the Easy Button. For these users, it seems to make sense at first glance, except there’s one little problem.
Small businesses with no SQL Server staff in-house will rarely find out that this product exists.
You know what I think? (Of course you do, because you read my blog, and you’re inside my head.) I think Microsoft should bundle Atlanta with every SQL Server as an option during the installation. Currently, the installation process asks if we’d like to send feature usage data back to Microsoft, and I get the feeling there’s a dashboard somewhere in Building 35 that proves only 14 people in the world have ever used Service Broker. That’s a good start, but let’s kick it up to the next level and ask if users want to get performance recommendations sent to them weekly, and enter an email address right then and there to subscribe to the alerts. SAP does something like this by running weekly tests on their databases at each client automatically, phoning those results home, and then sending the users a warning when something has gone awry in their environment.
It shouldn’t be an extra-pay feature – and I’m not saying that because I think we’re entitled to it. I just think very, very few people will pay extra for this as an add-on service, but it could be a really neat feature to entice businesses to upgrade to the latest versions of SQL Server. As a DBA, I’d love to forward my boss the weekly Microsoft email alerts:
- Me: “Looks like Bob in Accounting went wild and crazy with the joins again. Here’s a list of recommendations to improve his queries. Instead of us spending another $10k on drives like he asked, he needs to fix his queries. Do you want to take them into Accounting, or do you want me to?”
- Boss: “Wow, that Bob is a bozo. Hey, can we get these same recommendations for the HR server?”
- Me: “Nope, that’s SQL Server 1946. This is a new feature of SQL Server 2031. I can haz upgrade?”
- Boss: “Yes. Kthxbai.”
You can refresh this page every couple of moments to see what’s happened. The most recent updates are at the bottom.
8:21AM – Today opened with Tina Turner singing “Simply the Best.” It’s either that or I drank entirely too much last night and Rushabh opened with a musical number, I’m not quite sure.
8:28 – Rushabh keeps talking about how we’ve touched a lot of people. I think I read about this in the harassment complaint. They’re aiming to touch more people – or maybe be touched by more people, I’m not quite sure – by webcasting the keynote live for free.
8:35 – Microsoft’s Mark Souza is playing doctor onstage to talk about the 400+ Microsoft development team members here to answer your questions.
8:39 – Talk to Microsoft folks, and you’re eligible to win all kinds of prizes including an XBox Kinect.
8:40 – “Everybody, look under your chair – if you’ve got an envelope, you just won a Dell Alienware laptop!” Dang, all I got was chewing gum. And of course you know I just felt under there without looking. Ewww. I won some used chewing gum.
8:42 – Rushabh’s introducing Ted Kummert to talk about some upcoming news. WOOHOO!
8:46 – Playing a video explaining the history of SQL Server. Best quote so far: “I figured if I could build an even halfway decent database, Microsoft could sell the heck out of it.” Oh, the jokes are just zipping around. This is a great video, though – really helps build a connection with the people building this tool, makes them personal and identifiable. This is one of the best marketing-ish videos I’ve seen. Nice work.
8:49 – Ted Kummert took the stage, and immediately started off by saying that a lot of the SQL Server people live & work in Seattle, and having the Summit here helps them send more people. Oooo. Okay, got the message there.
8:55 – SQL Server 2008 R2 Parallel Data Warehouse is now released to manufacturing. This is a little tricky, though – it’s only sold through manufacturers along with hardware. It isn’t available for download & testing on your own machine. It’s sold as a big honkin’ appliance that you just plug in and go like hell.
9:00 – Demoing Parallel Data Warehouse with a 100TB copy of the TPC-H benchmark. Eagle-eyed watchers will notice that this isn’t using SQL Server Management Studio – PDW has its own admin tool still.
9:01 – Running PowerPivot queries to suck a subset of the billions of rows – showing off the integration between PDW, Excel, PowerPivot, etc – talking about the whole-stack story.
9:04 – Here’s my thoughts on Parallel Data Warehouse from last year’s announcements and interviews. I like the idea, but people need to remember that it’s not managed with traditional SQL Server tools or techniques or staff. It’s a sealed box, and you can’t do anything with it – it’s manufacturer maintained.
9:05 – Yahoo’s onstage talking about their 12TB Analysis Services cube loading 60 50GB files per hour, 1.2TB per day, 3.5b events per day. Less than 10 second average query times.
9:10 – Announcing Premier Mission Critical, Microsoft Critical Advantage Program for Parallel Data Warehouse. No details on it though.
9:11 – Bob Ward announced & demoed Microsoft Atlanta – I’ve got a separate post with my thoughts on Microsoft Atlanta. It’s a cloud-based SQL Server monitoring system that only works for SQL Server 2008 and R2 (and newer) that detects configuration problems.
9:18 – Cloud time! Kummert says Azure needs to be self-managed, have elastic scale, and be agile & familiar. It’s solving real business problems today. Next up: we’ve got Community Technology Previews announced originally at PDC for Web Admin, Reporting, and Data Sync.
9:24 – Available now – the Windows Azure Marketplace DataMarket, something like the iTunes Store for data sets – both commercial and freely available. They’re demoing how to add internet-accessible datasets from the DataMarket with, say, weather data, to enrich your reporting. On rainy days, you sell less bicycles. I’m a big fan of this.
9:32 – Ted’s encouraging the audience to go sign up and get engaged in the cloud.
9:33 – “What’s Next for SQL Server?” The next version is called SQL Server Denali for now as a code name, and I’ve got some quick information about SQL Server Denali. They’ll be demoing it later, plus handing it out to attendees after tomorrow’s keynote.
9:37 – The changes in SQL Server Integration Services will be huge, he says – big improvements in management and servers. R2 was a big release for reporting & analytics, plus Office 2010 focused on managed self-service analytics. Excel users were empowered to build BI applications, and PowerPivot was embedded into SharePoint.
9:39 – Project Crescent is a new web-based reporting system letting end users tell their own stories about the data. Amir Netz is demoing PowerPivot, then saying that’s good, but we need something bigger for enterprise datastores. The new BI Development Studio, running on top of Visual Studio 2010 Premium, hooks into the same column-oriented storage engine that PowerPivot used, but now you can use it on a server so you get centralized security and bigger horsepower.
9:49 – SQL Server Denali will have columnar indexes built into the database engine. Columnar indexes are what makes PowerPivot so insanely fast. More on this later.
9:52 – They’re demoing Project Crescent to build reports. Unfortunately, I gotta bail – off to set up for my session!
Microsoft just announced major changes to the Microsoft Certified Master of SQL Server program. I’m excited about the new direction of the program, although I’m a little biased – I helped with the program’s changes to reach the right audience.
The Original Microsoft Certified Master Program:
- $18,500 entry fee for 3 weeks of training (and possibly certification)
- MCITP: Database Developer 2008 and MCITP: Database Administrator 2008 required first
- You had to show up onsite in Seattle for 3 weeks straight
- Three written multiple-choice exams onsite during the training
- One six-hour final lab
This setup had a few problems. It was tremendously expensive – both in terms of the entry fee and the 3 weeks of downtime. It was hard to schedule 3 weeks off in a row even if you had the vacation or your company was willing to eat that expense. You might not have needed all 3 weeks of training, and you might not have been able to travel around the world to get it.
Bottom line – having the training and the certification as an all-in-one package just didn’t scale, and that’s evidenced by the fact that only a handful of us outside of Microsoft were able to pull it off. I know a lot of people who are more technically qualified than I am, but they couldn’t justify the Master program. So how could we get more qualified people to be Masters?
The New Microsoft Certified Master Program:
- MCITP: Database Developer 2008 and MCITP: Database Administrator 2008 still required
- Certification and training is totally separate
- Initial written multiple-choice exam – $500, and can be taken at Prometric testing centers around the world
- 6-hour lab exam – $2,000, and can be taken at select secure Prometric testing centers around the world
The two biggest barriers to entry – a huge initial price tag and 3 weeks of lost work time in Seattle – are now gone! It’s now easier to prove that you’re a Microsoft Certified Master of SQL Server. (Note that I did not say it’s easier to be one, because it’s still very, very, very tough.)
Since the training is now separate, it’s up to you how much training you want to get, and who you want to get it from. If you’re already an expert on some subject areas, maybe you’ll only get training on the areas where you’re weak. Perhaps you’ll choose to get the training in chunks – one week’s worth this year, one week’s worth the next year, and then make a run at the exam. I would advise trying not to pass the lab exam cold, because this is most definitely not the MCITP.
To help you get up to speed, Microsoft partnered with SQLskills to provide dozens of hours of video training online completely free. Bob Beauchamin, Kimberly Tripp, Paul Randal, and I recorded some of our best presentations on internals, CLR, storage, performance tuning, and more. You can watch our free Microsoft Certified Master online courses all in the comfort of your cubicle. (That link may not be live yet – check again later in the day.)
If you want more personal, interactive training, you’ll be able to get it from more places. I’d argue that you still want to get it from the most qualified, highest rated trainers around. You want to get it from people who aren’t just experts at training – you want your instructors to be hands-on consultants who live this work every single week. When we’re not training or helping the community, we’re consulting in some of the toughest environments around. Yes, of course I mean SQLskills – we’re offering a series of events around the United States in 2011 to give you Microsoft Certified Master approved training. We’re the only trainers that have been involved with the MCM program from the very first rotation, and we’re the best people to help you achieve the highest level of technical certification on SQL Server. You can check out our upcoming SQLskills Immersion Events here.
I really passionately believe in where the MCM program is going. There’s going to be more people recognized as Masters, and that’s a good thing, because there’s a lot of really qualified people out there. I’m not worried about the MCM becoming the next MCITP – a certification seen as having too low of value due to the braindump factor – because I was a contributor to the new MCM program. It’s a seriously high bar to pass, but I believe a lot of you can do it.