R2’s new Data-Tier Application (DAC) capabilities give the DBA to manage databases as if they were more like virtual servers. Today I’m going to talk about what that means short-term and long-term.
Short-Term: Nothing To See Here, Move Along
I don’t think most DBAs will see a .dacpack file in the wild for years:
- It’s SQL Server 2008 R2 only. The new SQL Server Utility model can’t be used to manage older versions of SQL Server. Whenever I poll enterprise DBAs, the majority of ’em are still on 2000 and 2005.
- It’s only available in Enterprise Edition. Microsoft has only put the coolest new features in Enterprise Edition lately, and this is no exception.
- They don’t develop themselves. Somebody has to develop the DACs before you can deploy ’em to your server. Yes, you can reverse-engineer an existing database into a new Data-Tier Application, but…
- Not all SQL Server objects are supported. Data-Tier Applications support only a subset of objects like tables, non-clustered indexes, views, functions, and stored procedures – and not just any stored procs, either.
To illustrate that last point, I tried extracting one of my favorite databases, a Twitter cache built using Tweet-SQL, and got the following errors:
Some of the errors included:
- This object depends on dbo.tweet_usr_followers(Stored Procedure) that is not supported in a DAC.
- Accessibility(SqlAssembly) – This object type is not supported in a DAC.
- dbo.tweet_acc_archive(StoredProcedure) – This object type is not supported in a DAC.
The problem: extended stored procedures and encrypted stored procedures.
Ironically, these techniques are often used by third party vendors to deliver their code – like TweetSQL, and like Quest, for that matter. Yesterday, I blogged about why third-party vendors are a great use case for DACs, but now we can see why adoption in that user group won’t be quick. Why would a third-party vendor deploy their database as a .dacpack if it can only be deployed on Enterprise Edition and none of the contents can be encrypted?
In fact, who would use this at all when it’s such a minor subset of SQL Server’s capabilities?
Flash Back to SQL Server 2008’s Release
At the PASS Summit in Seattle, I talked to a DBA who’d really had it with Microsoft. He complained that 2008 didn’t have anything groundbreaking for production database administrators. He’d loved SQL 2005’s introduction of SQL Server Management Studio, but since then, he hadn’t seen anything out of Microsoft to really make his job easier. PowerShell and Policy-Based Management were theoretical steps in the right direction, but not big enough. (To date, I still don’t see widespread adoption of either, and I agree with his sentiments.)
More than making life easier for production DBAs, SQL Server 2008 started to install plumbing that would make life easier for BI users. Much easier. Much, much easier. SQL Server 2008 R2 brings the faucets and shows just how easy it’s going to be. To quote Microsoft’s R2 marketing page:
“Self-service analysis tools allow end-users to quickly create solutions from disparate data sources within a familiar Microsoft Office Excel user interface. By publishing these solutions in SharePoint Server, users can easily share them with others.”
As Microsoft delivers capabilities to end users, they’re focusing on self-service.
Not making it easier for administrators – making it self-service. Making it easy enough that no dedicated administrator is required – in theory, at least.
The reality is that DBAs don’t buy SQL Server. DBAs sell SQL Server. They sell it to CIOs, developers, and BI users. Microsoft’s approach with self-service BI in SQL Server 2008 R2 means that they’re selling directly to the BI users, empowering them to do their own work without getting approval from the DBA.
That DBA who complained about Microsoft not focusing on DBA tools isn’t going to be any happier with SQL Server 2008 R2. The features available at release will be focused on BI professionals, and the new-plumbing features that will work long-term are focused at a different audience altogether. When you read that SQL Server Utility features will make life easier for production DBAs, you need to read between the lines.
Long-Term: Look Up, The Sky Is Coming
The subset of supported features is eerily similar to SQL Azure, Microsoft’s cloud-based SQL Server offering.
Did you hear that just now?
All over the universe, DBAs cried out in anguish at the prospect of reading yet more about how the cloud is coming to steal their buckets.
Just like SQL Server 2008 R2 starts to deliver self-service BI, the next versions of SQL Server will probably focus on delivering self-service data storage. The “self” could be:
- Developers who just want to write applications that store data without the hassles of asking for DBA permission
- Network admins who want to manage SQL Server the same way they manage the rest of their servers – as virtual servers, slicing up pools of resources
- Project managers who want to buy a third-party application and don’t care about how the database works
Virtualization didn’t become popular by catering to the people who liked their own dedicated servers. It caught on because it catered to the people who paid the bills and the people who had to manage all those dedicated servers. Likewise, the DAC concept might catch on not by catering to DBAs, but by catering to people who never really liked DBAs.
These types of users will love the DAC concept, and furthermore, they might like the concept of just hosting their database in the cloud. Us hard-core DBAs look at Azure’s pricing model ($10/mo for 1gb, $100/mo for 10gb) as crazy high, but to project managers and network admins, that’s not too bad at all. Compare it to the cost of a fully configured SQL Server Enterprise Edition box with licensing (because remember, DACs only work on Enterprise Edition) and the cost is downright sensible.
DBAs complain about security problems in the cloud, but the DAC concept appears to have conquered some of those limitations. Security requirements are built into the DAC package, including logins and permissions, and non-secure multi-database elements (like extended stored procedures) just aren’t allowed in .dacpack files.
When I interviewed Tom Casey at the PASS Summit in 2008, he hinted at this by saying that SQL Server is already somewhat multi-tenant, and now I see where this is going. Down the road, you can choose whether to deploy a DAC in your internal cloud (SQL Server resource pool managed by the SQL Server Utility Control Point) or to Microsoft’s cloud. If your developers have already confined their application to using a subset of SQL Server’s functionality, then it’s no additional headaches for you.
The SQL Server Utility model won’t change the way you work this year, but ask your Windows administrators how their job has changed with the advent of virtualization. Take heed of the lessons they learned, because your job will be changing next.
Brent you harbinger of Doom you ?
Let’s not sell ourselves short here. We are not merely DBA’s but are in fact multifaceted, highly versatile I.T. Professionals with an unquenchable thirst for knowledge.
With new technology comes new opportunities for those that are clued up and posses the knowledge.
I say, Bring it on!
HAHAHA, riiight. Well, I think some of us are multifaceted, highly versatile professionals – I love me some SAN and virtualization – but not all of us. If you play ostrich and bury your head in the sand, you’ll find a dwindling number of careers fitting the production DBA description over the coming decade.
I’m like you though – I’m really excited about this change. I think those of us who spent the time as production DBAs will be really well-equipped to understand how performance tuning works in virtual environments and in the cloud.
“I think those of us who spent the time as production DBAs will be really well-equipped to understand how performance tuning works in virtual environments and in the cloud.”
Absolutely! In my opinion, these new sleek tools and features empower end users through abstraction of the underlying complexities. As with all good mechanics, if you understand what’s going on under the hood you can squeeze out the best performance from the engine and so too will be placed to work with these new and exciting environments.
Yo BO. Correction to your blog: a dacpac is not limited to EE. A dacpac may be deployed to any edition of SQL Server. See you in a couple of weeks!
Hmm – just to clarify, when I try to connect instances to a Utility Control Point, it says that it must be Enterprise Edition (among other requirements.) Is there a reason we would deploy a dacpac to something that wasn’t controlled by a UCP?
Firstly, thanks for very interesting posts about the DAC.As for the use of the DAC without UCP, I think that DAC could be used for smaller databases. For example imagine that you have many small databases (eg. small internet shops), you could use DAC for upgading process etc.
I’m in the process of developing a SQL Server Dashboard (using SQL 2008 and SSRS 2008) for all our prod servers (around 22 globally) that needs to show us all sorts of things like CPU (various avg’s), disk space, failed jobs, jobs that are long running, blocks/locks, logshipping statuses … etc etc. Basically overall health.
Where this UCP (and the 2005 reports dashboard) falls short is:
a) it’s not realtime enough,
b) it doesn’t give you an overview of all your boxes on 1 page,
c) it only shows CPU and disk space counters and
d) what about all our 2000 and 2005 instances?? Must they just be left out in the cold
We run high volume stock trading systems where minutes can mean big money and so we need to be able to spot any issues immediately.
And on the DAC subject:
So it doesn’t do assemblies, it doesn’t do CLR’s and encrypted procs, it doesn’t do service broker objects, replication publications, jobs or linked servers … so basically it’s no different to generating a script of the procs, views, tables, functions and users of a database, which is very very quick and easy from the task menu, then saving the script (or just backing up and restoring the DB).
Big fan of new tech, but come on – give us something new and useful, not just some shiny bells and whistles for the beginners.
Brad – interesting comments. Just out of curiosity – why build your own monitoring system when there’s so many good ones out there like Quest Spotlight? Grant Fritchey talked about this in a post too:
We’ve tested a few monitoring solutions like Spotlight, Red-Gate’s SQL Response and Idera’s Diagnostic Manager in parallel with our own “legacy” monitoring systems and, among other inadequacies, in almost every case we were alerted by our systems minutes before the shelf packages alerted any issues.
Since we are already storing alot of bit and pieces of counter-type data, it’s just a case of putting a pretty front-end onto it to replace the few webpages, email alerts and net sends … which these guys are charging a fortune for.
… now I get to learn SSRS 2008. 🙂
Brad – hmm, we’re just going to have to agree to disagree on that one. I’ve never seen anyone’s monitoring system come close to Spotlight’s speed on monitoring – unless you’re absolutely hammering your system, testing it continuously, in which case the overhead is usually too high. Best of luck, and if you decide the work involved is too high, let me know. We’re definitely not “charging a fortune” – especially compared to building your own tool. Building your own tool is only free if your time isn’t valuable. 😉 I dunno about you, but I make *way* more than minimum wage. Good luck though!
haha – well then you’re just gonna have to come around and see it outperform Spotlight for yourself (and with an overhead that’s practically unnoticable). 🙂
Don’t spend your “*way* more than minimum wage” all at once!
Hey Brent, I just clicked on the Tweet-SQL link in your blog post (because I was curious) and got a bit of a surprise! Friendly nsfw warning, don’t scroll down on that webpage either
Hahaha, great catch, fixed! Thanks.