Microsoft Atlanta: Cloud-Based SQL Server Monitoring

14 Comments

Microsoft Atlanta is a new cloud-based SQL Server monitoring tool.  Companies can deploy an agent on their SQL Server 2008 and 2008 R2 instances, and the agent will communicate up to Microsoft’s servers in the cloud (through a secure gateway also installed inside the company’s network.)  Microsoft aggregates the data, analyzes it looking for problems, and displays it back to IT users like database administrators through a web-based dashboard.

I’m not bashful about cheerleading for Microsoft products that rock, but I’m not feeling this one quite yet.  Here’s the issues I’m concerned about:

It only monitors SQL Server 2008 and newer. Most of my clients still have SQL Server 2005 (if not 2000) in the shop.  They don’t want one tool to monitor SQL Server 2008 performance and another solution for SQL Server 2005.  Frankly, they don’t even want to settle for one monitoring tool per database platform – they want one pane of glass to monitor SQL Server, SharePoint, IIS, app servers, and often other database platforms like Oracle or Sybase.

Microsoft Atlanta

Microsoft Catlanta

It sends diagnostic data to the cloud. Some of my clients can’t let their own developers get access to queries running on the production server, let alone send them offsite to servers that could be in any geographic location.  In larger enterprises, privacy and compliance regulations fiercely guard where data can go.  Atlanta could be sending your usernames, application names, and queries to the cloud – and don’t forget that queries can include things like customer names, personally identifying numbers like SSNs, or even credit cards.

It’s not real-time. When a user calls saying their query just timed out or produced an error, I need to see what’s happening on my servers right now.  I don’t want to use one tool to troubleshoot what happened 90 seconds ago versus what’s happening now.  I need live data, and I can’t wait for round trips to the cloud.

It feels like it’s targeted at small businesses that can’t afford a monitoring infrastructure or an in-house database administrator.  They’ve got just one or two SQL Servers, and if they added a monitoring tool in-house, they would have to put it on the same SQL Servers they’re monitoring – thereby introducing a nasty little point of failure.  Even if they added another SQL Server, they wouldn’t know how to read the results of most monitoring tools, and they see Microsoft Atlanta as the Easy Button.  For these users, it seems to make sense at first glance, except there’s one little problem.

Small businesses with no SQL Server staff in-house will rarely find out that this product exists.

You know what I think?  (Of course you do, because you read my blog, and you’re inside my head.)  I think Microsoft should bundle Atlanta with every SQL Server as an option during the installation.  Currently, the installation process asks if we’d like to send feature usage data back to Microsoft, and I get the feeling there’s a dashboard somewhere in Building 35 that proves only 14 people in the world have ever used Service Broker.  That’s a good start, but let’s kick it up to the next level and ask if users want to get performance recommendations sent to them weekly, and enter an email address right then and there to subscribe to the alerts. SAP does something like this by running weekly tests on their databases at each client automatically, phoning those results home, and then sending the users a warning when something has gone awry in their environment.

I found ur performanz. Iz in the toilet.

I found ur performanz. Iz in the toilet.

It shouldn’t be an extra-pay feature – and I’m not saying that because I think we’re entitled to it.  I just think very, very few people will pay extra for this as an add-on service, but it could be a really neat feature to entice businesses to upgrade to the latest versions of SQL Server.  As a DBA, I’d love to forward my boss the weekly Microsoft email alerts:

  • Me: “Looks like Bob in Accounting went wild and crazy with the joins again.  Here’s a list of recommendations to improve his queries.  Instead of us spending another $10k on drives like he asked, he needs to fix his queries.  Do you want to take them into Accounting, or do you want me to?”
  • Boss: “Wow, that Bob is a bozo.  Hey, can we get these same recommendations for the HR server?”
  • Me: “Nope, that’s SQL Server 1946.  This is a new feature of SQL Server 2031.  I can haz upgrade?”
  • Boss: “Yes. Kthxbai.”

You can learn more and sign up for the beta free at MicrosoftAtlanta.com.

Previous Post
#SQLPASS Summit Keynote Day 1
Next Post
SQL Server 2012 AlwaysOn Availability Groups Rock

14 Comments. Leave new

  • Steven Ormrod
    November 9, 2010 9:22 am

    Good points, I would like it better if it came built-in and supported more SQL versions. But it does seem like a good first pass…

  • I totally agree with all of your points, bar the Service Broker one..

    I’ve implemented it in 2 systems, and I know 2 people who’ve also used it locally, so it must be at least 40 people in total 😛

  • Wow, you had that one all queued up and ready to go. Good job! And I agree about the older versions. That was my first thought today.

  • I was really excited about this product until, I found out that data gets pushed to,the cloud.. Major road block for my company

  • More than 14 people have used Service Broker!!! 😉

  • Oh come on Brent…

    Microsoft exposes everything that they are going to upload to the cloud with Atlanta in an XML file that you can crack open and inspect. They are only collecting pretty generic server and instance level configuration information, so I don’t really think security would be an issue for most companies. If you have any specific concerns about what they are collecting and uploading (after looking at the XML file), open an Connect item and make your case with Paul Mestemaker and Bob Ward. They are very open to constructive feedback.

    As far as not supporting down-level versions of SQL Server, they (and we) have to draw the line somewhere. Would you rather they spend their resources trying to support SQL Server 2000 in a tool like this or adding more functionality to the tool for relatively recent versions of SQL Server?

    If people are still running SQL Server 2000 in 2011, they have to either live without some of the improvements made over the last 11 years, or they have to keep pushing their bosses and employers and vendors to upgrade to a fully supported version of SQL Server.

    I busted my ass pushing my company to move off of SQL Server 2000 back in early 2006,then to SQL Server 2008, and finally to SQL Server 2008 R2. It’s never easy, but it is part of a database professional’s job to make that case (IMHO).

    • Glenn – I hear where you’re coming from, but I disagree. The third party tool market is chock full of apps that support 2000 and 2005, don’t rely on the cloud, and work great. MS is reinventing a very old wheel and slapping the cloud label on it. The market won’t bite.

  • Hey Brent!

    I am a Program Manager on the Atlanta team. Thanks for checking out our beta! Customer feedback is really important to us, especially during a pre-release phase. I appreciate the critical feedback you’ve given about:
    • Not supporting down-level versions
    • Not generating real-time alerts
    • Having to send data to the cloud instead of having a completely on-premise solution.

    If you continue to use the service, please do not hesitate to contact us on any suggestions you might have. We need to understand the top deployment blockers for our customers.

    Given this is our first release, and that we are in beta, we realized that we cannot be all things to all people. What I want to make sure is that you understand the intent of Atlanta and that I can clear up any misconceptions you might have.

    All of those capabilities you mentioned that are missing in Atlanta are available in SCOM 2007 R2 today. My team is not trying to replicate SCOM or build “SCOM in the cloud”. We are focusing on:
    1. Providing proactive configuration-based knowledge from our experts in Microsoft CSS
    2. Being a simple solution that provides a strong connection between our products and support

    There has been some confusion as to what kind of data Atlanta collects and sends back to the cloud. In fact, you touched on that yourself in your blog post:
    “Some of my clients can’t let their own developers get access to queries running on the production server, let alone send them offsite to servers that could be in any geographic location.”
    “Atlanta could be sending your usernames, application names, and queries to the cloud”
    Atlanta does not look at SQL statements of any kind. We do not look at query text or sproc definitions. We solely look at configuration information (e.g. sys.configurations, sys.databases, driver versions, some registry keys).

    Transparency is absolutely critical for a service like ours. All of the data that we capture is available in plain XML that you can inspect before uploading it to Microsoft. It is also archived on your gateway machine for 5 days by default after uploading to Microsoft so you have an audit trail of what data was sent out of your environment. We want everybody to understand what we do collect and what we don’t collect. We want to collect any management data (e.g. configuration and errors logs). We specifically do not want to collect any user data or customer workload data (e.g. any data from user tables like customer names, SSNs, or credit cards).

    I am currently at PASS and will try to find you to make sure that you have the resources you need to understand Atlanta. I have a demo at 1:30pm in the Product Pavilion on Wed 11/10. I will also be spending a lot of time at the SQL Server Clinic, so feel free to stop by so we can chat.

    Thanks again for checking out Atlanta.

    Paul A. Mestemaker II
    Program Manager
    Microsoft codename “Atlanta”

    • Paul – thanks for the post. Brace yourself, but…this is actually sounding worse. If you’re not collecting query data of any kind, now I’m really stumped at the value of this.

      In order to provide configuration-based knowledge from the experts in CSS, you don’t have to send data FROM the client TO the cloud. You can push those best practices TO the client FROM the cloud, and surface it in SSMS, thereby bypassing all of the data security issues at once. Why build a new UI in the cloud, knowing full well the huge security and compliance issues involved, when the end user already has a full-blown thick client for your app?

      In order to build a connection between your products and support, you need more than configuration data – you need activity like queries. What would the value of, say, SQLDIAG be if it didn’t include a trace? Dramatically less. And are you the only group that needs support data? Of course not – so why not build this functionality into SCOM and use it across multiple products at once? Why does every product need its own little siloed project that doesn’t work for any other products?

      Finally, this line *really* disappoints me:

      “Transparency is absolutely critical for a service like ours. All of the data that we capture is available in plain XML… We want everybody to understand what we do collect and what we don’t collect.”

      If this is an important thing to communicate to your customers, what’s the best way to inform them? Putting it in “plain XML” files, or putting it in the documentation and making it legally enforceable that these are the only things you’re collecting? Right now, the privacy agreement states something completely different:

      “This data may include the following types of data: computer name, DNS name, fully-qualified domain name, IP address, agent ID, software configuration and metadata, performance data, event logs, registry settings, file versions and driver information.”

      To me, “performance data” is entirely too broad. I routinely work with clients for performance health checks, and as a part of that, we look at queries. Microsoft could easily legally defend having query data under “performance data.”

  • I just want to add, it’s not just that it’s SQL 2008 and higher, it also requires Windows 2008 and higher. Kind of a burn for us still stuck with Windows 2003 systems.

Menu
{"cart_token":"","hash":"","cart_data":""}