How We Architected Paste The Plan

If you’ve ever heard the Away From the Keyboard, a podcast that I co-host with Cecil Phillip, you’ll quickly discover that I love hearing stories about how products are created. Now I get to tell you the story of how Paste The Plan became real and the architecture around it.

In a world…

Here at Brent Ozar Unlimited, Brent likes to meet with us individually on a monthly basis to review the company performance and our personal goals. During one of these meetings Brent causally mentioned that he wished there was a site that would make it simple to share query plans. So that got me thinking. I could whip up a site that could do just that, but the hard part would be displaying the query plan graphically. As it turns out that would be the easiest part because of Justin Pealing’s HTML Query Plan tool. This brilliant piece of code takes a query plan in XML and transforms it to HTML. Best of all, it’s open source.

With the graphical query plan piece settled, I started to figure out the architecture of Paste The Plan. I gravitated towards the traditional Microsoft stack: ASP.NET MVC, Entity Framework, and SQL Server, held together with C#. I presented this to Brent and his reply was “What if we put this in the cloud? More specifically, Amazon Web Services (AWS), and what if we went Serverless?”  To quote one of the greatest movies ever made “That’s a horse of a different color.” It was an intriguing thought. But that would mean we would have to throw out the entire Microsoft stack with the exception of SQL Server. We’re a SQL Server shop. Of course we would use SQL Server…right?

So I got to work investigating. What is this “Serverless” thing and how does it works in AWS?  As it turns out, “Serverless” actually runs on servers, but you aren’t managing them. Serverless really means that there are a group of managed services that your application runs on. This excites some folks and horrifies others. Count me in the excited category.

So let’s go over the Paste The Plan layer by layer.

PasteThePlan Arch Final

Paste the Plan’s Architecture

Presentation Layer

The presentation layer in a typical Serverless AWS web application is hosted in a S3 bucket that has static website hosting configured. That means we couldn’t use server-side web technologies like ASP.NET or PHP. Originally, I set up Paste The Plan this way but we wanted to integrate Paste The Plan with BrentOzar.com, which runs on WordPress, hosted by WPEngine. So a WordPress template was built so we didn’t have to deal with crazy iframes.

We also used CloudFront to house some of the other JavaScript, CSS, and image resources. Cloudfront is a content delivery network that puts static resources on servers all around the globe. So when a client tries to access content, Cloudfront delivers that content from the server that can deliver it the fastest.

Application Layer

The application layer is where all of the magic happens. When you press “submit” or view a query plan via a URL, the presentation layer calls the app layer to process that query plan or retrieve it from the data layer. In the AWS Serverless stack, you have only one choice for computational processing, Lambda. Lambda allows code to run in the cloud without having to worry about those pesky servers. Lambda is doing a bunch of our heavy lifting like parsing query plan XML, transforming XML to HTML, pulling SQL out of the XML, and saving data to the database. Just in case you’re curious, all of our code in Lambda was written with Node.js.

Now we have a problem — Lamba can’t directly communicate with the presentation layer. This is where the API Gateway comes in. API Gateway acts as the front door to Lambda and creates a RESTful API for the presentation layer. Now we have a fully fleshed out application layer that processes query plans and returns the output. All we need to do now is tie it all together with SQL Server right? Yea, about that…

Data Layer

Let’s take a look at the kind of data we need to store:

  • Id (string)
  • Date created (date)
  • XML hash (string)
  • Query plan XML (string)
  • Query plan SQL (string)
  • Query plan HTML (string).

At first glance this shouldn’t be a problem for SQL Server, but the query plan XML, SQL, and HTML strings could be large (we set the cap for the query plans at 2 MB). It’s not ideal, but we know SQL Server can handle it.

So here’s where the other side of working in the cloud comes in…cost. As of this posting (September 2016), the cost of the smallest reserved instance of SQL Server (db-t2-small, we don’t want Express edition) is $73.00 (USD) monthly. And that’s for the Web edition, not Standard edition. For Standard Edition the cost jumps to $623.42 monthly. So for an annual cost of $876.00 we could use SQL Server as the database for Paste The Plan.

But this is 2016, and there are alternatives in the cloud. AWS features a document/key-value database called DynamoDB. Take another look at what we’re trying to store: it’s not complex relational data, it’s just one table. DynamoDB seems like the perfect fit for the types of data Paste The Plan is going to generate. But what about the cost? I’m glad you asked. Amazon must have found an ancient rune stone that explained the formula for pricing NoSQL data hosting because it’s not simple. But let’s just say we needed 1,000,000 reads and 1,000,000 writes for Paste The Plan; that would cost us around $7.50 (USD) a month. Once a month, Brent is going to have to skip his daily Starbucks visit. (Brent says: bad news, Richie, it’s coming out of your paycheck.)

But storing all of that query plan text in DynamoDB didn’t sit well with me. Doing so would cost us more in reads, writes, throughput, and storage. That kind of text data doesn’t really belong in a database. It belongs in a file system. And wouldn’t you know it Amazon has a bang up file service in S3. So we decided to put the query plan XML, HTML, and S3 into a JSON object, compressed it, and put it in S3. But here’s the kicker, the cost of S3 is $0.04 (USD) per gigabyte.

It’s alive

So that’s how Paste The Plan was brought to life. We learned a bunch by bringing this little app to life. Working in the cloud is trickier than on-prem, with lots of little details to consider. Most everything in the cloud has a cost associated with it. Storage, data transfer, computation, and memory all need to be taken into account when building cloud-based applications. We hope you enjoy our little foray into the cloud.

Previous Post
Announcing PasteThePlan.com: An Easier Way to Share Execution Plans
Next Post
The Open Source Tools of Paste The Plan

22 Comments. Leave new

  • Thanks for sharing this Richie. I would have approached the solution the same way (.NET MVC, SQL Server, etc..) but it’s refreshing to see how you guys did it using an alternate approach. Very cool!

    Reply
    • Thanks! It’s a whole new world out there. Cloud has really changed the game. I was at the PDC in 2008 when Microsoft announced Azure. At that point the cloud seemed like a whole bunch of promises and concepts. Eight years later I can say it works, and works well. It does take a bit of time wrapping your head around the cloud concept and how the different services work, but once you get past that it is very powerful running an app in the cloud.

      Reply
  • I really appreciate the backstory about how you got to those decisions – it’s a really neat tool and it would astoundingly well. Nice work!

    Reply
  • Stephen Holland
    September 14, 2016 11:43 am

    Fantastic story. My company is looking more and more at cloud options, and the cost factor has been a tricky one. Your story of working through to the least expensive option was very educational.

    Reply
  • Jeff Roughgarden
    September 14, 2016 11:56 am

    Richie, thanks so much for sharing this. I bet I’m not the only one who’d like to know even more about your architectural decision process. For example, I’m not clear on the final cost of the AWS solution vs. the more conventional full Microsoft stack approach. A multi-page white paper on this topic would be well received.

    Reply
    • Jeff – we’ll definitely circle back after a month of hosting and talk finances. In the meantime, go ahead and do your own comparison on pricing and you can learn a lot there.

      Reply
  • This is an awesome account and a fun read. My partner and I migrated the company we work for to the Amazon cloud 3 years ago and love it! But we still have not really taken advantage of many of the managed services that would really require rearchitecting the production applications (too costly). I am going to share this with those that would do the work, though– I think they might be inspired.
    One question, though: How come you didn’t gravitate to the Microsoft Cloud first, since SQL Server is the bread-and-butter?

    Reply
  • Enjoyable and interesting read, even if it mostly seemed out-of-my-league from a skills perspective.

    When did you start this process? i.e. when did Brent “causally” [sic] mention his wish?
    The end result of my question is we’d all know how long this process took, since PTP was announced yesterday on 9/13/2016. Did it take 3 months? 6 months? etc. The impression I got was that this was a relatively short development process versus other projects.

    Did you do ALL of the coding yourselves? (aside from Justin Pealing’s HTML Query Plan tool).
    What was/were the biggest challenge(s)? Did you (the plural you, as a team) know Node.js beforehand or learned completely new from scratch? I’m assuming this was a team effort and that Richie didn’t do all of the coding.

    Reply
    • I wen’t back and took a look at the GitHub commits. The first commit was on June 3rd with the final, RTM commit on September 6th. So that’s about three months, but let’s break that down a bit. In the middle of those three months we had Dell DBA Days and we also went to AWS Architecture training. So subtract two weeks. And mind you, Paste The Plan wasn’t the only thing I was working on. It was the major thing, but not the only thing.

      And yes, I was the only one working on Paste The Plan. The team helped with testing and direction but this was essentially a one person effort. Before this project, I had never used Node.js. I had never used DynamoDb. I had never used Lambda. I had never used S3. I had never used Gulp, our JavaScript build tool. I had never written anything using WordPress or PHP. If that sounds like a bunch of “nevers” you’d be right. What I did have is twenty years experience of building software and database solutions. I had some great friends and awesome community, dotNet Miami, that answered my questions about these newer tools. My Google-fu is also very strong. (Waves hands in a kung-fu like motion.) Now, I don’t say that to toot my own horn, well, maybe a little. But I think being able to quickly learn and use all of the technologies is a testament to the amount of information that’s out there. We’ve come a long way since I was a kid with a C-64 Programmer’s Reference Guide and the latest edition of Commodore magazine.

      One of the big deliverables of the project was to learn how to write software in the cloud. I think we accomplished that. Having that as one of the goals really took the pressure off and let me focus on learning the tools as opposed to getting it out the door. Thanks for the questions!

      Reply
      • Stellar response. Thank you for your hard (and impressive) work!

        Reply
        • To elaborate a little on one of the questions: was it a team effort? No, it was 100% Richie. Nobody else on the team does any coding whatsoever. There was a tiny amount of testing, but even that was pretty small. Nobody else had Github commits, and I was really hands-off with the design. “Here’s what the functionality needs to do – go.”

          We’re a really small company, and we try to hire people who know their subject area really well. It doesn’t make sense to take SQL Server experts like Erik and Tara and get them to write great code. (And I know my limitations well – my coding skills expired a decade ago.) We’re better off with the consultants & trainers doing consulting & training, and using that income to help pay for the tools we build, rather than having consultants & trainers build crappy code that will cost a ton to support and scale down the road.

          Reply
          • To add a little to this: I spent about a half hour parsing query plan XML in Python and ran screaming back to SSMS. It was not okay.

  • Apisak Srihamat
    September 14, 2016 9:04 pm

    Many thanks Richie, I don’t think Brent should cut your paycheck.
    Instead the moment to provide particular service for free in short period will leading to customer royalty that we are willing to pay when we found it’s working very well. I believe Brent is smart enough to realized 🙂

    Reply
    • Apisak – interesting. So how much would you pay for access to it?

      Reply
      • Apisak Srihamat
        September 18, 2016 9:13 pm

        I’m not sure yet,

        In term of pricing, I think more important is number of people that willing to use it.

        There are 2 kind of payment term for this service that I can think of at the moment:
        1. People who love your comment will rely on company decision for paying this kind of service. The company that found out this kind of service helpful/useful is willing to pay for it. How to make company think this service is helpful/useful is most likely solving their critical problem. You need a support from who love your comment to present it nicely to particular company. I think he/she need also your team help to go through this process. As simple price calculation is, if 5,000 request was submitted per first year. You could calculate project fixed cost per year plus by you margin divided by 5,000 as simply price for each request. The fixed cost for first year is probably your bet that you will gain back later.
        2. People who love your comment is willing to help you pay project fixed cost for a first (and may be next) year bet. You can tell them that all of them will get some reward at the end of the year whether this project success or not. If it’s successfully you can give them some profit. If it’s fail you can give them a postcard with your drawing. The price is not specific for this case, any $ per each lover should be acceptable.

        Reply
        • Apisak – interesting, but that’s not really how product pricing for a brand-new product works. You don’t look at the cost and then back into the price. Instead, you have to look at the value you’re creating, and whether users are willing to pay for that value. Whether PasteThePlan cost us $1k, $10k, or $100k per year to develop and host, that doesn’t really change the value that users get from it.

          So before you can say “just charge for it,” you have to be able to answer the question: how much would YOU actually pay for this, personally?

          My guess is that most readers wouldn’t pay anything (which is completely fine, and why it’s free.) Even if they were willing to pay for it, that just means competitors would enter the market faster and build the same thing, and undercut us. It’s tough for them to undercut us when we’re free. 😉

          Reply
          • Apisak Srihamat
            September 19, 2016 4:55 am

            Well, if you are first into the market, you could be the one who create the rules.
            The price calculation I mentioned is just few examples I can think of at the moment. Since you are going to create the rules, you can say I’m (Or anyone) wrong anytime. Regarding competitors, I think the unique way your team are will create difficultly for the competitors to substitute. Are you unique ?

            Anyway, I only read the concept and think it should work, I haven’t try out yet so sorry that I don’t have any price in my mind. May be later I can tell you. But … will I be the one who create the rules not you ? 🙂

          • Apisak – you’re still not answering the really simple question, so it’s probably best that we call it quits here. Thanks for stopping by though!

  • Hi Richie
    I found this article really interesting. I’ve been fighting to cope with the idea of having stuff in the cloud since SAAS, PAAS, MAAS and all the the other AS-es and never thought it would be that simple. Your article shed some light on the cloud and how it could work for a lot of solutions. I reckon if more developers shed their experiences with “the cloud” outside of their closed developer communities, then more people (DBAs, System Engineers, etc.) would realize that the cloud is not that evil thing that some of us may think it is. They would even come to appreciate its (“the cloud’s”) benefits.
    I have long quit writing code and have focussed on the performance tunings aspects of SQL Server the last few years. I can relate to Brent no longer having the “coding” abilities and can definitely relate to Erik’s comment about running back to his desk to embrace SSMS. At some point along the road you decide what you can do best and stick to it.
    I really appreciate the efforts that the Brenzozar team displays day-in, day-out to the community, and always look forward to reading your articles.
    Thanks Richie for this awesome write-up.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.

Menu
{"cart_token":"","hash":"","cart_data":""}