Sure, you’d much rather work with SQL Server 2014, but what’s more prevalent out in the real world? At Dell DBA Days, I saw a chart that shocked me:
That chart shows the number of servers running Dell Spotlight Essentials, a free SQL Server monitoring tool. Note that SQL Server 2014 has 4% of the population, about 1/4 as much as SQL Server 2005.
But the terrifying part: for every 2 SQL Server 2014 instances, there’s one SQL Server 2000 instance still kicking around! That’s not exactly a rapid adoption rate.
I wonder why we aren’t we upgrading our SQL Server 2000 instances? (And I’m not pointing the figure at you – it’s likely the business that’s making you keep it around.)
As much as I enjoy new and shiny things, I need to show this to my boss. SQL 2016 isn’t even out yet and my boss is already pushing us to move to it.
The day you are stuck in an environment of SQL SERVER 2000 and 2005, then you will learn to appreciate your boss.
In my shop We have two 2000 servers in PRODUCTION and one **6.5** in production. After *years* of begging, I’m happy to say that the successor to the 6.5-based app goes live next week – on SQL 2014! I feel like having a company-wide party.
There is much more business resistance to moving the manufacturing support apps running over SQL 2000 to 2014. My goal for the shop is to have 2008 and 2014 and nothing else, and as soon as I prove out that the 2008-attached apps will run on 2014, to get rid of that too. I won’t stand up new apps over anything older than 2014SP1.
I have new senior leadership who supports this initiative and is pushing the business to currency. that’s rare.
OOF! 6.5? No way! That’s too painful.
Not to be *that guy*, but I think you meant “pointing the FINGER at you”.
All grammar aside, I think the problem stems from lack of time and environments for testing upgrades. I know that has been the difficulty at my last two shops. Even three jobs ago when we had a decently representative dev and test environment (matched prod very closely) it was difficult to spin everything up on a new SQL version while keeping the original test servers up for keeping the regular dev and test running. Most shops won’t halt all other development and deployment while IT converts the test environment to the latest version, tests it, overcomes any problems that may arise, and then performs the upgrade in prod.
Typically, if there is not a powerful business drive to upgrade it doesn’t get the time or attention. Our difficult job as database professionals is to quantify the risks and benefits of upgrading well enough that management gives it that focus and resource allowance. As always, the management mantra is “don’t ‘fix’ it if it ain’t broke, but we’re sure going to try to break it”. 😀
I had assumed that “point the figure” was a clever play on words with “Point the finger” ,since the primary exhibit of this article is a figure (figure g1, or whatever)
Joseph – bingo. You rock for catching that.
That thought occurred to me but I dismissed it because it would have been too convenient. Now I feel sheepish for having ever doubted your ability to play with words.
*bows and walks backwards from the room*
Hahaha, no, you’d usually be right. Most of my hero-a-derp around the keyboard results in typos rather than elegant turns of words, hahaha.
HAHAHA, and of course OS X El Capitan’s autocorrect made that hero-a-derp, which makes it even funnier.
Even your autocorrect mistakes have class.
Only you could turn a “herp” into a hero by accident.
The reason is money.
1. – a small, in-house development team can barely keep up with new feature requests. Significant resources to migrate/test to a new database version or even from VB 6 to VB.Net etc. And there are always migration issues (e.g., DTS to SSIS 2005, to SSIS 2012)
2. – licensing changes/issues – the change from server to core licensing is a significant cost to SMB. Just the license cost to upgrade one server from SQL 2008 to SQL 2014 would be the single largest acquisition cost item for our company in the entire year.
3. – Oh, also, the bottom line – if it isn’t broke don’t fix it. It’s hard to convince the business it’s broke when production continues to run well. 😉
Exactly. Switching one of our servers to the core licensing model will cost us $40,000 if we use a pretty standard dual CPU setup.
Number 3 hit the nail right on the head.
We are on 2012, which we upgraded from 2000 about two years ago…severe headache, as you can’t upgrade straight across. Had to do an intermediate step to 2005, then could do 2012. Reason are pretty much the same as the comments from Ray. If it isn’t broken (and still supported), leave it alone.
We run 2008 R2 on one server and a passive backup on a second server. The cost to upgrade to 2014 was $28,000 because we would have to buy 8 x 2 core licenses (8 cores per server) since you cannot run a passive backup on SQL 2014 license. And it also appears most of the value of 2014 is not performance but HA features which we don’t need. We will simply wait until we need to run 2 servers in production.
In my experience, it has been old, mission-critical apps running on stand alone boxes with SQL 2000 hosted on the same box, combined with the attitude ‘If it aint broke…’. It takes time to convince people. Once my CIO started making ‘out of support’ a common buzz word among the board and top managers, priorities finally started to change.
My guess is the 2% on SQL2000 is probably due to the difference between DTS and SSIS. The amount of DTS packages I have seen at some companies is astonishing! – No one want to convert 100’s of working DTS packages to SSIS.
For SQL2014, I think a lot of people are hesitant because of changes to the carnality estimator, and lets fact it there were not a lot of compelling reasons in the database engine to move from 2008R2 to 2014
It will be interesting to see how fast people adopt 2016 which has lots of database engine changes like JSON support.
I want a “carnality” detector. Sounds naughty. 🙂
Maybe also a “carmality” detector. That could be yummy. 😉
Or a Calamity detector !!
YES! That would be and excellent invention, indeed.
It’s called a Pink Slip. If you get one you know there is a calamity.
I suspect that the users of Dell Spotlight Essentials (a free monitoring tool) are not representative of all SQL Servers in use.
Alan – interesting! Do you think people who use monitoring tools are more ahead of the curve, or farther behind?
Dell Spotlight is a great tool, but I would guess that users skew towards people who live in the land of no budget, given the plethora of really amazing paid tools that are out there.
The question is: how MUCH do they skew. My guess: not that much.
I know that we have customers who still use SQL Server 2000 and 2005.
Most of our customers are currently on 2008.
A few on 2012.
I don’t know any that are on 2014.
Joseph – so it sounds like these no-budget folks are even more advanced than you! 😉 Jokes aside, that’s been my experience too – people who have free alerting may tend to have lower budgets, but they also have lower restrictions on jumping versions. Enterprises with huge budgets tend to move more slowly.
Yep – we kick it old-school 😉
We’ve got one customer, in particular, who pays us an absolutely absurd amount of money for support to allow them to remain on a VB6 solution running against SQL Server 2000. Crazy. Hey, whatever works, I guess.
We still have almost a dozen SQL 2000 servers kicking around that we can’t seem to get rid of. 15 SQL 2005 servers. The number go up from there luckily. The reason we can’t seem to get away from these is because the business is a firm believer in “If it ain’t broke, don’t fix it.” So no matter how much we want to pull the trigger on moving forward, they aren’t eager to mess with something that is working perfectly fine at the moment.
I have a feeling this could be common in many environments.
If MS wants fast adoption, they can’t be releasing a major version every 2 years, IMHO. It takes almost that long to get every server upgraded…We’re in a permanent upgrade loop at this point.
If MS wants fast adoption, they should not be cranking up the license prices like they do now.
I agree. Having a release cycle every two years hinders fast adoption.
Being in a 2 man DBA team running 160 Database servers world-wide and 1600 databases, I just don’t have the man-power to look into each new SQL Server version.
Last year we managed to have a management statement handed out to IT, informing everybody that we would adopt SQL Server 2012 as the new standard for SQL Server installations. And then “they” still goofed a new TFS installation by having SQL Server 2014 installed. Result: No support from the DBA team.
We still have one SQL 2000 running (phase-out in progress) and a few SQL Server 2005’s. The majority of installations are 2008 R2 and SQL Server 2012 on the rise.
And then there are the licensing issues…
If I had to guess I’d say that a number of the 2000 instances are cases where the company couldn’t justify the cost of upgrading an old legacy system that may or may not be around in a few years.
More interesting to me is the large number of 2008 R2 instances. We just finished upgrading a number of 2005 instances and while we upgraded a few to 2012 most of them went to 2008 R2. And that’s with 2014 having been out for over a year. The reason? DTS. It was just to expensive (particularly in terms of time) to upgrade 100’s or even 1000’s of DTS packages to SSIS until forced to do so.
I am guessing though that adoption rates will continue to slow down. I think that with the speed of SQL Server releases companies will choose to skip more and more releases. Only upgrading every 2nd, 3rd or even 4th release.
I’m part of a large, global product that uses either SQL or Oracle as it’s back end. We help many customers. A few reasons I know of are the following:
1. We are an N-tier application, and our application server generates the SQL code for either Oracle or SQL. In this case, we must take into account the new language constructs that are adopted, where appropriate. This will cause us to regression test our application server runtime component.
2. SQL ain’t cheap. Our customers tend to try and squeeze as much life out of each version they can.
3. Our customers don’t understand SQL Server. So they don’t often realize a technical advantage for migrating.
4. Like #1 above, moving to a new SQL Server version must be supported. For us, that might also require that you upgrade our application too. Given the size of our application, that is complex and requires multiple consultants, a project plan a decent amount of time.
Hope this help!
The reason is also time;
1. The users are all too busy to think about finding time for the planning, testing and downtime.
2. It was a awfully long time ago since the SQL 2000 or 2005 servers/databases were installed and the original Database Owners have left and no-one new wants to step up.
3. They are third party Databases and getting support from the supplier to migrate and re-configure the application is hard (they may have gone, been bought out or simply want you to buy something new)
… and there is also the need to upgrade the server as well.
And as Ray says, there is a lot of ‘if it aint broke’…. but I think it’s now time that we try and fix it anyway :0)
Our company product is a commercial app with a SQL bottom tier. We keep SQL 2000 through 2014 around because some of our customers do. As long as they’re paying us annual maintenance, we have to be able to provide support.
My strategy is to make sure that the database is migrated to a more current SQL platform when an application gets upgraded. There is usually a good reason to upgrade the application so that drives it really. Otherwise, there’s too much hassle involved in trying to get the application owners to upgrade their application or re-map it to a new server. It would be nice to get rid of the 2005 instances in our environment though. That chart doesn’t surprise me at all!
I believe our reasons are like most others, especially for a small shop
1.) Cost, not only for the software / license, but also to purchase a new server (s)
2.) Time to set up and test and make any changes to front end apps
3.) Everything runs fine right now – no major issues
4.) Some of our apps would require approval from other agencies and they are not fond of change.
We have a SQL Server 2000 Instance on a Windows Server 2003 cluster (32-bit). It’s a major pain point for me, but the business wants to stay on it because they have HUNDREDS of Access 97 based apps hitting it. One of which is a MAJOR application that would take a very significant effort to replace. When the CIO mentioned the budget and testing effort, they just curled up into a little ball.
When will it get replaced? When the SAN (which was put into service in approximately 2003) finally bites the dust. Could be any day, now. They don’t even want to budget the SAN’s replacement. The status quo seems great, until that status changes to, “Hey. Why can’t we…?” Then it’ll be another Herculean effort to get the business back online.
p.s. Yes. I said Access ’97.
It all makes COBOL look good, doesn’t it?
Yes, I’m serious.
When I worked in a mainframe shop in 1999-2002, we had code that hadn’t been recompiled since 1985. It just ran, and no one lost any sleep at all over the idea that IBM would ever stop fully supporting it.
I’m living that pain too. We have a SQL Server 2005 database which started life as a number of separate Access databases all lumped together into one big one, then the back end got scaled up into SQL Server. The Access portion had connections directly to the tables, no stored procedures, you get the picture.
I’ve spent the last five years modifying it to get us away from Access and onto .NET applications for the front end (to say nothing of doing proper backups and maintenance on the back side).
This past year we tried to get approval to go to 2014 and got rebuffed when we discovered it would be on the order of $27,000 for Standard edition on a single box.
Ray is the man with the answers.
Smaller companies are getting squeezed by costs. Only our larger clients with experience want SQL Server. The rest don’t care, they don’t want to pay the price. So we look at alternatives for a market segment Microsoft won’t support.
I was unaware sql 2008 was a thing… 2008r2 yes… but not 2008
If I were a betting man, I would put good money on the fact that the licensing costs alone keep a good majority of people on older versions. When Microsoft moved to a per-core based licensing scheme for 2012 and up, they really put the screws to its user base.
When we finally moved off of Access 2003 and 97 databases for production data and applications in 2012, the licensing costs were tough to take. We ended up spending around $25K for the licensing of two dual quad-core servers (one production and one DR) for SQL Server 2012 Standard Edition. That price tag was nearly double what the hardware cost us. That’s a tough pill to swallow when every penny of the bottom line counts to the shareholders. And don’t even get me started on Enterprise Edition – who has $150K for one set of licenses?
We’re looking upgrading the hardware since it’s reaching it’s end of life, and now the thought of licensing the new hardware (which would have way more than just the 16 cores we’re currently licensed for) is making management rethink the definition “end of life.”
So, how did they get the numbers? I know the product is free, but does it send info to the Dell Mothership with or without user intervention?
Jack – the whole product is based in the cloud. It sends measurement data to the cloud, and then you get analytics that were calculated up there. (Otherwise, with the paid versions of monitoring tools, you have to host your own analytics repository.)
Brent is correct, the free version that is fully hosted via SpotlightEssentials.com leverages a SQL Server Management Studio plugin, that collects the data locally and periodically uploads it into the cloud. As a free cloud hosted service, the customer does have to “opt-in” however the free product will not be functional if the customer opts out.
Data can also be uploaded to SpotlightEssentials.com via the paid for, Spotlight on SQL Server Enterprise product. This is also an “opt-in” by the customer, however the product will still be fully functional if the the customer chooses to do this, they will just be unable to take advantage of the cloud supported functionality (i.e. health checks, and mobile device support).
While I find this data very interesting as well, and I think the 20+ thousand servers that SpotlightEssentials knows about are a good data point about the SQL Server deployments at large, think about all of the SQL Servers out there that no one cares enough about to monitor at all, whether via a free or paid tool. I group SQL Servers into three tiers. The top being servers that DBA’s know about, and are able to convince their bosses to spend money to support. These are, in my opinion, the most likely to be upgraded in a reasonable time frame, as the customer will not only want to take advantage of new MSFT functionality, but also need to guarantee MSFT continues to support them. The second tier are those servers that are relatively important to the business, but difficult to justify the cost of upgrading or tooling. The bottom tier will be those that the customer/DBA may not even know about.
If anyone would like to take a look, and contribute to our data, we’d love you to!!!
I am down to my last 6 2005 instances which comprise a measly 8% of our current deployments. And I have firm timelines for ditching 3 of those even: yay!
Several things to consider here.
1: Many times corporations move at a snails pace. It takes ages to get approval to go forward with anything.
2:Testing takes an upgrade can take months. Lets say you finally got your company to agree to upgrade your SQL Server to 2014. It may take a long time to get a game plan ready and test it through environments.
3: If it ain’t broke. I’ve seen a company that literally had an AS400 running some ancient SQL Server Chugging along in a closet that they had accidently walled over. Since they had no problems they just let it go.
Not saying for better or worse just giving some reasons why convincing your company to upgrade can be an uphill battle.
Although I will say I expected 2005 to be waaaaaay lower than it is.
I think it may be more to do with server age: older servers by definition are more likely to have extra stuff installed on them like monitoring tools. A long-standing server that hasn’t upgraded yet might be more likely to warrant monitoring, because people want to get every ounce out of it. It might say more about the use cases for the monitoring software than it does the takeup rate of SQL Server editions?
The bigger issue is that the Software Vendors are not certifying their software for the new versions – I’m sure there is comment above this one that relates – but unfortunately Business first asks “Does the vendor Support SQL 2014 ?”
At this point in my life, I work on the vendor side.
It’s a weird deadlock. As a vendor, we don’t want to upgrade until our customers are asking for it, because that’s cost and risk forcing the customer into something.
The customers, on the other hand, want to stick with something that we “certify”. We tell them “all of our test environments are currently yada yada.” Then they go out and buy that.
It’s a surprisingly intractable phenomenon.
I think its more like code creep, except this one is for servers. Most companies I have consulted have at least a few SQL 2000 servers lying around , some of it just there to support an application that just isn’t worth upgrading the rest just lying around coz it aint broke yet.
I think no matter which version of SQL comes out the companies have a tendency to hoard servers, sometimes citing license costs or just becuase upgrading the database would require upgrading the hardware and just too much paperwork.
We run only SQL 2008 R2 and above .. we need to get upgraded away from SQL 2008 R2 by 2019 … we are based in the financial sector and we need to stay PCI DSS certified which means staying on a product version that is supported by Microsoft so that we still receive hot fixes/security patches/service packs etc …
I have also read that when a vulnerability is discovered in a supported product (which should then receive a fix) hackers will then try this exploit in older versions that won’t be patched to see if the vulnerability exists there as well.
We still have an old 2000 box knocking around. The application still works so the business area is unwilling to stump up the money for either a new server or for application vendor support for migration.
Then there is the 2005 Enterprise cluster. Costs of migration to 2014…. Whoaaaaa…. Might have to be a 2012 standard especially with 2014 requiring the passive node to be licensed too ?
Everything else we have is 2008r2…. Including a bunch of SQL express ….
We do use the free Dell spotlight product…. On the basis of the above, we could not justify the paid for version. It’s another tool to use along with sp_Blitz. It gives a useful visual indicator and a starting point for some performance issue investigation.
Money is tight.
That looks like an almost exact breakdown of a couple hundred servers I’m familiar with, except more heavily weighted to 2008 than 2008 R2. There are a few 6.5 servers stuck somewhere in manufacturing land; we don’t have direct access so it’s not our problem. There’s a few more 2000 servers but they’re used for one particular application which refuses to move.
Upgrading in made difficult by the enterprise structure (“Service Management”). Departments must charge each other for work done to justify their resources; and as the database team cost centre it’s not like you can offer another team virtual money to spin up a project (the requisite business analyst, project manager, and support staff) to understand who is using an application and then assist us with moving it over to a new platform.
Senior management just kind of ignores it because it’s all virtual money. But middle management, the ones who can actually assign the people to do the work, take it very seriously and so refuse to cooperate. “But we shouldn’t have these old unsupported servers, it makes our life difficult (because we have to write scripts to accommodate old versions with no DMVs and no Outer Apply, and it’s not too difficult to move over…” falls on deaf ears.
Do you know what management does care about? Consolidation. Now they don’t care about any problems it might introduce (how exactly you’re going to diagnose performance problems, on virtualised machines, in always on groups running hundreds of databases, for multiple applications, with no budget for monitoring software). They just care about the number. You have 250 servers finely tuned between applications and split between dev/test/qa/prod? Take it down to 50. We don’t care how or why, we don’t care that everyone refuses to cooperate because there’s no money in it, or if it’s going to provide us any benefit whatsoever (in disk, CPU, memory, etc), just do it, because numbers look good.
If you hadn’t mentioned 6.5, I’d have wondered if you work where I do, especially when you started to talk about consolidation. I’ll bet the numbers management are looking at which drive it are the licensing agreements and only after someone realized how much money they’d have to spend to ‘true up’. Yet ironically while infrastructure tells everyone else they have to consolidate, it somehow doesn’t apply to themselves when they decide their tools shouldn’t have to share with anyone else or even each other.
The good news is that migrating from SQL 2005 to any of the newer versions is easy. Converting from SQL 7 to SQL 2000 was difficult and from SQL 2000 to SQL 2005 was unnecessarily difficult again. The shift from SQL 7 was hard because Microsoft abandoned many non-ASCII syntaxes. The shift from 2000 to 2005 was difficult because Microsoft stopped supporting INFORMATION_SCHEMA views on linked servers. They were again supported in 2008 and later versions. There were also many other pain points in going from 2000 to 2005. These were hugely expensive. The costs of developing and testing the conversions from 2005 to 2014 were less than 5% of the costs of the conversion to SQL 2000.
We are working hard to get everything to 2012. Then we will be fully but more currently unsupported when 2016 comes out. 2 year cycle gives no one a chance to breathe. Don’t need shiney and new, need stable.
Why don’t people upgrade faster? I’d say because admins and companies just don’t see any point in upgrading as long as the old version runs fine. Consider that before SQL 2008 the MSSQL was frequently viewed as second class citizen and used to run only less important apps and internal web sites while the really important business-critical stuff ran on Oracle (well, in some companies this is true even today…).
The new features in SQL are mostly aimed at enterprise-level computing and bring little advantages for those small users. In such a case I find it difficult to push for an upgrade which would only cost money while bringing no recognizable business benefits. There is also a good chance the original vendor or developer of the app is no longer available and migration could just result in a dead app etc.
On the other hand If company’s business relies on something MSSQL-based I’d bet they tend to be in the more up-to-date section of your pie chart.
That chart shocked me as well. What an ugly pie chart! Way too many slices and not sorted by size. Just awful. Ah well, once a BI guy… 😀
I second Koen’s comment on the pie chart, though I’d have expected the slices to be sorted by release order, but I wouldn’t call myself a data visualization expert either.
I’m surprised that the comments so far only mention OS versions once. In my world the DB and OS versions tend to be coupled – most recent versions of each as servers get replaced, along with a strong avoidance of in-place upgrades. So we’ve been able to make progress towards retiring sql 2000/2005 machines mostly as a side effect of the server admins trying to rid themselves of Win2003.
The exceptions I’ve seen usually involve 3rd party tool support. I have one vendor who just now started supporting database mirroring, and sql 2012, but not availability groups. For some vendors I think they may purposely only support up to the n-1 most recent version.
The push for PCI/DSS and other compliance rules such as SOX, HIPAA, etc. with their requirements to only run supported software is both a blessing and a curse. It helps push people to upgrade when they wouldn’t otherwise (because ‘it just works’, regardless of whether it’s supported anymore) but as someone else pointed out, now it’s a seemingly endless upgrade loop. The cloud does offer a way out of that upgrade loop, but then you have to make sure your use of it still complies with those rules.
you may not call yourself a data viz expert, but sorting on release data is indeed an excellent idea. As a bar chart it would have a great impact:
Most excellent, and from that we can see that 2008 is an outlier to the normal distribution one might expect.
Don’t forget, staying at lower levels cannot always happen. We are slowly being forced to higher versions due to the OS version upgrades. I hate the core licensing, the msoft sales rep should have been wearing a mask when he told us the prices.
We will stay at 2008R2 as long as we can, but I don’t think that will be too long.
We may be forced to look into cloud based solutions, if the costs dictate that avenue of enquiry.
2008R2 is like the Windows XP of all the SQL Server versions: steady as a rock and suitable for most customers..why change? I think 2012 will eventually fall behind too. The companies where I have been tend to migrate from 2008R2 to 2014 and skip the 2012. Don’t see a lot of 2000’s anymore though
2008R2 was good indeed, although mostly I BI release if I remember correctly. I see a lot of clients on SQL 2012 (better SSIS, better MDS, better SharePoint integration for BI, Power View support in SharePoint, window functions and other nice stuff) and skipping 2014.
I worked for compay a few years go that had 2000, 2005, and 2008r2. The reasoning behind he 2000 boxes was custom code that was previously built for a client that would require a major overhaul to upgrade. The client didn’t want to pay, so it as determind they would use the legacy code long as they could.
I was able to get us to upgrade to Sql2012, but that was probably only 6 months before Sql2014 hit CTP, as I remember. Sql2014 and the upcoming Sql2016 still don’t have any significant features/improvements for my world. Microsoft seems to be focusing on adding features for the “Enterprise” version, and I live in “Standard” land, which is affordable with 4 vCPUs.
Assuming the current revision cadence, if Sql2018 still doesn’t address some of my “list”, I don’t see us having a reason to upgrade until Sql2020. We only made the jump up to Sql2012 because of the new windowing functions, try-catch functionality, and .Net CLR integrations.
I’m still waiting for stuff like:
* Performant UDFs and table variables (they’ve started to address it, but it’s not quite there yet; how about optimizations for a deterministic UDF, at least?)
* The ability to pass temp-tables and table-variables in and out of functions/sprocs
* The ability to declare a variable type based on a table column (like Oracle has had for decades)
* A syntax extension to make an INSERT be more like an UPDATE (e.g. like MySql’s “SET/SELECT” extensions do, which allows the programmer to specify the target column name next to the value that’s going into it, rather in a completely different part of the Sql statement).
In full agreement with Granger. In most non fortune 500 dev shops there isn’t a need for many of the new features past 2008 r2. From a BI perspective 2012 is solid and I haven’t seen any compelling features that would justify and upgrade.
Thanks for the useful stats. Do you also have stats on the popularity of Enterprise vs Standard Editions?
Great question! No, I don’t have that.