SQLChicken.com

SQL Server DBA Tips & Tricks

By

So Long and Thanks For All The Fish

Not to try and steal Satya Nadella’s thunder today at WPC but I’ve got some exciting news of my own to share. I’m excited to say that I’ll be fulfilling a lifelong dream and going to work for Microsoft!

Every Beginning Has an End

matrix3This news is kind of a bittersweet though as I’ll be leaving Pragmatic Works after almost four amazing years. It has truly been an awesome experience. In the last four years I’ve had the opportunity to work with some of the best folks in the business and had opportunities I would never have dreamed of. Here’s a highlight list of some of the best parts:

  • The amazing people I’ve had the honor to work (and karaoke) with
  • Learning the ins and outs of SQL Server from some of the best in the world
  • Opportunities to write on major SQL Server book titles
  • Learning and teaching exciting new technology like Parallel Data Warehouse/APS
  • Developing and delivering exciting training content
  • Helping others through mentoring and teaching in the Pragmatic Works Foundation
  • Ability to shape and have input to software offerings
  • Company culture built around giving back to SQL Server Community
  • Literally scaling mountains

I’d like to give a special thanks to Brian Knight (Blog | Twitter), Adam Jorgensen (Blog | Twitter) and Bradley Ball (Blog | Twitter). My time at Pragmatic Works was great and I couldn’t have asked for better people to work for. Thank you guys for everything.

 

So What’s Next?

The good news is that in my new position I’m afforded the opportunities (and encouraged!) to continue being involved with the community. You’ll still see me presenting at SQLSaturday events, picking up the slack on my blogging, webinars, and hopefully presenting at future PASS Summit (buahaha, now I get to submit under the Microsoft call for speakers!) and other conferences.

I’ll also be delving into some other areas, mainly in the development space. You’ll be seeing some new content from me in areas such as BI development, Big Data, Cloud development, and Windows/Windows Phone development. Should be fun!

And like that…he was gone

Share

By

T-SQL Tuesday #48 Roundup

20121003200545.jpg

A big thanks to everyone who participated in this month’s T-SQL Tuesday (link) blog party. This month’s topic was to give your thoughts on Cloud. Lots of interesting reads after the break.

Read More

Share

By

T-SQL Tuesday: Head in the Clouds

20121003-200545
This month’s T-SQL Tuesday is hosted by yours truly. Our topic this month is simply the Cloud. If you work in IT there’s approximately zero chance that you’ve managed to avoid this word in some respect. Has your manager asked you to look into what cloud solutions can do for you? Are you ahead the curve and have taken it upon yourself to already start using and/or testing cloud solutions? This month I asked everyone to share what their thoughts were on the cloud.

Choices, Choices

When people talk about cloud solutions there are a myriad of options you could be talking about. Since this is a SQL Server focused blog, I’m going to focus on offerings specific to that. More specifically I’ll be talking about offerings from Microsoft’s cloud solution, Windows Azure, since that’s the platform I have experience with.

In regards to choices around SQL Server in the cloud there are two routes you can take: use Windows Azure SQL Database (WASD). This offering is known as Platform as a Service (PaaS). What this offering does is it offers developers a relational platform to develop against quickly and easily without the hassle and worry of the administrative overhead that goes with standing up a full SQL Server server. The drawbacks here are there are certain limitations around this option but I’ll drill into that in further detail below.

The second solution you’ll come across, and my personal favorite, is Windows Azure Virtual Machines. This offering is referred to as Infrastructure as a Service (IaaS). What this gives you is an on-demand, scalable compute infrastructure. In non-marketing speak it basically means you can spin up a virtual machine with SQL Server already installed, storage allocated, and customized number of CPUs and memory in minutes instead of waiting around for your IT department to go through its normal provisioning cycle. If it sounds like I’m advocating completely circumventing your company’s policies and going rogue, I’m not. More detailed thoughts on this offering below as well.

WASD: Hey DBAs, It’s Not For Us!

Ever since Azure came out and rolled out the various SQL Server offerings I’ve been trying to wrap my head around this particular facet of the solutions offering. Ever since it came out (and was still called Azure SQL Databases), all I could do was focus on its limitations what it couldn’t do.

Some of those limitations have changed/increased over time such as database sizes. At first the largest database you could create was 50GB. Now you can create databases up to 150GB in size and you can shard you data out so you can get beyond that 150GB size barrier if you need to. However sharding data like that requires different coding techniques that your development team likely isn’t doing today.

Additionally there are other restrictions like requiring a clustered index on every table, which isn’t necessarily a bad thing. Since this database is in the cloud another issue developers need to code for is network connectivity. Network connectivity can (and will) drop on occasion so it’s necessary to code retry logic for connectivity in your application. Finally if you write a bad query that causes the transaction log to “blow out”, your connection gets throttled for a time. For me, as a DBA, all these restrictions why would anyone in their right mind want to use this?! And therein lies the crux of the issue: I’m a DBA…this isn’t a solution meant for me.

It wasn’t until having some conversations with folks at this year’s PASS Summit that the whole use case, and my understanding, of WASD really clicked into place. After attending Connor Cunninham’s (Blog) pre-con on Azure Data Platform, attending Grant Fritchey’s (@gfritchey | Blog), having conversations with Lara Rubbelke (@sqlgal | blog ) and Eli Weinstock-Herman (@Tarwn | blog ) amongst others I came to a realization about PaaS: It’s not meant for me, so I really shouldn’t be bothered by the fact that it can’t do X, Y, Z. Just because it has the SQL Server label on it, doesn’t automatically mean I, the DBA, need to own it! “But Jorge, in my shop if it says SQL on it I end up supporting it anyways!”. Well that’s okay, because with PaaS the administrative side of things are handled (for most part) by Microsoft. These tasks include backups, hosting/provisioning of servers and day to day administration of the infrastructure that supports the solution.

Long story short, this is a solution aimed at developers. They just want a relational data store to develop against without headache of waiting for someone to provision them an environment to do so, nothing more. Think this isn’t happening today with devs? Check out this Twitter conversation I had with Scott Hanselman (@shanselman | blog) recently:

Scott not only uses WASD for development purposes, he wasn’t even sure what I was talking about when I asked him if he used WASD, that’s about how transparent they’ve made it for developers. The conversation was based around my discovery that not all administrative pieces of WASD had been ported over to HTML5 yet from Silverlight. He didn’t know because as a developer that’s something he never had to deal with or care about. In the words of Martha Stewart “that’s a good thing”.

OH NOES, CLOUD HAZ TAKEN MAH JOBZ!

Don’t worry, dear reader (totally ripped that from @sqlballs), not all is lost and no your job as a DBA isn’t going anywhere. If anything, your job stays intact and it’s going to evolve. Evolution is good, ask Charles Xavier. If anything the rise of cloud technology not only cements your role in the company but will actually upgrade you a bit as now you evolve into more of an architect role. Still like staying close to the technology? It’s still there and not going anywhere. We still have our on premise options. Not only that, we have pretty cool cloud options that are made to work hand-in-hand with our on premise environments. Which brings me to my favorite option…

Windows Azure Virtual Machines FTW

I love virtual machines. I especially love Windows Azure Virtual Machines (WAVM). Not only do they keep my job intact, in that you’re still doing everything you, as a DBA, do today in administering and using full SQL Server in an operating system but it also makes my job a hell of a lot easier in some respects.

One of the coolest things about WAVM is that Microsoft provides you with a nice selection of pre-built template virtual machines for you to choose from. SQL Server 2008 R2 + SP1 on Windows Server 2008 R2 + SP1, it’s there. SQL Server 2014 CTP 2 on Windows Server 2012 R2. Only a few clicks away. Not only that, you can fully customize these virtual machines’ resources such as number of CPUs, how much memory allocated to it and disk space. Disk space should probably be the best news anyone who has had to beg a SAN admin for disk space has heard. You also get the benefit of applying high availability options as well as backup protection options in a few clicks.

So if it’s just a virtual machine, just like you have today in your datacenter, what’s the big deal? Well there’s a few things. I just mentioned that self-service ability. Unless your enterprise has invested in a full blown Private Cloud solution then you probably don’t have anything like that available to you. Today you’re putting in a request, or opening a ticket, outlining what you want and writing up justifications for it. Then you get to wait for the network team, SAN team, sysadmins and DBAs to all do their part in setting up the machine then finally turning it over to you.

Fantastic…What’s The Catch?

I know, I’m painting a rosy, unicorn-laden picture. Well fact is there are certainly some things about WAVM you need to consider. First, it’s not connected to your network. Not a problem…maybe. There are ways to have your network extended out to the cloud through Windows Azure Virtual Network. If you were to extend your network out to Azure, you can also stand up a domain controller out there so any virtual machines you spin up out there look and feel just like any other server on your corporate network.

Okay then what about reliability? Each component of Azure offers its own SLA, which you can see here. As of time of this article the stated SLA for the virtual network is 99.9% and other cloud services (virtual machines with availability sets) at least 99.95%. Do you get that sort of SLA at work today? You might. Well compare that level of reliability and service compared to what you’d pay using Azure versus what your company paid to set up the infrastructure and staff to offer the current level of reliability.

What’s security like? Well I’ll be blogging and presenting more on Azure security this coming year but for purposes of this post I’ll condense it. It’s as good as you make it. Just like your current environment. Again, because we’re talking virtual machines it’s all the same as what you’re doing today inside your data center. In fact, I would bet that most of you currently work in places where your company’s datacenter is actually located outside your company and hosted by someone else (e.g. colo sites). In these massive datacenters you have racks and racks of servers and equipment that are bought and paid for by customers of the host but are physically located side by side. Azure is also a co-located situation but you have a little more dynamic control over where components of your solution are located.

Okay so we have our virtual machines “hanging out” in public for anyone to get to then? Not exactly. The virtual networks you configure, by default, essentially have their tin foil hats on and are not open to the world. Portions that you do open up you have to explicitly grant access through the firewalls in place. How about that data in storage? Again, how much do you secure it today? If you leave it unencrypted, at rest, in your data center today then you’re potentially exposing it to theft as well so technically this risk exists in both worlds. In the end, with security, there comes a point where it’s simply a matter of trust. Trust Microsoft to secure their data centers. Trust yourself to do your job correctly and secure what needs to be secured. This last point brings me to my final epiphany about the cloud, thanks to Grant Fritchey/Tom LaRock (@SQLRockstar | blog ) for this one…

The Cloud Forces You to Do It Right

This goes for both PaaS (especially) and IaaS. One of the best things I heard at Summit this year was Grant ranting on how WASD forces you to code correctly. Write code that forces a log to start blowing out and it kills your session? Well write it correctly to avoid that. Network glitches can and will occur. Have you written retry connection logic into your application? I guarantee you will now.

Like it or not we’re seeing a fundamental shift in how computing solutions are offered and used. We’re seeing a world of consumerization of IT (I hate myself for typing that marketing buzz phrase) where end users expect the freedom to pick and choose their solutions and don’t want to wait for the black hole that IT can be to respond to their needs. They will discover solutions like Azure, see how fast they can do stuff on their own, and potentially get themselves in a bind. Instead of coming off as the stodgy group that doesn’t want to help, embrace these solutions yourself and offer them up with guidance. In the end it’ll be a win-win for everyone.

How do you feel about this whole thing? If you didn’t write your own post this month I’d love to hear your thoughts in comments below.

 

 

Share

By

Monday Morning Mistakes: SSIS Expressions Not Evaluating Correctly

M3logo

SSIS Expressions

Expressions in SSIS are great. They allow you to create dynamic values for all sorts of stuff like variables, connection strings, properties for almost anything, etc. One huge issue that tends to trip up a lot of folks, especially those new to SSIS, is the evaluation of those expressions when using variables.

The Issue

You create an SSIS variable with an expression but at runtime the expression is not evaluating as expected. Instead the expression value is using the default/static value of the variable.

Quick Answer

Make sure property for ‘Evaluate as an Expression” on the variable is set to True. Without it being set to true, the variable will evaluate to the hard set value.

Read More

Share

By

Monday Morning Mistakes: Remote Connectivity to SQL Server

Inspired by common emails and questions I  see, I figured I’d do a series of blog posts on common mistakes folks make with SQL Server called Monday Morning Mistakes (or #sqlM3 for short, since we all love quick hashtags these days). These are meant as quick fixes, nothing too comprehensive. Also since I just made up a hashtag, feel free to share your own #sqlM3 tips on Twitter anytime! Without further ado…

Today’s quick issue: Can connect to SQL Server locally but can’t connect from other server or computer.

Quick answer: Remote connections (read also: any connections that are not local) to SQL Server are disabled by default. This behavior is default in SQL Server 2005 and higher. You have to manually enable TCP/IP protocol to instance to allow connectivity. This requires a service restart to take effect.

Read More

Share

By

Estimated Completion Time for Backups and Restores

I’m in the middle of a database migration and thought I’d quickly share a script I threw together to show estimated time of completion for a database restore in progress on SQL Server. The script will also show you estimated time for database backups to complete as well.

Please don’t take this script as gospel, the best way to truly know how long restores will take is to actually perform a restore! Remember folks:

Backups are worthless, restores are pricess

 

NOTE: Due to the fact this script uses DMV’s, will only work on SQL Server 2005 and higher

Share

By

SQL University: Virtualization Basics

This week we’re going to talk about a topic that has been gaining steam in the last few years and as it has it has started impacting database administrator’s worlds more and more: virtualization. Why do I make this statement? Well since the economy currently sucks, shops are finding ways to consolidate and make their dollars stretch a little further. Back in the day when you had a new application you pretty much went out and bought yourself some new servers and went on your merry way. Now, when money’s tight, folks are a little less likely to go out and simply buy new equipment for each individual application. Not only is this option expensive, there are other factors to think about such as space (data center may not have capacity for new servers), electricity and cooling.

Enter virtualization. Virtualization allows you to consolidate this server sprawl issue by buying a physical server, filling it with tons of your typical resources such as CPU, memory and drives, and from this single box be able to create virtual servers on this single piece of hardware that look/act/feel like independent servers. This week we’re going to cover some basics of virtualization and stuff you need to know about if you’re going to be going that route in your shop.

Read More

Share

By

SQL Server 2012: Business Intelligence Edition

Well this was quite the little surprise this morning. Microsoft announced a new edition to the SQL Server lineup for 2012 – Business Intelligence edition. In addition to a new edition (funnily I don’t see Datacenter in that lineup) we also have a new licensing scheme for SQL Server. In SQL 2012 it looks like Microsoft is finally moving to the core-based licensing model. Ladies and gentlemen, start your grumbling! Okay, seriously, the new licensing scheme shouldn’t be that big of a shock to anyone. I think most of us have been expecting this for quite some time as it only makes sense as newer processors are coming with more and more cores.

As for the new edition of SQL Server, I think it’s an interesting move to say the least. As SQL Server adoption in the enterprise keeps going up, it kind of makes sense that they’d make a dedicated edition for the BI stack. The last few releases of SQL Server have been BI-feature heavy and when you’re architecting your setup, you should be setting up dedicated boxes (if possible) for the BI stack anyways. In my eyes this is a pretty smart move, although I’m sure some will disagree. With the separation of church and state Engine and BI you can now have a little more flexibility in your choices, especially regarding licensing.

image

 

So what does the new licensing change mean for you? Should you be worried? Well if you’re not sure how your licenses are currently distributed or what you have out in your enterprise deployed right now, I HIGHLY suggest you download and use the MAP Toolkit. This free tool will not only discover instances in your enterprise (not only SQL Server!) but it will give you some really great detailed information including usage information (this is a must-use tool if you’re considering consolidation), editions, number of cores, etc. Run it against your environment and then have a chat with your local Microsoft rep about how the new changes might affect your existing infrastructure.

What are your thoughts on the new changes? Like it? Hate it? Don’t care? Let me hear it in the comments.

Share

By

Deploying SSIS Packages with BIxPress

oldbusted-newhotness

Same functionality but obvious differences

If you’ve worked with SSIS for any amount of time, you may quickly come to find that the native way of deploying packages can be…cumbersome. The native tools, while helpful, aren’t exactly the most intuitive to setup or use. This post will show you how you can quickly and easily deploy and configure packages using BIxPress.

Old and Busted

Before I show you how to deploy packages, I should probably quickly explain how to deploy packages in SSIS using native methods. I won’t go into every single detail here on how to deploy packages natively, however, if you’re interested in doing it step-by-step the built-in Help in Business Intelligence Development Studios (BIDS) has a complete walkthrough tutorial for you to check out. To access those tutorials simply press Ctrl+F1 from within BIDS (or click on Help menu and select How Do I from menu). From the ‘How Do I?’ list click on Find Tutorials, Integration Services and then select your tutorial. The one I’m referring to in this post is the Tutorial: Deploying Packages.

The condensed version of the tutorial is this: in order to deploy packages you have to go through a series of steps that aren’t exactly obvious from the interface. First, you have to manually enable the ability to even deploy. You get to this by going to the properties of the project, go to Deployment Utility and set the CreateDeploymentUtility option to True. Once you’re done doing that, you have to build (or rebuild) the project for it to generate what is called a Deployment Manifest file. This file is saved to the file path configured for DeploymentOutputPath where you set the properties for the Deployment Utility. This part alone reeks of user-interface fail to me, but I digress.

Once you’ve created your deployment manifest you’ll need to copy that manifest file out to a share on the target server. After you’ve copied it there, you double click it to launch the Package Installation Wizard. This wizard is pretty typical of Microsoft wizards and is pretty straightforward as far as walking you through your various options. For complete details on deploying using the wizard, refer to the tutorial in the Help. By the time you’re done with the wizard you’ll have deployed the package but your options for customization of deployment are limited.

New Hotness

After learning SSIS over the past year, one of the things that BIxPress has absolutely spoiled me with is the ease of deploying packages. In BIDS just right-click on your package and select Deploy SSIS Package (BIxPress) from the context menu. This will launch the BIxPress Package Deployment wizard. The first screen that comes up gives you a few really cool options such as copying folder structures (if needed), deploy XML files for you if you used XML configurations, you can change the location of those configuration files on your target server, and even change the package level protection from here. These options here have made deployments a breeze for me as on QA servers I had clients putting configs in D:SSISConfigs and on production it was something different like E:SSIS_Configs. Being able to quickly and easily change these options on the fly has saved me tons of headaches.

The next screen is the real meat of this feature. Here you can actually select more than just the one package you right-clicked initially for deployment. Additionally you have lots of options for deploying to and from a server. You can deploy to/from your regular options of File System, SQL Server or SSIS Package Store but here its easily laid out for you for ease. Speaking of ease, ever wanted to deploy in the opposite direction (i.e. production to development)? Simply check off the box ‘Enable 2-way deployment’ and you can quickly deploy bi-directionally quickly and easily. Pretty slick, eh?

Once you check off the packages to deploy and select your deployment destination options, simply click the deploy button in the middle and it quickly deploys your packages. Once it is complete you get a summary of the deployment results which you can save for change management purposes. That’s it, you’re done!

If you want to try out BIxPress you can download a trial copy from the Pragmatic Works website.

Share

By

We Are Community

Today I was planning on writing a summary post of PASS Summit experience but something happened last night that caused me to change up the queue for blog posts and quite frankly bothered me. Today I’d like to address a few things regarding the Community, behavior within it and just general thoughts about stuff. I apologize ahead of time for the word vomit you’re about to read.

So last night a certain individual began ranting to certain folks on Twitter about how what he thought about the MVP Award and how it seems like they “hand it to anyone now” based on “printing out a card for after hours events”. This person (whom for the time being I’m simply refusing to mention) had an opinion, which is fine. When myself and some others started reading this we began defending whom they were talking about, which in this case was Jen McCown (Blog | Twitter) of MidnightDBA fame. What was funny to me was that this person didn’t seem to want to have a reasonable conversation, they seemed to have an almost personal grudge. Even in email format (yes, some of us tried to reason with him in private as well), he kept up the childish name-calling and outlandish behavior.

Now granted, if you don’t like the MVP program or who is awarded, that’s fine and dandy and you can let the folks at Microsoft know (Blog | Twitter | Facebook). Everyone is entitled to their opinion, but when you put your opinion in a public forum and others challenge you on your statements don’t whine about it and throw a fit. Don’t break down into childish attacks. Don’t start attacking everyone with ridiculously stupid statements and then claim people are attacking YOU. Yes, all of this happened and more last night. Missed the fun? This guy got a new hashtag generated for him aptly named #sqlidiot.

Another interesting point came up during our “conversations” with this guy, namely he made a statement about us whining and we were taking Community vs Real Life. Let’s think about this for a second. The SQL Community is not exactly huge, and if you attended PASS Summit last week you get the sense that it’s more like a global family. Most of us know each other offline and a lot of us have never met but when we do finally meet in person you know each other so well you actually FORGET the fact that you’ve never met! We celebrate our triumphs together. We share our pain together. We pray together. We lift each other up and support each other. Hell, we even officiate each other’s weddings! This is Community. This is Family. To think that our interactions are limited to digital medium is both nearsighted and flat out wrong, it’s only a small part of a very large (and global) picture.

As witnessed last night you can see how protective we are of each other. Noticed I have not made mention of status at all. That’s because something like the MVP award, while cool, doesn’t mean you can’t/don’t belong in this family. From the person looking to start writing their first SQL query, to the professional speakers, to the folks writing the engine for the products we all know and love we are all One. Yeah, sounds a little over the top existential but I truly feel that way about this Community. When someone goes on a public forum and starts tearing others down, for no apparent reason whatsoever, don’t be shocked when you have quite a few folks fighting back. You may be brave behind a keyboard but I’d love for someone to try that nonsense at a SQLSaturday event or PASS Summit. It’s not Community vs Real Life, Community IS Real Life and I will defend it, and the people that make it up, until the very end.

Finally there’s general conduct. We’re all entitled to our own opinions but how you express those opinions, especially in a public forum like Twitter, is critical. I can have a conversation with someone and not see eye to eye with them, that’s fine. Resorting to childish name-calling and tired/pathetic ‘your mother’ comebacks just makes you look like an absolute idiot and you lose any and all credibility you may have had to start with. Some people tend to forget the acronym PASS stands for the Professional Association for SQL Server, emphasis on professional. While we do tend to have our after hours and colorful fun, you’ll rarely see someone all-out break that professional decorum. That’s a matter of respect, for both yourself and the people you interact with. Think about the consequences of your actions, ESPECIALLY in a public forum. The Internet, as they say, is forever (and Google Bing Bingle has a long and easily searchable memory).

I know some of you followed along closely last night and even chimed in with this guy, some of you lurked, some are probably hearing about this for the first time. What are your thoughts?

Share

By

PASS Summit Keynote Day 1 Highlights

CLOUD! BIG DATA! EXCEL! CLOUD! CLOUD! Okay, recap done. Not really…sort of. In all honesty, while delivered in a fairly terrible fashion, there were some pretty big announcements made in today’s keynote. First let’s start with the one a lot of folks have been waiting on….

Official Names Revealed

The release of SQL Server we’ve known as “Denali” for the last 12 months now has an official name: SQL Server 2012! I know, not exactly exciting but at least it’s nice to have an official name. Also, since according to Mayan calendar the world ends this year anyways, this is THE LAST VERSION OF SQL SERVER YOU’LL EVER NEED!!! In addition to Denali SQL Server 2012 getting a name, we also got the official name for project “Crescent”, which is now officially known as PowerView.

 

BIG Data on Windows/Azure

Those worried about the NoSQL movement and how Microsoft would play in that space? No more worrying, now you get best of both worlds with the announcement of Microsoft’s support for Hadoop on Windows and Windows Azure! This is actually pretty exciting even though, in this blogger’s humble opinion, this kind of scale doesn’t matter for 99% of the folks out there. With this announcement, however, Microsoft has made huge strides in make the Cloud more relevant for big businesses. Want a multi-terrabyte system that scales? Windows Azure can handle that for you now. Want to handle that internally? Local options also supported. Or create a hybrid solution, the possibilities are actually fairly cool here.

The other story that was sold is that you can use Microsoft BI stack against your data in Hadoop. An example of this was shown by using PowerPivot to connect to Hadoop on Windows via the new ODBC connector. This connector will be available sometime in November as a CTP download. Speaking of connectors, Microsoft recently released connectors for PDW as well so you can connect big data with big iron for those who need that kind of data firepower.

 

Project “Data Explorer”

They also showed off a new tool which allows you to explore and merge data from Azure marketplace and various data sources. They spent a good chunk of time demoing bringing together data from Azure Marketplace, SQL Server and some other sources. Honestly I started tuning out a bit at this point since the #sqlpass stream became “interesting” at that point.

 

The rest of the keynote consisted of a rather downplayed series of demoes in Excel/PowerPivot/Power View. If you’d like you can check out the keynotes yourself here.

Share

By

Expiring Databases and Policy-Based Management

Today on Twitter my friend Jes “Run Forrest Run” Schultz Borland (Blog | Twitter) asked the Community “How do you clean up your dev environments? Let DBs sit out there forever? Delete after X months? Other? This seemed like an interesting issue to tackle and me being the PBM freak that I am, immediately I had a light bulb moment for a policy. In this post I’ll show you a policy you can run against your databases (can work in dev or whatever environment suits you) and will tell you which databases are older than 30 days old. As an added bonus, I’ll also show you how to add a custom extended property to set a custom expiration date.

Read More

Share

By

Small Business Hardware

[NOTE] My blog post scheduling-fu is weak, so this post didn’t go out Friday as planned. My apologies.

This is the final installment of our Small Business series. So far we talked about how to get the software, and we’ve talked about the different options of SQL Server available to you. Today we’re going to talk about what hardware you’ll need as a small business to setup your database environment for success.

Read More

Share