SQLChicken.com

SQL Server DBA Tips & Tricks

By

T-SQL Tuesday #48 Roundup

20121003200545.jpg

A big thanks to everyone who participated in this month’s T-SQL Tuesday (link) blog party. This month’s topic was to give your thoughts on Cloud. Lots of interesting reads after the break.

Read More

Share

By

T-SQL Tuesday: Head in the Clouds

20121003-200545
This month’s T-SQL Tuesday is hosted by yours truly. Our topic this month is simply the Cloud. If you work in IT there’s approximately zero chance that you’ve managed to avoid this word in some respect. Has your manager asked you to look into what cloud solutions can do for you? Are you ahead the curve and have taken it upon yourself to already start using and/or testing cloud solutions? This month I asked everyone to share what their thoughts were on the cloud.

Choices, Choices

When people talk about cloud solutions there are a myriad of options you could be talking about. Since this is a SQL Server focused blog, I’m going to focus on offerings specific to that. More specifically I’ll be talking about offerings from Microsoft’s cloud solution, Windows Azure, since that’s the platform I have experience with.

In regards to choices around SQL Server in the cloud there are two routes you can take: use Windows Azure SQL Database (WASD). This offering is known as Platform as a Service (PaaS). What this offering does is it offers developers a relational platform to develop against quickly and easily without the hassle and worry of the administrative overhead that goes with standing up a full SQL Server server. The drawbacks here are there are certain limitations around this option but I’ll drill into that in further detail below.

The second solution you’ll come across, and my personal favorite, is Windows Azure Virtual Machines. This offering is referred to as Infrastructure as a Service (IaaS). What this gives you is an on-demand, scalable compute infrastructure. In non-marketing speak it basically means you can spin up a virtual machine with SQL Server already installed, storage allocated, and customized number of CPUs and memory in minutes instead of waiting around for your IT department to go through its normal provisioning cycle. If it sounds like I’m advocating completely circumventing your company’s policies and going rogue, I’m not. More detailed thoughts on this offering below as well.

WASD: Hey DBAs, It’s Not For Us!

Ever since Azure came out and rolled out the various SQL Server offerings I’ve been trying to wrap my head around this particular facet of the solutions offering. Ever since it came out (and was still called Azure SQL Databases), all I could do was focus on its limitations what it couldn’t do.

Some of those limitations have changed/increased over time such as database sizes. At first the largest database you could create was 50GB. Now you can create databases up to 150GB in size and you can shard you data out so you can get beyond that 150GB size barrier if you need to. However sharding data like that requires different coding techniques that your development team likely isn’t doing today.

Additionally there are other restrictions like requiring a clustered index on every table, which isn’t necessarily a bad thing. Since this database is in the cloud another issue developers need to code for is network connectivity. Network connectivity can (and will) drop on occasion so it’s necessary to code retry logic for connectivity in your application. Finally if you write a bad query that causes the transaction log to “blow out”, your connection gets throttled for a time. For me, as a DBA, all these restrictions why would anyone in their right mind want to use this?! And therein lies the crux of the issue: I’m a DBA…this isn’t a solution meant for me.

It wasn’t until having some conversations with folks at this year’s PASS Summit that the whole use case, and my understanding, of WASD really clicked into place. After attending Connor Cunninham’s (Blog) pre-con on Azure Data Platform, attending Grant Fritchey’s (@gfritchey | Blog), having conversations with Lara Rubbelke (@sqlgal | blog ) and Eli Weinstock-Herman (@Tarwn | blog ) amongst others I came to a realization about PaaS: It’s not meant for me, so I really shouldn’t be bothered by the fact that it can’t do X, Y, Z. Just because it has the SQL Server label on it, doesn’t automatically mean I, the DBA, need to own it! “But Jorge, in my shop if it says SQL on it I end up supporting it anyways!”. Well that’s okay, because with PaaS the administrative side of things are handled (for most part) by Microsoft. These tasks include backups, hosting/provisioning of servers and day to day administration of the infrastructure that supports the solution.

Long story short, this is a solution aimed at developers. They just want a relational data store to develop against without headache of waiting for someone to provision them an environment to do so, nothing more. Think this isn’t happening today with devs? Check out this Twitter conversation I had with Scott Hanselman (@shanselman | blog) recently:

Scott not only uses WASD for development purposes, he wasn’t even sure what I was talking about when I asked him if he used WASD, that’s about how transparent they’ve made it for developers. The conversation was based around my discovery that not all administrative pieces of WASD had been ported over to HTML5 yet from Silverlight. He didn’t know because as a developer that’s something he never had to deal with or care about. In the words of Martha Stewart “that’s a good thing”.

OH NOES, CLOUD HAZ TAKEN MAH JOBZ!

Don’t worry, dear reader (totally ripped that from @sqlballs), not all is lost and no your job as a DBA isn’t going anywhere. If anything, your job stays intact and it’s going to evolve. Evolution is good, ask Charles Xavier. If anything the rise of cloud technology not only cements your role in the company but will actually upgrade you a bit as now you evolve into more of an architect role. Still like staying close to the technology? It’s still there and not going anywhere. We still have our on premise options. Not only that, we have pretty cool cloud options that are made to work hand-in-hand with our on premise environments. Which brings me to my favorite option…

Windows Azure Virtual Machines FTW

I love virtual machines. I especially love Windows Azure Virtual Machines (WAVM). Not only do they keep my job intact, in that you’re still doing everything you, as a DBA, do today in administering and using full SQL Server in an operating system but it also makes my job a hell of a lot easier in some respects.

One of the coolest things about WAVM is that Microsoft provides you with a nice selection of pre-built template virtual machines for you to choose from. SQL Server 2008 R2 + SP1 on Windows Server 2008 R2 + SP1, it’s there. SQL Server 2014 CTP 2 on Windows Server 2012 R2. Only a few clicks away. Not only that, you can fully customize these virtual machines’ resources such as number of CPUs, how much memory allocated to it and disk space. Disk space should probably be the best news anyone who has had to beg a SAN admin for disk space has heard. You also get the benefit of applying high availability options as well as backup protection options in a few clicks.

So if it’s just a virtual machine, just like you have today in your datacenter, what’s the big deal? Well there’s a few things. I just mentioned that self-service ability. Unless your enterprise has invested in a full blown Private Cloud solution then you probably don’t have anything like that available to you. Today you’re putting in a request, or opening a ticket, outlining what you want and writing up justifications for it. Then you get to wait for the network team, SAN team, sysadmins and DBAs to all do their part in setting up the machine then finally turning it over to you.

Fantastic…What’s The Catch?

I know, I’m painting a rosy, unicorn-laden picture. Well fact is there are certainly some things about WAVM you need to consider. First, it’s not connected to your network. Not a problem…maybe. There are ways to have your network extended out to the cloud through Windows Azure Virtual Network. If you were to extend your network out to Azure, you can also stand up a domain controller out there so any virtual machines you spin up out there look and feel just like any other server on your corporate network.

Okay then what about reliability? Each component of Azure offers its own SLA, which you can see here. As of time of this article the stated SLA for the virtual network is 99.9% and other cloud services (virtual machines with availability sets) at least 99.95%. Do you get that sort of SLA at work today? You might. Well compare that level of reliability and service compared to what you’d pay using Azure versus what your company paid to set up the infrastructure and staff to offer the current level of reliability.

What’s security like? Well I’ll be blogging and presenting more on Azure security this coming year but for purposes of this post I’ll condense it. It’s as good as you make it. Just like your current environment. Again, because we’re talking virtual machines it’s all the same as what you’re doing today inside your data center. In fact, I would bet that most of you currently work in places where your company’s datacenter is actually located outside your company and hosted by someone else (e.g. colo sites). In these massive datacenters you have racks and racks of servers and equipment that are bought and paid for by customers of the host but are physically located side by side. Azure is also a co-located situation but you have a little more dynamic control over where components of your solution are located.

Okay so we have our virtual machines “hanging out” in public for anyone to get to then? Not exactly. The virtual networks you configure, by default, essentially have their tin foil hats on and are not open to the world. Portions that you do open up you have to explicitly grant access through the firewalls in place. How about that data in storage? Again, how much do you secure it today? If you leave it unencrypted, at rest, in your data center today then you’re potentially exposing it to theft as well so technically this risk exists in both worlds. In the end, with security, there comes a point where it’s simply a matter of trust. Trust Microsoft to secure their data centers. Trust yourself to do your job correctly and secure what needs to be secured. This last point brings me to my final epiphany about the cloud, thanks to Grant Fritchey/Tom LaRock (@SQLRockstar | blog ) for this one…

The Cloud Forces You to Do It Right

This goes for both PaaS (especially) and IaaS. One of the best things I heard at Summit this year was Grant ranting on how WASD forces you to code correctly. Write code that forces a log to start blowing out and it kills your session? Well write it correctly to avoid that. Network glitches can and will occur. Have you written retry connection logic into your application? I guarantee you will now.

Like it or not we’re seeing a fundamental shift in how computing solutions are offered and used. We’re seeing a world of consumerization of IT (I hate myself for typing that marketing buzz phrase) where end users expect the freedom to pick and choose their solutions and don’t want to wait for the black hole that IT can be to respond to their needs. They will discover solutions like Azure, see how fast they can do stuff on their own, and potentially get themselves in a bind. Instead of coming off as the stodgy group that doesn’t want to help, embrace these solutions yourself and offer them up with guidance. In the end it’ll be a win-win for everyone.

How do you feel about this whole thing? If you didn’t write your own post this month I’d love to hear your thoughts in comments below.

 

 

Share

By

T-SQL Tuesday #48– Cloud Atlas

TSQL2sDay Logo
Welcome to this month’s (November 2013) edition of T-SQL Tuesday. For those not familiar this is rotating blog party that was started by Adam Machanic (@AdamMachanic | blog) back in 2009. Want to catch up on all the fun to date? Check out this nice archive (link) put together by Steve Jones (@way0utwest | blog). Thank you Steve!!!

Cloud: What’s Your Take?

Cloud. It’s the juggernaut buzzword in IT for the last couple of years now. By now you’ve surely been exposed to some aspect of it: Azure Virtual Machines, Windows Azure SQL Databases, Amazon EC2, Rackspace, etc. At this point in the game the cloud solutions are fairly mature and constantly evolving to better serve their customer base.

image
This month’s topic is all about the cloud. What’s your take on it? Have you used it? If so, let’s hear your experiences. Haven’t used it? Let’s hear why or why not? Do you like/dislike recent changes made to cloud services? It’s clear skies for writing! So let’s hear it folks, where do you stand with the cloud?

Rules

  • Your post must be published between 00:00 GMT Tuesday November 12th, 2013, and 00:00 GMT Wednesday November 13th, 2013.
    Your post must contain the T-SQL Tuesday logo from above and the image should link back to this blog post.
    Trackbacks should work, but if you don’t see one please link to your post in the comments section below so everyone can see your work.

For the Horde! (Read also: letting everyone know about TSQL2sDay)

  • Include a reference to T-SQL Tuesday in the title of your post.
    Tweet about your post using the hash tag #TSQL2sDay.
    Volunteer to host a future T-SQL Tuesday. Adam Machanic keeps the list.
Share

By

SQL University Lecture Series: Women in Tech

This week SQL University is on Spring Break but we’ve lined up some activities to help keep students busy (you know what they say about idle hands and whatnot!). In continuation of Women’s History Month, and properly coming off of the heels of the 24 Hours of PASS event, we’re proud to have our next talks in our Lecture Series this Friday at 1pm EST featuring the ladies of WIT Week here at SQL University.

Our Live Lecture will be happening over at SQLLunch, and as always it’s free, so make sure you go register for the event. If you enjoyed reading all of the fantastic posts from WIT week you’ll love this event. The session is going to be a round table discussion about what WIT means to them as well as discussing some of the issues they’ve faced and would like to address in the field. Audience participation is encouraged via Q&A in LiveMeeting so come join the panel. See you at the SQLLunch!

Share

By

24 Hours of PASS: Day 1

Yesterday was the first day of PASS’ 24 Hours of PASS event. For those not familiar, 24 Hours of PASS is an event that brings together 24 different presenters and they present on various topics on SQL Server ranging from performance tuning, internals to business intelligence and previews at vNext of SQL Server. This month’s event is quite special since March is Women’s History Month, PASS is celebrating it by having this event all delivered by women!

So far the event has been absolutely awesome and the awesomeness continues today with the last 12 hours of the event, starting at 8am EST. If you missed yesterday, don’t fret, all of the sessions are being recorded and will be available on the PASS website within a month. Yesterday’s sessions went well, we had some sessions that actually had over 750 attendees (or as Tom LaRock refers to it as, the Jetliner line)! There were a few surprises as well, such as in Isabel de la Barra’s session, where we were treated to a presentation in Spanish (and translated by moderator Jesus Gil). At first we thought it was going to be a big issue but it turns out that over 300 attendees stuck around for the session and feedback from the Twitter stream seemed positive.

Speaking of Twitter, if you wish to follow along with the event you can do so by following the event hash tag of #24HOP. We are also using #sqlpass as well as #passwit to help promote and discuss the event. Day one is in the book and day two is looking to be fantastic as well, see you in the sessions!

Share

By

Oh Yes It’s Ladies Ni…errr…Month

Making WIT awesome since 1993

It’s March and this month sees a lot of celebration, amongst the parties are Mardi Gras and St. Patrick’s Day. This month we also celebrate women as it is Women’s History Month as well! If you’ve spent any time in the SQL Community you may have noticed that we have an especially strong support for Women in Technology and so this month there are some great things happening to help celebrate that fact.

I was going to do a rundown of all the cool WIT stuff going on this month but it looks like Wendy Pastrick (Blog | Twitter) beat me to the punch! Check out her post about all the events going on this month, including next week’s SQL University WIT week which will feature our second Live Lecture Series via SQLLunch.com. We’re having the ladies of SQLU WIT week hosting a WIT panel so make sure to join us for that!

Update: Also, for those on Twitter, check out my WIT list: http://twitter.com/#!/list/SQLChicken/women-in-technology

And since the title of this post probably got the song in your head, as a bonus, here’s the video: http://www.youtube.com/watch?v=J1oU9_hy3mA

Share

By

Pragmatic Transition: Lighthouses and Shipwrecks

Sometimes your mistakes are the greatest lessons...

This is the next post in my series about transitioning from a DBA to a BI consultant for Pragmatic Works. This post is a particularly sensitive one as it pertains to a lesson I had to learn the hard way. My hopes are that by writing and publishing this maybe you can spare yourself or someone else from making the same mistakes. This post is basically to teach one thing: Sometimes you’re a lighthouse, shining your light and showing people the way to safety. The lighthouse is steady and helps others through with a clear message and action. The other half is the shipwreck. Sometimes seeing the wrecks on the rocks gives others a warning about what NOT to do in a given situation. Throughout your life you will probably play both roles many times. For me, in this particular situation, I’m playing the role of shipwreck.

Before I begin let me quickly set the stage for my current position in life. For the last few years I’ve been a SQL Server DBA in shops where I was pretty much the only one. Due to this, along with very lenient bosses, I was allowed to leverage social networking on a daily basis. If you follow me on Twitter then you know I tend to tweet more than any human being should a lot. I’ve come to think of the network of fellow SQL professionals on Twitter as my extended DBA team. I would consume tons of knowledge via conversations, monitoring (and participating) in the #sqlhelp channel, reading blog posts and checking out all the various webcasts and events. This was before taking on the role of a consultant.

As a consultant you have to remember one thing: you’re no longer on YOUR time, you’re on your CLIENT’S time. When someone hires you the expectation is that you’re there to do a job and focus on that job. When you deviate from that, especially on a public platform like social networking sites, the perception is that you’re using up their time. And by using up their time, I mean wasting it. While I may be working hard on whatever client work I’m doing, yet tweeting throughout the day, the perception is that I’m not really working and my focus isn’t where it should be. Even if I scheduled every single tweet throughout the day the perception is still the same, and this is the key: perception is reality. That being the case, the “reality” I was broadcasting by tweeting all the time (as a consultant) is that I was not busy, not focused and to some extent not caring about my client. While none of these are true the fact is I should’ve been more cognizant of the perception I put out to the public, and for that I apologize to the community as a whole.

So now what do we do? Well, we move forward and learn! I now understand a little better what’s expected of me in my new role. The beauty of mistakes is it gives us a chance to learn from them. The important part of mistakes is that you DO learn from them and most importantly: MOVE ON! Mistakes happen. Not only do they happen, they happen to everyone. What matters is how you deal with it and move forward. A really great example of a shipwreck-turned-lighthouse would be a recent situation with Brent Ozar (Blog | Twitter) and his business partnership at SQLSkills. You can read the saga here, here and here. Brent’s public dealing with his situation also helped inspire this post. He took what could have percieved as a terrible situation and turned around and made it a fantastic learning opportunity for anyone looking to pursue a similar partnership in the future. He turned a shipwreck into a lighthouse!

Just remember if you make a mistake that it’s okay. Stuff happens. It’s how we deal with those mistakes that matters in the end. How about you? Have you had a shipwreck/lighthouse moment? Share your stories in the comments!

Share

By

BIxPress 3.0: DBAs Welcome!

Much like the USA Network here in the States welcomes characters, I’d like to formally let the world know that BIxPress also welcomes folks, and this time it’s looking at you DBAs out there!

You may be thinking, “But Jorge, the product is called BIxpress, why as a DBA would I give a flip about it?!?” Glad you asked! I’ve recently made the transition from a DBA to a BI consultant and as part of my learning process for learning the BI stack I decided to take a crack at creating an SSIS package that would take a bunch of video files from a conference, compare the file names to the actual session titles (files came down named with their session codes, not names) and rename the files according to their formal session titles. If you’re interested in that, I’ll be posting another blog post soon detailing how I did it as well as you’ll be able to download the package yourself and try it out!

Read More

Share

By

SQL University: State of the Union

The new SQL University Logo

Well the new year is here and SQL University is back and better than ever! I just wanted to take a minute to bring everyone up to speed on what’s going on with SQLU.

First off the last semester we had (Spring 2010) started rather late, which pushed the rest of the schedule quite a bit. One of the unique facets of SQL University is having our coach Tom LaRock (Blog | Twitter) posting on EVERY topic, EVERY week which is quite the impressive feat! That being said, since the last semester ran a few weeks late it not only made a lot of work for him so I wanted to give him ample time off as that is a TON of writing he’s doing, which I think we can all agree, is pure awesomesauce. Also Tom has undergone a job transition, as well as I have, so it’s been a little hectic on that front as well. Due to the schedule shifts, job changes, moves and generally hectic life we decided to skip the Fall semester for 2010, hence you’ll find it missing from the overall SQLU main page.

Another reason we went quiet for awhile was we were busy putting together another major project: SQL University – The Book! No, sorry, no movie deals in the works but I think you guys will like this even better. What we’re doing is compiling all of this awesome material our professors have put together for you guys into an organized e-book companion! My hope is that we can it formatted properly for distribution via Amazon’s Kindle store but if that doesn’t work out we’ll probably just PDF it and let you guys go to town! As with the rest of this wonderful project, this is absolutely free to everyone and will be released as SQL University: Volume I, Freshman Year which includes the first two semester’s worth of blog content! This is taking a lot of time to put together so bear with us as we get that worked on.

Another big change you may have noticed, and one of the most exciting parts about this new year, is our re-branding! Our new logo comes courtesy of the wonderful folks at Revealed Design Inc. (Facebook| Twitter) and a big thanks to Aaron Nelson (Blog | Twitter) for hooking me up with them. This is a much cleaner design and look than my atrocious attempt at designing a blogger badge from before. When you visit each professor’s sites this time around you should see the new badges displayed.

Finally the other huge addition this year is our partnership with SQLLunch.com to bring you the live lecture series. We had our first one featuring Josef Richberg during SSIS week. We’ll be bringing you more this semester with some big names so stay tuned! The best way to keep up to date on all the latest news and additions to SQL University is to join our newsletter.

One More Thing…

If you’re enjoying SQL University and learning from all of these great folks in the SQL community you’ll get a chance to experience all of this in person! This Spring at SQLRally we’ll be hosting a Lightning Talk session featuring the professors of SQLU as well as some other special surprise guests. SQLRally runs from May 11-13 in Orlando, Florida and only costs $299  and $199 for pre-conference sessions (optional). Hope to see you there!

Share

By

Policy for Ad-hoc Workloads

During my presentation at SQLSaturday 62 in Tampa I was asked by an attendee about having a policy to check the setting for ad-hoc optimization settings. At the time since I was in a bit of a time crunch (and I couldn’t remember the exact facet to look under) I couldn’t properly demo how to check for it. In this post I’ll show you how to check for that specific setting. In a future post I’ll show you how to check on many more settings.

Before we begin, I highly recommend you familiarize yourself with what exactly this setting changes and how it affects your SQL Server environment. Remember this setting affects the entire instance so all databases installed here will be affected by this change. Read this great post by Bob Pusateri (Blog | Twitter) to get an understanding of what Optimizing for Ad Hoc Workloads really does.

Creating the Policy/Condition

  • In SQL Server Management Studio browse down to and expand your management node, expand the Policy-Based Management node, right-click the Policies folder and select New Policy.
  • Name your new policy and then from the Check Conditions drop down menu select New Condition.
  • Give your new condition a name and from the Facet drop down menu select the Server Configuration facet.
  • In the Expression editor, click the area below the column title of field and you will be presented with a drop-down of all the properties available for this facet. Select @OptimizeAdhocWorkloads.

Creating our new condition

  • Under the heading of Value, you will have two options: True or False. When you create a policy you want to establish a condition you want so for the purposes of this demonstration we want our servers to have this setting set to off (which is default setting) so we’ll select the option for FALSE. Click OK to create your condition and return to the new policy window.
  • Next we’ll select our Evaluation Mode. This policy, based on the facets and properties we’ve selected offer us three options: On demand, on schedule and On Change: log only. The last option, if enabled, will allow this policy to be active and log any changes made to this particular setting. One cool thing you can do with this is you can create alerts to automatically email you if this particular condition is violated. Check out Ken Simmons (Blog | Twitter) article on Configuring Alerts for Policy-Based Management to learn more. Leave the Evaluation Mode to On Demand and click OK.

Now that we have our policy created simply right-click on it (located under your Policies folder) and select Evaluate to try it out!

GUI? We Don’t Need No Stinking GUI!

In this post I walked you through how to create this policy using the GUI but if you prefer to script this out, you can do that too! Here is the T-SQL script that you can run in lieu of walking through the SSMS screens, to create this particular policy:

[code lang="sql" wraplines="true"]
Declare @object_set_id int
EXEC msdb.dbo.sp_syspolicy_add_object_set @object_set_name=N'Ad-hoc Workload Check_ObjectSet', @facet=N'IServerConfigurationFacet', @object_set_id=@object_set_id OUTPUT
Select @object_set_id

Declare @target_set_id int
EXEC msdb.dbo.sp_syspolicy_add_target_set @object_set_name=N'Ad-hoc Workload Check_ObjectSet', @type_skeleton=N'Server', @type=N'SERVER', @enabled=True, @target_set_id=@target_set_id OUTPUT
Select @target_set_id
GO

Declare @policy_id int
EXEC msdb.dbo.sp_syspolicy_add_policy @name=N'Ad-hoc Workload Check', @condition_name=N'adhoc optimization check', @policy_category=N'', @description=N'This policy checks the server setting to see if Optimize for Ad-Hoc Workload is enabled. The default setting is disabled.', @help_text=N'To learn more about this policy check out Jorge Segarra''s blog post on this', @help_link=N'http://sqlchicken.com/2011/01/policy-for-ad-hoc-workloads/', @schedule_uid=N'00000000-0000-0000-0000-000000000000', @execution_mode=0, @is_enabled=False, @policy_id=@policy_id OUTPUT, @root_condition_name=N'', @object_set=N'Ad-hoc Workload Check_ObjectSet'
Select @policy_id
GO[/code]

Conclusion

Again, I can’t iterate enough NOT to blindly go changing settings on your servers without understanding the effects of your actions! Policy-Based Management is a very powerful and easy-to-use tool but be sure to use it wisely! In a later post I will show you how to modify even more server-level settings and let you customize policies to check exactly the settings you want audit.

Share

By

EPM Framework and SQL 2008 R2

This weekend at SQLSaturday 62 in Tampa, I presented my policy-based management presentation. During my presentation
one of the cool things I cover is how policy-based management can be extended utilizing Reporting Services and PowerShell
through the use of an amazing tool called the Enterprise Policy-Management Framework available on Codeplex.

Enterprise Policy Management Framework, or EPMF, is completely free and was developed by the folks at Microsoft who created
policy-based management. I absolutely love telling folks about this project because it really helps sell the idea of
policy-based management’s application within an organization. What’s cool about this project is the built in reports make it easy to see the health state of your environment at a glance as well as let you drill down further in to each report piece to find more granular information on policy states.

One caveat of EPMF is that in order to run on SQL Server 2008 it requires SP1 Cumulative Update 3 or higher installed on your Central Management server in order to function properly. This requirement is in place in order for EPMF to be able to properly handle policy evaluation on down level systems (e.g. SQL Server 2000, 2005). An interesting question was asked during the presentation: “Does EPMF support SQL Server 2008 R2 RTM (10.50.1600)?” The answer is YES, it does!

I tested this on my local install of SQL Server 2008 R2 at RTM level and it works. Even though it works at RTM, I highly recommend you update your SQL Server 2008 R2 instance to at least Cumulative Update 3 or higher. I know, you’re thinking “but you just told me it works at RTM!” Yes, it does, however the RTM edition of R2 came with quite a nasty little bug that wasn’t fixed until the CU3 patch. This bug is outlined in this Connect issue by Aaron Bertrand (Blog | Twitter). The bug is that SSMS will not allow you to edit or create a job step after you’ve created an initial one. How does this affect you? Well when you setup EPMF you need to create a new scheduled job that executes the PowerShell script that evaluates the policies against your environment. This particular bug will stop you from editing or creating new job steps which could severely affect you trying to fix things. There is a workaround wherein you can close/reopen SSMS to make the error disappear but this can become quite cumbersome very quickly.

Policy-based management is an extremely powerful and easy to use feature in SQL Server 2008 and EPM Framework extends its awesomeness even further. If you’d like to learn more about Policy-based management you can check out some webinars I’ve done over at Pragmatic Works (webinar link) or at SQLLunch (webinar link) on the topic.
Share

By

SQL University: Spring 2011

The new SQL University Logo

Welcome back students! We’re very excited to start up a new semester and SQLU is back and better than ever! This semester we’re lucky to have 7 MVP’s, 7 Women in Tech (most ever!) and a Microsoft Certified Master (MCM) in SQL Server presenting topics. In addition to our regular lessons we have the SQL Rockstar himself, Tom LaRock (Blog | Twitter), hosting weekly DBA Coaching lessons on his blog as well. Our staff is also hard at work putting together an e-book compilation of the first two semesters-worth of content. We’re calling this compilation SQL University Vol 1: Freshman Year. As soon as we finish putting it together we’ll announce it via the mailing list along with communications on Twitter (Follow us @sqluniversity). What mailing list you ask? Well if you want to make sure you get all the latest news and updates for SQLU please sign up for our mailing list here.

Read More

Share

By

SQLSaturday #62: Tampa

We’re 10 short days away from SQLSaturday #62 event and I just wanted to remind everyone about some of the amazing stuff that will be happening that weekend!

First off we have an incredible deal with a pre-con we call Day of Data. We have two all-day training options for you at the incredible price of $99 (after today 1/5, price jumps to $109)! For the DBAs we have Denny Cherry (Blog | Twitter) presenting Storage and Virtualization for the DBA.  For the BI focus We have Stacia Misner (Blog | Twitter) presenting a Day of BI. This price includes coffee, juice and donuts, lunch, and course materials. To register click here and make sure to share this with co-workers and your boss! I guarantee the ROI on this training will be off the charts!

As for the main event we have an AMAZING lineup of speakers for this free (yes, I said FREE) training event. Check out the schedule (time/rooms subject to change):

Cafeteria Room A Room B Room C
8:30
- 9:30
Introduction to SSIS Efficient Datawarehouse Design How SQL saved my Business Intelligence Platform DBA Repository Update 2010 Using SSIS and SSRS
9:45
- 10:45
SSIS Cafeteria DBA 101 Developing Date and Role-Playing Dimensions Implementing auditing in SQL Server
11:00
- 12:00
Cool Tricks to Pull from your SSIS Hat: Why I Use Stored Procedures Introduction to PowerPivot for Excel SQL Server Auditing 101
12:15
- 1:15
Accelerating BI Development with BI xPress
1:30
- 2:30
Do You Know the Data Flow? Zen and the Art of Writing SQL Query Indexing for performance Reporting Services 2008
2:45
- 3:45
SQL Smackdown: SSIS vs. PowerShell Page And Row Compression How, When, and Why SQL Server 2008 R2 Parallel Data Warehouse Revive the code: refactoring for performance
4:00
- 5:00
Iron Chef SQL Server Troubleshooting with the SQL Server 2008 DC & MDW Bad SQL SSIS and SSRS Better Together
More rooms below – keep scrolling
Room D Room E Room F Cantina
8:30
- 9:30
Why Learn PowerShell? Policy-Based Management in a Nutshell To click or to type, that is the question
9:45
- 10:45
SQL Server PowerShell Extensions (SQLPSX) Become a Bilingual DBA! Oracle for the SQL Server Sql Server Service Broker – An Overview
11:00
- 12:00
Windows PowerShell 2.0 Best Practices for DBA’s Introduction to Transactional Replication ITIL V3 for the Database Administrator
12:15
- 1:15
Lunch is served
1:30
- 2:30
You inherited a database Now What? MDX 201 Find Performance Problems by Reading the Waits WIT Discussion
2:45
- 3:45
Where should I be encrypting my data SQL Server Memory Deep Dive Spatial Data in SQL 2008 and Bing
4:00
- 5:00
DR Availability,You’re Wanted in the Recovery Room SSIS Data Flow Buffer Breakdown Creating a Metadata Mart w/ SSIS – Data


And I guess it’s worth mentioning I’ll be there presenting my Policy-Based Management in a Nutshell talk so if you come to the event swing by and say hi (even stay for my session if you’d like!). So grab yo kids, grab yo wife, grab yo coworkers and get to SQLSaturday cuz everyone’s learning up in there!*

*I apologize for the horrendous addition to an internet meme to my post

Share