Tag Archive: SQL Server


ThankSQLgiving

The past twelve months have been fantastic for me, both personally and professionally.  Heck, I even finished remodeling my house.  I embraced each positive experience and fed it into the next one down the line.  I would never have had such a good year if it were not for the wonderful people I met along the way.

Thanksgiving is as apropos as can be.

PASS Summit

Last year, I was fortunate enough to go to the PASS Summit.  PASS, as it is colloquially known, is a world-wide conference of SQL Professionals.  It is jam packed with speakers, technical sessions, networking, parties, and hot tubs.  Although, not necessarily all at the same time.

I attended tons of great sessions, many being given by the same bloggers I had been reading; one of whom was Trevor Barkhouse (blog | twitter).  On the meet and greet day, I met Wes Brown (blog | twitter), who runs the local CACTUSS group in Austin, and he mentioned the SQL Saturday conferences.

I left Seattle absolutely loving the city, ready to move there, and recharged about my career with SQL Server.  I have been working with technology for over fifteen years, but I have never experienced the camaraderie that I felt with the SQL community.

SQL Saturday

After learning about SQL Saturday, I started looking up when and where the next one was going to be.  SQL Saturday is an almost free mini-PASS conference put on your fellow DBAs in the field.  It is on a rotating schedule that moves from town to town, and even other countries.

The first SQL Saturday I attended was in Houston, where I saw a session listed by Trevor Barkhouse.  I recognized his name from PASS, so I attended his session, asked a bunch of weird questions, and won a book at the end: SQL Server 2008 Internals and Troubleshooting; which I proceeded to read furiously…

After that, I attended the Dallas SQL Saturday and met-up with Trevor, and others, again.  The SQL community was really beginning to feel like a family.

SQLskills

To say that it was a blessing to attend the SQLskills classes this year would be an understatement.  Paul Randal (blog | twitter), Kimberly Tripp (blog | twitter), Bob Beauchemin (blog | twitter),  and Jonathan Kehayias (blog | twitter) offer, hands down, the best training I have ever had.  Period.  Forget about SQL training, technical training, or any other training.  I have never been so challenged, felt so dumb, and felt so smart; all at the same time.

As a special treat, I was able to attend a class where Brent Ozar (blog | twitter) was an additional one of the instructors.  His is one of the first blogs I started reading when I started out as a DBA.

If you can only do one thing as a DBA next year, attend a SQLskills class.  You will come away with a sore brain, a list of projects to implement in your shop, and a ton of great scripts and demos to review later.

Grandma’s 99th Birthday & Summer Mummers

This summer, my grandmother turned ninety-nine.  So, my cousin organized a secret family reunion to surprise her.  This was a fantastic event and we had people come out of the woodwork for this.  It was great to see her and other family members that I haven’t seen recently.  It amazes me how lucent she still is; I hope some of her genes rub off on me.

While in Midland, I also got to check out Summer Mummers.  Summer Mummers is a gin-u-wine Wild West Vaudeville Show.  It is an incredible experience and well worth the drive through West Texas to see it.  In fact, people travel from other countries every year to see it.

If you love theatre, do yourself a favor and go see this show at least once in your life.  Once you do, you will wonder how you ever lived before.

SQL Saturday Austin

With some encouragement from Wes Brown (blog | twitter) and Trevor Barkhouse (blog | twitter), I went for broke and submitted a session for SQL Saturday #97 in Austin.  I did a trial run of my presentation at the local CACTUSS user group, and even though I went way over time, I got plenty of good feedback.  So I edited and cut my way down to 60 minutes and re-worked my demos and went for it.

SQL Saturday Austin was like a class reunion for me.  I saw several of my friends from SQLskills class: Trevor Barkhouse (blog | twitter), Vicky Harp (twitter), and Rudy Rodarte (blog | twitter | youtube).  All of us as presenters!

By taking the leap of faith and becoming a speaker, I met tons of cool folks and made some good, new friends.  Doing this really charged my batteries and primed me for the next great chapter in my career development.

Employment

November brought on another change; after much consideration, I have decided to move on from Whole Foods Market.  During the past four and a half years, I have had a wonderful tenure.  I have made some great friends, worked on some great projects, and gained tons of valuable experience.  I feel very fortunate for having worked there, and am grateful for all the opportunities they have afforded me.  Between data center migrations, multi-instance clustering and consolidation, and some interesting disaster recovery situations I feel I could not have had a better training ground as a Database Administrator.  And the location’s not bad, either.  It’s hard to beat 6th & Lamar; downtown Austin at its finest…

In a couple of weeks, I will begin the next chapter of my career in the healthcare industry with a global provider of orthotics and prosthetics.  I am excited to be working with an organization that does so much to improve the lives of people all around the world.  I will continue my work with clustering and Denali.  While I will no longer be downtown, the Domain is a pretty cool location, as well.  I’m sure I will find some cool places to explore…

SQLskills Mentoring

Last week, I pretty much won the lottery.  Around mid-week, Jonathan Kehayias (blog | twitter) reached out to tell me a new mentoring program that SQLskills is starting, and oh by the way, would I be interested?  They say there is no such thing as a dumb question, but I’m not so sure…  🙂

Needless to say, I was interested; and floored, honored, humbled, etc.  I am truly touched that Jonathan and Paul reached out to me and are willing to take the time to guide me along my personal and professional goals for the coming year.  My hope is that I will be able to pay this forward to someone else down the line.

So my next mission is to sort out what exactly are my goals for two thousand and twelve.  I guess I’m not gonna worry about that Mayan Calendar thing any more.  But I do think December 20th, 2012 will be a good day to buy some stocks…  🙂

Everything’s Coming up Milhouse

When I look back upon this great year, it amazes me how things have developed.  If you don’t think your life is going as well as you would like, you DO have the power to change things.  It takes a lot of work and is not easy, but you can create your own luck and make the life that you would like to have.  You will not always get everything you want, but as Wayne Gretzky said, ‘You miss one hundred percent of the shots you don’t take.’

Just go for it.

Heaps of Heaps

Recently, I was looking into a stored procedure that was running poorly.  As part of my analysis, I examined the underlying tables and their indexes.  By using Kimberly Tripp’s (blog | twitter) rewrite of  sp_help, I was able to quickly determine that one of the tables was a heap.

Heaps

A Heap is a table without a Clustered Index.  This is bad for a few reasons:

  • When you update a Heap, you can get forwarded records
  • When you insert to the Heap, an 8-byte uniquifier will be added
  • Can compromise insert performance

How Many Heaps in a Pile

Like most things in nature, when there is one, there is another, and another, and another…

Knowing this, I set about to discover how many other tables in this database were heaps.

First, let’s look at our overall count of heaps in relation to the total table count.


-- percentage of heaps
declare @heapCount numeric;
declare @tableCount numeric;

select @heapCount = count(t.object_id)
	from sys.tables as t
	join sys.indexes as i
	on t.object_id = i.object_id
	where i.type_desc = 'HEAP';

select @tableCount = COUNT(*) from sys.tables;

select @heapCount as 'Heap Count',
	@tableCount as 'Table Count',
	CAST((@heapCount / @tableCount * 100) as numeric(5, 2)) as 'Percentage of Heaps';
go

Next, let’s enumerate all the tables that are heaps.


-- list all the heaps in the currently selected database
select OBJECT_NAME(t.object_id) as 'Table Name', i.type_desc
from sys.tables as t
join sys.indexes as i
on t.object_id = i.object_id
where i.type_desc = 'HEAP'
order by OBJECT_NAME(t.object_id);
go

Time to Make the Clusters

What I found was startling to say the least.  On this particular server, all three databases contain quite a few heaps.

Obviously, this is going to be where I start my tuning efforts.  After seeing how many heaps existed on a strange database, I think it would be wise to make this a standard check when first looking at any unfamiliar database.

Clearing the Plan Cache

While at SQL Saturday #63 in Dallas, I got some questions about clearing the plan cache.  This evolved into an email discussion; which resulted in me writing up some example scripts.  Since I received some good feedback, I thought this would be a good opportunity to turn them into a blog post.

Warning: These scripts should almost never be run on a production system.  These are meant for debugging and testing on DEV.  If you do run these on PROD, be sure to understand the consequences.

I could see running these on PROD in a few limited cases.  For example, if you have a query that is stuck with a bad plan in cache, you can use these scripts to remove that individual plan.  Another possibility would be to clear out the adhoc plans, perhaps just after you have enabled the optimize for adhoc workloads setting.

Examining the Plan Cache

The Plan Cache is where SQL Server stores the Execution Plans for any queries or stored procedures it has run.  The purpose of this is to enable SQL Server to reuse Execution Plans for each subsequent run of the query or stored procedure.  This allows SQL Server to scale up as more requests begin to pile on.

To examine the Plan Cache you can start by querying the sys.dm_exec_cached_plans DMV.


-- look at the plan cache, list all plans
select * from sys.dm_exec_cached_plans;
go

This is not very useful by itself.  In order to make this query a bit more useful, we can link in either the Query Text or the Query Plan by using two other DMVs: sys.dm_exec_sql_text and sys.dm_exec_query_plan respectively.


-- link in the sql text
select decp.*, dest.text
from sys.dm_exec_cached_plans decp
cross apply sys.dm_exec_sql_text(decp.plan_handle) dest;
go

-- link in the query plan
select decp.*, deqp.query_plan
from sys.dm_exec_cached_plans decp
cross apply sys.dm_exec_query_plan(decp.plan_handle) deqp;
go

And if you wish to see both the Query Text and the Query Plan together, you can link multiple DMVs in the same query.


-- link in both
select decp.*, dest.text, deqp.query_plan
from sys.dm_exec_cached_plans decp
cross apply sys.dm_exec_sql_text(decp.plan_handle) dest
cross apply sys.dm_exec_query_plan(decp.plan_handle) deqp;
go

Clearing the Plan Cache

As I work through clearing the Plan Cache, I’m going to start with the broadest example, and then get more granular as we go along.  If you need to clear the entire Plan Cache, you can do so with a DBCC FREEPROCCACHE command.

I feel the need to issue another warning here, and reiterate that this is almost always a bad idea on a production system.


-- clears everything in the plan cache
dbcc freeproccache;
go

Clearing by Database

If clearing the entire Plan Cache seems a bit excessive, as well it should, then you can get a wee bit more focused by only clearing the plans for a particular database.  This can be achieved by using the DBCC FLUSHPROCINDB command.  This is an undocumented command in SQL 2008, however, you can read about it under the SQL 2005 documentation.

As with clearing the entire cache, this is probably not something you want to do with a production system unless you have a good reason.  You may have one database that is a special case, which needs to be tamed.


-- clear the plan cache for a particular database
declare @v_DBID tinyint;

select @v_DBID = dbid from sys.sysdatabases where name = 'AdventureWorks2008R2';

dbcc flushprocindb (@v_DBID);
go

Clearing by Object Type

Next, let’s look at clearing the Plan Cache for a particular type of object.  By using DBCC FREESYSTEMCACHE we can choose to clear out the Adhoc Plans, Stored Procedure Plans, Views, or Extended Stored Procedures.

Most likely, the ones you will be interested in are Adhoc Plans and Stored Procedures.


-- clear plans for a particular type of object

-- clear sql plans: adhoc sql, prepared statements
dbcc freesystemcache('SQL Plans');
go

-- clear object plans: stored procedures, functions, triggers
dbcc freesystemcache('Object Plans');
go

-- clear bound trees: views, constraints, defaults
dbcc freesystemcache('Bound Trees');
go

-- clear extended stored procedures
dbcc freesystemcache('Extended Stored Procedures');
go

Like a broken record, I gotta remind you that it is a bad idea to do this on a production system.

Optimize for Adhoc Workloads

The primary reason you may wish to clear out the Adhoc Plans from your cache, is if you have just enabled the SQL Server setting ‘optimize for adhoc workloads‘.  This setting tells SQL Server not to cache the Execution Plans for Adhoc queries until they have been run more than once.  This helps keep Plan Cache Bloat under control.

You can enable this setting by using sp_configure.


-- show advanced options - enable all options to be seen / set
use [master]
go
exec sp_configure 'show advanced options', 1
go
reconfigure
go
reconfigure with override
go

-- optimize for ad hoc workloads
sp_configure 'optimize for ad hoc workloads', 1
go
reconfigure
go
reconfigure with override
go

-- verify change
sp_configure 'optimize for ad hoc workloads'
go

Once you have enabled this setting, the Single-Use Adhoc Plans will remain in cache until either the SQL Service has been restarted, or you have explicitly cleared the Plan Cache of Adhoc Plans by using the code example above.

Resource Governor

If you are using Resource Governor, then you have the option of clearing the Plan Cache for a particular Resource Pool.  Resource Governor allows you to create various Resource Pools and then designate a certain percentage of CPU or Memory to that pool.

In order to get a list of the Resource Pools that are defined on your system, you can query the sys.dm_resource_governor_resource_pools DMV.


-- list the resource governor pools
select distinct name as 'Resource Governor Pools'
from sys.dm_resource_governor_resource_pools;
go

Once you have identified the Resource Pool that needs to have its associated Cached Plans cleared, you can clear the plans by using the DBCC FREEPROCCACHE command.


-- clear the default resource governor pool
dbcc freeproccache ('default');
go

-- clear the internal resource governor pool
dbcc freeproccache ('internal');
go

-- clear the default resource governor pool
dbcc freeproccache ('YOURRESOURCEGOVERNORPOOL');
go

Again, please do not do this on a production system unless you are certain this is what you need.

Clearing a Single Plan

If you have a particular query which is causing you problems, for example, it got compiled with a bad plan, then, one option is to remove the offending plan from the cache, which will cause a recompile.  There is no guarantee, however, that the new plan will be any better than the old plan.

You may wish to capture the existing plan so that you can compare it with the new plan.  This way you can explore what the differences are, and focus your tuning efforts there.

To find the query plan for a particular query we will need a search string; the SQL from your query.  We will place your query inside of mine, so we can search the Plan Cache for it.  It is not necessary to use the entire text of the query, that would be a bit cumbersome. All we need is enough to make it unique, or unique enough that we can find it in the results.


-- or it may be even better to search by the sql text
-- so that you only see the ones that interest you
-- also link in the query plan so that you can analyze it
select decp.usecounts, decp.size_in_bytes, dest.text, decp.plan_handle, deqp.query_plan
from sys.dm_exec_cached_plans decp
cross apply sys.dm_exec_sql_text(decp.plan_handle) dest
cross apply sys.dm_exec_query_plan(decp.plan_handle) deqp
where dest.text like '%PLACE YOUR SQL CODE SNIPPET HERE%';
go

Once you have identified the offending plan, it is a simple matter to remove it from the cache.  Use the plan_handle with the DBCC FREEPROCCACHE command to clear it from the Plan Cache.


-- once you find the plan you wish to prune
-- place the plan_handle in the following statement
dbcc freeproccache (0x0600010010450926B8E00C07000000000000000000000000);
go

So this is the only one of the options where I can see a valid case for doing it on Production.  All the same, be careful and understand what you are doing.  It is important to clear only the Query Plan that is needed, and no more.

Conclusion

It is important to keep tabs on what is going on in your Plan Cache.  Being able to clear out all or part of your Plan Cache can be a powerful debugging technique.  Please bear in mind that this is not something that should be done on a Production server without careful consideration of the side effects.

However, playing with this in DEV can help you understand more of what is happening under the covers of SQL Server.