Friday, July 29, 2011
Are Your Database Statistics Fresh?
Thursday, July 14, 2011
Adventures with Denali CTP3–Part 1
One thing that surprised me was that the downloads for the databases were only the MDF (data) file – the log file was not included. After fiddling unsuccessfully with attaching it using the UI in Management Studio (no, I didn’t think of deleting the log file name from the file list in the UI – I’d assumed it was required and didn’t realize that if you did not list a logfile that it automatically treated it as an ATTACH_REBUILD_LOG command), I finally decided that it would be sensible to actually read the instructions. Technet provided me a very simple query to attach the database
CREATE DATABASE AdventureWorks2008R2 ON (FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\AdventureWorks2008R2_Data.mdf') FOR ATTACH_REBUILD_LOG ;
Worked like a charm.
I modified the query and attached the AdventureWorksDWDenali database in a similar manner then ran a few quick SELECT queries on various tables to see what they contained. I was pleasantly surprised to see that the OLAP database’s dimDate table contained English, French and Spanish day and month names.
I then launched the BI Development Studio and opened the AdventureWorksDWMultidimensionalDenali project (provided in samples pages on Codeplex). I verified the connection information in the datasource and successfully deployed the cube.
If everyone knew how easy this was, I’d probably be out of a job.
Query to Pull Database File Information
I’m sure that there are other ways to pull this data, however, in some environments your permissions may restrict you from using any method other than this to pull the data.
Tuesday, July 12, 2011
SQL Recovery Mode Adjustment
Run this script on a Dev box to set the recovery to SIMPLE for all databases to avoid the above scenario. It works on SQL 2000, 2005 and 2008 SQL servers.
NOTE: it is recommended that PRODUCTION servers use FULL recovery mode rather than SIMPLE.
use master
go
DECLARE @Database VARCHAR(255)
DECLARE @Table VARCHAR(255)
DECLARE @cmd NVARCHAR(500)
DECLARE DatabaseCursor CURSOR FOR
SELECT name FROM master.dbo.sysdatabases
where Name not in ('tempdb') -- cannot set recovery for Tempdb.
ORDER BY 1
OPEN DatabaseCursor
FETCH NEXT FROM DatabaseCursor INTO @Database
WHILE @@FETCH_STATUS = 0
BEGIN
set @cmd = 'ALTER DATABASE ' + @database +'
SET RECOVERY SIMPLE'
EXECUTE sp_executesql @statement=@cmd
FETCH NEXT FROM DatabaseCursor INTO @Database
END
CLOSE DatabaseCursor
DEALLOCATE DatabaseCursor
For your homework, you can substitute a WHILE loop for the cursor.
If your environment is running mostly SQL 2008 (or higher), please check out SQLChicken's article on setting up Policy Based management to handle ensuring that the dev boxes are all set to Simple Recovery Mode.
Monday, May 2, 2011
Meme Monday: I Got 99 SQL Problems And the Disk Ain’t One
1) Using default install settings for file growth
Despite numerous examples from live systems showing that those settings are not appropriate for our product's databases, we frequently see new customers with all their databases set to the default 10% growth setting, despite the statement in the "best practices" documentation that tells them otherwise.
2) Bloated SQL error logs
Many times when customers report having issues and we're called in to examine what's happening with their SQL server, we find that we can't open the SQL Error Logs because the customer's SQL server hasn't been restarted in a long time and the errorlog is so bloated that it's too big for the UI to open in a timely manner. The simple fix, of course, is to set up a SQL job that runs sp_cycle_errorlog periodically.
3) Not doing ANY index maintenance
Frequently, when I hear about SQL performance issues, I find that the customer has turned off the regular index maintenance jobs "because they take too long". Eventually, this results in painfully out of date statistics, severely fragmented indices and terrible performance.
4) Shrinking databases as "maintenance" to "free up disk space"
I try my best not to use profanity or to scream (loudly) when I see this enabled on customer servers. I just take a deep breath and forward the following links to the guilty party:
Paul Randal: Why You Should Not Shrink your Data Files
Brent Ozar: Stop Shrinking Your Database Files. Seriously. Now.
5) Developers "testing" code on production
Don't get me started....
6) Poor backup plans not even close to SLA requirements
High volume OLTP Production database, full recovery with log backup once a day at midnight and full backup once a day at 1AM - but their SLAs say they have to completely recover from failure within one hour. They claim that because the SQL server is clustered, that they don't have to back up the databases more often. Really? Please don't call me when things go south.
7) No disaster recovery plan
... And office in the middle of Tornado alley. Again, please don't call me to resurrect your SQL server when your data center gets destroyed by one of the 500+ tornadoes that went through town. You don't have a copy elsewhere and I can't create something from nothing.
8) Letting idiots have access to the Server room
Believe me: I can't make this one up - it actually DID happen.
A particular person on the night cleaning crew entered a server room to vacuum it. Because the vacuum's cord was too short to allow him to vacuum the far side of the server room, he unplugged something to free up an outlet so he could vacuum the far corner of the server room. The "something" he unplugged was the main power for the SQL server, which knocked out the customer's website until someone entered the server room in the morning and noticed that the server was unplugged.
9) Not having automated monitoring on servers
You'd think this was obvious, but I've been called too many times to count late at night to hear that someone's server is "down", only to find out the reason the SQL server crashed or the backups failed was because the disk was full. Automated disk monitoring systems have been around for over a decade, yet many of our customers don't have any automated monitoring and I doubt that their IT people check the servers every day since they seem so surprised to discover that their disk has filled up completely.
After just thinking about those 9 items, it's time for a stress pill.
Monday, April 25, 2011
SQLRally - Birds of a Feather
He replied, "@LadyRuna if you're volunteering, you could host it :-D #sqlWIT #sqlRally"
So it looks like I'm hosting the WIT table at the Birds of a Feather event at SQL Rally. Please stop by and say hello if you're there. I've never run one of these before and I'll be making things up as I go along.
Thursday, April 21, 2011
Hooray for Wonderful Husbands
I was talking to my boss yesterday about my professional development plan for this year, and the subject of training came up. He said we only had enough money in the budget to send one person to a conference this year, and since I’d attended SQL PASS Summit last year, they were going to send my coworker this time – meaning I wasn’t eligible to attend any offsite training at all. He suggested I find some books to read to fulfill my professional development plan.
Needless to say, I was not at all thrilled, and mentioned it to my darling husband.
He, too, was unimpressed. He knew how important and useful attending SQL PASS Summit had been for me.
Over dinner, he casually asked me, “Do you think you can take time off from work the second week of May?”
I checked my calendar – I was not scheduled to be on call, so I could take time off.
I replied, “Sure. Why?”
He held my hand and said, “There’s this PASS SQLRally thing in Orlando that week. We have enough vacation money saved up to send you.”
(Queue romantic music and sappy romantic scene….)
Ok, get your mind out of the gutter… The final result is: I’m going to PASS SQLRally and even get to attend one of the Pre-con classes. Hooray for wonderful, amazingly loving husbands! I certainly picked the BEST husband in the world. See you at PASS SQLRally!
Resoved: Error Creating Storage Event Trigger in Azure Synapse
My client receives external files from a vendor and wants to ingest them into the Data Lake using the integration pipelines in Synapse (Syna...
-
I've been playing with PowerBI recently, and have discovered their Gallery of Custom Visuals . I decided to try some to see what they ...
-
TSQL2sday Linkback Happy TSQL Tuesday Everyone! This week's TSQL2day is hosted by the lovely Jen McCown ( blog | twitter ), ...
-
Since I've attended SQL Pass Summit a couple of times, this year I volunteered to help out with the " Summit First-Timers " p...