Friday, July 29, 2011

Are Your Database Statistics Fresh?


SQL maintains a vast amount of data – or statistics – about the content of each object in a database. The statistics can become stale if they have not been updated very often, or if a large number of changes have occurred within the database. As the statistics become less useful, the time for running queries can increase dramatically.
In Production systems, statistics should be updated during the usual maintenance window to ensure that the metadata is fresh.
To see how fresh the statistics are for one object, run:
DBCC SHOW_STATISTICS ( 'CCAuditSessionType' ,CCAuditSessionType_PK)

If you need to see the statistics for all databases, run this instead:
select STATS_DATE(o.object_id,s.stats_id) as StatsDate,o.name as TableName, s.name as StatsName, auto_created, user_created, no_recompute
from sys.stats s
join sys.objects o on s.object_id=o.object_id
where o.type='U'

To update statistics that are out of date, execute the command
exec sp_updatestats
on each database on the server that needs to have its statistics updated.
Use the following query to generate a script to update the statistics on all databases
declare @db varchar(30)
, @dbID int
, @sql varchar(max)

create table #t
(DbName varchar(30), databaseID int)

Insert #t (DbName, databaseID)
select [name], database_id
from sys.databases
where database_id > 4

Select @dbID = MIN(databaseID)
from #t

While @dbID is not NULL
BEGIN
   select @db=DbName
     from #t
    where databaseID=@dbID

   set @sql = 'Use [' + @db + ']' + CHAR(13) + 'go ' + CHAR(13)
   set @sql = @sql + 'exec sp_updatestats' + CHAR(13) + 'go '

   PRINT @sql

   Select @dbID=min(databaseID)
     from #t
    where databaseID>@dbID
END

drop table #t
Copy and paste the printed output from your result set into the query portion of a SQL Server Agent job and this will ensure that the statistics are updated for all databases on a regular schedule. NOTE: the query above excludes the system databases.

Thursday, July 14, 2011

Adventures with Denali CTP3–Part 1

I usually only realize how slow downloads can be when I’m eager to begin working with the item being downloaded. The hour it took to download the AdventureWorks sample databases felt far longer than it actually was.
One thing that surprised me was that the downloads for the databases were only the MDF (data) file – the log file was not included. After fiddling unsuccessfully with attaching it using the UI in Management Studio (no, I didn’t think of deleting the log file name from the file list in the UI – I’d assumed it was required and didn’t realize that if you did not list a logfile that it automatically treated it as an ATTACH_REBUILD_LOG command), I finally decided that it would be sensible to actually read the instructions. Technet provided me a very simple query to attach the database
CREATE DATABASE AdventureWorks2008R2 ON (FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\AdventureWorks2008R2_Data.mdf') FOR ATTACH_REBUILD_LOG ;
Worked like a charm.
I modified the query and attached the AdventureWorksDWDenali database in a similar manner then ran a few quick SELECT queries on various tables to see what they contained. I was pleasantly surprised to see that the OLAP database’s dimDate table contained English, French and Spanish day and month names.
I then launched the BI Development Studio and opened the AdventureWorksDWMultidimensionalDenali project (provided in samples pages on Codeplex). I verified the connection information in the datasource and successfully deployed the cube.
If everyone knew how easy this was, I’d probably be out of a job.

Query to Pull Database File Information

This query will list the name, size and location of all files for all databases. This is handy for checking and documenting database server configuration to confirm whether the server follows our recommended best practices.
set nocount on
declare @sql varchar(max), @sql2 varchar(max), @name varchar(100)
if object_id('tempdb..#t') is not null drop table #t
create table #t (DbName varchar(30), LogicalName varchar(30), FileName varchar(100), sizeMB int, UsedSpaceMB int, growthMB int, is_percent_growth bit)
set @sql = ' substring(name,1,30) as LogicalName,substring(physical_name,1,75) as FileName,'
set @sql = @sql + 'size * 8192./1024/1024 as SizeMB,sum(a.total_pages * 8192./1024/1024 ) as UsedSpaceMB, '
set @sql = @sql + 'growth * 8192./1024/1024 as growthMB, is_percent_growth from '
declare c cursor for
select name from sys.databases where database_id > 4
open c
fetch next from c into @name
while @@fetch_status=0begin
set @sql2 = @sql + @name + '.sys.database_files df left join '
set @sql2 = @sql2 + @name + '.sys.allocation_units a on df.data_space_id=a.data_space_id left join '
set @sql2 = @sql2 + @name + '.sys.partitions p on p.partition_id = a.container_id '
set @sql2 = @sql2 + 'group by df.name, df.physical_name, growth, is_percent_growth, df.size'
begin try exec ('insert #t select ''' + @name + ''' as DbName, ' + @sql2 ) end try begin catch end catch
fetch next from c into @name
end
close c
deallocate c
select * from #t


I’m sure that there are other ways to pull this data, however, in some environments your permissions may restrict you from using any method other than this to pull the data.

Tuesday, July 12, 2011

SQL Recovery Mode Adjustment

Often when new Development boxes are handed over to my group, the databases are set to Recovery Mode = FULL because the setting match production recovery modes. Unfortunately, since the Dev boxes rarely have any backups running, eventually the transaction logs fill up the drive. When that happens, the databases can no longer accept new transactions and we are left with a (temporarily) non-functional box.


Run this script on a Dev box to set the recovery to SIMPLE for all databases to avoid the above scenario. It works on SQL 2000, 2005 and 2008 SQL servers.


NOTE: it is recommended that PRODUCTION servers use FULL recovery mode rather than SIMPLE.


use master

go


DECLARE @Database VARCHAR(255)
DECLARE @Table VARCHAR(255)
DECLARE @cmd NVARCHAR(500)
DECLARE DatabaseCursor CURSOR FOR


SELECT name FROM master.dbo.sysdatabases
where Name not in ('tempdb') -- cannot set recovery for Tempdb.
ORDER BY 1


OPEN DatabaseCursor


FETCH NEXT FROM DatabaseCursor INTO @Database
WHILE @@FETCH_STATUS = 0
BEGIN


     set @cmd = 'ALTER DATABASE ' + @database +'
     SET RECOVERY SIMPLE'


     EXECUTE sp_executesql @statement=@cmd


     FETCH NEXT FROM DatabaseCursor INTO @Database


END


CLOSE DatabaseCursor
DEALLOCATE DatabaseCursor


For your homework, you can substitute a WHILE loop for the cursor.  
If your environment is running mostly SQL 2008 (or higher), please check out SQLChicken's article on setting up Policy Based management to handle ensuring that the dev boxes are all set to Simple Recovery Mode.

Resoved: Error Creating Storage Event Trigger in Azure Synapse

My client receives external files from a vendor and wants to ingest them into the Data Lake using the integration pipelines in Synapse (Syna...