Thursday, June 9, 2016

Better SQL Syntax to Test for Existence Before Dropping



For the longest time, I've been using code like this

IF EXISTS (
    SELECT 1
    FROM sys.objects
    WHERE object_id = OBJECT_ID(N'[dbo].[FakeProc]')
      AND type IN (N'P',N'PC')
    )
  DROP PROCEDURE [dbo].[FakeProc]
GO

IF EXISTS (
     SELECT [name]
     FROM sysobjects
     WHERE [name] = 'trFakeTrigger'
AND type = 'TR'
     )
BEGIN
DROP TRIGGER trFakeTrigger;

END
Go

to test for existence of a Procedure or Trigger prior to dropping it. The challenge inherent within using the above syntax was remembering the abbreviation used to define the object type.

Later on, I found that I could use

IF OBJECT_ID('FakeProc') IS NOT NULL
BEGIN
DROP PROCEDURE FakeProc
END;
GO

IF OBJECT_ID('dbo.fn_FakeFunction') IS NOT NULL
BEGIN
DROP FUNCTION dbo.fn_FakeFunction
END;

GO


This was certainly more memorable than the earlier version but was still not the best choice. I needed a template that could be easily copied multiple times and modified with minimal issues.

I then discovered the most efficient trick: surround the drop command with a TRY - CATCH block.

BEGIN TRY DROP TRIGGER FakeTrigger END TRY BEGIN CATCH END CATCH;

BEGIN TRY DROP TABLE #Test END TRY BEGIN CATCH END CATCH;

BEGIN TRY ALTER TABLE MyTable DROP CONSTRAINT FakeConstraint END TRY BEGIN CATCH END CATCH;

BEGIN TRY DROP PROCEDURE FakeProc END TRY BEGIN CATCH END CATCH;

The best part is that MSDN already has written the code for a TRY-CATCH snippet. This snippet doesn't come built into SSMS, but the page provides clear and simple instructions for creating and importing the snippet. Give it a try and Happy Coding!



Tuesday, April 26, 2016

Power BI Experiments

I’ve recently discovered Power BI. Basically, it’s the child of Power Pivot and SQL Reporting Services, but without Kryptonite to slow it down. You can connect to all kinds of datasources – Excel, SQL, Hadoop, folders ….and 55+ other data sources – and then perform analyses of the data therein. Microsoft has even allowed developers to publish / create new “visualizations” (charts, graphs, maps, etc.) for users to adopt for their data analysis.

I decided to give it a try, and used my Wine Cellar Excel spreadsheet as the test data set so I could more easily recognize any trends that the Visualizations pointed out.

I first evaluated my collection by Country & Region by Quantity of Wine (in ml). The light blue circles correspond to “France”, and the pink is “Portugal”…Somehow Burgundy, France wound up near New York…and our Douro, Portugal (Port!!) sat near the Great Lakes. This I cannot explain.. .I had to edit “Rhone, France” to “Lyon, France” to move it from somewhere in Asia to France. I’m not sure how it was handling the mapping, but I guess it cannot handle Region names.

The only way I could get the mapping to look better was to look up the regions and set them to specific cities in each country. I guess that the mapping program cannot understand Regions (Bordeaux, Burgundy, Cotes D’Or, Willamette Valley, Douro Valley, etc.). Therefore, if one wishes to use Power BI to display data on maps, then the locations listed must be City names, and not Region names, nor a mix of Cities and Regions as I had in my Wine List spreadsheet.

I then tried and analysis of Quantity (in ml) by Year and Appellation. This provided the expected results: we own a LOT of Chateauneuf du Pape 2005 (Pink in below graph).
I also evaluated it using the pie chart to see how that turned out.

Again, it’s obvious what we like to stock in our wine cellar (light blue is all 2005, and the majority is Chateauneuf du Pape), but it also shows that Power BI makes it extremely simple to illustrate that fact through the visualizations it provides - I spend only a few minutes to generate these charts from my data.

All in all, I’d say that Power BI is very useful for quick and efficient analysis of any dataset where a graph or chart would best illustrate the relationship of data elements within it.

Resoved: Error Creating Storage Event Trigger in Azure Synapse

My client receives external files from a vendor and wants to ingest them into the Data Lake using the integration pipelines in Synapse (Syna...