Showing posts with label Synapse. Show all posts
Showing posts with label Synapse. Show all posts

Thursday, October 6, 2022

Resoved: Error Creating Storage Event Trigger in Azure Synapse

My client receives external files from a vendor and wants to ingest them into the Data Lake using the integration pipelines in Synapse (Synapse's version of Azure Data Factory). Since the exact time the vendor sends the files can vary greatly day-to-day, he requested that I create a Storage Event Trigger.

I quickly set up the trigger:


Set Type to "Storage events", then select the correct storage account and container from the list associated to the current Azure Subscription. Use the blob path begins with setting to filter on the correct folder where the files land and use the blob path ends with setting to filter the file types (and perhaps file names if it's relatively consistent) to ensure that only the right blobs invoke the trigger. Finally, set Event to "Blob created".

Click Continue to go to the next page.

There will be a warning, "Make sure you have specific filters. Configuring filters that are too broad can match a large number of files created/deleted and may significantly impact your cost." which reminds you to check that the filters on the previous page actually return only the desired set of files. Be sure that you have at least one qualifying file in the folder and that the Data Preview can find it. If not, go back to previous page and adjust the the blob path begins with setting and the blob path ends with setting to correct the filtering. 

Click Continue to go to the next page.

This final page asks for the pipeline parameters to use when the trigger is invoked. 

Click Save to create the trigger. 



Once the trigger has been saved, publish the data factory.

So far so good. 

Then this popped up: 


The trigger needs to create and subscribe to an Event Grid event in order to be activated. Even the error was mysterious: 

"The client {GUID}' with object id {GUID}' does not have authorization to perform action 'Microsoft.EventGrid/eventSubscriptions/write' over scope '/subscriptions/{GUID}/resourceGroups/{resourceGroup}/providers/Microsoft.Storage/storageAccounts/{StorageAccount}/providers/Microsoft.EventGrid/eventSubscriptions/{GUID}' or the scope is invalid. If access was recently granted, please refresh your credentials."

I tried numerous searches on how to get the authorization to perform action Microsoft.EventGrid/eventSubscriptions/write and kept hitting dead ends. 

Finally, I started poking around in the Subscription settings to see if something needed to be set in there. Under "Resource Providers", I found that Microsoft.Synapse, Microsoft.Storage, Microsoft.DataLakeStore and Microsoft.EventGrid were all registered. So that felt like a dead end. 

After a bit more muddling around searching, I entered "Failed to Subscribe" in the search and found my savior: Cathrine Wilhelmsen.  She had experienced exactly the same issue and had the same difficulty I had locating information on how to resolve issue. She even mentioned the same articles that I read in my attempts to figure out what to do! The only thing I had not done was visit the Microsoft Q&A thread about running event triggers in Synapse - probably because I stumbled upon her blog post first! Thank you, Cathrine!!

So what was the magic trick?


The Microsoft.DataFactory resource provider was not registered. 

I hadn't expected that because we didn't have Azure Data Factory installed in this subscription, but now we know that it is required for event triggers.

Once the Admin registered Microsoft.DataFactory, I was able to successfully publish the storage event trigger. 😀 


Friday, November 12, 2021

Databricks and Synapse Study Guide

 



We ran out of time in my Q&A session at the PASS Data Community Summit and I wasn't able to answer the question, "Are there any books or materials that you recommend for Databricks and Synapse?". I promised to blog the list, so here it is. 

Synapse

  • Once you create an Azure Synapse Workspace, you will see a link that send you directly to the Microsoft documentation on using Synapse. The Microsoft documentation is good. It includes several tutorials and Quick Starts to help you hit the ground running. 
  • Once you enter the main screen of Synapse, there's a like to "Knowledge Center", which includes code samples for exploring data with Spark, querying data with SQL and creating external tables using SQL. The Knowledge Center also has a gallery of database templates, sample datasets, sample notebooks, sample SQL scripts and sample pipelines. Definitely check it out and try a few of them. 

Databricks

  • The official Databricks Documentation has the latest and best information on everything Databricks
  • If you are using Azure Databricks, be sure to refer to Microsoft's Azure Databricks documentation because that covers Databricks coding specific to Azure instances including connecting to other Azure services and using Azure's version of the Databricks interface.
  • The official Apache Spark Documentation contains documentation for all versions of Spark, and has good code examples to help you determine how to use each function available in each language used in Spark. 
  • Databricks Academy -yes, they have their own training, which includes Azure and AWS specific training as well as general Databricks training which is applicable to any Databricks implementation. Some is free, and some is paid. 
  • Databricks offers periodic webinars. Definitely sign up for those and the Databricks "Data and AI Summit".  
  • Pop on over to YouTube and look up Bryan Cafferky. He has several good videos on Databricks.

Both Synapse AND Databricks

Yes, there are a few bits of documentation which apply to both Synapse and Databricks.
  • I highly recommend exploring Microsoft Learn. It has free training on everything in Azure and many of the training modules include labs which walk you through performing various activities in Azure using their Azure lab account so you don't have to pay for the experience of learning how to use certain features. 
  • Pragmatic Works still offers weekly free webinars on various topics including Azure, SQL server, Power Apps and more. Sign up on their list to be notified of new content, and explore their archive of past webinars.
  • If you're a fan of watching technical videos, "Advancing Analytics" has a series of YouTube videos covering Azure Synapse and another series covering Databricks. He posts new videos periodically, so keep checking back on his page.
  • 3 Cloud Solutions also offers periodic free webinars on various Azure Topics. Sign up for notifications from them so you don't miss any or explore their list of past webinars.


If I find any additional resources, I'll add them later, but that's all I have for now. 

Thursday, November 11, 2021

CI/CD with Azure Synapse Notebooks - Error Resolved

 


Some features of Azure Synapse are mysterious. Recently, I was working on deploying Azure Synapse artifacts from development to production using the "Synapse Workspace Deployment" extension in Azure DevOps and received an odd error: 

2021-11-10T21:20:14.8670075Z For artifact: AzureSQLQueryTool: Checkstatus: 202; status message: Accepted
2021-11-10T21:20:44.9656242Z For artifact: AzureSQLQueryTool: Checkstatus: 200; status message: OK
2021-11-10T21:20:44.9661205Z For artifact: AzureSQLQueryTool: Artifact Deployment status: Failed
2021-11-10T21:20:44.9673543Z Error during execution: Error: Failed to fetch the deployment status {"code":"400","message":"Failed Component = DataFactoryResourceProvider, ErrorCode = 400, Error = BadRequest "}
2021-11-10T21:20:44.9723399Z ##[error]Encountered with exception:Error: Failed to fetch the deployment status {"code":"400","message":"Failed Component = DataFactoryResourceProvider, ErrorCode = 400, Error = BadRequest "}
2021-11-10T21:20:44.9945300Z ##[section]Finishing: Synpase deployment task for workspace: myWorkspace_prod

The new items I had added to Synapse were several spark notebooks for ingesting data. I had tested them individually and they all appeared to be working, yet Azure DevOps' CI/CD gave me error when it attempted to deploy the release to production. I had followed the instructions provided by Microsoft to set up the CI/CD pipeline, yet it was failing.

I attempted to add override parameters for the notebooks - each notebook was linked to the spark pool in dev which was named "sp_dev". The Production spark pool was called "sp_prod", so with parameters for the pool's name it should work, right? 

No. Same error. 

After numerous other unsuccessful attempts at deployment, I deleted the production spark pool and recreated it with the same name as the dev spark pool. The notebooks deployed without a hitch. 

If you see the above error messages in your CI/CD logs and have spark notebooks in your Synapse deployment, the fix is always give the same names to the spark pools in every environment


Monday, June 21, 2021

Your Synapse Moment: Adding Synapse Workspace User

 Just a quick script for Azure Synapse. Let's say your synapse workspace is named "dev-synapse-workspace", and you've already set up the Managed Identity


To allow synapse to actually interact with your SQL pool user databases, in each database you should run the following script (put your synapse workspace name in place of the example name)

CREATE USER [dev-synapse-workspace] FROM EXTERNAL PROVIDER;

exec sp_addrolemember 'db_owner', 'dev-synapse-workspace';

After that, your pipelines should be able to easily access your Synapse SQL Pools. 

This same code works in an Azure SQL database if you need to access one from Azure Synapse. 

Happy coding!

Thursday, April 29, 2021

Where did my table valued functions go in Azure Synapse?

I recently discovered that you can create table functions in Azure Synapse,  but they are NOT listed in SSMS.  Also, even if you know the function's name, you can't use sp_helptext to find out what it does.

So how can we find them or find out their definitions if we know someone created some functions? 

Here's the query you need to pull from system tables:


SELECT s.name
    ,o.name
    ,DEFINITION
    ,type
FROM sys.sql_modules AS m
JOIN sys.objects AS o
   ON m.object_id = o.object_id
JOIN sys.schemas AS s
   ON o.schema_id = s.schema_id
WHERE o.type IN ('IF','FN');
GO

Have fun with Azure Synapse!


How to Turn Off Smart Charging

After a recent update, I discovered that my battery was down to 80% while plugged in, which I hadn't expected to see. I also noticed tha...