Friday, November 12, 2021

Databricks and Synapse Study Guide

 



We ran out of time in my Q&A session at the PASS Data Community Summit and I wasn't able to answer the question, "Are there any books or materials that you recommend for Databricks and Synapse?". I promised to blog the list, so here it is. 

Synapse

  • Once you create an Azure Synapse Workspace, you will see a link that send you directly to the Microsoft documentation on using Synapse. The Microsoft documentation is good. It includes several tutorials and Quick Starts to help you hit the ground running. 
  • Once you enter the main screen of Synapse, there's a like to "Knowledge Center", which includes code samples for exploring data with Spark, querying data with SQL and creating external tables using SQL. The Knowledge Center also has a gallery of database templates, sample datasets, sample notebooks, sample SQL scripts and sample pipelines. Definitely check it out and try a few of them. 

Databricks

  • The official Databricks Documentation has the latest and best information on everything Databricks
  • If you are using Azure Databricks, be sure to refer to Microsoft's Azure Databricks documentation because that covers Databricks coding specific to Azure instances including connecting to other Azure services and using Azure's version of the Databricks interface.
  • The official Apache Spark Documentation contains documentation for all versions of Spark, and has good code examples to help you determine how to use each function available in each language used in Spark. 
  • Databricks Academy -yes, they have their own training, which includes Azure and AWS specific training as well as general Databricks training which is applicable to any Databricks implementation. Some is free, and some is paid. 
  • Databricks offers periodic webinars. Definitely sign up for those and the Databricks "Data and AI Summit".  
  • Pop on over to YouTube and look up Bryan Cafferky. He has several good videos on Databricks.

Both Synapse AND Databricks

Yes, there are a few bits of documentation which apply to both Synapse and Databricks.
  • I highly recommend exploring Microsoft Learn. It has free training on everything in Azure and many of the training modules include labs which walk you through performing various activities in Azure using their Azure lab account so you don't have to pay for the experience of learning how to use certain features. 
  • Pragmatic Works still offers weekly free webinars on various topics including Azure, SQL server, Power Apps and more. Sign up on their list to be notified of new content, and explore their archive of past webinars.
  • If you're a fan of watching technical videos, "Advancing Analytics" has a series of YouTube videos covering Azure Synapse and another series covering Databricks. He posts new videos periodically, so keep checking back on his page.
  • 3 Cloud Solutions also offers periodic free webinars on various Azure Topics. Sign up for notifications from them so you don't miss any or explore their list of past webinars.


If I find any additional resources, I'll add them later, but that's all I have for now. 

Thursday, November 11, 2021

CI/CD with Azure Synapse Notebooks - Error Resolved

 


Some features of Azure Synapse are mysterious. Recently, I was working on deploying Azure Synapse artifacts from development to production using the "Synapse Workspace Deployment" extension in Azure DevOps and received an odd error: 

2021-11-10T21:20:14.8670075Z For artifact: AzureSQLQueryTool: Checkstatus: 202; status message: Accepted
2021-11-10T21:20:44.9656242Z For artifact: AzureSQLQueryTool: Checkstatus: 200; status message: OK
2021-11-10T21:20:44.9661205Z For artifact: AzureSQLQueryTool: Artifact Deployment status: Failed
2021-11-10T21:20:44.9673543Z Error during execution: Error: Failed to fetch the deployment status {"code":"400","message":"Failed Component = DataFactoryResourceProvider, ErrorCode = 400, Error = BadRequest "}
2021-11-10T21:20:44.9723399Z ##[error]Encountered with exception:Error: Failed to fetch the deployment status {"code":"400","message":"Failed Component = DataFactoryResourceProvider, ErrorCode = 400, Error = BadRequest "}
2021-11-10T21:20:44.9945300Z ##[section]Finishing: Synpase deployment task for workspace: myWorkspace_prod

The new items I had added to Synapse were several spark notebooks for ingesting data. I had tested them individually and they all appeared to be working, yet Azure DevOps' CI/CD gave me error when it attempted to deploy the release to production. I had followed the instructions provided by Microsoft to set up the CI/CD pipeline, yet it was failing.

I attempted to add override parameters for the notebooks - each notebook was linked to the spark pool in dev which was named "sp_dev". The Production spark pool was called "sp_prod", so with parameters for the pool's name it should work, right? 

No. Same error. 

After numerous other unsuccessful attempts at deployment, I deleted the production spark pool and recreated it with the same name as the dev spark pool. The notebooks deployed without a hitch. 

If you see the above error messages in your CI/CD logs and have spark notebooks in your Synapse deployment, the fix is always give the same names to the spark pools in every environment


Friday, November 5, 2021

Data Community Fun: A visit with Paul Turley

 


A few weeks ago, Paul Turley, his wife Shirley, their two spunky dogs and anti-social cat pulled up at the "Letourneau Resort" for a 3-day visit. We spent some time working together on the patio while his dogs raced around my property. The cat hid in the Data Bus and only went out when forced to do so. We shared several meals and spent many hours working at our computers (he visited Thursday, Friday, & Saturday. We both had to work Thursday & Friday). 



"Fancy" patio set-up. Just the computer, but we do have internet out there and the weather was perfect for sitting outside enjoying the fresh air while also getting a lot of work done. 

View from patio when you look up from computer. :) 

On the last day, Saturday, my boss at 3 Cloud, Kathi Vick, and her husband joined us. Paul has a blog called "Data On The Road", so he interviewed both of us before getting back on the road heading eastward. 


Guy loaned Paul his "Director's clapper" - it was the award from the Sakuracon 2005 AMV competition and was signed by a bunch of the convention guests - for the filming. The interviews took place on my patio couch. The clapper was definitely a fun way to start the videos.


I got to put the "Arizona" sticker on their "Places We've Been" map attached to the side of the Data Bus.

In early 2020, Paul and I had planned on being co-presenters at the Phoenix SQL Saturday, so I made cloud shirts for both of us so we'd match. Unfortunately, COVID came along and the Phoenix SQL Saturday never happened. 

For the interview with me, Paul and I donned our matching shirts to demonstrate how cool it would have been if we had presented with matching cloud shirts. They look pretty spiffy, eh? And the cloud earrings are of course a necessary accessory for any cloud data professional!


The above is a screenshot of the video posted to Paul's blog. I recommend you go see the video on his page, and also check out the other interviews he did with a number of other wonderful people in the data community. He also has a schedule posted telling you where he'll be and when. If he's coming to your area, reach out to him and say, "hi". 
 
I can't wait to see the Data Bus again when it next passes through the Phoenix area. 

Resoved: Error Creating Storage Event Trigger in Azure Synapse

My client receives external files from a vendor and wants to ingest them into the Data Lake using the integration pipelines in Synapse (Syna...