Lift and Shift SSIS to Azure

Enterprise Data & Analytics‘ data engineers are experts at lifting and shifting SSIS to Azure Data Factory SSIS Integration Runtime. 

Our state-of-the-art DILM Suite tools in the capable hands of our experienced data engineers combine to drastically reduce the amount of time to manually migrate and apply SSIS Catalog configuration artifacts – Literals, Catalog Environments and Catalog Environment Variables, References, and Reference Mappings – while simultaneously improving the quality of the migration effort.

Check out our Lift and Shift page to learn more!

:{>

Updates to the ADF Execute SSIS Package Activity

Last night I presented Moving Data with Azure Data Factory to a packed house at the Richmond Azure User Group. The crowd was engaged and asked awesome questions. I had a blast!

Surprise!

I rehearsed many of my demos before the presentation and even updated my slides. One thing I did not do was rehearse configuring an Execute SSIS Package activity. Why? I’ve built these demos live a bajillion times. I know what I’m doing…

When I clicked on the Settings tab I found new stuff – options I’d not previously encountered. Thankfully, I immediately recognized the purpose of these new configuration options – and I also immediately liked them. The demo worked and no one was the wiser (except the handful of folks with whom I shared this story after the meeting).

New Dropdowns for Folder, Project and Package

The new interface sports new dropdowns for selecting the Catalog Folder, Project, and Package to executed. I like this – it’s slick. I had deployed a project to my Azure-SSIS instance between the time I started this part of the demo and the time I wanted to configure the Execute SSIS Package activity. During deployment I created a new Catalog Folder which was not initially listed in the Folder dropdown. Clicking the Refresh button remedied this, though, and I was able to complete configuration rapidly. 

Configuration Tabs

I cannot recall if the Connection Managers and Property Overrides tabs were part of the previous Execute SSIS Package activity interface. I don’t think so, but I could be wrong about that. Update: I verified these are new tabs by looking at screenshots from my June 2018 post titled ADF Execute SSIS Package Activity. The previous version had an Advanced tab. 

The SSIS package configuration tabs are SSIS Parameters, Connection Managers, and Property Overrides.

When your Azure-SSIS instance is running, you may use these tabs to update  Parameter, Connection Manager Property, and SSIS Package Property values:

Warnings

It’s possible to configure the Execute SSIS Package activity when your Azure-SSIS Integration Runtime is not running, but you don’t get the nice dropdown pre-population and have to revert to the previous method of configuring the full SSIS Catalog path to the package you desire to execute.

SSIS Catalog Browser To The Rescue!

If you find yourself in this predicament and would rather configure the Execute SSIS Package activity without waiting 20-30 minutes for the Azure-SSIS instance to spin up, you can use SSIS Catalog Browser – a free utility from DILM Suite – to connect to your Azure-SSIS instance:

SSIS Catalog Browser displays the Catalog path for an SSIS package (or Catalog Environment) when you select the artifact in the unified Catalog surface. Copy the package’s Catalog path displayed in the Status area and and paste the value into the Package Path textbox in ADF:

Make sure the Manual Entries checkbox is checked.

I like the warnings. Feedback is a good thing.

Once Azure-SSIS Is Running

When your Azure-SSIS instance is up and running, you may configure the path to your SSIS package using the dropdowns:

You can even configure the path to a Catalog Environment:

Conclusion

I believe there are at least two lessons to take away from my experience:

  1. When presenting on Microsoft Azure topics, always check your demos to make certain nothing has changed; and 
  2. Microsoft Azure is evolving at a rapid rate – especially Azure Data Factory!

Want to Learn More About Azure Data Factory?

From me?

I am honored to announce Getting Started with Azure Data Factory – a course from Enterprise Data & Analytics!

The next delivery is 04 Mar 2019, 9:00 AM – 4:30 PM ET.

Azure Data Factory, or ADF, is an Azure PaaS (Platform-as-a-Service) that provides hybrid data integration at global scale. Use ADF to build fully managed ETL in the cloud – including SSIS. Join Andy Leonard – authorblogger, and Chief Data Engineer at Enterprise Data & Analytics – as he demonstrates practical Azure Data Factory use cases.

In this course, you’ll learn:

  • The essentials of ADF
  • Developing, testing, scheduling, monitoring, and managing ADF pipelines
  • Lifting and shifting SSIS to ADF SSIS Integration Runtime (Azure-SSIS)
  • ADF design patterns
  • Data Integration Lifecycle Management (DILM) for the cloud and hybrid data integration scenarios

I hope to see you there!

PS – Join me For Expert SSIS Training!

Follow Andy Leonard’s SSIS Training page for more information.

Honored to Present Lift and Shift SSIS to ADF at #Azure DataFest Reston

I am honored to deliver Lift and Shift SSIS to ADF at the Azure DataFest in Reston Virginia 11 Oct 2018!

Abstract

Your enterprise wants to use the latest cool Azure Data Analytics tools but there’s one issue: All your data are belong to the servers on-premises. How do you get your enterprise data into the cloud?

In this session, SSIS author and trainer Andy Leonard discusses and demonstrates migrating SSIS to Azure Data Factory Integration Runtime.

Register today!

:{>

Presenting Moving Data with Azure Data Factory at SQL Saturday Charlotte!

I am honored to present Moving Data with Azure Data Factory at SQL Saturday 806 in Charlotte, NC 20 Oct 2018.

This is the first time I am delivering this session. It still has that new presentation smell!

Abstract

Azure Data Factory – ADF – is a cloud data engineering solution. ADF version 2 sports a snappy web GUI (graphical user interface) and supports the SSIS Integration Runtime (IR) – or “SSIS in the Cloud.”

Attend this session to learn:
– How to build a “native ADF” pipeline;
– How to lift and shift SSIS to the Azure Data Factory integration Runtime; and
– ADF Design Patterns to execute and monitor pipelines and packages.

I hope to see you there!

:{>

Using SSIS Framework Community Edition Webinar 20 Sep

Join me 20 Sep 2018 at noon ET for a free webinar titled Using SSIS Framework Community Edition!

Abstract

SSIS Framework Community Edition is free and open source. You may know can use SSIS Framework Community Edition to execute a collection of SSIS packages using a call to a single stored procedure passing a single parameter. But did you know you can also use it to execute a collection of SSIS packages in Azure Data Factory SSIS Integration Runtime? You can!

In this free webinar, Andy discusses and demonstrates SSIS Framework Community Edition – on-premises and in the cloud.

Join SSIS author, BimlHero, consultant, trainer, and blogger Andy Leonard at noon EDT Thursday 20 Sep 2018 as he demonstrates using Biml to make an on-premises copy of an Azure SQL DB.

I hope to see you there!

Register today.

:{>

Introducing Azure Data Factory Design Patterns

I was honored to write an article titled Introducing Azure Data Factory Design Patterns featured in this month’s PASS Insights newsletter!

Introducing Azure Data Factory Design Patterns

The article covers a couple execution patterns:

  1. Execute Child Pipeline
  2. Execute Child SSIS Package

I demonstrate a cool SSIS Catalog Browser feature that helps ADF developers configure the Execute SSIS Package activity.

To see it in action, download SSIS Catalog Browser – it’s one of the free utilities available at DILM Suite. Connect to the instance of Azure SQL DB that hosts an Azure Data Factory SSIS Integration Runtime Catalog, select the SSIS Package you desire to execute using the Execute SSIS Package activity, and then copy the Catalog Path from the  Catalog Browser status message:

Paste that value into the Package Path property of the Execute SSIS Package activity:

You can rinse and repeat – Catalog Browser surfaces Environment paths as well:

Enjoy the article!

If you have any questions about Azure Data Factory – or need help getting started – please reach out!

Learn more:
Attend my full-day pre-conference session titled Intelligent Data Integration at the PASS Summit 2018  on 5 Nov 2018.
Check out this 1-day course on
Fundamentals of Azure Data Factory delivered in cooperation with Brent Ozar Unlimited 10 Dec 2018!

AndyWeather Internet of Things (IoT)

AndyWeather.com is a site I’ve maintained for about 10 years now. I use the site and related hardware, software, and services to test concepts and perform experiments.

I then apply my experience in delivering Internet of Things (IoT) solutions for Enterprise Data & Analytics customers and for SSIS and Biml training, such as my upcoming course titled Fundamentals of Azure Data Factory delivered in cooperation with Brent Ozar Unlimited.

It all started when GoDaddy created a DMZ for SQL Server databases. I found this functionality in 2008 and asked myself, “Self, how might we use this?”

Since That Time…

There have been two major iterations of AndyWeather. I use weather data collected during the first iteration for training purposes at SSIS Academy and when delivering training to Enterprise Data & Analytics customers.

AndyWeather v2

The setup of the second iteration is fairly straightforward:

  1. The Acurite Weather Station consists of an instrument pack plus a base station. The instruments collect weather measurements and transmit them to the base station.
  2. The base station is connected to an older e-Machine running Windows 7 Ultimate (32-bit) on 2GB RAM.
  3. An Acurite application interfaces with the base station and the application stores data locally in a single CSV file.
  4. I wrote a very simple C# console application named “abt” (an acronym for “Azure Blob Transfer”) to transfer the CSV file to Azure Blog Storage.
  5. An Azure Data Factory pipeline that loads an Azure SQL DB staging table.
  6. The AndyWeather website which reads the latest weather data from the Azure SQL DB staging table.
  7. I wrote another very simple C# application named “awt” (an acronym for “AndyWeather Tweets”) that tweets updates to the @AndyWeather twitter account.

Acurite Weather Station

The latest iteration began in early 2018 when I purchased an updated package of instruments and a new base station made by Acurite. So far, I like this station a lot. It was less expensive than the previous station and appears more rugged (again, so far – time will tell).

I recently relocated the weather station to improve connectivity between the instruments and the base station. I recorded a Data Driven *DataPoint* about it:

(Pay no attention to the exploding pecans in the background…)

The e-Machine

I intentionally use an under-powered PC for the server. Why? I want to learn how the base station – and then everything downstream of the base station – responds to busy server conditions. This is Engineering 101 stuff and I’ve learned a lot:

I love this old machine!

Acurite Application

The Acurite people maintain an application for communicating with base stations:

(click to enlarge)

The PC Connect application allows me to configure how and when weather data is collected from the base station – which collects measurements from the instruments. The application lets me configure the units-of-measure and file location – and I can even share my weather data with Weather Underground. How cool is that?

The Azure Blob Transfer Console Application

The Azure Blob Transfer (abt) application is a very simple console application written in C#. It picks up the CSV file containing weather data stored by the Acurite PC Connect application and writes the file to an Azure Blob Storage container:

(click to enlarge)

The CSV file in Azure Blob Storage is overwritten each time abt successfully executes. You can download a copy of the abt solution here.

Azure Data Factory Pipeline

An Azure Data Factory (ADF) pipeline calls a stored procedure that first truncates a staging table in a Azure SQL DB using a Stored Procedure activity, followed by a Copy Data activity that copies the weather data from the CSV file in Azure Blob Storage to an Azure SQL DB staging table:

At the time of this writing, ADF version 2 is current.

You can download the ARM template for the pipeline here.

The AndyWeather Website

The AndyWeather website has been around since the days of the first iteration of AndyWeather – the one that stored data in a SQL Server instance hosted at GoDaddy’s DMZ. It’s fairly straightforward code, which helps it perform fairly on desktops and mobile devices:

The biggest performance hit comes from executing the stored procedure against an Azure SQL DB, which can sometimes take 5-10 seconds to complete.

The AndyWeather Tweets Console Application

I snagged some C# code and a TwitterAPI class from a project named called TweetSharp to help build the awt console application:

You can download a copy of the awt solution here.

The @AndyWeather Twitter Account

It makes me happy every time I see a tweet from @AndyWeather:

I tell people, “It’s just a dumb little app,” but I really had fun building it. I learned a bunch, too!

Conclusion

The AndyWeather IoT solution uses hybrid technology – on-premises instruments and servers, combined with cloud services – to deliver weather data to a website and Twitter account. It’s accessible from social media and the web from desktops and mobile devices.

Just so you know, this isn’t everything I’ve built using the AndyWeather instruments. There’s a bunch more – some of which is still in the experimental phase. I’ll share more as time permits. But I want you all to know, I consider Azure a great big cyber-playground!

:{>

Announcing the Fundamentals of Azure Data Factory Course!

I am excited to announce a brand new course (it still has that new course smell) from Brent Ozar Unlimited and honored to deliver it! This one-day, live, online course is titled Fundamentals of Azure Data Factory and it’s designed to introduce you to Azure Data Factory (ADF).

There will be demos.
Live demos.
Lots of live demos!

Abstract

Azure Data Factory, or ADF, is an Azure PaaS (Platform-as-a-Service) that provides hybrid data integration at global scale. Use ADF to build fully managed ETL in the cloud – including SSIS. Join Andy Leonard – authorblogger, and Chief Data Engineer at Enterprise Data & Analytics – as he demonstrates practical Azure Data Factory use cases.

In this course, you’ll learn:

  • The essentials of Azure Data Factory (ADF)
  • Developing, testing, scheduling, monitoring, and managing ADF pipelines
  • Lifting and shifting SSIS to ADF SSIS Integration Runtime (Azure-SSIS)
  • ADF design patterns
  • Data Integration Lifecycle Management (DILM) for the cloud and hybrid data integration scenarios

To know if you’re ready for this class, look for “yes” answers to these questions:

  • Do you want to learn more about cloud data integration in Azure Data Factory?
  • Is your enterprise planning to migrate its data, databases, data warehouse(s), or some of them, to the cloud?
  • Do you currently use SSIS?

The next delivery is scheduled for 10 Dec 2018. Register today!

I hope to see you there.

:{>