Want to Learn More About Azure Data Factory?

From me?

I am honored to announce Getting Started with Azure Data Factory – a course from Enterprise Data & Analytics!

The next delivery is 04 Mar 2019, 9:00 AM – 4:30 PM ET.

If you sign up by 31 Jan, you save money. Right now the course is on sale!

Azure Data Factory, or ADF, is an Azure PaaS (Platform-as-a-Service) that provides hybrid data integration at global scale. Use ADF to build fully managed ETL in the cloud – including SSIS. Join Andy Leonard – authorblogger, and Chief Data Engineer at Enterprise Data & Analytics – as he demonstrates practical Azure Data Factory use cases.

In this course, you’ll learn:

  • The essentials of ADF
  • Developing, testing, scheduling, monitoring, and managing ADF pipelines
  • Lifting and shifting SSIS to ADF SSIS Integration Runtime (Azure-SSIS)
  • ADF design patterns
  • Data Integration Lifecycle Management (DILM) for the cloud and hybrid data integration scenarios

Did someone mention a sale?

Yep! The course is currently on sale until 31 Jan 2019!

I hope to see you there!

PS – Join me For Expert SSIS Training!

Honored to Present Lift and Shift SSIS to ADF at #Azure DataFest Reston

I am honored to deliver Lift and Shift SSIS to ADF at the Azure DataFest in Reston Virginia 11 Oct 2018!

Abstract

Your enterprise wants to use the latest cool Azure Data Analytics tools but there’s one issue: All your data are belong to the servers on-premises. How do you get your enterprise data into the cloud?

In this session, SSIS author and trainer Andy Leonard discusses and demonstrates migrating SSIS to Azure Data Factory Integration Runtime.

Register today!

:{>

Presenting Moving Data with Azure Data Factory at SQL Saturday Charlotte!

I am honored to present Moving Data with Azure Data Factory at SQL Saturday 806 in Charlotte, NC 20 Oct 2018.

This is the first time I am delivering this session. It still has that new presentation smell!

Abstract

Azure Data Factory – ADF – is a cloud data engineering solution. ADF version 2 sports a snappy web GUI (graphical user interface) and supports the SSIS Integration Runtime (IR) – or “SSIS in the Cloud.”

Attend this session to learn:
– How to build a “native ADF” pipeline;
– How to lift and shift SSIS to the Azure Data Factory integration Runtime; and
– ADF Design Patterns to execute and monitor pipelines and packages.

I hope to see you there!

:{>

Using SSIS Framework Community Edition Webinar 20 Sep

Join me 20 Sep 2018 at noon ET for a free webinar titled Using SSIS Framework Community Edition!

Abstract

SSIS Framework Community Edition is free and open source. You may know can use SSIS Framework Community Edition to execute a collection of SSIS packages using a call to a single stored procedure passing a single parameter. But did you know you can also use it to execute a collection of SSIS packages in Azure Data Factory SSIS Integration Runtime? You can!

In this free webinar, Andy discusses and demonstrates SSIS Framework Community Edition – on-premises and in the cloud.

Join SSIS author, BimlHero, consultant, trainer, and blogger Andy Leonard at noon EDT Thursday 20 Sep 2018 as he demonstrates using Biml to make an on-premises copy of an Azure SQL DB.

I hope to see you there!

Register today.

:{>

Introducing Azure Data Factory Design Patterns

I was honored to write an article titled Introducing Azure Data Factory Design Patterns featured in this month’s PASS Insights newsletter!

Introducing Azure Data Factory Design Patterns

The article covers a couple execution patterns:

  1. Execute Child Pipeline
  2. Execute Child SSIS Package

I demonstrate a cool SSIS Catalog Browser feature that helps ADF developers configure the Execute SSIS Package activity.

To see it in action, download SSIS Catalog Browser – it’s one of the free utilities available at DILM Suite. Connect to the instance of Azure SQL DB that hosts an Azure Data Factory SSIS Integration Runtime Catalog, select the SSIS Package you desire to execute using the Execute SSIS Package activity, and then copy the Catalog Path from the  Catalog Browser status message:

Paste that value into the Package Path property of the Execute SSIS Package activity:

You can rinse and repeat – Catalog Browser surfaces Environment paths as well:

Enjoy the article!

If you have any questions about Azure Data Factory – or need help getting started – please reach out!

Learn more:
Attend my full-day pre-conference session titled Intelligent Data Integration at the PASS Summit 2018  on 5 Nov 2018.
Check out this 1-day course on
Fundamentals of Azure Data Factory delivered in cooperation with Brent Ozar Unlimited 10 Dec 2018!

AndyWeather Internet of Things (IoT)

AndyWeather.com is a site I’ve maintained for about 10 years now. I use the site and related hardware, software, and services to test concepts and perform experiments.

I then apply my experience in delivering Internet of Things (IoT) solutions for Enterprise Data & Analytics customers and for SSIS and Biml training, such as my upcoming course titled Fundamentals of Azure Data Factory delivered in cooperation with Brent Ozar Unlimited.

It all started when GoDaddy created a DMZ for SQL Server databases. I found this functionality in 2008 and asked myself, “Self, how might we use this?”

Since That Time…

There have been two major iterations of AndyWeather. I use weather data collected during the first iteration for training purposes at SSIS Academy and when delivering training to Enterprise Data & Analytics customers.

AndyWeather v2

The setup of the second iteration is fairly straightforward:

  1. The Acurite Weather Station consists of an instrument pack plus a base station. The instruments collect weather measurements and transmit them to the base station.
  2. The base station is connected to an older e-Machine running Windows 7 Ultimate (32-bit) on 2GB RAM.
  3. An Acurite application interfaces with the base station and the application stores data locally in a single CSV file.
  4. I wrote a very simple C# console application named “abt” (an acronym for “Azure Blob Transfer”) to transfer the CSV file to Azure Blog Storage.
  5. An Azure Data Factory pipeline that loads an Azure SQL DB staging table.
  6. The AndyWeather website which reads the latest weather data from the Azure SQL DB staging table.
  7. I wrote another very simple C# application named “awt” (an acronym for “AndyWeather Tweets”) that tweets updates to the @AndyWeather twitter account.

Acurite Weather Station

The latest iteration began in early 2018 when I purchased an updated package of instruments and a new base station made by Acurite. So far, I like this station a lot. It was less expensive than the previous station and appears more rugged (again, so far – time will tell).

I recently relocated the weather station to improve connectivity between the instruments and the base station. I recorded a Data Driven *DataPoint* about it:

(Pay no attention to the exploding pecans in the background…)

The e-Machine

I intentionally use an under-powered PC for the server. Why? I want to learn how the base station – and then everything downstream of the base station – responds to busy server conditions. This is Engineering 101 stuff and I’ve learned a lot:

I love this old machine!

Acurite Application

The Acurite people maintain an application for communicating with base stations:

(click to enlarge)

The PC Connect application allows me to configure how and when weather data is collected from the base station – which collects measurements from the instruments. The application lets me configure the units-of-measure and file location – and I can even share my weather data with Weather Underground. How cool is that?

The Azure Blob Transfer Console Application

The Azure Blob Transfer (abt) application is a very simple console application written in C#. It picks up the CSV file containing weather data stored by the Acurite PC Connect application and writes the file to an Azure Blob Storage container:

(click to enlarge)

The CSV file in Azure Blob Storage is overwritten each time abt successfully executes. You can download a copy of the abt solution here.

Azure Data Factory Pipeline

An Azure Data Factory (ADF) pipeline calls a stored procedure that first truncates a staging table in a Azure SQL DB using a Stored Procedure activity, followed by a Copy Data activity that copies the weather data from the CSV file in Azure Blob Storage to an Azure SQL DB staging table:

At the time of this writing, ADF version 2 is current.

You can download the ARM template for the pipeline here.

The AndyWeather Website

The AndyWeather website has been around since the days of the first iteration of AndyWeather – the one that stored data in a SQL Server instance hosted at GoDaddy’s DMZ. It’s fairly straightforward code, which helps it perform fairly on desktops and mobile devices:

The biggest performance hit comes from executing the stored procedure against an Azure SQL DB, which can sometimes take 5-10 seconds to complete.

The AndyWeather Tweets Console Application

I snagged some C# code and a TwitterAPI class from a project named called TweetSharp to help build the awt console application:

You can download a copy of the awt solution here.

The @AndyWeather Twitter Account

It makes me happy every time I see a tweet from @AndyWeather:

I tell people, “It’s just a dumb little app,” but I really had fun building it. I learned a bunch, too!

Conclusion

The AndyWeather IoT solution uses hybrid technology – on-premises instruments and servers, combined with cloud services – to deliver weather data to a website and Twitter account. It’s accessible from social media and the web from desktops and mobile devices.

Just so you know, this isn’t everything I’ve built using the AndyWeather instruments. There’s a bunch more – some of which is still in the experimental phase. I’ll share more as time permits. But I want you all to know, I consider Azure a great big cyber-playground!

:{>

Announcing the Fundamentals of Azure Data Factory Course!

I am excited to announce a brand new course (it still has that new course smell) from Brent Ozar Unlimited and honored to deliver it! This one-day, live, online course is titled Fundamentals of Azure Data Factory and it’s designed to introduce you to Azure Data Factory (ADF).

There will be demos.
Live demos.
Lots of live demos!

Abstract

Azure Data Factory, or ADF, is an Azure PaaS (Platform-as-a-Service) that provides hybrid data integration at global scale. Use ADF to build fully managed ETL in the cloud – including SSIS. Join Andy Leonard – authorblogger, and Chief Data Engineer at Enterprise Data & Analytics – as he demonstrates practical Azure Data Factory use cases.

In this course, you’ll learn:

  • The essentials of Azure Data Factory (ADF)
  • Developing, testing, scheduling, monitoring, and managing ADF pipelines
  • Lifting and shifting SSIS to ADF SSIS Integration Runtime (Azure-SSIS)
  • ADF design patterns
  • Data Integration Lifecycle Management (DILM) for the cloud and hybrid data integration scenarios

To know if you’re ready for this class, look for “yes” answers to these questions:

  • Do you want to learn more about cloud data integration in Azure Data Factory?
  • Is your enterprise planning to migrate its data, databases, data warehouse(s), or some of them, to the cloud?
  • Do you currently use SSIS?

The next delivery is scheduled for 10 Dec 2018. Register today!

I hope to see you there.

:{>

Honored to Present Lift and Shift SSIS to ADF at #Azure DataFest Atlanta

The #Azure DataFest Atlanta Schedule has been posted and I am honored to be selected to present Lift and Shift SSIS to ADF!

Check out this lineup of presenters and presentations. I describe this group as: “a bunch of smart people – plus me!”

Lift and Shift SSIS to ADF Abstract

Your enterprise wants to use the latest cool Azure Data Analytics tools but there’s one issue: All your data are belong to the servers on-premises. How do you get your enterprise data into the cloud?

In this session, SSIS author and trainer Andy Leonard discusses and demonstrates migrating SSIS to Azure Data Factory Integration Runtime.

Register today!

:{>