Catalog Browser v0.7.8.0

I’ve been making smaller, more incremental changes to SSIS Catalog Browser – a free utility from the Data Integration Lifecycle Management suite (DILM Suite).

You can use SSIS Catalog Browser to view SSIS Catalog contents on a unified surface. Catalog Browser works with SSIS Catalogs on-premises and Azure Data Factory SSIS Integration Runtime, or Azure SSIS. It’s pretty cool and the price ($0 USD) is right!

The latest change is a version check that offers to send you to the page to download an update. You will find this change starting with version 0.7.7.0.  Version 0.7.8.0 includes a slightly better-formatted version-check message. As I said, smaller, more incremental changes.

Enjoy!

Honored to Present Lift and Shift SSIS to ADF at #Azure DataFest Reston

I am honored to deliver Lift and Shift SSIS to ADF at the Azure DataFest in Reston Virginia 11 Oct 2018!

Abstract

Your enterprise wants to use the latest cool Azure Data Analytics tools but there’s one issue: All your data are belong to the servers on-premises. How do you get your enterprise data into the cloud?

In this session, SSIS author and trainer Andy Leonard discusses and demonstrates migrating SSIS to Azure Data Factory Integration Runtime.

Register today!

:{>

Viewing SSIS Configurations Metadata in SSIS Catalog Browser

SSIS Catalog Browser is a pretty neat product. “How neat is it, Andy?” I’m glad you asked.

It’s free. That makes it difficult to beat the cost.

SSIS Catalog Browser is designed to surface all SSIS Catalog artifacts and properties in a single view. “What exactly does that mean, Andy?” You’re sharp. Let’s talk about why the surface-single-view is important.

Before I go on, you may read what I’m about to write here and in the companion post and think, “Andy doesn’t like the Integration Services Catalogs node in SSMS.” That is not accurate. I do like the Integration Services Catalogs node in SSMS. It surfaces enough information for the primary target user of SSMS – the Database Administrator – to see what they need to see to do their job, without “cluttering up” their interface with stuff that they rarely need to see and even more-rarely change.

In the companion post I shared this image of the windows (and pages) you need to open in SSMS to view the configured execution-time value of a parameter that is mapped via reference:

(click to enlarge)

That’s a lot of open windows.

So how does one view the same configuration metadata in SSIS Catalog Browser?

Under the Project node (LiftAndShift), we find a virtual folder that holds Project Parameters.

In Project parameters, we find a reference mapping – indicated by underlined text decoration and describing the reference mapping as between the parameter (ProjectParameter) and the SSIS Catalog Environment Variable (StringParameter).

Expanding the reference mapping node surfaces References. There are two references named env1 and env2. Since references can reference SSIS Catalog Environments in other Catalog folders, the fully-qualified path to each SSIS Catalog environment is shown in folder/environment format.

Expanding each reference node surfaces the value of the SSIS Catalog Environment Variable in each SSIS Catalog Environment.

I call this feature Values Everywhere, and I like it. A lot.

Values Everywhere From the Project Reference Perspective

Values Everywhere is perspective-aware. Whereas from the perspective of an SSIS Project Parameter, Values Everywhere surfaces the reference mapping in the format parameter–>environment variable, in the Project References virtual folder, Values Everywhere surfaces the same relationship as environment variable–>parameter:

Values Everywhere perspectives follow intuition when surfacing reference mapping relationships. (Did I mention I like this feature? A lot?)

Conclusion

SSIS Catalog Browser provides a clean interface for enterprise Release Management and Configuration teams. And it’s free.

I can hear you thinking, “Why is Catalog Browser free, Andy?” I give away Catalog Browser to demonstrate the surfacing capabilities of SSIS Catalog Compare.

SSIS Catalog Compare

SSIS Catalog Compare not only surfaces two SSIS Catalogs side by side, you can compare the contents of the Catalogs:

You can also script an entire SSIS Catalog which produces T-SQL script and ISPAC files for every artifact in the SSIS Catalog (organized by instance and folder):

You can also deploy all artifacts contained in an SSIS Catalog Folder from one SSIS Catalog to another:

This functionality is an efficient method for Data Integration Lifecycle Management – or DevOps – with SSIS.

SSIS Catalog Compare even works with Azure Data Factory SSIS Integration Runtime, so you can use SSIS Catalog Compare to lift and shift SSIS from on-premises Catalogs to the cloud.

Presenting Moving Data with Azure Data Factory at SQL Saturday Charlotte!

I am honored to present Moving Data with Azure Data Factory at SQL Saturday 806 in Charlotte, NC 20 Oct 2018.

This is the first time I am delivering this session. It still has that new presentation smell!

Abstract

Azure Data Factory – ADF – is a cloud data engineering solution. ADF version 2 sports a snappy web GUI (graphical user interface) and supports the SSIS Integration Runtime (IR) – or “SSIS in the Cloud.”

Attend this session to learn:
– How to build a “native ADF” pipeline;
– How to lift and shift SSIS to the Azure Data Factory integration Runtime; and
– ADF Design Patterns to execute and monitor pipelines and packages.

I hope to see you there!

:{>

Introducing Azure Data Factory Design Patterns

I was honored to write an article titled Introducing Azure Data Factory Design Patterns featured in this month’s PASS Insights newsletter!

Introducing Azure Data Factory Design Patterns

The article covers a couple execution patterns:

  1. Execute Child Pipeline
  2. Execute Child SSIS Package

I demonstrate a cool SSIS Catalog Browser feature that helps ADF developers configure the Execute SSIS Package activity.

To see it in action, download SSIS Catalog Browser – it’s one of the free utilities available at DILM Suite. Connect to the instance of Azure SQL DB that hosts an Azure Data Factory SSIS Integration Runtime Catalog, select the SSIS Package you desire to execute using the Execute SSIS Package activity, and then copy the Catalog Path from the  Catalog Browser status message:

Paste that value into the Package Path property of the Execute SSIS Package activity:

You can rinse and repeat – Catalog Browser surfaces Environment paths as well:

Enjoy the article!

If you have any questions about Azure Data Factory – or need help getting started – please reach out!

Learn more:
Attend my full-day pre-conference session titled Intelligent Data Integration at the PASS Summit 2018  on 5 Nov 2018.
Check out this 1-day course on
Fundamentals of Azure Data Factory delivered in cooperation with Brent Ozar Unlimited 10 Dec 2018!

AndyWeather Internet of Things (IoT)

AndyWeather.com is a site I’ve maintained for about 10 years now. I use the site and related hardware, software, and services to test concepts and perform experiments.

I then apply my experience in delivering Internet of Things (IoT) solutions for Enterprise Data & Analytics customers and for SSIS and Biml training, such as my upcoming course titled Fundamentals of Azure Data Factory delivered in cooperation with Brent Ozar Unlimited.

It all started when GoDaddy created a DMZ for SQL Server databases. I found this functionality in 2008 and asked myself, “Self, how might we use this?”

Since That Time…

There have been two major iterations of AndyWeather. I use weather data collected during the first iteration for training purposes at SSIS Academy and when delivering training to Enterprise Data & Analytics customers.

AndyWeather v2

The setup of the second iteration is fairly straightforward:

  1. The Acurite Weather Station consists of an instrument pack plus a base station. The instruments collect weather measurements and transmit them to the base station.
  2. The base station is connected to an older e-Machine running Windows 7 Ultimate (32-bit) on 2GB RAM.
  3. An Acurite application interfaces with the base station and the application stores data locally in a single CSV file.
  4. I wrote a very simple C# console application named “abt” (an acronym for “Azure Blob Transfer”) to transfer the CSV file to Azure Blog Storage.
  5. An Azure Data Factory pipeline that loads an Azure SQL DB staging table.
  6. The AndyWeather website which reads the latest weather data from the Azure SQL DB staging table.
  7. I wrote another very simple C# application named “awt” (an acronym for “AndyWeather Tweets”) that tweets updates to the @AndyWeather twitter account.

Acurite Weather Station

The latest iteration began in early 2018 when I purchased an updated package of instruments and a new base station made by Acurite. So far, I like this station a lot. It was less expensive than the previous station and appears more rugged (again, so far – time will tell).

I recently relocated the weather station to improve connectivity between the instruments and the base station. I recorded a Data Driven *DataPoint* about it:

(Pay no attention to the exploding pecans in the background…)

The e-Machine

I intentionally use an under-powered PC for the server. Why? I want to learn how the base station – and then everything downstream of the base station – responds to busy server conditions. This is Engineering 101 stuff and I’ve learned a lot:

I love this old machine!

Acurite Application

The Acurite people maintain an application for communicating with base stations:

(click to enlarge)

The PC Connect application allows me to configure how and when weather data is collected from the base station – which collects measurements from the instruments. The application lets me configure the units-of-measure and file location – and I can even share my weather data with Weather Underground. How cool is that?

The Azure Blob Transfer Console Application

The Azure Blob Transfer (abt) application is a very simple console application written in C#. It picks up the CSV file containing weather data stored by the Acurite PC Connect application and writes the file to an Azure Blob Storage container:

(click to enlarge)

The CSV file in Azure Blob Storage is overwritten each time abt successfully executes. You can download a copy of the abt solution here.

Azure Data Factory Pipeline

An Azure Data Factory (ADF) pipeline calls a stored procedure that first truncates a staging table in a Azure SQL DB using a Stored Procedure activity, followed by a Copy Data activity that copies the weather data from the CSV file in Azure Blob Storage to an Azure SQL DB staging table:

At the time of this writing, ADF version 2 is current.

You can download the ARM template for the pipeline here.

The AndyWeather Website

The AndyWeather website has been around since the days of the first iteration of AndyWeather – the one that stored data in a SQL Server instance hosted at GoDaddy’s DMZ. It’s fairly straightforward code, which helps it perform fairly on desktops and mobile devices:

The biggest performance hit comes from executing the stored procedure against an Azure SQL DB, which can sometimes take 5-10 seconds to complete.

The AndyWeather Tweets Console Application

I snagged some C# code and a TwitterAPI class from a project named called TweetSharp to help build the awt console application:

You can download a copy of the awt solution here.

The @AndyWeather Twitter Account

It makes me happy every time I see a tweet from @AndyWeather:

I tell people, “It’s just a dumb little app,” but I really had fun building it. I learned a bunch, too!

Conclusion

The AndyWeather IoT solution uses hybrid technology – on-premises instruments and servers, combined with cloud services – to deliver weather data to a website and Twitter account. It’s accessible from social media and the web from desktops and mobile devices.

Just so you know, this isn’t everything I’ve built using the AndyWeather instruments. There’s a bunch more – some of which is still in the experimental phase. I’ll share more as time permits. But I want you all to know, I consider Azure a great big cyber-playground!

:{>

Announcing the Fundamentals of Azure Data Factory Course!

I am excited to announce a brand new course (it still has that new course smell) from Brent Ozar Unlimited and honored to deliver it! This one-day, live, online course is titled Fundamentals of Azure Data Factory and it’s designed to introduce you to Azure Data Factory (ADF).

There will be demos.
Live demos.
Lots of live demos!

Abstract

Azure Data Factory, or ADF, is an Azure PaaS (Platform-as-a-Service) that provides hybrid data integration at global scale. Use ADF to build fully managed ETL in the cloud – including SSIS. Join Andy Leonard – authorblogger, and Chief Data Engineer at Enterprise Data & Analytics – as he demonstrates practical Azure Data Factory use cases.

In this course, you’ll learn:

  • The essentials of Azure Data Factory (ADF)
  • Developing, testing, scheduling, monitoring, and managing ADF pipelines
  • Lifting and shifting SSIS to ADF SSIS Integration Runtime (Azure-SSIS)
  • ADF design patterns
  • Data Integration Lifecycle Management (DILM) for the cloud and hybrid data integration scenarios

To know if you’re ready for this class, look for “yes” answers to these questions:

  • Do you want to learn more about cloud data integration in Azure Data Factory?
  • Is your enterprise planning to migrate its data, databases, data warehouse(s), or some of them, to the cloud?
  • Do you currently use SSIS?

The next delivery is scheduled for 10 Dec 2018. Register today!

I hope to see you there.

:{>

PASS Summit 2018 Starts in 10 Weeks!

Can you believe the PASS Summit 2018 begins only 10 weeks from today (27 Aug 2018)? I confess, this is sneaking up on me fast!

I will be there. Will you?

Where Can You Find Andy at the PASS Summit 2018?

Precon!

Monday 5 Nov 2018, I’m delivering a full-day pre-conference session titled Intelligent Data Integration with SSIS. I’m going to cover  everything listed at that link but there is an update about my precon content:

There will be Azure Data Factory content and demos!

Why this addition? Two reasons:

  1. My presentation titled Faster SSIS was selected. I usually include the three Faster SSIS demos in my precon. This time, you can just view the Faster SSIS session to see those demos.
  2. may have something cool and new to share about Azure Data Factory that is currently under NDA! Stay tuned…

Enterprise Data & Analytics is Exhibiting!

That’s right, you can find me in the Exhibition Hall! Enterprise Data & Analytics is exhibiting at the PASS Summit 2018!

Have an SSIS or Biml or ADF question? Stop by our booth!
Want to grab a selfie with me or Nick? Stop by our booth!
Want me to autograph your book? Stop by our booth!
Need some consulting or training help? Stop by our booth!

I’m so excited about this – I can hardly wait. We’ll have more information about specific dates and times when I will be manning the booth in coming weeks.

Presenting Faster SSIS

At the time of this writing, the session schedule has not yet been published. PASS has published a general schedule. Keep checking for details!

Conclusion

I am looking forward to the PASS Summit 2018. I hope to see you there.

:{>

Lift and Shift an SSIS Catalog…

… to a different instance on-premises.
… to Azure Data Factory SSIS Integration Runtime.

This screenshot is from the SSIS Catalog Compare documentation. You can script an entire SSIS Catalog using SSIS Catalog Compare!

This awesome functionality can save you hours (days, weeks, even!) when starting a new project or updating an existing project.

This awesome functionality supports DevOps with SSIS.

Learn more about SSIS Catalog Compare.

View the recording of Use SSIS Catalog Compare to Lift and Shift SSIS to Azure Data Factory (registration required).