HowTo: Install SSIS Framework Community Edition

SSIS Framework Comunity Edition is available from DILM Suite (Data Integration Lifecycle Management Suite). SSIS Framework CE is not only free, it is open source.

To install SSIS Framework Community Edition, visit DILM Suite and click SSIS Framework Community Edition:

When the SSIS Framework Community Edition page opens, click the first link:

This link will take you to GitHub where you may download the code:

Unzip the file and open the PDF titled SSIS Framework Community Edition Documentation and User Guide:

The documentation includes instructions for creating an SSIS Catalog on-premises – or an instance of Azure-SSIS in Azure Data Factory:

The Documentation and User Guide next walks you through the process of upgrading the SSIS packages to the latest version of SSIS, followed by instructions for deploying the Framework project:

The next step is to execute T-SQL scripts that configure the project you just deployed and also create the SSIS Framework objects:

The remainder of the document walks you through testing and customizing the SSIS Framework for your own needs. A customizing primer is included, in which I demonstrate how to extend Framework functionality.

Learn More

Click here to learn about enterprise SSIS Framework editions.
Click here to learn about SSIS Framework Browser, which is free.

New SSIS Catalog Browser Azure Properties

Thanks to some help from my friend, Sandy Winarko, SSIS Catalog Browser version 0.8.9.0 now includes Azure-SSIS Catalog Properties for Azure Worker Agents and Azure Status. When Azure-SSIS is stopped or starting, SSIS Catalog Browser reports an Azure Status of “Not Running”:

(click to enlarge)

Once even one Azure-SSIS worker agent starts, SSIS Catalog Browser reports an Azure Status of “Running” and surfaces the number of Azure Worker Agents currently running:

(click to enlarge)

Once all Azure-SSIS worker agents start, SSIS Catalog Browser surfaces the number of Azure Worker Agents currently running:

(click to enlarge)

SSIS Catalog Browser is one of the free utilities available for download from DILM Suite (Data Integration Lifecycle Management Suite).

The Recording for Loading Medical Data with SSIS is Available

Kent Bradshaw and I had a blast delivering Loading Medical Data with SSIS earlier today! If you missed the webinar and, perhaps more importanly, coupon codes to save on upcoming Enterprise Data & Analytics Training.

Enjoy the video!

We demonstrated a handful of (free!) DILM Suite (Data Integration Lifecycle Management) utilities:

Join us next week for another free webinar: Enterprise SSIS Execution!

Free Webinar – Enterprise SSIS Execution

Join Kent Bradshaw and me as we present (another) free Enterprise Data & Analytics webinar Tuesday, 23 Apr 2019 at 12:00 PM EDT: Enterprise SSIS Execution.

Abstract

SQL Server Integration Services (SSIS) is a powerful enterprise data integration tool that ships free with Microsoft SQL Server. Join Andy Leonard – Microsoft Data platform MVP, author, blogger, and Chief Data Engineer at Enterprise Data & Analytics – and Kent Bradshaw – Database Administrator, Developer, and Data Scientist at Enterprise Data & Analytics – as they demonstrate several ways to execute enterprise SSIS.

Join this webinar and learn how to execute SSIS from:

  • SSDT (SQL Server Data Tools)
  • the Command Prompt
  • the SSIS Catalog
  • a metadata-driven SSIS Framework

Register today!

:{>

Regarding SSIS Frameworks, Part 1 – Execution

Why Use a Framework?

The first answer is, “Maybe you don’t need or want to use a framework.” If your enterprise data integration consists of a small number of SSIS packages, a framework could be an extra layer of hassle metadata management for you that, frankly, you can live without. We will unpack this in a minute…

“You Don’t Need a Framework”

I know some really smart SSIS developers and consultants for whom this is the answer.
I know some less-experienced SSIS developers and consultants for whom this is the answer.
Some smart and less-experienced SSIS developers and consultants may change their minds once they gain at-scale experience and encounter some of the problems a framework solves.
Some will not.

If that last paragraph rubbed you the wrong way, I ask you to read the next one before closing this post:

One thing to consider: If you work with other data integration platforms – such as DataStage or Informatica – you will note these platforms include framework functionality built-in. Did the developers of these platforms include a bunch of unnecessary overhead in their products? No. They built in framework functionality because framework functionality is a solution for common data integration issues encountered at enterprise scale.

If your data integration consultant tells you that you do not need a framework, one of two things is true:
1. They are correct, you do not need a framework; or
2. They have not yet encountered the problems a framework solves, issues that only arise when one architects a data integration solution at scale.

– Andy, circa 2018

Data Integration Framework: Defined

A data integration framework manages three things:

  1. Execution
  2. Configuration
  3. Logging

This post focuses on…

Execution

If you read the paragraph above and thought, “I don’t need a framework for SSIS. I have a small number of SSIS packages in my enterprise,” I promised we would unpack that thought. You may have a small number of packages because you built one or more monolith(s). A monolith is one large package containing all the logic required to perform a data integration operation – such as staging from sources.

(click to enlarge)

The monolith shown above is from a (free!) webinar Kent Bradshaw and I deliver 17 Apr 2019. It’s called Loading Medical Data with SSIS. We refactor this monolith into four smaller packages – one for each Sequence Container – and add a (Batch) Controller package to execute them in order. I can hear some of you thinking…

“Why Refactor, Andy?”

I’m glad you asked! Despite the fact that its name contains the name of a popular relational database engine (SQL Server), SQL Server Integration Services is a software development platform. If you search for software development best practices, you will find something called Separation of Concerns near the top of everyone’s list.

One component of separation of concerns is decoupling chunks of code into smaller modules of encapsulated functionality. Applied to SSIS, this means Monoliths must die:

A slide from the Expert SSIS training…

If your SSIS package has a dozen Data Flow Tasks and one fails, you have to dig through the logs – a little, not a lot; but it’s at 2:00 AM – to figure out what failed and why. You can cut down the “what failed” part by building SSIS packages that contain a single Data Flow Task per package.

If you took that advice, you are now the proud owner of a bunch of SSIS packages. How do you manage execution?

Solutions

There are a number of solutions. You could:

  1. Daisy-chain package execution by using an Execute Package Task at the end of each SSIS package Control Flow that starts the next SSIS package.
  2. Create a Batch Controller SSIS package that uses Execute Package Tasks to execute each package in the desired order and degree of parallelism.
  3. Delegate execution management to a scheduling solution (SQL Agent, etc.).
  4. Use an SSIS Framework.
  5. Some combination of the above.
  6. None of the above (there are other options…).

Dasiy-Chain

Daisy-chaining package execution has some benefits:

  • Easy to interject a new SSIS package into the workflow, simply add the new package and update the preceding package’s Execute Package Task.

Daisy-chaining package execution has some drawbacks:

  • Adding a new package to daisy-chained solutions almost always requires deployment of two SSIS packages – the package before the new SSIS package (with a reconfigured Execute Package Task – or an update to the ) along with the new SSIS package. The exception is a new first package. A new last package would also require the “old last package” be updated.

Batch Controller

Using a Batch Controller package has some benefits:

  • Relatively easy to interject a new SSIS package into the workflow. As with daisy-chain, add the new package and modify the Controller package by adding a new Execute Package Task to kick off the new package when desired.

Batch-controlling package execution has some drawbacks:

  • Adding a new package to a batch-controlled solutions always requires deployment of two SSIS packages – the new SSIS package and the updated Controller SSIS package.

Scheduler Delegation

Depending on the scheduling utility in use, adding a package to the workflow can be really simple or horribly complex. I’ve seen both and I’ve also seen methods of automation that mitigate horribly-complex schedulers.

Use a Framework

I like metadata-driven SSIS frameworks because they’re metadata-driven. Why’s metadata-driven so important to me? To the production DBA or Operations people monitoring the systems in the middle of the night, SSIS package execution is just another batch process using server resources. Some DBAs and operations people comprehend SSIS really well, some do not. We can make life easier for both by surfacing as much metadata and operational logging – ETL instrumentation – as possible.

Well architected metadata-driven frameworks reduce enterprise innovation friction by:

  • Reducing maintenance overhead
  • Batched execution, discrete IDs
  • Packages may live anywhere
Less Overhead

Adding an SSIS package to a metadata-driven framework is a relatively simple two-step process:

  1. Deploy the SSIS package (or project).
  2. Just add metadata.

A nice bonus? Metadata stored in tables can be readily available to both production DBAs and Operations personnel… or anyone, really, with permission to view said data.

Batched Execution with Discrete IDs

An SSIS Catalog-integrated framework can overcome one of my pet peeves with using Batch Controllers. If you call packages using the Parent-Child design pattern implemented with the Execute Package Task, each child execution shares the same Execution / Operation ID with the parent package. While it’s mostly not a big deal, I feel the “All Executions” report is… misleading.

Using a Catalog-integrated framework gives me an Execution / Operation ID for each package executed – the parent and each child.

“Dude, Where’s My Package?”

Ever try to configure an Execute Package Task to execute a package in another SSIS project? or Catalog folder? You cannot.* By default, the Execute Package Task in a Project Deployment Model SSIS project (also the default) cannot “see” SSIS packages that reside in other SSIS projects or which are deployed to other SSIS Catalog Folders.

“Why do I care, Andy?”

Excellent question. Another benefit of separation of concerns is it promotes code reuse. Imagine I have a package named ArchiveFile.dtsx that, you know, archives flat files once I’m done loading them. Suppose I want to use that highly-parameterized SSIS package in several orchestrations? Sure, I can Add-Existing-Package my way right out of this corner. Until…

What happens when I want to modify the packages? Or find a bug? This is way messier than simply being able to modify / fix the package, test it, and deploy it to a single location in Production where a bajillion workflows access it. Isn’t it?

It is.

Messy stinks. Code reuse is awesome. A metadata-driven framework can access SSIS packages that are deployed to any SSIS project in any SSIS Catalog folder on an instance. Again, it’s just metadata.

*I know a couple ways to “jack up” an Execute Package Task and make it call SSIS Packages that reside in other SSIS Projects or in other SSIS Catalog Folders. I think this is such a bad idea for so many reasons, I’m not even going to share how to do it. If you are here…

… Just use a framework.

SSIS Framework Community Edition

At DILM Suite, Kent Bradshaw and I give away an SSIS Framework that manages execution – for free! I kid you not. SSIS Framework Community Edition is not only free, it’s also open source.

Lift and Shift SSIS to Azure

Enterprise Data & Analytics‘ data engineers are experts at lifting and shifting SSIS to Azure Data Factory SSIS Integration Runtime. 

Our state-of-the-art DILM Suite tools in the capable hands of our experienced data engineers combine to drastically reduce the amount of time to manually migrate and apply SSIS Catalog configuration artifacts – Literals, Catalog Environments and Catalog Environment Variables, References, and Reference Mappings – while simultaneously improving the quality of the migration effort.

Check out our Lift and Shift page to learn more!

:{>

Mean Time to Identify Failure

While managing a team of 40 ETL developers, I wanted to track lots of metrics. Some of the things I wanted to track were technical, like SSIS package execution times. Some metrics were people-centric. 

Andy’s First Rule of Statistics states:

You can use statistics for anything about people, except people.

Andy – circa 2005

It was important to me to track how long it took the on-call person to identify the problem. I didn’t use the information to beat on-call people over the head. I used the information to measure the results of several experiments for displaying metadata about the failure.

Reports For The Win

You may be as shocked by this as I was; reports helped a lot more than I anticipated. Before I deployed the reports the Mean Time to Identify Failure was tracking just under 30 minutes. After deploying the reports, the mean time to identify failure fell to 5 minutes.

As I said, I was shocked. There were mitigating circumstances. The on-call team members were getting more familiar with the information SSIS produces when it logs an error. They were gaining experience, seeing similar errors more than once.

I accounted for growing familiarity by narrowing the time window I examined. The least-impressive metrics put the reduction at 18 minutes to 5 minutes.

Pictures…

(click to enlarge)

Before I built and deployed the dashboard for SSIS Application Instances (like the one pictured at the top of this post), on-call people would query custom-logging tables we built to monitor enterprise data integration. The queries to return Application Instance log data were stored where everyone could reach them. In fact, I used the same queries as sources for this report.

A funny thing happened when I deployed the reports. Each week, one or more on-call people would ping me and tell me how much they liked the reports. Even though the data was the same, the presentation was different. A picture with a little color goes a long way.

The image at the beginning of this section – the SSIS Framework Task Instance Report – is displayed when a user clicks the Failed link shown in the initial screenshot. This design received he most comment by the on-call team members. The most common comment was, “I click the Failed link and it takes me to details about the failure.” The reports were passing The 2:00 AM Test.

SSIS Framework Applications

If you’ve read this far and wondered, “Andy, what’s an SSIS Application?” An SSIS Application is a construct I came up with to describe a collection of SSIS Packages configured to execute in a specific order. An application is a way to group SSIS packages for execution. You can get a sense of how our frameworks work – especially the application execution functionality – by checking out the SSIS Framework Community Edition at DILM Suite (DILM == Data Integration Lifecycle Management).

(click to enlarge)

An Application Instance is an instance of execution of an SSIS Application. An Application Instance is made up of Package Instances. the relationship between applications and packages appears straightforward: an application is a collection of packages; parent-child; one-to-many. But it’s not quite that simple. Our SSIS Frameworks facilitate patterns that execute the same package multiple times, sometimes in parallel! We can also create packages that perform utility functions – such as ArchiveFile.dtsx – and call it from multiple applications. When you do the Boolean algebra, the relationship between applications and packages is many-to-many. 

Our SSIS Frameworks are SSIS Catalog-integrated. They even work with the SSIS Integration Runtime that’s part of Azure Data Factory, Azure-SSIS. 

Dashboards… Evolved

While the Reporting Services dashboard was neat when it was released back in the day, the cool kids now play with Power BI. At DILM Suite you will also find a free – albeit basic – Power BI dashboard that surfaces many of the same metrics using even better reporting technology. The Basic SSIS Catalog Dashboard in Power BI is free at DILM Suite.

I’ve not yet collected Mean Time to Identify Failure metrics using the Basic SSIS Catalog Dashboard in Power BI dashboard. Perhaps you can be the first.

Enjoy!

Packaging SSIS Catalog Deployments

I love the SSIS Catalog. It’s an elegant piece of data integration engineering and I cannot say enough positive things about it. Packaging SSIS Catalog deployments can be tricky, though.

The SSIS Catalog is a framework. Frameworks manage execution, configuration, and logging; and the SSIS Catalog handles each task with grace. Like I said, I love it!

But…

(You knew there was a “but…” coming, didn’t you?)

A Tale of Two Audiences

There are two audiences for the SSIS Catalog, two groups of consumers:

  1. Administrators
  2. Developers
  3. Release Managers

I listed three. Because…

Administrators

(click to enlarge)

SSIS is often administered by database administrators, or DBAs. I admire DBAs. It’s often a thankless job – more like a collection of jobs all rolled into one (and subsequently paid as if its one job…).

I believe the SSIS Catalog interface presented to DBAs in SQL Server Management Studio is sufficient.

My complaint is the SSIS administrator has to expand a handful of nodes in Object Explorer and then right-click to open the SSIS project configuration window and then double-click each referenced SSIS Catalog environment to determine which value is configured for use when an SSIS package is executed.

Click the screenshot above to see what I mean. Configuring SSIS Catalog deployments in SSMS is challenging. I find it… clunky. Once I understood all the windows, what they meant and how to configure an SSIS package and project deployed to the SSIS Catalog, this made sense. But – in my opinion – this interface works against comprehension.

Does this interface work, though? It certainly does. When I teach people how to use the SSIS Catalog, I show them how to use the Object Explorer interface provided in SSMS.

(click to enlarge)

I don’t stop there, however, because I built one solution to the problem. I call my solution SSIS Catalog Browser. If you click to enlarge this image you will note I am viewing the configuration of the same parameter displayed in the SSMS image above. I find this interface cleaner.

Do administrators still need to understand how to configure SSIS Catalog deployments and SSIS packages and projects deployed to the SSIS Catalog? You bet. There is no substitute for understanding. SSIS Catalog Browser surfaces the same metadata displayed in the SSMS Object Explorer. The only difference is Catalog Browser is easier to navigate – in my opinion.

Best of all, Catalog Browser is free.

Developers and Release Managers

SSIS developers and release managers (DevOps release teams) need more functionality. As I wrote in DILM Tiers for the SSIS Enterprise, an enterprise should have a minimum of four Data Integration Lifecycle Management (DILM) tiers to manage enterprise data integration with SSIS. Those tiers need to be:

  1. Development – an environment where SSIS developers build SSIS packages and projects. SSIS developers need permission / rights / roles to utterly destroy the database instances in Dev. If the SSIS developers lack this ability, you have “an environment named Development but not a Development environment.” There is a difference.
  2. Test or Integration – an environment where SSIS developers have permission / rights / roles to deploy, configure, execute, and view logs related to SSIS packages and projects.
  3. UAT or QA (User Acceptance Testing or Quality Assurance or any environment other than Production, Test, or Development) – an environment that mimics Production in security, permission / rights / roles. Developers may (or may not) have read access to logs, source, and destination data. SSIS administrators (DBAs or DevOps / Release teams) own this environment. The person performing the deployment to Production should perform the deployment to UAT / QA / Whatever because I do not want the Production deployment to be the very first time this person deploys and configures this SSIS project.
  4. Production.

I architect data integration environments in this manner to support DILM (Data Integration Lifecycle Management) with SSIS, as I wrote in Data Integration Lifecycle Management with SSIS.

Viewing the contents of an SSIS Catalog is not enough functionality to manage releases. Why, then, do I include developers? Because…

SSIS developers create the initial SSIS Catalog deployments in the DILM DevOps cycle.

I cannot overemphasize this point. Developers need an environment where they are free to fail to build SSIS. They aren’t free to succeed, in fact, unless and until they are free to fail.

Have you ever heard a developer state, “It works on my machine.”? Do you know why it works on their machine? Defaults. They coded it up using default values. The defaults have to work or the developer will not pass the code along to the next tier.

How does an SSIS developer know they’ve forgotten to parameterize values?
How do they figure this out?

It’s impossible to test for missing parameters in the Development environment.

The answer is: SSIS developers must deploy the SSIS project to another environment – an environment separate and distinct from the Development environment – to test for missing parameterization.

To review: SSIS developers need a Development environment (not merely an environment named Dev) and they need a different environment to which they can deploy, configure, execute, and monitor logs.

Error Elimination

Having the SSIS developers build and script the SSIS Catalog deployments eliminates 80% of deployment configuration errors (according to Pareto…).

Having SSIS administrators practice deployment to UAT / QA / Whatever eliminates 80% of the remaining errors.

Math tells me an enterprise practicing DILM in this manner will experience a 4% deployment error rate. (Want to knock that down to 0.8%? Add another tier.)

Packaging SSIS Deployment

I will not go into functionality missing from the SSMS Object Explorer Integration Services Catalogs node (nor the underlying .Net Framework assemblies). I will simply state that some functionality that I believe should be there is not there.

(click to enlarge)

I don’t stop there, however, because I built one solution to the problem. I call my solution SSIS Catalog Compare. If you click the image to enlarge it, you will see a treeview that surfaces SSIS Catalog artifacts in the same way as SSIS Catalog Browser (they share a codebase). You will also see the results of a comparison operation, and the user initiating the packaging of an SSIS folder deployment by scripting the folder and its contents.

The result is a file system folder that contains T-SQL script files and ISPAC files for each SSIS Catalog configuration artifact:

  • Folder
  • SSIS Project
  • Environment
  • Reference
  • Configured Literals

You can use SSIS Catalog Compare to lift and shift SSIS Catalog objects from any environment to any other environment – or from any DILM tier to any other DILM tier – provided you have proper access to said environments and tiers.

This includes lifting and shifting SSIS to the Azure Data Factory SSIS Integration Runtime, also know as Azure-SSIS.

Zip up the contents of this file system folder, attach it to a ticket, and let your enterprise DevOps process work for data integration.

Conclusion

The SSIS Catalog is a framework, and a pretty elegant framework at that. Some pieces are clunky and other pieces are missing.

DILM Suite helps.