Start Azure-SSIS, Execute Package, Stop Azure-SSIS

Last August, I had a blast at the Data Platform Summit 2018 in Bengaluru, India. One highlight of the trip was meeting and conversing with the Azure Data Factory SSIS Integration Runtime Product/Program Manager, Sandy Winarko.

Sandy and I swapped coder stories and talked about data integration design patterns and I learned a few new design patterns sitting in his session. This post is about one of the patterns I learned from Sandy.

Before I move on, though, you should really check out the Data Platform Summit! It is an awesome show – Asia’s largest Microsoft data conference.

Stop In The Name Of… Money

I love Azure! I love the technology and the velocity of development. The cloud is changing the world and Azure is changing the cloud – almost every day. One cool thing about Azure is cost savings. A lot services are free or nearly free. That. is. Awesome! But some services are not free.  (That’s not a bad thing!)

Azure Data Factory’s SSIS Integration Runtime (aka Azure-SSIS IR) is not free. It costs money even when it’s just sitting there switched on with no SSIS packages executing. Depending on the size of the Azure-SSIS IR, one can wrack up some serious charges for what amounts to, well, no work.

That’s why I was excited to learn about this pattern from Sandy. The pattern is made up of three parts:

  1. Start Azure-SSIS
  2. Execute an SSIS package
  3. Stop Azure-SSIS

Sandy, Nick SchonningDouglas Laudenschlager, and Sreedhar Pelluru authored a post at Microsoft Docs titled How to start and stop Azure-SSIS Integration Runtime on a schedule. When I tried to follow the instructions I encountered one issue. Now, you may read that last sentence and think, “Andy, don’t you have a problem following instructions anyway?” Yes, but that’s beside the point… mostly.

Their article walks one through setting up an instance of Azure-SSIS. It then proceeds to describe how to build the start-execute-stop pattern Sandy described to me (and everyone who attended his presentation) in Bengaluru. Now this pattern has evolved since Sandy first described it. The first version used Azure Runbooks and Webhooks. I like this version much better. You can still use Runbooks and there’s more information in the How to start and stop Azure-SSIS Integration Runtime on a schedule article.

Before You Start

I’m going to share the part that gave me trouble first – it was security. In the article, Sandy and company write in Step 5 of the Create Your Pipelines section:

Assign the managed identity for your ADF a Contributor role to itself, so Web activities in its pipelines can call REST API to start/stop Azure-SSIS IRs provisioned in it.

I read that and proceeded to assign myself to the Contributor role of Azure Data Factory. That didn’t work. Not at all. And that’s not what the instructions said. The instructions said to “Assign the managed identity for your ADF a Contributor role to itself…” 

What’s a Managed Identity?

Well, I searched for the answer and found this spiffy article titled Azure Data Factory service identity that not only explained what an ADF Managed Identity was, it also told me where to find it

See, when I browsed over to assign the ADF Contributor Role, all I saw in the available list of people / accounts I could assign was li’l ol’ me:

So that’s who I assigned to the Contributor Role: me. As I said, that did not work.

The article titled Azure Data Factory service identity took me right to the location of the ADF Service Identity Application ID, which was the value I needed: 

After I figured this out – with awesome and patient help from Sandy himself (thanks Sandy!) – I was in business.

Stop It!

I created a separate pipeline to stop Azure-SSIS. I run it every night at 11:00 PM ET because if I’m awake at that time (I usually rise around 4:00 AM ET), I am not being productive on my computer (if I am on it at all…). I do this in case I forget and leave Azure-SSIS running after a presentation, class, or client demo.

Following the instructions, I created a new pipeline and added a Web Activity. I configured the Web Activity according to the instructions. Mine looked like this:

(click to enlarge)

The tricksy part (if there is one) is configuring the URL. There’s a shortcut for that. Click on the Monitor page of the ADF portal:

Click the Integration Runtimes tab at the top of the page, and then click the “View Integration Runtime Detail” link under the Actions column to bring up the snappy new mini-dashboard for Azure-SSIS:

Regardless of the state of Azure-SSIS, click the status link to open a functional  pane that contains the Resource ID and a Copy button:

This copies a lot of what you will need to set up the call to stop Azure-SSIS (and also to start it): 

/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DataFactory/factories/{factoryName}/integrationRuntimes/{integrationRuntimeName}

When you copy the Resource ID, you get all those missing values filled in for you. How neat is that? Prefix that with “https://management.azure.com” and add the suffix “/stop?api-version=2018-06-01”, and you are golden.

Finish following the instructions in the Create Your Pipelines section, and then take it out for a test drive.

Clone

Things begin to pick up here partly because you’ve done a lot more work than you realize already, and partly because supper’s ready and I’ve been blogging for the past couple hours.

(For you, Dear Reader! I do it all for You!) And for me. In case I forget…

Click the ellipsis beside the pipeline and then click Clone to make a fresh copy of your pipeline (it still has that new pipeline smell!)

I name my new clone Start Run Stop.

The Stop AndyDemoSSIS Web Activity is in there already and just begging to be cloned.

Look how easy it is to clone an activity in ADF. I mean, could they make this any easier? I think not.

Once its cloned, it’s not much work to edit the configuration of the cloned Web Activity to start the AndyDemo Azure-SSIS instance:

Boom. That just happened.

“Run-Time!” (Spoken in the Voice of the Geico Camel)

Next up? The Execute SSIS Package Activity:

I’m not going to write much about how to configure the Execute SSIS Package Activity. You can learn more in one of these blog posts where I’ve already written about it. Supper’s waiting.

Conclusion

Thanks to Sandy and the team at Microsoft for the excellent documentation. This is a very cool Azure Data factory Design Pattern!

Enjoy!

Mean Time to Identify Failure

While managing a team of 40 ETL developers, I wanted to track lots of metrics. Some of the things I wanted to track were technical, like SSIS package execution times. Some metrics were people-centric. 

Andy’s First Rule of Statistics states:

You can use statistics for anything about people, except people.

Andy – circa 2005

It was important to me to track how long it took the on-call person to identify the problem. I didn’t use the information to beat on-call people over the head. I used the information to measure the results of several experiments for displaying metadata about the failure.

Reports For The Win

You may be as shocked by this as I was; reports helped a lot more than I anticipated. Before I deployed the reports the Mean Time to Identify Failure was tracking just under 30 minutes. After deploying the reports, the mean time to identify failure fell to 5 minutes.

As I said, I was shocked. There were mitigating circumstances. The on-call team members were getting more familiar with the information SSIS produces when it logs an error. They were gaining experience, seeing similar errors more than once.

I accounted for growing familiarity by narrowing the time window I examined. The least-impressive metrics put the reduction at 18 minutes to 5 minutes.

Pictures…

(click to enlarge)

Before I built and deployed the dashboard for SSIS Application Instances (like the one pictured at the top of this post), on-call people would query custom-logging tables we built to monitor enterprise data integration. The queries to return Application Instance log data were stored where everyone could reach them. In fact, I used the same queries as sources for this report.

A funny thing happened when I deployed the reports. Each week, one or more on-call people would ping me and tell me how much they liked the reports. Even though the data was the same, the presentation was different. A picture with a little color goes a long way.

The image at the beginning of this section – the SSIS Framework Task Instance Report – is displayed when a user clicks the Failed link shown in the initial screenshot. This design received he most comment by the on-call team members. The most common comment was, “I click the Failed link and it takes me to details about the failure.” The reports were passing The 2:00 AM Test.

SSIS Framework Applications

If you’ve read this far and wondered, “Andy, what’s an SSIS Application?” An SSIS Application is a construct I came up with to describe a collection of SSIS Packages configured to execute in a specific order. An application is a way to group SSIS packages for execution. You can get a sense of how our frameworks work – especially the application execution functionality – by checking out the SSIS Framework Community Edition at DILM Suite (DILM == Data Integration Lifecycle Management).

(click to enlarge)

An Application Instance is an instance of execution of an SSIS Application. An Application Instance is made up of Package Instances. the relationship between applications and packages appears straightforward: an application is a collection of packages; parent-child; one-to-many. But it’s not quite that simple. Our SSIS Frameworks facilitate patterns that execute the same package multiple times, sometimes in parallel! We can also create packages that perform utility functions – such as ArchiveFile.dtsx – and call it from multiple applications. When you do the Boolean algebra, the relationship between applications and packages is many-to-many. 

Our SSIS Frameworks are SSIS Catalog-integrated. They even work with the SSIS Integration Runtime that’s part of Azure Data Factory, Azure-SSIS. 

Dashboards… Evolved

While the Reporting Services dashboard was neat when it was released back in the day, the cool kids now play with Power BI. At DILM Suite you will also find a free – albeit basic – Power BI dashboard that surfaces many of the same metrics using even better reporting technology. The Basic SSIS Catalog Dashboard in Power BI is free at DILM Suite.

I’ve not yet collected Mean Time to Identify Failure metrics using the Basic SSIS Catalog Dashboard in Power BI dashboard. Perhaps you can be the first.

Enjoy!

Updates to the ADF Execute SSIS Package Activity

Last night I presented Moving Data with Azure Data Factory to a packed house at the Richmond Azure User Group. The crowd was engaged and asked awesome questions. I had a blast!

Surprise!

I rehearsed many of my demos before the presentation and even updated my slides. One thing I did not do was rehearse configuring an Execute SSIS Package activity. Why? I’ve built these demos live a bajillion times. I know what I’m doing…

When I clicked on the Settings tab I found new stuff – options I’d not previously encountered. Thankfully, I immediately recognized the purpose of these new configuration options – and I also immediately liked them. The demo worked and no one was the wiser (except the handful of folks with whom I shared this story after the meeting).

New Dropdowns for Folder, Project and Package

The new interface sports new dropdowns for selecting the Catalog Folder, Project, and Package to executed. I like this – it’s slick. I had deployed a project to my Azure-SSIS instance between the time I started this part of the demo and the time I wanted to configure the Execute SSIS Package activity. During deployment I created a new Catalog Folder which was not initially listed in the Folder dropdown. Clicking the Refresh button remedied this, though, and I was able to complete configuration rapidly. 

Configuration Tabs

I cannot recall if the Connection Managers and Property Overrides tabs were part of the previous Execute SSIS Package activity interface. I don’t think so, but I could be wrong about that. Update: I verified these are new tabs by looking at screenshots from my June 2018 post titled ADF Execute SSIS Package Activity. The previous version had an Advanced tab. 

The SSIS package configuration tabs are SSIS Parameters, Connection Managers, and Property Overrides.

When your Azure-SSIS instance is running, you may use these tabs to update  Parameter, Connection Manager Property, and SSIS Package Property values:

Warnings

It’s possible to configure the Execute SSIS Package activity when your Azure-SSIS Integration Runtime is not running, but you don’t get the nice dropdown pre-population and have to revert to the previous method of configuring the full SSIS Catalog path to the package you desire to execute.

SSIS Catalog Browser To The Rescue!

If you find yourself in this predicament and would rather configure the Execute SSIS Package activity without waiting 20-30 minutes for the Azure-SSIS instance to spin up, you can use SSIS Catalog Browser – a free utility from DILM Suite – to connect to your Azure-SSIS instance:

SSIS Catalog Browser displays the Catalog path for an SSIS package (or Catalog Environment) when you select the artifact in the unified Catalog surface. Copy the package’s Catalog path displayed in the Status area and and paste the value into the Package Path textbox in ADF:

Make sure the Manual Entries checkbox is checked.

I like the warnings. Feedback is a good thing.

Once Azure-SSIS Is Running

When your Azure-SSIS instance is up and running, you may configure the path to your SSIS package using the dropdowns:

You can even configure the path to a Catalog Environment:

Conclusion

I believe there are at least two lessons to take away from my experience:

  1. When presenting on Microsoft Azure topics, always check your demos to make certain nothing has changed; and 
  2. Microsoft Azure is evolving at a rapid rate – especially Azure Data Factory!

SSIS Catalog Compare Version 4 – Three Videos

I am excited to announce general availability of SSIS Catalog Compare version 4!

I made three short-ish (~2 minutes each) videos demonstrating some of the capabilities of the product. I really like the SCCPAC files and SCCPAC Deployment Wizard, which are only available in the Enterprise Edition.

Use SCCPAC Deployment Wizard to Lift and Shift SSIS Projects and Configurations Metadata to Azure-SSIS

Use SCCPAC Deployment Wizard to Manage SSIS DevOps

Use SSIS Catalog Compare Enterprise Edition and SCCPAC Deployment Wizard to Deploy from the QA to Production Catalog in Real-Time

Enjoy!

:{>

T-SQL Tuesday #111 – What is Your “Why?”

I enjoy math. I noticed a pattern learning math, perhaps you experienced something similar. I found arithmetic an exercise in memory. I have a good memory (well, I had a good memory…) so memorizing a bunch of rules was no big deal. 

When I learned algebra, arithmetic made more sense. In addition to the memorized rules, I saw why the rules existed. I understood the rules better as a result.

This pattern held all through my math education. I understood algebra better once I learned geometry. I understood geometry better once I learned trigonometry. I understood trigonometry better once I learned single-variable calculus.

An Axiom (for me)

I notice a similar pattern applies to my career (or careers, as the case may be). I’ve served in many roles: 

  • Farm laborer
  • Musician
  • Stockyard laborer
  • Truck driver
  • Service technician
  • Soldier (part-time in the Virginia Army National Guard)
  • Electrician
  • Electrical engineer
  • Electronics technician
  • Manufacturing automation integrator
  • Software developer
  • Author
  • Data professional
  • Entrepreneur

The similar pattern manifests itself in this manner: I’ve enjoyed the position – and more success in the position – when I had a reason to do the work; some reason other than a paycheck. In some cases, I had multiple “why’s” beyond remuneration. For example, I join the Virginia Army National Guard to learn electronics and serve my country – to especially protect everyone’s right to free speech guaranteed by the First Amendment. I may not agree with what people say, but I was (and still am) willing to fight and die to preserve the right of US citizens to say whatever they want. 

As a result, I enjoyed serving in the National Guard (for the most part). I learned more. I learned better, I think, because I enjoyed serving.

Entrepreneurship

Entrepreneurship can be challenging. I believe one needs a “why” – or perhaps several “why’s” to remain an entrepreneur. The “why” cannot simply be money. Money isn’t inconsequential, mind you, but I believe the best “why’s” are less tangible.

Passion plays a major role for me. When business isn’t going well or when business is going too well, a couple intangible “why’s” – passions for both entrepreneurship and the kind of work I am blessed to do – inspire me to keep a steady hand on the tiller.

Also, entrepreneurship affords more and different ways to serve people. Am I saying one must be an entrepreneur to serve others? Nope. Flexibility with time, though, facilitates opportunities that may not otherwise be possible, or as possible.

What is Your “Why?”

That’s the question this month: Why do you do what you do?

I look forward to your replies.

The Rules

T-SQL Tuesday has the following rules:

  1. Publish your contribution on Tuesday, 12 Feb 2019 between 00:00 UTC to 23:59 UTC.
  2. Include the T-SQL Tuesday Logo and have it link to this post.
  3. Please comment below with a link to your post.
  4. Tweet about your post using #tsql2sday.
  5. If you’d like to host in the future, Steve Jones.

Roundup

On or about 19 Feb 2019, I will publish a roundup post. (Until then, that’s a broken link…)

Enjoy!

:{>

Delivering Intelligent Data Integration through SSIS in Dallas 31 May 2019!

I am honored to announce I will be delivering Intelligent Data Integration through SSIS – a day-long SQL Saturday 841 – Dallas pre-conference session 31 May 2019!

Description:

What is Intelligent Data Integration? SSIS (SQL Server Integration Services) packages developed using tried and true design patterns, built to participate in a DevOps enterprise practicing DILM (Data Integration Lifecycle Management), produced using Biml (Business Intelligence Markup Language) and executed using an SSIS Framework.

Attend a day of training focused on intelligent data integration delivered by an experienced SSIS consultant who has also led an enterprise team of several ETL developers during multiple projects that spanned 2.5 years (and delivered).

Attendees will learn:

  • a holistic approach to data integration design.
  • a methodology for enterprise data integration that spans development through operational support.
  • how automation changes everything. Including data integration with SSIS.

Topics include:

  1. SSIS Design Patterns
  2. Executing SSIS in the Enterprise
  3. Custom SSIS Execution Frameworks
  4. DevOps and SSIS
  5. Biml, Biml Frameworks, and Tools

Prerequisites: Familiarity with SQL Server Integration Services (SSIS).

Continental Breakfast, Lunch and Afternoon Snacks are provided.

I hope to see you there!

:{>

Lift and Shift is Not a Strategy…

Lift and shift is a bridge.

If your enterprise is considering migrating enterprise data and data integration workloads from on-premises to the cloud, lifting and shifting is not the end of the story. Lifting and shifting is, rather, the mechanism – the conversion project – that positions your enterprise to leverage the economies of scale afforded by cloud technology.

Andy’s Lift and Shift FAQ

“Should We Lift and Shift?”

I hear this question often and my response is, “When it makes sense to do so, yes, you should lift and shift.” This begs the next question, which is…

“How Do We Know It’s Time to Lift and Shift?”

My engineer-y response to this question is, “You will know it’s time to lift and shift to the cloud when you want to leverage functionality available in the cloud that is not (or not yet) available in on-premises versions of the enterprise platform(s) in use in your enterprise.”

“What Are Some Advantages of Migrating to the Cloud?”

The biggest advantage of lifting and shifting enterprise data to the cloud is the ability to efficiently scale operations. By efficiently, I mean quickly and easily – especially when compared to the time and expense (don’t forget opportunity cost when calculating expense) to scale up systems on-premises.

The ability to scale up and scale down on demand is a huge advantage for some business models which experience “spike-y” demand for operations at different times of the year, quarter, or month. But even if that’s not the case, all data scales. It’s very handy to be able to connect to the Azure Portal and move a slider (as opposed to purchasing and provisioning more hardware…).

There’s a brand new (in my opinion) “knob” exposed by cloud-enabled efficient scaling. As I wrote in my post titled Time and Money in the Cloud:

Let’s say you pay $100 to incrementally load your data warehouse and the load takes 24 hours to execute at the scale you’ve selected in the cloud. Prior to thinking in DTUs, engineers and business people would think, “That’s just the way it is. If I want more or faster, I need to pay for more or faster.” But DTU math doesn’t quite work that way. Depending on your workload and DTU pricing at the time (FULL DISCLOSURE: DTU PRICING CHANGES REGULARLY!), you may be able to spend that same $100 on more compute capabilities and reduce the amount of time required to load the same data into the same data warehouse to minutes instead of hours…

The fact that the cost/performance curve can be altered in seconds instead of months meta-changes everything.

“Are There Disadvantages of Migrating to the Cloud?”

It depends. (You knew that was coming, didn’t you?)

Enterprise Data & Analytics helps enterprises migrate data, data integration, lifecycle management, and DevOps to the cloud. In some cases (~30%), the enterprises spend a little more money in the near-term. There are two reasons for this:

  1. When upgrading, it’s always a good idea to operate new systems in tandem with existing systems. In a lift and shift scenario, this means additional expenses for cloud operations while maintaining the expense of on-premises operations. As cloud operations are validated, on-premises operations are shut off; thereby reducing operating expenses. In truth, though, this dynamic (and expense) exists whether one is lifting and shifting to the cloud or simply upgrading system on-premises.
  2. “Standing on the bridge” (more in a bit) can cost more than remaining either on-premises or lifting and shifting the entire enterprise workload to the cloud.
  3. Regulatory requirements – including privacy and regulations about which data is allowed to leave nation-states – will constrain many industries, especially government agencies and NGOs (non-governmental organizations) who interact heavily with government agencies.

Standing On The Bridge

One option we at Enterprise Data & Analytics consider when assisting enterprises in lift and shift engagements is something we call “standing on the bridge.” 

Standing on the bridge is present in each lift and shift project. It’s one strategy for implementing hybrid data management, which almost every enterprise in the cloud today has implemented. Hybrid means part of the enterprise data remains on-premises and part of the enterprise data is lifted and shifted to the cloud. 

Hybrid is implemented for a variety of reasons which include:

  • Mitigating regulatory concerns; and
  • As part of the normal progression of lifting and shifting enterprise data and data workloads to the cloud.

Standing on the bridge for too long is a situation to avoid. 

“How Do We Avoid Standing on the Bridge For Too Long?”

Planning. Planning is how an enterprise avoids standing on the bridge too long. Your enterprise wants advice from experienced professionals to shepherd the lift and shift operation. 

Enterprise Data & Analytics can help.

Helpful Tools

Enterprise Data & Analytics has been delivering, and writing and speaking about Data Integration Lifecycle Management for years. 

We’ve built helpful tools and utilities that are available at the DILM Suite. Most of the DILM Suite tools are free and some are even open source. 

Enterprise Data & Analytics Welcomes Shannon Lowder!

I am honored and thrilled to welcome Shannon Lowder (@shannonlowder | blog | LinkedIn) to the Enterprise Data & Analytics team!

Shannon is a data engineer, data scientist, BimlHero (though not listed on the page at the time of this writing), and shepherd of the Biml Interrogator open source project. If you use Biml to generate SSIS projects that load flat files, you need Biml Interrogator.

Shannon, Kent Bradshaw, and I are also co-authoring a book tentatively titled Frameworks. (Confession: Kent and Shannon are mostly done… I am slacking…)

Shannon brings a metric ton of experience to serve our awesome clients. He has years of experience in data analytics, serving recently in the role of enterprise Lead Data Scientist. Shannon’s experience spans supply chain management, manufacturing, finance, and insurance.

In addition to his impressive data skills, Shannon is an accomplished .Net developer with enterprise senior developer experience (check out Biml Interrogator for a sample of his coding prowess).

Shannon is a regular speaker at SQL Saturday events, presenting on topics that include Business Intelligence, Biml, and data integration automation. He is a gifted engineer with experience in framework design, data integration patterns, and Azure who possesses a knack for automation. Shannon is an avid “tinkerer” who enjoys learning. He has experience implementing Azure Machine Learning and applying AI to predictive analytics using sources classified Big Data. Shannon is also a practitioner of data integration DevOps with SSIS. In other words, he fits right in with our team here at Enterprise Data & Analytics!

As Shannon writes on his LinkedIn profile:

I am a data guy with a passion for partnering with clients to solve their database and technology issues. Over my career, I’ve played all the roles: database developer, administrator, business intelligence developer, and architect, and now consultant. I’m the guy you call in when you have the impossible problem and everyone tells you it cannot be solved. I automate solutions in order to free your current staff to do the higher value tasks. I bring solutions outside of traditional relational database solutions, in order to find the shortest path between you and your goals.

As an accomplished Microsoft SQL data professional, recognized BimlHero, and practicing Data Scientist, I’m the resource you need to extract the most value from your data.

I’m humbled and thankful and excited to watch Enterprise Data & Analytics continue to (quietly) grow – adding cool people (another announcement is forthcoming) and service offerings like Data Concierge. It’s very cool to watch!

Welcome Shannon! I’m honored to work with you, my brother and friend.

For more information, please contact Enterprise Data & Analytics!

Grant SSIS Catalog Read-Only Permissions

The SSIS Catalog has built-in security to manage permissions. SSISDB – the database behind the SSIS Catalog – is “just a database” in many respects. When it comes to security, the SSIS Catalog surfaces an internal mechanism that provides granular permissions management. 
In this post I intend to demonstrate how to use SSIS Catalog security to provide read-only access to SSIS Catalog artifacts. But first…

Two Thoughts About SSISDB Roles

Thought 1: “Help us DBAs, you’re our only hope.” – Princess Leia (paraphrased)

If you work with SSIS you already know the Microsoft team of technical writers is comprised of artists, masters of the field of technical writing. I’m convinced a large part of the successful adoption of SSIS is due these good people. You can see a sample of their outstanding artistry in the article titled Integration Services Roles (SSIS Service).

Two important roles in the SSIS Catalog are ssis_admin and ssis_logreader. According to the article linked above:

ssis_admin. This role provides full administrative access to the SSIS Catalog database.

ssis_logreader This role provides permissions to access all the views related SSISDB operational logs.

SSIS_admin and ssis_logreader are SQL Server database roles. As such, they are typically set and maintained by Database Administrators (DBAs).

Thought 2: Although SSISDB is a SQL Server database, it’s more like an application coded in T-SQL.

One for-instance, for instance, is the SSIS Catalog requires Windows authentication for most administrative activities. It took me a while to understand why Windows authentication is necessary. I now get it, but the explanation is for another post. This has implications, such as:

  • SQL Logins – even sysadmins such as sa – cannot deploy SSIS projects to an SSIS Catalog. Or execute SSIS packages.
    • One exception: an Azure-SSIS SSISDB database hosted on an instance of Azure SQL DB can perform SSIS Catalog administration using a SQL Login.

In sum, the SSIS Catalog is a database application that requires Windows authentication for administrative tasks.

Null Use Case: No Access

When a user has no access to SSIS Catalog artifacts, the SSMS Object Explorer Integration Services Catalogs node appears as shown here:

SSIS Catalog Browser presents a unified view of all SSIS Catalog artifacts (and it’s free!).

All SSIS Catalog-related products and utilities at DILM Suite respect SSIS Catalog security. Early versions of SSIS Catalog Compare allowed users to login with SQL Server Login credentials and access SSIS Catalog artifacts that SSIS Catalog security would block. A couple years ago I refactored CatalogBase – the object beneath DILM Suite’s SSIS Catalog products and utilities that interacts with the SSIS Catalog – to respect SSIS Catalog security.

A user with no access will see an empty SSIS Catalog using SSIS Catalog Browser:

Grant Read-Only Access to Folders

In SSMS, right-click the SSIS Catalog Folder you wish to surface for a 
Windows authentication-based SQL Server login:

When the Folder Properties window displays, click the Permissions page. On the Permissions page, click the Browse button to select one or more Windows authentication logins:

The Browse All Principals dialog displays. Select one or more Windows authentication logins (Windows User type):

To assign read-only permission to the SSIS Catalog Folder, click the OK button on the Browse All Principals dialog and check the Grant checkbox for the Read permission in the Folder Properties window:

The user(s) you selected are now able to view the SSIS Catalog folder using the SSMS Object Explorer’s Integration Services Catalogs node:

Similarly, SSIS Catalog Browser will display the SSIS Catalog folder for the assigned user(s);

“Where’s the Beef?”

I can hear you thinking, “Where are the projects and environments, Andy?” That is an excellent question. I’m glad you asked! They exist, of course, but we only granted the user Read permission for the SSIS Catalog Folder.

Grant Read-Only Access to Folder Artifacts

To see SSIS Projects and SSIS Catalog Environments, a user must be granted Read Objects permission on the SSIS Catalog Folder:

Now users can view SSIS Catalog folder contents using SSMS:

SSIS Catalog Browser users enjoy a similar view:

Conclusion

The SSIS Catalog surfaces a rich and powerful security model that may be leveraged by enterprise data integration DevOps teams using SSMS or SSIS Catalog Browser.

Want to learn more? From me? Join me for the next delivery of Expert SSIS – a 2-day, live, remote SSIS training event!