Intelligent Data Integration, SSIS Design Patterns, and Biml

On Friday, 04 Aug 2017, I have the privilege and honor of delivering a full-day precon titled Intelligent Data Integration: SSIS Design Patterns and Biml as part of SQL Saturday Louisville. If you’re interested in learning more or attending, you can learn more here.

“Isn’t This The Same Presentation You Delivered Before, Andy?”

Yes and no. It has the same title but…

I’ve focused on Biml presentations for the past two years. Over the past eighteen months I’ve built the DILM Suite. These facts intersect: My goal is to facilitate DevOps and Continuous Integration (CI) with SQL Server Integration Services (SSIS) and Biml plays an important role; namely, automated SSIS code generation. The DILM Suite development work has impacted my webinars and presentations – especially this precon. I delivered SSIS Design Patterns and Biml: A Day of Intelligent Data Integration once before, over a year ago in Atlanta. Since then I’ve delivered modules of that presentation in Enterprise Data & Analytics webinars. With each delivery the DILM Suite development work has informed and inspired changes to the content of the modules; the content has evolved and the 15 Jun delivery will be different.

This evolution-of-content has happened to many of my Biml presentations. In some cases the updates are such that today’s version of the presentation is a radical departure from the first delivery. Why? I’m constantly learning. Writing the DILM Suite has intensified my learning. As I’ve shepherded this vision and watched it come to fruition, I’ve discovered new possibilities and more use cases.

“Mini-Cycles”

I catch a glimpse of what’s possible and develop until it’s reality. As I develop, the glimpse becomes more defined and I add and refine features in response. This “mini-cycle” continues until I reach a good stopping point with a solution, product, or utility. By then I’ve caught a glimpse of a solution to another problem and begin developing a different solution… and the mini-cycle repeats for this other solution, product, or utility.

With DILM Suite, I catch a glimpse of a Euler diagram (I think visually, in graphs) showing how two or more of the solutions, products, and utilities work together to facilitate more complex DevOps and SSIS CI scenarios. This started in early 2016 around the time I began releasing a handful of free utilities. There will be more free utilities but the list of free DILM Suite stuff at the time of this writing is:

The blog post titled An Example of Data Integration Lifecycle Management with SSIS, Part 4 provides a glimpse of how one might use four of these free tools together (everything except the Biml Express Metadata Framework – which hadn’t been released at that time). Today, at the time of this writing, that glimpse is my latest “pinnacle.” The Euler’s in my mind, though, are already two pinnacles beyond that – and working on a 3rd. It’s likely the 15 Jun delivery of the Intelligent Data Integration: SSIS Design Patterns and Biml precon will contain material beyond these five free tools.

The delivery after 15 Jun will likely contain still more material. I’m continuously integrating my Continuous Integration and DevOps-for-SSIS thoughts, and then building tools and designing best practices and patterns to support the latest version of my vision.

I don’t expect to stop.

Ever.

“Is the Intelligent Data Integration: SSIS Design Patterns and Biml Precon Just One Big Commercial for the DILM Suite, Andy?”

Goodness no.

In the first part I’m going to share everything I know about using what’s-in-the-box to deliver enterprise-class data integration with SSIS – some of which Kent Bradshaw and I covered in the 3-part webinar series titled SSIS Academy: Using the SSIS Catalog (we stayed “in the box” for these three webinars). In the second part I’ll point out some gaps in the OOTB solutions and demonstrate some ways to close them. Examples of some (not all) solutions are free DILM Suite tools.

Conclusion

I hope to see you at SQL Saturday Lousiville 05 Aug! If you’re interested in learning more about DevOps and Data Integration Lifecycle Management, I also hope to see you at the Intelligent Data Integration: SSIS Design Patterns and Biml precon.

You might enjoy engaging Enterprise Data & Analytics consultants because we like helping teams do more with SSIS.

:{>

Learn More:
An Example of Data Integration Lifecycle Management with SSIS, Part 4
The Recordings for SSIS Academy: Using the SSIS Catalog are Available
Save Time and Improve SSIS Quality with Biml

Related Training:
IESSIS1: Immersion Event on Learning SQL Server Integration Services – Oct 2017, Chicago

Andy Leonard

andyleonard.blog

Christian, husband, dad, grandpa, Data Philosopher, Data Engineer, Azure Data Factory, SSIS guy, and farmer. I was cloud before cloud was cool. :{>

2 thoughts on “Intelligent Data Integration, SSIS Design Patterns, and Biml

  1. hi Andy,
    I dont know if this is the right question to post here, but i am trying to accomplish something which is not out of the box.
    I was trying to build a package:
    1)user enters the SQL.
    2)The db source and destination are supplied through the Project parameter.
    3) the package should load the data in the destination table , based on the SQL supplied by the user.
    4)The hard part is to do the mapping in the script component , which changes with the source SQL.
    5) i am able to prepare the sQL and the scrip using the Script component, but not able to do the mapping in the SCript component as it changes with every SQL
    Do you think , that this can be accomplished using the script component or not.
    I will appreciate any help from you.
    thanks
    Garry

  2. Hi Garry,
      "Pipelines" are created when an SSIS developer configures an adapter in a Data Flow Task. This happens when, for instance, you add an OLE DB Source adapter to a data flow and configure it to connect to a source database via an OLE DB Connection Manager. At that point, a data flow pipeline is created and configured to use the columns – by name and data type – specified in the OLE DB Source adapter.
      As far as I know (and I could be wrong), I don’t believe it’s possible to decouple pipelines and re-couple them dynamically.
      This is why changes to source or destination tables break data flows.
      There are other data integration tools that support "semantic rationalization," which is the problem you are trying to solve. Expressor sold their data integration tool to Qlik a few years ago, and it managed changes to underlying schemata. I’m not sure if Qlik still sells Expressor or not, I couldn’t locate it on their site.
    Hope this helps,
    Andy

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.