SSIS Design Pattern: Use Script Tasks for ETL Instrumentation

I consider scripting in SQL Server Integration Services (SSIS) my data integration “get out of jail free” card. Why? If I cannot find a way to accomplish some requirement using SSIS’s built-in objects – tasks, containers, Data Flow components, etc. – I can usually write .Net code to fulfill the requirement.

As stated in the SSIS Academy short course titled Troubleshooting SSIS, I almost always add to the Control Flow an SSIS Script Task that logs the initial state of parameters and variables when package execution starts. I find this information invaluable when troubleshooting failed SSIS executions.

Configuring a Script Task for ETL Instrumentation

Begin by adding an SSIS Script Task to the Control Flow:

Open the editor.
Click inside the ReadOnlyVariables property value textbox. An ellipsis is available:

Click the ellipsis to open the Select Variables window. Select the System::TaskName and System::PackageName variables:

Click the OK button to close the Select Variables window. Note the two variables we selected now appear in the ReadOnlyVariables value textbox:

Select the variables and copy them to the clipboard:

.Net Scripting

Click the Edit Script button to open the Visual Studio Tools for Applications (VSTA) .Net code editor. Scroll to public void Main() and select the text “TODO: Add your code here”:

(click to enlarge)

Paste the contents of the clipboard:

I do this because mistyping variable names results in an error that’s not very intuitive. Also, .Net design-time validation has no way to test variable names for accuracy. It’s better to copy and paste than risk an error, in my opinion.

Declaring a .Net Variable

Begin a .Net variable declaration by typing “string “. Select the System::PackageName variable name from the comment and copy it to the clipboard:

Paste the clipboard contents after “string ” as shown here:

The .Net string variable declaration appears as shown:

Now, System::PackageName is not a valid .Net variable name.
System::PackageName is a valid SSIS variable name where “System” is the namespace and “PackageName” is the variable name.

Edit the .Net variable so it is named “packageName” as shown:

Initializing a .Net Variable

Our goal is to read the value of the SSIS variable System::PackageName into the .Net string variable named packageName. Assigning an initial value to a variable is known as initializing the variable.

To begin initializing the .Net string variable named packageName, add the “=” sign after the variable name (packageName). In C# (and similar languages), a single equal sign is an assignment operator.

Type “Dts.” – as shown next – and note IntelliSense presents a list of available methods and objects. Scroll to the object named “Variables” as shown:

You can append “Variables” to “Dts.” by double-clicking Variables or by pressing the Tab key when Variables is selected (as shown above). Add a beginning-bracket – [ – followed by a double-quote – “. Note the complementary closing bracket – ] – and double-quote – ” – are added automagically. Plus the cursor is in perfect position to paste the contents of clipboard (again):

Move to the end of this line of .Net code and type “.v”. IntelliSense kicks in – although my screen-capture software makes the context menu opaque – and the Value property is selected as shown:

This is important.

Next type “.t”. Notice the Value property is completed by IntelliSense – again, automagically – and IntelliSense also displays the only option available that begins with the letter “t” – ToString:

Complete the statement by typing “();”:

Rinse, Repeat

Use the same .Net variable declaration and initialization to declare and initialize the taskName .Net string variable:

Next, declare a .Net string variable named subComponent and use the packageName and taskName .Net string variables to initialize the value of subComponent:

Exercise your .Net variable declaration and initialization skills even more by declaring two more variables:

  1. An int variable named informationCode and initialized to 1001
  2. A bool variable named fireAgain initialized to true:

Finally, declare a .Net string variable named description and initialize it with the string

“I am ” + packageName

ETL Instrumentation via Information Event Message

Configure and raise an Information event using the following .Net statement:

Dts.Events.FireInformation(informationCode, subComponent, description, “”, 0, ref fireAgain);

Click the screenshot below to enlarge the image. Note the Dts.Events.FireInformation method takes six arguments:

  1. informationCode [int]
  2. subComponent [string]
  3. description [string]
  4. helpFile [string]
  5. helpContext [int]
  6. fireAgain [bool]
(click to enlarge)

We use the .Net variables informationCode, subComponent, description, and fireAgain to supply four of the arguments. We supply literal values – “” and 0, respectively – to the helpFile and helpContext arguments.

When complete, our Main() method contains the following .Net code:

public void Main()
// System::TaskName,System::PackageName
string packageName = Dts.Variables[“System::PackageName”].Value.ToString();
string taskName = Dts.Variables[“System::TaskName”].Value.ToString();
string subComponent = packageName + “.” + taskName;
int informationCode = 1001;
bool fireAgain = true;

string description = “I am ” + packageName;
Dts.Events.FireInformation(informationCode, subComponent, description, “”, 0, ref fireAgain);

Dts.TaskResult = (int)ScriptResults.Success;

Test It!

Close the VstaProjects code editor by closing the window, then close the Script Task Editor by clicking the OK button.

Execute the SSIS package in the SQL Server Data Tools (SSDT) debugger by pressing the F5 key. If all goes as planned, you should see a successful debug execution:

Click the Progress tab to view the Information message:

(click to enlarge)


ETL Instrumentation is an important part of data integration development with SSIS. It increases development time (slightly) to add ETL Instrumentation such as this but it’s worth it. When troubleshooting an SSIS package at 2:00 AM some months after deployment, ETL Instrumentation will save time.

The additional time spent during development is the opposite of technical debt, it’s a technical investment.

Learn more!

Get live, online training – from me! Check out the Enterprise Data & Analytics Training page.

Check out Introduction to Troubleshooting SSIS – a free sample of SSIS Self-Help at SSIS Academy!

The Recording of Faster SSIS is Available

The recording for the free webinar titled Faster SSIS is available and may be viewed above. Thanks to everyone who attended!

Regarding “Bad” BLOBTempStoragePath Settings

Regarding the BLObs.dtsx demo SSIS package, Doug asked a really good question:

“Does the package abort if the [BLOBTempStoragePath] folder path is invalid or does it create it for you?”

The answer is: The package fails. The path is not created for us.

(click to enlarge)

The errors are:

[OLE DB Source [46]] Error: Failed to retrieve long data for column “Demographics”.
[OLE DB Source [46]] Error: There was an error with OLE DB Source.Outputs[OLE DB Source Output] on OLE DB Source. The column status returned was: “DBSTATUS_UNAVAILABLE”.
[OLE DB Source [46]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The “OLE DB Source.Outputs[OLE DB Source Output]” failed because error code 0xC0209071 occurred, and the error row disposition on “OLE DB Source” specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on OLE DB Source returned error code 0xC0209029. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.

(click to enlarge)

Enjoy the video!


Andy Answers: Format a Flat File Name

My Twitter DMs are open @AndyLeonard. Ask me a question about SSIS, ADF, or Biml. I may answer it with a blog post, a video, or both.

Today’s Andy Answer is an answer to an SSIS question received via @AndyLeonard direct message:

Using SSDT, I have 3 variables: one for the file path, one for date string, and one for file name. I’m trying to create the flat file destination with a variable so the date will be correct each day the file is generated. I have the right expressions for each variable, however, when the file is created it only shows the file name variable value. I configured the flat file connection manager’s ConnectionString property with an expression that concatenates these three variables. Can you help?


Yes. Yes I can help. I Am Here To Help.™

An Example to Demonstrate

Create a demo project:

Rename the SSIS package, if you’d like:

Add new SSIS Variables to the SSIS Package. 

I added four variables to start:

  • FlatFilePath [String] – defaulted to a path on my server.
  • FlatFileName [String] – a hard-coded base name for my file.
  • FlatFileExtension [String] – to hold my file extension (csv in this case).
  • FlatFileDateString [String] – the date formatted as I desire:

Formatting the FlatFileDateString SSIS variable is tricksy. I used the following SSIS Expression:

(DT_WSTR,4)YEAR(@[System::StartTime]) + (DT_WSTR,2)((MONTH(@[System::StartTime]) < 10) ? “0” + (DT_WSTR,2)MONTH(@[System::StartTime]) : (DT_WSTR,2)MONTH(@[System::StartTime])) + (DT_WSTR,2)((DAY(@[System::StartTime]) < 10) ? “0” + (DT_WSTR,2)DAY(@[System::StartTime]) : (DT_WSTR,2)DAY(@[System::StartTime]))

Add a new SSIS Variable to contain the full path to the flat file. My variable is named FlatFileFullPath and the SSIS Expression used to build it is show here:

My SSIS Variables, configured:

Note: the initial value for the FlatFileFullPath Expression is:

@[User::FlatFilePath] + “\” + @[User::FlatFileName] + “_” + @[User::FlatFileDateString] + “.” + @[User::FlatFileExtension]

Add a Data Flow Task. In the Data Flow Task, add an OLE DB Source adapter. Configure the source adapter to read some data from some sample database:

Create a test sample file in the directory you specified in the FlatFilePath SSIS Variable Value:

Populate the sample file with column names and one sample data row. Enclose the column names and data values in double-quotes:

Create a Flat File Connection Manager. Configure the Text Qualifier as double-quotes and make sure the “Column names in the first data row” checkbox is checked:

Verify Flat File Connection Manager configuration by viewing the Columns page:

Add a Flat File Destination adapter to the Data Flow Task and configure it to use the Flat File Connection Manager. Ensure the “Overwrite data in the file” checkbox is checked to make the SSIS package re-executable (within the same day, at least):

Select the Flat File Connection Manager and view its Properties. Click the ellipsis for the Expressions property to open the Property Expressions Editor for the Flat File Connection Manager. From the Property dropdown, select ConnectionString:

Drag the FlatFileFullPath variable to the Expression textbox. Click the Evaluate Expression button to examine and verify the value of the FlatFileFullPath SSIS variable: 

When you close the Expression Builder, the Property Expressions Editor should appear similar to:

Once the ConnectionString Expression override is configured, set the DelayValidation property for the Flat File Connection Manager to True. This setting configures the connection manager to skip validation, which is helpful if the file does not (yet) exist:

A test execution in the SSIS debugger proves we’ve accomplished out goal:

One Change

I recommend (at least) one change to this package: Move the FlatFilePath SSIS Variable to a Package Parameter. Why? One may desire to change the target directory of the output file at some point and a parameter value is easy to override at runtime:

After converting the SSIS variable to a package parameter, update the Expression for the FlatFileFullPath SSIS variable to reflect the change:

The updated Expression is:

@[$Package::FlatFilePath] + “\” + @[User::FlatFileName] + “_” + @[User::FlatFileDateString] + “.” + @[User::FlatFileExtension]

A second test run lets me know we’re all good.


This is one answer for this question posed to me on Twitter.

Do you have an SSIS question? Ask! Andy Answers.


Delivering Intelligent Data Integration through SSIS in Dallas 31 May 2019!

I am honored to announce I will be delivering Intelligent Data Integration through SSIS – a day-long SQL Saturday 841 – Dallas pre-conference session 31 May 2019!


What is Intelligent Data Integration? SSIS (SQL Server Integration Services) packages developed using tried and true design patterns, built to participate in a DevOps enterprise practicing DILM (Data Integration Lifecycle Management), produced using Biml (Business Intelligence Markup Language) and executed using an SSIS Framework.

Attend a day of training focused on intelligent data integration delivered by an experienced SSIS consultant who has also led an enterprise team of several ETL developers during multiple projects that spanned 2.5 years (and delivered).

Attendees will learn:

  • a holistic approach to data integration design.
  • a methodology for enterprise data integration that spans development through operational support.
  • how automation changes everything. Including data integration with SSIS.

Topics include:

  1. SSIS Design Patterns
  2. Executing SSIS in the Enterprise
  3. Custom SSIS Execution Frameworks
  4. DevOps and SSIS
  5. Biml, Biml Frameworks, and Tools

Prerequisites: Familiarity with SQL Server Integration Services (SSIS).

Continental Breakfast, Lunch and Afternoon Snacks are provided.

I hope to see you there!


Enterprise Data & Analytics Welcomes Shannon Lowder!

I am honored and thrilled to welcome Shannon Lowder (@shannonlowder | blog | LinkedIn) to the Enterprise Data & Analytics team!

Shannon is a data engineer, data scientist, BimlHero (though not listed on the page at the time of this writing), and shepherd of the Biml Interrogator open source project. If you use Biml to generate SSIS projects that load flat files, you need Biml Interrogator.

Shannon, Kent Bradshaw, and I are also co-authoring a book tentatively titled Frameworks. (Confession: Kent and Shannon are mostly done… I am slacking…)

Shannon brings a metric ton of experience to serve our awesome clients. He has years of experience in data analytics, serving recently in the role of enterprise Lead Data Scientist. Shannon’s experience spans supply chain management, manufacturing, finance, and insurance.

In addition to his impressive data skills, Shannon is an accomplished .Net developer with enterprise senior developer experience (check out Biml Interrogator for a sample of his coding prowess).

Shannon is a regular speaker at SQL Saturday events, presenting on topics that include Business Intelligence, Biml, and data integration automation. He is a gifted engineer with experience in framework design, data integration patterns, and Azure who possesses a knack for automation. Shannon is an avid “tinkerer” who enjoys learning. He has experience implementing Azure Machine Learning and applying AI to predictive analytics using sources classified Big Data. Shannon is also a practitioner of data integration DevOps with SSIS. In other words, he fits right in with our team here at Enterprise Data & Analytics!

As Shannon writes on his LinkedIn profile:

I am a data guy with a passion for partnering with clients to solve their database and technology issues. Over my career, I’ve played all the roles: database developer, administrator, business intelligence developer, and architect, and now consultant. I’m the guy you call in when you have the impossible problem and everyone tells you it cannot be solved. I automate solutions in order to free your current staff to do the higher value tasks. I bring solutions outside of traditional relational database solutions, in order to find the shortest path between you and your goals.

As an accomplished Microsoft SQL data professional, recognized BimlHero, and practicing Data Scientist, I’m the resource you need to extract the most value from your data.

I’m humbled and thankful and excited to watch Enterprise Data & Analytics continue to (quietly) grow – adding cool people (another announcement is forthcoming) and service offerings like Data Concierge. It’s very cool to watch!

Welcome Shannon! I’m honored to work with you, my brother and friend.

For more information, please contact Enterprise Data & Analytics!

Join me For Expert SSIS Training!

I’m honored to announce Expert SSIS – a course from Enterprise Data & Analytics!

The next delivery is 01-02 Apr 2019, 9:00 AM – 4:30 PM ET.

Data integration is the foundation of data science, business intelligence, and enterprise data warehousing. This instructor-led training class is specifically designed for SQL Server Integration Services (SSIS) professionals responsible for developing, deploying, and managing data integration at enterprise-scale.

You will learn to improve data integration with SSIS by:

  • Building faster data integration.
  • Making data integration execution more manageable.
  • Building data integration faster.


  1. SSIS Design Patterns for Performance – how to build SSIS packages that execute and load data faster by tuning SSIS data flows and implementing performance patterns.
  2. SSIS Deployment, Configuration, Execution, and Monitoring – the “Ops” part of DevOps with SSIS using out-of-the-box tools and the latest utilities.
  3. Automation – how to use Business Intelligence Markup Language (Biml) to improve SSIS quality and reduce development time.

I hope to see you there!

PS – Want to Learn More About Azure Data Factory?

Follow Andy Leonard’s SSIS Training page for more information.

Viewing SSIS Configurations Metadata in SSMS

Let’s take a look at an SSIS Project in SSMS:

Demo is the SSIS Catalog folder. Demo contains three SSIS projects named 0-Monolith, EmptySSISProject, and LiftAndShift.

If we expand the LiftAndShift SSIS project we see it contains a single SSIS package named Load Customer.dtsx.

The Demo folder contains two Catalog environments named env1 and env2.

These are the artifacts surfaced via the Integration Services Catalogs node in SSMS.

Before I go on, you may read what I’m about to write here and in the companion post and think, “Andy doesn’t like the Integration Services Catalogs node in SSMS.” That is not accurate. I do like the Integration Services Catalogs node in SSMS. It surfaces enough information for the primary target user of SSMS – the Database Administrator – to see what they need to see to do their job, without “cluttering up” their interface with stuff that they rarely need to see and even more-rarely change.

But Wait, There’s More

There are configurations that are not readily surfaced in the Integration Services Catalogs node in SSMS. To view the configurations metadata for the LiftAndShift SSIS project, right-click the project and click Configure:

The Configure window displays:

From this window I can see examples of the three sources of values for SSIS project and package parameters:

  1. Design-time defaults
  2. Literals
  3. Reference mappings

Design-Time Defaults

Design-time defaults are the default values entered by the SSIS developer when building the SSIS project. These values remain with the SSIS project as it is deployed. They are not changed unless and until another version of the SSIS project is deployed, one which contains different default values for parameters.

Design-time defaults are denoted by no text decoration.


There are actually two kinds of literal values:

  1. Configuration literals
  2. Execution literals

Configuration Literals

Configuration literals are stored in the SSIS Catalog. The values are “hard-coded” to override design-time defaults when the SSIS package is executed.

Configuration literals are denoted by bold text decoration.

When the SSIS project was first deployed to the SSIS Catalog, the value of ProjectLiteralParameter was the design-time default. We can see the design-time default value of ProjectLiteralParameter if we open the Configure window and click the ellipsis beside the Value cell for the parameter:

The design-time default value for ProjectLiteralParameter is “Project Default Value” shown circled in green above. The Configuration Literal value is “Project Literal Value” and is shown inside red and blue boxes in both the Set Parameter Value dialog and the Configure project window.

Execution Literals

Execution literals may be supplied in the Execute Package window. The Execute Package window is displayed when a user right-clicks an SSIS package and then clicks Execute:

Please note the Configuration literals – the values shown in the previous Configuration window with bold text decoration – are shown with no text decoration in the Execute Package window:

(click to enlarge)

Why are the Configuration literals not identified as such in the Execute Package window? The reason is that these values can be overridden as Execution literals.

If I click the ellipsis beside the value cell for the parameter named ProjectLiteralParameter, I can type in a value – an execution literal value – that will override the value of the ProjectLiteralParameter when the SSIS package is executed. The text decoration – in the Execute Package window – for an Execution Literal is bold:

This is a little complex so please bear with me as we examine what just happened.

When the SSIS project was first deployed to the SSIS Catalog, the value of ProjectLiteralParameter was the design-time default. We saw this value in the image above, repeated here:

The design-time default value for ProjectLiteralParameter is “Project Default Value” shown circled in green above. The Configuration Literal value is “Project Literal Value” and is shown inside red and blue boxes in both the Set Parameter Value dialog and the Configure project window.

When we open the Execute window, the Configuration literal value – “Project Literal Value” – is shown with no text decoration (as shown earlier):

(click to enlarge)

We can override the configured override for this execution of the SSIS package and this execution only by clicking the ellipsis to the right of the Value cell for ProjectLiteralParameter. The ellipsis opens the Edit Literal Value for Execution dialog into which we can enter an Execution literal value:

Please note: Execution literal values are not persisted in the SSIS Catalog. They are applied for the current execution – the execution triggered when the user clicks the OK button on the Execute Package window – and that execution only. If the use clicks the Cancel button and then re-opens the Execute Package window, all values revert to their Configured state.

Reference Mappings

Reference mappings are a mechanism for externalizing configurations in the SSIS Catalog. References are an elegant solution for configurations and release management. As with all flexible solutions, they are complex.

Reference mappings begin with References, and references begin with SSIS Catalog Environments (did I mention this was complex?).

Not discussed: Execution literals may also be used to override parameters that are reference-mapped. More on reference mappings in a bit…

SSIS Catalog Environments

An SSIS Catalog Environment is a container that holds SSIS Catalog Environment Variables. An SSIS Catalog Environment has the following configurable properties:

  • Name
  • Description (optional)

An SSIS Catalog Environment is a collection of SSIS Catalog Environment Variables. An SSIS Catalog Environment Variable has the following configurable properties:

  • Name
  • Type
  • Description (optional)
  • Value
  • Sensitive


A reference is a relationship between an SSIS Catalog Environment and an SSIS project (or SSIS package) that is deployed to an SSIS Catalog:

References are configured in the Configuration window on the References page:

Reference Mappings

A reference mapping applies (“maps”) the value of an SSIS Catalog Environment Variable – accessed via a reference – to an SSIS Project (or SSIS package) parameter.

Reference mappings are denoted by underlined text decoration:

In the image above, StringParameter is the name of the SSIS Catalog Environment Variable mapped to the parameter named ProjectParameter. This means the value contained in the SSIS Catalog Environment Variable named StringParameter will be used to override the value of the parameter named ProjectParameter at execution time.

There are several components of a Reference Mapping configuration. If we start our description at the parameter, feel free to sing along to our own version of The Skeleton Song substituting the following:

  • The parameter is part of the SSIS project
  • The project references the SSIS Catalog Environment
  • The SSIS Catalog Environment contains the SSIS Catalog Environment Variable
  • SSIS Catalog Environment Variable value property overrides the parameter value at execution time.

Not discussed: Multiple references and reference selection at execution time.

Viewing the Execution-Time Value

You can find the value that will be used at execution-time using SSMS.

View Design-Time Default and Configuration Literal Values

Design-time default and configuration literal values are available from the Configuration window:

Reference Mapping Values

You have to open a few windows – and a couple pages on the same window (Configure) to surface the execution-time value of a parameter that is mapped via reference:

(click to enlarge)

It’s… complicated. And it’s even more complex if we include multiple references to different SSIS Catalog Environments.

This last image is why I wrote SSIS Catalog Browser.

Catalog Browser is free and I write about how to see these same SSIS Catalog artifacts in the companion post titled Viewing SSIS Configurations Metadata in SSIS Catalog Browser.

Using SSIS Framework Community Edition Webinar 20 Sep

Join me 20 Sep 2018 at noon ET for a free webinar titled Using SSIS Framework Community Edition!


SSIS Framework Community Edition is free and open source. You may know can use SSIS Framework Community Edition to execute a collection of SSIS packages using a call to a single stored procedure passing a single parameter. But did you know you can also use it to execute a collection of SSIS packages in Azure Data Factory SSIS Integration Runtime? You can!

In this free webinar, Andy discusses and demonstrates SSIS Framework Community Edition – on-premises and in the cloud.

Join SSIS author, BimlHero, consultant, trainer, and blogger Andy Leonard at noon EDT Thursday 20 Sep 2018 as he demonstrates using Biml to make an on-premises copy of an Azure SQL DB.

I hope to see you there!

Register today.


Honored to Present Faster SSIS at SQL Saturday Boston 22 Sep 2018!

I am honored to present Faster SSIS at SQL Saturday Boston 22 Sep 2018!

Check out this schedule – there are a bunch of smart people presenting – plus me!


Ever wonder why SSIS runs so slow? Watch SSIS author Andy Leonard as he runs test loads using sample and real-world data and shows you how to tune SQL Server 2017 Integration Services (SSIS 2017) packages.

We’ll start by experimenting with SSIS design patterns to improve performance loading AdventureWorks data. We will implement different change detection patterns and compare execution performance for each. Then, we’ll explain a Data Flow Task’s bottleneck when loading binary large objects – or Blobs.

Finally, we’ll demonstrate a design pattern that uses a Script Component in a Data Flow to boost load performance to MySql, whether on-premises or in the cloud.
Prerequisites: None. Some SSIS and SQL knowledge will be helpful but is not required.

I hope to see you there! I’d love to meet you if you read this blog and attend – just walk up and introduce yourself, I’m the southern guy with the braided beard.


PASS Summit 2018 Starts in 10 Weeks!

Can you believe the PASS Summit 2018 begins only 10 weeks from today (27 Aug 2018)? I confess, this is sneaking up on me fast!

I will be there. Will you?

Where Can You Find Andy at the PASS Summit 2018?


Monday 5 Nov 2018, I’m delivering a full-day pre-conference session titled Intelligent Data Integration with SSIS. I’m going to cover  everything listed at that link but there is an update about my precon content:

There will be Azure Data Factory content and demos!

Why this addition? Two reasons:

  1. My presentation titled Faster SSIS was selected. I usually include the three Faster SSIS demos in my precon. This time, you can just view the Faster SSIS session to see those demos.
  2. may have something cool and new to share about Azure Data Factory that is currently under NDA! Stay tuned…

Enterprise Data & Analytics is Exhibiting!

That’s right, you can find me in the Exhibition Hall! Enterprise Data & Analytics is exhibiting at the PASS Summit 2018!

Have an SSIS or Biml or ADF question? Stop by our booth!
Want to grab a selfie with me or Nick? Stop by our booth!
Want me to autograph your book? Stop by our booth!
Need some consulting or training help? Stop by our booth!

I’m so excited about this – I can hardly wait. We’ll have more information about specific dates and times when I will be manning the booth in coming weeks.

Presenting Faster SSIS

At the time of this writing, the session schedule has not yet been published. PASS has published a general schedule. Keep checking for details!


I am looking forward to the PASS Summit 2018. I hope to see you there.