T-SQL Tuesday #111 – What is Your “Why?”

I enjoy math. I noticed a pattern learning math, perhaps you experienced something similar. I found arithmetic an exercise in memory. I have a good memory (well, I had a good memory…) so memorizing a bunch of rules was no big deal. 

When I learned algebra, arithmetic made more sense. In addition to the memorized rules, I saw why the rules existed. I understood the rules better as a result.

This pattern held all through my math education. I understood algebra better once I learned geometry. I understood geometry better once I learned trigonometry. I understood trigonometry better once I learned single-variable calculus.

An Axiom (for me)

I notice a similar pattern applies to my career (or careers, as the case may be). I’ve served in many roles: 

  • Farm laborer
  • Musician
  • Stockyard laborer
  • Truck driver
  • Service technician
  • Soldier (part-time in the Virginia Army National Guard)
  • Electrician
  • Electrical engineer
  • Electronics technician
  • Manufacturing automation integrator
  • Software developer
  • Author
  • Data professional
  • Entrepreneur

The similar pattern manifests itself in this manner: I’ve enjoyed the position – and more success in the position – when I had a reason to do the work; some reason other than a paycheck. In some cases, I had multiple “why’s” beyond remuneration. For example, I join the Virginia Army National Guard to learn electronics and serve my country – to especially protect everyone’s right to free speech guaranteed by the First Amendment. I may not agree with what people say, but I was (and still am) willing to fight and die to preserve the right of US citizens to say whatever they want. 

As a result, I enjoyed serving in the National Guard (for the most part). I learned more. I learned better, I think, because I enjoyed serving.

Entrepreneurship

Entrepreneurship can be challenging. I believe one needs a “why” – or perhaps several “why’s” to remain an entrepreneur. The “why” cannot simply be money. Money isn’t inconsequential, mind you, but I believe the best “why’s” are less tangible.

Passion plays a major role for me. When business isn’t going well or when business is going too well, a couple intangible “why’s” – passions for both entrepreneurship and the kind of work I am blessed to do – inspire me to keep a steady hand on the tiller.

Also, entrepreneurship affords more and different ways to serve people. Am I saying one must be an entrepreneur to serve others? Nope. Flexibility with time, though, facilitates opportunities that may not otherwise be possible, or as possible.

What is Your “Why?”

That’s the question this month: Why do you do what you do?

I look forward to your replies.

The Rules

T-SQL Tuesday has the following rules:

  1. Publish your contribution on Tuesday, 12 Feb 2019 between 00:00 UTC to 23:59 UTC.
  2. Include the T-SQL Tuesday Logo and have it link to this post.
  3. Please comment below with a link to your post.
  4. Tweet about your post using #tsql2sday.
  5. If you’d like to host in the future, Steve Jones.

Roundup

On or about 19 Feb 2019, I will publish a roundup post. (Until then, that’s a broken link…)

Enjoy!

:{>

Delivering Intelligent Data Integration through SSIS in Dallas 31 May 2019!

I am honored to announce I will be delivering Intelligent Data Integration through SSIS – a day-long SQL Saturday 841 – Dallas pre-conference session 31 May 2019!

Description:

What is Intelligent Data Integration? SSIS (SQL Server Integration Services) packages developed using tried and true design patterns, built to participate in a DevOps enterprise practicing DILM (Data Integration Lifecycle Management), produced using Biml (Business Intelligence Markup Language) and executed using an SSIS Framework.

Attend a day of training focused on intelligent data integration delivered by an experienced SSIS consultant who has also led an enterprise team of several ETL developers during multiple projects that spanned 2.5 years (and delivered).

Attendees will learn:

  • a holistic approach to data integration design.
  • a methodology for enterprise data integration that spans development through operational support.
  • how automation changes everything. Including data integration with SSIS.

Topics include:

  1. SSIS Design Patterns
  2. Executing SSIS in the Enterprise
  3. Custom SSIS Execution Frameworks
  4. DevOps and SSIS
  5. Biml, Biml Frameworks, and Tools

Prerequisites: Familiarity with SQL Server Integration Services (SSIS).

Continental Breakfast, Lunch and Afternoon Snacks are provided.

I hope to see you there!

:{>

Lift and Shift is Not a Strategy…

Lift and shift is a bridge.

If your enterprise is considering migrating enterprise data and data integration workloads from on-premises to the cloud, lifting and shifting is not the end of the story. Lifting and shifting is, rather, the mechanism – the conversion project – that positions your enterprise to leverage the economies of scale afforded by cloud technology.

Andy’s Lift and Shift FAQ

“Should We Lift and Shift?”

I hear this question often and my response is, “When it makes sense to do so, yes, you should lift and shift.” This begs the next question, which is…

“How Do We Know It’s Time to Lift and Shift?”

My engineer-y response to this question is, “You will know it’s time to lift and shift to the cloud when you want to leverage functionality available in the cloud that is not (or not yet) available in on-premises versions of the enterprise platform(s) in use in your enterprise.”

“What Are Some Advantages of Migrating to the Cloud?”

The biggest advantage of lifting and shifting enterprise data to the cloud is the ability to efficiently scale operations. By efficiently, I mean quickly and easily – especially when compared to the time and expense (don’t forget opportunity cost when calculating expense) to scale up systems on-premises.

The ability to scale up and scale down on demand is a huge advantage for some business models which experience “spike-y” demand for operations at different times of the year, quarter, or month. But even if that’s not the case, all data scales. It’s very handy to be able to connect to the Azure Portal and move a slider (as opposed to purchasing and provisioning more hardware…).

There’s a brand new (in my opinion) “knob” exposed by cloud-enabled efficient scaling. As I wrote in my post titled Time and Money in the Cloud:

Let’s say you pay $100 to incrementally load your data warehouse and the load takes 24 hours to execute at the scale you’ve selected in the cloud. Prior to thinking in DTUs, engineers and business people would think, “That’s just the way it is. If I want more or faster, I need to pay for more or faster.” But DTU math doesn’t quite work that way. Depending on your workload and DTU pricing at the time (FULL DISCLOSURE: DTU PRICING CHANGES REGULARLY!), you may be able to spend that same $100 on more compute capabilities and reduce the amount of time required to load the same data into the same data warehouse to minutes instead of hours…

The fact that the cost/performance curve can be altered in seconds instead of months meta-changes everything.

“Are There Disadvantages of Migrating to the Cloud?”

It depends. (You knew that was coming, didn’t you?)

Enterprise Data & Analytics helps enterprises migrate data, data integration, lifecycle management, and DevOps to the cloud. In some cases (~30%), the enterprises spend a little more money in the near-term. There are two reasons for this:

  1. When upgrading, it’s always a good idea to operate new systems in tandem with existing systems. In a lift and shift scenario, this means additional expenses for cloud operations while maintaining the expense of on-premises operations. As cloud operations are validated, on-premises operations are shut off; thereby reducing operating expenses. In truth, though, this dynamic (and expense) exists whether one is lifting and shifting to the cloud or simply upgrading system on-premises.
  2. “Standing on the bridge” (more in a bit) can cost more than remaining either on-premises or lifting and shifting the entire enterprise workload to the cloud.
  3. Regulatory requirements – including privacy and regulations about which data is allowed to leave nation-states – will constrain many industries, especially government agencies and NGOs (non-governmental organizations) who interact heavily with government agencies.

Standing On The Bridge

One option we at Enterprise Data & Analytics consider when assisting enterprises in lift and shift engagements is something we call “standing on the bridge.” 

Standing on the bridge is present in each lift and shift project. It’s one strategy for implementing hybrid data management, which almost every enterprise in the cloud today has implemented. Hybrid means part of the enterprise data remains on-premises and part of the enterprise data is lifted and shifted to the cloud. 

Hybrid is implemented for a variety of reasons which include:

  • Mitigating regulatory concerns; and
  • As part of the normal progression of lifting and shifting enterprise data and data workloads to the cloud.

Standing on the bridge for too long is a situation to avoid. 

“How Do We Avoid Standing on the Bridge For Too Long?”

Planning. Planning is how an enterprise avoids standing on the bridge too long. Your enterprise wants advice from experienced professionals to shepherd the lift and shift operation. 

Enterprise Data & Analytics can help.

Helpful Tools

Enterprise Data & Analytics has been delivering, and writing and speaking about Data Integration Lifecycle Management for years. 

We’ve built helpful tools and utilities that are available at the DILM Suite. Most of the DILM Suite tools are free and some are even open source. 

Enterprise Data & Analytics Welcomes Shannon Lowder!

I am honored and thrilled to welcome Shannon Lowder (@shannonlowder | blog | LinkedIn) to the Enterprise Data & Analytics team!

Shannon is a data engineer, data scientist, BimlHero (though not listed on the page at the time of this writing), and shepherd of the Biml Interrogator open source project. If you use Biml to generate SSIS projects that load flat files, you need Biml Interrogator.

Shannon, Kent Bradshaw, and I are also co-authoring a book tentatively titled Frameworks. (Confession: Kent and Shannon are mostly done… I am slacking…)

Shannon brings a metric ton of experience to serve our awesome clients. He has years of experience in data analytics, serving recently in the role of enterprise Lead Data Scientist. Shannon’s experience spans supply chain management, manufacturing, finance, and insurance.

In addition to his impressive data skills, Shannon is an accomplished .Net developer with enterprise senior developer experience (check out Biml Interrogator for a sample of his coding prowess).

Shannon is a regular speaker at SQL Saturday events, presenting on topics that include Business Intelligence, Biml, and data integration automation. He is a gifted engineer with experience in framework design, data integration patterns, and Azure who possesses a knack for automation. Shannon is an avid “tinkerer” who enjoys learning. He has experience implementing Azure Machine Learning and applying AI to predictive analytics using sources classified Big Data. Shannon is also a practitioner of data integration DevOps with SSIS. In other words, he fits right in with our team here at Enterprise Data & Analytics!

As Shannon writes on his LinkedIn profile:

I am a data guy with a passion for partnering with clients to solve their database and technology issues. Over my career, I’ve played all the roles: database developer, administrator, business intelligence developer, and architect, and now consultant. I’m the guy you call in when you have the impossible problem and everyone tells you it cannot be solved. I automate solutions in order to free your current staff to do the higher value tasks. I bring solutions outside of traditional relational database solutions, in order to find the shortest path between you and your goals.

As an accomplished Microsoft SQL data professional, recognized BimlHero, and practicing Data Scientist, I’m the resource you need to extract the most value from your data.

I’m humbled and thankful and excited to watch Enterprise Data & Analytics continue to (quietly) grow – adding cool people (another announcement is forthcoming) and service offerings like Data Concierge. It’s very cool to watch!

Welcome Shannon! I’m honored to work with you, my brother and friend.

For more information, please contact Enterprise Data & Analytics!

“Think of the Money You Saved”

A True Story of Value vs. Hourly Rate tells the (true) story of a client who did the math. Enterprise Data & Analytics – charging more per hour – was delivering more value than an offshore team charging less than half our rate. Granted, the lady who did the math is a math major. But still.

To quote the sage wisdom of Foghorn Leghorn, “Figures don’t lie.”

I Engineer

“My name is Andy and I’m an engineer.” In case you’ve never spent time around an engineer or overheard one of us interacting with others, we have a reputation for being… direct. (Direct is the nicest word I could think of to describe many of us. Brash, harsh, rude… these terms all come to mind.)

Not all, but many engineers do not prioritize emotion. Many engineers are solution-focused to the exclusion of, well, many things that may actually improve the solution – things that would almost certainly reduce the time required to achieve said solution.

Meetings With Potential Clients

My engineer-y skills shine when meeting with potential clients. By “shine” I mean I often say things that are not at all helpful to Enterprise Data & Analytics‘ potential client and are most often not-helpful to closing the sale. One such saying is… 

“Think of the money you saved!”

When do I say, “Think of the money you saved!”? Usually when the potential client is describing a previous solution that did not solve the problem they were trying to solve. If I had a nickel for every time I have heard someone describe the factors that led to the decision to hire someone to fix the problem and that cost (per hour) tipped the scales in the final decision, I would have a lot of nickels.

Tragedy

Hiring the least – or even a lesser – hourly rate firm is sometimes tragic… unless you’re hiring unskilled labor (and even that might be tragic).

My friend Nick is an executive corporate strategist. Nick is very good at his job. He has conversations with other executives to help them avoid this error. Nick’s logic? “You believe it’s a good idea to hire a cut-rate professional to engage on an enterprise technology project, but if your child needs a surgeon or you need a divorce lawyer, only the best will do?”

Nick has a point. When getting things done right the first time matters, we naturally hire the best available.

“Do ‘the Best’ Always Charge the Most?”

No. “The best” service providers do not always demand the highest rate. Sometimes the best salespeople demand the highest rate. They often get that rate because they excel at persuasion. That said, the market almost always bears a higher rate for professionals who are experienced and deliver value.

Why Do I Say, “Think of All the Money You Saved?”

My heart is breaking. I’m learning about a company who paid good money for a solution and got either nothing or not the solution they were looking for. The money they’ve already spent was either partially or completely wasted. Now they are willing to spend more to have the project fixed or done correctly, i.e. delivered. I’m being a little obvious and not very helpful by stating a lesson they’ve already learned…

The hard way.

Confession Time

Ok, I confess: I don’t actually say, “Think of all the money you saved.” At least not anymore. Why did I stop saying that? It’s not kind. It’s not empathetic. It doesn’t help. Saying that could cost us the gig, even. Nobody likes a jackhole.

I instead focus on helping. We can help most by delivering the solution. And we often deliver the solution for less money than they already spent – even though we charge more per hour.

The Pain Is Real

The tragedy is real, though. It happens. If you work through the math, it tracks with the old saying, “How come there’s never time to do things right but always time to do things over?”

Experience informs me there are situations when time is of the essence. Minimally-viable products (MVPs), for example, are extremely time-sensitive. Has Enterprise Data & Analytics delivered MVPs? Absolutely! We’ve even recommended – and delivered – minimally-viable products to clients as deliverables for the first sprint of major enterprise projects.

Conclusion

Please don’t be that company. Don’t hire a lower-rate firm if quality and time-to-market are factors in your enterprise deliverable. Hire the best, like you would if your child needed a surgeon. Why?

Deliver quality late, no one remembers.
Deliver junk on time, no one forgets.

Grant SSIS Catalog Read-Only Permissions

The SSIS Catalog has built-in security to manage permissions. SSISDB – the database behind the SSIS Catalog – is “just a database” in many respects. When it comes to security, the SSIS Catalog surfaces an internal mechanism that provides granular permissions management. 
In this post I intend to demonstrate how to use SSIS Catalog security to provide read-only access to SSIS Catalog artifacts. But first…

Two Thoughts About SSISDB Roles

Thought 1: “Help us DBAs, you’re our only hope.” – Princess Leia (paraphrased)

If you work with SSIS you already know the Microsoft team of technical writers is comprised of artists, masters of the field of technical writing. I’m convinced a large part of the successful adoption of SSIS is due these good people. You can see a sample of their outstanding artistry in the article titled Integration Services Roles (SSIS Service).

Two important roles in the SSIS Catalog are ssis_admin and ssis_logreader. According to the article linked above:

ssis_admin. This role provides full administrative access to the SSIS Catalog database.

ssis_logreader This role provides permissions to access all the views related SSISDB operational logs.

SSIS_admin and ssis_logreader are SQL Server database roles. As such, they are typically set and maintained by Database Administrators (DBAs).

Thought 2: Although SSISDB is a SQL Server database, it’s more like an application coded in T-SQL.

One for-instance, for instance, is the SSIS Catalog requires Windows authentication for most administrative activities. It took me a while to understand why Windows authentication is necessary. I now get it, but the explanation is for another post. This has implications, such as:

  • SQL Logins – even sysadmins such as sa – cannot deploy SSIS projects to an SSIS Catalog. Or execute SSIS packages.
    • One exception: an Azure-SSIS SSISDB database hosted on an instance of Azure SQL DB can perform SSIS Catalog administration using a SQL Login.

In sum, the SSIS Catalog is a database application that requires Windows authentication for administrative tasks.

Null Use Case: No Access

When a user has no access to SSIS Catalog artifacts, the SSMS Object Explorer Integration Services Catalogs node appears as shown here:

SSIS Catalog Browser presents a unified view of all SSIS Catalog artifacts (and it’s free!).

All SSIS Catalog-related products and utilities at DILM Suite respect SSIS Catalog security. Early versions of SSIS Catalog Compare allowed users to login with SQL Server Login credentials and access SSIS Catalog artifacts that SSIS Catalog security would block. A couple years ago I refactored CatalogBase – the object beneath DILM Suite’s SSIS Catalog products and utilities that interacts with the SSIS Catalog – to respect SSIS Catalog security.

A user with no access will see an empty SSIS Catalog using SSIS Catalog Browser:

Grant Read-Only Access to Folders

In SSMS, right-click the SSIS Catalog Folder you wish to surface for a 
Windows authentication-based SQL Server login:

When the Folder Properties window displays, click the Permissions page. On the Permissions page, click the Browse button to select one or more Windows authentication logins:

The Browse All Principals dialog displays. Select one or more Windows authentication logins (Windows User type):

To assign read-only permission to the SSIS Catalog Folder, click the OK button on the Browse All Principals dialog and check the Grant checkbox for the Read permission in the Folder Properties window:

The user(s) you selected are now able to view the SSIS Catalog folder using the SSMS Object Explorer’s Integration Services Catalogs node:

Similarly, SSIS Catalog Browser will display the SSIS Catalog folder for the assigned user(s);

“Where’s the Beef?”

I can hear you thinking, “Where are the projects and environments, Andy?” That is an excellent question. I’m glad you asked! They exist, of course, but we only granted the user Read permission for the SSIS Catalog Folder.

Grant Read-Only Access to Folder Artifacts

To see SSIS Projects and SSIS Catalog Environments, a user must be granted Read Objects permission on the SSIS Catalog Folder:

Now users can view SSIS Catalog folder contents using SSMS:

SSIS Catalog Browser users enjoy a similar view:

Conclusion

The SSIS Catalog surfaces a rich and powerful security model that may be leveraged by enterprise data integration DevOps teams using SSMS or SSIS Catalog Browser.

Want to learn more? From me? Join me for the next delivery of Expert SSIS – a 2-day, live, remote SSIS training event! 

Join me For Expert SSIS Training!

I’m honored to announce Expert SSIS – a course from Enterprise Data & Analytics!

The next delivery is 01-02 Apr 2019, 9:00 AM – 4:30 PM ET.

Data integration is the foundation of data science, business intelligence, and enterprise data warehousing. This instructor-led training class is specifically designed for SQL Server Integration Services (SSIS) professionals responsible for developing, deploying, and managing data integration at enterprise-scale.

You will learn to improve data integration with SSIS by:

  • Building faster data integration.
  • Making data integration execution more manageable.
  • Building data integration faster.

Agenda

  1. SSIS Design Patterns for Performance – how to build SSIS packages that execute and load data faster by tuning SSIS data flows and implementing performance patterns.
  2. SSIS Deployment, Configuration, Execution, and Monitoring – the “Ops” part of DevOps with SSIS using out-of-the-box tools and the latest utilities.
  3. Automation – how to use Business Intelligence Markup Language (Biml) to improve SSIS quality and reduce development time.

I hope to see you there!

PS – Want to Learn More About Azure Data Factory?

Follow Andy Leonard’s SSIS Training page for more information.

The 2:00 AM Test

I’m working with a collection of clients these days providing data integration, cloud, and DevOps advice. I love architecture work! My clients love the results. Everyone wins.

Kent Bradshaw and I are tag-teaming with a couple clients who are converting data integration from other platforms to SSIS. In a recent meeting Kent mentioned “the 2:00 AM test.”

“What’s the 2:00 AM Test, Andy?”

I’m glad you asked! The 2:00 AM test is a question developers should ask themselves when designing solutions. The question?

“Will this make sense at 2:00 AM? When I have – or someone else has – been awakened by a job or application failure? Will I be able to look at this and intuit stuff about how this is supposed to work?”

Future You Will Thank You

The reason the 2:00 AM test is important is because later – at some point in the future when you’ve long-since stopped thinking about the stuff you’re thinking about when you designed this solution – you (or someone) will be awakened at 2:00 AM by some failure. Even if it’s you, there’s a good chance you won’t remember all the nuances of the design. Even if you do, you may not remember why you designed things the way you did.

So…?

Be like Hansel and Gretel. Leave yourself some breadcrumbs. What kind of breadcrumbs?

Comments

Code comments are the best way to record what you are thinking at the time you are thinking it. Consider the lifetime of a software solution. It may take an hour, a day, week, or month to build; but it may be in production for years. Are you going to remember why you made each decision you made? Are you going to be able to recall – years later – why you zigged instead of zagging?

Call out anything not readily accessible, anything invisible on the initial view of the immediate coding surface. If you’re writing VB or C#, include a comment explaining – at a minimum – where to learn more about classes and objects not readily available. If you used NuGet to import stuff, will it kill you to copy and paste the command into a comment?

In SSIS, if you’re using variables or parameters or event handlers, add annotation to the control flow or data flow task.

Leave yourself a note.

Practice Good Naming

My friend Joel has a t-shirt based on this tweet:

One the two hard things is naming things. I suggest descriptive names for stuff. Which stuff? In VB or C#, variables, methods, and classes for starters. For SSIS I heartily recommend Jamie Thomson’s iconic post titled Suggested Best Practises and naming conventions.

Good naming conventions promote self-documenting code.

Small Chunks

Coders on all platforms develop a sense of a “good size.” An easy trap to fall into is attempting to “boil the ocean.” Break things up. Practice separation of concerns. Write decoupled code.

You will find decoupled, small, function-of-work code is simpler to test, easier to maintain, and promotes code reuse. 

Conclusion

Future you will thank for incorporating these best practices in your coding, regardless of your software development platform.

Two ADF Presentations Next Week!

I am honored to deliver Moving Data with Azure Data Factory to two User Groups next week!

Abstract

Azure Data Factory – ADF – is a cloud data engineering solution. ADF version 2 sports a snappy web GUI (graphical user interface) and supports the SSIS Integration Runtime (IR) – or “SSIS in the Cloud.”

Attend this session to learn:

How to build a “native ADF” pipeline;
How to lift and shift SSIS to the Azure Data Factory integration Runtime; and
ADF Design Patterns to execute and monitor pipelines and packages.

Dates and Locations

07 Jan 2019 – 6:00 PM ET

Columbus Georgia SQL Server Users Group
Meeting at the Columbus Public Library
3000 Macon Rd
Columbus, Georgia 31906

Register today!

08 Jan 2019 – 5:30 PM ET

Midlands PASS
Meeting at WeKnowIT
440 Knox Abbott Drive
Suite 424
Cayce, SC 29033

Register today!

If you live – or will be – nearby, I hope to see you!

:{>

PS – want more? Join me 04 Mar 2019 for a full-day of training! 
Learn more:
Getting Started with Azure Data Factory – Live, 1-Day Training