Join me For Expert SSIS Training!

I’m honored to announce Expert SSIS – a course from Enterprise Data & Analytics!

The next delivery is 01-02 Apr 2019, 9:00 AM – 4:30 PM ET.

Data integration is the foundation of data science, business intelligence, and enterprise data warehousing. This instructor-led training class is specifically designed for SQL Server Integration Services (SSIS) professionals responsible for developing, deploying, and managing data integration at enterprise-scale.

You will learn to improve data integration with SSIS by:

  • Building faster data integration.
  • Making data integration execution more manageable.
  • Building data integration faster.

Agenda

  1. SSIS Design Patterns for Performance – how to build SSIS packages that execute and load data faster by tuning SSIS data flows and implementing performance patterns.
  2. SSIS Deployment, Configuration, Execution, and Monitoring – the “Ops” part of DevOps with SSIS using out-of-the-box tools and the latest utilities.
  3. Automation – how to use Business Intelligence Markup Language (Biml) to improve SSIS quality and reduce development time.

I hope to see you there!

PS – Want to Learn More About Azure Data Factory?

Follow Andy Leonard’s SSIS Training page for more information.

Azure DevOps, SSIS, and Git Part 0 – Getting Started

“My name is Andy and I am a huge fan of source control.” That said, I confess I have struggled with git for a couple years now. I recently had some “play time” – I love play time – and decided to play with git.

In this post I will demonstrate how to create an Azure DevOps account, configure it, create a new Azure DevOps Project, connect my Azure DevOps project to SQL Server Data Tools (SSDT), clone the Azure DevOps project to a local repository, create an SSIS project in the local repository, perform an initial checkin, update the local SSIS project, and check in the changes so they are stored in our Azure DevOps remote repository.

That’s a lot and I know you’re itching to get started walking through this demo. But first I’d like to share why I’m writing about Azure DevOps, Git, and SSIS – and recommend a book while I’m at it.

Why?

I have been listening to Can’t Hurt Me, an audio book by David Goggins and Adam Skolnick based on Goggins’ life. The audio book (and hard copy) contains profanity. I am not a fan of gratuitous profanity – profanity for the sake of profanity. In my opinion, that’s not what I’m hearing when I listen to Can’t Hurt Me.

I’m hearing a man communicate.

Goggins offers solid advice for, well… life. The audio book is better than the hard copy because the co-authors chat in between (and during) chapters about the experiences recorded in the book. (Shameless plug: You can get a free audio book from Audible and help out the Data Driven Podcast if you sign up using this link.)

One piece of advice Goggins offers is: face your challenges and fears. Overcome them. With a tip o’ the cap to Mr. Goggins, let’s “get after this.”

Git is a distributed version control system (VCS) that is free and open source. I can hear some of you thinking, “What does that have to do with SQL Server Integration Services (SSIS), Andy?” I’m glad you asked:

SSIS is software development.

Andy, circa 2006

Learning By Example

There are a few moving parts to using SSIS with Azure DevOps. In this post I’m going to share how to configure the parts and then get them moving. Cool?

Setup an Azure DevOps Account

First, you need Azure DevOps. Browse to dev.azure.com and set up an account:

Before I dive into something like this, I prefer to count the cost. At the time of this writing Azure DevOps may be used for free:

See the Azure DevOps Pricing page for details (note: this may have changed since I wrote this post).

Once you have Azure DevOps up and running, create a project:

Clicking the Create project button opens the Create New Project window. Configure the project by selecting a version control engine and a work item process :

Please note TFS is an option for version control engine. Are you using TFS on-premises but thinking of migrating to the cloud? Here you go.

Click the Create button to proceed. In a minute or so, the Overview>Summary page for the project displays:

Connect to the Azure DevOps Project

The next step is to connect SSDT to the Azure DevOps project. Begin by clicking Team>Manage Connections:

Note: You may need to disconnect a current connection first:

You may need to add an account to Visual Studio’s connections:

If so, you are prompted for credentials from Azure:

Once connected to Azure DevOps via Azure security, you may select an account:

After connecting, you may select the project you created earlier:

Git works in a local repository which exists on the developer’s machine. The Clone button surfaces three options:

  1. Connect
  2. Clone
  3. Clone with Submodules

The default option for the highlighted button is “Clone.” That’s a clue, cloning is our next step but for our introductory demonstration, we select Connect:

Note we cannot continue unless we clone the repository (that’s why it was shown by default!):

This reminds me of my younger son, Riley. Whenever I tell him something he doesn’t want to hear, Riley responds, “GOSH, Dad!” When I first saw this dialog, I said, “GOSH, Azure DevOps!” True story…
What Just Happened?

When we cloned the SSIS Test Azure DevOps git repository, we created a local, linked copy of the SSIS Test Azure DevOps git repository in the specified folder – in my case, the local folder was C:\Users\A. Ray Leonard\source\repos\SSIS Test:


We will get to the other folder and files in this directory later. Promise…

The next step is to “Create a new project or solution” in this new local repository (it still has that new repository smell!):

Create an SSIS Project

Click the”Create a new project or solution” link in Team Explorer to create a project in our new local repository:

View Solution Explorer once the new project has been created. Note the small green “+” sign decorating the solution name:

The “+” sign indicates the solution has been added to source control, which – again – we may view the solution in our local repository using Windows Explorer:

But when we check our Azure DevOps SSIS Test project, we see a message indicating the remote repository is empty:

“Dude! Where’s my code?”

Committing Code (Locally)

Click View>Team Explorer to open the Team Explorer dialog:

Once Team Explorer opens, click the Changes button:

“Ch-ch-ch-changes”

Enter a Commit Message:

Q: “How committed are you to this relationship?”

Click the Commit All button to commit the changes locally:

A: “Locally committed. You are locally committed to this relationship.”

What does locally committed mean? Your local repository has been updated but no changes have been transmitted to your remote repository…yet.

As the message above reads, “Sync to share your changes with the server.” Click the “Sync” link (hey, that rhymes!):

Syncing

Clicking the Sync link triggers a git “push”


Team Explorer lets you know once the sync and push operations complete:

Now you can see files in your remote repository – up in Azure DevOps, finally!

There be files here. WOO HOO!

You can learn a lot more about interacting with your local and remote repository from the Pro Git book, available online for free:

Update, Commit, Sync (Push)

We now have a copy of our mostly-empty SSIS project stored in the cloud. w00t! Let’s make some changes to the project and push them to Azure DevOps.

I opt to rename the package:

I opt to add a Script Task for ETL Instrumentation as recently posted, SSIS Design Pattern: Use Script Tasks for ETL Instrumentation:

I Confess, This (Still) Confuses Me…

After a save, I add a message and commit the changes locally:

This is not the confusing part…

Once the changes are committed locally, I click the Sync button as before:

Still not the confusing part…

After clicking Sync, I need to Push the changes to the Azure DevOps remote repository:

THIS is the confusing part.

This confuses me. Why doesn’t Sync issue a git push – like with the initial sync?
I’d be ok with the initial git sync not pushing, even; I crave consistency.

Regardless, my updates now reside in the Azure DevOps remote repository:

Conclusion

In this post we configured Azure DevOps, added an Azure DevOps Project, connected the Azure DevOps project to SQL Server Data Tools (SSDT), cloned the Azure DevOps project locally, added an SSIS project to the local repository, performed an initial checkin, updated the local SSIS project, and checked in the changes.

Mission accomplished.

:{>

SSIS Design Pattern: Use Script Tasks for ETL Instrumentation

I consider scripting in SQL Server Integration Services (SSIS) my data integration “get out of jail free” card. Why? If I cannot find a way to accomplish some requirement using SSIS’s built-in objects – tasks, containers, Data Flow components, etc. – I can usually write .Net code to fulfill the requirement.

As stated in the SSIS Academy short course titled Troubleshooting SSIS, I almost always add to the Control Flow an SSIS Script Task that logs the initial state of parameters and variables when package execution starts. I find this information invaluable when troubleshooting failed SSIS executions.

Configuring a Script Task for ETL Instrumentation

Begin by adding an SSIS Script Task to the Control Flow:

Open the editor.
Click inside the ReadOnlyVariables property value textbox. An ellipsis is available:

Click the ellipsis to open the Select Variables window. Select the System::TaskName and System::PackageName variables:

Click the OK button to close the Select Variables window. Note the two variables we selected now appear in the ReadOnlyVariables value textbox:

Select the variables and copy them to the clipboard:

.Net Scripting

Click the Edit Script button to open the Visual Studio Tools for Applications (VSTA) .Net code editor. Scroll to public void Main() and select the text “TODO: Add your code here”:

(click to enlarge)

Paste the contents of the clipboard:

I do this because mistyping variable names results in an error that’s not very intuitive. Also, .Net design-time validation has no way to test variable names for accuracy. It’s better to copy and paste than risk an error, in my opinion.

Declaring a .Net Variable

Begin a .Net variable declaration by typing “string “. Select the System::PackageName variable name from the comment and copy it to the clipboard:

Paste the clipboard contents after “string ” as shown here:

The .Net string variable declaration appears as shown:

Now, System::PackageName is not a valid .Net variable name.
System::PackageName is a valid SSIS variable name where “System” is the namespace and “PackageName” is the variable name.

Edit the .Net variable so it is named “packageName” as shown:

Initializing a .Net Variable

Our goal is to read the value of the SSIS variable System::PackageName into the .Net string variable named packageName. Assigning an initial value to a variable is known as initializing the variable.

To begin initializing the .Net string variable named packageName, add the “=” sign after the variable name (packageName). In C# (and similar languages), a single equal sign is an assignment operator.

Type “Dts.” – as shown next – and note IntelliSense presents a list of available methods and objects. Scroll to the object named “Variables” as shown:

You can append “Variables” to “Dts.” by double-clicking Variables or by pressing the Tab key when Variables is selected (as shown above). Add a beginning-bracket – [ – followed by a double-quote – “. Note the complementary closing bracket – ] – and double-quote – ” – are added automagically. Plus the cursor is in perfect position to paste the contents of clipboard (again):

Move to the end of this line of .Net code and type “.v”. IntelliSense kicks in – although my screen-capture software makes the context menu opaque – and the Value property is selected as shown:

This is important.

Next type “.t”. Notice the Value property is completed by IntelliSense – again, automagically – and IntelliSense also displays the only option available that begins with the letter “t” – ToString:

Complete the statement by typing “();”:

Rinse, Repeat

Use the same .Net variable declaration and initialization to declare and initialize the taskName .Net string variable:

Next, declare a .Net string variable named subComponent and use the packageName and taskName .Net string variables to initialize the value of subComponent:

Exercise your .Net variable declaration and initialization skills even more by declaring two more variables:

  1. An int variable named informationCode and initialized to 1001
  2. A bool variable named fireAgain initialized to true:

Finally, declare a .Net string variable named description and initialize it with the string

“I am ” + packageName

ETL Instrumentation via Information Event Message

Configure and raise an Information event using the following .Net statement:

Dts.Events.FireInformation(informationCode, subComponent, description, “”, 0, ref fireAgain);

Click the screenshot below to enlarge the image. Note the Dts.Events.FireInformation method takes six arguments:

  1. informationCode [int]
  2. subComponent [string]
  3. description [string]
  4. helpFile [string]
  5. helpContext [int]
  6. fireAgain [bool]
(click to enlarge)

We use the .Net variables informationCode, subComponent, description, and fireAgain to supply four of the arguments. We supply literal values – “” and 0, respectively – to the helpFile and helpContext arguments.

When complete, our Main() method contains the following .Net code:

public void Main()
{
// System::TaskName,System::PackageName
string packageName = Dts.Variables[“System::PackageName”].Value.ToString();
string taskName = Dts.Variables[“System::TaskName”].Value.ToString();
string subComponent = packageName + “.” + taskName;
int informationCode = 1001;
bool fireAgain = true;

string description = “I am ” + packageName;
Dts.Events.FireInformation(informationCode, subComponent, description, “”, 0, ref fireAgain);

Dts.TaskResult = (int)ScriptResults.Success;
}

Test It!

Close the VstaProjects code editor by closing the window, then close the Script Task Editor by clicking the OK button.

Execute the SSIS package in the SQL Server Data Tools (SSDT) debugger by pressing the F5 key. If all goes as planned, you should see a successful debug execution:

Click the Progress tab to view the Information message:

(click to enlarge)

Conclusion

ETL Instrumentation is an important part of data integration development with SSIS. It increases development time (slightly) to add ETL Instrumentation such as this but it’s worth it. When troubleshooting an SSIS package at 2:00 AM some months after deployment, ETL Instrumentation will save time.

The additional time spent during development is the opposite of technical debt, it’s a technical investment.

Learn more!

Get live, online training – from me! Check out the Enterprise Data & Analytics Training page.

Check out Introduction to Troubleshooting SSIS – a free sample of SSIS Self-Help at SSIS Academy!

Training Feedback

Enterprise Data & Analytics

Here’s what students are saying about the free Enterprise Data & Analytics webinars this month:

I loved the way you present and blend it with the demos Andy.

The advanced level of knowledge required kept the event moving at a frantic pace!

I always learn something from Mr. Leonard.

Andy’s training style is to-the-point and experience based, not a recitation of textbook process. He engages the audience with real-world experiences demonstrating useful application of the products he teaches. I find his insights invaluable.

Andy used SSDT demo mode for the exercises, showing exactly what I could expect along with navigation tips and shortcuts using SSDT. I also like the fact that Andy made a point to make sure we, as the audience, was paying attention to the important sections.

Presentation was straight to the point, useful, and with no unneeded fluff. Also, no PowerPoint presentation! Just real-world demos covering the topic being discussed.

Concise and easy to follow. Seems like it would be useful to students of varying skill levels.

I look forward to more of these, Andy. You are a terrific ambassador to the profession.

I’m always grateful when the experts are willing to freely share knowledge.

Yall make an old man blush!

I enjoy delivering training. I find it’s a great way to learn a topic is to deliver training on that topic.

Stay tuned, there are more free webinars on the way! The best way to keep up is to subscribe to my mailing list.

:{>

The Recording of Faster SSIS is Available

The recording for the free webinar titled Faster SSIS is available and may be viewed above. Thanks to everyone who attended!

Regarding “Bad” BLOBTempStoragePath Settings

Regarding the BLObs.dtsx demo SSIS package, Doug asked a really good question:

“Does the package abort if the [BLOBTempStoragePath] folder path is invalid or does it create it for you?”

The answer is: The package fails. The path is not created for us.

(click to enlarge)

The errors are:

Error:
Error:
Error:
[OLE DB Source [46]] Error: Failed to retrieve long data for column “Demographics”.
[OLE DB Source [46]] Error: There was an error with OLE DB Source.Outputs[OLE DB Source Output] on OLE DB Source. The column status returned was: “DBSTATUS_UNAVAILABLE”.
[OLE DB Source [46]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The “OLE DB Source.Outputs[OLE DB Source Output]” failed because error code 0xC0209071 occurred, and the error row disposition on “OLE DB Source” specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on OLE DB Source returned error code 0xC0209029. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.

(click to enlarge)

Enjoy the video!

:{>

SSIS, Docker, and Windows Containers, Part 1 – Why?

In my first post in this series, SSIS, Docker, and Windows Containers, Part 0 – Getting Started, I covered:

  • Installing docker
  • Choosing a Container OS
  • Pulling a test image
  • Pulling a Windows Container with SQL Server installed
  • interacting with with your instantiated container

In this post I reveal my ultimate goal and then travel down two rabbit holes to explain:
1. How I got here; and
2. One way to get answers to questions no one wants to hear.

My Ultimate Goal

My ultimate goal is to use one or more Windows containers running SQL Server to test SSIS. I can hear some of you thinking,..

“Why, Andy?”

My response? “Cuz.”
And, “I do what I want!”

Seriously, since I hopped the fence from developer to data I’ve dreamed of the day when I could practice lifecycle management with data-stuff like I used to practice lifecycle management with software development.

I recognize the obstacles. The greatest obstacle (in my humble opinion) is software is mostly stateless these days (these days started with Object-Oriented Programming and include its descendants). Stateless development solves lots of engineering problems in lifecycle management, and by “solves a lot of engineering problems” I mean some engineering problems simply don’t exist so lifecycle management for stateless stuff can simply ignore “a lot of engineering problems.”

A database, on the other hand, is all about that state. When it comes to managing lifecycle for a stateful platform – like a database – ACID gets dumped on many lifecycle management tools and solutions (see what I did there?).

Is it possible to manage a data-related lifecycle using stateless tools? Yes. But here there be obstacles. Let’s look at on use case:

Use Case 0

Your team releases an update that adds one column to one table. The column receives data. Someone realizes there’s a bug and determines a rollback is the solution. The web and middle-tier teams successfully rollback the release using functionality built into the release-management / DevOps enterprise tool.

What happens to the column in the table?

  • Is it dropped?
  • Is the data deleted?
  • Since this isn’t a post about data lifecycle management, let’s not even consider what happens if the data is considered personal-identifying-information or some tuple otherwise subject to privacy regulations.

You see the problem? The answer is always going to be, “It depends.”

This is the problem I am trying to solve, specifically for data integration.

I want to be able to easily deploy and rollback data integration deployments. But to accomplish this, I need more metadata (data about the data) and a bunch more automation. Why? My stuff has state – by definition.

“Not Supported”

If you have not yet figured this out, Microsoft is a software development shop. They’re people trying to use software to solve business problems for customers – just like you and me. Just like you and me, they have schedules and deadlines and competition and a need to generate a profit. in other words, Microsoft is a business.

When some redneck from Farmville Virginia starts asking all sorts of out-of-scope questions, he sometimes hears, “That’s not supported, Andy,” in response. Ask me how I know.

So it helps to be stubborn persistent. Persistence sometimes takes the form of thinking of another approach. When I began probing for ways to test SSIS in containers, my ultimate goal was to have the tests simulate Production as closely as possible. In 2019, Production SSIS means executing SSIS in the SSIS Catalog.

Is it possible to execute Production SSIS from the file system or the MSDB database in 2019? It is. I’ve not found a good use case for executing SSIS outside the Catalog, though – not in 2019. I am not trying to hurt anyone’s feelings, I promise. I’m just saying that the SSIS Catalog has been out for north of six years at the time of this writing. Plus: The only way to execute Azure-SSIS packages is via an SSIS Catalog in the cloud (again, at the time of this writing).

At Enterprise Data & Analytics we help clients migrate from file- / MSDB-based SSIS execution to the SSIS Catalog (both on-premises and Azure-SSIS). Some clients realize the benefits of SSIS Catalog-based execution and call us for help with training and consulting. Others ask for help performance-tuning SSIS, and we almost always recommend using the SSIS Catalog because SSIS packages almost always execute faster when executed from the Catalog. When do we not recommend using the SSIS Catalog? Some enterprises have designed their data environments in such a way that using the SSIS Catalog would require a major shift. The technical portion of the shift would be relatively straightforward, but – often – the cultural shift is more disruptive. Not all enterprises are willing to make such a cultural shift; to endure the disruption.

Getting Around “Not Supported”

When I started asking questions like, “Is it possible to run the SSIS Catalog in a Windows container?” the response was, “That’s not supported.”

Fine. What is supported? Running the SSIS Catalog on SQL Server installed on Windows Server Core. I was trying to solve many of the same problems in a container that would need to be solved running an SSIS Catalog on Core. It turned out many of these problems had indeed already been solved.

More Information

If you can make it to SQL Saturday #813 – Boston BI Edition on 30 Mar 2019, Liz Baron and Sreeja Pullagura are delivering an information-packed session titled SSIS DevOps with Azure Containers on this very topic. I’ve had the privilege and honor to work with Liz, Sreeja, and the awesome data engineering team at Teach For America – a worthy non-profit.

I predict Sreeja’s and Liz’s presentation is going to be a session that moves the needle for Data Integration Lifecycle Management (DILM).

My next several blog posts on this topic – sharing some of my efforts and what I’ve learned about executing SSIS using containers – will begin appearing 31 Mar 2019.

Shameless plug: I am delivering Intelligent Data Integration with SSIS – a full-day pre-conference training session – Friday, 29 Mar 2019.
Register here!
I hope to see you there!

:{>

“Andy, I Want More ____”

I am always learning.

I enjoy learning and this trait led me to technology in the first place. It turns out a love of learning is all but essential for technologists.

After I learn, I share what I’ve learned in blog posts, webinars, and via presentations at SQL Saturdays and events like the PASS Summit.

What Do You Want to Learn?

Drop me a comment and let me know what you’d like to learn! I may take your suggestion and run with it.

:{>

The Recording for Troubleshooting SSIS (Part 1) is Available

A fantastic crowd attended today’s (13 Mar 2019) free webinar titled Troubleshooting SSIS. If you attended you got the full show. If not, you may view the content at SSIS Academy for a nominal fee.

Enjoy this focused content to SSIS Self-Help, a service from Enterprise Data & Analytics designed to help folks quickly and affordably overcome SSIS challenges.

Preview Available

The first two segments of the presentation are available for free at the top of this post. In the full version I walk you through:

  • Design-time validation
  • Debugging most SSIS tasks
  • Debugging the Script Task
  • Debugging the SSIS Data Flow Task
    • Using Data Taps
  • Debugging the Script Component

Enjoy.

:{>

T-SQL Tuesday #112 – The Cookie Jar

This month, Shane O’Neill hosts T-SQL Tuesday. His topic is titled Dipping into the Cookie Jar and is inspired by David Goggins‘ book titled Can’t Hurt Me.

Coincidence: I am currently listening to the Can’t Hurt Me audio book.

I made several recent mini-road-trips to Richmond to visit my Dad (more later) – a 75-mile trip (one-way) – and I listened to most of the first four chapters while making those recent trips.

I’ve been following Goggins for a while on Instagram, where he posts about against motivation and about overcoming one’s perceived limitations. He is a practitioner of what I’ve described as “transmuting life’s poison,” the act of using bad things that have happened to you as fuel for achieving good (great, even) things in life. Transmuting the poison is a reference to Dune.

Goggins had a hard life. He was dealt a crappy hand. He did stuff to his own detriment. He pulls no punches about his situation or the source of his woes. What he did with those things, though – how he transmuted the poison – is nothing short of inspiring.

My Stuff

I’m not prepared to share everything now (and may never be), but here’s some stuff that I’ve endured that was self- and / or other-inflicted:

  • Been on welfare
  • Caused a crisis pregnancy
  • Been fired (multiple times)
  • Failed at business
  • Been financially broke (more than once)
  • Been divorced

The latest happened Saturday morning when my Dad passed away after a couple months of declining health. I am not so naive as to think this is as difficult as what others have faced; some for years, even. But it was not nothing and it has challenged me on many levels – some of which are absent from the list above.

He was in Richmond when he died.
He’s the reason I made so many trips to Richmond recently.

Cookies

Like Goggins, I used the darkness of those (and other) experiences to power through some tough experiences in life. What have I achieved by transmuting this poison?

  • Family – beyond what I thought possible or even dreamed; cool and intelligent children; an awesome, beautiful, cute, sexy, cute, smokin’-hot wife.
  • Friends – real friends who lay down their lives for you.
  • Serving our Community – I am honored and humbled to be a member of our Community. I’m a redneck with an Associate’s Degree. I don’t deserve the honor our community bestows.
  • Being published – a kid who made almost straight D-minus’s in high school English.
  • Weaponized ADHD – using the compulsive part of obsessive/compulsive to work multiple days with little or no sleep to get the job done, using the obsessive part to make the obstacles quit before I do. I owe my brother from another mother, Frank LaVigne (franksworld | @Tableteer | DataDriven), credit for the term weaponized.
  • Military service – like Goggins I served in the military, although I did not complete training anywhere near as difficult as he did and I never experienced combat. Basic Training is not nothing, though, and many lessons from boot camp remain with me still. One thing Goggins says with which I wholeheartedly agree: “It’s all a mind thing.” Once you get your mind right, you are unstoppable and unbreakable.

When the going gets tough I remember these things, but this is not all and they are not even the…

Most Important

God is good, all the time. He doesn’t allow things to happen to me that I cannot bear. I confess, however, there have been times – some recently – when I have glanced toward Heaven and asked, “Are You sure You have the right Andy?”

At this time, with the normal stresses of life intermingling with some not-as-frequent stresses of life, I am thankful for Jesus carrying me and surrounding me with sisters and brothers who express their love and caring, and who sacrifice their time and talents to demonstrate their love.

For you see, God is the true “transmuter of poison.”

In Isaiah 61:2b-3, God speaks through the prophet:

“To comfort all who mourn,
To console those who mourn in Zion,
To give them beauty for ashes,
The oil of joy for mourning,
The garment of praise for the spirit of heaviness;
That they may be called trees of righteousness,
The planting of the Lord, that He may be glorified.”

Beauty for ashes.
Joy for mourning.
Praise for heaviness.

Dad is a Christian (present tense intended). This life is part of Christianity.
Though we had our differences, I miss Dad.
But I will see him again.

Peace.