One Way to Create a Parquet File from SQL Server Data

Yesterday I was coding away, deep into using C# to decompress and parse Excel XML so I could read data from a specific worksheet in a workbook on a server without Excel objects installed, when I saw the following on Twitter:

For those of you who do not already know…

I Don’t Switch Contexts Well

I wasn’t wrong.
I just wasn’t thinking… yet.

Plus (blogger tip), if you want your blog to help people, be on the lookout for statements like, “I can’t find this anywhere…” When you find one, blog about that topic.

I searched and found some promising Parquet SSIS components available from CData Software and passed that information along. I shared my inexperience in exporting to parquet format and asked a few friends how they’d done it.

I thought: How many times have I demonstrated Azure Data Factory and clicked right past file format selection without giving Parquet a second thought? Too many times. It was time to change that.

The request in the tweet is threefold:

  1. Read data from SQL Server.
  2. Write the data to a Parquet file.
  3. Stay on-premises.

Steps 1 and 2

Steps 1 and 2 can be accomplished in Azure Data Factory without much hassle. The steps are:

Create a pipeline:

Add a Copy Data activity:

Configure a SQL Server data source:

Configure a Parquet sink:

I used a storage account configured for Azure Data Lake as my target, and you can find details on configuring ADLS and using Parquet with ADF at Microsoft docs.

Step 3

Staying on-premises was the tricksy part of this request. I decided to create a C# (.Net Framework) console application to test the concepts. I also decided to use Parquet.Net:

I found some helpful C# on Darren Fuller‘s post titled Extracting data from SQL into Parquet. I modified Darren’s code some and was able to export a dataset from the AdventureWorks2019.Person.Person database to a Parquet file using the following code:

I apologize for the formatting. You should be able to copy and paste this code into a C# (.Net Framework) console app and have at it. The code can also be adapted for an SSIS Script Task.


I tested the output Parquet file by uploading the output file to Azure Blob Storage, creating a test target table, creating a test ADF pipeline, and then loading the data from the file into the table:

The load succeeded and a test query showed my data.

I hope this helps.



Andy Leonard

Christian, husband, dad, grandpa, Data Philosopher, Data Engineer, Azure Data Factory, SSIS, and Biml guy. I was cloud before cloud was cool. :{>

4 thoughts on “One Way to Create a Parquet File from SQL Server Data

  1. We have used ‘external tables’ aka Polybase to extract data into Parquet files on prem. Once external table (Parquet file) is defined is just a simple insert into .. lot less coding requried..

Comments are closed.