Converting Data Format from CSV to Parquet
Application Scenarios
Parquet is a columnar storage substrate created for simpler data analysis. This format can speed up queries by allowing only the required columns to be read and calculated. In addition, Parquet is built to support efficient compression schemes, which maximizes the storage efficiency on disks. Using DLI, you can easily convert data format form CSV to Parquet.
Solution Overview
Upload CSV data to an OBS bucket, convert CSV data into Parquet data with DLI, and store the converted Parquet data to OBS.
Process
To use DLI to convert CSV data into Parquet data, perform the following steps:
Step 1: Creating and Uploading Data. Upload data to your OBS bucket.
Step 2: Using DLI to Convert CSV Data into Parquet Data. Import CSV data to DLI and convert it into Parquet data.
Solution Advantages
- The query performance is improved.
If you have text-based data files or tables in an HDFS and are using Spark SQL to query data, converting data format to Parquet can improve the query performance by about 30 times (or more in some cases), despite of the time consumed during the conversion.
- Storage is saved.
Parquet is built to support efficient compression schemes, which maximizes the storage efficiency on disks. With Parquet, the storage cost can be reduced by about 75%.
Resource Planning and Costs
Resource |
Description |
Cost |
---|---|---|
OBS |
You need to create an OBS bucket and upload data to OBS for data analysis using DLI. |
You will be charged for using the following OBS resources:
The actual fee depends on the size of the stored file, the number of user access requests, and the traffic volume. Estimate the fee based on your service requirements. |
DLI |
Before creating a SQL job, you need to purchase a queue. When using queue resources, you are billed based on the CUH of the queue. |
For example, if you purchase a pay-per-use queue, you will be billed based on the number of CUHs used by the queue. Usage is billed by the hour. For example, 58 minutes of usage will be rounded to the hour. CUH pay-per-use billing = Unit price x Number of CUs x Number of hours. |
Step 1: Creating and Uploading Data
- Create a CSV file. See test.csv in Figure 2.
- In the OBS management console, create a bucket, name it obs-csv-parquet, and upload the test.csv file to the bucket.
Figure 3 Uploading CSV data to OBS
- Create a bucket and name it obs-parquet-data to store the converted parquet data.
Step 2: Using DLI to Convert CSV Data into Parquet Data
- Go to the DLI console, click SQL Editor in the navigation pane.
- In the left pane of the SQL editor, click the Databases tab. Click , create a database, and name it demo.
- In the SQL editing window, set Engine to spark, Queue to default, and Database to demo. Execute the following statement to create table test_csv_hw to import the data in the test.csv file from OBS.
create table test_csv_hw(id1 int, id2 int, id3 int, id4 int, id5 int) using csv options( path 'obs://obs-csv-parquet/test.csv' )
- In the SQL editing window, query data in the test_csv_hw table.
Figure 4 Querying data
- In the SQL job editing window, create a table to store the OBS data in Parquet format and name the table test_parquet_hw.
create table `test_parquet_hw` (`id1` INT, `id2` INT, `id3` INT, `id4` INT, `id5` INT) using parquet options ( path 'obs://obs-parquet-data/' )
You do not need to specify a file because no Parquet file exists in this OBS bucket before the data is converted.
- In the SQL editing window, execute the following statement to convert the CSV data to Parquet format and store the data in the specified OBS folder:
insert into test_parquet_hw select * from test_csv_hw
- Check the result. OBS automatically created a file for saving the result.
Figure 5 Parquet data saved in a file in OBS
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.