Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

scan_csv().sink_parquet() is significantly slower than using collect().write_parquet(), and the resulting file sizes are different #20815

Open
2 tasks done
ruoyu0088 opened this issue Jan 21, 2025 · 1 comment
Labels
A-io-parquet Area: reading/writing Parquet files bug Something isn't working needs triage Awaiting prioritization by a maintainer performance Performance issues or improvements python Related to Python Polars

Comments

@ruoyu0088
Copy link

ruoyu0088 commented Jan 21, 2025

Checks

  • I have checked that this issue has not already been reported.
  • I have confirmed this bug exists on the latest version of Polars.

Reproducible example

import os
import polars as pl

def create_folder_and_write_csv_files(folder_path, num_files=5, num_rows=10):
    if not os.path.exists(folder_path):
        os.makedirs(folder_path)
    
    for i in range(1, num_files + 1):
        data = pl.DataFrame({
            "column1": range(1, num_rows + 1),
            "column2": [f"row_{i}" for i in range(1, num_rows + 1)],
        })
        
        file_path = os.path.join(folder_path, f"test_file_{i}.csv")
        data.write_csv(file_path)

create_folder_and_write_csv_files("data/test_folder", num_files=300, num_rows=1000)

df_test = pl.scan_csv('data/test_folder/*.csv')
%time df_test.sink_parquet('data/test.parquet', compression='uncompressed')
%time df_test.collect().write_parquet('data/test1.parquet', compression='uncompressed')

output

  1. Using scan_csv().sink_parquet() (slow):
%time df_test.sink_parquet('data/test.parquet', compression='uncompressed')
   
CPU times: total: 1.53 s  
Wall time: 1.19 s
  1. Using collect().write_parquet() (fast):
%time df_test.collect().write_parquet('data/test1.parquet', compression='uncompressed')
   
CPU times: total: 93.8 ms  
Wall time: 50 ms

Additionally, the resulting file sizes are different:

  • test.parquet: 6,449,222 bytes
  • test1.parquet: 789,349 bytes

Issue description

I am trying to process multiple CSV files using Polars and save them as Parquet files. The method scan_csv().sink_parquet() is much slower than using collect().write_parquet(), and the resulting Parquet file sizes are different. Here’s the code I used:

Expected behavior

  • The scan_csv().sink_parquet() method should have similar performance and file size results compared to collect().write_parquet().

Installed versions

--------Version info---------
Polars:              1.20.0
Index type:          UInt32
Platform:            Windows-10-10.0.26100-SP0
Python:              3.11.8 | packaged by conda-forge | (main, Feb 16 2024, 20:40:50) [MSC v.1937 64 bit (AMD64)]
LTS CPU:             False

----Optional dependencies----
<not installed>      
<not installed>ager  
5.5.0r               
<not installed>      
<not installed>      
3.1.0pickle          
<not installed>      
0.24.0ake            
0.12.0cel            
2024.10.0            
24.11.1              
<not installed>      
<not installed>      
3.9.3otlib           
1.6.0asyncio         
1.26.4               
<not installed>      
2.2.3s               
18.0.0w              
<not installed>      
<not installed>      
2.0.36hemy           
<not installed>      
<not installed>      
3.2.0riter   
@ruoyu0088 ruoyu0088 added bug Something isn't working needs triage Awaiting prioritization by a maintainer python Related to Python Polars labels Jan 21, 2025
@ph-ll-pp
Copy link

I am having the same issue with scan_parquet().sink_parquet().

@alexander-beedie alexander-beedie added performance Performance issues or improvements A-io-parquet Area: reading/writing Parquet files labels Jan 26, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-io-parquet Area: reading/writing Parquet files bug Something isn't working needs triage Awaiting prioritization by a maintainer performance Performance issues or improvements python Related to Python Polars
Projects
None yet
Development

No branches or pull requests

3 participants