-
-
Notifications
You must be signed in to change notification settings - Fork 239
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Data loading issues #142
Comments
Can you try to replace the line max_int = sys.maxsize
while True:
# decrease the maxInt value by factor 10
# as long as the OverflowError occurs.
try:
csv.field_size_limit(max_int)
break
except OverflowError:
max_int = int(max_int) It seems that |
It is taking too much time for execution. It's been more than 12 hours for execution. Is there any solution? |
Oh, that definitely shouldn't take that long (just a few seconds I guess). Did you try setting fixed number like |
Well, the steps are bit old and I would like to rework it once I have more time.
|
It should work now, just pull latest changes from repo. |
Can you please elaborate? |
On what exactly? |
|
The error hasnt solved for me too even after replacing the code provided by you |
OverflowError: Python int too large to convert to C long
Getting this error for both train.csv and dev.csv file
What to do to solve this error?
The text was updated successfully, but these errors were encountered: