You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For a project I am working on uncompressed data files can be very large (easily 500k). I'd rather not load and hold the entire file in memory to parse it. Instead, I want to stream in characters as needed.
The text was updated successfully, but these errors were encountered:
This is tricky to do in a generic way with a framework like this. PetitParser needs random access on the input data, because of backtracking. However, depending on your grammar, you can split up the parser and run it incrementally over junks of input data. An example is this event based XML parser that can efficiently process infinitely large streams of input.
For a project I am working on uncompressed data files can be very large (easily 500k). I'd rather not load and hold the entire file in memory to parse it. Instead, I want to stream in characters as needed.
The text was updated successfully, but these errors were encountered: