You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks a lot for your educational and informative project. I tried quantizing the already provided tinystories llama2.c format .bin files using the export.py, but as far as I understood it was not supported. On the export.py file llama.c is mentioned as supported input but apparently it is not. Is it possible to add this feature?
I am relatively new to all this so please don't mind if my point was incorrect. thanks again.
The text was updated successfully, but these errors were encountered:
hafezmg48
changed the title
add feature: export (quantize) from llama.c format
add feature: export (quantize) from Llama2.c format
Mar 15, 2024
Thanks a lot for your educational and informative project. I tried quantizing the already provided tinystories llama2.c format .bin files using the export.py, but as far as I understood it was not supported. On the export.py file llama.c is mentioned as supported input but apparently it is not. Is it possible to add this feature?
I am relatively new to all this so please don't mind if my point was incorrect. thanks again.
The text was updated successfully, but these errors were encountered: