Releases: Blaizzy/mlx-vlm
Releases · Blaizzy/mlx-vlm
v0.1.11
What's Changed
- Chat in CLI. by @chigkim in #168
- Fix skip-vision predicate and add utils unit test (quantize and inputs) by @Blaizzy in #172
- Refactor topk to use mlx.core (DS-VL2) by @Blaizzy in #175
- Fix trainer and Qwen2-VL by @Blaizzy in #179
- Pin latest mlx by @Blaizzy in #184
New Contributors
Full Changelog: v0.1.10...v0.1.11
v0.1.10
v0.1.9
v0.1.8
v0.1.7
What's Changed
- Fix multi-image and 2x speed improvements (DS-VL2) by @Blaizzy in #157
- Refactor utils (model loading, inference and output processing) by @Blaizzy in #161
- Fix Llama-3.2-Vision (18x faster generation and 75% less memory usage) by @Blaizzy in #163
⚠️ Breaking Changes
This release introduces some breaking changes. If you encounter any issues, please open an issue or submit a PR.
Full Changelog: v0.1.6...v0.1.7
v0.1.6
v0.1.5
v0.1.4
v0.1.3
What's Changed
- Add lazy eval during conversion by @Blaizzy in #127
- Open tokenizer.json within context manager by @neilmehta24 in #129
- Fix Bugs in chat UI by @terhechte in #96
- Fix broken stream generate for SmolVLM and others by @andimarafioti in #132
- Fix idefics3 by @Blaizzy in #133
New Contributors
- @neilmehta24 made their first contribution in #129
- @terhechte made their first contribution in #96
- @andimarafioti made their first contribution in #132
Full Changelog: v0.1.2...v0.1.3