Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update llama.cpp to 74d73dc version #34

Open
wants to merge 26 commits into
base: master-release
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
fd2c582
remove reference interface from extern C in qwen2audio examples
Nov 21, 2024
7589158
expose omni_context_params struct
Nov 21, 2024
809db95
ugrade to llama.cpp 74d73dc
TerryT9 Dec 2, 2024
a2c5305
merge from master
TerryT9 Dec 2, 2024
661b3f7
Merge pull request #31 from NexaAI/teliu/dev
zhiyuan8 Dec 2, 2024
0b15d2d
fix conficts (#32)
TerryT9 Dec 2, 2024
71b563e
Merge branch 'weili/dev' of github.com:NexaAI/llama.cpp into weili/dev
Dec 3, 2024
97267e6
bug fix in common-nexa.cpp
liwiii Dec 3, 2024
be54cb0
bug fix
Dec 3, 2024
ca7e8ef
fix clip_n_patch() allocation size error for 81-series omni-vlm models
Dec 3, 2024
07c7ff3
Merge branch 'weili/dev' of github.com:NexaAI/llama.cpp into weili/dev
Dec 3, 2024
b86cded
remove iostream header
Dec 3, 2024
b2958b3
Merge pull request #33 from NexaAI/weili/dev
zhiyuan8 Dec 3, 2024
64a6001
update omni audio cmake
TerryT9 Dec 3, 2024
5962b50
Update omni-audio cmake content to make it static (#36)
TerryT9 Dec 10, 2024
1487d32
Add streaming for omnivlm (#39)
TerryT9 Jan 6, 2025
9201de2
[swift] add module omnivlm (#41)
TerryT9 Jan 13, 2025
524323f
add support for Deepseek-R1-Qwen distill model
Davidqian123 Jan 29, 2025
1b40302
update
Davidqian123 Jan 29, 2025
cc827bc
Merge pull request #43 from NexaAI/release
Davidqian123 Jan 29, 2025
a4ee5ca
implemented compilation time q4_0 group size variants - for cpu
zhycheng614 Jan 30, 2025
23649e5
WIP
zhycheng614 Jan 30, 2025
37b572e
fixed hardcode qk=128 bug
zhycheng614 Jan 30, 2025
6a085d2
Merge pull request #44 from NexaAI/perry/quant-groupsize
zhiyuan8 Jan 31, 2025
e39e2b2
updated readme
zhycheng614 Feb 1, 2025
673d35c
Merge pull request #45 from NexaAI/perry/quant-groupsize
zhycheng614 Feb 1, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
24 changes: 0 additions & 24 deletions .clang-tidy

This file was deleted.

22 changes: 0 additions & 22 deletions .devops/cloud-v-pipeline

This file was deleted.

33 changes: 0 additions & 33 deletions .devops/full-cuda.Dockerfile

This file was deleted.

50 changes: 0 additions & 50 deletions .devops/full-rocm.Dockerfile

This file was deleted.

25 changes: 0 additions & 25 deletions .devops/full.Dockerfile

This file was deleted.

44 changes: 0 additions & 44 deletions .devops/llama-cli-cann.Dockerfile

This file was deleted.

37 changes: 0 additions & 37 deletions .devops/llama-cli-cuda.Dockerfile

This file was deleted.

28 changes: 0 additions & 28 deletions .devops/llama-cli-intel.Dockerfile

This file was deleted.

45 changes: 0 additions & 45 deletions .devops/llama-cli-rocm.Dockerfile

This file was deleted.

27 changes: 0 additions & 27 deletions .devops/llama-cli-vulkan.Dockerfile

This file was deleted.

23 changes: 0 additions & 23 deletions .devops/llama-cli.Dockerfile

This file was deleted.

Loading