Skip to content

tt-inference-server/vllm-llama3-src-cloud-ubuntu-20.04-amd64 v0.0.1-70206b9cf111-b9564bf364e9 Public Latest

Install from the command line
Learn more about packages
$ docker pull ghcr.io/tenstorrent/tt-inference-server/vllm-llama3-src-cloud-ubuntu-20.04-amd64:v0.0.1-70206b9cf111-b9564bf364e9

Recent tagged image versions

  • Published about 22 hours ago · Digest
    sha256:a5c90b076b10de76646f5e7605584e27b1e8400f10a8abdb3f6ad9c489576905
    1 Version downloads

Loading

Details


Last published

22 hours ago

Issues

15

Total downloads

1