Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Assertion error with app demo #43

Open
KW-NJU opened this issue Oct 29, 2024 · 0 comments
Open

Assertion error with app demo #43

KW-NJU opened this issue Oct 29, 2024 · 0 comments

Comments

@KW-NJU
Copy link

KW-NJU commented Oct 29, 2024

Hi, when I was running:
python -m demo.app --resume_from_checkpoint chenjoya/videollm-online-8b-v1plus --attn_implementation sdpa

I got a strange error. When I tried to ask a question on the web page, Assitant's reply would always get stuck at various places and report errors. The error on the terminal is:

Traceback (most recent call last):
File "/data/wj/anaconda3/envs/videollm/lib/python3.10/site-packages/gradio/queueing.py", line 706, in process_events
response = await route_utils.call_process_api(
File "/data/wj/anaconda3/envs/videollm/lib/python3.10/site-packages/gradio/route_utils.py", line 323, in call_process_api
output = await app.get_blocks().process_api(
File "/data/wj/anaconda3/envs/videollm/lib/python3.10/site-packages/gradio/blocks.py", line 2018, in process_api
result = await self.call_function(
File "/data/wj/anaconda3/envs/videollm/lib/python3.10/site-packages/gradio/blocks.py", line 1579, in call_function
prediction = await utils.async_iteration(iterator)
File "/data/wj/anaconda3/envs/videollm/lib/python3.10/site-packages/gradio/utils.py", line 691, in async_iteration
return await anext(iterator)
File "/data/wj/anaconda3/envs/videollm/lib/python3.10/site-packages/gradio/utils.py", line 685, in anext
return await anyio.to_thread.run_sync(
File "/data/wj/anaconda3/envs/videollm/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "/data/wj/anaconda3/envs/videollm/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2441, in run_sync_in_worker_thread
return await future
File "/data/wj/anaconda3/envs/videollm/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 943, in run
result = context.run(func, *args)
File "/data/wj/anaconda3/envs/videollm/lib/python3.10/site-packages/gradio/utils.py", line 668, in run_sync_iterator_async
return next(iterator)
File "/data/wj/anaconda3/envs/videollm/lib/python3.10/site-packages/gradio/utils.py", line 829, in gen_wrapper
response = next(iterator)
File "/data/wj/videollm-online/demo/app.py", line 92, in gr_liveinfer_queue_refresher_change
query, response = liveinfer()
File "/data/wj/videollm-online/demo/inference.py", line 129, in call
query, response = self._call_for_response(video_time, query)
File "/data/wj/videollm-online/demo/inference.py", line 48, in _call_for_response
assert self.last_ids == 933, f'{self.last_ids} != 933' # HACK, 933 = ]\n
AssertionError: tensor([[0]], device='cuda:0') != 933

Could you please tell me what to do? Thanks a lot!
By the way, my cli.py is working fine.

微信图片_20241030000919

1730218147357

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant