Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cant use bge-reranker-large in cpu mode #56

Open
Abirdcfly opened this issue Apr 15, 2024 · 0 comments
Open

cant use bge-reranker-large in cpu mode #56

Abirdcfly opened this issue Apr 15, 2024 · 0 comments

Comments

@Abirdcfly
Copy link

 kubectl logs -f -n kubeagi-system       bge-reranker-large-worker-57fd46459-7r945
Defaulted container "runner" out of: runner, loader (init)
/usr/local/lib/python3.11/site-packages/langchain/chat_models/__init__.py:31: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.chat_models import ChatOpenAI`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/embeddings/__init__.py:29: LangChainDeprecationWarning: Importing embeddings from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.embeddings import OpenAIEmbeddings`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/embeddings/__init__.py:29: LangChainDeprecationWarning: Importing embeddings from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.embeddings import AzureOpenAIEmbeddings`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/chat_models/__init__.py:31: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.chat_models import AzureChatOpenAI`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/chat_models/__init__.py:31: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.chat_models import BedrockChat`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/chat_models/__init__.py:31: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.chat_models import ChatVertexAI`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/llms/__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.llms import AmazonAPIGateway`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/llms/__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.llms import AzureOpenAI`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/llms/__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.llms import Bedrock`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/llms/__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.llms import OpenAI`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/llms/__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.llms import VertexAI`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
INFO:     Will watch for changes in these directories: ['/libs/cli']
INFO:     Uvicorn running on http://0.0.0.0:21002 (Press CTRL+C to quit)
INFO:     Started reloader process [1] using StatReload
/usr/local/lib/python3.11/site-packages/langchain/chat_models/__init__.py:31: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.chat_models import ChatOpenAI`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/embeddings/__init__.py:29: LangChainDeprecationWarning: Importing embeddings from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.embeddings import OpenAIEmbeddings`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/embeddings/__init__.py:29: LangChainDeprecationWarning: Importing embeddings from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.embeddings import AzureOpenAIEmbeddings`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/chat_models/__init__.py:31: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.chat_models import AzureChatOpenAI`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/chat_models/__init__.py:31: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.chat_models import BedrockChat`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/chat_models/__init__.py:31: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.chat_models import ChatVertexAI`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/llms/__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.llms import AmazonAPIGateway`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/llms/__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.llms import AzureOpenAI`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/llms/__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.llms import Bedrock`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/llms/__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.llms import OpenAI`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/llms/__init__.py:548: LangChainDeprecationWarning: Importing LLMs from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.llms import VertexAI`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
INFO:     Started server process [30]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     10.244.0.72:34026 - "POST /api/v1/reranking HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 404, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 762, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 782, in app
    await route.handle(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 72, in app
    response = await func(request)
               ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 299, in app
    raise e
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 294, in app
    raw_response = await run_endpoint_function(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 193, in run_endpoint_function
    return await run_in_threadpool(dependant.call, **values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/starlette/concurrency.py", line 40, in run_in_threadpool
    return await anyio.to_thread.run_sync(func, *args)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2144, in run_sync_in_worker_thread
    return await future
           ^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 851, in run
    result = context.run(func, *args)
             ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/libs/cli/kubeagi_cli/server.py", line 39, in reranking
    reranker = select_reranking(input.model_name_or_path)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/libs/cli/kubeagi_cli/serve/reranking.py", line 86, in select_reranking
    return BGEReranking(model_name_or_path)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/libs/cli/kubeagi_cli/serve/reranking.py", line 43, in __init__
    self._reranker = FlagReranker(
                     ^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/FlagEmbedding/flag_models.py", line 131, in __init__
    self.tokenizer = AutoTokenizer.from_pretrained(model_name_or_path)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/models/auto/tokenization_auto.py", line 825, in from_pretrained
    return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 2048, in from_pretrained
    return cls._from_pretrained(
           ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 2287, in _from_pretrained
    tokenizer = cls(*init_inputs, **init_kwargs)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py", line 155, in __init__
    super().__init__(
  File "/usr/local/lib/python3.11/site-packages/transformers/tokenization_utils_fast.py", line 120, in __init__
    raise ValueError(
ValueError: Couldn't instantiate the backend tokenizer from one of:
(1) a `tokenizers` library serialization file,
(2) a slow tokenizer instance to convert or
(3) an equivalent slow tokenizer class to instantiate and convert.
You need to have sentencepiece installed to convert a slow tokenizer to a fast one.

the deploy

apiVersion: v1
kind: Pod
metadata:
  creationTimestamp: "2024-04-15T09:15:16Z"
  generateName: bge-reranker-large-worker-57fd46459-
  labels:
    app.kubernetes.io/name: bge-reranker-large-worker
    arcadia.kubeagi.k8s.com.cn/worker: bge-reranker-large
    pod-template-hash: 57fd46459
  name: bge-reranker-large-worker-57fd46459-7r945
  namespace: kubeagi-system
  ownerReferences:
  - apiVersion: apps/v1
    blockOwnerDeletion: true
    controller: true
    kind: ReplicaSet
    name: bge-reranker-large-worker-57fd46459
    uid: d79b4d0d-8120-49bf-a849-0da692828d5a
  resourceVersion: "69504790"
  uid: 98835ab3-a479-4b0c-ac6d-cc63b8330e28
spec:
  containers:
  - command:
    - python
    - kubeagi_cli/cli.py
    - serve
    - --host
    - 0.0.0.0
    - --port
    - "21002"
    env:
    - name: RERANKING_MODEL_PATH
      value: /data/models/bge-reranker-large
    image: kubeagi/core-library-cli:v0.0.1
    imagePullPolicy: IfNotPresent
    name: runner
    ports:
    - containerPort: 21002
      name: http
      protocol: TCP
    resources:
      limits:
        cpu: "1"
        memory: 16Gi
        nvidia.com/gpu: "0"
      requests:
        cpu: "1"
        memory: 16Gi
        nvidia.com/gpu: "0"
    terminationMessagePath: /dev/termination-log
    terminationMessagePolicy: File
    volumeMounts:
    - mountPath: /data/models
      name: models
    - mountPath: /var/run/secrets/kubernetes.io/serviceaccount
      name: kube-api-access-qd2ns
      readOnly: true
  dnsPolicy: ClusterFirst
  enableServiceLinks: true
  initContainers:
  - command:
    - /bin/bash
    - -c
    - |-
      echo "Load models"
      mc alias set oss $MINIO_ENDPOINT $MINIO_ACCESS_KEY $MINIO_SECRET_KEY --insecure
      mc --insecure cp -r oss/$MINIO_MODEL_BUCKET/model/$MINIO_MODEL_NAME /data/models
    env:
    - name: MINIO_ENDPOINT
      value: http://arcadia-minio.kubeagi-system.svc.cluster.local:9000
    - name: MINIO_ACCESS_KEY
      value: admin
    - name: MINIO_SECRET_KEY
      value: Passw0rd!
    - name: MINIO_MODEL_BUCKET
      value: kubeagi-system
    - name: MINIO_MODEL_NAME
      value: bge-reranker-large
    image: kubeagi/minio-mc:RELEASE.2023-01-28T20-29-38Z
    imagePullPolicy: IfNotPresent
    name: loader
    resources: {}
    terminationMessagePath: /dev/termination-log
    terminationMessagePolicy: File
    volumeMounts:
    - mountPath: /data/models
      name: models
    - mountPath: /var/run/secrets/kubernetes.io/serviceaccount
      name: kube-api-access-qd2ns
      readOnly: true
  nodeName: kubeagi-control-plane
  preemptionPolicy: PreemptLowerPriority
  priority: 0
  restartPolicy: Always
  schedulerName: default-scheduler
  securityContext: {}
  serviceAccount: default
  serviceAccountName: default
  terminationGracePeriodSeconds: 30
  tolerations:
  - effect: NoExecute
    key: node.kubernetes.io/not-ready
    operator: Exists
    tolerationSeconds: 300
  - effect: NoExecute
    key: node.kubernetes.io/unreachable
    operator: Exists
    tolerationSeconds: 300
  volumes:
  - emptyDir: {}
    name: models
  - name: kube-api-access-qd2ns
    projected:
      defaultMode: 420
      sources:
      - serviceAccountToken:
          expirationSeconds: 3607
          path: token
      - configMap:
          items:
          - key: ca.crt
            path: ca.crt
          name: kube-root-ca.crt
      - downwardAPI:
          items:
          - fieldRef:
              apiVersion: v1
              fieldPath: metadata.namespace
            path: namespace
status:
  conditions:
  - lastProbeTime: null
    lastTransitionTime: "2024-04-15T09:15:45Z"
    status: "True"
    type: Initialized
  - lastProbeTime: null
    lastTransitionTime: "2024-04-15T09:15:46Z"
    status: "True"
    type: Ready
  - lastProbeTime: null
    lastTransitionTime: "2024-04-15T09:15:46Z"
    status: "True"
    type: ContainersReady
  - lastProbeTime: null
    lastTransitionTime: "2024-04-15T09:15:16Z"
    status: "True"
    type: PodScheduled
  containerStatuses:
  - containerID: containerd://717f1771e38ce1f70fb0e31c1fe332dd382a2826328f53fd08546340f5206012
    image: docker.io/kubeagi/core-library-cli:v0.0.1
    imageID: docker.io/kubeagi/core-library-cli@sha256:7d830b613cfdcbf34c2550d9b746f344a1ed7804f3bcf72fc4f4691be5180f6e
    lastState: {}
    name: runner
    ready: true
    restartCount: 0
    started: true
    state:
      running:
        startedAt: "2024-04-15T09:15:46Z"
  hostIP: 172.18.0.2
  initContainerStatuses:
  - containerID: containerd://561f9037f576a125bae0a473a647d59b4f1d3e2304f9ccef6cae2fa0a2a09d1d
    image: docker.io/kubeagi/minio-mc:RELEASE.2023-01-28T20-29-38Z
    imageID: docker.io/kubeagi/minio-mc@sha256:729b3d128487fd83b4788aae24672a812836f7c4aa8687c40045fe72bee64d98
    lastState: {}
    name: loader
    ready: true
    restartCount: 0
    state:
      terminated:
        containerID: containerd://561f9037f576a125bae0a473a647d59b4f1d3e2304f9ccef6cae2fa0a2a09d1d
        exitCode: 0
        finishedAt: "2024-04-15T09:15:39Z"
        reason: Completed
        startedAt: "2024-04-15T09:15:16Z"
  phase: Running
  podIP: 10.244.0.75
  podIPs:
  - ip: 10.244.0.75
  qosClass: Burstable
  startTime: "2024-04-15T09:15:16Z"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant