Skip to content

Commit

Permalink
fix: formatting issues
Browse files Browse the repository at this point in the history
  • Loading branch information
ajithvcoder committed Jan 24, 2025
1 parent 1f29c36 commit 93080f2
Show file tree
Hide file tree
Showing 3 changed files with 6 additions and 6 deletions.
4 changes: 3 additions & 1 deletion adalflow/adalflow/components/model_client/bedrock_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -302,7 +302,9 @@ def convert_inputs_to_api_kwargs(
api_kwargs = self._validate_and_process_config_keys(api_kwargs)

# Separate inference config and additional model request fields
api_kwargs, inference_config, additional_model_request_fields = self._separate_parameters(api_kwargs)
api_kwargs, inference_config, additional_model_request_fields = (
self._separate_parameters(api_kwargs)
)

api_kwargs["messages"] = [
{"role": "user", "content": [{"text": input}]},
Expand Down
6 changes: 3 additions & 3 deletions docs/source/integrations/aws_bedrock.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ AWS Bedrock API Client
Getting Credentials
-------------------

You need to have an AWS account and an access key and secret key to use AWS Bedrock services. Moreover, the account associated with the access key must have
You need to have an AWS account and an access key and secret key to use AWS Bedrock services. Moreover, the account associated with the access key must have
the necessary permissions to access Bedrock services. Refer to the `AWS documentation <https://docs.aws.amazon.com/singlesignon/latest/userguide/howtogetcredentials.html>`_ for more information on obtaining credentials.

Enabling Foundation Models
Expand All @@ -32,9 +32,9 @@ Steps for enabling model access:
Note:

1. Avoid enabling high-cost models to prevent accidental high charges due to incorrect usage.
2. As of Nov 2024, a cost-effective option is the Llama-3.2 1B model, with model ID: ``meta.llama3-2-1b-instruct-v1:0`` in the ``us-east-1`` region.
2. As of Nov 2024, a cost-effective option is the Llama-3.2 1B model, with model ID: ``meta.llama3-2-1b-instruct-v1:0`` in the ``us-east-1`` region.
3. AWS tags certain models with `inferenceTypesSupported` = `INFERENCE_PROFILE` and in UI it might appear with a tooltip as `This model can only be used through an inference profile.` In such cases you may need to use the Model ARN: ``arn:aws:bedrock:us-east-1:306093656765:inference-profile/us.meta.llama3-2-1b-instruct-v1:0`` in the model ID field when using Adalflow.
4. Ensure (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION_NAME) or AWS_DEFAULT_PROFILE is set in the ``.env`` file. Mention exact key names in ``.env`` file for example access key id is ``AWS_ACCESS_KEY_ID``
4. Ensure (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION_NAME) or AWS_DEFAULT_PROFILE is set in the ``.env`` file. Mention exact key names in ``.env`` file for example access key id is ``AWS_ACCESS_KEY_ID``

.. code-block:: python
Expand Down
2 changes: 0 additions & 2 deletions tutorials/bedrock_client_simple_qa.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
import os

from adalflow.components.model_client import BedrockAPIClient
from adalflow.core.types import ModelType
from adalflow.utils import setup_env
Expand Down

0 comments on commit 93080f2

Please sign in to comment.