Skip to content

Commit

Permalink
Add SageMaker as a LLM provider (#1947)
Browse files Browse the repository at this point in the history
* Add SageMaker as a LLM provider

* Removed unnecessary constants; updated docs to align with bootstrap naming convention

---------

Co-authored-by: Brandon Hancock (bhancock_ai) <[email protected]>
  • Loading branch information
bobbywlindsey and bhancockio authored Jan 22, 2025
1 parent a836f46 commit e27a150
Showing 1 changed file with 18 additions and 0 deletions.
18 changes: 18 additions & 0 deletions docs/concepts/llms.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -243,6 +243,9 @@ There are three ways to configure LLMs in CrewAI. Choose the method that best fi
# llm: bedrock/amazon.titan-text-express-v1
# llm: bedrock/meta.llama2-70b-chat-v1

# Amazon SageMaker Models - Enterprise-grade
# llm: sagemaker/<my-endpoint>

# Mistral Models - Open source alternative
# llm: mistral/mistral-large-latest
# llm: mistral/mistral-medium-latest
Expand Down Expand Up @@ -506,6 +509,21 @@ Learn how to get the most out of your LLM configuration:
)
```
</Accordion>

<Accordion title="Amazon SageMaker">
```python Code
AWS_ACCESS_KEY_ID=<your-access-key>
AWS_SECRET_ACCESS_KEY=<your-secret-key>
AWS_DEFAULT_REGION=<your-region>
```

Example usage:
```python Code
llm = LLM(
model="sagemaker/<my-endpoint>"
)
```
</Accordion>

<Accordion title="Mistral">
```python Code
Expand Down

0 comments on commit e27a150

Please sign in to comment.