Skip to content

Releases: jackmpcollins/magentic

v0.11.0

25 Nov 18:54
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.10.0...v0.11.0

v0.10.0

15 Nov 05:28
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.9.1...v0.10.0

v0.9.1

07 Nov 05:04
Compare
Choose a tag to compare

v0.9.0

06 Nov 05:08
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.8.0...v0.9.0


Example of LiteLLM backend

from magentic import prompt
from magentic.chat_model.litellm_chat_model import LitellmChatModel


@prompt(
    "Talk to me! ",
    model=LitellmChatModel("ollama/llama2"),
)
def say_hello() -> str:
    ...


say_hello()

See the Backend/LLM Configuration section of the README for how to set the LiteLLM backend as the default.

v0.8.0

02 Nov 06:41
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.7.2...v0.8.0


Allow the chat_model/LLM to be set using a context manager. This allows the same prompt-function to easily be reused with different models, and also makes it neater to dynamically set the model.

from magentic import OpenaiChatModel, prompt


@prompt("Say hello")
def say_hello() -> str:
    ...


@prompt(
    "Say hello",
    model=OpenaiChatModel("gpt-4", temperature=1),
)
def say_hello_gpt4() -> str:
    ...


say_hello()  # Uses env vars or default settings

with OpenaiChatModel("gpt-3.5-turbo"):
    say_hello()  # Uses gpt-3.5-turbo due to context manager
    say_hello_gpt4()  # Uses gpt-4 with temperature=1 because explicitly configured

v0.7.2

14 Oct 20:58
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.7.1...v0.7.2


Allow setting max_tokens param in OpenaiChatModel. The default value for this can also be set using an environment variable MAGENTIC_OPENAI_MAX_TOKENS.

Example

from magentic import prompt
from magentic.chat_model.openai_chat_model import OpenaiChatModel

@prompt("Hello, how are you?", model=OpenaiChatModel(max_tokens=3))
def test() -> str:
    ...

test()
# 'Hello! I'

v0.7.1

08 Oct 01:09
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v0.7.0...v0.7.1

v0.7.0

02 Oct 07:21
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.6.0...v0.7.0


Chat Prompting

The @chatprompt decorator works just like @prompt but allows you to pass chat messages as a template rather than a single text prompt. This can be used to provide a system message or for few-shot prompting where you provide example responses to guide the model's output. Format fields denoted by curly braces {example} will be filled in all messages - use the escape_braces function to prevent a string being used as a template.

from magentic import chatprompt, AssistantMessage, SystemMessage, UserMessage
from magentic.chatprompt import escape_braces

from pydantic import BaseModel


class Quote(BaseModel):
    quote: str
    character: str


@chatprompt(
    SystemMessage("You are a movie buff."),
    UserMessage("What is your favorite quote from Harry Potter?"),
    AssistantMessage(
        Quote(
            quote="It does not do to dwell on dreams and forget to live.",
            character="Albus Dumbledore",
        )
    ),
    UserMessage("What is your favorite quote from {movie}?"),
)
def get_movie_quote(movie: str) -> Quote:
    ...


get_movie_quote("Iron Man")
# Quote(quote='I am Iron Man.', character='Tony Stark')

v0.6.0

25 Sep 06:04
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v0.5.0...v0.6.0

v0.5.0

15 Sep 08:59
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.4.1...v0.5.0


from magentic import prompt_chain


async def get_current_weather(location, unit="fahrenheit"):
    """Get the current weather in a given location"""
    return {
        "location": location,
        "temperature": "72",
        "unit": unit,
        "forecast": ["sunny", "windy"],
    }


@prompt_chain(
    template="What's the weather like in {city}?",
    functions=[get_current_weather],
)
async def describe_weather(city: str) -> str:
    ...


output = await describe_weather("Boston")