Skip to content

Evref-BL/Pharo-OllamaAPI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Pharo-OllamaAPI

Pharo 12 Moose version

This is a simple API to call the Ollama API

You must first install Ollama on your computer

Example

Generate code with CodeLlama

ollama := OllamaAPI new.
ollama model: OCodeLlamaModel new.
ollama model tag: '7b-code'.
ollama temperature: 0.1.
ollama num_predict: 30.
ollama top_p: 0.5.

ollama query: '<PRE><body>
    <!-- here a table -->
    <SUF>
</body><MID>'

Generate a comment code with CodeLlama

ollama := OllamaAPI new.
ollama model: OCodeLlamaModel new.
ollama model tag: '7b'.
ollama temperature: 0.5.
ollama num_predict: 75.
ollama top_p: 0.5.

ollama query: 'Writte a comment that explain this function

<yourcode>'

Use the stream API

[ollama := OllamaAPI new.
ollama model: OCodeLlamaModel new.
ollama model tag: '7b'.
ollama temperature: 0.5.
ollama num_predict: 100.
ollama top_p: 0.5.
ollama stream: true.

answer := ollama query: 'Hello world'.
reader := NeoJSONReader on: (ZnCharacterReadStream on: answer).
[ reader atEnd ] whileFalse: [
	| val |
	val := reader next.
	Transcript crShow: (val at: #response).
	(val at: #done) ifTrue: [ answer close ] ]] forkAt: Processor lowIOPriority

Installation

Metacello new
  githubUser: 'Evref-BL' project: 'Pharo-OllamaAPI' commitish: 'main' path: 'src';
  baseline: 'PharoOllama';
  load

As a dependency

spec
  baseline: 'PharoOllama'
  with: [
  spec repository: 'github://Evref-BL/Pharo-OllamaAPI:main/src' ]

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published