Skip to content

Releases: devoxx/DevoxxGenieIDEAPlugin

v0.2.13

16 Aug 10:35
aff779c
Compare
Choose a tag to compare
  • Feat #209 : Upgraded to LangChain4j 0.33.0
  • Fix #211 : Class initialization must not depend on services
  • Feat #213 : Show input/output tokens and cost per request in footer of response

image

v0.2.12

14 Aug 07:03
Compare
Choose a tag to compare
  • Fix #203 : Google WebSeach is broken
  • Feat #199 : Show execution time of prompt enhancement
  • Fix - Token, Cost and Context Window Settings page mapping correction
  • Fix #202 : Update the Gradle IntelliJ build file so it can be installed on other IntelliJ products

v0.2.10

05 Aug 07:38
0f0512b
Compare
Choose a tag to compare
  • Fix #184 - Input panel has bigger min/preferred height size
  • Feat #186 - Support for local LLaMA.c++ http server
  • Feat #191 - Add Google model : gemini-1.5-pro-exp-0801
  • Fix #181 - Last selected LLM provider is not persisted anymore, fixed by @mydeveloperplanet
  • Feat #181 - Support for multiple projects with different LLM providers & language models
  • Fix #190 - Scroll output panel to the bottom when new output is added

v0.2.9

26 Jul 17:15
Compare
Choose a tag to compare

Fix #183 : Allows a remote Ollama instance to be used.

v0.2.8

24 Jul 18:09
71613e8
Compare
Choose a tag to compare

Support for Exo which allows you to run a local LLM cluster with Llama 3.1 using 8b, 70b and 405b on your Apple Silicon computers.

image

v0.2.7

23 Jul 17:52
Compare
Choose a tag to compare
  • Show window context for downloaded Ollama models
  • Also allow Token Calculation + "Add full project" to Ollama models 🔥

image

v0.2.6

22 Jul 09:48
Compare
Choose a tag to compare
  • Renamed Gemini LLM provider to Google
  • Increased Gemini Pro 1.5 window context to 2M
  • Sorting LLM providers and model names alphabetically in combobox
  • LLM cost calculation refactored

v0.2.5

18 Jul 21:56
803ed14
Compare
Choose a tag to compare

Feat #171: Support OpenAI GPT 4o mini
Fix #170: Fixed LMStudio comms

v0.2.4

05 Jul 14:17
Compare
Choose a tag to compare

Feat #164: Include all attached files in response output reference
Feat #166: Improve code inclusion for chat context

v0.2.3

05 Jul 07:11
b40c494
Compare
Choose a tag to compare

Feat #148: Create custom commands
Feat #157: Calc tokens for directory
Fix #153: Use the "Copy Project" settings when using "Add Directory to Context Window"
Feat #159: Introduce variable TokenCalculator based on selected LLM Provider
Feat #161: Move predefined command to custom commands