Replies: 6 comments 8 replies
-
Beta Was this translation helpful? Give feedback.
-
Dampfinchen suggested to make it work without Docker and on Windows soon. |
Beta Was this translation helpful? Give feedback.
-
don't you specify the model in the config? Which model do you select in the web UI setting? if you don't select any the BE stops working, I tried putting the model I have in the config but it didn't work. other not local models have no problems |
Beta Was this translation helpful? Give feedback.
-
Ciao Flavio, mi aiuteresti ad installarlo su Windows? Io non ci sono riuscito, mi potresti postare una breve guida per favore? |
Beta Was this translation helpful? Give feedback.
-
Will be writing up a powershell script later today to get this running. Would love feedback on testing it when done. I got OpenDevin up and working with OpenAI on Windows 11 via powershell and just installing python3.12 and LTS NPM. One thing for the environment variables to temp add them using the toml file I will need to look at but was about to add them for the agent and front in via using $env:ENV_VAR_NAME before bring the front and back ends up manually There is also an issue with changing the execution policy since pnpm is unsigned on windows and then also you need windows admin to run corepack enable Here are the basic steps I did but will be more detailed in the powershell script Before git clone of repo
In OpenDevin folder in opendevin_env
To run front end To run backend with OpenAI |
Beta Was this translation helpful? Give feedback.
-
Can the Docker installation method work with Powershell instead of WSL? Docker is accessible through chocolatey already |
Beta Was this translation helpful? Give feedback.
-
Original thread issue #637
Summary
I was able to have a running version of OpenDevin on Windows 10 (not WSL) before the change to Makefile. I switched to WSL since I thought I had no other choice, but whatever I did finished like the issue #561 (ws://localhost:3001/ws). The package manager for Windows Chocolatey that comes with Node.js has a "make" installer (https://community.chocolatey.org/packages/make) so I tried to set it up straight on Windows 10 (which would be better than WSL imho). "uvloop" does not work on Windows, but I read that fastapi works without it on Windows, so I removed the dependency, ran "make build" and was able to run "make start-backend" + "make start-frontend" without errors. I tried to run it with my old LM Studio config and got it to work.
LM Studio:
[2024-04-03 01:24:08.701] [INFO] [LM STUDIO SERVER] Supported endpoints:
[2024-04-03 01:24:08.701] [INFO] [LM STUDIO SERVER] -> GET http://localhost:1234/v1/models
[2024-04-03 01:24:08.702] [INFO] [LM STUDIO SERVER] -> POST http://localhost:1234/v1/chat/completions
[2024-04-03 01:24:08.702] [INFO] [LM STUDIO SERVER] -> POST http://localhost:1234/v1/completions
config.toml:
LLM_BASE_URL="http://localhost:1234/v1/"
LLM_API_KEY="lm-studio"
LLM_EMBEDDING_MODEL="local"
WORKSPACE_DIR="./workspace"
It interacts with the console too.
Motivation
Using Windows 10 instead of WSL and full integration of LM Studio
Technical Design
Windows 10 + "choco install make" + LM Studio
Beta Was this translation helpful? Give feedback.
All reactions