Skip to content

pluja/llm-language-tool

Repository files navigation

LLM Language Tool 🌐

A lightweight, browser-based text processing tool powered by Large Language Models (LLMs). Process, translate, and analyze text directly in your browser without any backend or server dependencies.

Click here to see an example translation of this blog post to French.

🚀 Try it out!

Visit the public instance hosted on Cloudflare Pages:

https://langtool.pluja.dev

Add your API Key for an OpenAI-compatible API (ollama, ppq.ai, nano-gpt.com, llama.cpp, openai...)

Screenshot of the User Interface

✨ Features

  • Text Processing

    • Process text directly from input or fetch from URLs
    • Translation between multiple languages
    • Text summarization and correction
    • Translation explanations and summaries
  • Privacy & Flexibility

    • 100% client-side processing
    • Configure your own LLM endpoints and API keys
    • Data Privacy & Security:
      • All operations occur client-side - your API keys and queries never leave your browser
      • Queries are sent directly to your preferred LLM provider (supports self-hosted models with openai-compatible backends)
      • Sharing functionality uses PocketJSON (optional, can be self-hosted)
  • Sharing

    • Share your results with a clean, distraction-free view
      • Generate shareable URLs for your processed content
      • Recipients see a simplified interface without tooling
      • Easy switch between simple view and full application
      • Base64 encoded content for sharing
    • Share configurations across devices via URL
      • Share language preferences and API configurations
      • Export/Import settings using a URL
      • Base64 encoded settings for secure sharing
  • Developer Friendly

    • Pure JavaScript and HTML implementation
    • Zero runtime dependencies
    • Modern UI with Tailwind CSS
    • URL parameter support for automation

Self-Hosting / Local Development

  1. Clone the repository:

    git clone https://github.com/pluja/llm-language-tool.git
  2. Navigate to the project directory:

    cd llm-language-tool
  3. Start a local server:

    # Using Python (recommended)
    python3 -m http.server 8000
    
    # Or Using Docker
    docker run --rm -p 8000:8000 -v $(pwd):/srv --workdir /srv python:3-alpine python -m http.server 8000 --bind 0.0.0.0
    
    # Or Using Docker-Compose
    docker compose up -d
  4. Open your browser and visit http://localhost:8000

🔧 Configuration

The tool supports various configuration options through the UI:

  • LLM endpoint configuration
  • API key management
  • Language model
  • Language preferences
  • UI customization

Sharing Settings

You can easily share your configuration with other devices or users:

  1. Click the Settings button (⚙️)
  2. Click "Share Settings" in the modal
  3. A URL containing your encoded settings will be copied to your clipboard
  4. Share this URL or save it for later use
  5. When opened, the URL will automatically import all settings you configured

The shared configuration includes:

  • API endpoint
  • API key
  • Model ID
  • Custom language list

Sharing Results

Share your processed content with others:

  1. Process any text using the available tools
  2. Click the "Share" button on the result
  3. A URL will be copied to your clipboard
  4. Recipients will see a clean, simplified view of the content
  5. They can access the full application via the "View Full App" button

🔍 URL Parameters

Automate tasks using URL parameters:

https://langtool.pluja.dev?content=https://example.com&task=translate&output=spanish

Supported parameters:

  • content: Text or URL to process
  • task: Operation to perform (translate or summarize)
  • output: Target language, only used with translate task
  • config: Base64 encoded configuration (automatically generated via Share Settings)
  • #<base64-content>?share: Share view with encoded content (automatically generated via Share button)

🛠️ Technical Stack

  • Frontend: Vanilla JavaScript and HTML
  • Styling: Tailwind CSS with plugins

👥 Contributing

Contributions are welcome! This project was initially created with the help of LLMs and has room for improvements and new features.

To contribute:

  1. Fork the repository
  2. Create a feature branch
  3. Commit your changes
  4. Open a Pull Request

💡 Inspiration

This project draws inspiration from:

📄 License

MIT

About

Use LLMs to enhance, translate and correct your texts

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published