Transmart - Automate your i18n localization with AI.
English | 简体中文
Transmart is an open-source developer tool that utilizes ChatGPT to automate i18n translation. Given a base language and specifying all the languages that need to be output, running it will generate all i18n locale files.
It consists of two parts: Cli and Core. Core is the NodeJS core implementation of Transmart, while Cli is a command-line tool that encapsulates Core. In most cases, only Cli is used.
This project is currently under active development,PRs are welcome,reach me at Twitter
- Supports large size files,don't worry about the 4096 tokens limit
- Supports all languages that can be displayed using Intl.DisplayNames and can be processed by ChatGPT.
- Supports override AI translated values
- Supports i18next
- Supports vue-i18n
- Supports Chrome.i18n
- Supports Glob namespace matching
- Supports customizing OpenAI Model、API endpoint
- Supports custom locale file structure
- Supports iOS
- Supports Android
Transmart requires Node version 13 or higher.
To install Transmart, run:
npm install @transmart/cli -D
# or
yarn add @transmart/cli
First, create a transmart.config.js file in the root of your project. or any others file format cosmiconfig can search for
transmart.config.js
module.exports = {
baseLocale: 'en',
locales: ['fr', 'jp', 'de'],
localePath: 'public/locales',
openAIApiKey: 'your-own-openai-api-key',
overrides: {
'zh-CN': {
common: {
create_app: 'Create my Application',
},
},
},
}
All Options Reference
Add transmart command to your npm scripts
{
"translate": "transmart"
}
And then execute
npm run translate
Or you can execute directly with npx
prefix in command line
npx transmart
If you are not satisfied with the result of AI translation,use overrides
option to overwrite the generated JSON partially
🎉🎉 Enjoy i18n
Name | Type | Description | Required |
---|---|---|---|
baseLocale | string | The language that Transmart will use as translation ref. | Yes |
locales | string[] | All languages that need to be translated | Yes |
localePath | string | Where you store your locale files | Yes |
openAIApiKey | string | The OpenAI API Key. | Yes |
context | string | Provide some context for a more accurate translation. | No |
openAIApiModel | string | OpenAI API model, default to gpt-3.5-turbo-16k-0613 |
No |
overrides | Record<string, Record<string, Record<string, any>>> |
used to overwrite the generated JSON if you are not satisfied with the result of AI translation (locale-namespace-key:value) | No |
namespaceGlob | string|string[] | Glob for namespace(s) to process, useful to include or exclude some files, learn more glob | No |
openAIApiUrl | string | Optional base url of OpenAI API, useful with proxy | No |
openAIApiUrlPath | string | Optional URL endpoint of OpenAI API, useful with proxy | No |
modelContextLimit | number | Optional max context window that the model supports. For example for gpt-4-32k, the context is 32768 tokens. Default to 4096 (gpt-3.5-turbo) | No |
modelContextSplit | number | Optional ratio to split between number of input / output tokens. For example, if the input language is English and output is Spanish, you may expect 1 input token to produce 2 output tokens. In this case, the variable is set to 1/2. By default, modelContextSplit is set to 1/1 | No |
systemPromptTemplate | function | (For advanced usage) Custom prompt template. See "translate.ts" for the default prompt. | No |
additionalReqBodyParams | any | (For advanced usage) Custom parameters to be passed into request body. Useful if you use a self-hosted model and you want to customize model parameters. For example, see llama.cpp server example | No |
singleFileMode | boolean | singleFileMode will use a single file for all namespaces. For example, if you have a single file en.json and you want to translate it to zh.json , you can set singleFileMode to true. This will translate it to zh.json . In this mode, the namespace will be ignored and set to app . |
No |
To contribute to Transmart,refer to contributing.md