A secure, open-source long-term memory solution for GPT Store builders and AI developers. Just mention @rememberall
in your custom GPT to unlock persistent memory across conversations.
- Install from GPT Store
- Add to your custom GPT's configuration using
./api/openapi.yaml
- Your GPT now has persistent memory!
Your GPT can store important information from conversations
Easily retrieve relevant memories when needed
Seamless integration with conversation flow
Intuitive dashboard for memory management
// Example usage in your GPT's system prompt
When user mentions past conversations, use @rememberall to recall context:
User: "What did we discuss about authentication last week?"
Assistant: Let me check @rememberall
Assistant: According to our previous discussion, we implemented JWT-based auth...
- 🔒 Privacy-First: Self-host your memory store
- 🚀 Vector-Based: Efficient semantic search
- 🔗 Easy Integration: Simple REST API
- 📦 Open Source: Customize and extend as needed
- Deploy using Docker Compose:
git clone https://github.com/yourusername/rememberall.git
cd rememberall/deploy
docker-compose up -d
- Available API Endpoints:
# List or search memories
GET /memories?search=query&limit=10&offset=0
# Create new memory
POST /memory
{
"memory": "Your memory text here"
}
- Authentication:
- All endpoints require Bearer token authentication
- Include token in requests:
Authorization: Bearer your-jwt-token
Get Memories:
{
"success": true,
"memories": [
{
"id": "mem_123",
"memory": "Discussion about authentication systems"
}
]
}
Create Memory:
{
"success": true,
"memory": {
"id": "mem_124",
"memory": "New project requirements discussion"
}
}
- End-to-end encryption
- Self-hosted vector storage
- Fine-grained access control
- GDPR-compliant data handling