feat: initial AI Station setup with Chainlit, Qdrant, PostgreSQL and RAG

This commit is contained in:
DFFM-maker 2025-12-26 17:51:59 +01:00
parent a375a2f76d
commit 11541934a7
1 changed files with 77 additions and 0 deletions

77
README.md Normal file
View File

@ -0,0 +1,77 @@
text
# AI Station - Multi-User AI Hub
Piattaforma AI dockerizzata con RAG (Retrieval-Augmented Generation) per uso familiare e professionale.
## Stack Tecnologico
- **Frontend/UI**: Chainlit 1.3.2
- **Vector DB**: Qdrant
- **Database**: PostgreSQL 15
- **AI Engine**: Ollama (qwen2.5-coder:7b) su RTX A1000
- **Reverse Proxy**: Nginx Proxy Manager
- **SSL**: Wildcard *.dffm.it
## Architettura
Internet → pfSense (192.168.1.254)
Nginx Proxy (192.168.1.252) → https://ai.dffm.it
AI-SRV (192.168.1.244:8000) → Docker Compose
├── Chainlit App
├── PostgreSQL
└── Qdrant
AI-GPU (192.168.1.243:11434) → Ollama + RTX A1000
text
## Quick Start
Clone repository
git clone https://github.com/TUO_USERNAME/ai-station.git
cd ai-station
Configura environment variables
cp .env.example .env
nano .env
Avvia stack
docker compose up -d
Verifica logs
docker compose logs -f chainlit-app
text
## Accesso
- **Locale**: http://192.168.1.244:8000
- **Remoto**: https://ai.dffm.it
## Funzionalità Attuali
✅ Chat AI con streaming responses
✅ RAG con upload documenti .txt
✅ Indicizzazione automatica su Qdrant
✅ WebSocket support
✅ Accesso SSL remoto
## Roadmap
- [ ] Supporto PDF per documenti fiscali
- [ ] OAuth2 multi-utente
- [ ] UI personalizzate per profili (business/engineering/architecture/admin)
- [ ] Integrazione Google Gemini
- [ ] Persistenza conversazioni
## Requisiti
- Docker & Docker Compose
- 8GB RAM minimo (16GB consigliato)
- Ollama server remoto con GPU
## License
MIT