Go to file
AI Station Server 2010c4e84c feat: PDF RAG support with glm-4.6:cloud integration
- PyMuPDF PDF text extraction
- Qdrant vector store with semantic search
- glm-4.6:cloud model for intelligent document analysis
- Chainlit UI with streaming responses
- PDF chunking system with overlap
- Document indexing and retrieval
2025-12-26 21:52:00 +01:00
.chainlit Primo salvataggio dopo il ripristino 2025-12-25 15:54:33 +01:00
.venv Primo salvataggio dopo il ripristino 2025-12-25 15:54:33 +01:00
__pycache__ stage1 2025-12-26 16:19:12 +01:00
workspaces/admin moficica app.py 2025-12-25 19:00:13 +01:00
.gitignore postgress version connect 2025-12-26 13:14:27 +01:00
Dockerfile pre Oauth2 2025-12-26 17:48:51 +01:00
Marimo_Multi-User_Hub.md Primo salvataggio dopo il ripristino 2025-12-25 15:54:33 +01:00
PROMPT_V2.md Primo salvataggio dopo il ripristino 2025-12-25 15:54:33 +01:00
README.md feat: initial AI Station setup with Chainlit, Qdrant, PostgreSQL and RAG 2025-12-26 17:51:59 +01:00
SPEC.md Primo salvataggio dopo il ripristino 2025-12-25 15:54:33 +01:00
app-final.py feat: PDF RAG support with glm-4.6:cloud integration 2025-12-26 21:52:00 +01:00
app.py feat: PDF RAG support with glm-4.6:cloud integration 2025-12-26 21:52:00 +01:00
chainlit.md Primo salvataggio dopo il ripristino 2025-12-25 15:54:33 +01:00
debugchainlit-app.txt feat: PDF support with RAG integration 2025-12-26 20:38:43 +01:00
debugchanlit-app.txt feat: PDF support with RAG integration 2025-12-26 20:38:43 +01:00
docker-compose.yml pre Oauth2 2025-12-26 17:48:51 +01:00
docker.logs pre Oauth2 2025-12-26 17:48:51 +01:00
dockerignore pre Oauth2 2025-12-26 17:48:51 +01:00
error.log Fix: RAG implementation and connection fix 2025-12-26 08:45:40 +01:00
init_db.py pre Oauth2 2025-12-26 17:48:51 +01:00
requirements.txt feat: PDF RAG support with glm-4.6:cloud integration 2025-12-26 21:52:00 +01:00

README.md

text

AI Station - Multi-User AI Hub

Piattaforma AI dockerizzata con RAG (Retrieval-Augmented Generation) per uso familiare e professionale.

Stack Tecnologico

  • Frontend/UI: Chainlit 1.3.2
  • Vector DB: Qdrant
  • Database: PostgreSQL 15
  • AI Engine: Ollama (qwen2.5-coder:7b) su RTX A1000
  • Reverse Proxy: Nginx Proxy Manager
  • SSL: Wildcard *.dffm.it

Architettura

Internet → pfSense (192.168.1.254) ↓ Nginx Proxy (192.168.1.252) → https://ai.dffm.it ↓ AI-SRV (192.168.1.244:8000) → Docker Compose ├── Chainlit App ├── PostgreSQL └── Qdrant ↓ AI-GPU (192.168.1.243:11434) → Ollama + RTX A1000

text

Quick Start

Clone repository git clone https://github.com/TUO_USERNAME/ai-station.git cd ai-station

Configura environment variables cp .env.example .env nano .env

Avvia stack docker compose up -d

Verifica logs docker compose logs -f chainlit-app

text

Accesso

Funzionalità Attuali

Chat AI con streaming responses RAG con upload documenti .txt Indicizzazione automatica su Qdrant WebSocket support Accesso SSL remoto

Roadmap

  • Supporto PDF per documenti fiscali
  • OAuth2 multi-utente
  • UI personalizzate per profili (business/engineering/architecture/admin)
  • Integrazione Google Gemini
  • Persistenza conversazioni

Requisiti

  • Docker & Docker Compose
  • 8GB RAM minimo (16GB consigliato)
  • Ollama server remoto con GPU

License

MIT