AI installation playbook for internal use
DGXP (DGX Playbook) by Armin Fimberger
WEBSERVICE Setup
LLM Setup (OpenWebUI, Ollama, RAG)
Version/date: v0.7, 12.2.2026
Goals
This system is a locally operated Retrieval-Augmented Generation (RAG) platform designed for controlled, source-based AI-assisted analysis.
It combines vector-based document search with large language models to generate answers that are strictly derived from indexed source material.
The system emphasizes traceability, deterministic search logic, and a clear separation between legal/normative sources and internal policy documents.
In exceptional cases, if no local source is found, the system may provide its own answer but must clearly label it as such. However, the free answer is only intended as a reminder and must be checked manually. If the answer is correct, the standard can be imported into the system.
The system should include all European and German standards that are relevant to data protection law.
The system does not send data to third parties or to an external cloud at any time.
Goal of this playbook
- Installation web interface/server
- Installation of a container that can receive requests and documents via a web interface
- Division of the various modules to simplify later distribution.
- Showing Vectorization of documents
- Showing challenge of dividing documents and resulting chunk size
- Showing weighting of laws
- Showing Citing examples based on internal guidelines
- Showing clear rules on AI when creating responses
No performance pressure or speed comparisons are carried out yet. In a subsequent step, it should also be possible to upload guidelines, the content of which can be checked for compliance by the RAG system.