Speaker: Nils Seipel
Notes: Christian Krippes
Nils provided his slides. Make sure to look into it, because not all graphics and code examples are included in the notes. Slides are in German.
Game-changer: LLaMA (Large Language Model Meta AI)
This Arstechnica article describes the developement and background story in more detail.
For code example, please take a look at the slides in the PDF document. Using a LLM locally is as easy:
pip install llama-cpp-python
It is important to look up what “talking” behaviour the model was trained on. The model expects input in the same way; otherwise, you’ll likely get weird results.
E.g a model was trained with the following input setup.
PROMPT: Blablabla ASSISTANT:
Own data sources can be questionable via ChatBot. This is possible through so called “embeddings”. See https://github.com/Appointat/Chat-with-Document-s-using-ChatGPT-API-and-Text-Embedding