LLM: RAG-streamlit-llamaindex-ollama

From OnnoWiki
Revision as of 16:58, 16 July 2024 by Onnowpurbo (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Sumber: https://lightning.ai/maxidiazbattan/studios/rag-streamlit-llamaindex-ollama?tab=overview

Install aplikasi pendukung

sudo su
apt install curl net-tools

Download

curl -fsSL https://ollama.com/install.sh | sh

Sebagai user biasa run & download model

ollama serve
ollama pull llama3

Aktifkan virtual environment

sudo apt install python3.12-venv
python3 -m venv venv
source venv/bin/activate


Download app.py & requirements.txt dari

https://lightning.ai/maxidiazbattan/studios/rag-streamlit-llamaindex-ollama?tab=files&layout=column&path=cloudspaces%2F01hwfjrdss66hkje94vb7enf15&y=2&x=0

copas ke file app.py & requirements.txt


pip install -r requirements.txt
streamlit run app.py


Browse

http://ipaddress:8501



Referensi

Pranala Menarik