Difference between revisions of "LLM: RAG-streamlit-llamaindex-ollama"
Jump to navigation
Jump to search
Onnowpurbo (talk | contribs) |
Onnowpurbo (talk | contribs) |
||
Line 13: | Line 13: | ||
ollama serve | ollama serve | ||
− | ollama | + | ollama pull llama3 |
Aktifkan virtual environment | Aktifkan virtual environment |
Latest revision as of 16:58, 16 July 2024
Sumber: https://lightning.ai/maxidiazbattan/studios/rag-streamlit-llamaindex-ollama?tab=overview
Install aplikasi pendukung
sudo su apt install curl net-tools
Download
curl -fsSL https://ollama.com/install.sh | sh
Sebagai user biasa run & download model
ollama serve ollama pull llama3
Aktifkan virtual environment
sudo apt install python3.12-venv python3 -m venv venv source venv/bin/activate
Download app.py & requirements.txt dari
https://lightning.ai/maxidiazbattan/studios/rag-streamlit-llamaindex-ollama?tab=files&layout=column&path=cloudspaces%2F01hwfjrdss66hkje94vb7enf15&y=2&x=0
copas ke file app.py & requirements.txt
pip install -r requirements.txt streamlit run app.py
Browse
http://ipaddress:8501