Difference between revisions of "LLM: RAG contoh"
Jump to navigation
Jump to search
Onnowpurbo (talk | contribs) |
Onnowpurbo (talk | contribs) |
||
Line 12: | Line 12: | ||
pip3 install fastembed | pip3 install fastembed | ||
pip3 install pdfplumber | pip3 install pdfplumber | ||
+ | pip3 install chromadb | ||
Edit apps.py | Edit apps.py |
Latest revision as of 13:11, 15 July 2024
Contoh: https://www.youtube.com/watch?v=7VAs22LC7WE
python3 -m venv venv source venv/bin/activate
mkdir pdf mkdir db
pip3 install langchain_community pip3 install langchain-text-splitters pip3 install fastembed pip3 install pdfplumber pip3 install chromadb
Edit apps.py
from langchain_community.llms import Ollama llm = Ollama(model="llama3") responds=llm.invoke("Why the sky is blue?") print(responds)
Run
python3 apps.py
Flask
pip3 install flask
Edit apps.py
from flask import flask from langchain_community.llms import Ollama
app = flask(__name__)
llm = Ollama(model="llama3") responds=llm.invoke("Why the sky is blue?") print(responds)
Lebih detail bisa di baca di
https://github.com/ThomasJay/RAG/blob/main/app.py