Difference between revisions of "LLM: RAG contoh"
Jump to navigation
Jump to search
Onnowpurbo (talk | contribs) (Created page with "Contoh: https://www.youtube.com/watch?v=7VAs22LC7WE python3 -m venv venv source venv/bin/activate pip3 install langchain_community Edit apps.py from langchain_commun...") |
Onnowpurbo (talk | contribs) |
||
| (6 intermediate revisions by the same user not shown) | |||
| Line 4: | Line 4: | ||
python3 -m venv venv | python3 -m venv venv | ||
source venv/bin/activate | source venv/bin/activate | ||
| + | |||
| + | mkdir pdf | ||
| + | mkdir db | ||
pip3 install langchain_community | pip3 install langchain_community | ||
| − | + | pip3 install langchain-text-splitters | |
| + | pip3 install fastembed | ||
| + | pip3 install pdfplumber | ||
| + | pip3 install chromadb | ||
Edit apps.py | Edit apps.py | ||
| Line 20: | Line 26: | ||
| + | |||
| + | ===Flask=== | ||
| + | |||
| + | pip3 install flask | ||
| + | |||
| + | Edit apps.py | ||
| + | |||
| + | from flask import flask | ||
| + | from langchain_community.llms import Ollama | ||
| + | |||
| + | app = flask(__name__) | ||
| + | |||
| + | |||
| + | llm = Ollama(model="llama3") | ||
| + | responds=llm.invoke("Why the sky is blue?") | ||
| + | print(responds) | ||
| + | |||
| + | Lebih detail bisa di baca di | ||
| + | |||
| + | https://github.com/ThomasJay/RAG/blob/main/app.py | ||
==Referensi== | ==Referensi== | ||
* https://www.youtube.com/watch?v=7VAs22LC7WE | * https://www.youtube.com/watch?v=7VAs22LC7WE | ||
Latest revision as of 13:11, 15 July 2024
Contoh: https://www.youtube.com/watch?v=7VAs22LC7WE
python3 -m venv venv source venv/bin/activate
mkdir pdf mkdir db
pip3 install langchain_community pip3 install langchain-text-splitters pip3 install fastembed pip3 install pdfplumber pip3 install chromadb
Edit apps.py
from langchain_community.llms import Ollama
llm = Ollama(model="llama3")
responds=llm.invoke("Why the sky is blue?")
print(responds)
Run
python3 apps.py
Flask
pip3 install flask
Edit apps.py
from flask import flask from langchain_community.llms import Ollama
app = flask(__name__)
llm = Ollama(model="llama3")
responds=llm.invoke("Why the sky is blue?")
print(responds)
Lebih detail bisa di baca di
https://github.com/ThomasJay/RAG/blob/main/app.py