Difference between revisions of "LLM: RAG contoh"

From OnnoWiki
Jump to navigation Jump to search
 
(5 intermediate revisions by the same user not shown)
Line 4: Line 4:
 
  python3 -m venv venv
 
  python3 -m venv venv
 
  source venv/bin/activate
 
  source venv/bin/activate
 +
 +
mkdir pdf
 +
mkdir db
  
 
  pip3 install langchain_community
 
  pip3 install langchain_community
 
+
pip3 install langchain-text-splitters
 +
pip3 install fastembed
 +
pip3 install pdfplumber
 +
pip3 install chromadb
  
 
Edit apps.py
 
Edit apps.py
Line 21: Line 27:
  
  
===Flask==
+
===Flask===
  
 
  pip3 install flask
 
  pip3 install flask
Line 40: Line 46:
  
 
  https://github.com/ThomasJay/RAG/blob/main/app.py
 
  https://github.com/ThomasJay/RAG/blob/main/app.py
 
 
  
 
==Referensi==
 
==Referensi==
  
 
* https://www.youtube.com/watch?v=7VAs22LC7WE
 
* https://www.youtube.com/watch?v=7VAs22LC7WE

Latest revision as of 13:11, 15 July 2024

Contoh: https://www.youtube.com/watch?v=7VAs22LC7WE


python3 -m venv venv
source venv/bin/activate
mkdir pdf
mkdir db
pip3 install langchain_community
pip3 install langchain-text-splitters
pip3 install fastembed
pip3 install pdfplumber
pip3 install chromadb

Edit apps.py

from langchain_community.llms import Ollama
llm = Ollama(model="llama3")
responds=llm.invoke("Why the sky is blue?")
print(responds)

Run

python3 apps.py


Flask

pip3 install flask

Edit apps.py

from flask import flask
from langchain_community.llms import Ollama
app = flask(__name__)


llm = Ollama(model="llama3")
responds=llm.invoke("Why the sky is blue?")
print(responds)

Lebih detail bisa di baca di

https://github.com/ThomasJay/RAG/blob/main/app.py

Referensi