This repository is a Fork project that allows you to use HugginFace Models and not only OpenAI API.
You may find the step-by-step video tutorial to build this application on Youtube.
This is a Python application that allows you to load a PDF and ask questions about it using natural language. The application uses a LLM to generate a response about your PDF. The LLM will not answer questions unrelated to the document.
The application reads the PDF and splits the text into smaller chunks that can be then fed into a LLM. It uses OpenAI embeddings to create vector representations of the chunks. The application then finds the chunks that are semantically similar to the question that the user asked and feeds those chunks to the LLM to generate a response.
The application uses Streamlit to create the GUI and Langchain to deal with the LLM.
To use the application, run the main.py
file with the streamlit CLI (after having installed streamlit):
streamlit run app.py