Pizzence is an AI-powered chatbot built with Streamlit and LangChain, designed to answer customer questions based on real restaurant reviews. Whether it's gluten-free options, service quality, or favorite dishes—Pizzence delivers quick, relevant answers powered by LLaMA3 via Ollama.
- 💬 Chat with real customer reviews
- ⚡ Fast, relevant, and to-the-point responses
- 🧠 Uses local vector search (ChromaDB) for context
- 🌐 Simple Streamlit web UI
- Python
- Streamlit for frontend
- LangChain for prompt orchestration
- Ollama (LLaMA3.2) as the LLM
- ChromaDB for vector database
- OllamaEmbeddings for review embedding
pizzence/
│
├── chrome\_langchain\_db/ # ChromaDB vector store (auto-created)
├── .gitignore # Git ignore rules
├── .python-version # Python version file (optional)
├── main.py # Streamlit chatbot app
├── vector.py # Review embedding and retriever setup
├── realistic\_restaurant\_reviews.csv # Customer reviews dataset
├── pyproject.toml # Project dependencies and config
├── uv.lock # Package lock file (if using uv/rye)
└── README.md # You’re here!
git clone https://github.com/ChAbdulWahhab/pizzence.git
cd pizzencepython -m venv venv
venv\Scripts\activate # Windows
source venv/bin/activate # macOS/LinuxIf you're using uv:
uv pip install -r requirements.txtOtherwise:
pip install -r requirements.txtstreamlit run main.py- Loads and embeds the reviews using OllamaEmbeddings.
- Saves them to a local ChromaDB vector store.
- When a user asks a question, relevant reviews are retrieved.
- These are used as context for the LLaMA3 model to generate an answer.
- Do they offer gluten-free pizzas?
- How is the staff behavior according to reviews?
- What’s the most appreciated item on the menu?
Open an issue or contribute on GitHub.
Enjoy chatting with your reviews – powered by Pizzence 🍕