Skip to content

Priyanshu-hawk/Conversationl-AI-ChatBot

Repository files navigation

AI Chatbot with Personalization and Web Interface

A conversational AI chatbot built with Flask and LangChain that supports personalization, weather queries, time queries, and persistent memory.

Features

  • Conversational AI powered by Groq's LLama 3.3 70B model
  • User personalization and context memory
  • Real-time weather information
  • Current time queries
  • Custom web interface

Bonus Features

  • Persistent storage of user information (Beta)

Tech Stack

  • Backend: Python 3.10+ with Flask
  • Frontend: HTML5, CSS, JavaScript
  • AI Model: Groq LLama 3.3 70B
  • Storage: JSON-based local storage
  • APIs: Weather API integration

Prerequisites

  • Python 3.10 or higher
  • Groq API key
  • Weather API key

Installation

  1. Clone the repository:
git clone <repository-url>
cd <repository-name>
  1. Create and activate a virtual environment:
  • For Linux/Mac:
python3 -m venv venv
source venv/bin/activate
  • For Windows (May this works, I don't have windows machine to test):
python -m venv venv
venv\Scripts\activate
  1. Install dependencies:
pip install -r requirements.txt
  1. Create .env

file from template:

cp .env.example .env
  1. Configure environment variables in

File: .env

GROQ_API_KEY=<your-groq-api-key>
WEATHER_API_KEY=<your-weather-api-key>
ENVIRONMENT=dev

Running the Application

Local Development

python web.py

The application will be available at http://localhost:5000

Production Deployment

gunicorn --bind :$PORT --workers=1 web:app

Usage

  1. Open the web interface in your browser
  2. Enter a user ID to start chatting. (The chatbot will remember your information)
  3. Type messages in the chat input field

Example Queries

  • Basic conversation:

    • "Hello, how are you?"
  • Personal information:

    • "My name is Prip"
    • "I live in Moscow"
  • Weather queries:

    • "What's the weather like in Raipur?"
    • "How's the weather today?" (uses saved location)
  • Time queries:

    • "What time is it?"
    • "Tell me the current time"
  • Context memory:

    • "Can you tell me more about that?"
    • "Remind me of my name"

⭐ Example Conversations with the chatbot: Conversation examples here

Project Structure

Commmand to generate tree on linux - tree Conversationl-AI-ChatBot-Vizares -L 2

Conversationl-AI-ChatBot
├── chatbot.py
├── config.py
├── personal_info.json
├── Procfile
├── __pycache__
│   <cache-files>
├── Readme.MD
├── requirements.txt
├── static
│   └── css
├── task.MD
├── templates
│   └── index.html
├── venv
│   <venv-files>
└── web.py

Deployment ⚠️⚠️⚠️

The reason i put this in the deployment section at the end is because this project is not properly working on the deployment server.

This project is deployed on Koyeb. But Few things are not working as expected. Because of the limitations of the free tier of Koyeb. This things are not working are:

  • Persistent storage of user information (Some times it works, sometimes it doesn't)
  • Time queries (It give the time of the server where the code is running)
  • Slow API call for weather API.

Deployment Link: ChatBot

Configuration

Modify : config.py

to adjust:

  • Model parameters
  • Conversation settings
  • System prompts
  • Error messages