Chat with Local Ollama model Llama2 with intuitive UI

⚙️ Tech Stack:

  1. PySide6: https://doc.qt.io/qtforpython-6/
  2. Qt Resource System: https://doc.qt.io/qt-5/resources.html
  3. CSS for styling: https://developer.mozilla.org/en-US/docs/Web/CSS
  4. PyInstaller for building the Apps: https://pyinstaller.org/en/stable/
  5. Ollama Python API for communicating with AI models locally: https://ollama.com/ https://github.com/ollama/ollama-python

What is PyQT?

PyQT is modern UI framework to create desktop/mobile apps. It can create cross platform apps that allow building all the functionality in python itself. Database connections are also supported for performing CRUD operations in PyQT.

PySide is the more modern counterpart of PyQT and has better free version of license for producing commercial applications. Qt is available for both Python and C++.

What is Ollama?

Ollama provides a way to run AI models locally on your machines giving more privacy to the users. Ollama supports many LLMs they support like Llama2, Gemma etc. Ollama allows interaction with models directly through a terminal or through an API they provide.

What did I build?

I built a Desktop Application using PySide6 to chat with Llama2 model of Ollama family. It involved many challenges like styling with external file, learning a totally new layouts system, and export to executable file w/o breaking anything.

There were many UI solution built for Ollama but none in the timeframe I ideated this app. All other UI solutions can be found at https://github.com/ollama/ollama

Basic Concepts in PySide6

  1. QApplication: Main file that contains one or more windows that comprise the application
  2. Window: each different instance of the application ex you can have mutliple instances of Google chrome open at the sam time
  3. QWidget: Each different part inside the Window eg Sidebar, Navbar etc
  4. Layouts: How different widgets should appear inside one another