Chat with Local Ollama model Llama2 with intuitive UI
PyQT is modern UI framework to create desktop/mobile apps. It can create cross platform apps that allow building all the functionality in python itself. Database connections are also supported for performing CRUD operations in PyQT.
PySide is the more modern counterpart of PyQT and has better free version of license for producing commercial applications. Qt is available for both Python and C++.
Ollama provides a way to run AI models locally on your machines giving more privacy to the users. Ollama supports many LLMs they support like Llama2, Gemma etc. Ollama allows interaction with models directly through a terminal or through an API they provide.
I built a Desktop Application using PySide6 to chat with Llama2 model of Ollama family. It involved many challenges like styling with external file, learning a totally new layouts system, and export to executable file w/o breaking anything.
There were many UI solution built for Ollama but none in the timeframe I ideated this app. All other UI solutions can be found at https://github.com/ollama/ollama