ElBruno.OllamaMonitor

NuGet NuGet Downloads Publish to NuGet License: MIT .NET

A tiny Windows system tray tool to monitor your local Ollama runtime.

Quick visual feedback about your Ollama status, resource usage, and models—right from your Windows system tray.

What It Does

ElBruno.OllamaMonitor sits in your Windows system tray and tells you:

Perfect for:

Demo

ElBruno.OllamaMonitor demo

Installation

dotnet tool install --global ElBruno.OllamaMonitor

Then launch anytime:

ollamamon

From Source

git clone https://github.com/elbruno/ElBruno.OllamaMonitor.git
cd ElBruno.OllamaMonitor
dotnet build src/ElBruno.OllamaMonitor/
dotnet run --project src/ElBruno.OllamaMonitor/

Quick Start

  1. Launch the app:
    ollamamon
    

    The app starts minimized to the tray. Click the icon to open the details window or mini monitor.

  2. Check your status: Look at the tray icon color—it tells you Ollama’s status at a glance.

  3. Configure (optional): See Configuration Guide for endpoint, refresh rate, and threshold settings.

System Tray Status

The tray icon color tells you the status at a glance:

Color Meaning
🟤 Gray Ollama is not reachable
🟢 Green Ollama is running, no model loaded
🔵 Blue A model is currently loaded
🟠 Orange A model is running or high resource usage
🔴 Red Error or Ollama unavailable

Click the icon to open the full details window for diagnostics, or open the mini monitor from the tray menu to keep resource usage visible on top of other windows.

Features

Requirements

Optional:

Configuration

See Configuration Guide for detailed setup, CLI commands, custom thresholds, and advanced options like remote Ollama monitoring.

Documentation

Promotional Materials

If you’d like to share this project:

License

This project is licensed under the MIT License — see LICENSE for details.

Support

Found a bug or have a feature request? Open an issue on GitHub.

Questions about Ollama? Check the Ollama documentation.

About the Author

Made with ❤️ by Bruno Capuano (ElBruno)