ElBruno.OllamaMonitor

A tiny Windows system tray tool to monitor your local Ollama runtime.
Quick visual feedback about your Ollama status, resource usage, and models—right from your Windows system tray.
What It Does
ElBruno.OllamaMonitor sits in your Windows system tray and tells you:
- Is Ollama running? A glance at the tray icon shows you the status.
- Is a model loaded? See what’s currently active.
- How much CPU, RAM, and GPU is it using? Real-time resource metrics from the Ollama process.
- Any errors? Get instant visual feedback if something’s wrong.
Perfect for:
- Local AI developers who need quick visibility into Ollama
- Demo presenters who want to know resource impact in real-time
- Anyone running large models locally who’s curious about the overhead
Demo

Installation
Via NuGet (Recommended)
dotnet tool install --global ElBruno.OllamaMonitor
Then launch anytime:
From Source
git clone https://github.com/elbruno/ElBruno.OllamaMonitor.git
cd ElBruno.OllamaMonitor
dotnet build src/ElBruno.OllamaMonitor/
dotnet run --project src/ElBruno.OllamaMonitor/
Quick Start
- Launch the app:
The app starts minimized to the tray. Click the icon to open the details window or mini monitor.
-
Check your status:
Look at the tray icon color—it tells you Ollama’s status at a glance.
- Configure (optional):
See Configuration Guide for endpoint, refresh rate, and threshold settings.
System Tray Status
The tray icon color tells you the status at a glance:
| Color |
Meaning |
| 🟤 Gray |
Ollama is not reachable |
| 🟢 Green |
Ollama is running, no model loaded |
| 🔵 Blue |
A model is currently loaded |
| 🟠 Orange |
A model is running or high resource usage |
| 🔴 Red |
Error or Ollama unavailable |
Click the icon to open the full details window for diagnostics, or open the mini monitor from the tray menu to keep resource usage visible on top of other windows.
Features
- ✅ System Tray Integration — Runs in the background, always visible
- ✅ Visual Status Indicators — Color-coded icons for quick status checks
- ✅ Standard Details Window — A normal Windows window with minimize/close behavior that keeps the app in the tray when closed
- ✅ Mini Monitor Window — A semi-transparent always-on-top compact view for CPU, RAM, GPU, and model status
- ✅ Local Configuration — Customize endpoint, refresh rate, thresholds
- ✅ CLI Commands — Fully scriptable configuration
- ✅ GPU Metrics — Best-effort NVIDIA GPU tracking (if nvidia-smi is available)
- ✅ Copy to Clipboard — Quickly share diagnostics
- ✅ Manual Refresh — Force an immediate check
- ✅ Open Ollama URL — Quick link to the Ollama API
Requirements
- Windows 10 / Windows 11 (requires .NET 10 runtime, which can be downloaded from dotnet.microsoft.com)
- Ollama running locally (download from ollama.ai)
- .NET 10 SDK to build from source
Optional:
- nvidia-smi (NVIDIA GPU drivers) for GPU metrics
Configuration
See Configuration Guide for detailed setup, CLI commands, custom thresholds, and advanced options like remote Ollama monitoring.
Documentation
If you’d like to share this project:
License
This project is licensed under the MIT License — see LICENSE for details.
Support
Found a bug or have a feature request? Open an issue on GitHub.
Questions about Ollama? Check the Ollama documentation.
About the Author
Made with ❤️ by Bruno Capuano (ElBruno)