π’ Just shipped: ElBruno.OllamaMonitor β a tiny Windows tray app to monitor your local Ollama runtime.
No dashboards. No bloat. Just a status icon that tells you:
Glance at your system tray. Thatβs it.
Perfect for: β Local AI developers who need quick visibility β Demo presenters showing real-time resource impact β Anyone running Ollama and curious about overhead
The tray icon changes color:
Built on .NET 10. Fully configurable. CLI support. Open source.
Install: dotnet tool install --global ElBruno.OllamaMonitor
GitHub: [Link to repo]
#Ollama #LocalAI #dotNET #Windows #OpenSource #Developer #AITools #Productivity
β Star it on GitHub π Report issues or suggest features π Contribute to Phase 2 (historical charts, MVVM, multi-instance support)
π’ New: ElBruno.OllamaMonitor β See your Ollama status in the Windows tray.
Quick visual feedback: Is Ollama running? Is a model loaded? How much CPU/GPU is it using?
One glance. Thatβs all you need.
π dotnet tool install --global ElBruno.OllamaMonitor
#Ollama #LocalAI #dotNET #Windows
π’ Shipped: ElBruno.OllamaMonitor v0.1.0
A .NET 10 system tray app for monitoring local Ollama workloads on Windows.
Features: β’ Real-time status icon (5-state FSM) β’ Floating details window (CPU, RAM, disk, GPU metrics) β’ CLI configuration β’ Best-effort NVIDIA GPU tracking β’ Copy diagnostics to clipboard
Built for developers. Minimal footprint. Full documentation.
GitHub: [Link to repo]
dotnet tool install --global ElBruno.OllamaMonitor
#OpenSource #dotNET #Ollama #Windows #LocalAI