ElBruno.OllamaMonitor is a .NET WPF desktop application built for Windows with a system tray interface, a standard details window, and a compact always-on-top mini monitor.
The architecture is split into two tracks:
src/ElBruno.OllamaMonitor/
├── Cli/
│ ├── CliCommand.cs # Command model
│ ├── CliCommandKind.cs # Command type enum
│ ├── CliCommandParser.cs # Parse args -> CliCommand
│ └── CliCommandRunner.cs # Execute commands (help, config, reset, etc.)
├── Configuration/
│ ├── AppSettings.cs # Settings data model (JSON-serializable)
│ └── AppSettingsService.cs # Load/save settings from disk
├── Diagnostics/
│ ├── StatusFormatter.cs # Format snapshots as readable text
│ ├── ClipboardService.cs # Copy diagnostics to clipboard
│ └── DiagnosticsLogService.cs # Event logging
├── Helpers/
│ ├── ProcessLauncher.cs # Launch URLs / external commands
│ └── ...
├── Interop/
│ └── ... # P/Invoke / Windows interop
├── Models/
│ ├── OllamaMonitorState.cs # State enum (Gray, Green, Blue, Orange, Red)
│ ├── OllamaMonitorSnapshot.cs # Aggregated status snapshot
│ ├── OllamaModelSnapshot.cs # Model info
│ └── ResourceSnapshot.cs # CPU, RAM, GPU metrics
├── Ollama/
│ ├── OllamaClient.cs # HTTP client for Ollama API
│ ├── OllamaStatusService.cs # Aggregate Ollama state
│ ├── OllamaStatus.cs # API response models
│ └── OllamaModelInfo.cs # Model details
├── Services/
│ ├── ProcessMetricsService.cs # CPU, RAM from Ollama process
│ ├── NvidiaSmiMetricsService.cs # GPU metrics via nvidia-smi
│ ├── TrayIconService.cs # System tray lifecycle
│ ├── TrayStatusMapper.cs # OllamaMonitorState -> icon color
│ └── TrayMenuBuilder.cs # Context menu construction
├── ViewModels/
│ └── MainWindowViewModel.cs # UI state, refresh logic
├── App.xaml / App.xaml.cs # WPF Application entry point
├── MainWindow.xaml / MainWindow.xaml.cs # Standard details window
└── MiniMonitorWindow.xaml / .cs # Compact always-on-top monitor
App.OnStartup()
├─ Parse CLI args (CliCommandParser)
├─ If args = ["--help", "config", "config set", etc.]
│ └─ Run CLI command (CliCommandRunner) → exit
└─ Else
└─ LaunchTrayApplication()
LaunchTrayApplication()
├─ Load settings (AppSettingsService)
├─ Create HttpClient (singleton)
├─ Create service stack:
│ ├─ OllamaClient
│ ├─ ProcessMetricsService
│ ├─ NvidiaSmiMetricsService
│ └─ OllamaStatusService (aggregates all three)
├─ Create MainWindow, MiniMonitorWindow, and MainWindowViewModel
├─ Create TrayIconService
├─ Start DispatcherTimer (refresh loop)
└─ Show window or minimize to tray (based on settings)
Every N seconds (default 2, configurable):
DispatcherTimer.Tick
└─ MainWindowViewModel.RefreshAsync()
└─ OllamaStatusService.GetStatusAsync()
├─ OllamaClient.GetVersionAsync()
├─ OllamaClient.GetTagsAsync()
├─ OllamaClient.GetProcessesAsync()
├─ ProcessMetricsService.GetMetricsAsync()
└─ NvidiaSmiMetricsService.GetGpuMetricsAsync()
└─ Determine OllamaMonitorState (Gray/Green/Blue/Orange/Red)
└─ Update UI bindings
└─ Update tray icon color (TrayStatusMapper)
Determines the tray icon color:
public enum OllamaMonitorState
{
NotReachable, // Gray — API unreachable
Running, // Green — API reachable, no model
ModelLoaded, // Blue — Model loaded, low usage
HighUsage, // Orange — Model running, high usage
Error // Red — Unexpected error
}
NotReachableRunningModelLoadedHighUsageError (overrides other states)Settings are stored as JSON at:
%LOCALAPPDATA%\ElBruno\OllamaMonitor\settings.json
Editable via:
ollamamon config set <key> <value>Key settings:
| Key | Type | Default | Purpose |
|---|---|---|---|
endpoint |
string | http://localhost:11434 |
Ollama API endpoint |
refreshIntervalSeconds |
int | 2 |
Polling interval |
startMinimizedToTray |
bool | true |
Hide the windows on startup |
showFloatingWindowOnStart |
bool | false |
Show the details window on startup |
enableGpuMetrics |
bool | true |
Include GPU metrics |
enableDiskMetrics |
bool | true |
Include disk I/O metrics |
highCpuThresholdPercent |
double | 80 |
CPU% to trigger HighUsage state |
highMemoryThresholdGb |
double | 16 |
RAM GB to trigger HighUsage state |
highGpuThresholdPercent |
double | 85 |
GPU% to trigger HighUsage state |
Aggregates Ollama API state, process metrics, and GPU metrics into a single OllamaMonitorSnapshot.
Task<OllamaMonitorSnapshot> GetStatusAsync(CancellationToken cancellationToken)
Returns:
Polls the Ollama process for CPU and memory usage using System.Diagnostics.Process.
Task<ResourceSnapshot?> GetMetricsAsync(CancellationToken cancellationToken)
Returns CPU%, RAM (MB), disk I/O if available.
Best-effort GPU metrics via nvidia-smi CLI tool.
Task<GpuMetrics?> GetGpuMetricsAsync(CancellationToken cancellationToken)
Returns GPU utilization%, VRAM used/total if available. Fails gracefully if nvidia-smi not found.
Manages system tray lifecycle, context menu, and state-driven icon updates.
OllamaMonitorStateBinds UI to OllamaMonitorSnapshot. Handles:
All commands are parsed and executed by CliCommandRunner:
| Command | Effect |
|---|---|
ollamamon |
Launch tray app |
ollamamon --help |
Show help text |
ollamamon config |
Print current settings |
ollamamon config set endpoint <url> |
Change Ollama endpoint |
ollamamon config set refresh-interval <seconds> |
Change polling interval |
ollamamon config reset |
Reset to defaults |
NotReachable state, no app crashThe project is packaged as a .NET global tool:
<PackAsTool>true</PackAsTool>
<ToolCommandName>ollamamon</ToolCommandName>
Install via:
dotnet tool install --global ElBruno.OllamaMonitor
This places the executable in the user’s PATH and creates the ollamamon command.
dotnet buildollamamon config worksollamamon --help worksNext Phase (Phase 2): MVVM framework, logging framework, unit tests, settings UI dialog, historical charts.