ElBruno.OllamaMonitor stores configuration in a JSON file that you can edit directly or modify via CLI commands.
%LOCALAPPDATA%\ElBruno\OllamaMonitor\settings.json
On Windows:
%LOCALAPPDATA% typically expands to C:\Users\<YourUsername>\AppData\LocalC:\Users\<YourUsername>\AppData\Local\ElBruno\OllamaMonitor\settings.jsonThe file is created automatically with default values on first run.
{
"endpoint": "http://localhost:11434",
"refreshIntervalSeconds": 2,
"startMinimizedToTray": true,
"showFloatingWindowOnStart": false,
"enableGpuMetrics": true,
"enableDiskMetrics": true,
"highCpuThresholdPercent": 80,
"highMemoryThresholdGb": 16,
"highGpuThresholdPercent": 85
}
endpointType: string
Default: http://localhost:11434
The HTTP endpoint where Ollama is running. Modify this if:
Examples:
"endpoint": "http://localhost:11434" // Local default
"endpoint": "http://192.168.1.100:11434" // Remote machine
"endpoint": "http://ollama.local:11434" // DNS name
refreshIntervalSecondsType: int (1 or greater)
Default: 2
How often (in seconds) the app polls the Ollama API and system metrics.
Examples:
"refreshIntervalSeconds": 1 // Frequent polling
"refreshIntervalSeconds": 5 // Moderate polling
"refreshIntervalSeconds": 10 // Low frequency
startMinimizedToTrayType: bool
Default: true
Whether the app starts minimized to the system tray (hidden from desktop).
true: App launches hidden; use the tray icon to open the details window or mini monitorfalse: App launches with the details window visibleshowFloatingWindowOnStartType: bool
Default: false
Whether to show the standard details window automatically when the app starts.
startMinimizedToTrayenableGpuMetricsType: bool
Default: true
Whether to attempt to collect NVIDIA GPU metrics.
nvidia-smi to be available on PATHfalse to skip GPU polling entirelyenableDiskMetricsType: bool
Default: true
Whether to collect disk read/write metrics for the Ollama process.
false to disable disk metric collectionhighCpuThresholdPercentType: double (0–100)
Default: 80
The CPU usage percentage threshold that triggers the Orange tray icon state (HighUsage).
When Ollama’s CPU usage exceeds this threshold and a model is loaded, the tray icon turns orange to indicate active, resource-intensive work.
Examples:
"highCpuThresholdPercent": 50 // Aggressive threshold
"highCpuThresholdPercent": 80 // Moderate (default)
"highCpuThresholdPercent": 95 // Conservative
highMemoryThresholdGbType: double
Default: 16
The RAM usage (in GB) threshold that triggers the Orange tray icon state.
When Ollama’s memory usage exceeds this threshold, the tray icon turns orange.
Examples:
"highMemoryThresholdGb": 4 // Tight constraint
"highMemoryThresholdGb": 16 // Moderate (default)
"highMemoryThresholdGb": 32 // Permissive
highGpuThresholdPercentType: double (0–100)
Default: 85
The GPU usage percentage threshold that triggers the Orange state.
When GPU utilization exceeds this threshold, the tray icon turns orange.
Examples:
"highGpuThresholdPercent": 70 // Aggressive
"highGpuThresholdPercent": 85 // Moderate (default)
"highGpuThresholdPercent": 99 // Conservative
In Phase 1, you can use ollamamon config commands to manage the endpoint and refresh interval:
# View current configuration
ollamamon config
# Change the Ollama endpoint
ollamamon config set endpoint http://192.168.1.100:11434
# Change refresh interval to 5 seconds
ollamamon config set refresh-interval 5
# Reset to default settings
ollamamon config reset
Note: To change thresholds, GPU metrics, or disk metrics settings, use Option 2 (direct file edit) below.
%LOCALAPPDATA%\ElBruno\OllamaMonitor\settings.json
Edit the JSON values
Save the file
Example:
{
"endpoint": "http://192.168.1.50:11434",
"refreshIntervalSeconds": 3,
"startMinimizedToTray": true,
"showFloatingWindowOnStart": false,
"enableGpuMetrics": true,
"enableDiskMetrics": true,
"highCpuThresholdPercent": 75,
"highMemoryThresholdGb": 12,
"highGpuThresholdPercent": 80
}
The tray icon color reflects the current state, which is partly determined by your threshold settings:
| State | Color | Trigger | CPU/RAM/GPU |
|---|---|---|---|
| NotReachable | Gray | Ollama API unreachable | — |
| Running | Green | API reachable, no model | Low |
| ModelLoaded | Blue | Model loaded | Low |
| HighUsage | Orange | Model running | Exceeds threshold |
| Error | Red | Unexpected error | — |
The thresholds you set control when the app transitions from ModelLoaded to HighUsage.
If the settings file is corrupted, the app will log an error and attempt to use defaults. To reset:
ollamamon config reset
This recreates the file with default values.
Configuration changes require an app restart. Either:
ollamamon again to launch a fresh instanceIf you see “Endpoint unreachable” in the floating window:
ollama serveollamamon configcurl http://localhost:11434/api/version (or your configured endpoint)This is normal if:
nvidia-smi is not installed or not on PATHTo enable GPU metrics:
ollamamon config set gpu-metrics true
Then verify nvidia-smi is available:
nvidia-smi
If nvidia-smi is not found, install NVIDIA drivers from nvidia.com.
If you want to balance responsiveness and CPU usage:
ollamamon config set refresh-interval 10
To monitor Ollama running on another machine:
ollamamon config set endpoint http://<remote-ip>:11434
ollamamonNote: Remote monitoring is best-effort in Phase 1. For production use, consider Phase 2 features.
Next Steps: