Symptom: Running ollamamon does nothing, or the app crashes immediately.
Solutions:
dotnet --version
If not installed, download .NET 10 from dotnet.microsoft.com.
%LOCALAPPDATA%\ElBruno\OllamaMonitor\logs\
Look for error messages.
dotnet run --project src/ElBruno.OllamaMonitor/ 2>&1
dotnet tool uninstall --global ElBruno.OllamaMonitor
dotnet tool install --global ElBruno.OllamaMonitor
Symptom: App runs but no icon in system tray.
Solutions:
ElBruno.OllamaMonitor processeventvwr.mscSymptom: The tray icon is always gray, even though Ollama is running.
Solutions:
curl http://localhost:11434/api/version
If this fails, Ollama is not running. Start it:
ollama serve
ollamamon config
Look for the endpoint value. Verify it matches where Ollama is running.
ollamamon config set endpoint http://<remote-ip>:11434
# Close it from the tray menu (Exit)
# Then launch again:
ollamamon
Symptom: Tray icon doesn’t change even when you load/unload models or change resource usage.
Solutions:
ollamamon config
Look for refreshIntervalSeconds. Try increasing it:
ollamamon config set refresh-interval 1
ollamamon again%LOCALAPPDATA%\ElBruno\OllamaMonitor\logs\ and look for warnings or errors.Symptom: You click the tray icon but the floating details window doesn’t appear.
Solutions:
ollamamon config
Look for showFloatingWindowOnStart. Try:
ollamamon config set show-floating-window true
Symptom: The floating window shows “GPU: N/A” even though you have an NVIDIA GPU.
Solutions:
ollamamon config
Look for enableGpuMetrics. If it’s false, enable it:
ollamamon config set enable-gpu-metrics true
nvidia-smi
If not found, install or update NVIDIA drivers from nvidia.com.
nvidia-smi --query-gpu=name,utilization.gpu,memory.used,memory.total --format=csv,noheader,nounits
If this command fails, the issue is with your NVIDIA drivers, not the app.
nvidia-smi on PATH. If it’s installed somewhere else, add it to PATH.Symptom: App shows errors about invalid JSON or settings.
Solution:
Reset to defaults:
ollamamon config reset
This deletes and recreates the settings file with default values.
If you want to manually edit, open:
%LOCALAPPDATA%\ElBruno\OllamaMonitor\settings.json
Verify it’s valid JSON. Use a JSON validator at jsonlint.com if unsure.
Symptom: Running ollamamon config set endpoint ... returns an error or shows nothing.
Solutions:
# Correct:
ollamamon config set endpoint http://localhost:11434
ollamamon config set refresh-interval 2
# Incorrect (missing arguments):
ollamamon config set endpoint
ollamamon config set
endpointrefresh-intervalcd %LOCALAPPDATA%\ElBruno\OllamaMonitor
ollamamon config
%LOCALAPPDATA%\ElBruno\OllamaMonitor\logs\ for error details.Symptom: The app is using 20%+ CPU even when idle.
Solutions:
ollamamon config set refresh-interval 10
Higher interval = less frequent polling = lower CPU.
ollamamon config set enable-gpu-metrics false
GPU polling can be expensive; try disabling it temporarily.
ollamamon config set enable-disk-metrics false
Check if Ollama process itself is using CPU:
Open Task Manager and look at the ollama process. If it’s using high CPU, it’s not an issue with this app.
Symptom: The app works for a while, then suddenly closes.
Solutions:
Check logs immediately after crash:
Open %LOCALAPPDATA%\ElBruno\OllamaMonitor\logs\ and look for the last entries. They should indicate the error.
ollamamon config set refresh-interval 5
If a bug only triggers during fast polling, reducing frequency can help.
ollamamon config set enable-gpu-metrics false
eventvwr.mscdotnet tool uninstall --global ElBruno.OllamaMonitor
dotnet tool install --global ElBruno.OllamaMonitor
Symptom: Tray icon flashes red or you see “Timeout” in the floating window.
Solutions:
curl http://localhost:11434/api/version
ollamamon config
App.xaml.cs, find:
_httpClient.Timeout = TimeSpan.FromSeconds(5);
Increase the value (e.g., to 10 seconds) if your network is slow.
ping <ollama-ip>Symptom: When installing, you see a license or validation error.
Solution:
Ensure you’re using .NET 10:
dotnet --version
Then try installing again:
dotnet tool install --global ElBruno.OllamaMonitor
If still failing, check your NuGet configuration:
dotnet nuget list source
%LOCALAPPDATA%\ElBruno\OllamaMonitor\logs\Edit the app code to add more WriteInfo() calls in DiagnosticsLogService.
# Test connectivity
curl http://localhost:11434/api/version
# Get loaded models
curl http://localhost:11434/api/tags
# Get running processes
curl http://localhost:11434/api/ps
If any of these fail, Ollama isn’t responding. Restart it and try again.
# List GPU info
nvidia-smi
# Test with same format as app
nvidia-smi --query-gpu=name,utilization.gpu,memory.used,memory.total --format=csv,noheader,nounits
If this fails, update your NVIDIA drivers.
Questions? See the FAQ in README or Development Guide.