Getting Started
Alpaka Desktop is a native Linux desktop client for Ollama. It connects directly to your local (or remote) Ollama instance and provides a full chat UI with streaming, model management, and advanced generation controls.
Prerequisites
Before launching Alpaka Desktop you need Ollama running:
bash
# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh
# Start the service
ollama serve
# Pull your first model
ollama pull llama3.2Ollama listens on http://localhost:11434 by default. Alpaka Desktop connects there automatically on first launch.
Installation
See the landing page for all installation options (AppImage, APT, AUR, source build).
First Launch
- Open Alpaka Desktop. The app icon appears in your system tray.
- Check the host connection. A green dot next to the host name in the top bar means Ollama is reachable. If you see a red dot, go to Settings → Hosts and verify the URL (
http://localhost:11434). - Select a model. Click the model selector in the top bar and choose a model you have pulled. If the list is empty, go to Models → Library to pull one.
- Start chatting. Type a message and press Enter to send. Tokens stream in real time.
Interface Overview
New Chat
Launch
Models
Settings
Conversations
Today
test
hello
Explain quantum comput...
Sidebar
🦙
Type a message...
qwen3-vl:4b ▼
Message Input
- Sidebar — conversation history, organized by date. Click a conversation to open it. Use
Ctrl+/to toggle. - Top bar — model selector, host indicator, and settings button.
- Chat area — messages render Markdown, code blocks, math (LaTeX), and thinking blocks.
- Input —
Entersends,Shift+Enterinserts a newline.
What's Next
- Chat features — streaming, thinking blocks, attachments, export
- Model management — pull, tag, create custom models
- Settings — understand the three-layer settings system
