Skip to main content

Getting Started (v0.4.0)

Welcome! This guide will get you up and running with Penguin v0.4.0, featuring advanced conversation management, real-time streaming, and comprehensive checkpoint capabilities.


1. Install

Core Installation

uv tool install penguin-ai # recommended: installs the main CLI/TUI entrypoints

# Alternative: plain pip still works
pip install penguin-ai # Python package + CLI/TUI + web runtime

Feature-Rich Installation

pip install "penguin-ai[all]" # Optional extras for memory/browser/legacy UI features

Installation Options

  • pip install "penguin-ai[web]" – compatibility alias; base install already includes the web runtime
  • pip install "penguin-ai[tui]" – compatibility alias; base install already includes the OpenCode-style TUI launcher
  • pip install "penguin-ai[legacy_tui]" – installs the older Textual prototype / experimental UI
  • pip install "penguin-ai[minimal]" – lightweight library-only installation
  • pip install "penguin-ai[dev]" – development dependencies for contributors

What's Included

Core Features:

  • Interactive TUI launcher via penguin, ptui, or penguin-tui
  • Headless/scriptable CLI via penguin-cli
  • Python API for programmatic use
  • Advanced conversation management with checkpoints
  • Real-time streaming with event-driven architecture
  • SQLite-backed project and task management
  • Enhanced token management with category-based budgets
  • Runtime model switching capabilities

2. Verify Installation

penguin --version # prints version string
penguin --help # shows the default launcher options
ptui --help # direct TUI alias
penguin-cli --help # headless/scriptable CLI entrypoint

Check Configuration

penguin-cli config check # validates required keys are present
penguin config debug # prints extended diagnostic info
penguin config edit # open the config file in your editor

3. First Steps

Enhanced Interactive Chat

penguin # launches the default terminal UI
ptui # same TUI, explicit alias
penguin-tui # same TUI, explicit alias matching the split runtime naming

Use penguin-cli for non-interactive automation and scripts:

penguin-cli -p "Help me debug this Python function" # one-off prompt
penguin-cli chat # explicit synonym for interactive chat

Model Management

Model subcommands are not yet exposed via CLI. Use defaults in config for now.

Advanced Project & Task Management

# Create a project
penguin-cli project create "AI Assistant Demo" -d "Demonstrating new features"

# List projects (note the ID from the table)
penguin-cli project list

# Create a task for that project (replace <PROJECT_ID>)
penguin-cli project task create <PROJECT_ID> "Implement streaming chat"

# View tasks (optionally filtered by project)
penguin-cli project task list [<PROJECT_ID>]

Enhanced Web Interface (optional)

pip install "penguin-ai[web]" # compatibility alias; base install already includes this runtime
penguin-web # Web/API server at http://127.0.0.1:9000
# API docs at http://127.0.0.1:9000/api/docs

TUI Runtime Notes

The terminal UI now lives in the OpenCode-derived penguin-tui/ workspace inside this repository.

  • In a source checkout, Penguin prefers local penguin-tui/packages/opencode sources.
  • Outside a source checkout, it can bootstrap a cached sidecar binary under ~/.cache/penguin/tui.
  • Advanced overrides: PENGUIN_OPENCODE_DIR, PENGUIN_TUI_BIN_PATH, and PENGUIN_TUI_RELEASE_URL.

Programmatic API with Streaming

from penguin.web.app import PenguinAPI

# Initialize with enhanced features
api = PenguinAPI()

# Chat with streaming and checkpointing
response = await api.chat(
"Help me refactor this code",
streaming=True,
conversation_id="my-session"
)

# Create checkpoints programmatically
checkpoint_id = await api.core.create_checkpoint(
name="Before refactoring",
description="Saving state before major changes"
)

FastAPI Integration

from fastapi import FastAPI
from penguin.web.app import create_app

# Embed Penguin in your FastAPI application
app = create_app()
# Penguin API is now available at /api/v1/

4. Enhanced Configuration

Environment Variables

# API Keys
ANTHROPIC_API_KEY=your_key_here
OPENAI_API_KEY=your_key_here
OPENROUTER_API_KEY=your_key_here

# Model Configuration
PENGUIN_DEFAULT_MODEL=openai/gpt-5
PENGUIN_CLIENT_PREFERENCE=native # native, litellm, or openrouter
PENGUIN_STREAMING_ENABLED=true

# Performance Settings
PENGUIN_FAST_STARTUP=true
PENGUIN_MAX_TOKENS=400000

Execution Root (Project vs Workspace)

Penguin separates where it edits code from where it stores assistant state:

  • Project root: your current repository (CWD/git root) for file edits, shell commands, diffs, and analysis.
  • Workspace root: Penguin’s own data (conversations, notes, logs, memory) at WORKSPACE_PATH.

Control which root file tools operate on:

  • CLI flag: --root project|workspace (applies for the current run)
  • Env var: PENGUIN_WRITE_ROOT=project|workspace (overrides config)
  • Config default: defaults.write_root: project (set to workspace to sandbox writes)
  • Fallback default: project

On startup the CLI prints the active execution root, e.g.

Execution root: project (/path/to/your/repo)

Set workspace as the default once:

penguin config set defaults.write_root workspace

Advanced Configuration File

Create config.yml for comprehensive settings:

model:
default: anthropic/claude-3-5-sonnet
client_preference: openrouter
streaming_enabled: true
max_context_window_tokens: 200000

performance:
fast_startup: true
diagnostics_enabled: true

checkpoints:
enabled: true
frequency: 1
retention:
keep_all_hours: 24
max_age_days: 30

memory:
indexing_enabled: true
providers:
- chroma
- lance

Next Steps

Essential Guides

New v0.4.0 Features

Advanced Topics

Troubleshooting

Common Issues:

  1. Command not found: Ensure you installed with pip install penguin-ai, not the minimal option
  2. Web interface not starting: Base install already includes the runtime; pip install "penguin-ai[web]" is only a compatibility alias. Then check port 8000.
  3. API errors: Verify API keys and use penguin config debug for configuration issues
  4. Streaming not working: Ensure your model supports streaming and check client preference settings
  5. Checkpoint errors: Check disk space and file permissions for the workspace directory
  6. Performance issues: Try fast_startup=true in config or use penguin profile / penguin perf-test for profiling
  7. TUI bootstrap problems: If sidecar download or source auto-detection fails, set PENGUIN_OPENCODE_DIR or PENGUIN_TUI_BIN_PATH explicitly.

Getting Help

  • GitHub Issues: Report bugs and feature requests
  • Documentation: Check the System Documentation for advanced configuration
  • Performance: Use penguin profile or penguin perf-test for system performance insights
  • Configuration: Run penguin config check or penguin config debug to inspect your settings. You can also review your config.yml by doing penguin config edit

Quick Diagnostic Commands:

penguin --version # Check version
penguin --help # CLI options
penguin profile # Profile startup and save a report
penguin perf-test # Benchmark startup performance
penguin config debug # Extended config + environment diagnostics