Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

rockfish chat

Start an AI-powered chat server for conversational network security analysis.

Overview

The chat server provides a web-based conversational interface for querying network security data using natural language. It integrates with MCP for live data queries and supports pluggable LLM backends.

Usage

rockfish chat [OPTIONS]

LLM Modes

ModeDescription
slm (default)Local small language model via Ollama
cloudCloud LLM (OpenAI, Anthropic)
hybridTry local SLM first, fall back to cloud
# Local SLM mode
rockfish chat --mode slm

# Cloud mode
rockfish chat --mode cloud

# Hybrid mode
rockfish chat --mode hybrid

Data Modes

ModeDescription
cache (default)Query local pre-filtered Parquet files
storeQuery via MCP cold storage

Options

OptionDefaultDescription
-c, --configChat configuration file (chat.yaml)
--host127.0.0.1HTTP server host
--port8082HTTP server port
--modeslmLLM mode: slm, cloud, hybrid
--data-modecacheData mode: cache or store
--mcp-endpointhttp://localhost:3000MCP server endpoint
--slm-endpointhttp://localhost:8081/v1/chat/completionsLocal SLM endpoint
--data-dir./outputData directory for cache mode
--sensorsensorSensor name for cache mode

Features

  • Session management with state persistence
  • Security guardrails and response caching
  • MCP integration for live data queries
  • Natural language to SQL translation

Examples

# Start chat server with local SLM
rockfish chat --mode slm \
  --data-dir /data/rockfish --sensor prod-01

# Start with MCP integration
rockfish chat --mode cloud \
  --mcp-endpoint http://localhost:3000 \
  --host 0.0.0.0 --port 8082