Start an AI-powered chat server for conversational network security analysis.
The chat server provides a web-based conversational interface for querying
network security data using natural language. It integrates with MCP for
live data queries and supports pluggable LLM backends.
rockfish chat [OPTIONS]
Mode Description
slm (default)Local small language model via Ollama
cloudCloud LLM (OpenAI, Anthropic)
hybridTry local SLM first, fall back to cloud
# Local SLM mode
rockfish chat --mode slm
# Cloud mode
rockfish chat --mode cloud
# Hybrid mode
rockfish chat --mode hybrid
Mode Description
cache (default)Query local pre-filtered Parquet files
storeQuery via MCP cold storage
Option Default Description
-c, --config— Chat configuration file (chat.yaml)
--host127.0.0.1HTTP server host
--port8082HTTP server port
--modeslmLLM mode: slm, cloud, hybrid
--data-modecacheData mode: cache or store
--mcp-endpointhttp://localhost:3000MCP server endpoint
--slm-endpointhttp://localhost:8081/v1/chat/completionsLocal SLM endpoint
--data-dir./outputData directory for cache mode
--sensorsensorSensor name for cache mode
Session management with state persistence
Security guardrails and response caching
MCP integration for live data queries
Natural language to SQL translation
# Start chat server with local SLM
rockfish chat --mode slm \
--data-dir /data/rockfish --sensor prod-01
# Start with MCP integration
rockfish chat --mode cloud \
--mcp-endpoint http://localhost:3000 \
--host 0.0.0.0 --port 8082