Skip to main content

Interactive REPL

The TRI CLI includes a full interactive REPL (Read-Eval-Print Loop) for continuous chat, code generation, and SWE agent operations without restarting the binary.

Starting the REPL

# Via zig build
zig build tri

# Or via the binary directly
./zig-out/bin/tri

When launched without arguments, TRI enters interactive mode:

═══════════════════════════════════════════════════════
TRI CLI v3.0.0 — Trinity Unified CLI
100% Local AI | Code | Chat | SWE | Swarm
phi^2 + 1/phi^2 = 3 = TRINITY
═══════════════════════════════════════════════════════

Mode: Chat | Language: Zig | Verbose: off
Type a message or use /command

tri>

REPL Commands

All REPL commands are prefixed with /. Typing text without a prefix sends it as a message in the current mode.

Mode Switching

CommandModeDescription
/chatChatGeneral-purpose AI conversation
/codeCodeCode generation from natural language prompts
/fixBugFixDetect and suggest fixes for bugs
/explainExplainExplain code structure or concepts
/testTestGenerate comprehensive test suites
/docDocumentGenerate API documentation
/refactorRefactorSuggest refactoring improvements
/reasonReasonChain-of-thought reasoning mode

Example:

tri> /code
Mode switched to: Code Generation

tri> Write a function that computes Fibonacci numbers
[AI generates Fibonacci function in current language]

Language Switching

CommandLanguageDescription
/zigZigSet output language to Zig (default)
/pythonPythonSet output language to Python
/rustRustSet output language to Rust
/jsJavaScriptSet output language to JavaScript
/javascriptJavaScriptAlias for /js

The language setting affects code generation, test generation, and documentation output.

Example:

tri> /python
Language switched to: Python

tri> /code
tri> Implement a binary search tree
[AI generates Python BST implementation]

Utility Commands

CommandDescription
/statsDisplay session statistics
/verboseToggle verbose output on/off
/helpShow REPL command reference
/quitExit the REPL
/exitExit the REPL (alias for /quit)
/qExit the REPL (alias for /quit)

Session Statistics

The /stats command displays accumulated metrics for the current session:

tri> /stats

Session Statistics
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
SWE Requests: 12
Chat Queries: 34
TVC Hits: 8
TVC Misses: 26
LLM Calls: 18
Vision Calls: 2
STT Calls: 1
Context Messages: 47
Summarizations: 3

Tracked metrics:

MetricDescription
SWE RequestsTotal fix/explain/test/doc/refactor/reason invocations
Chat QueriesTotal chat messages sent
TVC HitsResponses served from TVC corpus cache
TVC MissesResponses requiring fresh LLM generation
LLM CallsTotal API calls to language models
Vision CallsImage analysis requests (via --image)
STT CallsSpeech-to-text invocations (via --voice)
Context MessagesMessages in sliding context window
SummarizationsContext window overflow summarizations

State Management

The REPL maintains a CLIState struct across the session:

FieldDefaultDescription
ModeChatCurrent operating mode
LanguageZigCurrent output language
VerboseOffDetailed output toggle
StreamOffStreaming output toggle
TVC CorpusLoaded10,000-entry ternary vector cache

TVC Corpus Persistence

The TVC (Ternary Vector Corpus) is loaded at startup and saved on exit:

  • File: trinity_chat.tvc
  • Capacity: 10,000 entries
  • Vector dimension: 1,000 trits
  • Threshold: ϕ1=0.618\phi^{-1} = 0.618

When you chat, responses are cached in TVC. Subsequent similar queries return cached responses instantly.

Multi-Modal Support

The REPL supports multi-modal input through the chat command flags:

# In REPL mode, use chat with flags:
tri> chat --image photo.jpg "What's in this image?"
tri> chat --voice recording.wav "Transcribe this"
tri> chat --stream "Explain quantum computing"
FlagDescription
--streamEnable streaming (typing effect) output
--image <path>Attach image for vision analysis
--voice <path>Attach audio for speech-to-text

Provider Configuration

The REPL auto-detects available API keys from environment variables:

VariableProviderPriority
GROQ_API_KEYGroq1 (fastest)
ANTHROPIC_API_KEYClaude2
OPENAI_API_KEYOpenAI3

If no API key is set, TRI falls back to the local GGUF model:

  • Default path: models/tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf

Tips

  • Quick mode switch + query: Type /code then your prompt on the next line
  • Direct chat: Just type any text without / to send in current mode
  • History: The REPL maintains a sliding context window of 20 messages
  • Context overflow: When context exceeds the window, older messages are summarized automatically
  • Exit: Use /quit, /exit, /q, or Ctrl+C

See Also