Skip to main content

SDK API

High-level developer API for Hyperdimensional Computing applications.

Module: src/sdk.zig

Overviewโ€‹

The Trinity SDK provides a simplified, intuitive interface for working with Vector Symbolic Architecture (VSA) operations. While the low-level VSA API (src/vsa.zig) operates directly on HybridBigInt structures, the SDK wraps these operations into user-friendly types:

LevelModulePurpose
Low-Levelvsa.zigDirect VSA operations on HybridBigInt
High-Levelsdk.zigDeveloper-friendly wrappers with semantics

SDK Componentsโ€‹

The SDK consists of six main types:

  • Hypervector - Main abstraction for VSA operations
  • Codebook - Symbol-to-vector mapping with encoding/decoding
  • AssociativeMemory - Key-value storage using binding
  • SequenceEncoder - Ordered data representation via permutation
  • GraphEncoder - Relational triple encoding (subject-predicate-object)
  • Classifier - Simple HDC-based machine learning

Hypervectorโ€‹

The primary type for all VSA operations. Wraps HybridBigInt with labeled semantics and intuitive methods.

Constructionโ€‹

init(dim: usize) โ†’ Hypervectorโ€‹

Creates a zero hypervector with specified dimension.

var hv = Hypervector.init(1000);

random(dim: usize, seed: u64) โ†’ Hypervectorโ€‹

Creates a random hypervector (for atomic symbols).

var hv = Hypervector.random(1000, 42);

randomLabeled(dim: usize, seed: u64, label: []const u8) โ†’ Hypervectorโ€‹

Creates a random hypervector with a label for debugging.

var cat = Hypervector.randomLabeled(1000, 42, "cat");

fromRaw(raw: HybridBigInt) โ†’ Hypervectorโ€‹

Wraps an existing HybridBigInt into a Hypervector.

var raw = vsa.randomVector(1000, 42);
var hv = Hypervector.fromRaw(raw);

Accessorsโ€‹

getDimension() โ†’ usizeโ€‹

Returns the number of trits in the hypervector.

const dim = hv.getDimension(); // 1000

get(index: usize) โ†’ Tritโ€‹

Returns the trit value at position (-1, 0, or +1).

const trit = hv.get(5); // Returns: -1, 0, or +1

set(index: usize, value: Trit) โ†’ voidโ€‹

Sets the trit at a specific position.

hv.set(5, 1); // Set position 5 to +1

VSA Operationsโ€‹

bind(other: *Hypervector) โ†’ Hypervectorโ€‹

Creates an associative binding between two hypervectors.

Properties:

  • Self-inverse: bind(A, A) = all +1
  • Unbind reverses: unbind(bind(A, B), B) โ‰ˆ A
var associated = key.bind(&value);

unbind(key: *Hypervector) โ†’ Hypervectorโ€‹

Retrieves a hypervector from a binding.

var recovered = bound.unbind(&key);

bundle(other: *Hypervector) โ†’ Hypervectorโ€‹

Combines two hypervectors via majority voting (superposition).

var combined = a.bundle(&b);
// combined is similar to both a and b

bundle3(b: *Hypervector, c: *Hypervector) โ†’ Hypervectorโ€‹

Combines three hypervectors via majority voting.

var combined = a.bundle3(&b, &c);

permute(k: usize) โ†’ Hypervectorโ€‹

Cyclic shift by k positions (for sequence encoding).

var shifted = hv.permute(3); // Shift right by 3

inversePermute(k: usize) โ†’ Hypervectorโ€‹

Inverse cyclic shift.

var restored = shifted.inversePermute(3);

Similarity Measuresโ€‹

similarity(other: *Hypervector) โ†’ f64โ€‹

Cosine similarity in range [-1, 1].

  • 1.0 = identical
  • 0.0 = orthogonal
  • -1.0 = opposite
const sim = a.similarity(&b);
if (sim > 0.8) {
// Highly similar
}

hammingDistance(other: *Hypervector) โ†’ usizeโ€‹

Count of differing trit positions.

const dist = a.hammingDistance(&b);

hammingSimilarity(other: *Hypervector) โ†’ f64โ€‹

Normalized Hamming similarity in [0, 1].

const sim = a.hammingSimilarity(&b);

dotSimilarity(other: *Hypervector) โ†’ f64โ€‹

Dot product similarity.

const sim = a.dotSimilarity(&b);

Utility Methodsโ€‹

countNonZero() โ†’ usizeโ€‹

Returns the count of non-zero trits.

const active = hv.countNonZero();

density() โ†’ f64โ€‹

Ratio of non-zero trits [0, 1].

const density = hv.density();
// density = countNonZero() / getDimension()

clone() โ†’ Hypervectorโ€‹

Creates a deep copy of the hypervector.

var copy = hv.clone();

negate() โ†’ Hypervectorโ€‹

Negates all trits (-1 โ†’ +1, +1 โ†’ -1, 0 โ†’ 0).

var inverted = hv.negate();

Codebookโ€‹

Maps symbols (strings) to hypervectors for encoding and decoding.

Initializationโ€‹

init(allocator: Allocator, dimension: usize) โ†’ Codebookโ€‹

Creates a new codebook with specified dimension.

var gpa = std.heap.GeneralPurposeAllocator(.{}){};
const allocator = gpa.allocator();
var codebook = Codebook.init(allocator, 1000);
defer codebook.deinit();

Encodingโ€‹

encode(symbol: []const u8) โ†’ !*Hypervectorโ€‹

Gets or creates a hypervector for a symbol. Uses deterministic hashing.

const cat_hv = try codebook.encode("cat");
const dog_hv = try codebook.encode("dog");

Decodingโ€‹

decode(query: *Hypervector) โ†’ ?[]const u8โ€‹

Finds the symbol most similar to the query hypervector.

const symbol = codebook.decode(&query);
if (symbol) |s| {
std.debug.print("Matched: {s}\n", .{s});
}

decodeWithThreshold(query: *Hypervector, threshold: f64) โ†’ ?[]const u8โ€‹

Decodes with minimum similarity threshold.

const symbol = codebook.decodeWithThreshold(&query, 0.7);
// Returns null if no symbol has similarity >= 0.7

findSimilar(query: *Hypervector, threshold: f64, results: *ArrayList(SimilarityResult)) โ†’ !voidโ€‹

Finds all symbols above the similarity threshold.

var results = std.ArrayList(SimilarityResult).init(allocator);
try codebook.findSimilar(&query, 0.5, &results);

for (results.items) |result| {
std.debug.print("{s}: {d:.2}\n", .{result.symbol, result.similarity});
}

Utilityโ€‹

count() โ†’ usizeโ€‹

Returns the number of symbols in the codebook.

const size = codebook.count();

AssociativeMemoryโ€‹

Key-value storage using binding operations. Stores multiple associations in a single hypervector.

Initializationโ€‹

init(dimension: usize) โ†’ AssociativeMemoryโ€‹

Creates an empty associative memory.

var memory = AssociativeMemory.init(1000);

Operationsโ€‹

store(key: *Hypervector, value: *Hypervector) โ†’ voidโ€‹

Stores a key-value association.

var key = Hypervector.random(1000, 1);
var value = Hypervector.random(1000, 2);
memory.store(&key, &value);

Internal operation:

memory = bundle(memory, bind(key, value))

retrieve(key: *Hypervector) โ†’ Hypervectorโ€‹

Retrieves a value by key (returns noisy approximation).

var retrieved = memory.retrieve(&key);
const similarity = retrieved.similarity(&value);
// Typically > 0.2 for successful retrieval

contains(key: *Hypervector, threshold: f64) โ†’ boolโ€‹

Checks if a key exists in memory.

if (memory.contains(&key, 0.1)) {
// Key exists
}

clear() โ†’ voidโ€‹

Clears all stored associations.

memory.clear();

count() โ†’ usizeโ€‹

Returns the number of stored items.

const count = memory.count();

SequenceEncoderโ€‹

Encodes ordered sequences using permutation. Each position in the sequence is shifted by its index.

Initializationโ€‹

init(dimension: usize) โ†’ SequenceEncoderโ€‹

Creates a sequence encoder.

var encoder = SequenceEncoder.init(1000);

Operationsโ€‹

encode(items: []Hypervector) โ†’ Hypervectorโ€‹

Encodes a sequence of hypervectors.

Formula:

result = items[0] + permute(items[1], 1) + permute(items[2], 2) + ...
var items = [_]Hypervector{
Hypervector.random(1000, 1),
Hypervector.random(1000, 2),
Hypervector.random(1000, 3),
};
var sequence = encoder.encode(&items);

probe(sequence: *Hypervector, candidate: *Hypervector, position: usize) โ†’ f64โ€‹

Tests similarity of a candidate at a specific position.

const sim = encoder.probe(&sequence, &candidate, 2);
if (sim > 0.6) {
// Candidate likely at position 2
}

findPosition(sequence: *Hypervector, candidate: *Hypervector, max_length: usize, threshold: f64) โ†’ ?usizeโ€‹

Finds the position of a candidate in the sequence.

if (encoder.findPosition(&sequence, &candidate, 10, 0.5)) |pos| {
std.debug.print("Found at position: {}\n", .{pos});
}

GraphEncoderโ€‹

Encodes relational triples (subject-predicate-object) using role vectors.

Initializationโ€‹

init(dim: usize) โ†’ GraphEncoderโ€‹

Creates a graph encoder with random role vectors.

var encoder = GraphEncoder.init(1000);

Internal role vectors:

  • role_subject - random(seed=0x5B)
  • role_predicate - random(seed=0x9D)
  • role_object - random(seed=0x0B)

Operationsโ€‹

encodeTriple(subject: *Hypervector, predicate: *Hypervector, object: *Hypervector) โ†’ Hypervectorโ€‹

Encodes a triple (S, P, O).

Formula:

triple = bind(role_s, S) + bind(role_p, P) + bind(role_o, O)
var s = Hypervector.randomLabeled(1000, 1, "Paris");
var p = Hypervector.randomLabeled(1000, 2, "capital_of");
var o = Hypervector.randomLabeled(1000, 3, "France");

var triple = encoder.encodeTriple(&s, &p, &o);

querySubject(triple: *Hypervector) โ†’ Hypervectorโ€‹

Extracts the subject from a triple.

var subject = encoder.querySubject(&triple);
const similarity = subject.similarity(&s); // Should be high

queryPredicate(triple: *Hypervector) โ†’ Hypervectorโ€‹

Extracts the predicate from a triple.

var predicate = encoder.queryPredicate(&triple);

queryObject(triple: *Hypervector) โ†’ Hypervectorโ€‹

Extracts the object from a triple.

var object = encoder.queryObject(&triple);

Classifierโ€‹

Simple Hyperdimensional Computing classifier using prototype learning.

Initializationโ€‹

init(allocator: Allocator, dimension: usize) โ†’ Classifierโ€‹

Creates a new classifier.

var classifier = Classifier.init(allocator, 1000);
defer classifier.deinit();

Trainingโ€‹

train(class_name: []const u8, sample: *Hypervector) โ†’ !voidโ€‹

Adds a sample to a class. Bundles with existing class vector.

var sample1 = Hypervector.random(1000, 1);
var sample2 = Hypervector.random(1000, 2);

try classifier.train("positive", &sample1);
try classifier.train("positive", &sample2);

Predictionโ€‹

predict(sample: *Hypervector) โ†’ ?[]const u8โ€‹

Returns the most similar class name.

var test = Hypervector.random(1000, 3);
if (classifier.predict(&test)) |class| {
std.debug.print("Predicted: {s}\n", .{class});
}

predictWithConfidenceโ€‹

predictWithConfidence(sample: *Hypervector) โ†’ struct { class: ?[]const u8, confidence: f64 }

Returns prediction with similarity score.

var result = classifier.predictWithConfidence(&test);
if (result.class) |class| {
std.debug.print("Class: {s}, Confidence: {d:.2}\n", .{class, result.confidence});
}

Utilityโ€‹

classCount() โ†’ usizeโ€‹

Returns the number of classes.

const num_classes = classifier.classCount();

Usage Examplesโ€‹

Example 1: Symbolic Reasoningโ€‹

const std = @import("std");
const sdk = @import("trinity").sdk;

pub fn main() !void {
var gpa = std.heap.GeneralPurposeAllocator(.{}){};
const allocator = gpa.allocator();

// Create codebook for symbols
var codebook = sdk.Codebook.init(allocator, 1000);
defer codebook.deinit();

// Encode entities
const cat = try codebook.encode("cat");
const dog = try codebook.encode("dog");
const mammal = try codebook.encode("mammal");
const animal = try codebook.encode("animal");

// Create associative memory
var memory = sdk.AssociativeMemory.init(1000);

// Store facts: cat โ†’ mammal, mammal โ†’ animal
memory.store(cat, mammal);
memory.store(mammal, animal);

// Query: what is cat related to?
var retrieved = memory.retrieve(cat);
const decoded = codebook.decode(&retrieved);
std.debug.print("cat is a {s}\n", .{decoded}); // "mammal"
}

Example 2: Sequence Processingโ€‹

const std = @import("std");
const sdk = @import("trinity").sdk;

pub fn main() !void {
var gpa = std.heap.GeneralPurposeAllocator(.{}){};
const allocator = gpa.allocator();

var codebook = sdk.Codebook.init(allocator, 1000);
defer codebook.deinit();

// Encode words
const the = try codebook.encode("the");
const quick = try codebook.encode("quick");
const brown = try codebook.encode("brown");
const fox = try codebook.encode("fox");

// Create sequence encoder
var encoder = sdk.SequenceEncoder.init(1000);

// Encode sentence
var words = [_]sdk.Hypervector{ the.*, quick.*, brown.*, fox.* };
var sentence = encoder.encode(&words);

// Probe for words
const pos_fox = encoder.findPosition(&sentence, fox, 10, 0.5);
std.debug.print("'fox' at position: {?}\n", .{pos_fox}); // 3
}

Example 3: Knowledge Graphโ€‹

const std = @import("std");
const sdk = @import("trinity").sdk;

pub fn main() !void {
var gpa = std.heap.GeneralPurposeAllocator(.{}){};
const allocator = gpa.allocator();

var codebook = sdk.Codebook.init(allocator, 1000);
defer codebook.deinit();

// Encode entities and relations
const paris = try codebook.encode("Paris");
const france = try codebook.encode("France");
const capital = try codebook.encode("capital_of");

// Create graph encoder
var graph = sdk.GraphEncoder.init(1000);

// Encode triple: Paris โ†’ capital_of โ†’ France
var triple = graph.encodeTriple(paris, capital, france);

// Query: what is Paris the capital of?
var object = graph.queryObject(&triple);
const country = codebook.decode(&object);
std.debug.print("Paris is capital of: {s}\n", .{country}); // "France"

// Query: what is the capital of France?
var subject = graph.querySubject(&triple);
const city = codebook.decode(&subject);
std.debug.print("Capital of France: {s}\n", .{city}); // "Paris"
}

Example 4: Text Classificationโ€‹

const std = @import("std");
const sdk = @import("trinity").sdk;

pub fn main() !void {
var gpa = std.heap.GeneralPurposeAllocator(.{}){};
const allocator = gpa.allocator();

var classifier = sdk.Classifier.init(allocator, 1000);
defer classifier.deinit();

var codebook = sdk.Codebook.init(allocator, 1000);
defer codebook.deinit();

// Training data
const positive_words = [_][]const u8{ "good", "great", "excellent", "amazing" };
const negative_words = [_][]const u8{ "bad", "terrible", "awful", "poor" };

// Train positive class
for (positive_words) |word| {
const hv = try codebook.encode(word);
try classifier.train("positive", hv);
}

// Train negative class
for (negative_words) |word| {
const hv = try codebook.encode(word);
try classifier.train("negative", hv);
}

// Classify new text
const test_word = try codebook.encode("excellent");
var result = classifier.predictWithConfidence(&test_word);

std.debug.print("Class: {s}, Confidence: {d:.2}\n", .{
result.class orelse "unknown",
result.confidence
}); // "positive", high confidence
}

Best Practicesโ€‹

1. Choose Appropriate Dimensionsโ€‹

Use CaseRecommended Dimension
Symbolic reasoning512 - 1000
Sequence encoding1000 - 2000
Graph encoding1000+
Text classification2000+

Higher dimensions improve capacity and noise resistance but increase memory/compute.

2. Similarity Thresholdsโ€‹

Guidelines for similarity thresholds:

ThresholdInterpretation
> 0.8Strong match (identical or near-identical)
0.6 - 0.8Good match (likely correct)
0.4 - 0.6Weak match (ambiguous)
< 0.4Poor match (noise or unrelated)

Adjust based on your application's tolerance for false positives/negatives.

3. Associative Memory Capacityโ€‹

As a rule of thumb, associative memory can reliably store:

  • ~10 items with high accuracy (> 0.8 similarity)
  • ~50 items with moderate accuracy (> 0.5 similarity)

Beyond this, use sharding or hierarchical memory.

4. Sequence Position Limitsโ€‹

SequenceEncoder.findPosition() searches up to max_length. For practical purposes:

  • Keep sequences under 20 items for best results
  • Longer sequences require higher dimensions

5. Classifier Trainingโ€‹

For best classification accuracy:

  • Use balanced training data (similar samples per class)
  • Train with 5-10 samples minimum per class
  • More diverse samples โ†’ better generalization

Performance Considerationsโ€‹

Memory Usageโ€‹

TypeMemory per item (dim=1000)
Hypervector~1.5 KB (packed)
Codebook entry~1.5 KB + string overhead
AssociativeMemory~1.5 KB (single bundled vector)

Computation Complexityโ€‹

OperationComplexityNotes
bind/unbindO(n)n = dimension
bundle2/bundle3O(n)Majority voting
similarityO(n)Dot product
Codebook.encodeO(n)Hash + random generation
Codebook.decodeO(kยทn)k = symbol count
SequenceEncoder.encodeO(mยทn)m = sequence length
GraphEncoder.query*O(n)Single unbind

Optimization Tipsโ€‹

  1. Reuse hypervectors - Clone is cheaper than regenerating
  2. Batch operations - Bundle multiple items at once
  3. Use packed mode - HybridBigInt.pack() when not computing
  4. Limit codebook size - Decode is O(k), prefer smaller dictionaries
  5. Cache similarities - Expensive recomputation

Integration Examplesโ€‹

With Low-Level VSA APIโ€‹

const sdk = @import("trinity").sdk;
const vsa = @import("trinity").vsa;

// Can convert between SDK and VSA
var hv = sdk.Hypervector.random(1000, 42);

// Access raw HybridBigInt
const raw = hv.data; // HybridBigInt

// Use low-level operations
const processed = vsa.permute(&raw, 3);

// Wrap back in SDK
var result = sdk.Hypervector.fromRaw(processed);

With Custom Allocatorsโ€‹

const std = @import("std");
const sdk = @import("trinity").sdk;

pub fn main() !void {
// Use arena allocator for temporary operations
var arena = std.heap.ArenaAllocator.init(std.heap.page_allocator);
defer arena.deinit();

const allocator = arena.allocator();

var codebook = sdk.Codebook.init(allocator, 1000);
// ... use codebook

// Everything freed at once
}

Error Handlingโ€‹

const sdk = @import("trinity").sdk;

pub fn classifyWord(word: []const u8) ![]const u8 {
// Returns error on allocation failure
const hv = try codebook.encode(word);

// Returns null if no match found
const decoded = codebook.decodeWithThreshold(&hv, 0.7)
orelse return error.NoMatch;

return decoded;
}

See Alsoโ€‹