Memory Model
MetaScript uses ORC (Optimized Reference Counting) for the C backend, providing deterministic memory management without garbage collection pauses.
Why ORC?
Traditional garbage collectors have unpredictable pauses. Reference counting is predictable but slow. ORC combines the best of both:
| Approach | Predictable | Fast | No Pauses |
|---|---|---|---|
| Tracing GC | - | Yes | - |
| Naive RC | Yes | - | Yes |
| ORC | Yes | Yes | Yes |
ORC achieves ~6-8% overhead compared to manual memory management, while being completely automatic.
How ORC Works
Basic Reference Counting
Every object tracks how many references point to it:
const user = new User("Alice") // refcount = 1
const copy = user // refcount = 2
// copy goes out of scope // refcount = 1
// user goes out of scope // refcount = 0, freedCycle Detection
ORC handles cycles that break naive reference counting:
class Node {
next: Node | null = null
}
const a = new Node()
const b = new Node()
a.next = b
b.next = a // Cycle! Naive RC would leak
// ORC detects and collects cycles automaticallyMove Semantics
When possible, ORC moves values instead of copying:
function processUser(user: User): void {
// user is moved here, not copied
console.log(user.name)
}
const user = new User("Alice")
processUser(user) // user is moved
// user is no longer valid hereMemory Zones
For performance-critical code, use memory zones:
import { Zone } from "@metascript/core"
Zone.scoped((zone) => {
// All allocations use the zone
const users = zone.alloc(User, 1000)
for (const user of users) {
process(user)
}
// Everything freed at once when zone exits
})Backend-Specific Behavior
C Backend (ORC)
- Deterministic deallocation
- No GC pauses
- 6-8% overhead
- Best for: serverless, CLI tools
JavaScript Backend
- Uses V8/SpiderMonkey GC
- Familiar JS memory model
- Interop with existing JS code
- Best for: browsers, Node.js
Erlang Backend
- Per-process heap
- Process death = instant cleanup
- No stop-the-world GC
- Best for: distributed systems
Performance Tips
Avoid Unnecessary Allocations
// Bad: allocates new array every iteration
for (let i = 0; i < 1000; i++) {
const result = items.map(x => x * 2)
}
// Good: reuse the buffer
const buffer: number[] = []
for (let i = 0; i < 1000; i++) {
buffer.length = 0
for (const item of items) {
buffer.push(item * 2)
}
}Use Stack Allocation
Small, fixed-size values are stack-allocated:
// Stack allocated (fast)
const point = { x: 1, y: 2 }
// Heap allocated
const user = new User("Alice")Prefer Immutability
Immutable data enables optimizations:
@derive(Clone)
class Config {
readonly host: string
readonly port: number
}
// Clone is optimized - may share memory
const newConfig = config.clone()Debugging Memory
Leak Detection
# Run with leak detection
msc run --leak-check src/main.msMemory Profiling
# Generate memory profile
msc run --memory-profile src/main.ms
# Analyze results
msc analyze memory-profile.jsonReference Counting Stats
import { memory } from "@metascript/core"
console.log(memory.stats())
// { allocations: 1234, frees: 1230, live: 4, peak: 100 }Next Steps
- Learn about Three Runtimes to understand deployment options
- Explore Compile-Time Power for zero-cost abstractions
- Read the Performance Guide for optimization tips