Build Operations
This document defines build orchestration, cleaning, and diagnostic operations for Morphir workspaces.
Overview
Build operations coordinate compilation across multiple projects:
- Dependency-ordered builds: Compile projects in correct order
- Parallel compilation: Build independent projects concurrently
- Incremental builds: Only rebuild what changed
- Diagnostic aggregation: Unified error reporting across workspace
Build Pipeline
┌─────────────┐ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ Resolve │───►│ Order │───►│ Compile │───►│ Report │
│ Dependencies│ │ Projects │ │ (Parallel)│ │ Diagnostics │
└─────────────┘ └─────────────┘ └─────────────┘ └─────────────┘
Types
BuildResult
/// Result of a workspace build
pub type BuildResult {
BuildResult(
/// Overall success/failure
success: Bool,
/// Per-project results
projects: List(ProjectBuildResult),
/// Total build duration (milliseconds)
duration_ms: Int,
)
}
/// Result for a single project
pub type ProjectBuildResult {
ProjectBuildResult(
/// Project name
name: PackagePath,
/// Build status
status: BuildStatus,
/// Compiled distribution (if successful)
distribution: Option(Distribution),
/// Compilation diagnostics
diagnostics: List(Diagnostic),
/// Build duration (milliseconds)
duration_ms: Int,
)
}
/// Build status for a project
pub type BuildStatus {
/// Build succeeded with no issues
Ok
/// Build succeeded with warnings
Partial
/// Build failed
Failed
/// Build was skipped (up to date)
Skipped
}
Diagnostic
/// Compilation diagnostic
pub type Diagnostic {
Diagnostic(
/// Severity level
severity: Severity,
/// Error/warning code
code: String,
/// Human-readable message
message: String,
/// Source location
location: Option(SourceLocation),
/// Suggested fixes
hints: List(String),
)
}
pub type Severity {
Error
Warning
Info
Hint
}
pub type SourceLocation {
SourceLocation(
/// File path (relative to workspace)
file: String,
/// Start line (1-indexed)
start_line: Int,
/// Start column (1-indexed)
start_col: Int,
/// End line
end_line: Int,
/// End column
end_col: Int,
)
}
Operations
Build All
Builds all projects in the workspace.
Behavior
- Resolve all dependencies
- Compute build order (topological sort)
- Build projects in parallel where possible
- Aggregate diagnostics
- Return combined result
Build Order
Projects are built in dependency order:
Level 0 (no deps): core
Level 1 (→ core): domain, utils
Level 2 (→ domain): api, cli
Projects at the same level can be built in parallel.
WIT Interface
/// Build all projects in workspace
build-all: func() -> result<list<tuple<package-path, distribution>>, workspace-error>;
JSON-RPC
Request:
{
"method": "workspace/buildAll",
"params": {}
}
Response:
{
"result": {
"success": true,
"projects": [
{
"name": "my-org/core",
"status": "ok",
"distribution": { "..." },
"diagnostics": [],
"durationMs": 523
},
{
"name": "my-org/domain",
"status": "partial",
"distribution": { "..." },
"diagnostics": [
{
"severity": "warning",
"code": "W001",
"message": "Unused import: List.Extra",
"location": {
"file": "packages/domain/src/User.elm",
"startLine": 5,
"startCol": 1,
"endLine": 5,
"endCol": 25
}
}
],
"durationMs": 1247
}
],
"durationMs": 1823
}
}
CLI
morphir build # Build all
morphir build --parallel 4 # Limit parallelism
morphir build --project my-org/api # Build single project
Clean
Removes build artifacts and caches.
Behavior
- If project specified: clean that project only
- If no project: clean entire workspace
- Remove
.morphir-dist/directories - Optionally remove dependency cache
WIT Interface
/// Clean build artifacts
clean: func(
/// Specific project, or all if none
project: option<package-path>,
) -> result<_, workspace-error>;
JSON-RPC
Request (single project):
{
"method": "workspace/clean",
"params": {
"project": "my-org/domain"
}
}
Request (entire workspace):
{
"method": "workspace/clean",
"params": {}
}
Request (include dependency cache):
{
"method": "workspace/clean",
"params": {
"includeDeps": true
}
}
CLI
morphir clean # Clean all
morphir clean my-org/domain # Clean one project
morphir clean --deps # Also clean dependency cache
Get Diagnostics
Returns all current diagnostics across the workspace.
WIT Interface
/// Get workspace-wide diagnostics
get-diagnostics: func() -> list<tuple<package-path, list<diagnostic>>>;
JSON-RPC
Request:
{
"method": "workspace/getDiagnostics",
"params": {}
}
Response:
{
"result": {
"my-org/core": [],
"my-org/domain": [
{
"severity": "error",
"code": "E001",
"message": "Type mismatch: expected Int, got String",
"location": {
"file": "packages/domain/src/Order.elm",
"startLine": 42,
"startCol": 15,
"endLine": 42,
"endCol": 28
},
"hints": ["Try using String.toInt to convert"]
}
],
"my-org/api": [
{
"severity": "warning",
"code": "W002",
"message": "Function 'oldHelper' is deprecated",
"location": { "..." }
}
]
}
}
Incremental Builds
Change Detection
Changes are detected via:
- File modification time: Compare against last build
- Content hash: SHA-256 of source files
- Dependency changes: Rebuild if dependency was rebuilt
Build Cache
.morphir/
└── cache/
├── my-org/
│ └── domain/
│ ├── manifest.json # Build metadata
│ ├── source-hash # Hash of all sources
│ └── ir-cache/ # Cached IR fragments
Manifest Format
{
"version": "1.0.0",
"lastBuild": "2026-01-16T12:00:00Z",
"sourceHash": "sha256:abc123...",
"files": {
"src/User.elm": {
"hash": "sha256:def456...",
"lastModified": "2026-01-16T11:30:00Z"
}
},
"dependencies": {
"my-org/core": "sha256:ghi789..."
}
}
Parallel Execution
Strategy
Sequential (dependencies): core ──► domain ──► api
│
Parallel (independent): core ──►─┬► domain ──► api
└► utils ───►─┘
Configuration
# morphir.toml
[build]
parallel = true # Enable parallel builds
max-workers = 4 # Maximum parallel compilations
fail-fast = false # Continue on errors (or stop immediately)
Error Recovery
Partial Builds
When a project fails, dependent projects can still attempt to build using the last successful IR:
core (ok) ──► domain (FAILED) ──► api (uses cached domain IR)
Diagnostic-Only Mode
Build without generating artifacts (fast validation):
morphir build --check-only
Build Events
workspace/onBuildStarted
{
"method": "workspace/onBuildStarted",
"params": {
"projects": ["my-org/core", "my-org/domain"],
"incremental": true
}
}
workspace/onBuildProgress
{
"method": "workspace/onBuildProgress",
"params": {
"project": "my-org/domain",
"phase": "compiling",
"progress": 0.45,
"currentFile": "src/Domain/User.elm"
}
}
workspace/onBuildComplete
{
"method": "workspace/onBuildComplete",
"params": {
"success": true,
"projects": ["my-org/core", "my-org/domain"],
"durationMs": 2341,
"diagnosticCount": {
"errors": 0,
"warnings": 3
}
}
}
Streaming Builds
Large projects benefit from streaming compilation where results are produced incrementally rather than in one shot. This enables:
- Early feedback: Errors appear as soon as they're discovered
- Progressive output: Generated artifacts stream as modules complete
- Memory efficiency: Don't hold entire project in memory
- Interruptibility: Cancel long builds without losing partial progress
Streaming Model
┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ Source │────►│ Compile │────►│ Stream │
│ Files │ │ Module │ │ Results │
└─────────────┘ └─────────────┘ └─────────────┘
│ │ │
│ ▼ ▼
│ ┌─────────────┐ ┌─────────────┐
└───────────►│ Compile │─────►│ Stream │
│ Module │ │ Results │
└─────────────┘ └─────────────┘
Module-Level Streaming
Compilation streams results at the module level:
{
"jsonrpc": "2.0",
"method": "build/moduleCompiled",
"params": {
"project": "my-org/domain",
"module": ["Domain", "User"],
"status": "ok",
"ir": { "...module IR..." },
"diagnostics": []
}
}
Streaming Build Request
Request:
{
"jsonrpc": "2.0",
"id": "build-001",
"method": "workspace/buildStreaming",
"params": {
"projects": ["my-org/domain"],
"streaming": {
"granularity": "module",
"includeIR": true,
"includeDiagnostics": true
}
}
}
Stream of Notifications:
{ "method": "build/started", "params": { "project": "my-org/domain", "modules": 12 } }
{ "method": "build/moduleCompiled", "params": { "module": ["Domain", "Types"], "status": "ok", "ir": {...} } }
{ "method": "build/moduleCompiled", "params": { "module": ["Domain", "User"], "status": "ok", "ir": {...} } }
{ "method": "build/moduleCompiled", "params": { "module": ["Domain", "Order"], "status": "partial", "diagnostics": [...] } }
...
{ "method": "build/completed", "params": { "success": true, "modulesCompiled": 12 } }
Final Response:
{
"jsonrpc": "2.0",
"id": "build-001",
"result": { "success": true, "modulesCompiled": 12, "durationMs": 3421 }
}
CLI Streaming Output
morphir build --stream
Output:
Building my-org/domain (12 modules)
✓ Domain.Types [42ms]
✓ Domain.User [38ms]
⚠ Domain.Order [51ms] (2 warnings)
✓ Domain.Product [29ms]
...
✓ Domain.Api [67ms]
Build complete: 12 modules in 3.4s (2 warnings)
Incremental Module Compilation
For watch mode and IDE integration, individual modules can be recompiled:
Request:
{
"jsonrpc": "2.0",
"id": "compile-001",
"method": "compile/module",
"params": {
"project": "my-org/domain",
"module": ["Domain", "User"],
"source": "module Domain.User exposing (..)\n\nimport Domain.Types...",
"existingIR": { "...previous module IR for merge..." }
}
}
Dependency-Aware Streaming
When a module changes, dependent modules are recompiled in order:
User.elm changed
└─► Recompile Domain.User
└─► Recompile Domain.Api (depends on User)
└─► Recompile Domain.Service (depends on Api)
Each recompilation streams its result immediately:
{ "method": "build/moduleCompiled", "params": { "module": ["Domain", "User"], "trigger": "source-change" } }
{ "method": "build/moduleCompiled", "params": { "module": ["Domain", "Api"], "trigger": "dependency-change" } }
{ "method": "build/moduleCompiled", "params": { "module": ["Domain", "Service"], "trigger": "dependency-change" } }
Code Generation
Code generation transforms compiled IR into target language code using backend extensions.
Target Selection
The --target flag selects which backend to use for code generation:
# Generate Spark/Scala code
morphir codegen --target spark
# Generate TypeScript code
morphir codegen --target typescript
# Generate multiple targets
morphir codegen --target spark --target typescript
# List available targets
morphir codegen --list-targets
Built-in Targets:
| Target | Flag | Output | Notes |
|---|---|---|---|
| Spark | --target spark | Scala (Spark DataFrame API) | Default for data pipelines |
| Scala | --target scala | Pure Scala | General-purpose |
| TypeScript | --target typescript | TypeScript | Web/Node.js |
| JSON Schema | --target json-schema | JSON Schema | Type definitions only |
Extension Targets:
WASM-based backend extensions register additional targets:
# morphir.toml
[extensions]
codegen-flink = { path = "./extensions/flink-codegen.wasm" }
# Registers --target flink
# Use extension-provided target
morphir codegen --target flink
Target Configuration
Targets can be configured in morphir.toml:
[codegen.spark]
spark_version = "3.5"
scala_version = "2.13"
output_dir = "src/main/scala"
[codegen.typescript]
module_system = "esm"
output_dir = "src/generated"
Or via CLI flags:
morphir codegen --target spark --option spark_version=3.5 --output src/main/scala
Automatic Target Association
Projects can configure default targets that run automatically on build:
# morphir.toml
[project]
name = "my-org/domain"
# Default targets for this project
[codegen]
targets = ["spark", "typescript"]
# Target-specific configuration
[codegen.spark]
output_dir = "src/main/scala"
[codegen.typescript]
output_dir = "src/generated/ts"
With this configuration, morphir build automatically generates code for all configured targets:
# Builds IR and generates code for spark and typescript
morphir build
# Skip codegen
morphir build --no-codegen
# Override targets
morphir build --codegen-targets spark
Module-Level Target Association
Associate specific modules with specific targets:
# morphir.toml
[codegen]
# Default targets for all modules
targets = ["spark"]
# Override for specific modules
[codegen.modules."Domain.Api"]
targets = ["typescript", "json-schema"]
[codegen.modules."Domain.Internal"]
targets = [] # No codegen for internal modules
Pattern-Based Target Association
Use glob patterns for target association:
[codegen]
targets = ["spark"]
# All Api modules get TypeScript
[[codegen.rules]]
pattern = "**/Api/**"
targets = ["typescript"]
# Test modules don't get codegen
[[codegen.rules]]
pattern = "**/Test/**"
targets = []
Extension-Declared Targets
Extensions can declare which targets they provide and their capabilities:
// In extension's WIT interface
world spark-codegen {
export codegen-target {
// Target metadata
name: func() -> string; // "spark"
description: func() -> string; // "Apache Spark DataFrame API"
file-extension: func() -> string; // ".scala"
// Capability flags
supports-streaming: func() -> bool;
supports-incremental: func() -> bool;
}
export morphir:extension/codegen;
}
Extension Registration:
# morphir.toml
[extensions]
codegen-spark = { path = "./extensions/spark-codegen.wasm" }
# Extension automatically registers its target
# Now --target spark is available
Querying Available Targets:
{
"jsonrpc": "2.0",
"id": "targets-001",
"method": "codegen/listTargets",
"params": {}
}
Response:
{
"jsonrpc": "2.0",
"id": "targets-001",
"result": {
"targets": [
{
"name": "spark",
"description": "Apache Spark DataFrame API",
"fileExtension": ".scala",
"source": "builtin",
"supportsStreaming": true,
"supportsIncremental": true
},
{
"name": "flink",
"description": "Apache Flink DataStream API",
"fileExtension": ".scala",
"source": "extension:codegen-flink",
"supportsStreaming": true,
"supportsIncremental": false
}
]
}
}
JSON-RPC Method
{
"jsonrpc": "2.0",
"id": "codegen-001",
"method": "codegen/generate",
"params": {
"project": "my-org/domain",
"target": "spark",
"options": {
"spark_version": "3.5",
"output_dir": "src/main/scala"
}
}
}
Response:
{
"jsonrpc": "2.0",
"id": "codegen-001",
"result": {
"target": "spark",
"filesGenerated": 24,
"outputDir": "src/main/scala"
}
}
Streaming Code Generation
Code generation also supports streaming to avoid generating all output at once.
Streaming Codegen Request
Request:
{
"jsonrpc": "2.0",
"id": "codegen-001",
"method": "codegen/generateStreaming",
"params": {
"project": "my-org/domain",
"target": "spark",
"streaming": {
"granularity": "module",
"writeImmediately": true
}
}
}
Stream of Notifications:
{ "method": "codegen/started", "params": { "target": "spark", "modules": 12 } }
{ "method": "codegen/moduleGenerated", "params": { "module": ["Domain", "Types"], "files": ["Types.scala"] } }
{ "method": "codegen/moduleGenerated", "params": { "module": ["Domain", "User"], "files": ["User.scala", "UserCodecs.scala"] } }
...
{ "method": "codegen/completed", "params": { "filesGenerated": 24 } }
Incremental Codegen
Only regenerate code for changed modules:
morphir codegen --target spark --incremental
Behavior:
- Compare module IR hashes against last codegen
- Regenerate only changed modules
- Stream generated files as they're produced
Codegen Manifest
Track what was generated for incremental updates:
{
"target": "spark",
"generatedAt": "2026-01-16T12:00:00Z",
"modules": {
"Domain.User": {
"irHash": "sha256:abc123...",
"files": [
{ "path": "src/main/scala/domain/User.scala", "hash": "sha256:def456..." }
]
}
}
}
Parallel Codegen
Generate code for independent modules in parallel:
# morphir.toml
[codegen]
parallel = true
max-workers = 4
streaming = true # Enable streaming output
Ad-Hoc Compilation
For quick experimentation and integration workloads, Morphir supports compiling code without a full project setup. This is useful for:
- Quick prototyping: Test ideas without creating a project
- Shell pipelines: Integrate with Unix-style workflows
- CI validation: Check snippets in automated tests
- Code generation testing: Validate codegen output
Snippet Compilation
Compile source code from stdin or inline. The input language must be specified (or inferred from file extension):
# From stdin (language required)
echo 'module Example exposing (add)
add a b = a + b' | morphir compile --lang elm -
# From a single file (language inferred from .elm extension)
morphir compile snippet.elm
# Explicit language
morphir compile --lang elm snippet.elm
# Multiple files
morphir compile types.elm logic.elm
Supported Languages:
| Language | Flag | Extensions | Notes |
|---|---|---|---|
| Elm | --lang elm | .elm | Default frontend |
| Morphir DSL | --lang morphir-dsl | .morphir, .mdsl | Native DSL |
| (Extensions) | --lang <name> | Per extension | Via WASM frontends |
JSON-RPC Method:
{
"jsonrpc": "2.0",
"id": "snippet-001",
"method": "compile/snippet",
"params": {
"language": "elm",
"source": "module Example exposing (add)\nadd a b = a + b",
"options": {
"moduleName": "Example",
"packageName": "adhoc"
}
}
}
Expression Evaluation
Compile or evaluate standalone expressions:
# Type-check an expression
morphir check --expr "\\x -> x + 1"
# Evaluate an expression
morphir eval "List.map (\\x -> x * 2) [1, 2, 3]"
JSON-RPC Method:
{
"jsonrpc": "2.0",
"id": "expr-001",
"method": "compile/expression",
"params": {
"expression": "\\x -> x + 1",
"context": {
"imports": []
}
}
}
Ad-Hoc with Dependencies
Specify dependencies inline for snippets that need external packages:
morphir compile snippet.elm --with-dep morphir/sdk --with-dep morphir/json
JSON-RPC:
{
"jsonrpc": "2.0",
"id": "snippet-002",
"method": "compile/snippet",
"params": {
"language": "elm",
"source": "module Example exposing (..)\nimport Json.Decode...",
"options": {
"dependencies": [
{ "name": "morphir/sdk" },
{ "name": "morphir/json", "version": "1.0.0" }
]
}
}
}
Pipeline Compilation
Chain compilation and codegen in Unix pipelines:
# Compile then generate
cat snippet.elm | morphir compile - | morphir codegen --target spark -
# Compile multiple, stream codegen
morphir compile "src/*.elm" --stream | morphir codegen --target typescript --stream -
Fragment Compilation
For IDE integration, compile a code fragment within an existing module context:
{
"jsonrpc": "2.0",
"id": "frag-001",
"method": "compile/fragment",
"params": {
"language": "elm",
"fragment": "\\x -> x + 1",
"context": {
"modulePath": ["Domain", "User"],
"imports": ["Domain.Types"],
"localBindings": {
"currentUser": "User"
}
}
}
}
This enables features like:
- Hover type information
- Autocomplete with context
- Inline error checking
See CLI Interaction for complete CLI documentation.
Best Practices
- Use Incremental Builds: Avoid
cleanunless necessary - Parallelize: Enable parallel builds for faster compilation
- Fail Fast in CI: Use
--fail-fastin CI to stop on first error - Cache Dependencies: Keep dependency cache for faster rebuilds
- Check Before Push: Run
morphir build --check-onlybefore committing - Stream Large Builds: Use
--streamfor projects with many modules - Incremental Codegen: Use
--incrementalto only regenerate changed modules