The Big Picture
Why jinflow exists
Section titled “Why jinflow exists”Organizations sit on data they don’t understand. Hospital supply chains leak revenue. Ski resorts can’t reconcile ticket sales with lift rides. Freight forwarders lose visibility between checkpoints. Law firms bill hours they can’t trace.
The data exists. The questions exist. What’s missing is a systematic way to go from “something looks wrong” to “here’s why, and here’s what to do about it.”
jinflow is that system.
The idea
Section titled “The idea”Declare what you want to detect. The engine does the rest.
You write a signal: “find cases where billing events are missing.” You write a thesis: “is this a systematic billing gap?” You write a verdict: “the billing trigger only fires on inpatient discharges.”
Each declaration compiles to SQL. Each SQL model runs against your data. Each result feeds the next layer. The output is a knowledge store — a single DuckDB file containing findings, verdicts, explanations, and expert knowledge. Browsable in the Explorer.
No custom code. No ad-hoc queries. No dashboards that answer one question and raise three more. A declarative pipeline that builds understanding.
The analytical pyramid
Section titled “The analytical pyramid”Each layer builds on the one below. Signals detect patterns. Findings are the atomic evidence they emit. Perspectives aggregate findings into entity-level scores; Theses evaluate whether a pattern is systematic; Verdicts propose a machine-computed root cause. On top — coming with Sense 15 — Observations are a named person’s signed strategic claim about reality, and Explanations are the signed causal theory behind them, composed of Contributing Factors that can adopt a Verdict’s machine-proposed explanation. Alongside the pyramid, Subject Matter (grouped by Dossier) captures expert knowledge about the domain itself — reusable across every tenant, never tenant-specific.
The two hands of jinflow
Section titled “The two hands of jinflow”jinflow is ambidextrous. Two hands always working at the same time, never either/or: the P-hand (Pipeline) builds the data foundation, and the T-hand (Talk) drives the analytical intelligence. Both evolve continuously; neither blocks the other. The Entity + Contract is the stable interface where the two hands meet.
P-hand (Pipeline) is data engineering: ingest, validate, govern. The output is clean, source-system-agnostic Entities. An engineer can improve source adapters, fix validation rules, add new entities — without touching a single signal or thesis.
T-hand (Talk) is analytical work: declare what to detect, evaluate whether it’s real, explain why. A consultant can add signals, refine theses, capture Subject Matter — without touching the pipeline.
Neither hand blocks the other. Both evolve continuously. jinflow make rebuilds both in one pass. jinflow evolve assists both — whether you’re debugging a data quality issue or drafting a new verdict. The Entity is the handoff: the P-hand guarantees its shape and quality, the T-hand trusts that guarantee.
| P-hand · Pipeline | T-hand · Talk | |
|---|---|---|
| Who | Data engineer | Analyst, consultant, domain expert |
| Thinks in | SQL, schemas, data quality | Questions, theses, evidence |
| Writes | dbt models, source-system macros | Signals, theses, verdicts, Subject Matter |
| Ends at / starts from | Entity (Gold) | Entity (Gold) |
| Tools | jinflow make, dbt CLI | Explorer, jinflow evolve, YAML editor |
Both hands reach for the same artifacts — Entity, Signal, Subject Matter, and the rest. The next section names and places each of those artifacts.
Two worlds: Subject Matter Expertise and Analytical Activities
Section titled “Two worlds: Subject Matter Expertise and Analytical Activities”Every jinflow artifact belongs to one of two distinct worlds. The Subject Matter Expertise world holds what experts know about the domain itself — stable, codified, reusable across every tenant of a pack. The Analytical Activities world holds what unfolds against real tenant data — declarations, machine output, and (soon) signed human claims. The dbt pipeline (Bronze → Silver → Gold) is shared substrate serving both worlds but belonging to neither.
Subject Matter Expertise — what’s codified. Experts capture what they know about the domain itself: system quirks, process workarounds, mapping decisions. A Subject Matter entry is either a Statement (an observation about how the domain works) or a Check (a SQL-executable assertion). Dossiers group related Subject Matters into narratives. This layer is pack-sourced and single-sourced — tenants reference it; they never copy or fork it. “OPALE splits cases at midnight” is true wherever OPALE runs, so it belongs in the pack.
Analytical Activities — what unfolds. The pack ships a catalog of templates — Signal, Perspective, Thesis, Verdict. A tenant imports a template and then owns it: once it hits real tenant data, it diverges and evolves locally. Imported Signals run on Gold entities and emit machine output — Findings, Perspective scores, Thesis status, Verdict output. On top, the coming signed human layer (Sense 15): an Observation is a named person’s signed strategic claim about this tenant’s reality, Validation is its empirical proof from machine output, and Explanation is the signed causal theory — composed of Contributing Factors that may adopt a Verdict as a starting point.
The pack contains much more than these two worlds — dbt models, macros, extractors, contracts, tenant skeletons. Those are infrastructure; they serve both worlds but sit in neither.
| Artifact | World | Pack role | Tenant role |
|---|---|---|---|
| Subject Matter, Statement, Check, Dossier | SME | single source of truth | referenced, never copied |
| Signal, Perspective, Thesis, Verdict | Activities | catalog of templates | imported, owned, evolves |
| Finding, Perspective score, Thesis status, Verdict output | Activities | — | always (machine-computed) |
| Observation, Validation, Explanation, Contributing Factor | Activities | — | always (signed human · coming soon) |
See The Two Worlds for the full IS / IS NOT for each term and the migration consequences.
The core loop
Section titled “The core loop”Make compiles your declarations and builds the knowledge store. Explore lets you browse findings, test theses, and review evidence in a web UI. Evolve connects you to Claude AI for deeper analysis. Each cycle deepens understanding.
Design principles
Section titled “Design principles”Declarative, not procedural. You declare what to detect. The engine decides how to compute it. Today the declarations are YAML compiled to SQL. Tomorrow it could be a different surface — the principle stays.
No silent filtering. Invalid data is flagged, not dropped. Every row carries an is_valid flag and an invalid_reason. Gold only shows valid rows, but Silver preserves everything. Nothing disappears without a trace.
Quality is queryable. Data quality isn’t a side report — it’s a first-class dbt model. You can query quality metrics the same way you query the data itself.
Knowledge as data. Expert knowledge (Subject Matter) lives in the same pipeline as findings. It’s attributed, versioned, scoped, and optionally testable. The difference between data and knowledge is the why.
Multi-tenant by design. Each tenant is an isolated DuckDB schema. Tenants share the analytical framework but never see each other’s data. The AI is scoped to exactly one tenant per session.
Domain packs power the analytics. Each pack bundles signal / thesis / verdict / perspective templates, Subject Matter, Dossiers, contracts, and source-system adapters for a specific industry. The engine is domain-agnostic — packs bring the domain expertise. Reference packs include nuMetrix (healthcare), Millesime (winemaking), Alptrack (ski resorts), InterLogic (logistics), and more.
Who is jinflow for?
Section titled “Who is jinflow for?”| Role | Focus | What they do |
|---|---|---|
| Data engineer | P-hand (Pipeline) | Maintains the medallion pipeline. Adds source-system adapters. Delivers clean Entities. |
| Consultant | T-hand (Talk) | Builds signals and theses for clients. Packages expertise into domain packs. |
| Analyst | T-hand (Talk) | Explores findings in the Explorer. Captures expert knowledge as Subject Matter. |
| Leadership | Consume the results | Reviews executive summaries, confirmed theses, and verdict root causes. |
What makes jinflow different
Section titled “What makes jinflow different”From BI tools: BI dashboards answer predefined questions. jinflow discovers questions you didn’t know to ask — and explains why the answers matter.
From data quality tools: Quality tools flag bad data. jinflow detects patterns (signals), evaluates whether they’re systematic (theses), explains why they’re happening (verdicts), and recommends what to do.
From custom analytics: Custom code is powerful but fragile. jinflow provides a declarative framework where the analytical logic is versioned, compiled, and reproducible — not buried in notebooks or scripts.
Next steps
Section titled “Next steps”- Your First Build — build a signal and explore findings in 15 minutes
- Glossary — 54 terms explained
- Domain Packs — see what jinflow looks like across 4 industries