Skip to content

Commit

Permalink
s/Lacunae/Lacunas/
Browse files Browse the repository at this point in the history
Latin is fun...until it isn't.

Fixes #12
  • Loading branch information
sam boyer committed Jan 13, 2022
1 parent 3ad9c6b commit 0590ecf
Show file tree
Hide file tree
Showing 15 changed files with 57 additions and 57 deletions.
2 changes: 1 addition & 1 deletion FAQ.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,6 @@ You don't.

Semantic versioning [explicitly](https://semver.org/#spec-item-9) grants prereleases an exception to its compatibility semantics. This makes each [contiguous series of] prerelease a subsequence where "anything goes."

Thema takes the stance that it is preferable to _never_ suspend version number-implied guarantees, and instead lean hard into the system of lenses, translations, and lacunae. In other words, it's fine to experiment and make breaking changes within your Thema, so long as you write lenses and lacunae that can lead your users' objects to solid ground.
Thema takes the stance that it is preferable to _never_ suspend version number-implied guarantees, and instead lean hard into the system of lenses, translations, and lacunas. In other words, it's fine to experiment and make breaking changes within your Thema, so long as you write lenses and lacunas that can lead your users' objects to solid ground.

Support for indicating a maturity level on individual schema may be added in the future. But it would have no bearing on core Thema guarantees. Instead, maturity would be an opaque string, used purely for signalling between humans: "we're really not sure about this yet; future lenses for translating from this schema may be sloppy!"
12 changes: 6 additions & 6 deletions docs/authoring.md
Original file line number Diff line number Diff line change
Expand Up @@ -279,15 +279,15 @@ Applied to some concrete JSON (with a `version` field implicitly added to the sc

The output is valid, but less than ideal. Are we just going to have `-1` values littered all over our instances of `Ship.secondfield`? When would those get cleaned up? Does choosing `-1` as a placeholder grant special semantics to that particular value in perpetuity?

These questions bring us to the last part of thema: Lacunae.
These questions bring us to the last part of thema: Lacunas.

## Emitting a lacuna

Thema's professed guarantee - all prior valid instances of schema will be translatable to all future valid instances of schema - sounds lovely. But the `secondfield` case shows it to be a pressure vessel, fit to burst when requirements evolve in just _slightly_ the wrong way. Other schema systems are similar - they "burst" when folks make breaking changes to attain the semantics they want. And it's nice that thema pushes this out further with the ability to encode translations in lenses. But eventually, it'll still burst, and folks will pick their desired semantics over thema's rules - just like they do today.

To prevent this outcome, what we really need is a pressure release valve. Which is where lacunae come in.
To prevent this outcome, what we really need is a pressure release valve. Which is where lacunas come in.

Lacunae represent a gap or flaw in a lens translation. As a lineage author, you add a lacuna to your lens when the translation results in a message that, while syntactically valid (it conforms to schema), has problematic semantics. Lacunae are accumulated during translation, and returned alongside the translated instance itself.
Lacunas represent a gap or flaw in a lens translation. As a lineage author, you add a lacuna to your lens when the translation results in a message that, while syntactically valid (it conforms to schema), has problematic semantics. Lacunas are accumulated during translation, and returned alongside the translated instance itself.

Thema defines a limited set of lacuna types that correspond to different types of flaws. (This area is under active development.) For our case, we should emit a `Placeholder` lacuna.

Expand Down Expand Up @@ -319,7 +319,7 @@ lin: seqs: [
firstfield: from.firstfield
secondfield: -1
}
lacunae: [
lacunas: [
thema.#Lacuna & {
targetFields: [{
path: "secondfield"
Expand Down Expand Up @@ -358,7 +358,7 @@ Basic inputs and outputs:
"firstfield": "foobar",
"secondfield": -1
},
"lacunae": [
"lacunas": [
{
"sourceFields": [],
"targetFields": [
Expand All @@ -375,7 +375,7 @@ Basic inputs and outputs:
}
```

Encapsulating translation flaws as lacunae relieves pressure on the schemas and translation. Schemas need not carry extraneous, legacy fields to reflect translation flaws, and lacunae can disambiguate for the calling program between translations with flaws, and those without. In this case, we can imagine `secondfield` is actually a serial identifier/foreign key, and the calling program can be constructed to look for a `Placeholder` lacuna on `secondfield`, then replace that `-1` with a correct value derived from another source.
Encapsulating translation flaws as lacunas relieves pressure on the schemas and translation. Schemas need not carry extraneous, legacy fields to reflect translation flaws, and lacunas can disambiguate for the calling program between translations with flaws, and those without. In this case, we can imagine `secondfield` is actually a serial identifier/foreign key, and the calling program can be constructed to look for a `Placeholder` lacuna on `secondfield`, then replace that `-1` with a correct value derived from another source.

Knowing when to emit a lacuna, and which type to emit, is nontrivial. The set of lacuna types and precise rules for when and how to use them appropriately are under active development. We will, in future, provide documentation specific to each lacuna type. In the meantime, the [exemplars directory](https://github.com/grafana/thema/tree/main/exemplars) contains a number of examples of lacuna use.

Expand Down
2 changes: 1 addition & 1 deletion docs/go-usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -319,7 +319,7 @@ type Ship struct {

// JSONToShip converts a byte slice of JSON data containing a single instance of
// ship valid against any schema
func JSONToShip(data []byte) (*Ship, thema.TranslationLacunae, error) {
func JSONToShip(data []byte) (*Ship, thema.TranslationLacunas, error) {
ship, lac, err := jshipk.Converge(data)
if err != nil {
return nil, nil, err
Expand Down
6 changes: 3 additions & 3 deletions docs/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,13 +58,13 @@ This pattern begins with a three-step process, typically executed at the program

1. Receive some input data and `ValidateAny()` to confirm it is an instance, and of what schema version
2. `Translate()` the instance to the schema version the program is currently designed to work with
3. Decide what to do with any lacunae emitted from translation - for example: ignore, log, error out, mutate the translated instance
3. Decide what to do with any lacunas emitted from translation - for example: ignore, log, error out, mutate the translated instance

This animation illustrates a program performing these first two steps across varying numbers of the schemas from the example above:

![Validate and Translate](validate-and-translate.gif) TODO fixup the graffle, make the gif

Once this process is complete, the program can continue (or terminate based on observed lacunae) to perform useful behavior based on the input, now known to be both a) valid and b) represented in the form of the schema version against which the program has been written. Versioning and translation has been encapsulated at the program boundary, and the rest of the program can safely pretend that only the one version of the schema exists.
Once this process is complete, the program can continue (or terminate based on observed lacunas) to perform useful behavior based on the input, now known to be both a) valid and b) represented in the form of the schema version against which the program has been written. Versioning and translation has been encapsulated at the program boundary, and the rest of the program can safely pretend that only the one version of the schema exists.

Deeper exploration and concrete examples are available in the [tutorial on using Thema from Go](go-usage.md).

Expand All @@ -81,4 +81,4 @@ Thema's guarantee arises from the combination of smaller, machine-checkable cons
Thema will be considered a mature, stable project when all the intended invariants are machine-checked. Even when this milestone is reached, however, certain caveats will remain:

* Programs need to have the most updated version of the lineage, in case they receive inputs that are valid, but against a schema they are not yet aware of. This implies a publishing and distribution model for lineages is necessary, as well as an append-only immutability requirement. See [Publishing (TODO)](publishing.md).
* Even with lenses and lacunae, some data semantics will result in practical limits on surpriseless translation of message intent. As the Project Cambria folks [note](https://www.inkandswitch.com/cambria/#findings), a practical breaking point will eventually be reached. Thema is not a magical silver bullet.
* Even with lenses and lacunas, some data semantics will result in practical limits on surpriseless translation of message intent. As the Project Cambria folks [note](https://www.inkandswitch.com/cambria/#findings), a practical breaking point will eventually be reached. Thema is not a magical silver bullet.
2 changes: 1 addition & 1 deletion docs/ship.cue
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ lin: seqs: [
firstfield: from.firstfield
secondfield: -1
}
lacunae: [
lacunas: [
thema.#Lacuna & {
targetFields: [{
path: "secondfield"
Expand Down
6 changes: 3 additions & 3 deletions exemplars/defaultchange.cue
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,8 @@ defaultchange: {
aunion: from.anion
}
}
lacunae: [
// FIXME really feels like these lacunae should be able to be autogenerated, at least for simple cases?
lacunas: [
// FIXME really feels like these lacunas should be able to be autogenerated, at least for simple cases?
if from.aunion == "foo" {
thema.#Lacuna & {
sourceFields: [{
Expand Down Expand Up @@ -63,7 +63,7 @@ defaultchange: {
aunion: from.anion
}
}
lacunae: [
lacunas: [
if from.aunion == "foo" {
thema.#Lacuna & {
sourceFields: [{
Expand Down
4 changes: 2 additions & 2 deletions exemplars/narrowing.cue
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ narrowing: {
properbool: from.boolish
}
}
lacunae: [
lacunas: [
if ((from.boolish & string) != _|_) && ((from.boolish & ("true" | "false")) == _|_) {
thema.#Lacuna & {
sourceFields: [{
Expand All @@ -58,7 +58,7 @@ narrowing: {
// Preserving preicse original form is a non-goal of thema in general.
boolish: from.properbool
}
lacunae: []
lacunas: []
}
}
]
Expand Down
4 changes: 2 additions & 2 deletions exemplars/rename.cue
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ rename: {
after: from.before
unchanged: from.unchanged
}
lacunae: []
lacunas: []
}
lens: reverse: {
to: seqs[0].schemas[0]
Expand All @@ -40,7 +40,7 @@ rename: {
before: from.after
unchanged: from.unchanged
}
lacunae: []
lacunas: []
}
}
]
Expand Down
8 changes: 4 additions & 4 deletions kernel/input.go
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ type InputKernelConfig struct {
// To is the schema version on which all operations will converge.
To thema.SyntacticVersion

// TODO add something for interrupting/mediating translation vis-a-vis accumulated lacunae
// TODO add something for interrupting/mediating translation vis-a-vis accumulated lacunas
}

// An InputKernel accepts all the valid inputs for a given lineage, converges
Expand All @@ -48,7 +48,7 @@ type InputKernel struct {
load DataLoader
lin thema.Lineage
to thema.SyntacticVersion
// TODO add something for interrupting/mediating translation vis-a-vis accumulated lacunae
// TODO add something for interrupting/mediating translation vis-a-vis accumulated lacunas
}

// NewInputKernel constructs an input kernel.
Expand Down Expand Up @@ -104,7 +104,7 @@ func NewInputKernel(cfg InputKernelConfig) (InputKernel, error) {
}

// Converge runs input data through the full kernel process: validate, translate to a
// fixed version, return transformed instance along with any emitted lacunae.
// fixed version, return transformed instance along with any emitted lacunas.
//
// Valid formats for the input data are determined by the DataLoader func with which
// the kernel was constructed. Invalid data will result in an error.
Expand All @@ -115,7 +115,7 @@ func NewInputKernel(cfg InputKernelConfig) (InputKernel, error) {
// which the kernel was constructed.
//
// It is safe to call Converge from multiple goroutines.
func (k InputKernel) Converge(data []byte) (interface{}, thema.TranslationLacunae, error) {
func (k InputKernel) Converge(data []byte) (interface{}, thema.TranslationLacunas, error) {
if !k.init {
panic("kernel not initialized")
}
Expand Down
10 changes: 5 additions & 5 deletions lacuna.cue
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ package thema
// A lacuna may be unconditional (the gap exists for all possible instances
// being translated between the schema pair) or conditional (the gap only exists
// when certain values appear in the instance being translated between schema).
// However, the conditionality of lacunae is expected to be expressed at the
// However, the conditionality of lacunas is expected to be expressed at the
// level of the lens, and determines whether a particular lacuna object is
// created; the production of a lacuna object as the output of a specific
// translation indicates the lacuna applies to that specific translation.
Expand Down Expand Up @@ -45,7 +45,7 @@ package thema

#LacunaTypes: [N=string]: #LacunaType & { name: N }
#LacunaTypes: {
// Placeholder lacunae indicate that a field in the target instance has
// Placeholder lacunas indicate that a field in the target instance has
// been filled with a placeholder value.
//
// Use Placeholder when introducing a new required field that lacks a default,
Expand All @@ -59,7 +59,7 @@ package thema
id: 1
}

// DroppedField lacunae indicate that field(s) in the source instance were
// DroppedField lacunas indicate that field(s) in the source instance were
// dropped in a manner that potentially lost some of their contained semantics.
//
// When a lens drops multiple fields, prefer to create one DroppedField
Expand All @@ -70,7 +70,7 @@ package thema
id: 2
}

// LossyFieldMapping lacunae indicate that no clear mapping existed from the
// LossyFieldMapping lacunas indicate that no clear mapping existed from the
// source field value to the intended semantics of any valid target field
// value.
//
Expand All @@ -80,7 +80,7 @@ package thema
id: 3
}

// ChangedDefault lacunae indicate that the source field value was the
// ChangedDefault lacunas indicate that the source field value was the
// schema-specified default, and the default changed in the target field,
// and the value in the instance was changed as well.
//
Expand Down
14 changes: 7 additions & 7 deletions lacuna.go
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
package thema

// TranslationLacunae defines common patterns for unary and composite lineages
// in the lacunae their translations emit.
type TranslationLacunae interface {
// TranslationLacunas defines common patterns for unary and composite lineages
// in the lacunas their translations emit.
type TranslationLacunas interface {
AsList() []Lacuna
}

type flatLacunae []Lacuna
type flatLacunas []Lacuna

func (fl flatLacunae) AsList() []Lacuna {
func (fl flatLacunas) AsList() []Lacuna {
return fl
}

Expand All @@ -23,7 +23,7 @@ func (fl flatLacunae) AsList() []Lacuna {
// A lacuna may be unconditional (the gap exists for all possible instances
// being translated between the schema pair) or conditional (the gap only exists
// when certain values appear in the instance being translated between schema).
// However, the conditionality of lacunae is expected to be expressed at the
// However, the conditionality of lacunas is expected to be expressed at the
// level of the lens, and determines whether a particular lacuna object is
// created; the production of a lacuna object as the output of the translation
// of a particular instance indicates the lacuna applies to that specific
Expand All @@ -42,7 +42,7 @@ type Lacuna struct {
Message string `json:"message"`
}

// LacunaType assigns numeric identifiers to different classes of Lacunae.
// LacunaType assigns numeric identifiers to different classes of Lacunas.
//
// FIXME this is a terrible way of doing this and needs to change
type LacunaType uint16
Expand Down
4 changes: 2 additions & 2 deletions lineage.cue
Original file line number Diff line number Diff line change
Expand Up @@ -54,14 +54,14 @@ import (
to: descendant
from: ancestor
rel: descendant
lacunae: [...#Lacuna]
lacunas: [...#Lacuna]
translated: to & rel
}
reverse: {
to: ancestor
from: descendant
rel: ancestor
lacunae: [...#Lacuna]
lacunas: [...#Lacuna]
translated: to & rel
}
}
Expand Down
2 changes: 1 addition & 1 deletion load/testdata/testmod/ship.cue
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ lin: seqs: [
firstfield: from.firstfield
secondfield: -1
}
lacunae: [
lacunas: [
thema.#Lacuna & {
targetFields: [{
path: "secondfield"
Expand Down
16 changes: 8 additions & 8 deletions translate.cue
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ import "list"
// continuing until the target schema version is reached.
//
// The out values are the instance in final translated form, the schema versions
// at which the translation started and ended, and any lacunae emitted during
// at which the translation started and ended, and any lacunas emitted during
// translation.
//
// TODO functionize
Expand All @@ -20,9 +20,9 @@ import "list"
// Shape of output
out: {
linst: #LinkedInstance
lacunae: [...{
lacunas: [...{
v: #SyntacticVersion
lacunae: [...#Lacuna]
lacunas: [...#Lacuna]
}]
}

Expand All @@ -37,12 +37,12 @@ import "list"
_#step: {
inst: inlinst.lin.joinSchema
v: #SyntacticVersion
lacunae: [...#Lacuna]
lacunas: [...#Lacuna]
}

// The accumulator holds the results of each translation step.
accum: list.Repeat([_#step], len(schemarange)+1)
accum: [{ inst: inlinst.inst, v: VF, lacunae: [] }, for i, vsch in schemarange {
accum: [{ inst: inlinst.inst, v: VF, lacunas: [] }, for i, vsch in schemarange {
let lasti = accum[i]
v: vsch.v

Expand All @@ -56,7 +56,7 @@ import "list"
// with incomplete CUE structures
// inst: lasti.inst & (#Pick & { lin: inlin, v: vsch.v }).out
inst: lasti.inst & inlin.seqs[vsch.v[0]].schemas[vsch.v[1]]
lacunae: []
lacunas: []
}
if vsch.v[0] > lasti.v[0] {
// Crossing sequences. Translate via the explicit lens.
Expand All @@ -65,7 +65,7 @@ import "list"
// last translation
let _lens = { from: lasti.inst } & inlin.seqs[vsch.v[0]].lens.forward
inst: _lens.translated
lacunae: _lens.lacunae
lacunas: _lens.lacunas
}
}]

Expand All @@ -75,7 +75,7 @@ import "list"
v: accum[len(accum)-1].v
lin: inlin
}
lacunae: [for step in accum if len(step.lacunae) > 0 { v: step.v, lacunae: step.lacunae }]
lacunas: [for step in accum if len(step.lacunas) > 0 { v: step.v, lacunas: step.lacunas }]
}
}

Expand Down
Loading

0 comments on commit 0590ecf

Please sign in to comment.