diff --git a/README.md b/README.md index 8a7ea31e..c94859ac 100644 --- a/README.md +++ b/README.md @@ -47,7 +47,7 @@ See [python README](https://github.com/emer/leabra/blob/master/python/README.md) * Go uses `interfaces` to represent abstract collections of functionality (i.e., sets of methods). The `emer` package provides a set of interfaces for each structural level (e.g., `emer.Layer` etc) -- any given specific layer must implement all of these methods, and the structural containers (e.g., the list of layers in a network) are lists of these interfaces. An interface is implicitly a *pointer* to an actual concrete object that implements the interface. -* To allow for specialized algorithms to extend the basic Leabra algorithm functionality, we have additional algorithm-specific interfaces in `leabra/leabra/leabra.go`, called `LeabraNetwork`, `LeabraLayer`, and `LeabraPrjn` -- all functions should go through this interface so that the final actual function called can be either the default version defined on `leabra.Layer` or a more specialized type (e.g., for simulating the PFC, hippocampus, BG etc). This is what it looks like for example: +* To allow for specialized algorithms to extend the basic Leabra algorithm functionality, we have additional algorithm-specific interfaces in `leabra/leabra/leabra.go`, called `LeabraNetwork`, `LeabraLayer`, and `LeabraPath` -- all functions should go through this interface so that the final actual function called can be either the default version defined on `leabra.Layer` or a more specialized type (e.g., for simulating the PFC, hippocampus, BG etc). This is what it looks like for example: ```Go func (nt *Network) InitActs() { @@ -62,7 +62,7 @@ func (nt *Network) InitActs() { * The emer interfaces are designed to support generic access to network state, e.g., for the 3D network viewer, but specifically avoid anything algorithmic. Thus, they allow viewing of any kind of network, including PyTorch backprop nets in the eTorch package. -* There are 3 main levels of structure: `Network`, `Layer` and `Prjn` (projection). The Network calls methods on its Layers, and Layers iterate over both `Neuron` data structures (which have only a minimal set of methods) and the `Prjn`s, to implement the relevant computations. The `Prjn` fully manages everything about a projection of connectivity between two layers, including the full list of `Syanpse` elements in the connection. There is no "ConGroup" or "ConState" level as was used in C++, which greatly simplifies many things. The Layer also has a set of `Pool` elements, one for each level at which inhibition is computed (there is always one for the Layer, and then optionally one for each Sub-Pool of units (*Pool* is the new simpler term for "Unit Group" from C++ emergent). +* There are 3 main levels of structure: `Network`, `Layer` and `Path` (pathway). The Network calls methods on its Layers, and Layers iterate over both `Neuron` data structures (which have only a minimal set of methods) and the `Path`s, to implement the relevant computations. The `Path` fully manages everything about a pathway of connectivity between two layers, including the full list of `Syanpse` elements in the connection. There is no "ConGroup" or "ConState" level as was used in C++, which greatly simplifies many things. The Layer also has a set of `Pool` elements, one for each level at which inhibition is computed (there is always one for the Layer, and then optionally one for each Sub-Pool of units (*Pool* is the new simpler term for "Unit Group" from C++ emergent). * Layers have a `Shape` property, using the `tensor.Shape` type (see `table` package), which specifies their n-dimensional (tensor) shape. Standard layers are expected to use a 2D Y*X shape (note: dimension order is now outer-to-inner or *RowMajor* now), and a 4D shape then enables `Pools` ("unit groups") as hypercolumn-like structures within a layer that can have their own local level of inihbition, and are also used extensively for organizing patterns of connectivity. @@ -78,7 +78,7 @@ Here are some of the additional supporting packages, organized by overall functi * [netview](netview) provides the `NetView` interactive 3D network viewer, implemented in the GoGi 3D framework. -* [prjn](prjn) is a separate package for defining patterns of connectivity between layers (i.e., the `ProjectionSpec`s from C++ emergent). This is done using a fully independent structure that *only* knows about the shapes of the two layers, and it returns a fully general bitmap representation of the pattern of connectivity between them. The `leabra.Prjn` code then uses these patterns to do all the nitty-gritty of connecting up neurons. This makes the projection code *much* simpler compared to the ProjectionSpec in C++ emergent, which was involved in both creating the pattern and also all the complexity of setting up the actual connections themselves. This should be the *last* time any of those projection patterns need to be written (having re-written this code too many times in the C++ version as the details of memory allocations changed). +* [path](path) is a separate package for defining patterns of connectivity between layers (i.e., the `ProjectionSpec`s from C++ emergent). This is done using a fully independent structure that *only* knows about the shapes of the two layers, and it returns a fully general bitmap representation of the pattern of connectivity between them. The `leabra.Path` code then uses these patterns to do all the nitty-gritty of connecting up neurons. This makes the pathway code *much* simpler compared to the ProjectionSpec in C++ emergent, which was involved in both creating the pattern and also all the complexity of setting up the actual connections themselves. This should be the *last* time any of those pathway patterns need to be written (having re-written this code too many times in the C++ version as the details of memory allocations changed). * [relpos](relpos) provides relative positioning of layers (right of, above, etc). diff --git a/actrf/README.md b/actrf/README.md index 77d8d243..d22e2fdd 100644 --- a/actrf/README.md +++ b/actrf/README.md @@ -2,7 +2,7 @@ Docs: [GoDoc](https://pkg.go.dev/github.com/emer/emergent/actrf) Package actrf provides activation-based receptive field computation, otherwise known as *reverse correlation* or *spike-triggered averaging*. It simply computes the activation weighted average of other *source* patterns of activation -- i.e., sum(act * src) / sum(src) which then shows you the patterns of source activity for which a given unit was active. -The RF's are computed and stored in 4D tensors, where the outer 2D are the 2D projection of the activation tensor (e.g., the activations of units in a layer), and the inner 2D are the 2D projection of the source tensor. +The RF's are computed and stored in 4D tensors, where the outer 2D are the 2D pathway of the activation tensor (e.g., the activations of units in a layer), and the inner 2D are the 2D pathway of the source tensor. This results in a nice standard RF plot that can be visualized in a tensor grid view. diff --git a/actrf/actrf.go b/actrf/actrf.go index e4c67b38..72010364 100644 --- a/actrf/actrf.go +++ b/actrf/actrf.go @@ -56,8 +56,8 @@ func (af *RF) Init(name string, act, src tensor.Tensor) { // does nothing if shape is already correct. // return shape ints func (af *RF) InitShape(act, src tensor.Tensor) []int { - aNy, aNx, _, _ := tensor.Prjn2DShape(act.Shape(), false) - sNy, sNx, _, _ := tensor.Prjn2DShape(src.Shape(), false) + aNy, aNx, _, _ := tensor.Projection2DShape(act.Shape(), false) + sNy, sNx, _, _ := tensor.Projection2DShape(src.Shape(), false) oshp := []int{aNy, aNx, sNy, sNx} if tensor.EqualInts(af.RF.Shp.Sizes, oshp) { return oshp @@ -102,14 +102,14 @@ func (af *RF) Add(act, src tensor.Tensor, thr float32) { aNy, aNx, sNy, sNx := shp[0], shp[1], shp[2], shp[3] for sy := 0; sy < sNy; sy++ { for sx := 0; sx < sNx; sx++ { - tv := float32(tensor.Prjn2DValue(src, false, sy, sx)) + tv := float32(tensor.Projection2DValue(src, false, sy, sx)) if tv < thr { continue } af.SumSrc.AddScalar([]int{sy, sx}, float64(tv)) for ay := 0; ay < aNy; ay++ { for ax := 0; ax < aNx; ax++ { - av := float32(tensor.Prjn2DValue(act, false, ay, ax)) + av := float32(tensor.Projection2DValue(act, false, ay, ax)) af.SumProd.AddScalar([]int{ay, ax, sy, sx}, float64(av*tv)) } } diff --git a/actrf/doc.go b/actrf/doc.go index 11f7bf12..13a972bb 100644 --- a/actrf/doc.go +++ b/actrf/doc.go @@ -10,8 +10,8 @@ which then shows you the patterns of source activity for which a given unit was active. The RF's are computed and stored in 4D tensors, where the outer 2D are the -2D projection of the activation tensor (e.g., the activations of units in -a layer), and the inner 2D are the 2D projection of the source tensor. +2D pathway of the activation tensor (e.g., the activations of units in +a layer), and the inner 2D are the 2D pathway of the source tensor. This results in a nice standard RF plot that can be visualized in a tensor grid view. diff --git a/actrf/running.go b/actrf/running.go index 2388f08e..69f903c0 100644 --- a/actrf/running.go +++ b/actrf/running.go @@ -9,21 +9,21 @@ import "cogentcore.org/core/tensor" // RunningAvg computes a running-average activation-based receptive field // for activities act relative to source activations src (the thing we're projecting rf onto) // accumulating into output out, with time constant tau. -// act and src are projected into a 2D space (tensor.Prjn2D* methods), and +// act and src are projected into a 2D space (tensor.Projection2D* methods), and // resulting out is 4D of act outer and src inner. func RunningAvg(out *tensor.Float32, act, src tensor.Tensor, tau float32) { dt := 1 / tau cdt := 1 - dt - aNy, aNx, _, _ := tensor.Prjn2DShape(act.Shape(), false) - tNy, tNx, _, _ := tensor.Prjn2DShape(src.Shape(), false) + aNy, aNx, _, _ := tensor.Projection2DShape(act.Shape(), false) + tNy, tNx, _, _ := tensor.Projection2DShape(src.Shape(), false) oshp := []int{aNy, aNx, tNy, tNx} out.SetShape(oshp, "ActY", "ActX", "SrcY", "SrcX") for ay := 0; ay < aNy; ay++ { for ax := 0; ax < aNx; ax++ { - av := float32(tensor.Prjn2DValue(act, false, ay, ax)) + av := float32(tensor.Projection2DValue(act, false, ay, ax)) for ty := 0; ty < tNy; ty++ { for tx := 0; tx < tNx; tx++ { - tv := float32(tensor.Prjn2DValue(src, false, ty, tx)) + tv := float32(tensor.Projection2DValue(src, false, ty, tx)) oi := []int{ay, ax, ty, tx} oo := out.Shape().Offset(oi) ov := out.Values[oo] diff --git a/decoder/softmax.go b/decoder/softmax.go index 3efbaf0d..5d4ed9ea 100644 --- a/decoder/softmax.go +++ b/decoder/softmax.go @@ -242,7 +242,7 @@ type softMaxForSerialization struct { Weights []float32 `json:"weights"` } -// Save saves the decoder weights to given file path. +// Save saves the decoder weights to given file paths. // If path ends in .gz, it will be gzipped. func (sm *SoftMax) Save(path string) error { file, err := os.Create(path) @@ -265,7 +265,7 @@ func (sm *SoftMax) Save(path string) error { return encoder.Encode(softMaxForSerialization{Weights: sm.Weights.Values}) } -// Load loads the decoder weights from given file path. +// Load loads the decoder weights from given file paths. // If the shape of the decoder does not match the shape of the saved weights, // an error will be returned. func (sm *SoftMax) Load(path string) error { diff --git a/doc.go b/doc.go index f50647d3..45fc8fff 100644 --- a/doc.go +++ b/doc.go @@ -10,7 +10,7 @@ This top-level of the repository has no functional code -- everything is organiz into the following sub-repositories: * emer: defines the primary structural interfaces for emergent, at the level of -Network, Layer, and Prjn (projection). These contain no algorithm-specific code +Network, Layer, and Path (pathway). These contain no algorithm-specific code and are only about the overall structure of a network, sufficient to support general purpose tools such as the 3D NetView. It also houses widely used support classes used in algorithm-specific code, including things like MinMax and AvgMax, and also the @@ -23,14 +23,14 @@ and easier support for making permuted random lists, etc. * netview provides the NetView interactive 3D network viewer, implemented in the Cogent Core 3D framework. -* prjn is a separate package for defining patterns of connectivity between layers +* path is a separate package for defining patterns of connectivity between layers (i.e., the ProjectionSpecs from C++ emergent). This is done using a fully independent structure that *only* knows about the shapes of the two layers, and it returns a fully general -bitmap representation of the pattern of connectivity between them. The leabra.Prjn code +bitmap representation of the pattern of connectivity between them. The leabra.Path code then uses these patterns to do all the nitty-gritty of connecting up neurons. -This makes the projection code *much* simpler compared to the ProjectionSpec in C++ emergent, +This makes the pathway code *much* simpler compared to the ProjectionSpec in C++ emergent, which was involved in both creating the pattern and also all the complexity of setting up the -actual connections themselves. This should be the *last* time any of those projection patterns +actual connections themselves. This should be the *last* time any of those pathway patterns need to be written (having re-written this code too many times in the C++ version as the details of memory allocations changed). diff --git a/econfig/README.md b/econfig/README.md index 3122fa08..fc885f1c 100644 --- a/econfig/README.md +++ b/econfig/README.md @@ -26,7 +26,7 @@ Docs: [GoDoc](https://pkg.go.dev/github.com/emer/emergent/econfig) * Is a replacement for `ecmd` and includes the helper methods for saving log files etc. -* A `map[string]any` type can be used for deferred raw params to be applied later (`Network`, `Env` etc). Example: `Network = {'.PFCLayer:Layer.Inhib.Layer.Gi' = '2.4', '#VSPatchPrjn:Prjn.Learn.LRate' = '0.01'}` where the key expression contains the [params](../params) selector : path to variable. +* A `map[string]any` type can be used for deferred raw params to be applied later (`Network`, `Env` etc). Example: `Network = {'.PFCLayer:Layer.Inhib.Layer.Gi' = '2.4', '#VSPatchPath:Path.Learn.LRate' = '0.01'}` where the key expression contains the [params](../params) selector : path to variable. * Supports full set of `Open` (file), `OpenFS` (takes fs.FS arg, e.g., for embedded), `Read` (bytes) methods for loading config files. The overall `Config()` version uses `OpenWithIncludes` which processes includes -- others are just for single files. Also supports `Write` and `Save` methods for saving from current state. @@ -47,7 +47,7 @@ Docs: [GoDoc](https://pkg.go.dev/github.com/emer/emergent/econfig) ```toml [Params.Network] "#Output:Layer.Inhib.Layer.Gi" = 0.7 - "Prjn:Prjn.Learn.LRate.Base" = 0.05 + "Path:Path.Learn.LRate.Base" = 0.05 ``` * Field tag `default:"value"`, used in the [Cogent Core](https://cogentcore.org/core) GUI, sets the initial default value and is shown for the `-h` or `--help` usage info. diff --git a/econfig/econfig_test.go b/econfig/econfig_test.go index f054849e..0747c851 100644 --- a/econfig/econfig_test.go +++ b/econfig/econfig_test.go @@ -145,7 +145,7 @@ func TestArgs(t *testing.T) { cfg := &TestConfig{} SetFromDefaults(cfg) // note: cannot use "-Includes=testcfg.toml", - args := []string{"-save-wts", "-nogui", "-no-epoch-log", "--NoRunLog", "--runs=5", "--run", "1", "--TAG", "nice", "--PatParams.Sparseness=0.1", "--Network", "{'.PFCLayer:Layer.Inhib.Gi' = '2.4', '#VSPatchPrjn:Prjn.Learn.LRate' = '0.01'}", "-Enum=TestValue2", "-Slice=[3.2, 2.4, 1.9]", "leftover1", "leftover2"} + args := []string{"-save-wts", "-nogui", "-no-epoch-log", "--NoRunLog", "--runs=5", "--run", "1", "--TAG", "nice", "--PatParams.Sparseness=0.1", "--Network", "{'.PFCLayer:Layer.Inhib.Gi' = '2.4', '#VSPatchPath:Path.Learn.LRate' = '0.01'}", "-Enum=TestValue2", "-Slice=[3.2, 2.4, 1.9]", "leftover1", "leftover2"} allArgs := make(map[string]reflect.Value) FieldArgNames(cfg, allArgs) leftovers, err := ParseArgs(cfg, args, allArgs, true) diff --git a/econfig/enum.go b/econfig/enum.go index 03edc4f7..91282786 100644 --- a/econfig/enum.go +++ b/econfig/enum.go @@ -10,7 +10,7 @@ package econfig type TestEnum int32 //enums:enum // note: we need to add the Layer extension to avoid naming -// conflicts between layer, projection and other things. +// conflicts between layer, pathway and other things. const ( TestValue1 TestEnum = iota diff --git a/elog/README.md b/elog/README.md index 02862576..0ca1187f 100644 --- a/elog/README.md +++ b/elog/README.md @@ -261,7 +261,7 @@ Iterate over layers of interest (use `LayersByClass` function). It is *essential } ``` -Here's how to log a projection variable: +Here's how to log a pathway variable: ```Go ss.Logs.AddItem(&elog.Item{ @@ -271,7 +271,7 @@ Here's how to log a projection variable: Range: minmax.F32{Max: 1}, Write: elog.WriteMap{ etime.Scope(etime.Train, etime.Trial): func(ctx *elog.Context) { - ffpj := cly.RecvPrjn(0).(*axon.Prjn) + ffpj := cly.RecvPath(0).(*axon.Path) ctx.SetFloat32(ffpj.GScale.AvgMax) }, etime.Scope(etime.AllModes, etime.Epoch): func(ctx *elog.Context) { ctx.SetAgg(ctx.Mode, etime.Trial, stats.Mean) diff --git a/emer/README.md b/emer/README.md index c4cb061f..23509b00 100644 --- a/emer/README.md +++ b/emer/README.md @@ -2,7 +2,7 @@ Docs: [GoDoc](https://pkg.go.dev/github.com/emer/emergent/v2/emer) Package emer provides minimal interfaces for the basic structural elements of neural networks including: -* emer.Network, emer.Layer, emer.Unit, emer.Prjn (projection that interconnects layers) +* emer.Network, emer.Layer, emer.Unit, emer.Path (pathway that interconnects layers) These interfaces are intended to be just sufficient to support visualization and generic analysis kinds of functions, but explicitly avoid exposing ANY of the algorithmic aspects, diff --git a/emer/doc.go b/emer/doc.go index cf6aa043..e186db5a 100644 --- a/emer/doc.go +++ b/emer/doc.go @@ -5,7 +5,7 @@ /* Package emer provides minimal interfaces for the basic structural elements of neural networks including: -* emer.Network, emer.Layer, emer.Unit, emer.Prjn (projection that interconnects layers) +* emer.Network, emer.Layer, emer.Unit, emer.Path (pathway that interconnects layers) These interfaces are intended to be just sufficient to support visualization and generic analysis kinds of functions, but explicitly avoid exposing ANY of the algorithmic aspects, diff --git a/emer/enumgen.go b/emer/enumgen.go index 95fa2a46..4230ba2e 100644 --- a/emer/enumgen.go +++ b/emer/enumgen.go @@ -49,43 +49,43 @@ func (i *LayerType) UnmarshalText(text []byte) error { return enums.UnmarshalText(i, text, "LayerType") } -var _PrjnTypeValues = []PrjnType{0, 1, 2, 3} +var _PathTypeValues = []PathType{0, 1, 2, 3} -// PrjnTypeN is the highest valid value for type PrjnType, plus one. -const PrjnTypeN PrjnType = 4 +// PathTypeN is the highest valid value for type PathType, plus one. +const PathTypeN PathType = 4 -var _PrjnTypeValueMap = map[string]PrjnType{`Forward`: 0, `Back`: 1, `Lateral`: 2, `Inhib`: 3} +var _PathTypeValueMap = map[string]PathType{`Forward`: 0, `Back`: 1, `Lateral`: 2, `Inhib`: 3} -var _PrjnTypeDescMap = map[PrjnType]string{0: `Forward is a feedforward, bottom-up projection from sensory inputs to higher layers`, 1: `Back is a feedback, top-down projection from higher layers back to lower layers`, 2: `Lateral is a lateral projection within the same layer / area`, 3: `Inhib is an inhibitory projection that drives inhibitory synaptic inputs instead of excitatory`} +var _PathTypeDescMap = map[PathType]string{0: `Forward is a feedforward, bottom-up pathway from sensory inputs to higher layers`, 1: `Back is a feedback, top-down pathway from higher layers back to lower layers`, 2: `Lateral is a lateral pathway within the same layer / area`, 3: `Inhib is an inhibitory pathway that drives inhibitory synaptic inputs instead of excitatory`} -var _PrjnTypeMap = map[PrjnType]string{0: `Forward`, 1: `Back`, 2: `Lateral`, 3: `Inhib`} +var _PathTypeMap = map[PathType]string{0: `Forward`, 1: `Back`, 2: `Lateral`, 3: `Inhib`} -// String returns the string representation of this PrjnType value. -func (i PrjnType) String() string { return enums.String(i, _PrjnTypeMap) } +// String returns the string representation of this PathType value. +func (i PathType) String() string { return enums.String(i, _PathTypeMap) } -// SetString sets the PrjnType value from its string representation, +// SetString sets the PathType value from its string representation, // and returns an error if the string is invalid. -func (i *PrjnType) SetString(s string) error { - return enums.SetString(i, s, _PrjnTypeValueMap, "PrjnType") +func (i *PathType) SetString(s string) error { + return enums.SetString(i, s, _PathTypeValueMap, "PathType") } -// Int64 returns the PrjnType value as an int64. -func (i PrjnType) Int64() int64 { return int64(i) } +// Int64 returns the PathType value as an int64. +func (i PathType) Int64() int64 { return int64(i) } -// SetInt64 sets the PrjnType value from an int64. -func (i *PrjnType) SetInt64(in int64) { *i = PrjnType(in) } +// SetInt64 sets the PathType value from an int64. +func (i *PathType) SetInt64(in int64) { *i = PathType(in) } -// Desc returns the description of the PrjnType value. -func (i PrjnType) Desc() string { return enums.Desc(i, _PrjnTypeDescMap) } +// Desc returns the description of the PathType value. +func (i PathType) Desc() string { return enums.Desc(i, _PathTypeDescMap) } -// PrjnTypeValues returns all possible values for the type PrjnType. -func PrjnTypeValues() []PrjnType { return _PrjnTypeValues } +// PathTypeValues returns all possible values for the type PathType. +func PathTypeValues() []PathType { return _PathTypeValues } -// Values returns all possible values for the type PrjnType. -func (i PrjnType) Values() []enums.Enum { return enums.Values(_PrjnTypeValues) } +// Values returns all possible values for the type PathType. +func (i PathType) Values() []enums.Enum { return enums.Values(_PathTypeValues) } // MarshalText implements the [encoding.TextMarshaler] interface. -func (i PrjnType) MarshalText() ([]byte, error) { return []byte(i.String()), nil } +func (i PathType) MarshalText() ([]byte, error) { return []byte(i.String()), nil } // UnmarshalText implements the [encoding.TextUnmarshaler] interface. -func (i *PrjnType) UnmarshalText(text []byte) error { return enums.UnmarshalText(i, text, "PrjnType") } +func (i *PathType) UnmarshalText(text []byte) error { return enums.UnmarshalText(i, text, "PathType") } diff --git a/emer/layer.go b/emer/layer.go index ce19eb04..1423fcac 100644 --- a/emer/layer.go +++ b/emer/layer.go @@ -45,7 +45,7 @@ type Layer interface { IsOff() bool // SetOff sets the "off" (lesioned) status of layer. Also sets the Off state of all - // projections from this layer to other layers. + // pathways from this layer to other layers. SetOff(off bool) // Shape returns the organization of units in the layer, in terms of an array of dimensions. @@ -193,62 +193,62 @@ type Layer interface { // di is a data parallel index di, for networks capable of processing input patterns in parallel. UnitValue(varNm string, idx []int, di int) float32 - // NRecvPrjns returns the number of receiving projections - NRecvPrjns() int + // NRecvPaths returns the number of receiving pathways + NRecvPaths() int - // RecvPrjn returns a specific receiving projection - RecvPrjn(idx int) Prjn + // RecvPath returns a specific receiving pathway + RecvPath(idx int) Path - // NSendPrjns returns the number of sending projections - NSendPrjns() int + // NSendPaths returns the number of sending pathways + NSendPaths() int - // SendPrjn returns a specific sending projection - SendPrjn(idx int) Prjn + // SendPath returns a specific sending pathway + SendPath(idx int) Path - // SendNameTry looks for a projection connected to this layer whose sender layer has a given name - SendNameTry(sender string) (Prjn, error) + // SendNameTry looks for a pathway connected to this layer whose sender layer has a given name + SendNameTry(sender string) (Path, error) - // SendNameTypeTry looks for a projection connected to this layer whose sender layer has a given name and type - SendNameTypeTry(sender, typ string) (Prjn, error) + // SendNameTypeTry looks for a pathway connected to this layer whose sender layer has a given name and type + SendNameTypeTry(sender, typ string) (Path, error) - // RecvNameTry looks for a projection connected to this layer whose receiver layer has a given name - RecvNameTry(recv string) (Prjn, error) + // RecvNameTry looks for a pathway connected to this layer whose receiver layer has a given name + RecvNameTry(recv string) (Path, error) - // RecvNameTypeTry looks for a projection connected to this layer whose receiver layer has a given name and type - RecvNameTypeTry(recv, typ string) (Prjn, error) + // RecvNameTypeTry looks for a pathway connected to this layer whose receiver layer has a given name and type + RecvNameTypeTry(recv, typ string) (Path, error) - // RecvPrjnValues fills in values of given synapse variable name, - // for projection from given sending layer and neuron 1D index, + // RecvPathValues fills in values of given synapse variable name, + // for pathway from given sending layer and neuron 1D index, // for all receiving neurons in this layer, // into given float32 slice (only resized if not big enough). - // prjnType is the string representation of the prjn type -- used if non-empty, - // useful when there are multiple projections between two layers. + // pathType is the string representation of the path type -- used if non-empty, + // useful when there are multiple pathways between two layers. // Returns error on invalid var name. // If the receiving neuron is not connected to the given sending layer or neuron // then the value is set to math32.NaN(). - // Returns error on invalid var name or lack of recv prjn (vals always set to nan on prjn err). - RecvPrjnValues(vals *[]float32, varNm string, sendLay Layer, sendIndex1D int, prjnType string) error + // Returns error on invalid var name or lack of recv path (vals always set to nan on path err). + RecvPathValues(vals *[]float32, varNm string, sendLay Layer, sendIndex1D int, pathType string) error - // SendPrjnValues fills in values of given synapse variable name, - // for projection into given receiving layer and neuron 1D index, + // SendPathValues fills in values of given synapse variable name, + // for pathway into given receiving layer and neuron 1D index, // for all sending neurons in this layer, // into given float32 slice (only resized if not big enough). - // prjnType is the string representation of the prjn type -- used if non-empty, - // useful when there are multiple projections between two layers. + // pathType is the string representation of the path type -- used if non-empty, + // useful when there are multiple pathways between two layers. // Returns error on invalid var name. // If the sending neuron is not connected to the given receiving layer or neuron // then the value is set to math32.NaN(). - // Returns error on invalid var name or lack of recv prjn (vals always set to nan on prjn err). - SendPrjnValues(vals *[]float32, varNm string, recvLay Layer, recvIndex1D int, prjnType string) error + // Returns error on invalid var name or lack of recv path (vals always set to nan on path err). + SendPathValues(vals *[]float32, varNm string, recvLay Layer, recvIndex1D int, pathType string) error - // Defaults sets default parameter values for all Layer and recv projection parameters + // Defaults sets default parameter values for all Layer and recv pathway parameters Defaults() - // UpdateParams() updates parameter values for all Layer and recv projection parameters, + // UpdateParams() updates parameter values for all Layer and recv pathway parameters, // based on any other params that might have changed. UpdateParams() - // ApplyParams applies given parameter style Sheet to this layer and its recv projections. + // ApplyParams applies given parameter style Sheet to this layer and its recv pathways. // Calls UpdateParams on anything set to ensure derived parameters are all updated. // If setMsg is true, then a message is printed to confirm each parameter that is set. // it always prints a message if a parameter fails to be set. @@ -280,7 +280,7 @@ type Layer interface { // SetWts sets the weights for this layer from weights.Layer decoded values SetWts(lw *weights.Layer) error - // Build constructs the layer and projection state based on the layer shapes + // Build constructs the layer and pathway state based on the layer shapes // and patterns of interconnectivity Build() error @@ -386,42 +386,42 @@ const ( // we keep these here to make it easier for other packages to implement the emer.Layer interface // by just calling these methods -func SendNameTry(l Layer, sender string) (Prjn, error) { - for pi := 0; pi < l.NRecvPrjns(); pi++ { - pj := l.RecvPrjn(pi) +func SendNameTry(l Layer, sender string) (Path, error) { + for pi := 0; pi < l.NRecvPaths(); pi++ { + pj := l.RecvPath(pi) if pj.SendLay().Name() == sender { return pj, nil } } - return nil, fmt.Errorf("sending layer: %v not found in list of projections", sender) + return nil, fmt.Errorf("sending layer: %v not found in list of pathways", sender) } -func RecvNameTry(l Layer, recv string) (Prjn, error) { - for pi := 0; pi < l.NSendPrjns(); pi++ { - pj := l.SendPrjn(pi) +func RecvNameTry(l Layer, recv string) (Path, error) { + for pi := 0; pi < l.NSendPaths(); pi++ { + pj := l.SendPath(pi) if pj.RecvLay().Name() == recv { return pj, nil } } - return nil, fmt.Errorf("receiving layer: %v not found in list of projections", recv) + return nil, fmt.Errorf("receiving layer: %v not found in list of pathways", recv) } -func SendNameTypeTry(l Layer, sender, typ string) (Prjn, error) { - for pi := 0; pi < l.NRecvPrjns(); pi++ { - pj := l.RecvPrjn(pi) - if pj.SendLay().Name() == sender && pj.PrjnTypeName() == typ { +func SendNameTypeTry(l Layer, sender, typ string) (Path, error) { + for pi := 0; pi < l.NRecvPaths(); pi++ { + pj := l.RecvPath(pi) + if pj.SendLay().Name() == sender && pj.PathTypeName() == typ { return pj, nil } } - return nil, fmt.Errorf("sending layer: %v not found in list of projections", sender) + return nil, fmt.Errorf("sending layer: %v not found in list of pathways", sender) } -func RecvNameTypeTry(l Layer, recv, typ string) (Prjn, error) { - for pi := 0; pi < l.NSendPrjns(); pi++ { - pj := l.SendPrjn(pi) - if pj.RecvLay().Name() == recv && pj.PrjnTypeName() == typ { +func RecvNameTypeTry(l Layer, recv, typ string) (Path, error) { + for pi := 0; pi < l.NSendPaths(); pi++ { + pj := l.SendPath(pi) + if pj.RecvLay().Name() == recv && pj.PathTypeName() == typ { return pj, nil } } - return nil, fmt.Errorf("receiving layer: %v, type: %v not found in list of projections", recv, typ) + return nil, fmt.Errorf("receiving layer: %v, type: %v not found in list of pathways", recv, typ) } diff --git a/emer/netparams.go b/emer/netparams.go index d7d37929..d522efe4 100644 --- a/emer/netparams.go +++ b/emer/netparams.go @@ -33,7 +33,7 @@ type NetParams struct { // the network to apply parameters to Network Network `view:"-"` - // list of hyper parameters compiled from the network parameters, using the layers and projections from the network, so that the same styling logic as for regular parameters can be used + // list of hyper parameters compiled from the network parameters, using the layers and pathways from the network, so that the same styling logic as for regular parameters can be used NetHypers params.Flex `view:"-"` // print out messages for each parameter that is set diff --git a/emer/network.go b/emer/network.go index c8780abf..0aede233 100644 --- a/emer/network.go +++ b/emer/network.go @@ -43,9 +43,9 @@ type Network interface { // Layer names must be unique and a map is used so this is a fast operation LayerByNameTry(name string) (Layer, error) - // PrjnByNameTry returns prjn of given name, returns error if not found. - // Prjn names are SendToRecv, and are looked up by parsing the name - PrjnByNameTry(name string) (Prjn, error) + // PathByNameTry returns path of given name, returns error if not found. + // Path names are SendToRecv, and are looked up by parsing the name + PathByNameTry(name string) (Path, error) // Defaults sets default parameter values for everything in the Network Defaults() @@ -54,7 +54,7 @@ type Network interface { // based on any other params that might have changed. UpdateParams() - // ApplyParams applies given parameter style Sheet to layers and prjns in this network. + // ApplyParams applies given parameter style Sheet to layers and paths in this network. // Calls UpdateParams on anything set to ensure derived parameters are all updated. // If setMsg is true, then a message is printed to confirm each parameter that is set. // it always prints a message if a parameter fails to be set. @@ -72,9 +72,9 @@ type Network interface { // of the most important layer-level params (specific to each algorithm). KeyLayerParams() string - // KeyPrjnParams returns a listing for all Recv projections in the network, - // of the most important projection-level params (specific to each algorithm). - KeyPrjnParams() string + // KeyPathParams returns a listing for all Recv pathways in the network, + // of the most important pathway-level params (specific to each algorithm). + KeyPathParams() string // UnitVarNames returns a list of variable names available on the units in this network. // This list determines what is shown in the NetView (and the order of vars list). @@ -97,7 +97,7 @@ type Network interface { // SynVarNames returns the names of all the variables on the synapses in this network. // This list determines what is shown in the NetView (and the order of vars list). - // Not all projections need to support all variables, but must safely return math32.NaN() for + // Not all pathways need to support all variables, but must safely return math32.NaN() for // unsupported ones. // This is typically a global list so do not modify! SynVarNames() []string diff --git a/emer/params.go b/emer/params.go index 16312289..0e7c0de4 100644 --- a/emer/params.go +++ b/emer/params.go @@ -31,7 +31,7 @@ type Params struct { // map of objects to apply parameters to -- the key is the name of the Sheet for each object, e.g., Objects map[string]any `view:"-" Network", "Sim" are typically used"` - // list of hyper parameters compiled from the network parameters, using the layers and projections from the network, so that the same styling logic as for regular parameters can be used + // list of hyper parameters compiled from the network parameters, using the layers and pathways from the network, so that the same styling logic as for regular parameters can be used NetHypers params.Flex `view:"-"` // print out messages for each parameter that is set @@ -286,7 +286,7 @@ func (pr *Params) SetObjectSet(objName, setName string) error { } // NetworkHyperParams returns the compiled hyper parameters from given Sheet -// for each layer and projection in the network -- applies the standard css +// for each layer and pathway in the network -- applies the standard css // styling logic for the hyper parameters. func NetworkHyperParams(net Network, sheet *params.Sheet) params.Flex { hypers := params.Flex{} @@ -297,15 +297,15 @@ func NetworkHyperParams(net Network, sheet *params.Sheet) params.Flex { // typ := ly.Type().String() hypers[nm] = ¶ms.FlexVal{Nm: nm, Type: "Layer", Cls: ly.Class(), Obj: params.Hypers{}} } - // separate projections + // separate pathways for li := 0; li < nl; li++ { ly := net.Layer(li) - np := ly.NRecvPrjns() + np := ly.NRecvPaths() for pi := 0; pi < np; pi++ { - pj := ly.RecvPrjn(pi) + pj := ly.RecvPath(pi) nm := pj.Name() // typ := pj.Type().String() - hypers[nm] = ¶ms.FlexVal{Nm: nm, Type: "Prjn", Cls: pj.Class(), Obj: params.Hypers{}} + hypers[nm] = ¶ms.FlexVal{Nm: nm, Type: "Path", Cls: pj.Class(), Obj: params.Hypers{}} } } for nm, vl := range hypers { @@ -319,8 +319,8 @@ func NetworkHyperParams(net Network, sheet *params.Sheet) params.Flex { return hypers } -// SetFloatParam sets given float32 param value to layer or projection -// (typ = Layer or Prjn) of given name, at given path (which can start +// SetFloatParam sets given float32 param value to layer or pathway +// (typ = Layer or Path) of given name, at given path (which can start // with the typ name). // Returns an error (and logs it automatically) for any failure. func SetFloatParam(net Network, name, typ, path string, val float32) error { @@ -338,8 +338,8 @@ func SetFloatParam(net Network, name, typ, path string, val float32) error { slog.Error(err.Error()) return err } - case "Prjn": - pj, err := net.PrjnByNameTry(name) + case "Path": + pj, err := net.PathByNameTry(name) if err != nil { slog.Error(err.Error()) return err diff --git a/emer/prjn.go b/emer/path.go similarity index 61% rename from emer/prjn.go rename to emer/path.go index d29d7b09..60233b19 100644 --- a/emer/prjn.go +++ b/emer/path.go @@ -10,62 +10,59 @@ import ( "cogentcore.org/core/base/reflectx" "github.com/emer/emergent/v2/params" - "github.com/emer/emergent/v2/prjn" + "github.com/emer/emergent/v2/paths" "github.com/emer/emergent/v2/weights" ) -// Prjn defines the basic interface for a projection which connects two layers. +// Path defines the basic interface for a pathway which connects two layers. // Name is set automatically to: SendLay().Name() + "To" + RecvLay().Name() -type Prjn interface { +type Path interface { params.Styler // TypeName, Name, and Class methods for parameter styling - // Init MUST be called to initialize the prjn's pointer to itself as an emer.Prjn + // Init MUST be called to initialize the path's pointer to itself as an emer.Path // which enables the proper interface methods to be called. - Init(prjn Prjn) + Init(path Path) - // SendLay returns the sending layer for this projection + // SendLay returns the sending layer for this pathway SendLay() Layer - // RecvLay returns the receiving layer for this projection + // RecvLay returns the receiving layer for this pathway RecvLay() Layer // Pattern returns the pattern of connectivity for interconnecting the layers - Pattern() prjn.Pattern + Pattern() paths.Pattern // SetPattern sets the pattern of connectivity for interconnecting the layers. - // Returns Prjn so it can be chained to set other properties too - SetPattern(pat prjn.Pattern) Prjn + // Returns Path so it can be chained to set other properties too + SetPattern(pat paths.Pattern) Path - // Type returns the functional type of projection according to PrjnType (extensible in + // Type returns the functional type of pathway according to PathType (extensible in // more specialized algorithms) - Type() PrjnType + Type() PathType - // SetType sets the functional type of projection according to PrjnType - // Returns Prjn so it can be chained to set other properties too - SetType(typ PrjnType) Prjn + // SetType sets the functional type of pathway according to PathType + // Returns Path so it can be chained to set other properties too + SetType(typ PathType) Path - // PrjnTypeName returns the string rep of functional type of projection - // according to PrjnType (extensible in more specialized algorithms, by + // PathTypeName returns the string rep of functional type of pathway + // according to PathType (extensible in more specialized algorithms, by // redefining this method as needed). - PrjnTypeName() string + PathTypeName() string - // Connect sets the basic connection parameters for this projection (send, recv, pattern, and type) - // Connect(send, recv Layer, pat prjn.Pattern, typ PrjnType) - - // AddClass adds a CSS-style class name(s) for this prjn, + // AddClass adds a CSS-style class name(s) for this path, // ensuring that it is not a duplicate, and properly space separated. - // Returns Prjn so it can be chained to set other properties too - AddClass(cls ...string) Prjn + // Returns Path so it can be chained to set other properties too + AddClass(cls ...string) Path // Label satisfies the core.Labeler interface for getting the name of objects generically Label() string - // IsOff returns true if projection or either send or recv layer has been turned Off. + // IsOff returns true if pathway or either send or recv layer has been turned Off. // Useful for experimentation IsOff() bool - // SetOff sets the projection Off status (i.e., lesioned). Careful: Layer.SetOff(true) will - // reactivate that layer's projections, so projection-level lesioning should always be called + // SetOff sets the pathway Off status (i.e., lesioned). Careful: Layer.SetOff(true) will + // reactivate that layer's pathways, so pathway-level lesioning should always be called // after layer-level lesioning. SetOff(off bool) @@ -90,15 +87,15 @@ type Prjn interface { SynIndex(sidx, ridx int) int // SynVarIndex returns the index of given variable within the synapse, - // according to *this prjn's* SynVarNames() list (using a map to lookup index), + // according to *this path's* SynVarNames() list (using a map to lookup index), // or -1 and error message if not found. SynVarIndex(varNm string) (int, error) // SynVarNum returns the number of synapse-level variables - // for this prjn. This is needed for extending indexes in derived types. + // for this paths. This is needed for extending indexes in derived types. SynVarNum() int - // Syn1DNum returns the number of synapses for this prjn as a 1D array. + // Syn1DNum returns the number of synapses for this path as a 1D array. // This is the max idx for SynVal1D and the number of vals set by SynValues. Syn1DNum() int @@ -126,14 +123,14 @@ type Prjn interface { // Returns error for access errors. SetSynValue(varNm string, sidx, ridx int, val float32) error - // Defaults sets default parameter values for all Prjn parameters + // Defaults sets default parameter values for all Path parameters Defaults() - // UpdateParams() updates parameter values for all Prjn parameters, + // UpdateParams() updates parameter values for all Path parameters, // based on any other params that might have changed. UpdateParams() - // ApplyParams applies given parameter style Sheet to this projection. + // ApplyParams applies given parameter style Sheet to this pathway. // Calls UpdateParams if anything set to ensure derived parameters are all updated. // If setMsg is true, then a message is printed to confirm each parameter that is set. // it always prints a message if a parameter fails to be set. @@ -151,29 +148,29 @@ type Prjn interface { // AllParams returns a listing of all parameters in the Projection AllParams() string - // WriteWtsJSON writes the weights from this projection from the receiver-side perspective + // WriteWtsJSON writes the weights from this pathway from the receiver-side perspective // in a JSON text format. We build in the indentation logic to make it much faster and // more efficient. WriteWtsJSON(w io.Writer, depth int) - // ReadWtsJSON reads the weights from this projection from the receiver-side perspective - // in a JSON text format. This is for a set of weights that were saved *for one prjn only* + // ReadWtsJSON reads the weights from this pathway from the receiver-side perspective + // in a JSON text format. This is for a set of weights that were saved *for one path only* // and is not used for the network-level ReadWtsJSON, which reads into a separate // structure -- see SetWts method. ReadWtsJSON(r io.Reader) error - // SetWts sets the weights for this projection from weights.Prjn decoded values - SetWts(pw *weights.Prjn) error + // SetWts sets the weights for this pathway from weights.Path decoded values + SetWts(pw *weights.Path) error - // Build constructs the full connectivity among the layers as specified in this projection. + // Build constructs the full connectivity among the layers as specified in this pathway. Build() error } -// Prjns is a slice of projections -type Prjns []Prjn +// Paths is a slice of pathways +type Paths []Path // ElemLabel satisfies the core.SliceLabeler interface to provide labels for slice elements -func (pl *Prjns) ElemLabel(idx int) string { +func (pl *Paths) ElemLabel(idx int) string { if len(*pl) == 0 { return "(empty)" } @@ -187,13 +184,13 @@ func (pl *Prjns) ElemLabel(idx int) string { return pj.Name() } -// Add adds a projection to the list -func (pl *Prjns) Add(p Prjn) { +// Add adds a pathway to the list +func (pl *Paths) Add(p Path) { (*pl) = append(*pl, p) } -// Send finds the projection with given send layer -func (pl *Prjns) Send(send Layer) (Prjn, bool) { +// Send finds the pathway with given send layer +func (pl *Paths) Send(send Layer) (Path, bool) { for _, pj := range *pl { if pj.SendLay() == send { return pj, true @@ -202,8 +199,8 @@ func (pl *Prjns) Send(send Layer) (Prjn, bool) { return nil, false } -// Recv finds the projection with given recv layer -func (pl *Prjns) Recv(recv Layer) (Prjn, bool) { +// Recv finds the pathway with given recv layer +func (pl *Paths) Recv(recv Layer) (Path, bool) { for _, pj := range *pl { if pj.RecvLay() == recv { return pj, true @@ -212,88 +209,88 @@ func (pl *Prjns) Recv(recv Layer) (Prjn, bool) { return nil, false } -// SendName finds the projection with given send layer name, nil if not found +// SendName finds the pathway with given send layer name, nil if not found // see Try version for error checking. -func (pl *Prjns) SendName(sender string) Prjn { +func (pl *Paths) SendName(sender string) Path { pj, _ := pl.SendNameTry(sender) return pj } -// RecvName finds the projection with given recv layer name, nil if not found +// RecvName finds the pathway with given recv layer name, nil if not found // see Try version for error checking. -func (pl *Prjns) RecvName(recv string) Prjn { +func (pl *Paths) RecvName(recv string) Path { pj, _ := pl.RecvNameTry(recv) return pj } -// SendNameTry finds the projection with given send layer name. +// SendNameTry finds the pathway with given send layer name. // returns error message if not found -func (pl *Prjns) SendNameTry(sender string) (Prjn, error) { +func (pl *Paths) SendNameTry(sender string) (Path, error) { for _, pj := range *pl { if pj.SendLay().Name() == sender { return pj, nil } } - return nil, fmt.Errorf("sending layer: %v not found in list of projections", sender) + return nil, fmt.Errorf("sending layer: %v not found in list of pathways", sender) } -// SendNameTypeTry finds the projection with given send layer name and Type string. +// SendNameTypeTry finds the pathway with given send layer name and Type string. // returns error message if not found. -func (pl *Prjns) SendNameTypeTry(sender, typ string) (Prjn, error) { +func (pl *Paths) SendNameTypeTry(sender, typ string) (Path, error) { for _, pj := range *pl { if pj.SendLay().Name() == sender { - tstr := pj.PrjnTypeName() + tstr := pj.PathTypeName() if tstr == typ { return pj, nil } } } - return nil, fmt.Errorf("sending layer: %v, type: %v not found in list of projections", sender, typ) + return nil, fmt.Errorf("sending layer: %v, type: %v not found in list of pathways", sender, typ) } -// RecvNameTry finds the projection with given recv layer name. +// RecvNameTry finds the pathway with given recv layer name. // returns error message if not found -func (pl *Prjns) RecvNameTry(recv string) (Prjn, error) { +func (pl *Paths) RecvNameTry(recv string) (Path, error) { for _, pj := range *pl { if pj.RecvLay().Name() == recv { return pj, nil } } - return nil, fmt.Errorf("receiving layer: %v not found in list of projections", recv) + return nil, fmt.Errorf("receiving layer: %v not found in list of pathways", recv) } -// RecvNameTypeTry finds the projection with given recv layer name and Type string. +// RecvNameTypeTry finds the pathway with given recv layer name and Type string. // returns error message if not found. -func (pl *Prjns) RecvNameTypeTry(recv, typ string) (Prjn, error) { +func (pl *Paths) RecvNameTypeTry(recv, typ string) (Path, error) { for _, pj := range *pl { if pj.RecvLay().Name() == recv { - tstr := pj.PrjnTypeName() + tstr := pj.PathTypeName() if tstr == typ { return pj, nil } } } - return nil, fmt.Errorf("receiving layer: %v, type: %v not found in list of projections", recv, typ) + return nil, fmt.Errorf("receiving layer: %v, type: %v not found in list of pathways", recv, typ) } ////////////////////////////////////////////////////////////////////////////////////// -// PrjnType +// PathType -// PrjnType is the type of the projection (extensible for more specialized algorithms). +// PathType is the type of the pathway (extensible for more specialized algorithms). // Class parameter styles automatically key off of these types. -type PrjnType int32 //enums:enum +type PathType int32 //enums:enum -// The projection types +// The pathway types const ( - // Forward is a feedforward, bottom-up projection from sensory inputs to higher layers - Forward PrjnType = iota + // Forward is a feedforward, bottom-up pathway from sensory inputs to higher layers + Forward PathType = iota - // Back is a feedback, top-down projection from higher layers back to lower layers + // Back is a feedback, top-down pathway from higher layers back to lower layers Back - // Lateral is a lateral projection within the same layer / area + // Lateral is a lateral pathway within the same layer / area Lateral - // Inhib is an inhibitory projection that drives inhibitory synaptic inputs instead of excitatory + // Inhib is an inhibitory pathway that drives inhibitory synaptic inputs instead of excitatory Inhib ) diff --git a/emer/typegen.go b/emer/typegen.go index e361b765..b9c7db1f 100644 --- a/emer/typegen.go +++ b/emer/typegen.go @@ -14,7 +14,7 @@ var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/emer.LayerT var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/emer.LayNames", IDName: "lay-names", Doc: "LayNames is a list of layer names.\nHas convenience methods for adding, validating."}) -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/emer.NetParams", IDName: "net-params", Doc: "NetParams handles standard parameters for a Network only\n(use econfig and a Config struct for other configuration params)\nAssumes a Set named \"Base\" has the base-level parameters, which are\nalways applied first, followed optionally by additional Set(s)\nthat can have different parameters to try.", Fields: []types.Field{{Name: "Params", Doc: "full collection of param sets to use"}, {Name: "ExtraSheets", Doc: "optional additional sheets of parameters to apply after Base -- can use multiple names separated by spaces (don't put spaces in Sheet names!)"}, {Name: "Tag", Doc: "optional additional tag to add to file names, logs to identify params / run config"}, {Name: "Network", Doc: "the network to apply parameters to"}, {Name: "NetHypers", Doc: "list of hyper parameters compiled from the network parameters, using the layers and projections from the network, so that the same styling logic as for regular parameters can be used"}, {Name: "SetMsg", Doc: "print out messages for each parameter that is set"}}}) +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/emer.NetParams", IDName: "net-params", Doc: "NetParams handles standard parameters for a Network only\n(use econfig and a Config struct for other configuration params)\nAssumes a Set named \"Base\" has the base-level parameters, which are\nalways applied first, followed optionally by additional Set(s)\nthat can have different parameters to try.", Fields: []types.Field{{Name: "Params", Doc: "full collection of param sets to use"}, {Name: "ExtraSheets", Doc: "optional additional sheets of parameters to apply after Base -- can use multiple names separated by spaces (don't put spaces in Sheet names!)"}, {Name: "Tag", Doc: "optional additional tag to add to file names, logs to identify params / run config"}, {Name: "Network", Doc: "the network to apply parameters to"}, {Name: "NetHypers", Doc: "list of hyper parameters compiled from the network parameters, using the layers and pathways from the network, so that the same styling logic as for regular parameters can be used"}, {Name: "SetMsg", Doc: "print out messages for each parameter that is set"}}}) var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/emer.LaySize", IDName: "lay-size", Doc: "LaySize contains parameters for size of layers", Fields: []types.Field{{Name: "Y", Doc: "Y (vertical) size of layer -- in units for 2D, or number of pools (outer dimension) for 4D layer"}, {Name: "X", Doc: "X (horizontal) size of layer -- in units for 2D, or number of pools (outer dimension) for 4D layer"}, {Name: "PoolY", Doc: "Y (vertical) size of each pool in units, only for 4D layers (inner dimension)"}, {Name: "PoolX", Doc: "Y (horizontal) size of each pool in units, only for 4D layers (inner dimension)"}}}) @@ -22,10 +22,10 @@ var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/emer.NetSiz var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/emer.Network", IDName: "network", Doc: "Network defines the basic interface for a neural network, used for managing the structural\nelements of a network, and for visualization, I/O, etc"}) -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/emer.Params", IDName: "params", Doc: "Params handles standard parameters for a Network and other objects.\nAssumes a Set named \"Base\" has the base-level parameters, which are\nalways applied first, followed optionally by additional Set(s)\nthat can have different parameters to try.", Fields: []types.Field{{Name: "Params", Doc: "full collection of param sets to use"}, {Name: "ExtraSets", Doc: "optional additional set(s) of parameters to apply after Base -- can use multiple names separated by spaces (don't put spaces in Set names!)"}, {Name: "Tag", Doc: "optional additional tag to add to file names, logs to identify params / run config"}, {Name: "Objects", Doc: "map of objects to apply parameters to -- the key is the name of the Sheet for each object, e.g.,"}, {Name: "NetHypers", Doc: "list of hyper parameters compiled from the network parameters, using the layers and projections from the network, so that the same styling logic as for regular parameters can be used"}, {Name: "SetMsg", Doc: "print out messages for each parameter that is set"}}}) +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/emer.Params", IDName: "params", Doc: "Params handles standard parameters for a Network and other objects.\nAssumes a Set named \"Base\" has the base-level parameters, which are\nalways applied first, followed optionally by additional Set(s)\nthat can have different parameters to try.", Fields: []types.Field{{Name: "Params", Doc: "full collection of param sets to use"}, {Name: "ExtraSets", Doc: "optional additional set(s) of parameters to apply after Base -- can use multiple names separated by spaces (don't put spaces in Set names!)"}, {Name: "Tag", Doc: "optional additional tag to add to file names, logs to identify params / run config"}, {Name: "Objects", Doc: "map of objects to apply parameters to -- the key is the name of the Sheet for each object, e.g.,"}, {Name: "NetHypers", Doc: "list of hyper parameters compiled from the network parameters, using the layers and pathways from the network, so that the same styling logic as for regular parameters can be used"}, {Name: "SetMsg", Doc: "print out messages for each parameter that is set"}}}) -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/emer.Prjn", IDName: "prjn", Doc: "Prjn defines the basic interface for a projection which connects two layers.\nName is set automatically to: SendLay().Name() + \"To\" + RecvLay().Name()"}) +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/emer.Path", IDName: "path", Doc: "Path defines the basic interface for a pathway which connects two layers.\nName is set automatically to: SendLay().Name() + \"To\" + RecvLay().Name()"}) -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/emer.Prjns", IDName: "prjns", Doc: "Prjns is a slice of projections"}) +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/emer.Paths", IDName: "paths", Doc: "Paths is a slice of pathways"}) -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/emer.PrjnType", IDName: "prjn-type", Doc: "PrjnType is the type of the projection (extensible for more specialized algorithms).\nClass parameter styles automatically key off of these types."}) +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/emer.PathType", IDName: "path-type", Doc: "PathType is the type of the pathway (extensible for more specialized algorithms).\nClass parameter styles automatically key off of these types."}) diff --git a/netparams/README.md b/netparams/README.md index 013d18a5..48716aa8 100644 --- a/netparams/README.md +++ b/netparams/README.md @@ -19,7 +19,7 @@ Sets { }, Sel: ".Back" { Params: { - "Prjn.PrjnScale.Rel": "0.2", + "Path.PathScale.Rel": "0.2", ... } } @@ -44,9 +44,9 @@ Each `params.Sheet` consists of a collection of params.Sel elements which finall * `#Name` = a specific named object. -The order of application within a given Sheet is also critical -- typically put the most general Type params first, then `.Class`, then the most specific `#Name` cases, to achieve within a given Sheet the same logic of establishing Base params for all types and then more specific overrides for special cases (e.g., an overall learning rate that appplies across all projections, but maybe a faster or slower one for a .Class or specific #Name'd projection). +The order of application within a given Sheet is also critical -- typically put the most general Type params first, then `.Class`, then the most specific `#Name` cases, to achieve within a given Sheet the same logic of establishing Base params for all types and then more specific overrides for special cases (e.g., an overall learning rate that appplies across all pathways, but maybe a faster or slower one for a .Class or specific #Name'd pathway). -There is a params.Styler interface with methods that any Go type can implement to provide these different labels. The emer.Network, .Layer, and .Prjn interfaces each implement this interface. +There is a params.Styler interface with methods that any Go type can implement to provide these different labels. The emer.Network, .Layer, and .Path interfaces each implement this interface. Parameter values are stored as strings, which can represent any value. diff --git a/netparams/doc.go b/netparams/doc.go index 548cd868..56f2add8 100644 --- a/netparams/doc.go +++ b/netparams/doc.go @@ -49,11 +49,11 @@ The order of application within a given Sheet is also critical -- typically put the most general Type params first, then .Class, then the most specific #Name cases, to achieve within a given Sheet the same logic of establishing Base params for all types and then more specific overrides for special cases (e.g., an overall -learning rate that appplies across all projections, but maybe a faster or slower -one for a .Class or specific #Name'd projection). +learning rate that appplies across all pathways, but maybe a faster or slower +one for a .Class or specific #Name'd pathway). There is a params.Styler interface with methods that any Go type can implement -to provide these different labels. The emer.Network, .Layer, and .Prjn interfaces +to provide these different labels. The emer.Network, .Layer, and .Path interfaces each implement this interface. Otherwise, the Apply method will just directly apply params to a given struct diff --git a/netparams/netparams_test.go b/netparams/netparams_test.go index 8ece99df..60021273 100644 --- a/netparams/netparams_test.go +++ b/netparams/netparams_test.go @@ -15,11 +15,11 @@ import ( var paramSets = Sets{ "Base": { - {Sel: "Prjn", Desc: "norm and momentum on works better, but wt bal is not better for smaller nets", + {Sel: "Path", Desc: "norm and momentum on works better, but wt bal is not better for smaller nets", Params: params.Params{ - "Prjn.Learn.Norm.On": "true", - "Prjn.Learn.Momentum.On": "true", - "Prjn.Learn.WtBal.On": "false", + "Path.Learn.Norm.On": "true", + "Path.Learn.Momentum.On": "true", + "Path.Learn.WtBal.On": "false", }}, {Sel: "Layer", Desc: "using default 1.8 inhib for all of network -- can explore", Params: params.Params{ @@ -33,9 +33,9 @@ var paramSets = Sets{ Params: params.Params{ "Layer.Inhib.Layer.Gi": "1.4", }}, - {Sel: ".Back", Desc: "top-down back-projections MUST have lower relative weight scale, otherwise network hallucinates", + {Sel: ".Back", Desc: "top-down back-pathways MUST have lower relative weight scale, otherwise network hallucinates", Params: params.Params{ - "Prjn.WtScale.Rel": "0.2", + "Path.WtScale.Rel": "0.2", }}, }, "DefaultInhib": { @@ -45,27 +45,27 @@ var paramSets = Sets{ }}, }, "NoMomentum": { - {Sel: "Prjn", Desc: "no norm or momentum", + {Sel: "Path", Desc: "no norm or momentum", Params: params.Params{ - "Prjn.Learn.Norm.On": "false", - "Prjn.Learn.Momentum.On": "false", + "Path.Learn.Norm.On": "false", + "Path.Learn.Momentum.On": "false", }}, }, "WtBalOn": { - {Sel: "Prjn", Desc: "weight bal on", + {Sel: "Path", Desc: "weight bal on", Params: params.Params{ - "Prjn.Learn.WtBal.On": "true", + "Path.Learn.WtBal.On": "true", }}, }, } var trgCode = `netparams.Sets{ "Base": { - {Sel: "Prjn", Desc: "norm and momentum on works better, but wt bal is not better for smaller nets", + {Sel: "Path", Desc: "norm and momentum on works better, but wt bal is not better for smaller nets", Params: params.Params{ - "Prjn.Learn.Norm.On": "true", - "Prjn.Learn.Momentum.On": "true", - "Prjn.Learn.WtBal.On": "false", + "Path.Learn.Norm.On": "true", + "Path.Learn.Momentum.On": "true", + "Path.Learn.WtBal.On": "false", }}, {Sel: "Layer", Desc: "using default 1.8 inhib for all of network -- can explore", Params: params.Params{ @@ -79,9 +79,9 @@ var trgCode = `netparams.Sets{ Params: params.Params{ "Layer.Inhib.Layer.Gi": "1.4", }}, - {Sel: ".Back", Desc: "top-down back-projections MUST have lower relative weight scale, otherwise network hallucinates", + {Sel: ".Back", Desc: "top-down back-pathways MUST have lower relative weight scale, otherwise network hallucinates", Params: params.Params{ - "Prjn.WtScale.Rel": "0.2", + "Path.WtScale.Rel": "0.2", }}, }, "DefaultInhib": { @@ -91,16 +91,16 @@ var trgCode = `netparams.Sets{ }}, }, "NoMomentum": { - {Sel: "Prjn", Desc: "no norm or momentum", + {Sel: "Path", Desc: "no norm or momentum", Params: params.Params{ - "Prjn.Learn.Norm.On": "false", - "Prjn.Learn.Momentum.On": "false", + "Path.Learn.Norm.On": "false", + "Path.Learn.Momentum.On": "false", }}, }, "WtBalOn": { - {Sel: "Prjn", Desc: "weight bal on", + {Sel: "Path", Desc: "weight bal on", Params: params.Params{ - "Prjn.Learn.WtBal.On": "true", + "Path.Learn.WtBal.On": "true", }}, }, } @@ -120,7 +120,7 @@ func TestParamSetsWriteGo(t *testing.T) { } func TestParamSetsSet(t *testing.T) { - cval, err := paramSets.ParamValue("Base", "Prjn", "Prjn.Learn.WtBal.On") + cval, err := paramSets.ParamValue("Base", "Path", "Path.Learn.WtBal.On") if err != nil { t.Error(err) } @@ -128,30 +128,30 @@ func TestParamSetsSet(t *testing.T) { if cval != "false" { t.Errorf("value should have been false: %s\n", cval) } - err = paramSets.SetString("Base", "Prjn", "Prjn.Learn.WtBal.On", "true") + err = paramSets.SetString("Base", "Path", "Path.Learn.WtBal.On", "true") if err != nil { t.Error(err) } - cval, err = paramSets.ParamValue("Base", "Prjn", "Prjn.Learn.WtBal.On") + cval, err = paramSets.ParamValue("Base", "Path", "Path.Learn.WtBal.On") // fmt.Printf("new value: %s\n", cval) if cval != "true" { t.Errorf("value should have been true: %s\n", cval) } - err = paramSets.SetFloat("Base", "Prjn", "Prjn.Learn.WtBal.On", 5.1) + err = paramSets.SetFloat("Base", "Path", "Path.Learn.WtBal.On", 5.1) if err != nil { t.Error(err) } - cval, err = paramSets.ParamValue("Base", "Prjn", "Prjn.Learn.WtBal.On") + cval, err = paramSets.ParamValue("Base", "Path", "Path.Learn.WtBal.On") // fmt.Printf("new value: %s\n", cval) if cval != "5.1" { t.Errorf("value should have been 5.1: %s\n", cval) } - cval, err = paramSets.ParamValue("Basre", "Prjn", "Prjn.Learn.WtBal.On") + cval, err = paramSets.ParamValue("Basre", "Path", "Path.Learn.WtBal.On") if err == nil { t.Errorf("Should have had an error") } // fmt.Printf("error: %s\n", err) - cval, err = paramSets.ParamValue("Base", "Prjns", "Prjn.Learn.WtBal.On") + cval, err = paramSets.ParamValue("Base", "Paths", "Path.Learn.WtBal.On") if err == nil { t.Errorf("Should have had an error") } diff --git a/netview/data.go b/netview/data.go index 42dff3f2..9caf05bb 100644 --- a/netview/data.go +++ b/netview/data.go @@ -21,42 +21,42 @@ type LayData struct { // the full data, in that order Data []float32 - // receiving projection data -- shared with SendPrjns - RecvPrjns []*PrjnData + // receiving pathway data -- shared with SendPaths + RecvPaths []*PathData - // sending projection data -- shared with RecvPrjns - SendPrjns []*PrjnData + // sending pathway data -- shared with RecvPaths + SendPaths []*PathData } -// AllocSendPrjns allocates Sending projections for given layer. +// AllocSendPaths allocates Sending pathways for given layer. // does nothing if already allocated. -func (ld *LayData) AllocSendPrjns(ly emer.Layer) { - nsp := ly.NSendPrjns() - if len(ld.SendPrjns) == nsp { - for si := 0; si < ly.NSendPrjns(); si++ { - pj := ly.SendPrjn(si) - spd := ld.SendPrjns[si] - spd.Prjn = pj +func (ld *LayData) AllocSendPaths(ly emer.Layer) { + nsp := ly.NSendPaths() + if len(ld.SendPaths) == nsp { + for si := 0; si < ly.NSendPaths(); si++ { + pj := ly.SendPath(si) + spd := ld.SendPaths[si] + spd.Path = pj } return } - ld.SendPrjns = make([]*PrjnData, nsp) - for si := 0; si < ly.NSendPrjns(); si++ { - pj := ly.SendPrjn(si) - pd := &PrjnData{Send: pj.SendLay().Name(), Recv: pj.RecvLay().Name(), Prjn: pj} - ld.SendPrjns[si] = pd + ld.SendPaths = make([]*PathData, nsp) + for si := 0; si < ly.NSendPaths(); si++ { + pj := ly.SendPath(si) + pd := &PathData{Send: pj.SendLay().Name(), Recv: pj.RecvLay().Name(), Path: pj} + ld.SendPaths[si] = pd pd.Alloc() } } -// FreePrjns nils prjn data -- for NoSynDat -func (ld *LayData) FreePrjns() { - ld.RecvPrjns = nil - ld.SendPrjns = nil +// FreePaths nils path data -- for NoSynDat +func (ld *LayData) FreePaths() { + ld.RecvPaths = nil + ld.SendPaths = nil } -// PrjnData holds display state for a projection -type PrjnData struct { +// PathData holds display state for a pathway +type PathData struct { // name of sending layer Send string @@ -64,8 +64,8 @@ type PrjnData struct { // name of recv layer Recv string - // source projection - Prjn emer.Prjn + // source pathway + Path emer.Path // synaptic data, by variable in SynVars and number of data points SynData []float32 @@ -73,8 +73,8 @@ type PrjnData struct { // Alloc allocates SynData to hold number of variables * nsyn synapses. // If already has capacity, nothing happens. -func (pd *PrjnData) Alloc() { - pj := pd.Prjn +func (pd *PathData) Alloc() { + pj := pd.Path nvar := pj.SynVarNum() nsyn := pj.Syn1DNum() nt := nvar * nsyn @@ -85,10 +85,10 @@ func (pd *PrjnData) Alloc() { } } -// RecordData records synaptic data from given prjn. +// RecordData records synaptic data from given paths. // must use sender or recv based depending on natural ordering. -func (pd *PrjnData) RecordData(nd *NetData) { - pj := pd.Prjn +func (pd *PathData) RecordData(nd *NetData) { + pj := pd.Path vnms := pj.SynVarNames() nvar := pj.SynVarNum() nsyn := pj.Syn1DNum() diff --git a/netview/events.go b/netview/events.go index 940821e5..ae001fe1 100644 --- a/netview/events.go +++ b/netview/events.go @@ -72,8 +72,8 @@ func (sw *Scene) MouseDownEvent(e events.Event) { return } nv := sw.NetView - nv.Data.PrjnUnIndex = unIndex - nv.Data.PrjnLay = lay.Name() + nv.Data.PathUnIndex = unIndex + nv.Data.PathLay = lay.Name() nv.UpdateView() e.SetHandled() } diff --git a/netview/netdata.go b/netview/netdata.go index 840ab3a9..cf3aed3f 100644 --- a/netview/netdata.go +++ b/netview/netdata.go @@ -36,14 +36,14 @@ type NetData struct { //types:add // copied from Params -- do not record synapse level data -- turn this on for very large networks where recording the entire synaptic state would be prohibitive NoSynData bool - // name of the layer with unit for viewing projections (connection / synapse-level values) - PrjnLay string + // name of the layer with unit for viewing pathways (connection / synapse-level values) + PathLay string - // 1D index of unit within PrjnLay for for viewing projections - PrjnUnIndex int + // 1D index of unit within PathLay for for viewing pathways + PathUnIndex int - // copied from NetView Params: if non-empty, this is the type projection to show when there are multiple projections from the same layer -- e.g., Inhib, Lateral, Forward, etc - PrjnType string `edit:"-"` + // copied from NetView Params: if non-empty, this is the type pathway to show when there are multiple pathways from the same layer -- e.g., Inhib, Lateral, Forward, etc + PathType string `edit:"-"` // the list of unit variables saved UnVars []string @@ -149,23 +149,23 @@ makeData: ld := &LayData{LayName: nm, NUnits: lay.Shape().Len()} nd.LayData[nm] = ld if nd.NoSynData { - ld.FreePrjns() + ld.FreePaths() } else { - ld.AllocSendPrjns(lay) + ld.AllocSendPaths(lay) } } if !nd.NoSynData { for li := 0; li < nlay; li++ { rlay := nd.Net.Layer(li) rld := nd.LayData[rlay.Name()] - rld.RecvPrjns = make([]*PrjnData, rlay.NRecvPrjns()) - for ri := 0; ri < rlay.NRecvPrjns(); ri++ { - rpj := rlay.RecvPrjn(ri) + rld.RecvPaths = make([]*PathData, rlay.NRecvPaths()) + for ri := 0; ri < rlay.NRecvPaths(); ri++ { + rpj := rlay.RecvPath(ri) slay := rpj.SendLay() sld := nd.LayData[slay.Name()] - for _, spj := range sld.SendPrjns { - if spj.Prjn == rpj { - rld.RecvPrjns[ri] = spj // link + for _, spj := range sld.SendPaths { + if spj.Path == rpj { + rld.RecvPaths[ri] = spj // link } } } @@ -176,9 +176,9 @@ makeData: lay := nd.Net.Layer(li) ld := nd.LayData[lay.Name()] if nd.NoSynData { - ld.FreePrjns() + ld.FreePaths() } else { - ld.AllocSendPrjns(lay) + ld.AllocSendPaths(lay) } } } @@ -347,8 +347,8 @@ func (nd *NetData) RecordSyns() { lay := nd.Net.Layer(li) laynm := lay.Name() ld := nd.LayData[laynm] - for si := 0; si < lay.NSendPrjns(); si++ { - spd := ld.SendPrjns[si] + for si := 0; si < lay.NSendPaths(); si++ { + spd := ld.SendPaths[si] spd.RecordData(nd) } } @@ -436,21 +436,21 @@ func (nd *NetData) UnitValueIndex(laynm string, vnm string, uidx1d int, ridx int } // RecvUnitVal returns the value for given layer, variable name, unit index, -// for receiving projection variable, based on recorded synaptic projection data. +// for receiving pathway variable, based on recorded synaptic pathway data. // Returns false if value unavailable for any reason (including recorded as such as NaN). func (nd *NetData) RecvUnitValue(laynm string, vnm string, uidx1d int) (float32, bool) { ld, ok := nd.LayData[laynm] - if nd.NoSynData || !ok || nd.PrjnLay == "" { + if nd.NoSynData || !ok || nd.PathLay == "" { return 0, false } - recvLay := nd.Net.LayerByName(nd.PrjnLay) + recvLay := nd.Net.LayerByName(nd.PathLay) if recvLay == nil { return 0, false } - var pj emer.Prjn + var pj emer.Path var err error - if nd.PrjnType != "" { - pj, err = recvLay.SendNameTypeTry(laynm, nd.PrjnType) + if nd.PathType != "" { + pj, err = recvLay.SendNameTypeTry(laynm, nd.PathType) if pj == nil { pj, err = recvLay.SendNameTry(laynm) } @@ -460,9 +460,9 @@ func (nd *NetData) RecvUnitValue(laynm string, vnm string, uidx1d int) (float32, if pj == nil { return 0, false } - var spd *PrjnData - for _, pd := range ld.SendPrjns { - if pd.Prjn == pj { + var spd *PathData + for _, pd := range ld.SendPaths { + if pd.Path == pj { spd = pd break } @@ -474,7 +474,7 @@ func (nd *NetData) RecvUnitValue(laynm string, vnm string, uidx1d int) (float32, if err != nil { return 0, false } - synIndex := pj.SynIndex(uidx1d, nd.PrjnUnIndex) + synIndex := pj.SynIndex(uidx1d, nd.PathUnIndex) if synIndex < 0 { return 0, false } @@ -484,21 +484,21 @@ func (nd *NetData) RecvUnitValue(laynm string, vnm string, uidx1d int) (float32, } // SendUnitVal returns the value for given layer, variable name, unit index, -// for sending projection variable, based on recorded synaptic projection data. +// for sending pathway variable, based on recorded synaptic pathway data. // Returns false if value unavailable for any reason (including recorded as such as NaN). func (nd *NetData) SendUnitValue(laynm string, vnm string, uidx1d int) (float32, bool) { ld, ok := nd.LayData[laynm] - if nd.NoSynData || !ok || nd.PrjnLay == "" { + if nd.NoSynData || !ok || nd.PathLay == "" { return 0, false } - sendLay := nd.Net.LayerByName(nd.PrjnLay) + sendLay := nd.Net.LayerByName(nd.PathLay) if sendLay == nil { return 0, false } - var pj emer.Prjn + var pj emer.Path var err error - if nd.PrjnType != "" { - pj, err = sendLay.RecvNameTypeTry(laynm, nd.PrjnType) + if nd.PathType != "" { + pj, err = sendLay.RecvNameTypeTry(laynm, nd.PathType) if pj == nil { pj, err = sendLay.RecvNameTry(laynm) } @@ -508,9 +508,9 @@ func (nd *NetData) SendUnitValue(laynm string, vnm string, uidx1d int) (float32, if pj == nil { return 0, false } - var rpd *PrjnData - for _, pd := range ld.RecvPrjns { - if pd.Prjn == pj { + var rpd *PathData + for _, pd := range ld.RecvPaths { + if pd.Path == pj { rpd = pd break } @@ -522,7 +522,7 @@ func (nd *NetData) SendUnitValue(laynm string, vnm string, uidx1d int) (float32, if err != nil { return 0, false } - synIndex := pj.SynIndex(nd.PrjnUnIndex, uidx1d) + synIndex := pj.SynIndex(nd.PathUnIndex, uidx1d) if synIndex < 0 { return 0, false } @@ -626,12 +626,12 @@ func (nd *NetData) WriteJSON(w io.Writer) error { // Useful for replaying detailed trace for units of interest. func (nv *NetView) PlotSelectedUnit() (*table.Table, *plotview.PlotView) { //types:add nd := &nv.Data - if nd.PrjnLay == "" || nd.PrjnUnIndex < 0 { + if nd.PathLay == "" || nd.PathUnIndex < 0 { fmt.Printf("NetView:PlotSelectedUnit -- no unit selected\n") return nil, nil } - selnm := nd.PrjnLay + fmt.Sprintf("[%d]", nd.PrjnUnIndex) + selnm := nd.PathLay + fmt.Sprintf("[%d]", nd.PathUnIndex) b := core.NewBody("netview-selectedunit").SetTitle("NetView SelectedUnit Plot: " + selnm) plt := plotview.NewPlotView(b) @@ -663,18 +663,18 @@ func (nv *NetView) PlotSelectedUnit() (*table.Table, *plotview.PlotView) { //typ // SelectedUnitTable returns a table with all of the data for the // currently-selected unit, and data parallel index. func (nd *NetData) SelectedUnitTable(di int) *table.Table { - if nd.PrjnLay == "" || nd.PrjnUnIndex < 0 { + if nd.PathLay == "" || nd.PathUnIndex < 0 { fmt.Printf("NetView:SelectedUnitTable -- no unit selected\n") return nil } - ld, ok := nd.LayData[nd.PrjnLay] + ld, ok := nd.LayData[nd.PathLay] if !ok { fmt.Printf("NetView:SelectedUnitTable -- layer name incorrect\n") return nil } - selnm := nd.PrjnLay + fmt.Sprintf("[%d]", nd.PrjnUnIndex) + selnm := nd.PathLay + fmt.Sprintf("[%d]", nd.PathUnIndex) dt := &table.Table{} dt.SetMetaData("name", "NetView: "+selnm) @@ -685,7 +685,7 @@ func (nd *NetData) SelectedUnitTable(di int) *table.Table { vlen := len(nd.UnVars) nu := ld.NUnits nvu := vlen * nd.MaxData * nu - uidx1d := nd.PrjnUnIndex + uidx1d := nd.PathUnIndex dt.AddIntColumn("Rec") for _, vnm := range nd.UnVars { diff --git a/netview/netview.go b/netview/netview.go index 5084840e..5a105cf6 100644 --- a/netview/netview.go +++ b/netview/netview.go @@ -151,7 +151,7 @@ func (nv *NetView) Record(counters string, rastCtr int) { if counters != "" { nv.LastCtrs = counters } - nv.Data.PrjnType = nv.Params.PrjnType + nv.Data.PathType = nv.Params.PathType nv.Data.Record(nv.LastCtrs, rastCtr, nv.Params.Raster.Max) nv.RecTrackLatest() // if we make a new record, then user expectation is to track latest.. } @@ -446,7 +446,7 @@ func (nv *NetView) RecTrackLatest() bool { return true } -// NetVarsList returns the list of layer and prjn variables for given network. +// NetVarsList returns the list of layer and path variables for given network. // layEven ensures that the number of layer variables is an even number if true // (used for display but not storage). func (nv *NetView) NetVarsList(net emer.Network, layEven bool) (nvars, synvars []string) { @@ -489,13 +489,13 @@ func (nv *NetView) VarsListUpdate() { } unprops := nv.Net.UnitVarProps() - prjnprops := nv.Net.SynVarProps() + pathprops := nv.Net.SynVarProps() for _, nm := range nv.Vars { vp := &VarParams{Var: nm} vp.Defaults() var vtag string if strings.HasPrefix(nm, "r.") || strings.HasPrefix(nm, "s.") { - vtag = prjnprops[nm[2:]] + vtag = pathprops[nm[2:]] } else { vtag = unprops[nm] } @@ -584,13 +584,13 @@ func (nv *NetView) VarsConfig() { return } unprops := nv.Net.UnitVarProps() - prjnprops := nv.Net.SynVarProps() + pathprops := nv.Net.SynVarProps() for _, vn := range nv.Vars { vn := vn vb := core.NewButton(vl).SetText(vn).SetType(core.ButtonAction) pstr := "" if strings.HasPrefix(vn, "r.") || strings.HasPrefix(vn, "s.") { - pstr = prjnprops[vn[2:]] + pstr = pathprops[vn[2:]] } else { pstr = unprops[vn] } @@ -771,7 +771,7 @@ func (nv *NetView) UnitValColor(lay emer.Layer, idx1d int, raw float32, hasval b } if !hasval { scaled = 0 - if lay.Name() == nv.Data.PrjnLay && idx1d == nv.Data.PrjnUnIndex { + if lay.Name() == nv.Data.PathLay && idx1d == nv.Data.PathUnIndex { clr = color.RGBA{0x20, 0x80, 0x20, 0x80} } else { clr = NilColor @@ -872,7 +872,7 @@ func (nv *NetView) ConfigToolbar(tb *core.Toolbar) { views.NewFuncButton(m, nv.ShowNonDefaultParams).SetIcon(icons.Info) views.NewFuncButton(m, nv.ShowAllParams).SetIcon(icons.Info) views.NewFuncButton(m, nv.ShowKeyLayerParams).SetIcon(icons.Info) - views.NewFuncButton(m, nv.ShowKeyPrjnParams).SetIcon(icons.Info) + views.NewFuncButton(m, nv.ShowKeyPathParams).SetIcon(icons.Info) }) core.NewButton(tb).SetText("Net Data").SetIcon(icons.Save).SetMenu(func(m *core.Scene) { views.NewFuncButton(m, nv.Data.SaveJSON).SetText("Save Net Data").SetIcon(icons.Save) @@ -1195,10 +1195,10 @@ func (nv *NetView) ShowKeyLayerParams() string { //types:add return nds } -// ShowKeyPrjnParams shows a dialog with a listing for all Recv projections in the network, -// of the most important projection-level params (specific to each algorithm) -func (nv *NetView) ShowKeyPrjnParams() string { //types:add - nds := nv.Net.KeyPrjnParams() - texteditor.TextDialog(nv, "Key Prjn Params: "+nv.Nm, nds) +// ShowKeyPathParams shows a dialog with a listing for all Recv pathways in the network, +// of the most important pathway-level params (specific to each algorithm) +func (nv *NetView) ShowKeyPathParams() string { //types:add + nds := nv.Net.KeyPathParams() + texteditor.TextDialog(nv, "Key Path Params: "+nv.Nm, nds) return nds } diff --git a/netview/params.go b/netview/params.go index f0c5f87f..f8bb708e 100644 --- a/netview/params.go +++ b/netview/params.go @@ -56,8 +56,8 @@ type Params struct { //types:add // do not record synapse level data -- turn this on for very large networks where recording the entire synaptic state would be prohibitive NoSynData bool - // if non-empty, this is the type projection to show when there are multiple projections from the same layer -- e.g., Inhib, Lateral, Forward, etc - PrjnType string + // if non-empty, this is the type pathway to show when there are multiple pathways from the same layer -- e.g., Inhib, Lateral, Forward, etc + PathType string // maximum number of records to store to enable rewinding through prior states MaxRecs int `min:"1"` diff --git a/netview/typegen.go b/netview/typegen.go index 7ea495fc..7a473be0 100644 --- a/netview/typegen.go +++ b/netview/typegen.go @@ -13,9 +13,9 @@ import ( "cogentcore.org/core/xyz/xyzview" ) -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/netview.LayData", IDName: "lay-data", Doc: "LayData maintains a record of all the data for a given layer", Fields: []types.Field{{Name: "LayName", Doc: "the layer name"}, {Name: "NUnits", Doc: "cached number of units"}, {Name: "Data", Doc: "the full data, in that order"}, {Name: "RecvPrjns", Doc: "receiving projection data -- shared with SendPrjns"}, {Name: "SendPrjns", Doc: "sending projection data -- shared with RecvPrjns"}}}) +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/netview.LayData", IDName: "lay-data", Doc: "LayData maintains a record of all the data for a given layer", Fields: []types.Field{{Name: "LayName", Doc: "the layer name"}, {Name: "NUnits", Doc: "cached number of units"}, {Name: "Data", Doc: "the full data, in that order"}, {Name: "RecvPaths", Doc: "receiving pathway data -- shared with SendPaths"}, {Name: "SendPaths", Doc: "sending pathway data -- shared with RecvPaths"}}}) -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/netview.PrjnData", IDName: "prjn-data", Doc: "PrjnData holds display state for a projection", Fields: []types.Field{{Name: "Send", Doc: "name of sending layer"}, {Name: "Recv", Doc: "name of recv layer"}, {Name: "Prjn", Doc: "source projection"}, {Name: "SynData", Doc: "synaptic data, by variable in SynVars and number of data points"}}}) +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/netview.PathData", IDName: "path-data", Doc: "PathData holds display state for a pathway", Fields: []types.Field{{Name: "Send", Doc: "name of sending layer"}, {Name: "Recv", Doc: "name of recv layer"}, {Name: "Path", Doc: "source pathway"}, {Name: "SynData", Doc: "synaptic data, by variable in SynVars and number of data points"}}}) // SceneType is the [types.Type] for [Scene] var SceneType = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/netview.Scene", IDName: "scene", Doc: "Scene is a Widget for managing the 3D Scene of the NetView", Embeds: []types.Field{{Name: "Scene"}}, Fields: []types.Field{{Name: "NetView"}}, Instance: &Scene{}}) @@ -103,10 +103,10 @@ func (t *LayName) SetMat(v xyz.Material) *LayName { t.Mat = v; return t } // SetText sets the [LayName.Text] func (t *LayName) SetText(v string) *LayName { t.Text = v; return t } -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/netview.NetData", IDName: "net-data", Doc: "NetData maintains a record of all the network data that has been displayed\nup to a given maximum number of records (updates), using efficient ring index logic\nwith no copying to store in fixed-sized buffers.", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Methods: []types.Method{{Name: "OpenJSON", Doc: "OpenJSON opens colors from a JSON-formatted file.", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Args: []string{"filename"}, Returns: []string{"error"}}, {Name: "SaveJSON", Doc: "SaveJSON saves colors to a JSON-formatted file.", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Args: []string{"filename"}, Returns: []string{"error"}}}, Fields: []types.Field{{Name: "Net", Doc: "the network that we're viewing"}, {Name: "NoSynData", Doc: "copied from Params -- do not record synapse level data -- turn this on for very large networks where recording the entire synaptic state would be prohibitive"}, {Name: "PrjnLay", Doc: "name of the layer with unit for viewing projections (connection / synapse-level values)"}, {Name: "PrjnUnIndex", Doc: "1D index of unit within PrjnLay for for viewing projections"}, {Name: "PrjnType", Doc: "copied from NetView Params: if non-empty, this is the type projection to show when there are multiple projections from the same layer -- e.g., Inhib, Lateral, Forward, etc"}, {Name: "UnVars", Doc: "the list of unit variables saved"}, {Name: "UnVarIndexes", Doc: "index of each variable in the Vars slice"}, {Name: "SynVars", Doc: "the list of synaptic variables saved"}, {Name: "SynVarIndexes", Doc: "index of synaptic variable in the SynVars slice"}, {Name: "Ring", Doc: "the circular ring index -- Max here is max number of values to store, Len is number stored, and Index(Len-1) is the most recent one, etc"}, {Name: "MaxData", Doc: "max data parallel data per unit"}, {Name: "LayData", Doc: "the layer data -- map keyed by layer name"}, {Name: "UnMinPer", Doc: "unit var min values for each Ring.Max * variable"}, {Name: "UnMaxPer", Doc: "unit var max values for each Ring.Max * variable"}, {Name: "UnMinVar", Doc: "min values for unit variables"}, {Name: "UnMaxVar", Doc: "max values for unit variables"}, {Name: "SynMinVar", Doc: "min values for syn variables"}, {Name: "SynMaxVar", Doc: "max values for syn variables"}, {Name: "Counters", Doc: "counter strings"}, {Name: "RasterCtrs", Doc: "raster counter values"}, {Name: "RasterMap", Doc: "map of raster counter values to record numbers"}, {Name: "RastCtr", Doc: "dummy raster counter when passed a -1 -- increments and wraps around"}}}) +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/netview.NetData", IDName: "net-data", Doc: "NetData maintains a record of all the network data that has been displayed\nup to a given maximum number of records (updates), using efficient ring index logic\nwith no copying to store in fixed-sized buffers.", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Methods: []types.Method{{Name: "OpenJSON", Doc: "OpenJSON opens colors from a JSON-formatted file.", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Args: []string{"filename"}, Returns: []string{"error"}}, {Name: "SaveJSON", Doc: "SaveJSON saves colors to a JSON-formatted file.", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Args: []string{"filename"}, Returns: []string{"error"}}}, Fields: []types.Field{{Name: "Net", Doc: "the network that we're viewing"}, {Name: "NoSynData", Doc: "copied from Params -- do not record synapse level data -- turn this on for very large networks where recording the entire synaptic state would be prohibitive"}, {Name: "PathLay", Doc: "name of the layer with unit for viewing pathways (connection / synapse-level values)"}, {Name: "PathUnIndex", Doc: "1D index of unit within PathLay for for viewing pathways"}, {Name: "PathType", Doc: "copied from NetView Params: if non-empty, this is the type pathway to show when there are multiple pathways from the same layer -- e.g., Inhib, Lateral, Forward, etc"}, {Name: "UnVars", Doc: "the list of unit variables saved"}, {Name: "UnVarIndexes", Doc: "index of each variable in the Vars slice"}, {Name: "SynVars", Doc: "the list of synaptic variables saved"}, {Name: "SynVarIndexes", Doc: "index of synaptic variable in the SynVars slice"}, {Name: "Ring", Doc: "the circular ring index -- Max here is max number of values to store, Len is number stored, and Index(Len-1) is the most recent one, etc"}, {Name: "MaxData", Doc: "max data parallel data per unit"}, {Name: "LayData", Doc: "the layer data -- map keyed by layer name"}, {Name: "UnMinPer", Doc: "unit var min values for each Ring.Max * variable"}, {Name: "UnMaxPer", Doc: "unit var max values for each Ring.Max * variable"}, {Name: "UnMinVar", Doc: "min values for unit variables"}, {Name: "UnMaxVar", Doc: "max values for unit variables"}, {Name: "SynMinVar", Doc: "min values for syn variables"}, {Name: "SynMaxVar", Doc: "max values for syn variables"}, {Name: "Counters", Doc: "counter strings"}, {Name: "RasterCtrs", Doc: "raster counter values"}, {Name: "RasterMap", Doc: "map of raster counter values to record numbers"}, {Name: "RastCtr", Doc: "dummy raster counter when passed a -1 -- increments and wraps around"}}}) // NetViewType is the [types.Type] for [NetView] -var NetViewType = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/netview.NetView", IDName: "net-view", Doc: "NetView is a Cogent Core Widget that provides a 3D network view using the Cogent Core gi3d\n3D framework.", Methods: []types.Method{{Name: "PlotSelectedUnit", Doc: "PlotSelectedUnit opens a window with a plot of all the data for the\ncurrently selected unit.\nUseful for replaying detailed trace for units of interest.", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Returns: []string{"Table", "PlotView"}}, {Name: "Current", Doc: "Current records the current state of the network, including synaptic values,\nand updates the display. Use this when switching to NetView tab after network\nhas been running while viewing another tab, because the network state\nis typically not recored then.", Directives: []types.Directive{{Tool: "types", Directive: "add"}}}, {Name: "SaveWeights", Doc: "SaveWeights saves the network weights -- when called with views.CallMethod\nit will auto-prompt for filename", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Args: []string{"filename"}}, {Name: "OpenWeights", Doc: "OpenWeights opens the network weights -- when called with views.CallMethod\nit will auto-prompt for filename", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Args: []string{"filename"}}, {Name: "ShowNonDefaultParams", Doc: "ShowNonDefaultParams shows a dialog of all the parameters that\nare not at their default values in the network. Useful for setting params.", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Returns: []string{"string"}}, {Name: "ShowAllParams", Doc: "ShowAllParams shows a dialog of all the parameters in the network.", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Returns: []string{"string"}}, {Name: "ShowKeyLayerParams", Doc: "ShowKeyLayerParams shows a dialog with a listing for all layers in the network,\nof the most important layer-level params (specific to each algorithm)", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Returns: []string{"string"}}, {Name: "ShowKeyPrjnParams", Doc: "ShowKeyPrjnParams shows a dialog with a listing for all Recv projections in the network,\nof the most important projection-level params (specific to each algorithm)", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Returns: []string{"string"}}}, Embeds: []types.Field{{Name: "Layout"}}, Fields: []types.Field{{Name: "Net", Doc: "the network that we're viewing"}, {Name: "Var", Doc: "current variable that we're viewing"}, {Name: "Di", Doc: "current data parallel index di, for networks capable of processing input patterns in parallel."}, {Name: "Vars", Doc: "the list of variables to view"}, {Name: "SynVars", Doc: "list of synaptic variables"}, {Name: "SynVarsMap", Doc: "map of synaptic variable names to index"}, {Name: "VarParams", Doc: "parameters for the list of variables to view"}, {Name: "CurVarParams", Doc: "current var params -- only valid during Update of display"}, {Name: "Params", Doc: "parameters controlling how the view is rendered"}, {Name: "ColorMap", Doc: "color map for mapping values to colors -- set by name in Params"}, {Name: "ColorMapVal", Doc: "color map value representing ColorMap"}, {Name: "RecNo", Doc: "record number to display -- use -1 to always track latest, otherwise in range"}, {Name: "LastCtrs", Doc: "last non-empty counters string provided -- re-used if no new one"}, {Name: "Data", Doc: "contains all the network data with history"}, {Name: "DataMu", Doc: "mutex on data access"}}, Instance: &NetView{}}) +var NetViewType = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/netview.NetView", IDName: "net-view", Doc: "NetView is a Cogent Core Widget that provides a 3D network view using the Cogent Core gi3d\n3D framework.", Methods: []types.Method{{Name: "PlotSelectedUnit", Doc: "PlotSelectedUnit opens a window with a plot of all the data for the\ncurrently selected unit.\nUseful for replaying detailed trace for units of interest.", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Returns: []string{"Table", "PlotView"}}, {Name: "Current", Doc: "Current records the current state of the network, including synaptic values,\nand updates the display. Use this when switching to NetView tab after network\nhas been running while viewing another tab, because the network state\nis typically not recored then.", Directives: []types.Directive{{Tool: "types", Directive: "add"}}}, {Name: "SaveWeights", Doc: "SaveWeights saves the network weights -- when called with views.CallMethod\nit will auto-prompt for filename", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Args: []string{"filename"}}, {Name: "OpenWeights", Doc: "OpenWeights opens the network weights -- when called with views.CallMethod\nit will auto-prompt for filename", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Args: []string{"filename"}}, {Name: "ShowNonDefaultParams", Doc: "ShowNonDefaultParams shows a dialog of all the parameters that\nare not at their default values in the network. Useful for setting params.", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Returns: []string{"string"}}, {Name: "ShowAllParams", Doc: "ShowAllParams shows a dialog of all the parameters in the network.", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Returns: []string{"string"}}, {Name: "ShowKeyLayerParams", Doc: "ShowKeyLayerParams shows a dialog with a listing for all layers in the network,\nof the most important layer-level params (specific to each algorithm)", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Returns: []string{"string"}}, {Name: "ShowKeyPathParams", Doc: "ShowKeyPathParams shows a dialog with a listing for all Recv pathways in the network,\nof the most important pathway-level params (specific to each algorithm)", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Returns: []string{"string"}}}, Embeds: []types.Field{{Name: "Layout"}}, Fields: []types.Field{{Name: "Net", Doc: "the network that we're viewing"}, {Name: "Var", Doc: "current variable that we're viewing"}, {Name: "Di", Doc: "current data parallel index di, for networks capable of processing input patterns in parallel."}, {Name: "Vars", Doc: "the list of variables to view"}, {Name: "SynVars", Doc: "list of synaptic variables"}, {Name: "SynVarsMap", Doc: "map of synaptic variable names to index"}, {Name: "VarParams", Doc: "parameters for the list of variables to view"}, {Name: "CurVarParams", Doc: "current var params -- only valid during Update of display"}, {Name: "Params", Doc: "parameters controlling how the view is rendered"}, {Name: "ColorMap", Doc: "color map for mapping values to colors -- set by name in Params"}, {Name: "ColorMapVal", Doc: "color map value representing ColorMap"}, {Name: "RecNo", Doc: "record number to display -- use -1 to always track latest, otherwise in range"}, {Name: "LastCtrs", Doc: "last non-empty counters string provided -- re-used if no new one"}, {Name: "Data", Doc: "contains all the network data with history"}, {Name: "DataMu", Doc: "mutex on data access"}}, Instance: &NetView{}}) // NewNetView adds a new [NetView] with the given name to the given parent: // NetView is a Cogent Core Widget that provides a 3D network view using the Cogent Core gi3d @@ -178,7 +178,7 @@ func (t *NetView) SetTooltip(v string) *NetView { t.Tooltip = v; return t } var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/netview.RasterParams", IDName: "raster-params", Doc: "RasterParams holds parameters controlling the raster plot view", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Fields: []types.Field{{Name: "On", Doc: "if true, show a raster plot over time, otherwise units"}, {Name: "XAxis", Doc: "if true, the raster counter (time) is plotted across the X axis -- otherwise the Z depth axis"}, {Name: "Max", Doc: "maximum count for the counter defining the raster plot"}, {Name: "UnitSize", Doc: "size of a single unit, where 1 = full width and no space.. 1 default"}, {Name: "UnitHeight", Doc: "height multiplier for units, where 1 = full height.. 0.2 default"}}}) -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/netview.Params", IDName: "params", Doc: "Params holds parameters controlling how the view is rendered", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Fields: []types.Field{{Name: "Raster", Doc: "raster plot parameters"}, {Name: "NoSynData", Doc: "do not record synapse level data -- turn this on for very large networks where recording the entire synaptic state would be prohibitive"}, {Name: "PrjnType", Doc: "if non-empty, this is the type projection to show when there are multiple projections from the same layer -- e.g., Inhib, Lateral, Forward, etc"}, {Name: "MaxRecs", Doc: "maximum number of records to store to enable rewinding through prior states"}, {Name: "NVarCols", Doc: "number of variable columns"}, {Name: "UnitSize", Doc: "size of a single unit, where 1 = full width and no space.. .9 default"}, {Name: "LayNmSize", Doc: "size of the layer name labels -- entire network view is unit sized"}, {Name: "ColorMap", Doc: "name of color map to use"}, {Name: "ZeroAlpha", Doc: "opacity (0-1) of zero values -- greater magnitude values become increasingly opaque on either side of this minimum"}, {Name: "NetView", Doc: "our netview, for update method"}, {Name: "NFastSteps", Doc: "the number of records to jump for fast forward/backward"}}}) +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/netview.Params", IDName: "params", Doc: "Params holds parameters controlling how the view is rendered", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Fields: []types.Field{{Name: "Raster", Doc: "raster plot parameters"}, {Name: "NoSynData", Doc: "do not record synapse level data -- turn this on for very large networks where recording the entire synaptic state would be prohibitive"}, {Name: "PathType", Doc: "if non-empty, this is the type pathway to show when there are multiple pathways from the same layer -- e.g., Inhib, Lateral, Forward, etc"}, {Name: "MaxRecs", Doc: "maximum number of records to store to enable rewinding through prior states"}, {Name: "NVarCols", Doc: "number of variable columns"}, {Name: "UnitSize", Doc: "size of a single unit, where 1 = full width and no space.. .9 default"}, {Name: "LayNmSize", Doc: "size of the layer name labels -- entire network view is unit sized"}, {Name: "ColorMap", Doc: "name of color map to use"}, {Name: "ZeroAlpha", Doc: "opacity (0-1) of zero values -- greater magnitude values become increasingly opaque on either side of this minimum"}, {Name: "NetView", Doc: "our netview, for update method"}, {Name: "NFastSteps", Doc: "the number of records to jump for fast forward/backward"}}}) var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/netview.VarParams", IDName: "var-params", Doc: "VarParams holds parameters for display of each variable", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Fields: []types.Field{{Name: "Var", Doc: "name of the variable"}, {Name: "ZeroCtr", Doc: "keep Min - Max centered around 0, and use negative heights for units -- else use full min-max range for height (no negative heights)"}, {Name: "Range", Doc: "range to display"}, {Name: "MinMax", Doc: "if not using fixed range, this is the actual range of data"}}}) diff --git a/params/README.md b/params/README.md index 90008f7f..7ac4e21b 100644 --- a/params/README.md +++ b/params/README.md @@ -21,7 +21,7 @@ Sets { }, Sel: ".Back" { Params: { - "Prjn.PrjnScale.Rel": "0.2", + "Path.PathScale.Rel": "0.2", ... } } @@ -56,9 +56,9 @@ Each `params.Sheet` consists of a collection of params.Sel elements which actual * `#Name` = a specific named object. -The order of application within a given Sheet is also critical -- typically put the most general Type params first, then .Class, then the most specific #Name cases, to achieve within a given Sheet the same logic of establishing Base params for all types and then more specific overrides for special cases (e.g., an overall learning rate that appplies across all projections, but maybe a faster or slower one for a .Class or specific #Name'd projection). +The order of application within a given Sheet is also critical -- typically put the most general Type params first, then .Class, then the most specific #Name cases, to achieve within a given Sheet the same logic of establishing Base params for all types and then more specific overrides for special cases (e.g., an overall learning rate that appplies across all pathways, but maybe a faster or slower one for a .Class or specific #Name'd pathway). -There is a params.Styler interface with methods that any Go type can implement to provide these different labels. The emer.Network, .Layer, and .Prjn interfaces each implement this interface. +There is a params.Styler interface with methods that any Go type can implement to provide these different labels. The emer.Network, .Layer, and .Path interfaces each implement this interface. Otherwise, the Apply method will just directly apply params to a given struct type if it does not implement the Styler interface. diff --git a/params/apply.go b/params/apply.go index a610663a..fbf2b17f 100644 --- a/params/apply.go +++ b/params/apply.go @@ -248,7 +248,7 @@ func (ps *Sheet) Apply(obj any, setMsg bool) (bool, error) { // SelMatchReset resets the Sel.NMatch counter used to find cases where no Sel // matched any target objects. Call at start of application process, which // may be at an outer-loop of Apply calls (e.g., for a Network, Apply is called -// for each Layer and Prjn), so this must be called separately. +// for each Layer and Path), so this must be called separately. // See SelNoMatchWarn for warning call at end. func (ps *Sheet) SelMatchReset(setName string) { for _, sl := range *ps { diff --git a/params/doc.go b/params/doc.go index a849529a..b3548cc6 100644 --- a/params/doc.go +++ b/params/doc.go @@ -49,11 +49,11 @@ The order of application within a given Sheet is also critical -- typically put the most general Type params first, then .Class, then the most specific #Name cases, to achieve within a given Sheet the same logic of establishing Base params for all types and then more specific overrides for special cases (e.g., an overall -learning rate that appplies across all projections, but maybe a faster or slower -one for a .Class or specific #Name'd projection). +learning rate that appplies across all pathways, but maybe a faster or slower +one for a .Class or specific #Name'd pathway). There is a params.Styler interface with methods that any Go type can implement -to provide these different labels. The emer.Network, .Layer, and .Prjn interfaces +to provide these different labels. The emer.Network, .Layer, and .Path interfaces each implement this interface. Otherwise, the Apply method will just directly apply params to a given struct diff --git a/params/params.go b/params/params.go index 58e96147..962ec85d 100644 --- a/params/params.go +++ b/params/params.go @@ -13,8 +13,8 @@ import ( // Params is a name-value map for parameter values that can be applied // to any numeric type in any object. -// The name must be a dot-separated path to a specific parameter, e.g., Prjn.Learn.Lrate -// The first part of the path is the overall target object type, e.g., "Prjn" or "Layer", +// The name must be a dot-separated path to a specific parameter, e.g., Path.Learn.Lrate +// The first part of the path is the overall target object type, e.g., "Path" or "Layer", // which is used for determining if the parameter applies to a given object type. // // All of the params in one map must apply to the same target type because diff --git a/params/params_test.go b/params/params_test.go index ce07bc8f..f48c34d4 100644 --- a/params/params_test.go +++ b/params/params_test.go @@ -14,11 +14,11 @@ import ( var paramSets = Sets{ "Base": {Desc: "these are the best params", Sheets: Sheets{ "Network": &Sheet{ - {Sel: "Prjn", Desc: "norm and momentum on works better, but wt bal is not better for smaller nets", + {Sel: "Path", Desc: "norm and momentum on works better, but wt bal is not better for smaller nets", Params: Params{ - "Prjn.Learn.Norm.On": "true", - "Prjn.Learn.Momentum.On": "true", - "Prjn.Learn.WtBal.On": "false", + "Path.Learn.Norm.On": "true", + "Path.Learn.Momentum.On": "true", + "Path.Learn.WtBal.On": "false", }}, {Sel: "Layer", Desc: "using default 1.8 inhib for all of network -- can explore", Params: Params{ @@ -32,9 +32,9 @@ var paramSets = Sets{ Params: Params{ "Layer.Inhib.Layer.Gi": "1.4", }}, - {Sel: ".Back", Desc: "top-down back-projections MUST have lower relative weight scale, otherwise network hallucinates", + {Sel: ".Back", Desc: "top-down back-pathways MUST have lower relative weight scale, otherwise network hallucinates", Params: Params{ - "Prjn.WtScale.Rel": "0.2", + "Path.WtScale.Rel": "0.2", }}, }, "Sim": &Sheet{ // sim params apply to sim object @@ -62,18 +62,18 @@ var paramSets = Sets{ }}, "NoMomentum": {Desc: "no momentum or normalization", Sheets: Sheets{ "Network": &Sheet{ - {Sel: "Prjn", Desc: "no norm or momentum", + {Sel: "Path", Desc: "no norm or momentum", Params: Params{ - "Prjn.Learn.Norm.On": "false", - "Prjn.Learn.Momentum.On": "false", + "Path.Learn.Norm.On": "false", + "Path.Learn.Momentum.On": "false", }}, }, }}, "WtBalOn": {Desc: "try with weight bal on", Sheets: Sheets{ "Network": &Sheet{ - {Sel: "Prjn", Desc: "weight bal on", + {Sel: "Path", Desc: "weight bal on", Params: Params{ - "Prjn.Learn.WtBal.On": "true", + "Path.Learn.WtBal.On": "true", }}, }, }}, @@ -82,11 +82,11 @@ var paramSets = Sets{ var trgCode = `params.Sets{ {Desc: "these are the best params", Sheets: params.Sheets{ "Network": ¶ms.Sheet{ - {Sel: "Prjn", Desc: "norm and momentum on works better, but wt bal is not better for smaller nets", + {Sel: "Path", Desc: "norm and momentum on works better, but wt bal is not better for smaller nets", Params: params.Params{ - "Prjn.Learn.Momentum.On": "true", - "Prjn.Learn.Norm.On": "true", - "Prjn.Learn.WtBal.On": "false", + "Path.Learn.Momentum.On": "true", + "Path.Learn.Norm.On": "true", + "Path.Learn.WtBal.On": "false", }}, {Sel: "Layer", Desc: "using default 1.8 inhib for all of network -- can explore", Params: params.Params{ @@ -98,9 +98,9 @@ var trgCode = `params.Sets{ Params: params.Params{ "Layer.Inhib.Layer.Gi": "1.4", }}, - {Sel: ".Back", Desc: "top-down back-projections MUST have lower relative weight scale, otherwise network hallucinates", + {Sel: ".Back", Desc: "top-down back-pathways MUST have lower relative weight scale, otherwise network hallucinates", Params: params.Params{ - "Prjn.WtScale.Rel": "0.2", + "Path.WtScale.Rel": "0.2", }}, }, "Sim": ¶ms.Sheet{ @@ -128,18 +128,18 @@ var trgCode = `params.Sets{ }}, {Desc: "no momentum or normalization", Sheets: params.Sheets{ "Network": ¶ms.Sheet{ - {Sel: "Prjn", Desc: "no norm or momentum", + {Sel: "Path", Desc: "no norm or momentum", Params: params.Params{ - "Prjn.Learn.Momentum.On": "false", - "Prjn.Learn.Norm.On": "false", + "Path.Learn.Momentum.On": "false", + "Path.Learn.Norm.On": "false", }}, }, }}, {Desc: "try with weight bal on", Sheets: params.Sheets{ "Network": ¶ms.Sheet{ - {Sel: "Prjn", Desc: "weight bal on", + {Sel: "Path", Desc: "weight bal on", Params: params.Params{ - "Prjn.Learn.WtBal.On": "true", + "Path.Learn.WtBal.On": "true", }}, }, }}, @@ -160,7 +160,7 @@ func TestParamSetsWriteGo(t *testing.T) { } func TestParamSetsSet(t *testing.T) { - cval, err := paramSets.ParamValue("Base", "Network", "Prjn", "Prjn.Learn.WtBal.On") + cval, err := paramSets.ParamValue("Base", "Network", "Path", "Path.Learn.WtBal.On") if err != nil { t.Error(err) } @@ -168,34 +168,34 @@ func TestParamSetsSet(t *testing.T) { if cval != "false" { t.Errorf("value should have been false: %s\n", cval) } - err = paramSets.SetString("Base", "Network", "Prjn", "Prjn.Learn.WtBal.On", "true") + err = paramSets.SetString("Base", "Network", "Path", "Path.Learn.WtBal.On", "true") if err != nil { t.Error(err) } - cval, err = paramSets.ParamValue("Base", "Network", "Prjn", "Prjn.Learn.WtBal.On") + cval, err = paramSets.ParamValue("Base", "Network", "Path", "Path.Learn.WtBal.On") // fmt.Printf("new value: %s\n", cval) if cval != "true" { t.Errorf("value should have been true: %s\n", cval) } - err = paramSets.SetFloat("Base", "Network", "Prjn", "Prjn.Learn.WtBal.On", 5.1) + err = paramSets.SetFloat("Base", "Network", "Path", "Path.Learn.WtBal.On", 5.1) if err != nil { t.Error(err) } - cval, err = paramSets.ParamValue("Base", "Network", "Prjn", "Prjn.Learn.WtBal.On") + cval, err = paramSets.ParamValue("Base", "Network", "Path", "Path.Learn.WtBal.On") // fmt.Printf("new value: %s\n", cval) if cval != "5.1" { t.Errorf("value should have been 5.1: %s\n", cval) } - cval, err = paramSets.ParamValue("Basre", "Network2", "Prjn", "Prjn.Learn.WtBal.On") + cval, err = paramSets.ParamValue("Basre", "Network2", "Path", "Path.Learn.WtBal.On") if err == nil { t.Errorf("Should have had an error") } // fmt.Printf("error: %s\n", err) - cval, err = paramSets.ParamValue("Base", "Network2", "Prjn", "Prjn.Learn.WtBal.On") + cval, err = paramSets.ParamValue("Base", "Network2", "Path", "Path.Learn.WtBal.On") if err == nil { t.Errorf("Should have had an error") } - cval, err = paramSets.ParamValue("Base", "Network", "Prjns", "Prjn.Learn.WtBal.On") + cval, err = paramSets.ParamValue("Base", "Network", "Paths", "Path.Learn.WtBal.On") if err == nil { t.Errorf("Should have had an error") } diff --git a/params/search.go b/params/search.go index 25e8694d..b97fd4d6 100644 --- a/params/search.go +++ b/params/search.go @@ -10,7 +10,7 @@ type SearchValues struct { // name of object with the parameter Name string - // type of object with the parameter. This is a Base type name (e.g., Layer, Prjn), + // type of object with the parameter. This is a Base type name (e.g., Layer, Path), // that is at the start of the path in Network params. Type string diff --git a/params/styler.go b/params/styler.go index 4d14e648..adf2091b 100644 --- a/params/styler.go +++ b/params/styler.go @@ -15,7 +15,7 @@ type Styler interface { // TypeName returns the name of this type. CSS Sel selector with no prefix // operates on type name. This type is used *in addition* to the actual // Go type name of the object, and is a kind of type-category (e.g., Layer - // or Prjn in emergent network objects) + // or Path in emergent network objects) TypeName() string // Class returns the space-separated list of class selectors (tags). diff --git a/params/tweak_test.go b/params/tweak_test.go index d6d4837e..075796b4 100644 --- a/params/tweak_test.go +++ b/params/tweak_test.go @@ -14,14 +14,14 @@ import ( var tweakSets = Sets{ "Base": {Desc: "these are the best params", Sheets: Sheets{ "Network": &Sheet{ - {Sel: "Prjn", Desc: "norm and momentum on works better, but wt bal is not better for smaller nets", + {Sel: "Path", Desc: "norm and momentum on works better, but wt bal is not better for smaller nets", Params: Params{ - "Prjn.Learn.LRate": "0.02", - "Prjn.Learn.Momentum": "0.9", + "Path.Learn.LRate": "0.02", + "Path.Learn.Momentum": "0.9", }, Hypers: Hypers{ - "Prjn.Learn.LRate": {"Tweak": "log"}, - "Prjn.Learn.Momentum": {"Tweak": "incr"}, + "Path.Learn.LRate": {"Tweak": "log"}, + "Path.Learn.Momentum": {"Tweak": "incr"}, }}, {Sel: "Layer", Desc: "using default 1.8 inhib for all of network -- can explore", Params: Params{ @@ -37,12 +37,12 @@ var tweakSets = Sets{ Hypers: Hypers{ "Layer.Inhib.Layer.Gi": {"Tweak": "incr"}, }}, - {Sel: ".Back", Desc: "top-down back-projections MUST have lower relative weight scale, otherwise network hallucinates", + {Sel: ".Back", Desc: "top-down back-pathways MUST have lower relative weight scale, otherwise network hallucinates", Params: Params{ - "Prjn.WtScale.Rel": "0.2", + "Path.WtScale.Rel": "0.2", }, Hypers: Hypers{ - "Prjn.WtScale.Rel": {"Tweak": "log"}, + "Path.WtScale.Rel": {"Tweak": "log"}, }}, }, }}, @@ -73,7 +73,7 @@ func TestTweak(t *testing.T) { } } -var trgSearch = `[{"Param":"Layer.Inhib.Layer.Gi","Sel":{"Sel":"#Hidden","Desc":"output definitely needs lower inhib -- true for smaller layers in general","Params":{"Layer.Inhib.Layer.Gi":"1.4"},"Hypers":{"Layer.Inhib.Layer.Gi":{"Tweak":"incr"}}},"Search":[{"Name":"Hidden","Type":"Layer","Path":"Layer.Inhib.Layer.Gi","Start":1.4,"Values":[1.3,1.5]}]},{"Param":"Prjn.WtScale.Rel","Sel":{"Sel":".Back","Desc":"top-down back-projections MUST have lower relative weight scale, otherwise network hallucinates","Params":{"Prjn.WtScale.Rel":"0.2"},"Hypers":{"Prjn.WtScale.Rel":{"Tweak":"log"}}},"Search":[{"Name":"HiddenToInput","Type":"Prjn","Path":"Prjn.WtScale.Rel","Start":0.2,"Values":[0.1,0.5]}]},{"Param":"Layer.Inhib.Layer.Gi","Sel":{"Sel":"Layer","Desc":"using default 1.8 inhib for all of network -- can explore","Params":{"Layer.Inhib.Layer.Gi":"1.8"},"Hypers":{"Layer.Inhib.Layer.Gi":{"Tweak":"[1.75, 1.85]"}}},"Search":[{"Name":"Input","Type":"Layer","Path":"Layer.Inhib.Layer.Gi","Start":1.8,"Values":[1.75,1.85]}]},{"Param":"Prjn.Learn.LRate","Sel":{"Sel":"Prjn","Desc":"norm and momentum on works better, but wt bal is not better for smaller nets","Params":{"Prjn.Learn.LRate":"0.02","Prjn.Learn.Momentum":"0.9"},"Hypers":{"Prjn.Learn.LRate":{"Tweak":"log"},"Prjn.Learn.Momentum":{"Tweak":"incr"}}},"Search":[{"Name":"HiddenToInput","Type":"Prjn","Path":"Prjn.Learn.LRate","Start":0.02,"Values":[0.01,0.05]},{"Name":"InputToHidden","Type":"Prjn","Path":"Prjn.Learn.LRate","Start":0.02,"Values":[0.01,0.05]}]},{"Param":"Prjn.Learn.Momentum","Sel":{"Sel":"Prjn","Desc":"norm and momentum on works better, but wt bal is not better for smaller nets","Params":{"Prjn.Learn.LRate":"0.02","Prjn.Learn.Momentum":"0.9"},"Hypers":{"Prjn.Learn.LRate":{"Tweak":"log"},"Prjn.Learn.Momentum":{"Tweak":"incr"}}},"Search":[{"Name":"HiddenToInput","Type":"Prjn","Path":"Prjn.Learn.Momentum","Start":0.9,"Values":[0.8,1]},{"Name":"InputToHidden","Type":"Prjn","Path":"Prjn.Learn.Momentum","Start":0.9,"Values":[0.8,1]}]}] +var trgSearch = `[{"Param":"Layer.Inhib.Layer.Gi","Sel":{"Sel":"#Hidden","Desc":"output definitely needs lower inhib -- true for smaller layers in general","Params":{"Layer.Inhib.Layer.Gi":"1.4"},"Hypers":{"Layer.Inhib.Layer.Gi":{"Tweak":"incr"}}},"Search":[{"Name":"Hidden","Type":"Layer","Path":"Layer.Inhib.Layer.Gi","Start":1.4,"Values":[1.3,1.5]}]},{"Param":"Path.WtScale.Rel","Sel":{"Sel":".Back","Desc":"top-down back-pathways MUST have lower relative weight scale, otherwise network hallucinates","Params":{"Path.WtScale.Rel":"0.2"},"Hypers":{"Path.WtScale.Rel":{"Tweak":"log"}}},"Search":[{"Name":"HiddenToInput","Type":"Path","Path":"Path.WtScale.Rel","Start":0.2,"Values":[0.1,0.5]}]},{"Param":"Layer.Inhib.Layer.Gi","Sel":{"Sel":"Layer","Desc":"using default 1.8 inhib for all of network -- can explore","Params":{"Layer.Inhib.Layer.Gi":"1.8"},"Hypers":{"Layer.Inhib.Layer.Gi":{"Tweak":"[1.75, 1.85]"}}},"Search":[{"Name":"Input","Type":"Layer","Path":"Layer.Inhib.Layer.Gi","Start":1.8,"Values":[1.75,1.85]}]},{"Param":"Path.Learn.LRate","Sel":{"Sel":"Path","Desc":"norm and momentum on works better, but wt bal is not better for smaller nets","Params":{"Path.Learn.LRate":"0.02","Path.Learn.Momentum":"0.9"},"Hypers":{"Path.Learn.LRate":{"Tweak":"log"},"Path.Learn.Momentum":{"Tweak":"incr"}}},"Search":[{"Name":"HiddenToInput","Type":"Path","Path":"Path.Learn.LRate","Start":0.02,"Values":[0.01,0.05]},{"Name":"InputToHidden","Type":"Path","Path":"Path.Learn.LRate","Start":0.02,"Values":[0.01,0.05]}]},{"Param":"Path.Learn.Momentum","Sel":{"Sel":"Path","Desc":"norm and momentum on works better, but wt bal is not better for smaller nets","Params":{"Path.Learn.LRate":"0.02","Path.Learn.Momentum":"0.9"},"Hypers":{"Path.Learn.LRate":{"Tweak":"log"},"Path.Learn.Momentum":{"Tweak":"incr"}}},"Search":[{"Name":"HiddenToInput","Type":"Path","Path":"Path.Learn.Momentum","Start":0.9,"Values":[0.8,1]},{"Name":"InputToHidden","Type":"Path","Path":"Path.Learn.Momentum","Start":0.9,"Values":[0.8,1]}]}] ` func TestTweakHypers(t *testing.T) { @@ -81,8 +81,8 @@ func TestTweakHypers(t *testing.T) { hypers.Init([]FlexVal{ FlexVal{Nm: "Input", Type: "Layer", Cls: "Input", Obj: Hypers{}}, FlexVal{Nm: "Hidden", Type: "Layer", Cls: "Hidden", Obj: Hypers{}}, - FlexVal{Nm: "InputToHidden", Type: "Prjn", Cls: "Forward", Obj: Hypers{}}, - FlexVal{Nm: "HiddenToInput", Type: "Prjn", Cls: "Back", Obj: Hypers{}}, + FlexVal{Nm: "InputToHidden", Type: "Path", Cls: "Forward", Obj: Hypers{}}, + FlexVal{Nm: "HiddenToInput", Type: "Path", Cls: "Back", Obj: Hypers{}}, }) basenet := tweakSets.SetByName("Base").Sheets["Network"] hypers.ApplySheet(basenet, false) diff --git a/params/typegen.go b/params/typegen.go index 22ac38fa..12545bb9 100644 --- a/params/typegen.go +++ b/params/typegen.go @@ -18,7 +18,7 @@ var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/params.Hype var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/params.Hypers", IDName: "hypers", Doc: "Hypers is a parallel structure to Params which stores information relevant\nto hyperparameter search as well as the values.\nUse the key \"Val\" for the default value. This is equivalant to the value in\nParams. \"Min\" and \"Max\" guid the range, and \"Sigma\" describes a Gaussian."}) -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/params.Params", IDName: "params", Doc: "Params is a name-value map for parameter values that can be applied\nto any numeric type in any object.\nThe name must be a dot-separated path to a specific parameter, e.g., Prjn.Learn.Lrate\nThe first part of the path is the overall target object type, e.g., \"Prjn\" or \"Layer\",\nwhich is used for determining if the parameter applies to a given object type.\n\nAll of the params in one map must apply to the same target type because\nonly the first item in the map (which could be any due to order randomization)\nis used for checking the type of the target. Also, they all fall within the same\nSel selector scope which is used to determine what specific objects to apply the\nparameters to."}) +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/params.Params", IDName: "params", Doc: "Params is a name-value map for parameter values that can be applied\nto any numeric type in any object.\nThe name must be a dot-separated path to a specific parameter, e.g., Path.Learn.Lrate\nThe first part of the path is the overall target object type, e.g., \"Path\" or \"Layer\",\nwhich is used for determining if the parameter applies to a given object type.\n\nAll of the params in one map must apply to the same target type because\nonly the first item in the map (which could be any due to order randomization)\nis used for checking the type of the target. Also, they all fall within the same\nSel selector scope which is used to determine what specific objects to apply the\nparameters to."}) var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/params.Sel", IDName: "sel", Doc: "params.Sel specifies a selector for the scope of application of a set of\nparameters, using standard css selector syntax (. prefix = class, # prefix = name,\nand no prefix = type)", Directives: []types.Directive{{Tool: "types", Directive: "add"}}, Fields: []types.Field{{Name: "Sel", Doc: "selector for what to apply the parameters to, using standard css selector syntax: .Example applies to anything with a Class tag of 'Example', #Example applies to anything with a Name of 'Example', and Example with no prefix applies to anything of type 'Example'"}, {Name: "Desc", Doc: "description of these parameter values -- what effect do they have? what range was explored? it is valuable to record this information as you explore the params."}, {Name: "Params", Doc: "parameter values to apply to whatever matches the selector"}, {Name: "Hypers", Doc: "Put your hyperparams here"}, {Name: "NMatch", Doc: "number of times this selector matched a target during the last Apply process -- a warning is issued for any that remain at 0 -- see Sheet SelMatchReset and SelNoMatchWarn methods"}, {Name: "SetName", Doc: "name of current Set being applied"}}}) @@ -30,7 +30,7 @@ var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/params.Set" var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/params.Sets", IDName: "sets", Doc: "Sets is a collection of Set's that can be chosen among\ndepending on different desired configurations etc. Thus, each Set\nrepresents a collection of different possible specific configurations,\nand different such configurations can be chosen by name to apply as desired."}) -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/params.SearchValues", IDName: "search-values", Doc: "SearchValues is a list of parameter values to search for one parameter\non a given object (specified by Name), for float-valued params.", Fields: []types.Field{{Name: "Name", Doc: "name of object with the parameter"}, {Name: "Type", Doc: "type of object with the parameter. This is a Base type name (e.g., Layer, Prjn),\nthat is at the start of the path in Network params."}, {Name: "Path", Doc: "path to the parameter within the object"}, {Name: "Start", Doc: "starting value, e.g., for restoring after searching\nbefore moving on to another parameter, for grid search."}, {Name: "Values", Doc: "values of the parameter to search"}}}) +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/params.SearchValues", IDName: "search-values", Doc: "SearchValues is a list of parameter values to search for one parameter\non a given object (specified by Name), for float-valued params.", Fields: []types.Field{{Name: "Name", Doc: "name of object with the parameter"}, {Name: "Type", Doc: "type of object with the parameter. This is a Base type name (e.g., Layer, Path),\nthat is at the start of the path in Network params."}, {Name: "Path", Doc: "path to the parameter within the object"}, {Name: "Start", Doc: "starting value, e.g., for restoring after searching\nbefore moving on to another parameter, for grid search."}, {Name: "Values", Doc: "values of the parameter to search"}}}) var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/params.Styler", IDName: "styler", Doc: "The params.Styler interface exposes TypeName, Class, and Name methods\nthat allow the params.Sel CSS-style selection specifier to determine\nwhether a given parameter applies.\nAdding Set versions of Name and Class methods is a good idea but not\nneeded for this interface, so they are not included here."}) diff --git a/paths/README.md b/paths/README.md new file mode 100644 index 00000000..7a35490e --- /dev/null +++ b/paths/README.md @@ -0,0 +1,22 @@ +Docs: [GoDoc](https://pkg.go.dev/github.com/emer/emergent/paths) + +See [Wiki Params](https://github.com/emer/emergent/wiki/Paths) page for detailed docs. + +Package `paths` is a separate package for defining `Pattern`s of pathways between layers. This is done using a fully independent structure that *only* knows about the shapes of the two layers, and it returns a fully general bitmap representation of the pattern of connectivity between them. + +The algorithm-specific code then uses these patterns to do all the nitty-gritty of connecting up neurons. + +This makes the path code *much* simpler. This should be the *last* time any of those pathway patterns need to be written (having re-written this code too many times in the C++ version as the details of memory allocations changed). + +A Pattern maintains nothing about a specific pathway -- it only has the parameters that are applied in creating a new pattern of connectivity, so it can be shared among any number of pathways that need the same connectivity parameters. + +All Patttern types have a New where is the type name, that creates a new instance of given pattern initialized with default values. + +Individual Pattern types may have a Defaults() method to initialize default values, but it is not mandatory. + +# Topographic Weights + +Some paths (e.g., Circle, PoolTile) support the generation of topographic weight patterns that can be used to set initial weights, or per-synapse scaling factors. The `Pattern` interface does not define any standard for how this done, as there are various possible approaches. Circle defines a method with a standard signature that can be called for each point in the pattern, while PoolTile has a lot more overhead per point and is thus more efficient to generate the whole set of weights to tensor, which can then be used. + +It is recommended to have some kind of positive flag(s) for enabling the use of TopoWts -- the standard weight initialization methods, e.g., leabra.Network.InitWts, can then automatically do the correct thing for each type of standard path -- custom ones outside of this standard set would need custom code.. + diff --git a/prjn/circle.go b/paths/circle.go similarity index 87% rename from prjn/circle.go rename to paths/circle.go index 346c2be2..80dd4a08 100644 --- a/prjn/circle.go +++ b/paths/circle.go @@ -2,7 +2,7 @@ // Use of this source code is governed by a BSD-style // license that can be found in the LICENSE file. -package prjn +package paths import ( "cogentcore.org/core/math32" @@ -35,7 +35,7 @@ type Circle struct { // if true, connectivity wraps around edges Wrap bool - // if true, this prjn should set gaussian topographic weights, according to following parameters + // if true, this path should set gaussian topographic weights, according to following parameters TopoWts bool // gaussian sigma (width) as a proportion of the radius of the circle @@ -44,7 +44,7 @@ type Circle struct { // maximum weight value for GaussWts function -- multiplies values MaxWt float32 - // if true, and connecting layer to itself (self projection), then make a self-connection from unit to itself + // if true, and connecting layer to itself (self pathway), then make a self-connection from unit to itself SelfCon bool } @@ -68,8 +68,8 @@ func (cr *Circle) Name() string { func (cr *Circle) Connect(send, recv *tensor.Shape, same bool) (sendn, recvn *tensor.Int32, cons *tensor.Bits) { sendn, recvn, cons = NewTensors(send, recv) - sNy, sNx, _, _ := tensor.Prjn2DShape(send, false) - rNy, rNx, _, _ := tensor.Prjn2DShape(recv, false) + sNy, sNx, _, _ := tensor.Projection2DShape(send, false) + rNy, rNx, _, _ := tensor.Projection2DShape(recv, false) rnv := recvn.Values snv := sendn.Values @@ -98,8 +98,8 @@ func (cr *Circle) Connect(send, recv *tensor.Shape, same bool) (sendn, recvn *te } d := int(math32.Round(sp.DistanceTo(sctr))) if d <= cr.Radius { - ri := tensor.Prjn2DIndex(recv, false, ry, rx) - si := tensor.Prjn2DIndex(send, false, sy, sx) + ri := tensor.Projection2DIndex(recv, false, ry, rx) + si := tensor.Projection2DIndex(send, false, sy, sx) off := ri*sNtot + si if !cr.SelfCon && same && ri == si { continue @@ -117,10 +117,10 @@ func (cr *Circle) Connect(send, recv *tensor.Shape, same bool) (sendn, recvn *te // GaussWts returns gaussian weight value for given unit indexes in // given send and recv layers according to Gaussian Sigma and MaxWt. -// Can be used for a Prjn.SetScalesFunc or SetWtsFunc +// Can be used for a Path.SetScalesFunc or SetWtsFunc func (cr *Circle) GaussWts(si, ri int, send, recv *tensor.Shape) float32 { - sNy, sNx, _, _ := tensor.Prjn2DShape(send, false) - rNy, rNx, _, _ := tensor.Prjn2DShape(recv, false) + sNy, sNx, _, _ := tensor.Projection2DShape(send, false) + rNy, rNx, _, _ := tensor.Projection2DShape(recv, false) ry := ri / rNx // todo: this is not right for 4d! rx := ri % rNx diff --git a/prjn/doc.go b/paths/doc.go similarity index 73% rename from prjn/doc.go rename to paths/doc.go index c41f467c..0cb933ca 100644 --- a/prjn/doc.go +++ b/paths/doc.go @@ -3,23 +3,23 @@ // license that can be found in the LICENSE file. /* -Package prjn is a separate package for defining patterns of connectivity between layers +Package path is a separate package for defining patterns of connectivity between layers (i.e., the ProjectionSpecs from C++ emergent). This is done using a fully independent structure that *only* knows about the shapes of the two layers, and it returns a fully general bitmap representation of the pattern of connectivity between them. -The algorithm-specific leabra.Prjn code then uses these patterns to do all the nitty-gritty +The algorithm-specific leabra.Path code then uses these patterns to do all the nitty-gritty of connecting up neurons. -This makes the projection code *much* simpler compared to the ProjectionSpec in C++ emergent, +This makes the pathway code *much* simpler compared to the ProjectionSpec in C++ emergent, which was involved in both creating the pattern and also all the complexity of setting up the -actual connections themselves. This should be the *last* time any of those projection patterns +actual connections themselves. This should be the *last* time any of those pathway patterns need to be written (having re-written this code too many times in the C++ version as the details of memory allocations changed). -A Pattern maintains nothing about a specific projection -- it only has the parameters that +A Pattern maintains nothing about a specific pathway -- it only has the parameters that are applied in creating a new pattern of connectivity, so it can be shared among any number -of projections that need the same connectivity parameters. +of pathways that need the same connectivity parameters. All Patttern types have a New where is the type name, that creates a new instance of given pattern initialized with default values. @@ -27,4 +27,4 @@ instance of given pattern initialized with default values. Individual Pattern types may have a Defaults() method to initialize default values, but it is not mandatory. */ -package prjn +package paths diff --git a/prjn/full.go b/paths/full.go similarity index 88% rename from prjn/full.go rename to paths/full.go index bbb6eb81..d01de5a3 100644 --- a/prjn/full.go +++ b/paths/full.go @@ -2,14 +2,14 @@ // Use of this source code is governed by a BSD-style // license that can be found in the LICENSE file. -package prjn +package paths import "cogentcore.org/core/tensor" // Full implements full all-to-all pattern of connectivity between two layers type Full struct { - // if true, and connecting layer to itself (self projection), then make a self-connection from unit to itself + // if true, and connecting layer to itself (self pathway), then make a self-connection from unit to itself SelfCon bool } diff --git a/prjn/onetoone.go b/paths/onetoone.go similarity index 98% rename from prjn/onetoone.go rename to paths/onetoone.go index 99ba4f98..ca396e74 100644 --- a/prjn/onetoone.go +++ b/paths/onetoone.go @@ -2,7 +2,7 @@ // Use of this source code is governed by a BSD-style // license that can be found in the LICENSE file. -package prjn +package paths import "cogentcore.org/core/tensor" diff --git a/prjn/pattern.go b/paths/pattern.go similarity index 99% rename from prjn/pattern.go rename to paths/pattern.go index c7d31dda..de324159 100644 --- a/prjn/pattern.go +++ b/paths/pattern.go @@ -2,7 +2,7 @@ // Use of this source code is governed by a BSD-style // license that can be found in the LICENSE file. -package prjn +package paths //go:generate core generate -add-types diff --git a/prjn/poolonetoone.go b/paths/poolonetoone.go similarity index 99% rename from prjn/poolonetoone.go rename to paths/poolonetoone.go index 6f7cd417..e372ed2e 100644 --- a/prjn/poolonetoone.go +++ b/paths/poolonetoone.go @@ -2,7 +2,7 @@ // Use of this source code is governed by a BSD-style // license that can be found in the LICENSE file. -package prjn +package paths import "cogentcore.org/core/tensor" diff --git a/prjn/poolrect.go b/paths/poolrect.go similarity index 97% rename from prjn/poolrect.go rename to paths/poolrect.go index 10a692fa..9064a040 100644 --- a/prjn/poolrect.go +++ b/paths/poolrect.go @@ -2,7 +2,7 @@ // Use of this source code is governed by a BSD-style // license that can be found in the LICENSE file. -package prjn +package paths import ( "cogentcore.org/core/math32" @@ -35,7 +35,7 @@ type PoolRect struct { // if true, connectivity wraps around all edges if it would otherwise go off the edge -- if false, then edges are clipped Wrap bool - // if true, and connecting layer to itself (self projection), then make a self-connection from unit to itself + // if true, and connecting layer to itself (self pathway), then make a self-connection from unit to itself SelfCon bool // starting pool position in receiving layer -- if > 0 then pools below this starting point remain unconnected diff --git a/prjn/poolsameunit.go b/paths/poolsameunit.go similarity index 97% rename from prjn/poolsameunit.go rename to paths/poolsameunit.go index 26dbe2bb..147e32d5 100644 --- a/prjn/poolsameunit.go +++ b/paths/poolsameunit.go @@ -2,7 +2,7 @@ // Use of this source code is governed by a BSD-style // license that can be found in the LICENSE file. -package prjn +package paths import "cogentcore.org/core/tensor" @@ -16,7 +16,7 @@ import "cogentcore.org/core/tensor" // If neither is 4D, then it is equivalent to OneToOne. type PoolSameUnit struct { - // if true, and connecting layer to itself (self projection), then make a self-connection from unit to itself + // if true, and connecting layer to itself (self pathway), then make a self-connection from unit to itself SelfCon bool } diff --git a/prjn/pooltile.go b/paths/pooltile.go similarity index 99% rename from prjn/pooltile.go rename to paths/pooltile.go index 17b79185..51227801 100644 --- a/prjn/pooltile.go +++ b/paths/pooltile.go @@ -2,7 +2,7 @@ // Use of this source code is governed by a BSD-style // license that can be found in the LICENSE file. -package prjn +package paths import ( "fmt" @@ -27,7 +27,7 @@ import ( // must specifically apply these to the receptive fields. type PoolTile struct { - // reciprocal topographic connectivity -- logic runs with recv <-> send -- produces symmetric back-projection or topo prjn when sending layer is larger than recv + // reciprocal topographic connectivity -- logic runs with recv <-> send -- produces symmetric back-pathway or topo path when sending layer is larger than recv Recip bool // size of receptive field tile, in terms of pools on the sending layer diff --git a/prjn/pooltilesub.go b/paths/pooltilesub.go similarity index 99% rename from prjn/pooltilesub.go rename to paths/pooltilesub.go index 967cf452..71da2279 100644 --- a/prjn/pooltilesub.go +++ b/paths/pooltilesub.go @@ -2,7 +2,7 @@ // Use of this source code is governed by a BSD-style // license that can be found in the LICENSE file. -package prjn +package paths import ( "fmt" @@ -29,7 +29,7 @@ import ( // must specifically apply these to the receptive fields. type PoolTileSub struct { - // reciprocal topographic connectivity -- logic runs with recv <-> send -- produces symmetric back-projection or topo prjn when sending layer is larger than recv + // reciprocal topographic connectivity -- logic runs with recv <-> send -- produces symmetric back-pathway or topo path when sending layer is larger than recv Recip bool // size of receptive field tile, in terms of pools on the sending layer diff --git a/prjn/poolunifrnd.go b/paths/poolunifrnd.go similarity index 99% rename from prjn/poolunifrnd.go rename to paths/poolunifrnd.go index 0fae7348..7ab2a12b 100644 --- a/prjn/poolunifrnd.go +++ b/paths/poolunifrnd.go @@ -2,7 +2,7 @@ // Use of this source code is governed by a BSD-style // license that can be found in the LICENSE file. -package prjn +package paths import ( "math" diff --git a/prjn/prjn_test.go b/paths/prjn_test.go similarity index 99% rename from prjn/prjn_test.go rename to paths/prjn_test.go index 6817618f..593761d3 100644 --- a/prjn/prjn_test.go +++ b/paths/prjn_test.go @@ -2,7 +2,7 @@ // Use of this source code is governed by a BSD-style // license that can be found in the LICENSE file. -package prjn +package paths import ( "testing" diff --git a/prjn/rect.go b/paths/rect.go similarity index 89% rename from prjn/rect.go rename to paths/rect.go index a4842962..7e439815 100644 --- a/prjn/rect.go +++ b/paths/rect.go @@ -2,7 +2,7 @@ // Use of this source code is governed by a BSD-style // license that can be found in the LICENSE file. -package prjn +package paths import ( "cogentcore.org/core/math32" @@ -14,7 +14,7 @@ import ( // Rect implements a rectangular pattern of connectivity between two layers // where the lower-left corner moves in proportion to receiver position with offset // and multiplier factors (with wrap-around optionally). -// 4D layers are automatically flattened to 2D for this projection. +// 4D layers are automatically flattened to 2D for this pathway. type Rect struct { // size of rectangle in sending layer that each receiving unit receives from @@ -35,7 +35,7 @@ type Rect struct { // if true, connectivity wraps around all edges if it would otherwise go off the edge -- if false, then edges are clipped Wrap bool - // if true, and connecting layer to itself (self projection), then make a self-connection from unit to itself + // if true, and connecting layer to itself (self pathway), then make a self-connection from unit to itself SelfCon bool // make the reciprocal of the specified connections -- i.e., symmetric for swapping recv and send @@ -77,8 +77,8 @@ func (cr *Rect) Connect(send, recv *tensor.Shape, same bool) (sendn, recvn *tens return cr.ConnectRecip(send, recv, same) } sendn, recvn, cons = NewTensors(send, recv) - sNy, sNx, _, _ := tensor.Prjn2DShape(send, false) - rNy, rNx, _, _ := tensor.Prjn2DShape(recv, false) + sNy, sNx, _, _ := tensor.Projection2DShape(send, false) + rNy, rNx, _, _ := tensor.Projection2DShape(recv, false) rnv := recvn.Values snv := sendn.Values @@ -109,7 +109,7 @@ func (cr *Rect) Connect(send, recv *tensor.Shape, same bool) (sendn, recvn *tens for ry := cr.RecvStart.Y; ry < rNyEff+cr.RecvStart.Y; ry++ { for rx := cr.RecvStart.X; rx < rNxEff+cr.RecvStart.X; rx++ { - ri := tensor.Prjn2DIndex(recv, false, ry, rx) + ri := tensor.Projection2DIndex(recv, false, ry, rx) sst := cr.Start if cr.RoundScale { sst.X += int(math32.Round(float32(rx-cr.RecvStart.X) * sc.X)) @@ -128,7 +128,7 @@ func (cr *Rect) Connect(send, recv *tensor.Shape, same bool) (sendn, recvn *tens if clipx { continue } - si := tensor.Prjn2DIndex(send, false, sy, sx) + si := tensor.Projection2DIndex(send, false, sy, sx) off := ri*sNtot + si if !cr.SelfCon && same && ri == si { continue @@ -145,8 +145,8 @@ func (cr *Rect) Connect(send, recv *tensor.Shape, same bool) (sendn, recvn *tens func (cr *Rect) ConnectRecip(send, recv *tensor.Shape, same bool) (sendn, recvn *tensor.Int32, cons *tensor.Bits) { sendn, recvn, cons = NewTensors(send, recv) - sNy, sNx, _, _ := tensor.Prjn2DShape(recv, false) // swapped! - rNy, rNx, _, _ := tensor.Prjn2DShape(send, false) + sNy, sNx, _, _ := tensor.Projection2DShape(recv, false) // swapped! + rNy, rNx, _, _ := tensor.Projection2DShape(send, false) rnv := recvn.Values snv := sendn.Values @@ -177,7 +177,7 @@ func (cr *Rect) ConnectRecip(send, recv *tensor.Shape, same bool) (sendn, recvn for ry := cr.RecvStart.Y; ry < rNyEff+cr.RecvStart.Y; ry++ { for rx := cr.RecvStart.X; rx < rNxEff+cr.RecvStart.X; rx++ { - ri := tensor.Prjn2DIndex(send, false, ry, rx) + ri := tensor.Projection2DIndex(send, false, ry, rx) sst := cr.Start if cr.RoundScale { sst.X += int(math32.Round(float32(rx-cr.RecvStart.X) * sc.X)) @@ -196,7 +196,7 @@ func (cr *Rect) ConnectRecip(send, recv *tensor.Shape, same bool) (sendn, recvn if clipx { continue } - si := tensor.Prjn2DIndex(recv, false, sy, sx) + si := tensor.Projection2DIndex(recv, false, sy, sx) off := si*sNtot + ri if !cr.SelfCon && same && ri == si { continue diff --git a/paths/typegen.go b/paths/typegen.go new file mode 100644 index 00000000..8c67e907 --- /dev/null +++ b/paths/typegen.go @@ -0,0 +1,35 @@ +// Code generated by "core generate -add-types"; DO NOT EDIT. + +package paths + +import ( + "cogentcore.org/core/types" +) + +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/pathss.Circle", IDName: "circle", Doc: "Circle implements a circular pattern of connectivity between two layers\nwhere the center moves in proportion to receiver position with offset\nand multiplier factors, and a given radius is used (with wrap-around\noptionally). A corresponding Gaussian bump of TopoWts is available as well.\nMakes for a good center-surround connectivity pattern.\n4D layers are automatically flattened to 2D for this connection.", Fields: []types.Field{{Name: "Radius", Doc: "radius of the circle, in units from center in sending layer"}, {Name: "Start", Doc: "starting offset in sending layer, for computing the corresponding sending center relative to given recv unit position"}, {Name: "Scale", Doc: "scaling to apply to receiving unit position to compute sending center as function of recv unit position"}, {Name: "AutoScale", Doc: "auto-scale sending center positions as function of relative sizes of send and recv layers -- if Start is positive then assumes it is a border, subtracted from sending size"}, {Name: "Wrap", Doc: "if true, connectivity wraps around edges"}, {Name: "TopoWts", Doc: "if true, this path should set gaussian topographic weights, according to following parameters"}, {Name: "Sigma", Doc: "gaussian sigma (width) as a proportion of the radius of the circle"}, {Name: "MaxWt", Doc: "maximum weight value for GaussWts function -- multiplies values"}, {Name: "SelfCon", Doc: "if true, and connecting layer to itself (self pathway), then make a self-connection from unit to itself"}}}) + +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/pathss.Full", IDName: "full", Doc: "Full implements full all-to-all pattern of connectivity between two layers", Fields: []types.Field{{Name: "SelfCon", Doc: "if true, and connecting layer to itself (self pathway), then make a self-connection from unit to itself"}}}) + +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/pathss.OneToOne", IDName: "one-to-one", Doc: "OneToOne implements point-to-point one-to-one pattern of connectivity between two layers", Fields: []types.Field{{Name: "NCons", Doc: "number of recv connections to make (0 for entire size of recv layer)"}, {Name: "SendStart", Doc: "starting unit index for sending connections"}, {Name: "RecvStart", Doc: "starting unit index for recv connections"}}}) + +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/pathss.Pattern", IDName: "pattern", Doc: "Pattern defines a pattern of connectivity between two layers.\nThe pattern is stored efficiently using a bitslice tensor of binary values indicating\npresence or absence of connection between two items.\nA receiver-based organization is generally assumed but connectivity can go either way."}) + +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/pathss.PoolOneToOne", IDName: "pool-one-to-one", Doc: "PoolOneToOne implements one-to-one connectivity between pools within layers.\nPools are the outer-most two dimensions of a 4D layer shape.\nIf either layer does not have pools, then if the number of individual\nunits matches the number of pools in the other layer, those are connected one-to-one\notherwise each pool connects to the entire set of other units.\nIf neither is 4D, then it is equivalent to OneToOne.", Fields: []types.Field{{Name: "NPools", Doc: "number of recv pools to connect (0 for entire number of pools in recv layer)"}, {Name: "SendStart", Doc: "starting pool index for sending connections"}, {Name: "RecvStart", Doc: "starting pool index for recv connections"}}}) + +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/pathss.PoolRect", IDName: "pool-rect", Doc: "PoolRect implements a rectangular pattern of connectivity between\ntwo 4D layers, in terms of their pool-level shapes,\nwhere the lower-left corner moves in proportion to receiver\npool position with offset and multiplier factors (with wrap-around optionally).", Fields: []types.Field{{Name: "Size", Doc: "size of rectangle (of pools) in sending layer that each receiving unit receives from"}, {Name: "Start", Doc: "starting pool offset in sending layer, for computing the corresponding sending lower-left corner relative to given recv pool position"}, {Name: "Scale", Doc: "scaling to apply to receiving pool osition to compute corresponding position in sending layer of the lower-left corner of rectangle"}, {Name: "AutoScale", Doc: "auto-set the Scale as function of the relative pool sizes of send and recv layers (e.g., if sending layer is 2x larger than receiving, Scale = 2)"}, {Name: "RoundScale", Doc: "if true, use Round when applying scaling factor -- otherwise uses Floor which makes Scale work like a grouping factor -- e.g., .25 will effectively group 4 recv pools with same send position"}, {Name: "Wrap", Doc: "if true, connectivity wraps around all edges if it would otherwise go off the edge -- if false, then edges are clipped"}, {Name: "SelfCon", Doc: "if true, and connecting layer to itself (self pathway), then make a self-connection from unit to itself"}, {Name: "RecvStart", Doc: "starting pool position in receiving layer -- if > 0 then pools below this starting point remain unconnected"}, {Name: "RecvN", Doc: "number of pools in receiving layer to connect -- if 0 then all (remaining after RecvStart) are connected -- otherwise if < remaining then those beyond this point remain unconnected"}}}) + +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/pathss.PoolSameUnit", IDName: "pool-same-unit", Doc: "PoolSameUnit connects a given unit to the unit at the same index\nacross all the pools in a layer.\nPools are the outer-most two dimensions of a 4D layer shape.\nThis is most sensible when pools have same numbers of units in send and recv.\nThis is typically used for lateral topography-inducing connectivity\nand can also serve to reduce a pooled layer down to a single pool.\nThe logic works if either layer does not have pools.\nIf neither is 4D, then it is equivalent to OneToOne.", Fields: []types.Field{{Name: "SelfCon", Doc: "if true, and connecting layer to itself (self pathway), then make a self-connection from unit to itself"}}}) + +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/pathss.PoolTile", IDName: "pool-tile", Doc: "PoolTile implements tiled 2D connectivity between pools within layers, where\na 2D rectangular receptive field (defined over pools, not units) is tiled\nacross the sending layer pools, with specified level of overlap.\nPools are the outer-most two dimensions of a 4D layer shape.\n2D layers are assumed to have 1x1 pool.\nThis is a standard form of convolutional connectivity, where pools are\nthe filters and the outer dims are locations filtered.\nVarious initial weight / scaling patterns are also available -- code\nmust specifically apply these to the receptive fields.", Fields: []types.Field{{Name: "Recip", Doc: "reciprocal topographic connectivity -- logic runs with recv <-> send -- produces symmetric back-pathway or topo path when sending layer is larger than recv"}, {Name: "Size", Doc: "size of receptive field tile, in terms of pools on the sending layer"}, {Name: "Skip", Doc: "how many pools to skip in tiling over sending layer -- typically 1/2 of Size"}, {Name: "Start", Doc: "starting pool offset for lower-left corner of first receptive field in sending layer"}, {Name: "Wrap", Doc: "if true, pool coordinates wrap around sending shape -- otherwise truncated at edges, which can lead to assymmetries in connectivity etc"}, {Name: "GaussFull", Doc: "gaussian topographic weights / scaling parameters for full receptive field width. multiplies any other factors present"}, {Name: "GaussInPool", Doc: "gaussian topographic weights / scaling parameters within individual sending pools (i.e., unit positions within their parent pool drive distance for gaussian) -- this helps organize / differentiate units more within pools, not just across entire receptive field. multiplies any other factors present"}, {Name: "SigFull", Doc: "sigmoidal topographic weights / scaling parameters for full receptive field width. left / bottom half have increasing sigmoids, and second half decrease. Multiplies any other factors present (only used if Gauss versions are not On!)"}, {Name: "SigInPool", Doc: "sigmoidal topographic weights / scaling parameters within individual sending pools (i.e., unit positions within their parent pool drive distance for sigmoid) -- this helps organize / differentiate units more within pools, not just across entire receptive field. multiplies any other factors present (only used if Gauss versions are not On!). left / bottom half have increasing sigmoids, and second half decrease."}, {Name: "TopoRange", Doc: "min..max range of topographic weight values to generate"}}}) + +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/pathss.GaussTopo", IDName: "gauss-topo", Doc: "GaussTopo has parameters for Gaussian topographic weights or scaling factors", Fields: []types.Field{{Name: "On", Doc: "use gaussian topographic weights / scaling values"}, {Name: "Sigma", Doc: "gaussian sigma (width) in normalized units where entire distance across relevant dimension is 1.0 -- typical useful values range from .3 to 1.5, with .6 default"}, {Name: "Wrap", Doc: "wrap the gaussian around on other sides of the receptive field, with the closest distance being used -- this removes strict topography but ensures a more uniform distribution of weight values so edge units don't have weaker overall weights"}, {Name: "CtrMove", Doc: "proportion to move gaussian center relative to the position of the receiving unit within its pool: 1.0 = centers span the entire range of the receptive field. Typically want to use 1.0 for Wrap = true, and 0.8 for false"}}}) + +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/pathss.SigmoidTopo", IDName: "sigmoid-topo", Doc: "SigmoidTopo has parameters for Gaussian topographic weights or scaling factors", Fields: []types.Field{{Name: "On", Doc: "use gaussian topographic weights / scaling values"}, {Name: "Gain", Doc: "gain of sigmoid that determines steepness of curve, in normalized units where entire distance across relevant dimension is 1.0 -- typical useful values range from 0.01 to 0.1"}, {Name: "CtrMove", Doc: "proportion to move gaussian center relative to the position of the receiving unit within its pool: 1.0 = centers span the entire range of the receptive field. Typically want to use 1.0 for Wrap = true, and 0.8 for false"}}}) + +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/pathss.PoolTileSub", IDName: "pool-tile-sub", Doc: "PoolTileSub implements tiled 2D connectivity between pools within layers, where\na 2D rectangular receptive field (defined over pools, not units) is tiled\nacross the sending layer pools, with specified level of overlap.\nPools are the outer-most two dimensions of a 4D layer shape.\nSub version has sub-pools within each pool to encourage more independent\nrepresentations.\n2D layers are assumed to have 1x1 pool.\nThis is a standard form of convolutional connectivity, where pools are\nthe filters and the outer dims are locations filtered.\nVarious initial weight / scaling patterns are also available -- code\nmust specifically apply these to the receptive fields.", Fields: []types.Field{{Name: "Recip", Doc: "reciprocal topographic connectivity -- logic runs with recv <-> send -- produces symmetric back-pathway or topo path when sending layer is larger than recv"}, {Name: "Size", Doc: "size of receptive field tile, in terms of pools on the sending layer"}, {Name: "Skip", Doc: "how many pools to skip in tiling over sending layer -- typically 1/2 of Size"}, {Name: "Start", Doc: "starting pool offset for lower-left corner of first receptive field in sending layer"}, {Name: "Subs", Doc: "number of sub-pools within each pool"}, {Name: "SendSubs", Doc: "sending layer has sub-pools"}, {Name: "Wrap", Doc: "if true, pool coordinates wrap around sending shape -- otherwise truncated at edges, which can lead to assymmetries in connectivity etc"}, {Name: "GaussFull", Doc: "gaussian topographic weights / scaling parameters for full receptive field width. multiplies any other factors present"}, {Name: "GaussInPool", Doc: "gaussian topographic weights / scaling parameters within individual sending pools (i.e., unit positions within their parent pool drive distance for gaussian) -- this helps organize / differentiate units more within pools, not just across entire receptive field. multiplies any other factors present"}, {Name: "SigFull", Doc: "sigmoidal topographic weights / scaling parameters for full receptive field width. left / bottom half have increasing sigmoids, and second half decrease. Multiplies any other factors present (only used if Gauss versions are not On!)"}, {Name: "SigInPool", Doc: "sigmoidal topographic weights / scaling parameters within individual sending pools (i.e., unit positions within their parent pool drive distance for sigmoid) -- this helps organize / differentiate units more within pools, not just across entire receptive field. multiplies any other factors present (only used if Gauss versions are not On!). left / bottom half have increasing sigmoids, and second half decrease."}, {Name: "TopoRange", Doc: "min..max range of topographic weight values to generate"}}}) + +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/pathss.PoolUnifRnd", IDName: "pool-unif-rnd", Doc: "PoolUnifRnd implements random pattern of connectivity between pools within layers.\nPools are the outer-most two dimensions of a 4D layer shape.\nIf either layer does not have pools, PoolUnifRnd works as UnifRnd does.\nIf probability of connection (PCon) is 1, PoolUnifRnd works as PoolOnetoOne does.", Embeds: []types.Field{{Name: "PoolOneToOne"}, {Name: "UnifRnd"}}}) + +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/pathss.Rect", IDName: "rect", Doc: "Rect implements a rectangular pattern of connectivity between two layers\nwhere the lower-left corner moves in proportion to receiver position with offset\nand multiplier factors (with wrap-around optionally).\n4D layers are automatically flattened to 2D for this pathway.", Fields: []types.Field{{Name: "Size", Doc: "size of rectangle in sending layer that each receiving unit receives from"}, {Name: "Start", Doc: "starting offset in sending layer, for computing the corresponding sending lower-left corner relative to given recv unit position"}, {Name: "Scale", Doc: "scaling to apply to receiving unit position to compute corresponding position in sending layer of the lower-left corner of rectangle"}, {Name: "AutoScale", Doc: "auto-set the Scale as function of the relative sizes of send and recv layers (e.g., if sending layer is 2x larger than receiving, Scale = 2)"}, {Name: "RoundScale", Doc: "if true, use Round when applying scaling factor -- otherwise uses Floor which makes Scale work like a grouping factor -- e.g., .25 will effectively group 4 recv units with same send position"}, {Name: "Wrap", Doc: "if true, connectivity wraps around all edges if it would otherwise go off the edge -- if false, then edges are clipped"}, {Name: "SelfCon", Doc: "if true, and connecting layer to itself (self pathway), then make a self-connection from unit to itself"}, {Name: "Recip", Doc: "make the reciprocal of the specified connections -- i.e., symmetric for swapping recv and send"}, {Name: "RecvStart", Doc: "starting position in receiving layer -- if > 0 then units below this starting point remain unconnected"}, {Name: "RecvN", Doc: "number of units in receiving layer to connect -- if 0 then all (remaining after RecvStart) are connected -- otherwise if < remaining then those beyond this point remain unconnected"}}}) + +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/pathss.UnifRnd", IDName: "unif-rnd", Doc: "UnifRnd implements uniform random pattern of connectivity between two layers\nusing a permuted (shuffled) list for without-replacement randomness,\nand maintains its own local random number source and seed\nwhich are initialized if Rand == nil -- usually best to keep this\nspecific to each instance of a pathway so it is fully reproducible\nand doesn't interfere with other random number streams.", Fields: []types.Field{{Name: "PCon", Doc: "probability of connection (0-1)"}, {Name: "SelfCon", Doc: "if true, and connecting layer to itself (self pathway), then make a self-connection from unit to itself"}, {Name: "Recip", Doc: "reciprocal connectivity: if true, switch the sending and receiving layers to create a symmetric top-down pathway -- ESSENTIAL to use same RndSeed between two paths to ensure symmetry"}, {Name: "Rand", Doc: "random number source -- is created with its own separate source if nil"}, {Name: "RndSeed", Doc: "the current random seed -- will be initialized to a new random number from the global random stream when Rand is created."}}}) diff --git a/prjn/unifrnd.go b/paths/unifrnd.go similarity index 94% rename from prjn/unifrnd.go rename to paths/unifrnd.go index 4d15ad26..d59a9f73 100644 --- a/prjn/unifrnd.go +++ b/paths/unifrnd.go @@ -2,7 +2,7 @@ // Use of this source code is governed by a BSD-style // license that can be found in the LICENSE file. -package prjn +package paths import ( "math" @@ -17,17 +17,17 @@ import ( // using a permuted (shuffled) list for without-replacement randomness, // and maintains its own local random number source and seed // which are initialized if Rand == nil -- usually best to keep this -// specific to each instance of a projection so it is fully reproducible +// specific to each instance of a pathway so it is fully reproducible // and doesn't interfere with other random number streams. type UnifRnd struct { // probability of connection (0-1) PCon float32 `min:"0" max:"1"` - // if true, and connecting layer to itself (self projection), then make a self-connection from unit to itself + // if true, and connecting layer to itself (self pathway), then make a self-connection from unit to itself SelfCon bool - // reciprocal connectivity: if true, switch the sending and receiving layers to create a symmetric top-down projection -- ESSENTIAL to use same RndSeed between two prjns to ensure symmetry + // reciprocal connectivity: if true, switch the sending and receiving layers to create a symmetric top-down pathway -- ESSENTIAL to use same RndSeed between two paths to ensure symmetry Recip bool // random number source -- is created with its own separate source if nil diff --git a/prjn/README.md b/prjn/README.md deleted file mode 100644 index 92267daa..00000000 --- a/prjn/README.md +++ /dev/null @@ -1,22 +0,0 @@ -Docs: [GoDoc](https://pkg.go.dev/github.com/emer/emergent/prjn) - -See [Wiki Params](https://github.com/emer/emergent/wiki/Prjns) page for detailed docs. - -Package prjn is a separate package for defining `Pattern`s of connectivity between layers (i.e., the ProjectionSpecs from C++ emergent). This is done using a fully independent structure that *only* knows about the shapes of the two layers, and it returns a fully general bitmap representation of the pattern of connectivity between them. - -The algorithm-specific leabra.Prjn code then uses these patterns to do all the nitty-gritty of connecting up neurons. - -This makes the projection code *much* simpler compared to the ProjectionSpec in C++ emergent, which was involved in both creating the pattern and also all the complexity of setting up the actual connections themselves. This should be the *last* time any of those projection patterns need to be written (having re-written this code too many times in the C++ version as the details of memory allocations changed). - -A Pattern maintains nothing about a specific projection -- it only has the parameters that are applied in creating a new pattern of connectivity, so it can be shared among any number of projections that need the same connectivity parameters. - -All Patttern types have a New where is the type name, that creates a new instance of given pattern initialized with default values. - -Individual Pattern types may have a Defaults() method to initialize default values, but it is not mandatory. - -# Topographic Weights - -Some projections (e.g., Circle, PoolTile) support the generation of topographic weight patterns that can be used to set initial weights, or per-synapse scaling factors. The `Pattern` interface does not define any standard for how this done, as there are various possible approaches. Circle defines a method with a standard signature that can be called for each point in the pattern, while PoolTile has a lot more overhead per point and is thus more efficient to generate the whole set of weights to tensor, which can then be used. - -It is recommended to have some kind of positive flag(s) for enabling the use of TopoWts -- the standard weight initialization methods, e.g., leabra.Network.InitWts, can then automatically do the correct thing for each type of standard projection -- custom ones outside of this standard set would need custom code.. - diff --git a/prjn/typegen.go b/prjn/typegen.go deleted file mode 100644 index 36821538..00000000 --- a/prjn/typegen.go +++ /dev/null @@ -1,35 +0,0 @@ -// Code generated by "core generate -add-types"; DO NOT EDIT. - -package prjn - -import ( - "cogentcore.org/core/types" -) - -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/prjn.Circle", IDName: "circle", Doc: "Circle implements a circular pattern of connectivity between two layers\nwhere the center moves in proportion to receiver position with offset\nand multiplier factors, and a given radius is used (with wrap-around\noptionally). A corresponding Gaussian bump of TopoWts is available as well.\nMakes for a good center-surround connectivity pattern.\n4D layers are automatically flattened to 2D for this connection.", Fields: []types.Field{{Name: "Radius", Doc: "radius of the circle, in units from center in sending layer"}, {Name: "Start", Doc: "starting offset in sending layer, for computing the corresponding sending center relative to given recv unit position"}, {Name: "Scale", Doc: "scaling to apply to receiving unit position to compute sending center as function of recv unit position"}, {Name: "AutoScale", Doc: "auto-scale sending center positions as function of relative sizes of send and recv layers -- if Start is positive then assumes it is a border, subtracted from sending size"}, {Name: "Wrap", Doc: "if true, connectivity wraps around edges"}, {Name: "TopoWts", Doc: "if true, this prjn should set gaussian topographic weights, according to following parameters"}, {Name: "Sigma", Doc: "gaussian sigma (width) as a proportion of the radius of the circle"}, {Name: "MaxWt", Doc: "maximum weight value for GaussWts function -- multiplies values"}, {Name: "SelfCon", Doc: "if true, and connecting layer to itself (self projection), then make a self-connection from unit to itself"}}}) - -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/prjn.Full", IDName: "full", Doc: "Full implements full all-to-all pattern of connectivity between two layers", Fields: []types.Field{{Name: "SelfCon", Doc: "if true, and connecting layer to itself (self projection), then make a self-connection from unit to itself"}}}) - -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/prjn.OneToOne", IDName: "one-to-one", Doc: "OneToOne implements point-to-point one-to-one pattern of connectivity between two layers", Fields: []types.Field{{Name: "NCons", Doc: "number of recv connections to make (0 for entire size of recv layer)"}, {Name: "SendStart", Doc: "starting unit index for sending connections"}, {Name: "RecvStart", Doc: "starting unit index for recv connections"}}}) - -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/prjn.Pattern", IDName: "pattern", Doc: "Pattern defines a pattern of connectivity between two layers.\nThe pattern is stored efficiently using a bitslice tensor of binary values indicating\npresence or absence of connection between two items.\nA receiver-based organization is generally assumed but connectivity can go either way."}) - -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/prjn.PoolOneToOne", IDName: "pool-one-to-one", Doc: "PoolOneToOne implements one-to-one connectivity between pools within layers.\nPools are the outer-most two dimensions of a 4D layer shape.\nIf either layer does not have pools, then if the number of individual\nunits matches the number of pools in the other layer, those are connected one-to-one\notherwise each pool connects to the entire set of other units.\nIf neither is 4D, then it is equivalent to OneToOne.", Fields: []types.Field{{Name: "NPools", Doc: "number of recv pools to connect (0 for entire number of pools in recv layer)"}, {Name: "SendStart", Doc: "starting pool index for sending connections"}, {Name: "RecvStart", Doc: "starting pool index for recv connections"}}}) - -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/prjn.PoolRect", IDName: "pool-rect", Doc: "PoolRect implements a rectangular pattern of connectivity between\ntwo 4D layers, in terms of their pool-level shapes,\nwhere the lower-left corner moves in proportion to receiver\npool position with offset and multiplier factors (with wrap-around optionally).", Fields: []types.Field{{Name: "Size", Doc: "size of rectangle (of pools) in sending layer that each receiving unit receives from"}, {Name: "Start", Doc: "starting pool offset in sending layer, for computing the corresponding sending lower-left corner relative to given recv pool position"}, {Name: "Scale", Doc: "scaling to apply to receiving pool osition to compute corresponding position in sending layer of the lower-left corner of rectangle"}, {Name: "AutoScale", Doc: "auto-set the Scale as function of the relative pool sizes of send and recv layers (e.g., if sending layer is 2x larger than receiving, Scale = 2)"}, {Name: "RoundScale", Doc: "if true, use Round when applying scaling factor -- otherwise uses Floor which makes Scale work like a grouping factor -- e.g., .25 will effectively group 4 recv pools with same send position"}, {Name: "Wrap", Doc: "if true, connectivity wraps around all edges if it would otherwise go off the edge -- if false, then edges are clipped"}, {Name: "SelfCon", Doc: "if true, and connecting layer to itself (self projection), then make a self-connection from unit to itself"}, {Name: "RecvStart", Doc: "starting pool position in receiving layer -- if > 0 then pools below this starting point remain unconnected"}, {Name: "RecvN", Doc: "number of pools in receiving layer to connect -- if 0 then all (remaining after RecvStart) are connected -- otherwise if < remaining then those beyond this point remain unconnected"}}}) - -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/prjn.PoolSameUnit", IDName: "pool-same-unit", Doc: "PoolSameUnit connects a given unit to the unit at the same index\nacross all the pools in a layer.\nPools are the outer-most two dimensions of a 4D layer shape.\nThis is most sensible when pools have same numbers of units in send and recv.\nThis is typically used for lateral topography-inducing connectivity\nand can also serve to reduce a pooled layer down to a single pool.\nThe logic works if either layer does not have pools.\nIf neither is 4D, then it is equivalent to OneToOne.", Fields: []types.Field{{Name: "SelfCon", Doc: "if true, and connecting layer to itself (self projection), then make a self-connection from unit to itself"}}}) - -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/prjn.PoolTile", IDName: "pool-tile", Doc: "PoolTile implements tiled 2D connectivity between pools within layers, where\na 2D rectangular receptive field (defined over pools, not units) is tiled\nacross the sending layer pools, with specified level of overlap.\nPools are the outer-most two dimensions of a 4D layer shape.\n2D layers are assumed to have 1x1 pool.\nThis is a standard form of convolutional connectivity, where pools are\nthe filters and the outer dims are locations filtered.\nVarious initial weight / scaling patterns are also available -- code\nmust specifically apply these to the receptive fields.", Fields: []types.Field{{Name: "Recip", Doc: "reciprocal topographic connectivity -- logic runs with recv <-> send -- produces symmetric back-projection or topo prjn when sending layer is larger than recv"}, {Name: "Size", Doc: "size of receptive field tile, in terms of pools on the sending layer"}, {Name: "Skip", Doc: "how many pools to skip in tiling over sending layer -- typically 1/2 of Size"}, {Name: "Start", Doc: "starting pool offset for lower-left corner of first receptive field in sending layer"}, {Name: "Wrap", Doc: "if true, pool coordinates wrap around sending shape -- otherwise truncated at edges, which can lead to assymmetries in connectivity etc"}, {Name: "GaussFull", Doc: "gaussian topographic weights / scaling parameters for full receptive field width. multiplies any other factors present"}, {Name: "GaussInPool", Doc: "gaussian topographic weights / scaling parameters within individual sending pools (i.e., unit positions within their parent pool drive distance for gaussian) -- this helps organize / differentiate units more within pools, not just across entire receptive field. multiplies any other factors present"}, {Name: "SigFull", Doc: "sigmoidal topographic weights / scaling parameters for full receptive field width. left / bottom half have increasing sigmoids, and second half decrease. Multiplies any other factors present (only used if Gauss versions are not On!)"}, {Name: "SigInPool", Doc: "sigmoidal topographic weights / scaling parameters within individual sending pools (i.e., unit positions within their parent pool drive distance for sigmoid) -- this helps organize / differentiate units more within pools, not just across entire receptive field. multiplies any other factors present (only used if Gauss versions are not On!). left / bottom half have increasing sigmoids, and second half decrease."}, {Name: "TopoRange", Doc: "min..max range of topographic weight values to generate"}}}) - -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/prjn.GaussTopo", IDName: "gauss-topo", Doc: "GaussTopo has parameters for Gaussian topographic weights or scaling factors", Fields: []types.Field{{Name: "On", Doc: "use gaussian topographic weights / scaling values"}, {Name: "Sigma", Doc: "gaussian sigma (width) in normalized units where entire distance across relevant dimension is 1.0 -- typical useful values range from .3 to 1.5, with .6 default"}, {Name: "Wrap", Doc: "wrap the gaussian around on other sides of the receptive field, with the closest distance being used -- this removes strict topography but ensures a more uniform distribution of weight values so edge units don't have weaker overall weights"}, {Name: "CtrMove", Doc: "proportion to move gaussian center relative to the position of the receiving unit within its pool: 1.0 = centers span the entire range of the receptive field. Typically want to use 1.0 for Wrap = true, and 0.8 for false"}}}) - -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/prjn.SigmoidTopo", IDName: "sigmoid-topo", Doc: "SigmoidTopo has parameters for Gaussian topographic weights or scaling factors", Fields: []types.Field{{Name: "On", Doc: "use gaussian topographic weights / scaling values"}, {Name: "Gain", Doc: "gain of sigmoid that determines steepness of curve, in normalized units where entire distance across relevant dimension is 1.0 -- typical useful values range from 0.01 to 0.1"}, {Name: "CtrMove", Doc: "proportion to move gaussian center relative to the position of the receiving unit within its pool: 1.0 = centers span the entire range of the receptive field. Typically want to use 1.0 for Wrap = true, and 0.8 for false"}}}) - -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/prjn.PoolTileSub", IDName: "pool-tile-sub", Doc: "PoolTileSub implements tiled 2D connectivity between pools within layers, where\na 2D rectangular receptive field (defined over pools, not units) is tiled\nacross the sending layer pools, with specified level of overlap.\nPools are the outer-most two dimensions of a 4D layer shape.\nSub version has sub-pools within each pool to encourage more independent\nrepresentations.\n2D layers are assumed to have 1x1 pool.\nThis is a standard form of convolutional connectivity, where pools are\nthe filters and the outer dims are locations filtered.\nVarious initial weight / scaling patterns are also available -- code\nmust specifically apply these to the receptive fields.", Fields: []types.Field{{Name: "Recip", Doc: "reciprocal topographic connectivity -- logic runs with recv <-> send -- produces symmetric back-projection or topo prjn when sending layer is larger than recv"}, {Name: "Size", Doc: "size of receptive field tile, in terms of pools on the sending layer"}, {Name: "Skip", Doc: "how many pools to skip in tiling over sending layer -- typically 1/2 of Size"}, {Name: "Start", Doc: "starting pool offset for lower-left corner of first receptive field in sending layer"}, {Name: "Subs", Doc: "number of sub-pools within each pool"}, {Name: "SendSubs", Doc: "sending layer has sub-pools"}, {Name: "Wrap", Doc: "if true, pool coordinates wrap around sending shape -- otherwise truncated at edges, which can lead to assymmetries in connectivity etc"}, {Name: "GaussFull", Doc: "gaussian topographic weights / scaling parameters for full receptive field width. multiplies any other factors present"}, {Name: "GaussInPool", Doc: "gaussian topographic weights / scaling parameters within individual sending pools (i.e., unit positions within their parent pool drive distance for gaussian) -- this helps organize / differentiate units more within pools, not just across entire receptive field. multiplies any other factors present"}, {Name: "SigFull", Doc: "sigmoidal topographic weights / scaling parameters for full receptive field width. left / bottom half have increasing sigmoids, and second half decrease. Multiplies any other factors present (only used if Gauss versions are not On!)"}, {Name: "SigInPool", Doc: "sigmoidal topographic weights / scaling parameters within individual sending pools (i.e., unit positions within their parent pool drive distance for sigmoid) -- this helps organize / differentiate units more within pools, not just across entire receptive field. multiplies any other factors present (only used if Gauss versions are not On!). left / bottom half have increasing sigmoids, and second half decrease."}, {Name: "TopoRange", Doc: "min..max range of topographic weight values to generate"}}}) - -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/prjn.PoolUnifRnd", IDName: "pool-unif-rnd", Doc: "PoolUnifRnd implements random pattern of connectivity between pools within layers.\nPools are the outer-most two dimensions of a 4D layer shape.\nIf either layer does not have pools, PoolUnifRnd works as UnifRnd does.\nIf probability of connection (PCon) is 1, PoolUnifRnd works as PoolOnetoOne does.", Embeds: []types.Field{{Name: "PoolOneToOne"}, {Name: "UnifRnd"}}}) - -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/prjn.Rect", IDName: "rect", Doc: "Rect implements a rectangular pattern of connectivity between two layers\nwhere the lower-left corner moves in proportion to receiver position with offset\nand multiplier factors (with wrap-around optionally).\n4D layers are automatically flattened to 2D for this projection.", Fields: []types.Field{{Name: "Size", Doc: "size of rectangle in sending layer that each receiving unit receives from"}, {Name: "Start", Doc: "starting offset in sending layer, for computing the corresponding sending lower-left corner relative to given recv unit position"}, {Name: "Scale", Doc: "scaling to apply to receiving unit position to compute corresponding position in sending layer of the lower-left corner of rectangle"}, {Name: "AutoScale", Doc: "auto-set the Scale as function of the relative sizes of send and recv layers (e.g., if sending layer is 2x larger than receiving, Scale = 2)"}, {Name: "RoundScale", Doc: "if true, use Round when applying scaling factor -- otherwise uses Floor which makes Scale work like a grouping factor -- e.g., .25 will effectively group 4 recv units with same send position"}, {Name: "Wrap", Doc: "if true, connectivity wraps around all edges if it would otherwise go off the edge -- if false, then edges are clipped"}, {Name: "SelfCon", Doc: "if true, and connecting layer to itself (self projection), then make a self-connection from unit to itself"}, {Name: "Recip", Doc: "make the reciprocal of the specified connections -- i.e., symmetric for swapping recv and send"}, {Name: "RecvStart", Doc: "starting position in receiving layer -- if > 0 then units below this starting point remain unconnected"}, {Name: "RecvN", Doc: "number of units in receiving layer to connect -- if 0 then all (remaining after RecvStart) are connected -- otherwise if < remaining then those beyond this point remain unconnected"}}}) - -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/prjn.UnifRnd", IDName: "unif-rnd", Doc: "UnifRnd implements uniform random pattern of connectivity between two layers\nusing a permuted (shuffled) list for without-replacement randomness,\nand maintains its own local random number source and seed\nwhich are initialized if Rand == nil -- usually best to keep this\nspecific to each instance of a projection so it is fully reproducible\nand doesn't interfere with other random number streams.", Fields: []types.Field{{Name: "PCon", Doc: "probability of connection (0-1)"}, {Name: "SelfCon", Doc: "if true, and connecting layer to itself (self projection), then make a self-connection from unit to itself"}, {Name: "Recip", Doc: "reciprocal connectivity: if true, switch the sending and receiving layers to create a symmetric top-down projection -- ESSENTIAL to use same RndSeed between two prjns to ensure symmetry"}, {Name: "Rand", Doc: "random number source -- is created with its own separate source if nil"}, {Name: "RndSeed", Doc: "the current random seed -- will be initialized to a new random number from the global random stream when Rand is created."}}}) diff --git a/timer/README.md b/timer/README.md deleted file mode 100644 index 867ca5fb..00000000 --- a/timer/README.md +++ /dev/null @@ -1,5 +0,0 @@ -Docs: [GoDoc](https://pkg.go.dev/github.com/emer/emergent/timer) - -Package timer provides a simple wall-clock duration timer based on standard time. Accumulates total and average over multiple Start / Stop intervals. - - diff --git a/timer/timer.go b/timer/timer.go deleted file mode 100644 index f99983f7..00000000 --- a/timer/timer.go +++ /dev/null @@ -1,85 +0,0 @@ -// Copyright (c) 2019, The Emergent Authors. All rights reserved. -// Use of this source code is governed by a BSD-style -// license that can be found in the LICENSE file. - -// Package timer provides a simple wall-clock duration timer based on standard -// time. Accumulates total and average over multiple Start / Stop intervals. -package timer - -//go:generate core generate -add-types - -import "time" - -// Time manages the timer accumulated time and count -type Time struct { - - // the most recent starting time - St time.Time - - // the total accumulated time - Total time.Duration - - // the number of start/stops - N int -} - -// Reset resets the overall accumulated Total and N counters and start time to zero -func (t *Time) Reset() { - t.St = time.Time{} - t.Total = 0 - t.N = 0 -} - -// Start starts the timer -func (t *Time) Start() { - t.St = time.Now() -} - -// ResetStart reset then start the timer -func (t *Time) ResetStart() { - t.Reset() - t.Start() -} - -// Stop stops the timer and accumulates the latest start - stop interval, and also returns it -func (t *Time) Stop() time.Duration { - if t.St.IsZero() { - t.Total = 0 - t.N = 0 - return 0 - } - iv := time.Now().Sub(t.St) - t.Total += iv - t.N++ - return iv -} - -// Avg returns the average start / stop interval (assumes each was measuring the same thing). -func (t *Time) Avg() time.Duration { - if t.N == 0 { - return 0 - } - return t.Total / time.Duration(t.N) -} - -// AvgSecs returns the average start / stop interval (assumes each was measuring the same thing) -// as a float64 of seconds -func (t *Time) AvgSecs() float64 { - if t.N == 0 { - return 0 - } - return float64(t.Total) / (float64(t.N) * float64(time.Second)) -} - -// AvgMSecs returns the average start / stop interval as a float64 of milliseconds -func (t *Time) AvgMSecs() float64 { - if t.N == 0 { - return 0 - } - return float64(t.Total) / (float64(t.N) * float64(time.Millisecond)) -} - -// TotalSecs returns the total start / stop intervals as a float64 of seconds. -func (t *Time) TotalSecs() float64 { - return float64(t.Total) / float64(time.Second) -} diff --git a/timer/typegen.go b/timer/typegen.go deleted file mode 100644 index b3902dc9..00000000 --- a/timer/typegen.go +++ /dev/null @@ -1,9 +0,0 @@ -// Code generated by "core generate -add-types"; DO NOT EDIT. - -package timer - -import ( - "cogentcore.org/core/types" -) - -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/timer.Time", IDName: "time", Doc: "Time manages the timer accumulated time and count", Fields: []types.Field{{Name: "St", Doc: "the most recent starting time"}, {Name: "Total", Doc: "the total accumulated time"}, {Name: "N", Doc: "the number of start/stops"}}}) diff --git a/weights/cpp.go b/weights/cpp.go index 44df86ae..0072ab2f 100644 --- a/weights/cpp.go +++ b/weights/cpp.go @@ -19,7 +19,7 @@ func NetReadCpp(r io.Reader) (*Network, error) { nw := &Network{} var ( lw *Layer - pw *Prjn + pw *Path rw *Recv ri int pi int @@ -74,10 +74,10 @@ func NetReadCpp(r io.Reader) (*Network, error) { errlist = append(errlist, err) } fm := strings.TrimPrefix(css[1], "From:") - if len(lw.Prjns) < pi+1 { - lw.Prjns = append(lw.Prjns, Prjn{From: fm}) + if len(lw.Paths) < pi+1 { + lw.Paths = append(lw.Paths, Path{From: fm}) } - pw = &lw.Prjns[pi] + pw = &lw.Paths[pi] continue case strings.HasPrefix(bs, "") diff --git a/weights/json.go b/weights/json.go index ba16c93c..41003e71 100644 --- a/weights/json.go +++ b/weights/json.go @@ -42,9 +42,9 @@ func LayReadJSON(r io.Reader) (*Layer, error) { return lw, nil } -// PrjnReadJSON reads weights for prjn in a JSON format into Prjn structure -func PrjnReadJSON(r io.Reader) (*Prjn, error) { - pw := &Prjn{} +// PathReadJSON reads weights for path in a JSON format into Path structure +func PathReadJSON(r io.Reader) (*Path, error) { + pw := &Path{} dec := json.NewDecoder(r) err := dec.Decode(pw) // this is way to do it on reader instead of bytes if err == io.EOF { diff --git a/weights/json_test.go b/weights/json_test.go index fb7137ae..3a01f94d 100644 --- a/weights/json_test.go +++ b/weights/json_test.go @@ -30,8 +30,8 @@ func TestSaveWts(t *testing.T) { un[i] = rand.Float32() } l1.Units["TrgAvg"] = un - l1.Prjns = make([]Prjn, 1) - pj := &l1.Prjns[0] + l1.Paths = make([]Path, 1) + pj := &l1.Paths[0] pj.From = "Input" pj.SetMetaData("GScale", "0.333") pj.Rs = make([]Recv, 3) diff --git a/weights/typegen.go b/weights/typegen.go index 372eebf2..e4c98f9a 100644 --- a/weights/typegen.go +++ b/weights/typegen.go @@ -8,8 +8,8 @@ import ( var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/weights.Network", IDName: "network", Doc: "Network is temp structure for holding decoded weights", Directives: []types.Directive{{Tool: "go", Directive: "generate", Args: []string{"core", "generate", "-add-types"}}}, Fields: []types.Field{{Name: "Network"}, {Name: "MetaData"}, {Name: "Layers"}}}) -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/weights.Layer", IDName: "layer", Doc: "Layer is temp structure for holding decoded weights, one for each layer", Fields: []types.Field{{Name: "Layer"}, {Name: "MetaData"}, {Name: "Units"}, {Name: "Prjns"}}}) +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/weights.Layer", IDName: "layer", Doc: "Layer is temp structure for holding decoded weights, one for each layer", Fields: []types.Field{{Name: "Layer"}, {Name: "MetaData"}, {Name: "Units"}, {Name: "Paths"}}}) -var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/weights.Prjn", IDName: "prjn", Doc: "Prjn is temp structure for holding decoded weights, one for each projection", Fields: []types.Field{{Name: "From"}, {Name: "MetaData"}, {Name: "MetaValues"}, {Name: "Rs"}}}) +var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/weights.Path", IDName: "path", Doc: "Path is temp structure for holding decoded weights, one for each pathway", Fields: []types.Field{{Name: "From"}, {Name: "MetaData"}, {Name: "MetaValues"}, {Name: "Rs"}}}) var _ = types.AddType(&types.Type{Name: "github.com/emer/emergent/v2/weights.Recv", IDName: "recv", Doc: "Recv is temp structure for holding decoded weights, one for each recv unit", Fields: []types.Field{{Name: "Ri"}, {Name: "N"}, {Name: "Si"}, {Name: "Wt"}, {Name: "Wt1"}, {Name: "Wt2"}}}) diff --git a/weights/wts.go b/weights/wts.go index 9f03dd8f..0dbcab81 100644 --- a/weights/wts.go +++ b/weights/wts.go @@ -25,7 +25,7 @@ type Layer struct { Layer string MetaData map[string]string // for optional layer-level params, metadata such as ActMAvg, ActPAvg Units map[string][]float32 // for unit-level adapting parameters - Prjns []Prjn // receiving projections + Paths []Path // receiving pathways } func (ly *Layer) SetMetaData(key, val string) { @@ -35,15 +35,15 @@ func (ly *Layer) SetMetaData(key, val string) { ly.MetaData[key] = val } -// Prjn is temp structure for holding decoded weights, one for each projection -type Prjn struct { +// Path is temp structure for holding decoded weights, one for each pathway +type Path struct { From string - MetaData map[string]string // used for optional prjn-level params, metadata such as GScale - MetaValues map[string][]float32 // optional values at the projection level + MetaData map[string]string // used for optional path-level params, metadata such as GScale + MetaValues map[string][]float32 // optional values at the pathway level Rs []Recv } -func (pj *Prjn) SetMetaData(key, val string) { +func (pj *Path) SetMetaData(key, val string) { if pj.MetaData == nil { pj.MetaData = make(map[string]string) }