From 56bc5aa305b458d82fc308895c1c793cf8c3f639 Mon Sep 17 00:00:00 2001 From: David Ray Date: Sat, 10 Dec 2016 06:26:03 -0600 Subject: [PATCH 01/52] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index e3f91cac..ed39c90b 100644 --- a/README.md +++ b/README.md @@ -75,7 +75,7 @@ See the blog: [Join the Cogmission](http://www.cogmission.ai) | Core Algorithm | NuPIC Date |HTM.Java Date | Latest NuPIC SHA | Latest HTM.Java SHA | Status| | --------------- |:-------------:|:------------:|:----------------:|:-------------------:|:-----:| -| SpatialPooler | 2016-10-05 | 2016-10-07 |[commit](https://github.com/numenta/nupic/commit/7e77ecba4ffdd4991cfd87972de6211101e6661e)|[commit](https://github.com/numenta/htm.java/commit/2cdcee1fcc5f6c18c2c48b4b553c49879c1256bb#diff-22f96ea06fd0c2b3593c755cbccf0a8b)| Sync'd* +| SpatialPooler | 2016-10-05 | 2016-10-07 |[commit](https://github.com/numenta/nupic/commit/7e77ecba4ffdd4991cfd87972de6211101e6661e)|[commit](https://github.com/numenta/htm.java/commit/2cdcee1fcc5f6c18c2c48b4b553c49879c1256bb#diff-22f96ea06fd0c2b3593c755cbccf0a8b)| [Pending NuPIC #3411*](https://github.com/numenta/nupic/pull/3411) | TemporalMemory | 2016-09-23 | 2016-10-13 |[commit](https://github.com/numenta/nupic/commit/1036f25e7223471d72cebc536d6734f78d37b6c7)|[commit](https://github.com/numenta/htm.java/commit/7f4d8f2e2c910dd662909442546516e36adfc7cc)| Sync'd* \* May be one of: "Sync'd" or "Behind". "Behind" expresses a temporary lapse in synchronization while devs are implementing new changes. From 1f6bcdc25285812615f516416b677b736994f69e Mon Sep 17 00:00:00 2001 From: David Ray Date: Fri, 23 Dec 2016 12:42:15 -0600 Subject: [PATCH 02/52] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index ed39c90b..67a74b79 100644 --- a/README.md +++ b/README.md @@ -75,7 +75,7 @@ See the blog: [Join the Cogmission](http://www.cogmission.ai) | Core Algorithm | NuPIC Date |HTM.Java Date | Latest NuPIC SHA | Latest HTM.Java SHA | Status| | --------------- |:-------------:|:------------:|:----------------:|:-------------------:|:-----:| -| SpatialPooler | 2016-10-05 | 2016-10-07 |[commit](https://github.com/numenta/nupic/commit/7e77ecba4ffdd4991cfd87972de6211101e6661e)|[commit](https://github.com/numenta/htm.java/commit/2cdcee1fcc5f6c18c2c48b4b553c49879c1256bb#diff-22f96ea06fd0c2b3593c755cbccf0a8b)| [Pending NuPIC #3411*](https://github.com/numenta/nupic/pull/3411) +| SpatialPooler | 2016-10-05 | 2016-10-07 |[commit](https://github.com/numenta/nupic/commit/5c3edead9526d3b5fb6a4f37ad9d38cdcf32f5ff)|[commit](https://github.com/numenta/htm.java/commit/2cdcee1fcc5f6c18c2c48b4b553c49879c1256bb#diff-22f96ea06fd0c2b3593c755cbccf0a8b)| [*Behind NuPIC Merge #3411](https://github.com/numenta/nupic/pull/3411) | TemporalMemory | 2016-09-23 | 2016-10-13 |[commit](https://github.com/numenta/nupic/commit/1036f25e7223471d72cebc536d6734f78d37b6c7)|[commit](https://github.com/numenta/htm.java/commit/7f4d8f2e2c910dd662909442546516e36adfc7cc)| Sync'd* \* May be one of: "Sync'd" or "Behind". "Behind" expresses a temporary lapse in synchronization while devs are implementing new changes. From f299748a5aa3e6f8b75f6ad9d7fdc3aa8686235a Mon Sep 17 00:00:00 2001 From: David Ray Date: Fri, 23 Dec 2016 12:43:46 -0600 Subject: [PATCH 03/52] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 67a74b79..61bdf54a 100644 --- a/README.md +++ b/README.md @@ -75,7 +75,7 @@ See the blog: [Join the Cogmission](http://www.cogmission.ai) | Core Algorithm | NuPIC Date |HTM.Java Date | Latest NuPIC SHA | Latest HTM.Java SHA | Status| | --------------- |:-------------:|:------------:|:----------------:|:-------------------:|:-----:| -| SpatialPooler | 2016-10-05 | 2016-10-07 |[commit](https://github.com/numenta/nupic/commit/5c3edead9526d3b5fb6a4f37ad9d38cdcf32f5ff)|[commit](https://github.com/numenta/htm.java/commit/2cdcee1fcc5f6c18c2c48b4b553c49879c1256bb#diff-22f96ea06fd0c2b3593c755cbccf0a8b)| [*Behind NuPIC Merge #3411](https://github.com/numenta/nupic/pull/3411) +| SpatialPooler | 2016-12-11 | 2016-10-07 |[commit](https://github.com/numenta/nupic/commit/5c3edead9526d3b5fb6a4f37ad9d38cdcf32f5ff)|[commit](https://github.com/numenta/htm.java/commit/2cdcee1fcc5f6c18c2c48b4b553c49879c1256bb#diff-22f96ea06fd0c2b3593c755cbccf0a8b)| [*Behind NuPIC Merge #3411](https://github.com/numenta/nupic/pull/3411) | TemporalMemory | 2016-09-23 | 2016-10-13 |[commit](https://github.com/numenta/nupic/commit/1036f25e7223471d72cebc536d6734f78d37b6c7)|[commit](https://github.com/numenta/htm.java/commit/7f4d8f2e2c910dd662909442546516e36adfc7cc)| Sync'd* \* May be one of: "Sync'd" or "Behind". "Behind" expresses a temporary lapse in synchronization while devs are implementing new changes. From df92d0df30708e0af80861714c606112e1903502 Mon Sep 17 00:00:00 2001 From: David Ray Date: Wed, 8 Feb 2017 09:51:08 -0600 Subject: [PATCH 04/52] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 61bdf54a..8b6acabe 100644 --- a/README.md +++ b/README.md @@ -1,4 +1,4 @@ -# ![Numenta Logo](http://numenta.org/images/numenta-icon128.png) +# ![Numenta Logo](http://metaware.us/nupic/1039191.png) # htm.java
From 52bc9f99068587ad45a9c8a916cadc189eedaba3 Mon Sep 17 00:00:00 2001 From: David Ray Date: Wed, 8 Feb 2017 09:55:14 -0600 Subject: [PATCH 05/52] Update README.md --- README.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 8b6acabe..0e838b73 100644 --- a/README.md +++ b/README.md @@ -3,7 +3,9 @@
-[![htm.java awesomeness](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](http://cogmission.ai) [![AGI Probability](https://img.shields.io/badge/AGI%20Probability-97%25-blue.svg)](http://numenta.com/#hero) [![Coolness Factor](https://img.shields.io/badge/Coolness%20Factor-100%25-blue.svg)](https://github.com/numenta/htm.java-examples) [![Build Status](https://travis-ci.org/numenta/htm.java.png?branch=master)](https://travis-ci.org/numenta/htm.java) [![Coverage Status](https://coveralls.io/repos/numenta/htm.java/badge.svg?branch=master&service=github)](https://coveralls.io/github/numenta/htm.java?branch=master) [![Maven Central](https://maven-badges.herokuapp.com/maven-central/org.numenta/htm.java/badge.svg)](https://maven-badges.herokuapp.com/maven-central/org.numenta/htm.java) [![][license img]][license] [![docs-badge][]][docs] [![Gitter](https://badges.gitter.im/Join +[![htm.java awesomeness](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](http://cogmission.ai) + +[![AGI Probability](https://img.shields.io/badge/AGI%20Probability-97%25-blue.svg)](http://numenta.com/#hero) [![Coolness Factor](https://img.shields.io/badge/Coolness%20Factor-100%25-blue.svg)](https://github.com/numenta/htm.java-examples) [![Build Status](https://travis-ci.org/numenta/htm.java.png?branch=master)](https://travis-ci.org/numenta/htm.java) [![Coverage Status](https://coveralls.io/repos/numenta/htm.java/badge.svg?branch=master&service=github)](https://coveralls.io/github/numenta/htm.java?branch=master) [![Maven Central](https://maven-badges.herokuapp.com/maven-central/org.numenta/htm.java/badge.svg)](https://maven-badges.herokuapp.com/maven-central/org.numenta/htm.java) [![][license img]][license] [![docs-badge][]][docs] [![Gitter](https://badges.gitter.im/Join Chat.svg)](https://gitter.im/numenta/htm.java?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge) [![OpenHub](https://www.openhub.net/p/htm-java/widgets/project_thin_badge.gif)](https://www.openhub.net/p/htm-java)
From 2280ce7d4da1a96cb3b9a49e99f151dc3d1fea27 Mon Sep 17 00:00:00 2001 From: David Ray Date: Wed, 8 Feb 2017 10:05:52 -0600 Subject: [PATCH 06/52] Update README.md --- README.md | 4 +--- 1 file changed, 1 insertion(+), 3 deletions(-) diff --git a/README.md b/README.md index 0e838b73..ff9688ae 100644 --- a/README.md +++ b/README.md @@ -3,9 +3,7 @@
-[![htm.java awesomeness](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](http://cogmission.ai) - -[![AGI Probability](https://img.shields.io/badge/AGI%20Probability-97%25-blue.svg)](http://numenta.com/#hero) [![Coolness Factor](https://img.shields.io/badge/Coolness%20Factor-100%25-blue.svg)](https://github.com/numenta/htm.java-examples) [![Build Status](https://travis-ci.org/numenta/htm.java.png?branch=master)](https://travis-ci.org/numenta/htm.java) [![Coverage Status](https://coveralls.io/repos/numenta/htm.java/badge.svg?branch=master&service=github)](https://coveralls.io/github/numenta/htm.java?branch=master) [![Maven Central](https://maven-badges.herokuapp.com/maven-central/org.numenta/htm.java/badge.svg)](https://maven-badges.herokuapp.com/maven-central/org.numenta/htm.java) [![][license img]][license] [![docs-badge][]][docs] [![Gitter](https://badges.gitter.im/Join +[![htm.java awesomeness](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](http://cogmission.ai)[![AGI Probability](https://img.shields.io/badge/AGI%20Probability-97%25-blue.svg)](http://numenta.com/#hero) [![Coolness Factor](https://img.shields.io/badge/Coolness%20Factor-100%25-blue.svg)](https://github.com/numenta/htm.java-examples) [![Build Status](https://travis-ci.org/numenta/htm.java.png?branch=master)](https://travis-ci.org/numenta/htm.java) [![Coverage Status](https://coveralls.io/repos/numenta/htm.java/badge.svg?branch=master&service=github)](https://coveralls.io/github/numenta/htm.java?branch=master) [![Maven Central](https://maven-badges.herokuapp.com/maven-central/org.numenta/htm.java/badge.svg)](https://maven-badges.herokuapp.com/maven-central/org.numenta/htm.java) [![][license img]][license] [![docs-badge][]][docs] [![Gitter](https://badges.gitter.im/Join Chat.svg)](https://gitter.im/numenta/htm.java?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge) [![OpenHub](https://www.openhub.net/p/htm-java/widgets/project_thin_badge.gif)](https://www.openhub.net/p/htm-java)
From 55662fc59e635976f4adfd290bf9d8c221cd8d26 Mon Sep 17 00:00:00 2001 From: David Ray Date: Wed, 8 Feb 2017 10:07:47 -0600 Subject: [PATCH 07/52] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index ff9688ae..8b6acabe 100644 --- a/README.md +++ b/README.md @@ -3,7 +3,7 @@
-[![htm.java awesomeness](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](http://cogmission.ai)[![AGI Probability](https://img.shields.io/badge/AGI%20Probability-97%25-blue.svg)](http://numenta.com/#hero) [![Coolness Factor](https://img.shields.io/badge/Coolness%20Factor-100%25-blue.svg)](https://github.com/numenta/htm.java-examples) [![Build Status](https://travis-ci.org/numenta/htm.java.png?branch=master)](https://travis-ci.org/numenta/htm.java) [![Coverage Status](https://coveralls.io/repos/numenta/htm.java/badge.svg?branch=master&service=github)](https://coveralls.io/github/numenta/htm.java?branch=master) [![Maven Central](https://maven-badges.herokuapp.com/maven-central/org.numenta/htm.java/badge.svg)](https://maven-badges.herokuapp.com/maven-central/org.numenta/htm.java) [![][license img]][license] [![docs-badge][]][docs] [![Gitter](https://badges.gitter.im/Join +[![htm.java awesomeness](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](http://cogmission.ai) [![AGI Probability](https://img.shields.io/badge/AGI%20Probability-97%25-blue.svg)](http://numenta.com/#hero) [![Coolness Factor](https://img.shields.io/badge/Coolness%20Factor-100%25-blue.svg)](https://github.com/numenta/htm.java-examples) [![Build Status](https://travis-ci.org/numenta/htm.java.png?branch=master)](https://travis-ci.org/numenta/htm.java) [![Coverage Status](https://coveralls.io/repos/numenta/htm.java/badge.svg?branch=master&service=github)](https://coveralls.io/github/numenta/htm.java?branch=master) [![Maven Central](https://maven-badges.herokuapp.com/maven-central/org.numenta/htm.java/badge.svg)](https://maven-badges.herokuapp.com/maven-central/org.numenta/htm.java) [![][license img]][license] [![docs-badge][]][docs] [![Gitter](https://badges.gitter.im/Join Chat.svg)](https://gitter.im/numenta/htm.java?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge) [![OpenHub](https://www.openhub.net/p/htm-java/widgets/project_thin_badge.gif)](https://www.openhub.net/p/htm-java)
From 3a58776fd03b3b302982799d20a6afe113fc4af5 Mon Sep 17 00:00:00 2001 From: David Ray Date: Wed, 8 Feb 2017 11:30:45 -0600 Subject: [PATCH 08/52] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 8b6acabe..3a73cf68 100644 --- a/README.md +++ b/README.md @@ -1,4 +1,4 @@ -# ![Numenta Logo](http://metaware.us/nupic/1039191.png) +# NuPIC Logo # htm.java
From 6c32139d479ed359c7076de553b7d9478c384506 Mon Sep 17 00:00:00 2001 From: Matthew Taylor Date: Tue, 14 Feb 2017 06:46:32 -0800 Subject: [PATCH 09/52] Updated location of Java docs in readme --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 3a73cf68..96ec64fe 100644 --- a/README.md +++ b/README.md @@ -51,7 +51,7 @@ _**NOTE: Minimum JavaSE version is 8**_ For a more detailed discussion of htm.java see:
* [htm.java Wiki](https://github.com/numenta/htm.java/wiki) -* [Java Docs](http://numenta.org/docs/htm.java/) +* [Java Docs](http://numenta.github.io/htm.java/) See the [Test Coverage Reports](https://coveralls.io/jobs/4164658) - For more information on where you can contribute! Extend the tests and get your name in bright lights! From 75407f8fd57f9df1682492a5ec667b14e249fbf4 Mon Sep 17 00:00:00 2001 From: Hopding Date: Sun, 19 Feb 2017 12:32:04 -0600 Subject: [PATCH 10/52] preliminary stage --- .../java/org/numenta/nupic/Parameters.java | 5 +- .../nupic/algorithms/CLAClassifier.java | 2 +- .../numenta/nupic/algorithms/Classifier.java | 15 +++++ .../nupic/algorithms/SDRClassifier.java | 2 +- .../java/org/numenta/nupic/network/Layer.java | 61 +++++++++++++------ .../org/numenta/nupic/network/Region.java | 3 + 6 files changed, 65 insertions(+), 23 deletions(-) create mode 100644 src/main/java/org/numenta/nupic/algorithms/Classifier.java diff --git a/src/main/java/org/numenta/nupic/Parameters.java b/src/main/java/org/numenta/nupic/Parameters.java index ba623a86..001f8160 100644 --- a/src/main/java/org/numenta/nupic/Parameters.java +++ b/src/main/java/org/numenta/nupic/Parameters.java @@ -32,6 +32,7 @@ import java.util.Random; import java.util.Set; +import org.numenta.nupic.algorithms.Classifier; import org.numenta.nupic.algorithms.SpatialPooler; import org.numenta.nupic.algorithms.TemporalMemory; import org.numenta.nupic.model.Cell; @@ -417,8 +418,8 @@ public static enum KEY { // Network Layer indicator for auto classifier generation AUTO_CLASSIFY("hasClassifiers", Boolean.class), - - + INFERRED_FIELDS("inferredFields", Map.class), // Map Classification compute(int recordNum, + Map classification, + int[] patternNZ, + boolean learn, + boolean infer); +} diff --git a/src/main/java/org/numenta/nupic/algorithms/SDRClassifier.java b/src/main/java/org/numenta/nupic/algorithms/SDRClassifier.java index 57a94887..d36cbabd 100644 --- a/src/main/java/org/numenta/nupic/algorithms/SDRClassifier.java +++ b/src/main/java/org/numenta/nupic/algorithms/SDRClassifier.java @@ -96,7 +96,7 @@ * @author David Ray * @author Andrew Dillon */ -public class SDRClassifier implements Persistable { +public class SDRClassifier implements Persistable, Classifier { private static final long serialVersionUID = 1L; int verbosity = 0; diff --git a/src/main/java/org/numenta/nupic/network/Layer.java b/src/main/java/org/numenta/nupic/network/Layer.java index e8fa3fb2..9a79745e 100644 --- a/src/main/java/org/numenta/nupic/network/Layer.java +++ b/src/main/java/org/numenta/nupic/network/Layer.java @@ -1694,25 +1694,36 @@ private void doEncoderBucketMapping(Inference inference, Map enc // Store the encoding int[] encoding = inference.getEncoding(); - for(EncoderTuple t : encoderTuples) { - String name = t.getName(); - Encoder e = t.getEncoder(); - - int bucketIdx = -1; - Object o = encoderInputMap.get(name); - if(DateTime.class.isAssignableFrom(o.getClass())) { - bucketIdx = ((DateEncoder)e).getBucketIndices((DateTime)o)[0]; - } else if(Number.class.isAssignableFrom(o.getClass())) { - bucketIdx = e.getBucketIndices((double)o)[0]; - } else { - bucketIdx = e.getBucketIndices((String)o)[0]; - } - - int offset = t.getOffset(); - int[] tempArray = new int[e.getWidth()]; - System.arraycopy(encoding, offset, tempArray, 0, tempArray.length); - - inference.getClassifierInput().put(name, new NamedTuple(new String[] { "name", "inputValue", "bucketIdx", "encoding" }, name, o, bucketIdx, tempArray)); + // TODO: 21: Looks like this is where the classifierInput(s) are set. + // Should probably change this so that instead of adding a mapping for + // each encoder, it adds a mapping for each field specified by the user + // in the new Parameters KEY. +// for(EncoderTuple t : encoderTuples) { +// String name = t.getName(); +// Encoder e = t.getEncoder(); +// +// int bucketIdx = -1; +// Object o = encoderInputMap.get(name); +// if(DateTime.class.isAssignableFrom(o.getClass())) { +// bucketIdx = ((DateEncoder)e).getBucketIndices((DateTime)o)[0]; +// } else if(Number.class.isAssignableFrom(o.getClass())) { +// bucketIdx = e.getBucketIndices((double)o)[0]; +// } else { +// bucketIdx = e.getBucketIndices((String)o)[0]; +// } +// +// int offset = t.getOffset(); +// int[] tempArray = new int[e.getWidth()]; +// System.arraycopy(encoding, offset, tempArray, 0, tempArray.length); +// +// inference.getClassifierInput().put(name, new NamedTuple(new String[] { "name", "inputValue", "bucketIdx", "encoding" }, name, o, bucketIdx, tempArray)); +// } + + // Get fields user wants encoded from Parameters + Map inferredFields = (Map)params.get(KEY.INFERRED_FIELDS); + for(Map.Entry entry : inferredFields.entrySet()) { + String name = entry.getKey(); + EncoderTuple encoderTuple = encoderTuples.get } } @@ -1908,7 +1919,15 @@ private void clearSubscriberObserverLists() { * @param encoder * @return */ + // TODO: 21: This creates two parallel arrays, one of encoder's names, and + // the other of each encoder's classifier. Returns a NamedTuple making use of + // these arrays easier. NamedTuple makeClassifiers(MultiEncoder encoder) { + // TODO: 21: Should be able to inspect new Parameters KEY(s) in here + // TODO: 21: to adjust types of Classifiers that are created/used. + // Looks like the passed-in MultiEncoder is a single wrapper containing + // multiple encoders; one for each field. Right now, a classifier is + // created for each of those encoders. However, instead of do String[] names = new String[encoder.getEncoders(encoder).size()]; CLAClassifier[] ca = new CLAClassifier[names.length]; int i = 0; @@ -2321,6 +2340,10 @@ public Object get(Object o) { @Override public ManualInput call(ManualInput t1) { + // TODO: 21: Should only need to change this code to use the + // new Classifier interface. But will need to change what is + // returned by t1.getClassifierInput() to only pay attention + // to fields being classifier based on new Parameters KEY Map ci = t1.getClassifierInput(); int recordNum = getRecordNum(); for(String key : ci.keySet()) { diff --git a/src/main/java/org/numenta/nupic/network/Region.java b/src/main/java/org/numenta/nupic/network/Region.java index f01d2246..797e19b1 100644 --- a/src/main/java/org/numenta/nupic/network/Region.java +++ b/src/main/java/org/numenta/nupic/network/Region.java @@ -457,6 +457,9 @@ Region connect(Region inputRegion) { @Override public void onError(Throwable e) { e.printStackTrace(); } @SuppressWarnings("unchecked") @Override public void onNext(Inference i) { + // TODO: 21: This is where classifierInput is set. Need to change + // it to respect only fields user has specified for classification + // with the new Parameters KEY. localInf.sdr(i.getSDR()).recordNum(i.getRecordNum()).classifierInput(i.getClassifierInput()).layerInput(i.getSDR()); if(i.getSDR().length > 0) { ((Layer)tail).compute(localInf); From 67620cc49b298cd179661c4c6b36b88ce34ccba2 Mon Sep 17 00:00:00 2001 From: Andrew Dillon Date: Sun, 19 Feb 2017 14:00:04 -0600 Subject: [PATCH 11/52] Updated doEncoderBucketMapping() method to use INFERRED_FIELDS param --- .../java/org/numenta/nupic/network/Layer.java | 51 +++++++++++++++---- 1 file changed, 40 insertions(+), 11 deletions(-) diff --git a/src/main/java/org/numenta/nupic/network/Layer.java b/src/main/java/org/numenta/nupic/network/Layer.java index 9a79745e..1577f7b3 100644 --- a/src/main/java/org/numenta/nupic/network/Layer.java +++ b/src/main/java/org/numenta/nupic/network/Layer.java @@ -31,6 +31,7 @@ import java.util.Map; import java.util.Set; import java.util.concurrent.ConcurrentLinkedQueue; +import java.util.stream.Collectors; import org.joda.time.DateTime; import org.numenta.nupic.FieldMetaType; @@ -231,7 +232,7 @@ public class Layer implements Persistable { private boolean hasGenericProcess; /** - * List of {@link Encoders} used when storing bucket information see + * List of {@link Encoder}s used when storing bucket information see * {@link #doEncoderBucketMapping(Inference, Map)} */ private List encoderTuples; @@ -1048,7 +1049,7 @@ public void start() { /** * Restarts this {@code Layer} * - * {@link #restart()} is to be called after a call to {@link #halt()}, to begin + * {@link #restart} is to be called after a call to {@link #halt()}, to begin * processing again. The {@link Network} will continue from where it previously * left off after the last call to halt(). * @@ -1180,7 +1181,7 @@ public Set getPredictiveCells() { } /** - * Returns the previous predictive {@link Cells} + * Returns the previous predictive {@link Cell}s * * @return the binary vector representing the current prediction. */ @@ -1472,7 +1473,7 @@ void notifyError(Exception e) { *

*

* If any algorithms are repeated then {@link Inference}s will - * NOT be shared between layers. {@link Regions} + * NOT be shared between layers. {@link Region}s * NEVER share {@link Inference}s *

* @@ -1657,7 +1658,7 @@ private Observable resolveObservableSequence(T t) { /** * Executes the check point logic, handles the return of the serialized byte array - * by delegating the call to {@link rx.Observer#onNext(byte[])} of all the currently queued + * by delegating the call to {@link rx.Observer#onNext}(byte[]) of all the currently queued * Observers; then clears the list of Observers. */ private void doCheckPoint() { @@ -1721,9 +1722,38 @@ private void doEncoderBucketMapping(Inference inference, Map enc // Get fields user wants encoded from Parameters Map inferredFields = (Map)params.get(KEY.INFERRED_FIELDS); + + // Store a NamedTuple for each of those fields for(Map.Entry entry : inferredFields.entrySet()) { - String name = entry.getKey(); - EncoderTuple encoderTuple = encoderTuples.get + String fieldName = entry.getKey(); // Name of encoder input field + EncoderTuple encoderTuple = encoderTuples.stream() // Get the EncoderTuple for this input field + .filter(e -> e.getName().equals(fieldName)) + .collect(Collectors.toList()) + .get(0); + Encoder e = encoderTuple.getEncoder(); + + int bucketIdx = -1; + Object o = encoderInputMap.get(name); + if(DateTime.class.isAssignableFrom(o.getClass())) { + bucketIdx = ((DateEncoder)e).getBucketIndices((DateTime)o)[0]; + } else if(Number.class.isAssignableFrom(o.getClass())) { + bucketIdx = e.getBucketIndices((double)o)[0]; + } else { + bucketIdx = e.getBucketIndices((String)o)[0]; + } + + int offset = encoderTuple.getOffset(); + int[] tempArray = new int[e.getWidth()]; + System.arraycopy(encoding, offset, tempArray, 0, tempArray.length); + + inference.getClassifierInput().put( + name, + new NamedTuple(new String[] { "name", "inputValue", "bucketIdx", "encoding" }, + name, + o, + bucketIdx, + tempArray + )); } } @@ -1809,9 +1839,9 @@ private Observable fillInOrderedSequence(Observable o) /** * Called internally to create a subscription on behalf of the specified - * {@link LayerObserver} + * Layer {@link Observer} * - * @param sub the LayerObserver (subscriber). + * @param sub the Layer Observer (subscriber). * @return */ private Subscription createSubscription(final Observer sub) { @@ -2033,8 +2063,7 @@ public void run() { * that stores the state of this {@code Network} while keeping the Network up and running. * The Network will be stored at the pre-configured location (in binary form only, not JSON). * - * @param network the {@link Network} to check point. - * @return the {@link CheckPointOp} operator + * @return the {@link CheckPointOp} operator */ @SuppressWarnings("unchecked") CheckPointOp getCheckPointOperator() { From 6e9bc600a2ccbd5d5b94e7f768e6ff1b6a33e508 Mon Sep 17 00:00:00 2001 From: cogmission Date: Tue, 28 Feb 2017 02:58:35 -0600 Subject: [PATCH 12/52] Remove tabs from file, neaten appearance --- .../numenta/nupic/encoders/ScalarEncoder.java | 1198 ++++++++--------- 1 file changed, 599 insertions(+), 599 deletions(-) diff --git a/src/main/java/org/numenta/nupic/encoders/ScalarEncoder.java b/src/main/java/org/numenta/nupic/encoders/ScalarEncoder.java index b62d9ed7..7e29e842 100644 --- a/src/main/java/org/numenta/nupic/encoders/ScalarEncoder.java +++ b/src/main/java/org/numenta/nupic/encoders/ScalarEncoder.java @@ -159,266 +159,266 @@ */ public class ScalarEncoder extends Encoder { - private static final long serialVersionUID = 1L; - - private static final Logger LOGGER = LoggerFactory.getLogger(ScalarEncoder.class); - - /** - * Constructs a new {@code ScalarEncoder} - */ - ScalarEncoder() {} - - /** - * Returns a builder for building ScalarEncoders. - * This builder may be reused to produce multiple builders - * - * @return a {@code ScalarEncoder.Builder} - */ - public static Encoder.Builder builder() { - return new ScalarEncoder.Builder(); - } - - /** - * Returns true if the underlying encoder works on deltas - */ - @Override - public boolean isDelta() { - return false; - } - - /** - * w -- number of bits to set in output + private static final long serialVersionUID = 1L; + + private static final Logger LOGGER = LoggerFactory.getLogger(ScalarEncoder.class); + + /** + * Constructs a new {@code ScalarEncoder} + */ + ScalarEncoder() {} + + /** + * Returns a builder for building ScalarEncoders. + * This builder may be reused to produce multiple builders + * + * @return a {@code ScalarEncoder.Builder} + */ + public static Encoder.Builder builder() { + return new ScalarEncoder.Builder(); + } + + /** + * Returns true if the underlying encoder works on deltas + */ + @Override + public boolean isDelta() { + return false; + } + + /** + * w -- number of bits to set in output * minval -- minimum input value * maxval -- maximum input value (input is strictly less if periodic == True) - * + * * Exactly one of n, radius, resolution must be set. "0" is a special * value that means "not set". - * + * * n -- number of bits in the representation (must be > w) * radius -- inputs separated by more than, or equal to this distance will have non-overlapping * representations * resolution -- inputs separated by more than, or equal to this distance will have different * representations - * + * * name -- an optional string which will become part of the description - * + * * clipInput -- if true, non-periodic inputs smaller than minval or greater * than maxval will be clipped to minval/maxval - * + * * forced -- if true, skip some safety checks (for compatibility reasons), default false - */ - public void init() { - if(getW() % 2 == 0) { - throw new IllegalStateException( - "W must be an odd number (to eliminate centering difficulty)"); - } - - setHalfWidth((getW() - 1) / 2); - - // For non-periodic inputs, padding is the number of bits "outside" the range, - // on each side. I.e. the representation of minval is centered on some bit, and - // there are "padding" bits to the left of that centered bit; similarly with - // bits to the right of the center bit of maxval - setPadding(isPeriodic() ? 0 : getHalfWidth()); - - if(!Double.isNaN(getMinVal()) && !Double.isNaN(getMaxVal())) { - if(getMinVal() >= getMaxVal()) { - throw new IllegalStateException("maxVal must be > minVal"); - } - setRangeInternal(getMaxVal() - getMinVal()); - } - - // There are three different ways of thinking about the representation. Handle - // each case here. - initEncoder(getW(), getMinVal(), getMaxVal(), getN(), getRadius(), getResolution()); - - //nInternal represents the output area excluding the possible padding on each side - setNInternal(getN() - 2 * getPadding()); - - if(getName() == null) { - if((getMinVal() % ((int)getMinVal())) > 0 || - (getMaxVal() % ((int)getMaxVal())) > 0) { - setName("[" + getMinVal() + ":" + getMaxVal() + "]"); - }else{ - setName("[" + (int)getMinVal() + ":" + (int)getMaxVal() + "]"); - } - } - - //Checks for likely mistakes in encoder settings - if(!isForced()) { - checkReasonableSettings(); - } + */ + public void init() { + if(getW() % 2 == 0) { + throw new IllegalStateException( + "W must be an odd number (to eliminate centering difficulty)"); + } + + setHalfWidth((getW() - 1) / 2); + + // For non-periodic inputs, padding is the number of bits "outside" the range, + // on each side. I.e. the representation of minval is centered on some bit, and + // there are "padding" bits to the left of that centered bit; similarly with + // bits to the right of the center bit of maxval + setPadding(isPeriodic() ? 0 : getHalfWidth()); + + if(!Double.isNaN(getMinVal()) && !Double.isNaN(getMaxVal())) { + if(getMinVal() >= getMaxVal()) { + throw new IllegalStateException("maxVal must be > minVal"); + } + setRangeInternal(getMaxVal() - getMinVal()); + } + + // There are three different ways of thinking about the representation. Handle + // each case here. + initEncoder(getW(), getMinVal(), getMaxVal(), getN(), getRadius(), getResolution()); + + //nInternal represents the output area excluding the possible padding on each side + setNInternal(getN() - 2 * getPadding()); + + if(getName() == null) { + if((getMinVal() % ((int)getMinVal())) > 0 || + (getMaxVal() % ((int)getMaxVal())) > 0) { + setName("[" + getMinVal() + ":" + getMaxVal() + "]"); + }else{ + setName("[" + (int)getMinVal() + ":" + (int)getMaxVal() + "]"); + } + } + + //Checks for likely mistakes in encoder settings + if(!isForced()) { + checkReasonableSettings(); + } description.add(new Tuple((name = getName()).equals("None") ? "[" + (int)getMinVal() + ":" + (int)getMaxVal() + "]" : name, 0)); - } + } - /** - * There are three different ways of thinking about the representation. + /** + * There are three different ways of thinking about the representation. * Handle each case here. * - * @param c - * @param minVal - * @param maxVal - * @param n - * @param radius - * @param resolution - */ - public void initEncoder(int w, double minVal, double maxVal, int n, double radius, double resolution) { - if(n != 0) { - if(!Double.isNaN(minVal) && !Double.isNaN(maxVal)) { - if(!isPeriodic()) { - setResolution(getRangeInternal() / (getN() - getW())); - }else{ - setResolution(getRangeInternal() / getN()); - } - - setRadius(getW() * getResolution()); - - if(isPeriodic()) { - setRange(getRangeInternal()); - }else{ - setRange(getRangeInternal() + getResolution()); - } - } - }else{ - if(radius != 0) { - setResolution(getRadius() / w); - }else if(resolution != 0) { - setRadius(getResolution() * w); - }else{ - throw new IllegalStateException( - "One of n, radius, resolution must be specified for a ScalarEncoder"); - } - - if(isPeriodic()) { - setRange(getRangeInternal()); - }else{ - setRange(getRangeInternal() + getResolution()); - } - - double nFloat = w * (getRange() / getRadius()) + 2 * getPadding(); - setN((int)Math.ceil(nFloat)); - } - } - - /** - * Return the bit offset of the first bit to be set in the encoder output. + * @param c + * @param minVal + * @param maxVal + * @param n + * @param radius + * @param resolution + */ + public void initEncoder(int w, double minVal, double maxVal, int n, double radius, double resolution) { + if(n != 0) { + if(!Double.isNaN(minVal) && !Double.isNaN(maxVal)) { + if(!isPeriodic()) { + setResolution(getRangeInternal() / (getN() - getW())); + }else{ + setResolution(getRangeInternal() / getN()); + } + + setRadius(getW() * getResolution()); + + if(isPeriodic()) { + setRange(getRangeInternal()); + }else{ + setRange(getRangeInternal() + getResolution()); + } + } + }else{ + if(radius != 0) { + setResolution(getRadius() / w); + }else if(resolution != 0) { + setRadius(getResolution() * w); + }else{ + throw new IllegalStateException( + "One of n, radius, resolution must be specified for a ScalarEncoder"); + } + + if(isPeriodic()) { + setRange(getRangeInternal()); + }else{ + setRange(getRangeInternal() + getResolution()); + } + + double nFloat = w * (getRange() / getRadius()) + 2 * getPadding(); + setN((int)Math.ceil(nFloat)); + } + } + + /** + * Return the bit offset of the first bit to be set in the encoder output. * For periodic encoders, this can be a negative number when the encoded output * wraps around. * - * @param c the memory - * @param input the input data - * @return an encoded array - */ - public Integer getFirstOnBit(double input) { - if(Double.isNaN(input)) { - return null; - }else{ - if(input < getMinVal()) { - if(clipInput() && !isPeriodic()) { - if(LOGGER.isTraceEnabled()) { - LOGGER.info("Clipped input " + getName() + "=" + input + " to minval " + getMinVal()); - } - input = getMinVal(); - }else{ - throw new IllegalStateException("input (" + input +") less than range (" + - getMinVal() + " - " + getMaxVal() + ")"); - } - } - } - - if(isPeriodic()) { - if(input >= getMaxVal()) { - throw new IllegalStateException("input (" + input +") greater than periodic range (" + - getMinVal() + " - " + getMaxVal() + ")"); - } - }else{ - if(input > getMaxVal()) { - if(clipInput()) { - if(LOGGER.isTraceEnabled()) { - LOGGER.info("Clipped input " + getName() + "=" + input + " to maxval " + getMaxVal()); - } - input = getMaxVal(); - }else{ - throw new IllegalStateException("input (" + input +") greater than periodic range (" + - getMinVal() + " - " + getMaxVal() + ")"); - } - } - } - - int centerbin; - if(isPeriodic()) { - centerbin = ((int)((input - getMinVal()) * getNInternal() / getRange())) + getPadding(); - }else{ - centerbin = ((int)(((input - getMinVal()) + getResolution()/2) / getResolution())) + getPadding(); - } - - return centerbin - getHalfWidth(); - } - - /** - * Check if the settings are reasonable for the SpatialPooler to work - * @param c - */ - public void checkReasonableSettings() { - if(getW() < 21) { - throw new IllegalStateException( - "Number of bits in the SDR (%d) must be greater than 2, and recommended >= 21 (use forced=True to override)"); - } - } - - /** - * {@inheritDoc} - */ - @Override - public Set getDecoderOutputFieldTypes() { - return new LinkedHashSet<>(Arrays.asList(FieldMetaType.FLOAT, FieldMetaType.INTEGER)); - } - - /** - * Should return the output width, in bits. - */ - @Override - public int getWidth() { - return getN(); - } - - /** - * {@inheritDoc} - * NO-OP - */ - @Override - public int[] getBucketIndices(String input) { return null; } - - /** - * Returns the bucket indices. - * - * @param input - */ - @Override - public int[] getBucketIndices(double input) { - int minbin = getFirstOnBit(input); - - //For periodic encoders, the bucket index is the index of the center bit - int bucketIdx; - if(isPeriodic()) { - bucketIdx = minbin + getHalfWidth(); - if(bucketIdx < 0) { - bucketIdx += getN(); - } - }else{//for non-periodic encoders, the bucket index is the index of the left bit - bucketIdx = minbin; - } - - return new int[] { bucketIdx }; - } - - /** - * Encodes inputData and puts the encoded value into the output array, + * @param c the memory + * @param input the input data + * @return an encoded array + */ + public Integer getFirstOnBit(double input) { + if(Double.isNaN(input)) { + return null; + }else{ + if(input < getMinVal()) { + if(clipInput() && !isPeriodic()) { + if(LOGGER.isTraceEnabled()) { + LOGGER.info("Clipped input " + getName() + "=" + input + " to minval " + getMinVal()); + } + input = getMinVal(); + }else{ + throw new IllegalStateException("input (" + input +") less than range (" + + getMinVal() + " - " + getMaxVal() + ")"); + } + } + } + + if(isPeriodic()) { + if(input >= getMaxVal()) { + throw new IllegalStateException("input (" + input +") greater than periodic range (" + + getMinVal() + " - " + getMaxVal() + ")"); + } + }else{ + if(input > getMaxVal()) { + if(clipInput()) { + if(LOGGER.isTraceEnabled()) { + LOGGER.info("Clipped input " + getName() + "=" + input + " to maxval " + getMaxVal()); + } + input = getMaxVal(); + }else{ + throw new IllegalStateException("input (" + input +") greater than periodic range (" + + getMinVal() + " - " + getMaxVal() + ")"); + } + } + } + + int centerbin; + if(isPeriodic()) { + centerbin = ((int)((input - getMinVal()) * getNInternal() / getRange())) + getPadding(); + }else{ + centerbin = ((int)(((input - getMinVal()) + getResolution()/2) / getResolution())) + getPadding(); + } + + return centerbin - getHalfWidth(); + } + + /** + * Check if the settings are reasonable for the SpatialPooler to work + * @param c + */ + public void checkReasonableSettings() { + if(getW() < 21) { + throw new IllegalStateException( + "Number of bits in the SDR (%d) must be greater than 2, and recommended >= 21 (use forced=True to override)"); + } + } + + /** + * {@inheritDoc} + */ + @Override + public Set getDecoderOutputFieldTypes() { + return new LinkedHashSet<>(Arrays.asList(FieldMetaType.FLOAT, FieldMetaType.INTEGER)); + } + + /** + * Should return the output width, in bits. + */ + @Override + public int getWidth() { + return getN(); + } + + /** + * {@inheritDoc} + * NO-OP + */ + @Override + public int[] getBucketIndices(String input) { return null; } + + /** + * Returns the bucket indices. + * + * @param input + */ + @Override + public int[] getBucketIndices(double input) { + int minbin = getFirstOnBit(input); + + //For periodic encoders, the bucket index is the index of the center bit + int bucketIdx; + if(isPeriodic()) { + bucketIdx = minbin + getHalfWidth(); + if(bucketIdx < 0) { + bucketIdx += getN(); + } + }else{//for non-periodic encoders, the bucket index is the index of the left bit + bucketIdx = minbin; + } + + return new int[] { bucketIdx }; + } + + /** + * Encodes inputData and puts the encoded value into the output array, * which is a 1-D array of length returned by {@link Connections#getW()}. - * + * * Note: The output array is reused, so clear it before updating it. - * @param inputData Data to encode. This should be validated by the encoder. - * @param output 1-D array of same length returned by {@link Connections#getW()} + * @param inputData Data to encode. This should be validated by the encoder. + * @param output 1-D array of same length returned by {@link Connections#getW()} */ @Override public void encodeIntoArray(Double input, int[] output) { @@ -426,7 +426,7 @@ public void encodeIntoArray(Double input, int[] output) { Arrays.fill(output, 0); return; } - + Integer bucketVal = getFirstOnBit(input); if(bucketVal != null) { int bucketIdx = bucketVal; @@ -446,391 +446,391 @@ public void encodeIntoArray(Double input, int[] output) { minbin = 0; } } - + ArrayUtils.setIndexesTo(output, ArrayUtils.range(minbin, maxbin + 1), 1); } - + // Added guard against immense string concatenation if(LOGGER.isTraceEnabled()) { LOGGER.trace(""); LOGGER.trace("input: " + input); LOGGER.trace("range: " + getMinVal() + " - " + getMaxVal()); LOGGER.trace("n:" + getN() + "w:" + getW() + "resolution:" + getResolution() + - "radius:" + getRadius() + "periodic:" + isPeriodic()); + "radius:" + getRadius() + "periodic:" + isPeriodic()); LOGGER.trace("output: " + Arrays.toString(output)); LOGGER.trace("input desc: " + decode(output, "")); } } - /** - * Returns a {@link DecodeResult} which is a tuple of range names - * and lists of {@link RangeLists} in the first entry, and a list - * of descriptions for each range in the second entry. - * - * @param encoded the encoded bit vector - * @param parentFieldName the field the vector corresponds with - * @return - */ - @Override - public DecodeResult decode(int[] encoded, String parentFieldName) { - // For now, we simply assume any top-down output greater than 0 - // is ON. Eventually, we will probably want to incorporate the strength - // of each top-down output. - if(encoded == null || encoded.length < 1) { - return null; - } - int[] tmpOutput = Arrays.copyOf(encoded, encoded.length); - - // ------------------------------------------------------------------------ - // First, assume the input pool is not sampled 100%, and fill in the - // "holes" in the encoded representation (which are likely to be present - // if this is a coincidence that was learned by the SP). - - // Search for portions of the output that have "holes" - int maxZerosInARow = getHalfWidth(); - for(int i = 0;i < maxZerosInARow;i++) { - int[] searchStr = new int[i + 3]; - Arrays.fill(searchStr, 1); - ArrayUtils.setRangeTo(searchStr, 1, -1, 0); - int subLen = searchStr.length; - - // Does this search string appear in the output? - if(isPeriodic()) { - for(int j = 0;j < getN();j++) { - int[] outputIndices = ArrayUtils.range(j, j + subLen); - outputIndices = ArrayUtils.modulo(outputIndices, getN()); - if(Arrays.equals(searchStr, ArrayUtils.sub(tmpOutput, outputIndices))) { - ArrayUtils.setIndexesTo(tmpOutput, outputIndices, 1); - } - } - }else{ - for(int j = 0;j < getN() - subLen + 1;j++) { - if(Arrays.equals(searchStr, ArrayUtils.sub(tmpOutput, ArrayUtils.range(j, j + subLen)))) { - ArrayUtils.setRangeTo(tmpOutput, j, j + subLen, 1); - } - } - } - } - - LOGGER.trace("raw output:" + Arrays.toString( - ArrayUtils.sub(encoded, ArrayUtils.range(0, getN())))); - LOGGER.trace("filtered output:" + Arrays.toString(tmpOutput)); - - // ------------------------------------------------------------------------ - // Find each run of 1's. - int[] nz = ArrayUtils.where(tmpOutput, new Condition.Adapter() { - @Override - public boolean eval(int n) { - return n > 0; - } - }); - List runs = new ArrayList(); //will be tuples of (startIdx, runLength) - Arrays.sort(nz); - int[] run = new int[] { nz[0], 1 }; - int i = 1; - while(i < nz.length) { - if(nz[i] == run[0] + run[1]) { - run[1] += 1; - }else{ - runs.add(new Tuple(run[0], run[1])); - run = new int[] { nz[i], 1 }; - } - i += 1; - } - runs.add(new Tuple(run[0], run[1])); - - // If we have a periodic encoder, merge the first and last run if they - // both go all the way to the edges - if(isPeriodic() && runs.size() > 1) { - int l = runs.size() - 1; - if(((Integer)runs.get(0).get(0)) == 0 && ((Integer)runs.get(l).get(0)) + ((Integer)runs.get(l).get(1)) == getN()) { - runs.set(l, new Tuple((Integer)runs.get(l).get(0), - ((Integer)runs.get(l).get(1)) + ((Integer)runs.get(0).get(1)) )); - runs = runs.subList(1, runs.size()); - } - } - - // ------------------------------------------------------------------------ - // Now, for each group of 1's, determine the "left" and "right" edges, where - // the "left" edge is inset by halfwidth and the "right" edge is inset by - // halfwidth. - // For a group of width w or less, the "left" and "right" edge are both at - // the center position of the group. - int left = 0; - int right = 0; - List ranges = new ArrayList(); - for(Tuple tupleRun : runs) { - int start = (Integer)tupleRun.get(0); - int runLen = (Integer)tupleRun.get(1); - if(runLen <= getW()) { - left = right = start + runLen / 2; - }else{ - left = start + getHalfWidth(); - right = start + runLen - 1 - getHalfWidth(); - } - - double inMin, inMax; - // Convert to input space. - if(!isPeriodic()) { - inMin = (left - getPadding()) * getResolution() + getMinVal(); - inMax = (right - getPadding()) * getResolution() + getMinVal(); - }else{ - inMin = (left - getPadding()) * getRange() / getNInternal() + getMinVal(); - inMax = (right - getPadding()) * getRange() / getNInternal() + getMinVal(); - } - // Handle wrap-around if periodic - if(isPeriodic()) { - if(inMin >= getMaxVal()) { - inMin -= getRange(); - inMax -= getRange(); - } - } - - // Clip low end - if(inMin < getMinVal()) { - inMin = getMinVal(); - } - if(inMax < getMinVal()) { - inMax = getMinVal(); - } - - // If we have a periodic encoder, and the max is past the edge, break into - // 2 separate ranges - if(isPeriodic() && inMax >= getMaxVal()) { - ranges.add(new MinMax(inMin, getMaxVal())); - ranges.add(new MinMax(getMinVal(), inMax - getRange())); - }else{ - if(inMax > getMaxVal()) { - inMax = getMaxVal(); - } - if(inMin > getMaxVal()) { - inMin = getMaxVal(); - } - ranges.add(new MinMax(inMin, inMax)); - } - } - - String desc = generateRangeDescription(ranges); - String fieldName; - // Return result - if(parentFieldName != null && !parentFieldName.isEmpty()) { - fieldName = String.format("%s.%s", parentFieldName, getName()); - }else{ - fieldName = getName(); - } - - RangeList inner = new RangeList(ranges, desc); - Map fieldsDict = new HashMap(); - fieldsDict.put(fieldName, inner); - - return new DecodeResult(fieldsDict, Arrays.asList(fieldName)); - } - - /** - * Generate description from a text description of the ranges - * - * @param ranges A list of {@link MinMax}es. - */ - public String generateRangeDescription(List ranges) { - StringBuilder desc = new StringBuilder(); - int numRanges = ranges.size(); - for(int i = 0;i < numRanges;i++) { - if(ranges.get(i).min() != ranges.get(i).max()) { - desc.append(String.format("%.2f-%.2f", ranges.get(i).min(), ranges.get(i).max())); - }else{ - desc.append(String.format("%.2f", ranges.get(i).min())); - } - if(i < numRanges - 1) { - desc.append(", "); - } - } - return desc.toString(); - } - - /** - * Return the internal topDownMapping matrix used for handling the + /** + * Returns a {@link DecodeResult} which is a tuple of range names + * and lists of {@link RangeLists} in the first entry, and a list + * of descriptions for each range in the second entry. + * + * @param encoded the encoded bit vector + * @param parentFieldName the field the vector corresponds with + * @return + */ + @Override + public DecodeResult decode(int[] encoded, String parentFieldName) { + // For now, we simply assume any top-down output greater than 0 + // is ON. Eventually, we will probably want to incorporate the strength + // of each top-down output. + if(encoded == null || encoded.length < 1) { + return null; + } + int[] tmpOutput = Arrays.copyOf(encoded, encoded.length); + + // ------------------------------------------------------------------------ + // First, assume the input pool is not sampled 100%, and fill in the + // "holes" in the encoded representation (which are likely to be present + // if this is a coincidence that was learned by the SP). + + // Search for portions of the output that have "holes" + int maxZerosInARow = getHalfWidth(); + for(int i = 0;i < maxZerosInARow;i++) { + int[] searchStr = new int[i + 3]; + Arrays.fill(searchStr, 1); + ArrayUtils.setRangeTo(searchStr, 1, -1, 0); + int subLen = searchStr.length; + + // Does this search string appear in the output? + if(isPeriodic()) { + for(int j = 0;j < getN();j++) { + int[] outputIndices = ArrayUtils.range(j, j + subLen); + outputIndices = ArrayUtils.modulo(outputIndices, getN()); + if(Arrays.equals(searchStr, ArrayUtils.sub(tmpOutput, outputIndices))) { + ArrayUtils.setIndexesTo(tmpOutput, outputIndices, 1); + } + } + }else{ + for(int j = 0;j < getN() - subLen + 1;j++) { + if(Arrays.equals(searchStr, ArrayUtils.sub(tmpOutput, ArrayUtils.range(j, j + subLen)))) { + ArrayUtils.setRangeTo(tmpOutput, j, j + subLen, 1); + } + } + } + } + + LOGGER.trace("raw output:" + Arrays.toString( + ArrayUtils.sub(encoded, ArrayUtils.range(0, getN())))); + LOGGER.trace("filtered output:" + Arrays.toString(tmpOutput)); + + // ------------------------------------------------------------------------ + // Find each run of 1's. + int[] nz = ArrayUtils.where(tmpOutput, new Condition.Adapter() { + @Override + public boolean eval(int n) { + return n > 0; + } + }); + List runs = new ArrayList(); //will be tuples of (startIdx, runLength) + Arrays.sort(nz); + int[] run = new int[] { nz[0], 1 }; + int i = 1; + while(i < nz.length) { + if(nz[i] == run[0] + run[1]) { + run[1] += 1; + }else{ + runs.add(new Tuple(run[0], run[1])); + run = new int[] { nz[i], 1 }; + } + i += 1; + } + runs.add(new Tuple(run[0], run[1])); + + // If we have a periodic encoder, merge the first and last run if they + // both go all the way to the edges + if(isPeriodic() && runs.size() > 1) { + int l = runs.size() - 1; + if(((Integer)runs.get(0).get(0)) == 0 && ((Integer)runs.get(l).get(0)) + ((Integer)runs.get(l).get(1)) == getN()) { + runs.set(l, new Tuple((Integer)runs.get(l).get(0), + ((Integer)runs.get(l).get(1)) + ((Integer)runs.get(0).get(1)) )); + runs = runs.subList(1, runs.size()); + } + } + + // ------------------------------------------------------------------------ + // Now, for each group of 1's, determine the "left" and "right" edges, where + // the "left" edge is inset by halfwidth and the "right" edge is inset by + // halfwidth. + // For a group of width w or less, the "left" and "right" edge are both at + // the center position of the group. + int left = 0; + int right = 0; + List ranges = new ArrayList(); + for(Tuple tupleRun : runs) { + int start = (Integer)tupleRun.get(0); + int runLen = (Integer)tupleRun.get(1); + if(runLen <= getW()) { + left = right = start + runLen / 2; + }else{ + left = start + getHalfWidth(); + right = start + runLen - 1 - getHalfWidth(); + } + + double inMin, inMax; + // Convert to input space. + if(!isPeriodic()) { + inMin = (left - getPadding()) * getResolution() + getMinVal(); + inMax = (right - getPadding()) * getResolution() + getMinVal(); + }else{ + inMin = (left - getPadding()) * getRange() / getNInternal() + getMinVal(); + inMax = (right - getPadding()) * getRange() / getNInternal() + getMinVal(); + } + // Handle wrap-around if periodic + if(isPeriodic()) { + if(inMin >= getMaxVal()) { + inMin -= getRange(); + inMax -= getRange(); + } + } + + // Clip low end + if(inMin < getMinVal()) { + inMin = getMinVal(); + } + if(inMax < getMinVal()) { + inMax = getMinVal(); + } + + // If we have a periodic encoder, and the max is past the edge, break into + // 2 separate ranges + if(isPeriodic() && inMax >= getMaxVal()) { + ranges.add(new MinMax(inMin, getMaxVal())); + ranges.add(new MinMax(getMinVal(), inMax - getRange())); + }else{ + if(inMax > getMaxVal()) { + inMax = getMaxVal(); + } + if(inMin > getMaxVal()) { + inMin = getMaxVal(); + } + ranges.add(new MinMax(inMin, inMax)); + } + } + + String desc = generateRangeDescription(ranges); + String fieldName; + // Return result + if(parentFieldName != null && !parentFieldName.isEmpty()) { + fieldName = String.format("%s.%s", parentFieldName, getName()); + }else{ + fieldName = getName(); + } + + RangeList inner = new RangeList(ranges, desc); + Map fieldsDict = new HashMap(); + fieldsDict.put(fieldName, inner); + + return new DecodeResult(fieldsDict, Arrays.asList(fieldName)); + } + + /** + * Generate description from a text description of the ranges + * + * @param ranges A list of {@link MinMax}es. + */ + public String generateRangeDescription(List ranges) { + StringBuilder desc = new StringBuilder(); + int numRanges = ranges.size(); + for(int i = 0;i < numRanges;i++) { + if(ranges.get(i).min() != ranges.get(i).max()) { + desc.append(String.format("%.2f-%.2f", ranges.get(i).min(), ranges.get(i).max())); + }else{ + desc.append(String.format("%.2f", ranges.get(i).min())); + } + if(i < numRanges - 1) { + desc.append(", "); + } + } + return desc.toString(); + } + + /** + * Return the internal topDownMapping matrix used for handling the * bucketInfo() and topDownCompute() methods. This is a matrix, one row per * category (bucket) where each row contains the encoded output for that * category. * - * @param c the connections memory - * @return the internal topDownMapping - */ - public SparseObjectMatrix getTopDownMapping() { - - if(topDownMapping == null) { - //The input scalar value corresponding to each possible output encoding - if(isPeriodic()) { - setTopDownValues( - ArrayUtils.arange(getMinVal() + getResolution() / 2.0, - getMaxVal(), getResolution())); - }else{ - //Number of values is (max-min)/resolutions - setTopDownValues( - ArrayUtils.arange(getMinVal(), getMaxVal() + getResolution() / 2.0, - getResolution())); - } - } - - //Each row represents an encoded output pattern - int numCategories = getTopDownValues().length; - SparseObjectMatrix topDownMapping; - setTopDownMapping( - topDownMapping = new SparseObjectMatrix( - new int[] { numCategories })); - - double[] topDownValues = getTopDownValues(); - int[] outputSpace = new int[getN()]; - double minVal = getMinVal(); - double maxVal = getMaxVal(); - for(int i = 0;i < numCategories;i++) { - double value = topDownValues[i]; - value = Math.max(value, minVal); - value = Math.min(value, maxVal); - encodeIntoArray(value, outputSpace); - topDownMapping.set(i, Arrays.copyOf(outputSpace, outputSpace.length)); - } - - return topDownMapping; - } - - /** - * {@inheritDoc} - * - * @param the input value, in this case a double - * @return a list of one input double - */ - @Override - public TDoubleList getScalars(S d) { - TDoubleList retVal = new TDoubleArrayList(); - retVal.add((Double)d); - return retVal; - } - - /** - * Returns a list of items, one for each bucket defined by this encoder. + * @param c the connections memory + * @return the internal topDownMapping + */ + public SparseObjectMatrix getTopDownMapping() { + + if(topDownMapping == null) { + //The input scalar value corresponding to each possible output encoding + if(isPeriodic()) { + setTopDownValues( + ArrayUtils.arange(getMinVal() + getResolution() / 2.0, + getMaxVal(), getResolution())); + }else{ + //Number of values is (max-min)/resolutions + setTopDownValues( + ArrayUtils.arange(getMinVal(), getMaxVal() + getResolution() / 2.0, + getResolution())); + } + } + + //Each row represents an encoded output pattern + int numCategories = getTopDownValues().length; + SparseObjectMatrix topDownMapping; + setTopDownMapping( + topDownMapping = new SparseObjectMatrix( + new int[] { numCategories })); + + double[] topDownValues = getTopDownValues(); + int[] outputSpace = new int[getN()]; + double minVal = getMinVal(); + double maxVal = getMaxVal(); + for(int i = 0;i < numCategories;i++) { + double value = topDownValues[i]; + value = Math.max(value, minVal); + value = Math.min(value, maxVal); + encodeIntoArray(value, outputSpace); + topDownMapping.set(i, Arrays.copyOf(outputSpace, outputSpace.length)); + } + + return topDownMapping; + } + + /** + * {@inheritDoc} + * + * @param the input value, in this case a double + * @return a list of one input double + */ + @Override + public TDoubleList getScalars(S d) { + TDoubleList retVal = new TDoubleArrayList(); + retVal.add((Double)d); + return retVal; + } + + /** + * Returns a list of items, one for each bucket defined by this encoder. * Each item is the value assigned to that bucket, this is the same as the * EncoderResult.value that would be returned by getBucketInfo() for that * bucket and is in the same format as the input that would be passed to * encode(). - * + * * This call is faster than calling getBucketInfo() on each bucket individually * if all you need are the bucket values. - * - * @param returnType class type parameter so that this method can return encoder + * + * @param returnType class type parameter so that this method can return encoder * specific value types * * @return list of items, each item representing the bucket value for that * bucket. - */ - @SuppressWarnings("unchecked") - @Override - public List getBucketValues(Class t) { - if(bucketValues == null) { - SparseObjectMatrix topDownMapping = getTopDownMapping(); - int numBuckets = topDownMapping.getMaxIndex() + 1; - bucketValues = new ArrayList(); - for(int i = 0;i < numBuckets;i++) { - ((List)bucketValues).add((Double)getBucketInfo(new int[] { i }).get(0).get(1)); - } - } - return (List)bucketValues; - } - - /** - * {@inheritDoc} - */ - @Override - public List getBucketInfo(int[] buckets) { - SparseObjectMatrix topDownMapping = getTopDownMapping(); - - //The "category" is simply the bucket index - int category = buckets[0]; - int[] encoding = topDownMapping.getObject(category); - - //Which input value does this correspond to? - double inputVal; - if(isPeriodic()) { - inputVal = getMinVal() + getResolution() / 2 + category * getResolution(); - }else{ - inputVal = getMinVal() + category * getResolution(); - } - - return Arrays.asList(new Encoding(inputVal, inputVal, encoding)); - } - - /** - * {@inheritDoc} - */ - @Override - public List topDownCompute(int[] encoded) { - //Get/generate the topDown mapping table - SparseObjectMatrix topDownMapping = getTopDownMapping(); - - // See which "category" we match the closest. - int category = ArrayUtils.argmax(rightVecProd(topDownMapping, encoded)); - - return getBucketInfo(new int[]{category}); - } - - /** - * Returns a list of {@link Tuple}s which in this case is a list of - * key value parameter values for this {@code ScalarEncoder} - * - * @return a list of {@link Tuple}s - */ - public List dict() { - List l = new ArrayList(); - l.add(new Tuple("maxval", getMaxVal())); - l.add(new Tuple("bucketValues", getBucketValues(Double.class))); - l.add(new Tuple("nInternal", getNInternal())); - l.add(new Tuple("name", getName())); - l.add(new Tuple("minval", getMinVal())); - l.add(new Tuple("topDownValues", Arrays.toString(getTopDownValues()))); - l.add(new Tuple("clipInput", clipInput())); - l.add(new Tuple("n", getN())); - l.add(new Tuple("padding", getPadding())); - l.add(new Tuple("range", getRange())); - l.add(new Tuple("periodic", isPeriodic())); - l.add(new Tuple("radius", getRadius())); - l.add(new Tuple("w", getW())); - l.add(new Tuple("topDownMappingM", getTopDownMapping())); - l.add(new Tuple("halfwidth", getHalfWidth())); - l.add(new Tuple("resolution", getResolution())); - l.add(new Tuple("rangeInternal", getRangeInternal())); - - return l; - } - - /** - * Returns a {@link EncoderBuilder} for constructing {@link ScalarEncoder}s - * - * The base class architecture is put together in such a way where boilerplate - * initialization can be kept to a minimum for implementing subclasses, while avoiding - * the mistake-proneness of extremely long argument lists. - * - * @see ScalarEncoder.Builder#setStuff(int) - */ - public static class Builder extends Encoder.Builder { - private Builder() {} - - @Override - public ScalarEncoder build() { - //Must be instantiated so that super class can initialize - //boilerplate variables. - encoder = new ScalarEncoder(); - - //Call super class here - super.build(); - - //////////////////////////////////////////////////////// - // Implementing classes would do setting of specific // - // vars here together with any sanity checking // - //////////////////////////////////////////////////////// - - ((ScalarEncoder)encoder).init(); - - return (ScalarEncoder)encoder; - } - } + */ + @SuppressWarnings("unchecked") + @Override + public List getBucketValues(Class t) { + if(bucketValues == null) { + SparseObjectMatrix topDownMapping = getTopDownMapping(); + int numBuckets = topDownMapping.getMaxIndex() + 1; + bucketValues = new ArrayList(); + for(int i = 0;i < numBuckets;i++) { + ((List)bucketValues).add((Double)getBucketInfo(new int[] { i }).get(0).get(1)); + } + } + return (List)bucketValues; + } + + /** + * {@inheritDoc} + */ + @Override + public List getBucketInfo(int[] buckets) { + SparseObjectMatrix topDownMapping = getTopDownMapping(); + + //The "category" is simply the bucket index + int category = buckets[0]; + int[] encoding = topDownMapping.getObject(category); + + //Which input value does this correspond to? + double inputVal; + if(isPeriodic()) { + inputVal = getMinVal() + getResolution() / 2 + category * getResolution(); + }else{ + inputVal = getMinVal() + category * getResolution(); + } + + return Arrays.asList(new Encoding(inputVal, inputVal, encoding)); + } + + /** + * {@inheritDoc} + */ + @Override + public List topDownCompute(int[] encoded) { + //Get/generate the topDown mapping table + SparseObjectMatrix topDownMapping = getTopDownMapping(); + + // See which "category" we match the closest. + int category = ArrayUtils.argmax(rightVecProd(topDownMapping, encoded)); + + return getBucketInfo(new int[]{category}); + } + + /** + * Returns a list of {@link Tuple}s which in this case is a list of + * key value parameter values for this {@code ScalarEncoder} + * + * @return a list of {@link Tuple}s + */ + public List dict() { + List l = new ArrayList(); + l.add(new Tuple("maxval", getMaxVal())); + l.add(new Tuple("bucketValues", getBucketValues(Double.class))); + l.add(new Tuple("nInternal", getNInternal())); + l.add(new Tuple("name", getName())); + l.add(new Tuple("minval", getMinVal())); + l.add(new Tuple("topDownValues", Arrays.toString(getTopDownValues()))); + l.add(new Tuple("clipInput", clipInput())); + l.add(new Tuple("n", getN())); + l.add(new Tuple("padding", getPadding())); + l.add(new Tuple("range", getRange())); + l.add(new Tuple("periodic", isPeriodic())); + l.add(new Tuple("radius", getRadius())); + l.add(new Tuple("w", getW())); + l.add(new Tuple("topDownMappingM", getTopDownMapping())); + l.add(new Tuple("halfwidth", getHalfWidth())); + l.add(new Tuple("resolution", getResolution())); + l.add(new Tuple("rangeInternal", getRangeInternal())); + + return l; + } + + /** + * Returns a {@link EncoderBuilder} for constructing {@link ScalarEncoder}s + * + * The base class architecture is put together in such a way where boilerplate + * initialization can be kept to a minimum for implementing subclasses, while avoiding + * the mistake-proneness of extremely long argument lists. + * + * @see ScalarEncoder.Builder#setStuff(int) + */ + public static class Builder extends Encoder.Builder { + private Builder() {} + + @Override + public ScalarEncoder build() { + //Must be instantiated so that super class can initialize + //boilerplate variables. + encoder = new ScalarEncoder(); + + //Call super class here + super.build(); + + //////////////////////////////////////////////////////// + // Implementing classes would do setting of specific // + // vars here together with any sanity checking // + //////////////////////////////////////////////////////// + + ((ScalarEncoder)encoder).init(); + + return (ScalarEncoder)encoder; + } + } } From 932b4d6cafc1dfbf656adfa4cbb0ef6c7e07220e Mon Sep 17 00:00:00 2001 From: Andrew Dillon Date: Sun, 5 Mar 2017 12:06:38 -0600 Subject: [PATCH 13/52] Added KEY.INFERRED_FIELDS and modified Layer to utilize that new parameters --- .../java/org/numenta/nupic/Parameters.java | 21 +-- .../java/org/numenta/nupic/network/Layer.java | 122 ++++++++++-------- .../numenta/nupic/network/ManualInput.java | 12 +- 3 files changed, 90 insertions(+), 65 deletions(-) diff --git a/src/main/java/org/numenta/nupic/Parameters.java b/src/main/java/org/numenta/nupic/Parameters.java index 001f8160..ad852e2a 100644 --- a/src/main/java/org/numenta/nupic/Parameters.java +++ b/src/main/java/org/numenta/nupic/Parameters.java @@ -23,19 +23,13 @@ package org.numenta.nupic; import java.io.IOException; -import java.util.Arrays; -import java.util.Collections; -import java.util.EnumMap; -import java.util.HashMap; -import java.util.List; -import java.util.Map; -import java.util.Random; -import java.util.Set; +import java.util.*; import org.numenta.nupic.algorithms.Classifier; import org.numenta.nupic.algorithms.SpatialPooler; import org.numenta.nupic.algorithms.TemporalMemory; import org.numenta.nupic.model.Cell; +import org.numenta.nupic.model.Segment; import org.numenta.nupic.model.Column; import org.numenta.nupic.model.ComputeCycle; import org.numenta.nupic.model.DistalDendrite; @@ -69,6 +63,7 @@ public class Parameters implements Persistable { private static final Map DEFAULTS_TEMPORAL; private static final Map DEFAULTS_SPATIAL; private static final Map DEFAULTS_ENCODER; + private static final Map DEFAULTS_CLASSIFIER; static { @@ -141,6 +136,12 @@ public class Parameters implements Persistable { DEFAULTS_ENCODER = Collections.unmodifiableMap(defaultEncoderParams); defaultParams.putAll(DEFAULTS_ENCODER); + /////////// Classifier Parameters /////////// + Map defaultClassifierParams = new ParametersMap(); + defaultClassifierParams.put(KEY.INFERRED_FIELDS, new HashMap>()); + DEFAULTS_CLASSIFIER = Collections.unmodifiableMap(defaultClassifierParams); + defaultParams.putAll(DEFAULTS_CLASSIFIER); + DEFAULTS_ALL = Collections.unmodifiableMap(defaultParams); } @@ -418,7 +419,9 @@ public static enum KEY { // Network Layer indicator for auto classifier generation AUTO_CLASSIFY("hasClassifiers", Boolean.class), - INFERRED_FIELDS("inferredFields", Map.class), // Map // How many bits to use if encoding the respective date fields. // e.g. Tuple(bits to use:int, radius:double) diff --git a/src/main/java/org/numenta/nupic/network/Layer.java b/src/main/java/org/numenta/nupic/network/Layer.java index 1577f7b3..96e0ed28 100644 --- a/src/main/java/org/numenta/nupic/network/Layer.java +++ b/src/main/java/org/numenta/nupic/network/Layer.java @@ -37,11 +37,7 @@ import org.numenta.nupic.FieldMetaType; import org.numenta.nupic.Parameters; import org.numenta.nupic.Parameters.KEY; -import org.numenta.nupic.algorithms.Anomaly; -import org.numenta.nupic.algorithms.CLAClassifier; -import org.numenta.nupic.algorithms.Classification; -import org.numenta.nupic.algorithms.SpatialPooler; -import org.numenta.nupic.algorithms.TemporalMemory; +import org.numenta.nupic.algorithms.*; import org.numenta.nupic.encoders.DateEncoder; import org.numenta.nupic.encoders.Encoder; import org.numenta.nupic.encoders.EncoderTuple; @@ -400,7 +396,7 @@ public Layer(Parameters params, MultiEncoder e, SpatialPooler sp, TemporalMemory (encoder == null ? "" : "MultiEncoder,"), (spatialPooler == null ? "" : "SpatialPooler,"), (temporalMemory == null ? "" : "TemporalMemory,"), - (autoCreateClassifiers == null ? "" : "Auto creating CLAClassifiers for each input field."), + (autoCreateClassifiers == null ? "" : "Auto creating Classifiers for each input field."), (anomalyComputer == null ? "" : "Anomaly")); } } @@ -1699,38 +1695,9 @@ private void doEncoderBucketMapping(Inference inference, Map enc // Should probably change this so that instead of adding a mapping for // each encoder, it adds a mapping for each field specified by the user // in the new Parameters KEY. -// for(EncoderTuple t : encoderTuples) { -// String name = t.getName(); -// Encoder e = t.getEncoder(); -// -// int bucketIdx = -1; -// Object o = encoderInputMap.get(name); -// if(DateTime.class.isAssignableFrom(o.getClass())) { -// bucketIdx = ((DateEncoder)e).getBucketIndices((DateTime)o)[0]; -// } else if(Number.class.isAssignableFrom(o.getClass())) { -// bucketIdx = e.getBucketIndices((double)o)[0]; -// } else { -// bucketIdx = e.getBucketIndices((String)o)[0]; -// } -// -// int offset = t.getOffset(); -// int[] tempArray = new int[e.getWidth()]; -// System.arraycopy(encoding, offset, tempArray, 0, tempArray.length); -// -// inference.getClassifierInput().put(name, new NamedTuple(new String[] { "name", "inputValue", "bucketIdx", "encoding" }, name, o, bucketIdx, tempArray)); -// } - - // Get fields user wants encoded from Parameters - Map inferredFields = (Map)params.get(KEY.INFERRED_FIELDS); - - // Store a NamedTuple for each of those fields - for(Map.Entry entry : inferredFields.entrySet()) { - String fieldName = entry.getKey(); // Name of encoder input field - EncoderTuple encoderTuple = encoderTuples.stream() // Get the EncoderTuple for this input field - .filter(e -> e.getName().equals(fieldName)) - .collect(Collectors.toList()) - .get(0); - Encoder e = encoderTuple.getEncoder(); + for(EncoderTuple t : encoderTuples) { + String name = t.getName(); + Encoder e = t.getEncoder(); int bucketIdx = -1; Object o = encoderInputMap.get(name); @@ -1742,19 +1709,53 @@ private void doEncoderBucketMapping(Inference inference, Map enc bucketIdx = e.getBucketIndices((String)o)[0]; } - int offset = encoderTuple.getOffset(); + int offset = t.getOffset(); int[] tempArray = new int[e.getWidth()]; System.arraycopy(encoding, offset, tempArray, 0, tempArray.length); - inference.getClassifierInput().put( - name, - new NamedTuple(new String[] { "name", "inputValue", "bucketIdx", "encoding" }, - name, - o, - bucketIdx, - tempArray - )); + inference.getClassifierInput().put(name, new NamedTuple(new String[] { "name", "inputValue", "bucketIdx", "encoding" }, name, o, bucketIdx, tempArray)); } + +// // Get fields user wants encoded from Parameters +// Map inferredFields = (Map)params.get(KEY.INFERRED_FIELDS); +// if(inferredFields == null) { +// LOGGER.info("KEY.INFERRED_FIELDS is null, no fields will be classified."); +// return; +// } +// +// // Store a NamedTuple for each of those fields +// for(Map.Entry entry : inferredFields.entrySet()) { +// String fieldName = entry.getKey(); // Name of encoder input field +// EncoderTuple encoderTuple = encoderTuples.stream() // Get the EncoderTuple for this input field +// .filter(e -> e.getName().equals(fieldName)) +// .collect(Collectors.toList()) +// .get(0); +// Encoder e = encoderTuple.getEncoder(); +// +// int bucketIdx = -1; +// Object o = encoderInputMap.get(fieldName); +// if(DateTime.class.isAssignableFrom(o.getClass())) { +// bucketIdx = ((DateEncoder)e).getBucketIndices((DateTime)o)[0]; +// } else if(Number.class.isAssignableFrom(o.getClass())) { +// bucketIdx = e.getBucketIndices((double)o)[0]; +// } else { +// bucketIdx = e.getBucketIndices((String)o)[0]; +// } +// +// int offset = encoderTuple.getOffset(); +// int[] tempArray = new int[e.getWidth()]; +// System.arraycopy(encoding, offset, tempArray, 0, tempArray.length); +// +// inference.getClassifierInput().put( +// fieldName, +// new NamedTuple( +// new String[] { "name", "inputValue", "bucketIdx", "encoding" }, +// fieldName, +// o, +// bucketIdx, +// tempArray +// )); +// } } /** @@ -1958,12 +1959,26 @@ NamedTuple makeClassifiers(MultiEncoder encoder) { // Looks like the passed-in MultiEncoder is a single wrapper containing // multiple encoders; one for each field. Right now, a classifier is // created for each of those encoders. However, instead of do + Map inferredFields = (Map>) params.get(KEY.INFERRED_FIELDS); String[] names = new String[encoder.getEncoders(encoder).size()]; - CLAClassifier[] ca = new CLAClassifier[names.length]; + Classifier[] ca = new Classifier[names.length]; int i = 0; for(EncoderTuple et : encoder.getEncoders(encoder)) { names[i] = et.getName(); - ca[i] = new CLAClassifier(); + Object fieldClassifier = inferredFields.get(et.getName()); + if(fieldClassifier == CLAClassifier.class) { + LOGGER.info("Classifying \"" + et.getName() + "\" input field with CLAClassifier"); + ca[i] = new CLAClassifier(); + } else if(fieldClassifier == SDRClassifier.class) { + LOGGER.info("Classifying \"" + et.getName() + "\" input field with SDRClassifier"); + ca[i] = new SDRClassifier(); + } else { + if(fieldClassifier != null) + LOGGER.warn("Invalid Classifier class token, \"" + fieldClassifier + + "\", specified for, \"" + et.getName() + "\", input field. " + + "Valid class tokens are CLAClassifier.class and SDRClassifier.class"); + LOGGER.info("Not classifying \"" + et.getName() + "\" input field"); + } i++; } return new NamedTuple(names, (Object[])ca); @@ -2380,10 +2395,13 @@ public ManualInput call(ManualInput t1) { bucketIdx = inputs.get("bucketIdx"); actValue = inputs.get("inputValue"); - CLAClassifier c = (CLAClassifier)t1.getClassifiers().get(key); - Classification result = c.compute(recordNum, inputMap, t1.getSDR(), isLearn, true); + Classifier c = (Classifier)t1.getClassifiers().get(key); - t1.recordNum(recordNum).storeClassification((String)inputs.get("name"), result); + // c will be null if no classifier was specifying for this field in KEY.INFERRED_FIELDS map + if(c != null) { + Classification result = c.compute(recordNum, inputMap, t1.getSDR(), isLearn, true); + t1.recordNum(recordNum).storeClassification((String)inputs.get("name"), result); + } } return t1; diff --git a/src/main/java/org/numenta/nupic/network/ManualInput.java b/src/main/java/org/numenta/nupic/network/ManualInput.java index 5fd2b9a3..0a1e336c 100644 --- a/src/main/java/org/numenta/nupic/network/ManualInput.java +++ b/src/main/java/org/numenta/nupic/network/ManualInput.java @@ -27,7 +27,7 @@ import java.util.Map; import java.util.Set; -import org.numenta.nupic.algorithms.CLAClassifier; +import org.numenta.nupic.algorithms.Classifier; import org.numenta.nupic.algorithms.Classification; import org.numenta.nupic.algorithms.SpatialPooler; import org.numenta.nupic.algorithms.TemporalMemory; @@ -191,7 +191,9 @@ public ManualInput customObject(Object o) { /** *

- * Returns the {@link Map} used as input into the {@link CLAClassifier} + * Returns the {@link Map} used as input into the field's {@link Classifier} + * (it is only actually used as input if a Classifier type has specified for + * the field). * * This mapping contains the name of the field being classified mapped * to a {@link NamedTuple} containing: @@ -237,7 +239,7 @@ public ManualInput classifiers(NamedTuple tuple) { /** * Returns a {@link NamedTuple} keyed to the input field - * names, whose values are the {@link CLAClassifier} used + * names, whose values are the {@link Classifier} used * to track the classification of a particular field */ @Override @@ -341,10 +343,12 @@ ManualInput copy() { * Returns the most recent {@link Classification} * * @param fieldName - * @return + * @return the most recent {@link Classification}, or null if none exists. */ @Override public Classification getClassification(String fieldName) { + if(classification == null) + return null; return classification.get(fieldName); } From 12188aaef740d9eb4111c49a70e21c85840c0d64 Mon Sep 17 00:00:00 2001 From: Andrew Dillon Date: Sun, 5 Mar 2017 12:10:10 -0600 Subject: [PATCH 14/52] Removed unneeded comments --- .../java/org/numenta/nupic/network/Layer.java | 69 +++---------------- 1 file changed, 10 insertions(+), 59 deletions(-) diff --git a/src/main/java/org/numenta/nupic/network/Layer.java b/src/main/java/org/numenta/nupic/network/Layer.java index 96e0ed28..57537a3c 100644 --- a/src/main/java/org/numenta/nupic/network/Layer.java +++ b/src/main/java/org/numenta/nupic/network/Layer.java @@ -1691,10 +1691,6 @@ private void doEncoderBucketMapping(Inference inference, Map enc // Store the encoding int[] encoding = inference.getEncoding(); - // TODO: 21: Looks like this is where the classifierInput(s) are set. - // Should probably change this so that instead of adding a mapping for - // each encoder, it adds a mapping for each field specified by the user - // in the new Parameters KEY. for(EncoderTuple t : encoderTuples) { String name = t.getName(); Encoder e = t.getEncoder(); @@ -1713,49 +1709,16 @@ private void doEncoderBucketMapping(Inference inference, Map enc int[] tempArray = new int[e.getWidth()]; System.arraycopy(encoding, offset, tempArray, 0, tempArray.length); - inference.getClassifierInput().put(name, new NamedTuple(new String[] { "name", "inputValue", "bucketIdx", "encoding" }, name, o, bucketIdx, tempArray)); - } - -// // Get fields user wants encoded from Parameters -// Map inferredFields = (Map)params.get(KEY.INFERRED_FIELDS); -// if(inferredFields == null) { -// LOGGER.info("KEY.INFERRED_FIELDS is null, no fields will be classified."); -// return; -// } -// -// // Store a NamedTuple for each of those fields -// for(Map.Entry entry : inferredFields.entrySet()) { -// String fieldName = entry.getKey(); // Name of encoder input field -// EncoderTuple encoderTuple = encoderTuples.stream() // Get the EncoderTuple for this input field -// .filter(e -> e.getName().equals(fieldName)) -// .collect(Collectors.toList()) -// .get(0); -// Encoder e = encoderTuple.getEncoder(); -// -// int bucketIdx = -1; -// Object o = encoderInputMap.get(fieldName); -// if(DateTime.class.isAssignableFrom(o.getClass())) { -// bucketIdx = ((DateEncoder)e).getBucketIndices((DateTime)o)[0]; -// } else if(Number.class.isAssignableFrom(o.getClass())) { -// bucketIdx = e.getBucketIndices((double)o)[0]; -// } else { -// bucketIdx = e.getBucketIndices((String)o)[0]; -// } -// -// int offset = encoderTuple.getOffset(); -// int[] tempArray = new int[e.getWidth()]; -// System.arraycopy(encoding, offset, tempArray, 0, tempArray.length); -// -// inference.getClassifierInput().put( -// fieldName, -// new NamedTuple( -// new String[] { "name", "inputValue", "bucketIdx", "encoding" }, -// fieldName, -// o, -// bucketIdx, -// tempArray -// )); -// } + inference.getClassifierInput().put( + name, + new NamedTuple( + new String[] { "name", "inputValue", "bucketIdx", "encoding" }, + name, + o, + bucketIdx, + tempArray + )); + } } /** @@ -1950,15 +1913,7 @@ private void clearSubscriberObserverLists() { * @param encoder * @return */ - // TODO: 21: This creates two parallel arrays, one of encoder's names, and - // the other of each encoder's classifier. Returns a NamedTuple making use of - // these arrays easier. NamedTuple makeClassifiers(MultiEncoder encoder) { - // TODO: 21: Should be able to inspect new Parameters KEY(s) in here - // TODO: 21: to adjust types of Classifiers that are created/used. - // Looks like the passed-in MultiEncoder is a single wrapper containing - // multiple encoders; one for each field. Right now, a classifier is - // created for each of those encoders. However, instead of do Map inferredFields = (Map>) params.get(KEY.INFERRED_FIELDS); String[] names = new String[encoder.getEncoders(encoder).size()]; Classifier[] ca = new Classifier[names.length]; @@ -2384,10 +2339,6 @@ public Object get(Object o) { @Override public ManualInput call(ManualInput t1) { - // TODO: 21: Should only need to change this code to use the - // new Classifier interface. But will need to change what is - // returned by t1.getClassifierInput() to only pay attention - // to fields being classifier based on new Parameters KEY Map ci = t1.getClassifierInput(); int recordNum = getRecordNum(); for(String key : ci.keySet()) { From 085466aa6fbd11402d8d66945f7218cc3d6e24b5 Mon Sep 17 00:00:00 2001 From: Andrew Dillon Date: Sun, 5 Mar 2017 17:16:38 -0600 Subject: [PATCH 15/52] Exception is now thrown is KEY.AUTO_CLASSIFY == true, but KEY.INFERRED_FIELDS is null or empty --- .../java/org/numenta/nupic/Parameters.java | 7 ------- .../java/org/numenta/nupic/network/Layer.java | 20 +++++++++++++------ 2 files changed, 14 insertions(+), 13 deletions(-) diff --git a/src/main/java/org/numenta/nupic/Parameters.java b/src/main/java/org/numenta/nupic/Parameters.java index ad852e2a..21b642e5 100644 --- a/src/main/java/org/numenta/nupic/Parameters.java +++ b/src/main/java/org/numenta/nupic/Parameters.java @@ -63,7 +63,6 @@ public class Parameters implements Persistable { private static final Map DEFAULTS_TEMPORAL; private static final Map DEFAULTS_SPATIAL; private static final Map DEFAULTS_ENCODER; - private static final Map DEFAULTS_CLASSIFIER; static { @@ -136,12 +135,6 @@ public class Parameters implements Persistable { DEFAULTS_ENCODER = Collections.unmodifiableMap(defaultEncoderParams); defaultParams.putAll(DEFAULTS_ENCODER); - /////////// Classifier Parameters /////////// - Map defaultClassifierParams = new ParametersMap(); - defaultClassifierParams.put(KEY.INFERRED_FIELDS, new HashMap>()); - DEFAULTS_CLASSIFIER = Collections.unmodifiableMap(defaultClassifierParams); - defaultParams.putAll(DEFAULTS_CLASSIFIER); - DEFAULTS_ALL = Collections.unmodifiableMap(defaultParams); } diff --git a/src/main/java/org/numenta/nupic/network/Layer.java b/src/main/java/org/numenta/nupic/network/Layer.java index 57537a3c..dfb18926 100644 --- a/src/main/java/org/numenta/nupic/network/Layer.java +++ b/src/main/java/org/numenta/nupic/network/Layer.java @@ -31,7 +31,6 @@ import java.util.Map; import java.util.Set; import java.util.concurrent.ConcurrentLinkedQueue; -import java.util.stream.Collectors; import org.joda.time.DateTime; import org.numenta.nupic.FieldMetaType; @@ -1915,6 +1914,13 @@ private void clearSubscriberObserverLists() { */ NamedTuple makeClassifiers(MultiEncoder encoder) { Map inferredFields = (Map>) params.get(KEY.INFERRED_FIELDS); + if(inferredFields == null || inferredFields.entrySet().size() == 0) { + throw new IllegalStateException( + "KEY.AUTO_CLASSIFY has been set to \"true\", but KEY.INFERRED_FIELDS is null or\n\t" + + "empty. Must specify desired Classifier for at least one input field in\n\t" + + "KEY.INFERRED_FIELDS or set KEY.AUTO_CLASSIFY to \"false\"." + ); + } String[] names = new String[encoder.getEncoders(encoder).size()]; Classifier[] ca = new Classifier[names.length]; int i = 0; @@ -1927,11 +1933,13 @@ NamedTuple makeClassifiers(MultiEncoder encoder) { } else if(fieldClassifier == SDRClassifier.class) { LOGGER.info("Classifying \"" + et.getName() + "\" input field with SDRClassifier"); ca[i] = new SDRClassifier(); - } else { - if(fieldClassifier != null) - LOGGER.warn("Invalid Classifier class token, \"" + fieldClassifier + - "\", specified for, \"" + et.getName() + "\", input field. " + - "Valid class tokens are CLAClassifier.class and SDRClassifier.class"); + } else if(fieldClassifier != null) { + throw new IllegalStateException( + "Invalid Classifier class token, \"" + fieldClassifier + "\",\n\t" + + "specified for, \"" + et.getName() + "\", input field.\n\t" + + "Valid class tokens are CLAClassifier.class and SDRClassifier.class" + ); + } else { // fieldClassifier is null LOGGER.info("Not classifying \"" + et.getName() + "\" input field"); } i++; From 4b8ecf7f3a152eda8c71371a04993c6ff0ee3ed1 Mon Sep 17 00:00:00 2001 From: Andrew Dillon Date: Sun, 5 Mar 2017 20:17:33 -0600 Subject: [PATCH 16/52] Added tests for makeClassifier method --- .../org/numenta/nupic/network/LayerTest.java | 117 +++++++++++++++++- 1 file changed, 112 insertions(+), 5 deletions(-) diff --git a/src/test/java/org/numenta/nupic/network/LayerTest.java b/src/test/java/org/numenta/nupic/network/LayerTest.java index 313aadfe..bf438ff2 100644 --- a/src/test/java/org/numenta/nupic/network/LayerTest.java +++ b/src/test/java/org/numenta/nupic/network/LayerTest.java @@ -41,13 +41,12 @@ import org.junit.Test; import org.numenta.nupic.Parameters; import org.numenta.nupic.Parameters.KEY; -import org.numenta.nupic.algorithms.Anomaly; +import org.numenta.nupic.algorithms.*; import org.numenta.nupic.algorithms.Anomaly.Mode; -import org.numenta.nupic.algorithms.CLAClassifier; -import org.numenta.nupic.algorithms.SpatialPooler; -import org.numenta.nupic.algorithms.TemporalMemory; import org.numenta.nupic.datagen.ResourceLocator; import org.numenta.nupic.encoders.MultiEncoder; +import org.numenta.nupic.encoders.RandomDistributedScalarEncoder; +import org.numenta.nupic.encoders.ScalarEncoder; import org.numenta.nupic.model.Connections; import org.numenta.nupic.model.SDR; import org.numenta.nupic.network.Layer.FunctionFactory; @@ -59,14 +58,17 @@ import org.numenta.nupic.network.sensor.SensorParams; import org.numenta.nupic.network.sensor.SensorParams.Keys; import org.numenta.nupic.util.MersenneTwister; +import org.numenta.nupic.util.NamedTuple; import org.numenta.nupic.util.UniversalRandom; +import org.openjdk.jmh.annotations.Param; import rx.Observable; import rx.Observer; import rx.Subscriber; import rx.functions.Func1; import rx.observers.TestObserver; import rx.subjects.PublishSubject; +import sun.plugin.dom.exception.InvalidStateException; /** * Tests the "heart and soul" of the Network API @@ -1770,5 +1772,110 @@ public void testStringToInferenceTransformer() { // Received a record yet. assertEquals("[42]", (Arrays.toString((int[])ff.inference.getSDR()))); } - + + @Test + public void testMakeClassifiers() { + // Setup Parameters + Parameters p = Parameters.getAllDefaultParameters(); + Map> inferredFieldsMap = new HashMap<>(); + inferredFieldsMap.put("field1", CLAClassifier.class); + inferredFieldsMap.put("field2", SDRClassifier.class); + inferredFieldsMap.put("field3", null); + p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + + // Create MultiEncoder and add the fields' encoders to it + MultiEncoder me = MultiEncoder.builder().name("").build(); + me.addEncoder( + "field1", + RandomDistributedScalarEncoder.builder().resolution(1).build() + ); + me.addEncoder( + "field2", + RandomDistributedScalarEncoder.builder().resolution(1).build() + ); + me.addEncoder( + "field3", + RandomDistributedScalarEncoder.builder().resolution(1).build() + ); + + // Create a Layer with Parameters and MultiEncoder + Layer> l = new Layer<>( + p, + me, + new SpatialPooler(), + new TemporalMemory(), + true, + null + ); + + // Make sure the makeClassifiers() method matches each + // field to the specified Classifier type + NamedTuple nt = l.makeClassifiers(l.getEncoder()); + assertEquals(nt.get("field1").getClass(), CLAClassifier.class); + assertEquals(nt.get("field2").getClass(), SDRClassifier.class); + assertEquals(nt.get("field3"), null); + } + + @Test + public void TestMakeClassifiersWithNoInferredFieldsKey() { + // Setup Parameters + Parameters p = Parameters.getAllDefaultParameters(); + + // Create MultiEncoder + MultiEncoder me = MultiEncoder.builder().name("").build(); + + // Create a Layer with Parameters and MultiEncoder + Layer> l = new Layer<>( + p, + me, + new SpatialPooler(), + new TemporalMemory(), + true, + null + ); + + // Make sure the makeClassifiers() method throws exception due to + // absence of KEY.INFERRED_FIELDS in the Parameters object + try { + NamedTuple nt = l.makeClassifiers(l.getEncoder()); + } catch (IllegalStateException e) { + assertTrue(e.getMessage().contains("KEY.INFERRED_FIELDS")); + assertTrue(e.getMessage().contains("null")); + assertTrue(e.getMessage().contains("empty")); + } + } + + @Test + public void TestMakeClassifiersWithInvalidInferredFieldsKey() { + // Setup Parameters + Parameters p = Parameters.getAllDefaultParameters(); + Map> inferredFieldsMap = new HashMap<>(); + inferredFieldsMap.put("field1", Classifier.class); + p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + + // Create MultiEncoder and add the fields' encoders to it + MultiEncoder me = MultiEncoder.builder().name("").build(); + me.addEncoder( + "field1", + RandomDistributedScalarEncoder.builder().resolution(1).build() + ); + + // Create a Layer with Parameters and MultiEncoder + Layer> l = new Layer<>( + p, + me, + new SpatialPooler(), + new TemporalMemory(), + true, + null + ); + + // Make sure the makeClassifiers() method throws exception due to + // absence of KEY.INFERRED_FIELDS in the Parameters object + try { + NamedTuple nt = l.makeClassifiers(l.getEncoder()); + } catch (IllegalStateException e) { + assertTrue(e.getMessage().contains("Invalid Classifier class token")); + } + } } From d91a5489b65d03773363c8163b03ff733339d7ce Mon Sep 17 00:00:00 2001 From: Andrew Dillon Date: Mon, 6 Mar 2017 10:36:58 -0600 Subject: [PATCH 17/52] updated existing tests to get them passing again --- .../org/numenta/nupic/network/LayerTest.java | 53 +++++++++++++++++-- 1 file changed, 48 insertions(+), 5 deletions(-) diff --git a/src/test/java/org/numenta/nupic/network/LayerTest.java b/src/test/java/org/numenta/nupic/network/LayerTest.java index bf438ff2..a705897f 100644 --- a/src/test/java/org/numenta/nupic/network/LayerTest.java +++ b/src/test/java/org/numenta/nupic/network/LayerTest.java @@ -223,6 +223,10 @@ public void testGetAllValues() { p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new UniversalRandom(42)); + Map> inferredFieldsMap = new HashMap<>(); + inferredFieldsMap.put("dayOfWeek", CLAClassifier.class); + p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + MultiEncoder me = MultiEncoder.builder().name("").build(); Layer> l = new Layer<>(p, me, new SpatialPooler(), new TemporalMemory(), Boolean.TRUE, null); @@ -261,7 +265,9 @@ public void onNext(Inference i) { @Test public void testResetMethod() { Parameters p = NetworkTestHarness.getParameters().copy(); - Layer l = Network.createLayer("l1", p).add(new TemporalMemory()); + Layer l = Network.createLayer("l1", p) + .alterParameter(KEY.AUTO_CLASSIFY, false) + .add(new TemporalMemory()); try { l.reset(); assertTrue(l.hasTemporalMemory()); @@ -309,7 +315,7 @@ public void testHalt() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getHotGymTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); - p.set(KEY.AUTO_CLASSIFY, Boolean.TRUE); + p.set(KEY.AUTO_CLASSIFY, false); HTMSensor htmSensor = (HTMSensor)sensor; @@ -352,7 +358,7 @@ public void testReset() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getHotGymTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); - p.set(KEY.AUTO_CLASSIFY, Boolean.TRUE); + p.set(KEY.AUTO_CLASSIFY, false); HTMSensor htmSensor = (HTMSensor)sensor; @@ -392,7 +398,7 @@ public void testSequenceChangeReset() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getHotGymTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); - p.set(KEY.AUTO_CLASSIFY, Boolean.TRUE); + p.set(KEY.AUTO_CLASSIFY, false); HTMSensor htmSensor = (HTMSensor)sensor; @@ -437,6 +443,10 @@ public void testLayerWithObservableInput() { p.set(KEY.RANDOM, new MersenneTwister(42)); p.set(KEY.AUTO_CLASSIFY, Boolean.TRUE); + Map> inferredFieldsMap = new HashMap<>(); + inferredFieldsMap.put("consumption", CLAClassifier.class); + p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + HTMSensor> htmSensor = (HTMSensor>)sensor; Network n = Network.create("test network", p) @@ -740,6 +750,10 @@ public void testBasicSetupEncoder_AUTO_MODE() { p.set(KEY.RANDOM, new UniversalRandom(42)); p.set(KEY.AUTO_CLASSIFY, Boolean.TRUE); + Map> inferredFieldsMap = new HashMap<>(); + inferredFieldsMap.put("consumption", CLAClassifier.class); + p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + HTMSensor htmSensor = (HTMSensor)sensor; Network n = Network.create("test network", p); @@ -868,6 +882,10 @@ public void testBasicSetup_SpatialPooler_AUTO_MODE() { p.set(KEY.RANDOM, new UniversalRandom(42)); p.set(KEY.AUTO_CLASSIFY, Boolean.TRUE); + Map> inferredFieldsMap = new HashMap<>(); + inferredFieldsMap.put("consumption", CLAClassifier.class); + p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + HTMSensor htmSensor = (HTMSensor)sensor; Network n = Network.create("test network", p); @@ -1075,6 +1093,10 @@ public void testBasicClassifierSetup() { p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new UniversalRandom(42)); + Map> inferredFieldsMap = new HashMap<>(); + inferredFieldsMap.put("dayOfWeek", CLAClassifier.class); + p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + MultiEncoder me = MultiEncoder.builder().name("").build(); Layer> l = new Layer<>(p, me, new SpatialPooler(), new TemporalMemory(), Boolean.TRUE, null); TestObserver tester; @@ -1113,6 +1135,10 @@ public void testMoreComplexSpatialPoolerPriming() { p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); + Map> inferredFieldsMap = new HashMap<>(); + inferredFieldsMap.put("dayOfWeek", CLAClassifier.class); + p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + p.set(KEY.SP_PRIMER_DELAY, PRIME_COUNT); MultiEncoder me = MultiEncoder.builder().name("").build(); @@ -1160,6 +1186,10 @@ public void test2ndAndSubsequentSubscribersPossible() { p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); + Map> inferredFieldsMap = new HashMap<>(); + inferredFieldsMap.put("dayOfWeek", CLAClassifier.class); + p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + p.set(KEY.SP_PRIMER_DELAY, PRIME_COUNT); MultiEncoder me = MultiEncoder.builder().name("").build(); @@ -1236,6 +1266,10 @@ public void testGetAllPredictions() { p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); + Map> inferredFieldsMap = new HashMap<>(); + inferredFieldsMap.put("dayOfWeek", CLAClassifier.class); + p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + p.set(KEY.SP_PRIMER_DELAY, PRIME_COUNT); final int cellsPerColumn = (int)p.get(KEY.CELLS_PER_COLUMN); @@ -1370,6 +1404,10 @@ public void testObservableRetrieval() { p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); + Map> inferredFieldsMap = new HashMap<>(); + inferredFieldsMap.put("dayOfWeek", CLAClassifier.class); + p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + MultiEncoder me = MultiEncoder.builder().name("").build(); final Layer> l = new Layer<>(p, me, new SpatialPooler(), new TemporalMemory(), Boolean.TRUE, null); @@ -1429,6 +1467,11 @@ public void testFullLayerFluentAssembly() { params.put(KEY_MODE, Mode.PURE); params.put(KEY_WINDOW_SIZE, 3); params.put(KEY_USE_MOVING_AVG, true); + + Map> inferredFieldsMap = new HashMap<>(); + inferredFieldsMap.put("consumption", CLAClassifier.class); + p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + Anomaly anomalyComputer = Anomaly.create(params); Layer l = Network.createLayer("TestLayer", p) @@ -1451,7 +1494,7 @@ public void onNext(Inference i) { if(flowReceived) return; // No need to set this value multiple times flowReceived = i.getClassifiers().size() == 2 && - i.getClassifiers().get("timestamp") != null && + i.getClassifiers().get("timestamp") == null && i.getClassifiers().get("consumption") != null; } }); From 36649408e3edfabc72920c1f465cb04baeaedfd3 Mon Sep 17 00:00:00 2001 From: Andrew Dillon Date: Mon, 6 Mar 2017 10:44:00 -0600 Subject: [PATCH 18/52] Removed errant imports --- src/test/java/org/numenta/nupic/network/LayerTest.java | 2 -- 1 file changed, 2 deletions(-) diff --git a/src/test/java/org/numenta/nupic/network/LayerTest.java b/src/test/java/org/numenta/nupic/network/LayerTest.java index a705897f..2caa34e9 100644 --- a/src/test/java/org/numenta/nupic/network/LayerTest.java +++ b/src/test/java/org/numenta/nupic/network/LayerTest.java @@ -61,14 +61,12 @@ import org.numenta.nupic.util.NamedTuple; import org.numenta.nupic.util.UniversalRandom; -import org.openjdk.jmh.annotations.Param; import rx.Observable; import rx.Observer; import rx.Subscriber; import rx.functions.Func1; import rx.observers.TestObserver; import rx.subjects.PublishSubject; -import sun.plugin.dom.exception.InvalidStateException; /** * Tests the "heart and soul" of the Network API From 650ba0e6a511ff498f55c3208c77b8c4d412c35d Mon Sep 17 00:00:00 2001 From: Andrew Dillon Date: Mon, 6 Mar 2017 11:56:42 -0600 Subject: [PATCH 19/52] Updated tests to pass "./gradlew check" task --- .../org/numenta/nupic/network/LayerTest.java | 56 ++++--------------- .../numenta/nupic/network/NetworkTest.java | 27 +++++++-- .../nupic/network/NetworkTestHarness.java | 14 +++++ .../nupic/network/PersistenceAPITest.java | 12 +++- .../org/numenta/nupic/network/RegionTest.java | 9 ++- .../serialize/HTMObjectInputOutputTest.java | 3 + 6 files changed, 67 insertions(+), 54 deletions(-) diff --git a/src/test/java/org/numenta/nupic/network/LayerTest.java b/src/test/java/org/numenta/nupic/network/LayerTest.java index 2caa34e9..02d79fb3 100644 --- a/src/test/java/org/numenta/nupic/network/LayerTest.java +++ b/src/test/java/org/numenta/nupic/network/LayerTest.java @@ -60,6 +60,7 @@ import org.numenta.nupic.util.MersenneTwister; import org.numenta.nupic.util.NamedTuple; import org.numenta.nupic.util.UniversalRandom; +import static org.numenta.nupic.network.NetworkTestHarness.*; import rx.Observable; import rx.Observer; @@ -220,10 +221,7 @@ public void testGetAllValues() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new UniversalRandom(42)); - - Map> inferredFieldsMap = new HashMap<>(); - inferredFieldsMap.put("dayOfWeek", CLAClassifier.class); - p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); MultiEncoder me = MultiEncoder.builder().name("").build(); Layer> l = new Layer<>(p, me, new SpatialPooler(), new TemporalMemory(), Boolean.TRUE, null); @@ -440,10 +438,7 @@ public void testLayerWithObservableInput() { p = p.union(NetworkTestHarness.getHotGymTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); p.set(KEY.AUTO_CLASSIFY, Boolean.TRUE); - - Map> inferredFieldsMap = new HashMap<>(); - inferredFieldsMap.put("consumption", CLAClassifier.class); - p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); HTMSensor> htmSensor = (HTMSensor>)sensor; @@ -747,10 +742,7 @@ public void testBasicSetupEncoder_AUTO_MODE() { p = p.union(NetworkTestHarness.getHotGymTestEncoderParams()); p.set(KEY.RANDOM, new UniversalRandom(42)); p.set(KEY.AUTO_CLASSIFY, Boolean.TRUE); - - Map> inferredFieldsMap = new HashMap<>(); - inferredFieldsMap.put("consumption", CLAClassifier.class); - p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); HTMSensor htmSensor = (HTMSensor)sensor; @@ -879,10 +871,7 @@ public void testBasicSetup_SpatialPooler_AUTO_MODE() { p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new UniversalRandom(42)); p.set(KEY.AUTO_CLASSIFY, Boolean.TRUE); - - Map> inferredFieldsMap = new HashMap<>(); - inferredFieldsMap.put("consumption", CLAClassifier.class); - p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); HTMSensor htmSensor = (HTMSensor)sensor; @@ -1090,10 +1079,7 @@ public void testBasicClassifierSetup() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new UniversalRandom(42)); - - Map> inferredFieldsMap = new HashMap<>(); - inferredFieldsMap.put("dayOfWeek", CLAClassifier.class); - p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); MultiEncoder me = MultiEncoder.builder().name("").build(); Layer> l = new Layer<>(p, me, new SpatialPooler(), new TemporalMemory(), Boolean.TRUE, null); @@ -1132,11 +1118,7 @@ public void testMoreComplexSpatialPoolerPriming() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); - - Map> inferredFieldsMap = new HashMap<>(); - inferredFieldsMap.put("dayOfWeek", CLAClassifier.class); - p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); - + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); p.set(KEY.SP_PRIMER_DELAY, PRIME_COUNT); MultiEncoder me = MultiEncoder.builder().name("").build(); @@ -1183,11 +1165,7 @@ public void test2ndAndSubsequentSubscribersPossible() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); - - Map> inferredFieldsMap = new HashMap<>(); - inferredFieldsMap.put("dayOfWeek", CLAClassifier.class); - p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); - + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); p.set(KEY.SP_PRIMER_DELAY, PRIME_COUNT); MultiEncoder me = MultiEncoder.builder().name("").build(); @@ -1263,11 +1241,7 @@ public void testGetAllPredictions() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); - - Map> inferredFieldsMap = new HashMap<>(); - inferredFieldsMap.put("dayOfWeek", CLAClassifier.class); - p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); - + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); p.set(KEY.SP_PRIMER_DELAY, PRIME_COUNT); final int cellsPerColumn = (int)p.get(KEY.CELLS_PER_COLUMN); @@ -1401,11 +1375,7 @@ public void testObservableRetrieval() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); - - Map> inferredFieldsMap = new HashMap<>(); - inferredFieldsMap.put("dayOfWeek", CLAClassifier.class); - p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); - + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); MultiEncoder me = MultiEncoder.builder().name("").build(); final Layer> l = new Layer<>(p, me, new SpatialPooler(), new TemporalMemory(), Boolean.TRUE, null); @@ -1465,11 +1435,7 @@ public void testFullLayerFluentAssembly() { params.put(KEY_MODE, Mode.PURE); params.put(KEY_WINDOW_SIZE, 3); params.put(KEY_USE_MOVING_AVG, true); - - Map> inferredFieldsMap = new HashMap<>(); - inferredFieldsMap.put("consumption", CLAClassifier.class); - p.set(KEY.INFERRED_FIELDS, inferredFieldsMap); - + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); Anomaly anomalyComputer = Anomaly.create(params); Layer l = Network.createLayer("TestLayer", p) diff --git a/src/test/java/org/numenta/nupic/network/NetworkTest.java b/src/test/java/org/numenta/nupic/network/NetworkTest.java index 40880cda..be2fc516 100644 --- a/src/test/java/org/numenta/nupic/network/NetworkTest.java +++ b/src/test/java/org/numenta/nupic/network/NetworkTest.java @@ -42,6 +42,7 @@ import org.numenta.nupic.Parameters.KEY; import org.numenta.nupic.algorithms.Anomaly; import org.numenta.nupic.algorithms.Anomaly.Mode; +import org.numenta.nupic.algorithms.CLAClassifier; import org.numenta.nupic.algorithms.SpatialPooler; import org.numenta.nupic.algorithms.TemporalMemory; import org.numenta.nupic.datagen.ResourceLocator; @@ -56,6 +57,7 @@ import org.numenta.nupic.network.sensor.SensorParams.Keys; import org.numenta.nupic.util.FastRandom; import org.numenta.nupic.util.MersenneTwister; +import static org.numenta.nupic.network.NetworkTestHarness.*; import rx.Observer; import rx.Subscriber; @@ -209,7 +211,8 @@ public void testBasicNetworkHaltGetsOnComplete() { Parameters p = NetworkTestHarness.getParameters(); p = p.union(NetworkTestHarness.getNetworkDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); - + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); + // Create a Network Network network = Network.create("test network", p) .add(Network.createRegion("r1") @@ -270,7 +273,8 @@ public void testBasicNetworkHalt_ThenRestart() { Parameters p = NetworkTestHarness.getParameters(); p = p.union(NetworkTestHarness.getNetworkDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); - + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); + // Create a Network Network network = Network.create("test network", p) .add(Network.createRegion("r1") @@ -490,7 +494,8 @@ public void testBasicNetworkRunAWhileThenHalt() { Parameters p = NetworkTestHarness.getParameters(); p = p.union(NetworkTestHarness.getNetworkDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); - + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); + // Create a Network Network network = Network.create("test network", p) .add(Network.createRegion("r1") @@ -555,6 +560,7 @@ public void testRegionHierarchies() { p.setPotentialRadius(16); p = p.union(NetworkTestHarness.getNetworkDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); Network network = Network.create("test network", p) .add(Network.createRegion("r1") @@ -694,6 +700,7 @@ public void testNetworkComputeWithNoSensor() { p.set(KEY.MAX_BOOST, 1.0); p.set(KEY.DUTY_CYCLE_PERIOD, 7); p.set(KEY.RANDOM, new MersenneTwister(42)); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); Map params = new HashMap<>(); params.put(KEY_MODE, Mode.PURE); @@ -771,6 +778,7 @@ public void testSynchronousBlockingComputeCall() { p.set(KEY.MAX_BOOST, 10.0); p.set(KEY.DUTY_CYCLE_PERIOD, 7); p.set(KEY.RANDOM, new MersenneTwister(42)); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); Map params = new HashMap<>(); params.put(KEY_MODE, Mode.PURE); @@ -819,6 +827,7 @@ public void testThreadedStartFlagging() { p.set(KEY.MAX_BOOST, 10.0); p.set(KEY.DUTY_CYCLE_PERIOD, 7); p.set(KEY.RANDOM, new MersenneTwister(42)); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); Map params = new HashMap<>(); params.put(KEY_MODE, Mode.PURE); @@ -847,6 +856,7 @@ public void testThreadedStartFlagging() { ////////////////////////////////////////////////////// p = NetworkTestHarness.getParameters(); p = p.union(NetworkTestHarness.getNetworkDemoTestEncoderParams()); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); n = Network.create("test network", p) .add(Network.createRegion("r1") .add(Network.createLayer("1", p) @@ -869,6 +879,7 @@ public void testThreadedStartFlagging() { try { p = NetworkTestHarness.getParameters(); p = p.union(NetworkTestHarness.getNetworkDemoTestEncoderParams()); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); n = Network.create("test network", p) .add(Network.createRegion("r1") .add(Network.createLayer("1", p) @@ -965,6 +976,7 @@ public void testObservableWithCoordinateEncoder_NEGATIVE() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getGeospatialTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("", CLAClassifier.class)); HTMSensor> htmSensor = (HTMSensor>)sensor; @@ -1050,7 +1062,8 @@ public void testCalculateInputWidth_NoPrevLayer_UpstreamRegion_with_TM() { Parameters p = NetworkTestHarness.getParameters(); p = p.union(NetworkTestHarness.getNetworkDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); - + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); + Network network = Network.create("test network", p) .add(Network.createRegion("r1") .add(Network.createLayer("2", p) @@ -1078,7 +1091,8 @@ public void testCalculateInputWidth_NoPrevLayer_UpstreamRegion_without_TM() { Parameters p = NetworkTestHarness.getParameters(); p = p.union(NetworkTestHarness.getNetworkDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); - + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); + Network network = Network.create("test network", p) .add(Network.createRegion("r1") .add(Network.createLayer("2", p) @@ -1278,7 +1292,8 @@ private Network getLoadedDayOfWeekNetwork() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new FastRandom(42)); - + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); + Sensor> sensor = Sensor.create( ObservableSensor::create, SensorParams.create(Keys::obs, new Object[] {"name", PublisherSupplier.builder() diff --git a/src/test/java/org/numenta/nupic/network/NetworkTestHarness.java b/src/test/java/org/numenta/nupic/network/NetworkTestHarness.java index 37bf31c1..fb3e833a 100644 --- a/src/test/java/org/numenta/nupic/network/NetworkTestHarness.java +++ b/src/test/java/org/numenta/nupic/network/NetworkTestHarness.java @@ -26,6 +26,8 @@ import org.numenta.nupic.Parameters; import org.numenta.nupic.Parameters.KEY; +import org.numenta.nupic.algorithms.CLAClassifier; +import org.numenta.nupic.algorithms.Classifier; import org.numenta.nupic.encoders.Encoder; import org.numenta.nupic.util.Tuple; @@ -206,6 +208,18 @@ public static Parameters getDayDemoTestEncoderParams() { return p; } + + /** + * @return a Map that can be used as the value for a Parameter + * object's KEY.INFERRED_FIELDS key, to classify the specified + * field with the specified Classifier type. + */ + public static Map> getInferredFieldsMap( + String field, Class classifier) { + Map> inferredFieldsMap = new HashMap<>(); + inferredFieldsMap.put(field, classifier); + return inferredFieldsMap; + } /** * Returns the default parameters used for the "dayOfWeek" encoder and algorithms. diff --git a/src/test/java/org/numenta/nupic/network/PersistenceAPITest.java b/src/test/java/org/numenta/nupic/network/PersistenceAPITest.java index 4dc7855b..b33c3588 100644 --- a/src/test/java/org/numenta/nupic/network/PersistenceAPITest.java +++ b/src/test/java/org/numenta/nupic/network/PersistenceAPITest.java @@ -30,6 +30,7 @@ import static org.numenta.nupic.algorithms.Anomaly.KEY_MODE; import static org.numenta.nupic.algorithms.Anomaly.KEY_USE_MOVING_AVG; import static org.numenta.nupic.algorithms.Anomaly.KEY_WINDOW_SIZE; +import static org.numenta.nupic.network.NetworkTestHarness.*; import java.io.File; import java.math.BigDecimal; @@ -676,6 +677,7 @@ public void testSerializeCLAClassifier() { public void testSerializeLayer() { Parameters p = NetworkTestHarness.getParameters().copy(); p.set(KEY.RANDOM, new MersenneTwister(42)); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); Map> settings = NetworkTestHarness.setupMap( null, // map 8, // n @@ -1684,6 +1686,7 @@ private Network getLoadedDayOfWeekStreamHierarchy() { Parameters p = NetworkTestHarness.getParameters(); p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new FastRandom(42)); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); Layer l2 = null; Network network = Network.create("test network", p) @@ -1711,6 +1714,7 @@ private Network getLoadedDayOfWeekNetwork() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new FastRandom(42)); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); Sensor> sensor = Sensor.create( ObservableSensor::create, SensorParams.create(Keys::obs, new Object[] {"name", @@ -1734,6 +1738,7 @@ private Network getLoadedHotGymHierarchy() { Parameters p = NetworkTestHarness.getParameters(); p = p.union(NetworkTestHarness.getNetworkDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); Network network = Network.create("test network", p) .add(Network.createRegion("r1") @@ -1759,6 +1764,7 @@ private Network getLoadedHotGymNetwork() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getHotGymTestEncoderParams()); p.set(KEY.RANDOM, new FastRandom(42)); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); Sensor> sensor = Sensor.create( ObservableSensor::create, SensorParams.create(Keys::obs, new Object[] {"name", @@ -1782,7 +1788,8 @@ private Network getLoadedHotGymSynchronousNetwork() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getHotGymTestEncoderParams()); p.set(KEY.RANDOM, new FastRandom(42)); - + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); + Network network = Network.create("test network", p).add(Network.createRegion("r1") .add(Network.createLayer("1", p) .alterParameter(KEY.AUTO_CLASSIFY, true) @@ -1797,6 +1804,7 @@ private Network getLoadedHotGymNetwork_FileSensor() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getHotGymTestEncoderParams()); p.set(KEY.RANDOM, new FastRandom(42)); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); Object[] n = { "some name", ResourceLocator.path("rec-center-hourly.csv") }; HTMSensor sensor = (HTMSensor)Sensor.create( @@ -1824,6 +1832,7 @@ private Network createAndRunTestSpatialPoolerNetwork(int start, int runTo) { Parameters p = NetworkTestHarness.getParameters().copy(); p.set(KEY.RANDOM, new MersenneTwister(42)); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); Map> settings = NetworkTestHarness.setupMap( null, // map @@ -1911,6 +1920,7 @@ private Network createAndRunTestTemporalMemoryNetwork() { ObservableSensor::create, SensorParams.create(Keys::obs, new Object[] {"name", manual})); Parameters p = getParameters(); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); Map> settings = NetworkTestHarness.setupMap( null, // map diff --git a/src/test/java/org/numenta/nupic/network/RegionTest.java b/src/test/java/org/numenta/nupic/network/RegionTest.java index 40bc814b..5bad07b2 100644 --- a/src/test/java/org/numenta/nupic/network/RegionTest.java +++ b/src/test/java/org/numenta/nupic/network/RegionTest.java @@ -38,6 +38,7 @@ import org.numenta.nupic.Parameters.KEY; import org.numenta.nupic.algorithms.Anomaly; import org.numenta.nupic.algorithms.Anomaly.Mode; +import org.numenta.nupic.algorithms.CLAClassifier; import org.numenta.nupic.algorithms.SpatialPooler; import org.numenta.nupic.algorithms.TemporalMemory; import org.numenta.nupic.datagen.ResourceLocator; @@ -47,6 +48,7 @@ import org.numenta.nupic.network.sensor.SensorParams; import org.numenta.nupic.network.sensor.SensorParams.Keys; import org.numenta.nupic.util.MersenneTwister; +import static org.numenta.nupic.network.NetworkTestHarness.*; import rx.Observer; import rx.Subscriber; @@ -202,6 +204,7 @@ public void testHalt() { Parameters p = NetworkTestHarness.getParameters(); p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); Map params = new HashMap<>(); params.put(KEY_MODE, Mode.PURE); @@ -326,7 +329,8 @@ public void testMultiLayerAssemblyNoSensor() { p.set(KEY.MAX_BOOST, 10.0); p.set(KEY.DUTY_CYCLE_PERIOD, 7); p.set(KEY.RANDOM, new MersenneTwister(42)); - + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); + Map params = new HashMap<>(); params.put(KEY_MODE, Mode.PURE); @@ -405,7 +409,7 @@ public void testIsLearn() { Network n = Network.create("test network", p) .add(Network.createRegion("r1") .add(Network.createLayer("1", p) - .alterParameter(KEY.AUTO_CLASSIFY, Boolean.TRUE)) + .alterParameter(KEY.AUTO_CLASSIFY, false)) .add(Network.createLayer("2", p) .add(Anomaly.create(params))) .add(Network.createLayer("3", p) @@ -446,6 +450,7 @@ public void test2LayerAssemblyWithSensor() { Parameters p = NetworkTestHarness.getParameters(); p = p.union(NetworkTestHarness.getDayDemoTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); Network n = Network.create("test network", p) .add(Network.createRegion("r1") diff --git a/src/test/java/org/numenta/nupic/serialize/HTMObjectInputOutputTest.java b/src/test/java/org/numenta/nupic/serialize/HTMObjectInputOutputTest.java index 5739e6e2..e04e7f24 100644 --- a/src/test/java/org/numenta/nupic/serialize/HTMObjectInputOutputTest.java +++ b/src/test/java/org/numenta/nupic/serialize/HTMObjectInputOutputTest.java @@ -11,6 +11,7 @@ import org.numenta.nupic.Parameters; import org.numenta.nupic.Parameters.KEY; import org.numenta.nupic.algorithms.Anomaly; +import org.numenta.nupic.algorithms.CLAClassifier; import org.numenta.nupic.algorithms.SpatialPooler; import org.numenta.nupic.algorithms.TemporalMemory; import org.numenta.nupic.network.Network; @@ -22,6 +23,7 @@ import org.numenta.nupic.network.sensor.SensorParams; import org.numenta.nupic.network.sensor.SensorParams.Keys; import org.numenta.nupic.util.FastRandom; +import static org.numenta.nupic.network.NetworkTestHarness.*; public class HTMObjectInputOutputTest { @@ -58,6 +60,7 @@ private Network getLoadedHotGymNetwork() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getHotGymTestEncoderParams()); p.set(KEY.RANDOM, new FastRandom(42)); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); Sensor> sensor = Sensor.create( ObservableSensor::create, SensorParams.create(Keys::obs, new Object[] {"name", From e518422de8214f19804ab8a73290cb5aef647f82 Mon Sep 17 00:00:00 2001 From: Andrew Dillon Date: Tue, 7 Mar 2017 09:29:20 -0600 Subject: [PATCH 20/52] Implemented changes requested in review --- src/main/java/org/numenta/nupic/Parameters.java | 10 ++++++++-- src/main/java/org/numenta/nupic/network/Layer.java | 13 ++++++++++--- src/main/java/org/numenta/nupic/network/Region.java | 3 --- .../java/org/numenta/nupic/network/LayerTest.java | 9 ++++++--- .../java/org/numenta/nupic/network/RegionTest.java | 5 +++-- 5 files changed, 27 insertions(+), 13 deletions(-) diff --git a/src/main/java/org/numenta/nupic/Parameters.java b/src/main/java/org/numenta/nupic/Parameters.java index 21b642e5..9e54b285 100644 --- a/src/main/java/org/numenta/nupic/Parameters.java +++ b/src/main/java/org/numenta/nupic/Parameters.java @@ -23,9 +23,15 @@ package org.numenta.nupic; import java.io.IOException; -import java.util.*; +import java.util.Collections; +import java.util.HashMap; +import java.util.Map; +import java.util.List; +import java.util.Random; +import java.util.Set; +import java.util.EnumMap; +import java.util.Arrays; -import org.numenta.nupic.algorithms.Classifier; import org.numenta.nupic.algorithms.SpatialPooler; import org.numenta.nupic.algorithms.TemporalMemory; import org.numenta.nupic.model.Cell; diff --git a/src/main/java/org/numenta/nupic/network/Layer.java b/src/main/java/org/numenta/nupic/network/Layer.java index dfb18926..e88bcc58 100644 --- a/src/main/java/org/numenta/nupic/network/Layer.java +++ b/src/main/java/org/numenta/nupic/network/Layer.java @@ -36,7 +36,13 @@ import org.numenta.nupic.FieldMetaType; import org.numenta.nupic.Parameters; import org.numenta.nupic.Parameters.KEY; -import org.numenta.nupic.algorithms.*; +import org.numenta.nupic.algorithms.Classification; +import org.numenta.nupic.algorithms.TemporalMemory; +import org.numenta.nupic.algorithms.SpatialPooler; +import org.numenta.nupic.algorithms.Anomaly; +import org.numenta.nupic.algorithms.Classifier; +import org.numenta.nupic.algorithms.SDRClassifier; +import org.numenta.nupic.algorithms.CLAClassifier; import org.numenta.nupic.encoders.DateEncoder; import org.numenta.nupic.encoders.Encoder; import org.numenta.nupic.encoders.EncoderTuple; @@ -1918,7 +1924,8 @@ NamedTuple makeClassifiers(MultiEncoder encoder) { throw new IllegalStateException( "KEY.AUTO_CLASSIFY has been set to \"true\", but KEY.INFERRED_FIELDS is null or\n\t" + "empty. Must specify desired Classifier for at least one input field in\n\t" + - "KEY.INFERRED_FIELDS or set KEY.AUTO_CLASSIFY to \"false\"." + "KEY.INFERRED_FIELDS or set KEY.AUTO_CLASSIFY to \"false\" (which is its default\n\t" + + "value in Parameters)." ); } String[] names = new String[encoder.getEncoders(encoder).size()]; @@ -2356,7 +2363,7 @@ public ManualInput call(ManualInput t1) { Classifier c = (Classifier)t1.getClassifiers().get(key); - // c will be null if no classifier was specifying for this field in KEY.INFERRED_FIELDS map + // c will be null if no classifier was specified for this field in KEY.INFERRED_FIELDS map if(c != null) { Classification result = c.compute(recordNum, inputMap, t1.getSDR(), isLearn, true); t1.recordNum(recordNum).storeClassification((String)inputs.get("name"), result); diff --git a/src/main/java/org/numenta/nupic/network/Region.java b/src/main/java/org/numenta/nupic/network/Region.java index 797e19b1..f01d2246 100644 --- a/src/main/java/org/numenta/nupic/network/Region.java +++ b/src/main/java/org/numenta/nupic/network/Region.java @@ -457,9 +457,6 @@ Region connect(Region inputRegion) { @Override public void onError(Throwable e) { e.printStackTrace(); } @SuppressWarnings("unchecked") @Override public void onNext(Inference i) { - // TODO: 21: This is where classifierInput is set. Need to change - // it to respect only fields user has specified for classification - // with the new Parameters KEY. localInf.sdr(i.getSDR()).recordNum(i.getRecordNum()).classifierInput(i.getClassifierInput()).layerInput(i.getSDR()); if(i.getSDR().length > 0) { ((Layer)tail).compute(localInf); diff --git a/src/test/java/org/numenta/nupic/network/LayerTest.java b/src/test/java/org/numenta/nupic/network/LayerTest.java index 02d79fb3..9ec91703 100644 --- a/src/test/java/org/numenta/nupic/network/LayerTest.java +++ b/src/test/java/org/numenta/nupic/network/LayerTest.java @@ -311,7 +311,8 @@ public void testHalt() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getHotGymTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); - p.set(KEY.AUTO_CLASSIFY, false); + p.set(KEY.AUTO_CLASSIFY, Boolean.TRUE); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); HTMSensor htmSensor = (HTMSensor)sensor; @@ -354,7 +355,8 @@ public void testReset() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getHotGymTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); - p.set(KEY.AUTO_CLASSIFY, false); + p.set(KEY.AUTO_CLASSIFY, Boolean.TRUE); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); HTMSensor htmSensor = (HTMSensor)sensor; @@ -394,7 +396,8 @@ public void testSequenceChangeReset() { Parameters p = NetworkTestHarness.getParameters().copy(); p = p.union(NetworkTestHarness.getHotGymTestEncoderParams()); p.set(KEY.RANDOM, new MersenneTwister(42)); - p.set(KEY.AUTO_CLASSIFY, false); + p.set(KEY.AUTO_CLASSIFY, Boolean.TRUE); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("consumption", CLAClassifier.class)); HTMSensor htmSensor = (HTMSensor)sensor; diff --git a/src/test/java/org/numenta/nupic/network/RegionTest.java b/src/test/java/org/numenta/nupic/network/RegionTest.java index 5bad07b2..5804737c 100644 --- a/src/test/java/org/numenta/nupic/network/RegionTest.java +++ b/src/test/java/org/numenta/nupic/network/RegionTest.java @@ -48,7 +48,7 @@ import org.numenta.nupic.network.sensor.SensorParams; import org.numenta.nupic.network.sensor.SensorParams.Keys; import org.numenta.nupic.util.MersenneTwister; -import static org.numenta.nupic.network.NetworkTestHarness.*; +import static org.numenta.nupic.network.NetworkTestHarness.getInferredFieldsMap; import rx.Observer; import rx.Subscriber; @@ -403,13 +403,14 @@ public void testIsLearn() { p.set(KEY.MAX_BOOST, 10.0); p.set(KEY.DUTY_CYCLE_PERIOD, 7); p.set(KEY.RANDOM, new MersenneTwister(42)); + p.set(KEY.INFERRED_FIELDS, getInferredFieldsMap("dayOfWeek", CLAClassifier.class)); Map params = new HashMap<>(); params.put(KEY_MODE, Mode.PURE); Network n = Network.create("test network", p) .add(Network.createRegion("r1") .add(Network.createLayer("1", p) - .alterParameter(KEY.AUTO_CLASSIFY, false)) + .alterParameter(KEY.AUTO_CLASSIFY, Boolean.TRUE)) .add(Network.createLayer("2", p) .add(Anomaly.create(params))) .add(Network.createLayer("3", p) From 4cb2fafabb5878441fd96da9da4b1c486dd05c67 Mon Sep 17 00:00:00 2001 From: Andrew Dillon Date: Tue, 7 Mar 2017 09:34:54 -0600 Subject: [PATCH 21/52] Converted wildcard import to explicit --- src/test/java/org/numenta/nupic/network/LayerTest.java | 9 ++++++--- 1 file changed, 6 insertions(+), 3 deletions(-) diff --git a/src/test/java/org/numenta/nupic/network/LayerTest.java b/src/test/java/org/numenta/nupic/network/LayerTest.java index 9ec91703..72deddbb 100644 --- a/src/test/java/org/numenta/nupic/network/LayerTest.java +++ b/src/test/java/org/numenta/nupic/network/LayerTest.java @@ -37,16 +37,19 @@ import java.util.List; import java.util.Map; import java.util.stream.Stream; - import org.junit.Test; import org.numenta.nupic.Parameters; import org.numenta.nupic.Parameters.KEY; -import org.numenta.nupic.algorithms.*; +import org.numenta.nupic.algorithms.Anomaly; +import org.numenta.nupic.algorithms.SpatialPooler; +import org.numenta.nupic.algorithms.TemporalMemory; +import org.numenta.nupic.algorithms.CLAClassifier; +import org.numenta.nupic.algorithms.SDRClassifier; +import org.numenta.nupic.algorithms.Classifier; import org.numenta.nupic.algorithms.Anomaly.Mode; import org.numenta.nupic.datagen.ResourceLocator; import org.numenta.nupic.encoders.MultiEncoder; import org.numenta.nupic.encoders.RandomDistributedScalarEncoder; -import org.numenta.nupic.encoders.ScalarEncoder; import org.numenta.nupic.model.Connections; import org.numenta.nupic.model.SDR; import org.numenta.nupic.network.Layer.FunctionFactory; From a58515a1637f788d23fda1bc9bcbb0c07c7f4039 Mon Sep 17 00:00:00 2001 From: David Ray Date: Tue, 7 Mar 2017 10:17:42 -0600 Subject: [PATCH 22/52] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 96ec64fe..08ee1824 100644 --- a/README.md +++ b/README.md @@ -91,7 +91,7 @@ The primary goal of this library development is to provide a Java version of NuP By working closely with Numenta and receiving their enthusiastic support and guidance, it is intended that this library be maintained as a viable Java language alternative to Numenta's C++ and Python offerings. However it must be understood that "official" support is (for the time being) currently limited to community resources such as the maintainers of this library and Numenta Forums / Message Lists and IRC: - * [NuPIC Community](http://numenta.org/index.html#community) + * [NuPIC Community](http://numenta.org/) * [New HTM Forum](http://discourse.numenta.org) *** From f44cc5a0566bc9365090b5a22b217a88579cfcf1 Mon Sep 17 00:00:00 2001 From: Andrew Dillon Date: Tue, 7 Mar 2017 10:55:13 -0600 Subject: [PATCH 23/52] Relocated static imports to top of list --- src/test/java/org/numenta/nupic/network/LayerTest.java | 3 ++- src/test/java/org/numenta/nupic/network/RegionTest.java | 2 +- .../org/numenta/nupic/serialize/HTMObjectInputOutputTest.java | 3 ++- 3 files changed, 5 insertions(+), 3 deletions(-) diff --git a/src/test/java/org/numenta/nupic/network/LayerTest.java b/src/test/java/org/numenta/nupic/network/LayerTest.java index 72deddbb..a8278983 100644 --- a/src/test/java/org/numenta/nupic/network/LayerTest.java +++ b/src/test/java/org/numenta/nupic/network/LayerTest.java @@ -21,6 +21,8 @@ */ package org.numenta.nupic.network; +import static org.numenta.nupic.network.NetworkTestHarness.getInferredFieldsMap; + import static org.junit.Assert.assertEquals; import static org.junit.Assert.assertFalse; import static org.junit.Assert.assertNotNull; @@ -63,7 +65,6 @@ import org.numenta.nupic.util.MersenneTwister; import org.numenta.nupic.util.NamedTuple; import org.numenta.nupic.util.UniversalRandom; -import static org.numenta.nupic.network.NetworkTestHarness.*; import rx.Observable; import rx.Observer; diff --git a/src/test/java/org/numenta/nupic/network/RegionTest.java b/src/test/java/org/numenta/nupic/network/RegionTest.java index 5804737c..3e514c89 100644 --- a/src/test/java/org/numenta/nupic/network/RegionTest.java +++ b/src/test/java/org/numenta/nupic/network/RegionTest.java @@ -21,6 +21,7 @@ */ package org.numenta.nupic.network; +import static org.numenta.nupic.network.NetworkTestHarness.getInferredFieldsMap; import static org.junit.Assert.assertEquals; import static org.junit.Assert.assertFalse; @@ -48,7 +49,6 @@ import org.numenta.nupic.network.sensor.SensorParams; import org.numenta.nupic.network.sensor.SensorParams.Keys; import org.numenta.nupic.util.MersenneTwister; -import static org.numenta.nupic.network.NetworkTestHarness.getInferredFieldsMap; import rx.Observer; import rx.Subscriber; diff --git a/src/test/java/org/numenta/nupic/serialize/HTMObjectInputOutputTest.java b/src/test/java/org/numenta/nupic/serialize/HTMObjectInputOutputTest.java index e04e7f24..386a107e 100644 --- a/src/test/java/org/numenta/nupic/serialize/HTMObjectInputOutputTest.java +++ b/src/test/java/org/numenta/nupic/serialize/HTMObjectInputOutputTest.java @@ -1,5 +1,7 @@ package org.numenta.nupic.serialize; +import static org.numenta.nupic.network.NetworkTestHarness.getInferredFieldsMap; + import static org.junit.Assert.assertNotNull; import static org.junit.Assert.assertTrue; import static org.junit.Assert.fail; @@ -23,7 +25,6 @@ import org.numenta.nupic.network.sensor.SensorParams; import org.numenta.nupic.network.sensor.SensorParams.Keys; import org.numenta.nupic.util.FastRandom; -import static org.numenta.nupic.network.NetworkTestHarness.*; public class HTMObjectInputOutputTest { From a3c62b76e8d2b87da90a69f9ebd3eeb643669cee Mon Sep 17 00:00:00 2001 From: Andrew Dillon Date: Wed, 8 Mar 2017 09:24:14 -0600 Subject: [PATCH 24/52] Implemented changes requested in review --- src/main/java/org/numenta/nupic/network/Layer.java | 3 ++- src/test/java/org/numenta/nupic/network/LayerTest.java | 2 +- .../java/org/numenta/nupic/network/NetworkTestHarness.java | 1 - 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/src/main/java/org/numenta/nupic/network/Layer.java b/src/main/java/org/numenta/nupic/network/Layer.java index e88bcc58..56a26deb 100644 --- a/src/main/java/org/numenta/nupic/network/Layer.java +++ b/src/main/java/org/numenta/nupic/network/Layer.java @@ -1918,8 +1918,9 @@ private void clearSubscriberObserverLists() { * @param encoder * @return */ + @SuppressWarnings("unchecked") NamedTuple makeClassifiers(MultiEncoder encoder) { - Map inferredFields = (Map>) params.get(KEY.INFERRED_FIELDS); + Map> inferredFields = (Map>) params.get(KEY.INFERRED_FIELDS); if(inferredFields == null || inferredFields.entrySet().size() == 0) { throw new IllegalStateException( "KEY.AUTO_CLASSIFY has been set to \"true\", but KEY.INFERRED_FIELDS is null or\n\t" + diff --git a/src/test/java/org/numenta/nupic/network/LayerTest.java b/src/test/java/org/numenta/nupic/network/LayerTest.java index a8278983..92118baa 100644 --- a/src/test/java/org/numenta/nupic/network/LayerTest.java +++ b/src/test/java/org/numenta/nupic/network/LayerTest.java @@ -1851,7 +1851,7 @@ public void TestMakeClassifiersWithNoInferredFieldsKey() { // Make sure the makeClassifiers() method throws exception due to // absence of KEY.INFERRED_FIELDS in the Parameters object try { - NamedTuple nt = l.makeClassifiers(l.getEncoder()); + l.makeClassifiers(l.getEncoder()); } catch (IllegalStateException e) { assertTrue(e.getMessage().contains("KEY.INFERRED_FIELDS")); assertTrue(e.getMessage().contains("null")); diff --git a/src/test/java/org/numenta/nupic/network/NetworkTestHarness.java b/src/test/java/org/numenta/nupic/network/NetworkTestHarness.java index fb3e833a..c350e413 100644 --- a/src/test/java/org/numenta/nupic/network/NetworkTestHarness.java +++ b/src/test/java/org/numenta/nupic/network/NetworkTestHarness.java @@ -26,7 +26,6 @@ import org.numenta.nupic.Parameters; import org.numenta.nupic.Parameters.KEY; -import org.numenta.nupic.algorithms.CLAClassifier; import org.numenta.nupic.algorithms.Classifier; import org.numenta.nupic.encoders.Encoder; import org.numenta.nupic.util.Tuple; From 77889c9eb9fb989bd5d400b3e089996815180c1b Mon Sep 17 00:00:00 2001 From: Andrew Dillon Date: Sat, 11 Mar 2017 13:41:44 -0600 Subject: [PATCH 25/52] Changed use of "==" operator to Class.isAssignableFrom method --- .../java/org/numenta/nupic/network/Layer.java | 15 +++++++++------ 1 file changed, 9 insertions(+), 6 deletions(-) diff --git a/src/main/java/org/numenta/nupic/network/Layer.java b/src/main/java/org/numenta/nupic/network/Layer.java index 56a26deb..13e47b60 100644 --- a/src/main/java/org/numenta/nupic/network/Layer.java +++ b/src/main/java/org/numenta/nupic/network/Layer.java @@ -1934,21 +1934,24 @@ NamedTuple makeClassifiers(MultiEncoder encoder) { int i = 0; for(EncoderTuple et : encoder.getEncoders(encoder)) { names[i] = et.getName(); - Object fieldClassifier = inferredFields.get(et.getName()); - if(fieldClassifier == CLAClassifier.class) { + Class fieldClassifier = inferredFields.get(et.getName()); + if(fieldClassifier == null) { + LOGGER.info("Not classifying \"" + et.getName() + "\" input field"); + } + else if(CLAClassifier.class.isAssignableFrom(fieldClassifier)) { LOGGER.info("Classifying \"" + et.getName() + "\" input field with CLAClassifier"); ca[i] = new CLAClassifier(); - } else if(fieldClassifier == SDRClassifier.class) { + } + else if(SDRClassifier.class.isAssignableFrom(fieldClassifier)) { LOGGER.info("Classifying \"" + et.getName() + "\" input field with SDRClassifier"); ca[i] = new SDRClassifier(); - } else if(fieldClassifier != null) { + } + else { throw new IllegalStateException( "Invalid Classifier class token, \"" + fieldClassifier + "\",\n\t" + "specified for, \"" + et.getName() + "\", input field.\n\t" + "Valid class tokens are CLAClassifier.class and SDRClassifier.class" ); - } else { // fieldClassifier is null - LOGGER.info("Not classifying \"" + et.getName() + "\" input field"); } i++; } From 93cb8d20fa087e6c2d3d1acdc2d3164b1b2d0571 Mon Sep 17 00:00:00 2001 From: Andrew Dillon Date: Sun, 26 Mar 2017 11:16:54 -0500 Subject: [PATCH 26/52] Added unit test for Synapse object equals() method --- .../org/numenta/nupic/model/SynapseTest.java | 67 +++++++++++++++++++ 1 file changed, 67 insertions(+) create mode 100644 src/test/java/org/numenta/nupic/model/SynapseTest.java diff --git a/src/test/java/org/numenta/nupic/model/SynapseTest.java b/src/test/java/org/numenta/nupic/model/SynapseTest.java new file mode 100644 index 00000000..cc9a0b26 --- /dev/null +++ b/src/test/java/org/numenta/nupic/model/SynapseTest.java @@ -0,0 +1,67 @@ +package org.numenta.nupic.model; + +import org.junit.Test; +import static org.junit.Assert.*; + +public class SynapseTest { + @Test + public void testSynapseEquality() { + // Make stuff we need to perform the tests + Column column = new Column(1, 0); + Cell cell1 = new Cell(column, 0); + Cell cell2 = new Cell(column, 1); + DistalDendrite segment1 = new DistalDendrite(cell1, 0, 0, 0); + DistalDendrite segment2 = new DistalDendrite(cell1, 1, 1, 1); + + // These are the Synapse objects we will use for the tests + Synapse synapse1 = new Synapse(); + Synapse synapse2 = new Synapse(); + + /* ----- These are the equality tests: ----- */ + // synapse1 should equal itself + assertTrue(synapse1.equals(synapse1)); + + // synapse1 should not equal null + assertFalse(synapse1.equals(null)); + + // synapse1 should not equal a non-Synapse object + assertFalse(synapse1.equals("This is not a Synapse object")); + + // synapse1 should not equal synapse2 because synapse2's + // inputIndex != synapse1's inputIndex + synapse1.setPresynapticCell(cell1); + assertFalse(synapse1.equals(synapse2)); + + // synapse1 should not equal synapse2 because synapse1's + // segment is null, but synapse2's segment is not null + synapse2 = new Synapse(cell1, segment1, 0, 0); + assertFalse(synapse1.equals(synapse2)); + + // synapse1 should not equal synapse2 because synapse1's + // segment != synapse2's segment + synapse1 = new Synapse(cell1, segment2, 0, 0); + assertFalse(synapse1.equals(synapse2)); + + // synapse1 should not equal synapse2 because synapse1's + // sourceCell is null, but synapse2's sourceCell is not null + synapse1.setPresynapticCell(null); + assertFalse(synapse1.equals(synapse2)); + + // synapse1 should not equal synapse2 because synapse1's + // sourceCell != synapse2's sourceCell + synapse1.setPresynapticCell(cell2); + assertFalse(synapse1.equals(synapse2)); + + // synapse1 should not equal synapse 2 because synapse1's + // synapseIndex != synapse2's synapseIndex + synapse1 = new Synapse(cell1, segment1, 0, 0); + synapse2 = new Synapse(cell1, segment1, 1, 0); + assertFalse(synapse1.equals(synapse2)); + + // synapse1 should equal synapse2 because all of their + // relevant properties are equal + synapse1 = new Synapse(cell1, segment1, 0, 0); + synapse2 = new Synapse(cell1, segment1, 0, 0); + assertTrue(synapse1.equals(synapse2)); + } +} From ed46d0e71f83df2f5a26e17c8696708ccb9aca44 Mon Sep 17 00:00:00 2001 From: Andrew Dillon Date: Sun, 26 Mar 2017 11:23:20 -0500 Subject: [PATCH 27/52] Added check for equality of Synapse object`s permanence values to their equals() method --- src/main/java/org/numenta/nupic/model/Synapse.java | 2 ++ src/test/java/org/numenta/nupic/model/SynapseTest.java | 8 +++++++- 2 files changed, 9 insertions(+), 1 deletion(-) diff --git a/src/main/java/org/numenta/nupic/model/Synapse.java b/src/main/java/org/numenta/nupic/model/Synapse.java index 99a5cfbf..811517ca 100644 --- a/src/main/java/org/numenta/nupic/model/Synapse.java +++ b/src/main/java/org/numenta/nupic/model/Synapse.java @@ -239,6 +239,8 @@ public boolean equals(Object obj) { return false; if(synapseIndex != other.synapseIndex) return false; + if(permanence != other.permanence) + return false; return true; } } diff --git a/src/test/java/org/numenta/nupic/model/SynapseTest.java b/src/test/java/org/numenta/nupic/model/SynapseTest.java index cc9a0b26..36b724cb 100644 --- a/src/test/java/org/numenta/nupic/model/SynapseTest.java +++ b/src/test/java/org/numenta/nupic/model/SynapseTest.java @@ -52,12 +52,18 @@ public void testSynapseEquality() { synapse1.setPresynapticCell(cell2); assertFalse(synapse1.equals(synapse2)); - // synapse1 should not equal synapse 2 because synapse1's + // synapse1 should not equal synapse2 because synapse1's // synapseIndex != synapse2's synapseIndex synapse1 = new Synapse(cell1, segment1, 0, 0); synapse2 = new Synapse(cell1, segment1, 1, 0); assertFalse(synapse1.equals(synapse2)); + // synapse1 should not equal synapse2 because synapse1's + // permanence != synapse2's permanence + synapse1 = new Synapse(cell1, segment1, 0, 0); + synapse2 = new Synapse(cell1, segment1, 0, 1); + assertFalse(synapse1.equals(synapse2)); + // synapse1 should equal synapse2 because all of their // relevant properties are equal synapse1 = new Synapse(cell1, segment1, 0, 0); From 280cbd71575591f6d8ef8ee0588416277558cbc8 Mon Sep 17 00:00:00 2001 From: cogmission Date: Wed, 5 Apr 2017 02:29:39 -0500 Subject: [PATCH 28/52] Update build files for new release --- build.gradle | 4 ++-- pom.xml | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/build.gradle b/build.gradle index 0072fb6d..42faedf1 100644 --- a/build.gradle +++ b/build.gradle @@ -4,7 +4,7 @@ apply plugin: 'eclipse' apply plugin: 'signing' group = 'org.numenta' -version = '0.6.11' +version = '0.6.12' archivesBaseName = 'htm.java' sourceCompatibility = 1.8 @@ -12,7 +12,7 @@ targetCompatibility = 1.8 jar { manifest { - attributes 'Implementation-Title': 'htm.java', 'Implementation-Version': '0.6.11' + attributes 'Implementation-Title': 'htm.java', 'Implementation-Version': '0.6.12' } } diff --git a/pom.xml b/pom.xml index cfc74f59..712c95a1 100644 --- a/pom.xml +++ b/pom.xml @@ -4,7 +4,7 @@ org.numenta htm.java - 0.6.11 + 0.6.12 htm.java The Java version of Numenta's HTM technology From cd053694f5a34a013e1a1ea9b644df0aeb2f9692 Mon Sep 17 00:00:00 2001 From: David Ray Date: Wed, 5 Apr 2017 03:51:16 -0500 Subject: [PATCH 29/52] Update README.md --- README.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 08ee1824..6b072e48 100644 --- a/README.md +++ b/README.md @@ -26,6 +26,7 @@ _**NOTE: Minimum JavaSE version is 8**_
## Recent News Items... +* New Feature Release! [v0.6.12-alpha](https://github.com/numenta/htm.java/releases/tag/v0.6.12-alpha) Network API Allows Multi-field inference! (04/04/2017) * HTM.Java Release v0.6.11-alpha to tag sync state with NuPIC (10/16/2016) * [HTM.Java Receives new TemporalMemory](https://discourse.numenta.org/t/htm-java-now-in-sync-with-nupic/1510) - HTM.Java now fully in sync!! (10/13/2016) * [HTM.Java Receives new SpatialPooler](https://github.com/numenta/htm.java/pull/486) - Fully Updated! (10/06/2016) @@ -161,7 +162,7 @@ Maven: org.numenta htm.java - 0.6.11 + 0.6.12 ``` @@ -171,7 +172,7 @@ How to get the latest SNAPSHOT build: (None yet for newest build...) org.numenta htm.java - 0.6.12-SNAPSHOT + 0.6.13-SNAPSHOT You also may need to include a repositories entry: From e1b7e1f043ca6da9eefea23cc03c716c7926d535 Mon Sep 17 00:00:00 2001 From: David Ray Date: Wed, 5 Apr 2017 04:31:17 -0500 Subject: [PATCH 30/52] Update CHANGELOG.md --- CHANGELOG.md | 21 ++++++++++++++++++++- 1 file changed, 20 insertions(+), 1 deletion(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index ef77bf99..63648f60 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -13,7 +13,26 @@ This project adheres to [Semantic Versioning](http://semver.org/). *** -## Unreleased [0.6.10-SNAPSHOT] +## Unreleased [0.6.13-SNAPSHOT] +#### Removed +#### Added +#### Changed +#### Fixed + +*** + +## Unreleased [0.6.12] +#### Removed +#### Added +* [[PR #511](https://github.com/numenta/htm.java/pull/511)] SDRClassifier Network API Integration +* [[PR #511](https://github.com/numenta/htm.java/pull/511)] Added new Classifier.java interface +* [[PR #511](https://github.com/numenta/htm.java/pull/511)] Added Tests for new integration +#### Changed +#### Fixed + +*** + +## Unreleased [0.6.10] #### Removed #### Added #### Changed From 523b9c4b8d407570fb4592e847e6a56e3f32de90 Mon Sep 17 00:00:00 2001 From: David Ray Date: Thu, 6 Apr 2017 02:00:23 -0500 Subject: [PATCH 31/52] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 6b072e48..8b0bad96 100644 --- a/README.md +++ b/README.md @@ -44,7 +44,7 @@ _**NOTE: Minimum JavaSE version is 8**_ * See a glimpse of htm.java's history and read about significant events in its development. -### View the [Change Log](https://github.com/numenta/htm.java/blob/master/CHANGELOG.md) (Updated! 2016-10-13) +### View the [Change Log](https://github.com/numenta/htm.java/blob/master/CHANGELOG.md) (Updated! 2017-04-05) * Change log itemizes the release history. * Contains an **"Unreleased" section** which lists changes coming in the next release as they're being worked on - (should help users keep in touch with the current evolution of htm.java) From fb80bbe47fc55e744d6e1bb6c221c759e9704e24 Mon Sep 17 00:00:00 2001 From: David Ray Date: Thu, 6 Apr 2017 02:02:39 -0500 Subject: [PATCH 32/52] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 8b0bad96..5acb9c5a 100644 --- a/README.md +++ b/README.md @@ -30,7 +30,7 @@ _**NOTE: Minimum JavaSE version is 8**_ * HTM.Java Release v0.6.11-alpha to tag sync state with NuPIC (10/16/2016) * [HTM.Java Receives new TemporalMemory](https://discourse.numenta.org/t/htm-java-now-in-sync-with-nupic/1510) - HTM.Java now fully in sync!! (10/13/2016) * [HTM.Java Receives new SpatialPooler](https://github.com/numenta/htm.java/pull/486) - Fully Updated! (10/06/2016) -* HTM.Java Reaches 100% NuPIC Compatibility and operation within NAB will be offered (soon)! (09/29/2016) +* HTM.Java Reaches 100% NuPIC Compatibility and operation within NAB will be offered (soon)! (09/29/2016) * [HTM.java Receives New SDRClassifier!](https://github.com/numenta/htm.java/blob/master/src/main/java/org/numenta/nupic/algorithms/SDRClassifier.java) (07/26/2016) * [HTM.java Status Report](https://discourse.numenta.org/t/htm-java-status-report/645) (05/29/2016) * [HTM.java Examples Repo Updated!](https://github.com/numenta/htm.java-examples) Includes use of the [New Cortical.io API!](https://github.com/cortical-io/retina-api-java-sdk) (05/18/2016) From b8857db37a83e85b4455ba58ce89f9118f2954d6 Mon Sep 17 00:00:00 2001 From: David Ray Date: Thu, 6 Apr 2017 13:31:55 -0500 Subject: [PATCH 33/52] Update README.md --- README.md | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/README.md b/README.md index 5acb9c5a..4cd75bb3 100644 --- a/README.md +++ b/README.md @@ -3,8 +3,7 @@
-[![htm.java awesomeness](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](http://cogmission.ai) [![AGI Probability](https://img.shields.io/badge/AGI%20Probability-97%25-blue.svg)](http://numenta.com/#hero) [![Coolness Factor](https://img.shields.io/badge/Coolness%20Factor-100%25-blue.svg)](https://github.com/numenta/htm.java-examples) [![Build Status](https://travis-ci.org/numenta/htm.java.png?branch=master)](https://travis-ci.org/numenta/htm.java) [![Coverage Status](https://coveralls.io/repos/numenta/htm.java/badge.svg?branch=master&service=github)](https://coveralls.io/github/numenta/htm.java?branch=master) [![Maven Central](https://maven-badges.herokuapp.com/maven-central/org.numenta/htm.java/badge.svg)](https://maven-badges.herokuapp.com/maven-central/org.numenta/htm.java) [![][license img]][license] [![docs-badge][]][docs] [![Gitter](https://badges.gitter.im/Join -Chat.svg)](https://gitter.im/numenta/htm.java?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge) [![OpenHub](https://www.openhub.net/p/htm-java/widgets/project_thin_badge.gif)](https://www.openhub.net/p/htm-java) +[![htm.java awesomeness](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](http://cogmission.ai) [![AGI Probability](https://img.shields.io/badge/AGI%20Probability-97%25-blue.svg)](http://numenta.com/#hero) [![Coolness Factor](https://img.shields.io/badge/Coolness%20Factor-100%25-blue.svg)](https://github.com/numenta/htm.java-examples) [![Build Status](https://travis-ci.org/numenta/htm.java.png?branch=master)](https://travis-ci.org/numenta/htm.java) [![Coverage Status](https://coveralls.io/repos/numenta/htm.java/badge.svg?branch=master&service=github)](https://coveralls.io/github/numenta/htm.java?branch=master) [![Maven Central](https://maven-badges.herokuapp.com/maven-central/org.numenta/htm.java/badge.svg)](https://maven-badges.herokuapp.com/maven-central/org.numenta/htm.java) [![][license img]][license] [![docs-badge][]][docs] [![Gitter](https://img.shields.io/badge/gitter-join_chat-green.svg?style=flat)](https://gitter.im/numenta/htm.java?utm_source=badge) [![OpenHub](https://www.openhub.net/p/htm-java/widgets/project_thin_badge.gif)](https://www.openhub.net/p/htm-java)
From 3234f6ea6bfe49d753a3c9c2a48bdc67e515da29 Mon Sep 17 00:00:00 2001 From: David Ray Date: Thu, 6 Apr 2017 13:35:03 -0500 Subject: [PATCH 34/52] Update README.md --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index 4cd75bb3..bf05c6d8 100644 --- a/README.md +++ b/README.md @@ -25,6 +25,7 @@ _**NOTE: Minimum JavaSE version is 8**_
## Recent News Items... +* Updated HTM.Java Examples! [Now in sync with latest release (v0.6.12-alpha)](https://github.com/numenta/htm.java-examples) (See the executable Jars!) (04/06/2017) * New Feature Release! [v0.6.12-alpha](https://github.com/numenta/htm.java/releases/tag/v0.6.12-alpha) Network API Allows Multi-field inference! (04/04/2017) * HTM.Java Release v0.6.11-alpha to tag sync state with NuPIC (10/16/2016) * [HTM.Java Receives new TemporalMemory](https://discourse.numenta.org/t/htm-java-now-in-sync-with-nupic/1510) - HTM.Java now fully in sync!! (10/13/2016) From df0d653eac2553bae780ec045bacc1e8a50799ee Mon Sep 17 00:00:00 2001 From: David Ray Date: Mon, 17 Apr 2017 01:55:30 -0500 Subject: [PATCH 35/52] Update Gradle Example --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index bf05c6d8..bc75a3f6 100644 --- a/README.md +++ b/README.md @@ -152,7 +152,7 @@ gradle -Pskipbench check # Executes the tests w/o running the benchmarks Gradle: dependencies { - compile group: 'org.numenta', name: 'htm.java', version:'0.6.11' + compile group: 'org.numenta', name: 'htm.java', version:'0.6.12' } ``` From b4f2b466cea5562ad51180a9936db2b257d11dcc Mon Sep 17 00:00:00 2001 From: cogmission Date: Fri, 12 May 2017 08:09:58 -0500 Subject: [PATCH 36/52] Fix for mismatch between ScalarEncoder signature type of Double and the decode result of Booleans in FieldMetaType --- src/main/java/org/numenta/nupic/FieldMetaType.java | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/main/java/org/numenta/nupic/FieldMetaType.java b/src/main/java/org/numenta/nupic/FieldMetaType.java index 3bdf1708..b6b6f3ff 100644 --- a/src/main/java/org/numenta/nupic/FieldMetaType.java +++ b/src/main/java/org/numenta/nupic/FieldMetaType.java @@ -96,7 +96,7 @@ public T decodeType(String input, Encoder enc) { case LIST : case STRING : return (T)input; case DATETIME : return (T)((DateEncoder)enc).parse(input); - case BOOLEAN : return (T)(Boolean.valueOf(input) == true ? new Integer(1) : new Integer(0)); + case BOOLEAN : return (T)(Boolean.valueOf(input) == true ? new Double(1) : new Double(0)); case COORD : case GEO : { String[] parts = input.split("[\\s]*\\;[\\s]*"); From e97e01d23646182f8e9afd2fbcb3e1cad8b613fb Mon Sep 17 00:00:00 2001 From: cogmission Date: Fri, 12 May 2017 08:52:18 -0500 Subject: [PATCH 37/52] Added a new Test class and test to test the FieldMetaType decoding --- .../org/numenta/nupic/FieldMetaTypeTest.java | 73 +++++++++++++++++++ 1 file changed, 73 insertions(+) create mode 100644 src/test/java/org/numenta/nupic/FieldMetaTypeTest.java diff --git a/src/test/java/org/numenta/nupic/FieldMetaTypeTest.java b/src/test/java/org/numenta/nupic/FieldMetaTypeTest.java new file mode 100644 index 00000000..03df2a22 --- /dev/null +++ b/src/test/java/org/numenta/nupic/FieldMetaTypeTest.java @@ -0,0 +1,73 @@ +package org.numenta.nupic; + +import static org.junit.Assert.assertEquals; +import static org.junit.Assert.assertTrue; + +import java.util.ArrayList; +import java.util.Arrays; +import java.util.List; + +import org.joda.time.DateTime; +import org.junit.Test; +import org.numenta.nupic.encoders.DateEncoder; +import org.numenta.nupic.util.Tuple; + + +public class FieldMetaTypeTest { + + /** + * Test decoding of 9 out of 10 types of {@link FieldMetaType}s. + */ + @Test + public void testDecodeType() { + List linput = new ArrayList<>(); + linput.add("test"); + String loutput = FieldMetaType.LIST.decodeType(linput.toString(), null); + assertEquals(linput.toString(), loutput); + + String catinput = "TestString"; + String output = FieldMetaType.STRING.decodeType(catinput, null); + assertEquals(catinput, output); + + String binput = "true"; + Double boutput = FieldMetaType.BOOLEAN.decodeType(binput, null); + assertEquals(new Double(1), boutput); + + String ginput = "100;200;5"; + Tuple texpected = new Tuple(100.0, 200.0, 5.0); + Tuple tupleOut = FieldMetaType.GEO.decodeType(ginput, null); + assertEquals(texpected, tupleOut); + + String intinput = "1337"; + double intoutput = FieldMetaType.INTEGER.decodeType(intinput, null); + assertEquals(1337.0d, intoutput, 0); + + intoutput = FieldMetaType.FLOAT.decodeType(intinput, null); + assertEquals(1337.0d, intoutput, 0); + + // DARR = Dense Array + int[] dainput = { 1, 0, 1, 0 }; + int[] daoutput = FieldMetaType.DARR.decodeType(Arrays.toString(dainput), null); + assertTrue(Arrays.equals(dainput, daoutput)); + + // SARR = Sparse Array + int[] sainput = { 0, 2 }; + int[] saoutput = FieldMetaType.SARR.decodeType(Arrays.toString(sainput), null); + assertTrue(Arrays.equals(sainput, saoutput)); + + DateTime comparison = new DateTime(2010, 11, 4, 13, 55, 01); + String compareString = "2010-11-04 13:55:01"; + // 3 bits for season, 1 bit for day of week, 3 for weekend, 5 for time of day + // use of forced is not recommended, used here for readability. + DateEncoder.Builder builder = DateEncoder.builder(); + builder.formatPattern("yyyy-MM-dd HH:mm:ss"); + + DateEncoder de = builder.season(3) + .dayOfWeek(1) + .weekend(3) + .timeOfDay(5).build(); + + DateTime dateOutput = FieldMetaType.DATETIME.decodeType(compareString, de); + assertEquals(comparison, dateOutput); + } +} From a712eef980685a9f6aef191ba42dde6338e995a0 Mon Sep 17 00:00:00 2001 From: cogmission Date: Fri, 12 May 2017 09:28:15 -0500 Subject: [PATCH 38/52] Update build files for new release (v0.6.13-alpha) --- build.gradle | 4 ++-- pom.xml | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/build.gradle b/build.gradle index 42faedf1..05c16194 100644 --- a/build.gradle +++ b/build.gradle @@ -4,7 +4,7 @@ apply plugin: 'eclipse' apply plugin: 'signing' group = 'org.numenta' -version = '0.6.12' +version = '0.6.13' archivesBaseName = 'htm.java' sourceCompatibility = 1.8 @@ -12,7 +12,7 @@ targetCompatibility = 1.8 jar { manifest { - attributes 'Implementation-Title': 'htm.java', 'Implementation-Version': '0.6.12' + attributes 'Implementation-Title': 'htm.java', 'Implementation-Version': '0.6.13' } } diff --git a/pom.xml b/pom.xml index 712c95a1..f72cef9e 100644 --- a/pom.xml +++ b/pom.xml @@ -4,7 +4,7 @@ org.numenta htm.java - 0.6.12 + 0.6.13 htm.java The Java version of Numenta's HTM technology From 9f3b3240f08139254fb1a678f3b77dfdd2878797 Mon Sep 17 00:00:00 2001 From: David Ray Date: Fri, 12 May 2017 09:58:11 -0500 Subject: [PATCH 39/52] Update CHANGELOG.md --- CHANGELOG.md | 16 +++++++++++++--- 1 file changed, 13 insertions(+), 3 deletions(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index 63648f60..22885992 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -13,15 +13,25 @@ This project adheres to [Semantic Versioning](http://semver.org/). *** -## Unreleased [0.6.13-SNAPSHOT] +## Unreleased [0.6.14-SNAPSHOT] #### Removed #### Added #### Changed +#### Fixed + +*** + +## [v0.6.13-alpha] Hot Fix! +#### Removed +#### Added +[[PR #518](https://github.com/numenta/htm.java/pull/518)] Added FieldMetaTypeTest.java +#### Changed #### Fixed +[[PR #518](https://github.com/numenta/htm.java/pull/518)] Fixed "bool" type decode bug *** -## Unreleased [0.6.12] +## [v0.6.12-alpha] #### Removed #### Added * [[PR #511](https://github.com/numenta/htm.java/pull/511)] SDRClassifier Network API Integration @@ -32,7 +42,7 @@ This project adheres to [Semantic Versioning](http://semver.org/). *** -## Unreleased [0.6.10] +## [v0.6.10-alpha] #### Removed #### Added #### Changed From dd9992350edcfdd091a22a91e6b9038e6a38b7e9 Mon Sep 17 00:00:00 2001 From: David Ray Date: Fri, 12 May 2017 10:03:09 -0500 Subject: [PATCH 40/52] Update README.md --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index bc75a3f6..8f29e059 100644 --- a/README.md +++ b/README.md @@ -25,6 +25,7 @@ _**NOTE: Minimum JavaSE version is 8**_
## Recent News Items... +* New Hot Fix Release [v0.6.13-alpha](https://github.com/numenta/htm.java/releases) (05/12/2017) * Updated HTM.Java Examples! [Now in sync with latest release (v0.6.12-alpha)](https://github.com/numenta/htm.java-examples) (See the executable Jars!) (04/06/2017) * New Feature Release! [v0.6.12-alpha](https://github.com/numenta/htm.java/releases/tag/v0.6.12-alpha) Network API Allows Multi-field inference! (04/04/2017) * HTM.Java Release v0.6.11-alpha to tag sync state with NuPIC (10/16/2016) From c7ef9d1ea6564c08cbdfe7611bf3cef40d68b192 Mon Sep 17 00:00:00 2001 From: David Ray Date: Fri, 2 Jun 2017 13:32:49 -0500 Subject: [PATCH 41/52] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 8f29e059..a8f7c3cd 100644 --- a/README.md +++ b/README.md @@ -134,7 +134,7 @@ gradle -Pskipbench check # Executes the tests w/o running the benchmarks **Linux Gradle Issues?** [see the wiki here.](https://github.com/numenta/htm.java/wiki/Gradle---JAVA_HOME-Issue-Resolution) -**A [wiki](https://github.com/numenta/htm.java/wiki/Build-Instructions) with full build instructions for building HTM.java is available here:** [Build Instructions](https://github.com/numenta/htm.java/wiki/Build-Instructions) (This includes a no-java/gradle-guaranteed-build using the provided [Docker File](https://github.com/numenta/htm.java/blob/master/Dockerfile) as a reference build.) +**A [wiki](https://github.com/numenta/htm.java/wiki/Build-Instructions) with full build instructions for building HTM.java is available here:** [Build Instructions](https://github.com/numenta/htm.java/wiki/Build-Instructions) *** ## For Developers: Usage & Project Integration From 24d9f6752eebfceea67a88a918d00f6547b3b07c Mon Sep 17 00:00:00 2001 From: David Ray Date: Fri, 2 Jun 2017 13:44:33 -0500 Subject: [PATCH 42/52] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index a8f7c3cd..84a6eaea 100644 --- a/README.md +++ b/README.md @@ -128,7 +128,7 @@ gradle check # Executes the tests and runs the benchmarks --or-- -gradle -Pskipbench check # Executes the tests w/o running the benchmarks +gradle -Pskipbench check # Executes the tests w/o running the benchmarks (Faster! **Recommended**) ``` **Note:** Info on installing **gradle** can be found on the wiki (look at #3.) [here](https://github.com/numenta/htm.java/wiki/Eclipse-Setup-Tips) From a56344ee879f56acb5119693211fa6534adb8e7c Mon Sep 17 00:00:00 2001 From: David Ray Date: Fri, 2 Jun 2017 13:50:58 -0500 Subject: [PATCH 43/52] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 84a6eaea..8449daf8 100644 --- a/README.md +++ b/README.md @@ -78,7 +78,7 @@ See the blog: [Join the Cogmission](http://www.cogmission.ai) | Core Algorithm | NuPIC Date |HTM.Java Date | Latest NuPIC SHA | Latest HTM.Java SHA | Status| | --------------- |:-------------:|:------------:|:----------------:|:-------------------:|:-----:| | SpatialPooler | 2016-12-11 | 2016-10-07 |[commit](https://github.com/numenta/nupic/commit/5c3edead9526d3b5fb6a4f37ad9d38cdcf32f5ff)|[commit](https://github.com/numenta/htm.java/commit/2cdcee1fcc5f6c18c2c48b4b553c49879c1256bb#diff-22f96ea06fd0c2b3593c755cbccf0a8b)| [*Behind NuPIC Merge #3411](https://github.com/numenta/nupic/pull/3411) -| TemporalMemory | 2016-09-23 | 2016-10-13 |[commit](https://github.com/numenta/nupic/commit/1036f25e7223471d72cebc536d6734f78d37b6c7)|[commit](https://github.com/numenta/htm.java/commit/7f4d8f2e2c910dd662909442546516e36adfc7cc)| Sync'd* +| TemporalMemory | 2016-09-23 | 2016-10-13 |[commit](https://github.com/numenta/nupic/commit/b1f35fe15a1cbed689d1173cfcecddfab781baab)|[commit](https://github.com/numenta/htm.java/commit/7f4d8f2e2c910dd662909442546516e36adfc7cc)| [*Behind NuPIC Merge #3654](https://github.com/numenta/nupic/pull/3654) \* May be one of: "Sync'd" or "Behind". "Behind" expresses a temporary lapse in synchronization while devs are implementing new changes. From 5f0f0a8e1c352b9290c6e111d039160473b5fa71 Mon Sep 17 00:00:00 2001 From: David Ray Date: Fri, 2 Jun 2017 13:51:36 -0500 Subject: [PATCH 44/52] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 8449daf8..666b5896 100644 --- a/README.md +++ b/README.md @@ -78,7 +78,7 @@ See the blog: [Join the Cogmission](http://www.cogmission.ai) | Core Algorithm | NuPIC Date |HTM.Java Date | Latest NuPIC SHA | Latest HTM.Java SHA | Status| | --------------- |:-------------:|:------------:|:----------------:|:-------------------:|:-----:| | SpatialPooler | 2016-12-11 | 2016-10-07 |[commit](https://github.com/numenta/nupic/commit/5c3edead9526d3b5fb6a4f37ad9d38cdcf32f5ff)|[commit](https://github.com/numenta/htm.java/commit/2cdcee1fcc5f6c18c2c48b4b553c49879c1256bb#diff-22f96ea06fd0c2b3593c755cbccf0a8b)| [*Behind NuPIC Merge #3411](https://github.com/numenta/nupic/pull/3411) -| TemporalMemory | 2016-09-23 | 2016-10-13 |[commit](https://github.com/numenta/nupic/commit/b1f35fe15a1cbed689d1173cfcecddfab781baab)|[commit](https://github.com/numenta/htm.java/commit/7f4d8f2e2c910dd662909442546516e36adfc7cc)| [*Behind NuPIC Merge #3654](https://github.com/numenta/nupic/pull/3654) +| TemporalMemory | 2017-06-02 | 2016-10-13 |[commit](https://github.com/numenta/nupic/commit/b1f35fe15a1cbed689d1173cfcecddfab781baab)|[commit](https://github.com/numenta/htm.java/commit/7f4d8f2e2c910dd662909442546516e36adfc7cc)| [*Behind NuPIC Merge #3654](https://github.com/numenta/nupic/pull/3654) \* May be one of: "Sync'd" or "Behind". "Behind" expresses a temporary lapse in synchronization while devs are implementing new changes. From 3c229baf7cca42a8d1139eac2ffcd51003e05867 Mon Sep 17 00:00:00 2001 From: David Ray Date: Fri, 2 Jun 2017 13:54:24 -0500 Subject: [PATCH 45/52] Update README.md --- README.md | 7 +------ 1 file changed, 1 insertion(+), 6 deletions(-) diff --git a/README.md b/README.md index 666b5896..513a0ecb 100644 --- a/README.md +++ b/README.md @@ -25,6 +25,7 @@ _**NOTE: Minimum JavaSE version is 8**_
## Recent News Items... +* Updated Sync Report Table Here in README (06/02/2017) * New Hot Fix Release [v0.6.13-alpha](https://github.com/numenta/htm.java/releases) (05/12/2017) * Updated HTM.Java Examples! [Now in sync with latest release (v0.6.12-alpha)](https://github.com/numenta/htm.java-examples) (See the executable Jars!) (04/06/2017) * New Feature Release! [v0.6.12-alpha](https://github.com/numenta/htm.java/releases/tag/v0.6.12-alpha) Network API Allows Multi-field inference! (04/04/2017) @@ -33,12 +34,6 @@ _**NOTE: Minimum JavaSE version is 8**_ * [HTM.Java Receives new SpatialPooler](https://github.com/numenta/htm.java/pull/486) - Fully Updated! (10/06/2016) * HTM.Java Reaches 100% NuPIC Compatibility and operation within NAB will be offered (soon)! (09/29/2016) * [HTM.java Receives New SDRClassifier!](https://github.com/numenta/htm.java/blob/master/src/main/java/org/numenta/nupic/algorithms/SDRClassifier.java) (07/26/2016) -* [HTM.java Status Report](https://discourse.numenta.org/t/htm-java-status-report/645) (05/29/2016) -* [HTM.java Examples Repo Updated!](https://github.com/numenta/htm.java-examples) Includes use of the [New Cortical.io API!](https://github.com/cortical-io/retina-api-java-sdk) (05/18/2016) -* [New HTM.java Forum Site](http://discourse.numenta.org/c/htm-java) found on the new [HTM Forum](http://discourse.numenta.org) (05/10/2016) -* [HTM.java Recieves new Persistence API](https://github.com/numenta/htm.java/wiki/Saving-Your-Network:-The-Persistence-API) (04/14/2016) -* HTM.java Recieves [Docker Reference-Build Implementation](https://github.com/numenta/htm.java/wiki/Build-Instructions#reference-build-environment) (03/26/2016) -* **HTM.java Becomes Build-able With OpenJDK** (03/26/2016) ### [News Archives...](https://github.com/numenta/htm.java/wiki/News-Archives...) From 89464dff49d96c7283f1fa9a54ca712392f9ac5d Mon Sep 17 00:00:00 2001 From: David Ray Date: Fri, 2 Jun 2017 13:55:50 -0500 Subject: [PATCH 46/52] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 513a0ecb..306b3dbc 100644 --- a/README.md +++ b/README.md @@ -25,7 +25,7 @@ _**NOTE: Minimum JavaSE version is 8**_
## Recent News Items... -* Updated Sync Report Table Here in README (06/02/2017) +* Updated [Sync Report Table](https://github.com/numenta/htm.java/blob/master/README.md#versioning) Here in README (06/02/2017) * New Hot Fix Release [v0.6.13-alpha](https://github.com/numenta/htm.java/releases) (05/12/2017) * Updated HTM.Java Examples! [Now in sync with latest release (v0.6.12-alpha)](https://github.com/numenta/htm.java-examples) (See the executable Jars!) (04/06/2017) * New Feature Release! [v0.6.12-alpha](https://github.com/numenta/htm.java/releases/tag/v0.6.12-alpha) Network API Allows Multi-field inference! (04/04/2017) From 8bd1acbfec82e48a86e7189cdef5c39ed431e6f8 Mon Sep 17 00:00:00 2001 From: James Weakley Date: Fri, 25 Aug 2017 13:11:08 +1000 Subject: [PATCH 47/52] #522 Use line separator value from System --- src/test/java/org/numenta/nupic/util/UniversalRandomTest.java | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/test/java/org/numenta/nupic/util/UniversalRandomTest.java b/src/test/java/org/numenta/nupic/util/UniversalRandomTest.java index 9a02a2b4..873e20e0 100644 --- a/src/test/java/org/numenta/nupic/util/UniversalRandomTest.java +++ b/src/test/java/org/numenta/nupic/util/UniversalRandomTest.java @@ -107,7 +107,7 @@ public void testMain() { System.setOut(out); String output = baos.toString(); - String[] lines = output.split("\n"); + String[] lines = output.split(System.lineSeparator()); Arrays.stream(lines).forEach(System.out::println); From b054261ff23ac7cf3634e766f4823f0fd2dd6eb7 Mon Sep 17 00:00:00 2001 From: James Weakley Date: Fri, 25 Aug 2017 13:15:13 +1000 Subject: [PATCH 48/52] #522 Don't delete HTMNetworkTest directory after test run --- src/test/java/org/numenta/nupic/network/PersistenceAPITest.java | 2 -- 1 file changed, 2 deletions(-) diff --git a/src/test/java/org/numenta/nupic/network/PersistenceAPITest.java b/src/test/java/org/numenta/nupic/network/PersistenceAPITest.java index b33c3588..97bc9a82 100644 --- a/src/test/java/org/numenta/nupic/network/PersistenceAPITest.java +++ b/src/test/java/org/numenta/nupic/network/PersistenceAPITest.java @@ -110,8 +110,6 @@ public static void cleanUp() { catch(Exception io) { throw new RuntimeException(io); } } ); - - Files.delete(serialDir.toPath()); } }catch(Exception e) { e.printStackTrace(); From 7c6e17046bb3ed4f957b46e3644cdd57b88d792d Mon Sep 17 00:00:00 2001 From: James Weakley Date: Sat, 26 Aug 2017 21:33:07 +1000 Subject: [PATCH 49/52] Fixes #524 Notify observers of uncaught exceptions in layer thread, this will fail a test case when checkObserver is being called. In the affected test case, force UTC timezone for a more consistent test experience. --- src/main/java/org/numenta/nupic/network/Layer.java | 14 ++++++++++++-- .../numenta/nupic/network/PersistenceAPITest.java | 8 ++++++++ 2 files changed, 20 insertions(+), 2 deletions(-) diff --git a/src/main/java/org/numenta/nupic/network/Layer.java b/src/main/java/org/numenta/nupic/network/Layer.java index 13e47b60..aa6aeddd 100644 --- a/src/main/java/org/numenta/nupic/network/Layer.java +++ b/src/main/java/org/numenta/nupic/network/Layer.java @@ -21,6 +21,7 @@ */ package org.numenta.nupic.network; +import java.lang.Thread.UncaughtExceptionHandler; import java.math.BigDecimal; import java.math.MathContext; import java.util.ArrayList; @@ -2011,7 +2012,7 @@ protected int[] temporalInput(int[] input, ManualInput mi) { * Starts this {@code Layer}'s thread */ protected void startLayerThread() { - (LAYER_THREAD = new Thread("Sensor Layer [" + getName() + "] Thread") { + LAYER_THREAD = new Thread("Sensor Layer [" + getName() + "] Thread") { @SuppressWarnings("unchecked") public void run() { @@ -2044,7 +2045,16 @@ public void run() { } }); } - }).start(); + }; + + LAYER_THREAD.setUncaughtExceptionHandler(new UncaughtExceptionHandler() { + + @Override + public void uncaughtException(Thread t, Throwable e) { + notifyError(new RuntimeException("Unhandled Exception in "+LAYER_THREAD.getName(),e)); + } + }); + LAYER_THREAD.start(); } /** diff --git a/src/test/java/org/numenta/nupic/network/PersistenceAPITest.java b/src/test/java/org/numenta/nupic/network/PersistenceAPITest.java index b33c3588..7f6b2278 100644 --- a/src/test/java/org/numenta/nupic/network/PersistenceAPITest.java +++ b/src/test/java/org/numenta/nupic/network/PersistenceAPITest.java @@ -49,6 +49,7 @@ import java.util.stream.Stream; import org.junit.AfterClass; +import org.junit.BeforeClass; import org.junit.Test; import org.numenta.nupic.FieldMetaType; import org.numenta.nupic.Parameters; @@ -97,6 +98,13 @@ public class PersistenceAPITest extends ObservableTestBase { /** Printer to visualize DayOfWeek printouts - SET TO TRUE FOR PRINTOUT */ private BiFunction dayOfWeekPrintout = createDayOfWeekInferencePrintout(false); + + @BeforeClass + public static void beforeClass(){ + // Sample data contains datetimes that are invalid in some timezones due to DST. + // If UTC is forced, then test runs should yield the same result regardless of timezone + System.setProperty("user.timezone", "UTC"); + } @AfterClass public static void cleanUp() { From 8fc6b596461a879fdf3e8936833c9a972d858b57 Mon Sep 17 00:00:00 2001 From: David Ray Date: Thu, 14 Sep 2017 04:25:27 -0500 Subject: [PATCH 50/52] Update CHANGELOG.md --- CHANGELOG.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index 22885992..6b65f690 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -24,10 +24,10 @@ This project adheres to [Semantic Versioning](http://semver.org/). ## [v0.6.13-alpha] Hot Fix! #### Removed #### Added -[[PR #518](https://github.com/numenta/htm.java/pull/518)] Added FieldMetaTypeTest.java +* [[PR #518](https://github.com/numenta/htm.java/pull/518)] Added FieldMetaTypeTest.java #### Changed #### Fixed -[[PR #518](https://github.com/numenta/htm.java/pull/518)] Fixed "bool" type decode bug +* [[PR #518](https://github.com/numenta/htm.java/pull/518)] Fixed "bool" type decode bug *** From 7c455ae051ff3478acdc4c944db41ca6f220d0a5 Mon Sep 17 00:00:00 2001 From: cogmission Date: Tue, 19 Feb 2019 09:02:45 -0600 Subject: [PATCH 51/52] Added parsing for windows case --- src/main/java/org/numenta/nupic/network/sensor/FileSensor.java | 1 + 1 file changed, 1 insertion(+) diff --git a/src/main/java/org/numenta/nupic/network/sensor/FileSensor.java b/src/main/java/org/numenta/nupic/network/sensor/FileSensor.java index 219940e3..69b39ad0 100644 --- a/src/main/java/org/numenta/nupic/network/sensor/FileSensor.java +++ b/src/main/java/org/numenta/nupic/network/sensor/FileSensor.java @@ -155,6 +155,7 @@ public static Stream getJarEntryStream(String path) { JarFile jar = new JarFile(parts[0]); String innerPath = parts[1]; innerPath = innerPath.startsWith("!") ? innerPath.substring(1) : innerPath; + innerPath = innerPath.startsWith("\\") ? innerPath.substring(1) : innerPath; InputStream inStream = jar.getInputStream(jar.getEntry(innerPath)); BufferedReader br = new BufferedReader(new InputStreamReader(inStream)); retVal = br.lines().onClose(() -> { From 886f7cf8dd566b0b1309131910cb4f3291d59ed5 Mon Sep 17 00:00:00 2001 From: cogmission Date: Wed, 15 Sep 2021 21:44:59 -0500 Subject: [PATCH 52/52] Updated to test permissions --- .gitignore | 3 +++ README.md | 1 + 2 files changed, 4 insertions(+) diff --git a/.gitignore b/.gitignore index eed22cdd..bded69f6 100644 --- a/.gitignore +++ b/.gitignore @@ -1,3 +1,6 @@ +/out +/out/* +*out *.DS_Store /.DS_Store /bin diff --git a/README.md b/README.md index 306b3dbc..bcfcbf1a 100644 --- a/README.md +++ b/README.md @@ -25,6 +25,7 @@ _**NOTE: Minimum JavaSE version is 8**_
## Recent News Items... +* Updated README.md to reflect new site image source (09/15/2021) * Updated [Sync Report Table](https://github.com/numenta/htm.java/blob/master/README.md#versioning) Here in README (06/02/2017) * New Hot Fix Release [v0.6.13-alpha](https://github.com/numenta/htm.java/releases) (05/12/2017) * Updated HTM.Java Examples! [Now in sync with latest release (v0.6.12-alpha)](https://github.com/numenta/htm.java-examples) (See the executable Jars!) (04/06/2017)