Numenta Platform for Intelligent Computing is an implementation of Hierarchical Temporal Memory (HTM), a theory of intelligence based strictly on the neuroscience of the neocortex.

Overview

NuPIC Logo NuPIC

Numenta Platform for Intelligent Computing

The Numenta Platform for Intelligent Computing (NuPIC) is a machine intelligence platform that implements the HTM learning algorithms. HTM is a detailed computational theory of the neocortex. At the core of HTM are time-based continuous learning algorithms that store and recall spatial and temporal patterns. NuPIC is suited to a variety of problems, particularly anomaly detection and prediction of streaming data sources. For more information, see numenta.org or the NuPIC Forum.

For usage guides, quick starts, and API documentation, see http://nupic.docs.numenta.org/.

This project is in Maintenance Mode

We plan to do minor releases only, and limit changes in NuPIC and NuPIC Core to:

  • Fixing critical bugs.
  • Features needed to support ongoing research.

Installing NuPIC

NuPIC binaries are available for:

  • Linux x86 64bit
  • OS X 10.9
  • OS X 10.10
  • Windows 64bit

Dependencies

The following dependencies are required to install NuPIC on all operating systems.

Additional OS X requirements:

Install

Run the following to install NuPIC:

pip install nupic

Test

# From the root of the repo:
py.test tests/unit

Having problems?

  • You may need to use the --user flag for the commands above to install in a non-system location (depends on your environment). Alternatively, you can execute the pip commands with sudo (not recommended).
  • You may need to add the --use-wheel option if you have an older pip version (wheels are now the default binary package format for pip).

For any other installation issues, please see our search our forums (post questions there). You can report bugs at https://github.com/numenta/nupic/issues.

Live Community Chat: Gitter

Installing NuPIC From Source

To install from local source code, run from the repository root:

pip install .

Use the optional -e argument for a developer install.

If you want to build the dependent nupic.bindings from source, you should build and install from nupic.core prior to installing nupic (since a PyPI release will be installed if nupic.bindings isn't yet installed).

  • Build: Build Status AppVeyor Status CircleCI
  • To cite this codebase: DOI
Comments
  • Pip installable

    Pip installable

    With this PR NuPIC can be a PIP installable package.

    Other improvements with this PR:

    • $PYTHONPATH was discarded as an user can use the install or develop options from setuptools..
    • $NTAX_DEVELOPER_BUILD also was discarded due the same reason above.

    Please note, that $NTA can discarded as now everything can be done in a single location (distutils/setuptools experts here know what I mean).. This is possible due to distutils allow we install the NuPIC library in different places (PYTHON_PATH, HOME, etc)..

    To have a better idea, I suggest read first the instructions from README..

    Fixes #809 Fixes #602 Fixes #603 Fixes #604

    Replaces https://github.com/numenta/nupic/pull/880

    type:enhancement type:build 
    opened by david-ragazzi 175
  • Integration with nupic.core API

    Integration with nupic.core API

    These changes allow that CMake builds nupic.core and links its output (libnupic.core.a) to nupic subprojects.

    This depends of https://github.com/numenta/nupic.core/pull/47 (at least depends of the SHA)

    type:build status:ready core extraction 
    opened by david-ragazzi 131
  • Visualize, plot OPF experiment results

    Visualize, plot OPF experiment results

    I'd like to visualize the results .csv from running OPF experiment. It should be relatively easy, but perhaps some of you have already written a nice utility for that? NAB, @rhyolight ?


    WORKING branch: https://github.com/breznak/nupic/tree/plot_results

    TODO:

    • [x] move or copy the script from NAB to NUPIC/scripts/visualization/
    • [x] Offline & Security
      • ~~plot.ly~~ publishes all data online, I think this is unacceptable for general use for us
      • "our script" is pretty nice, runs in browser on localhost
        • [x] I suggest downloading the rendering scripts for offline use
    • [x] add example data for easy plotting test + doc
      • [x] OPF results data - opf_results/*.csv
      • [x] NAB (small) data - data_file* results_file* data/ results/
    • [x] eval which plotting lib to use, currently it's DyGraph https://github.com/numenta/NAB/blob/master/nab_visualizer.html#L4-L7 -- @jefffohl liked D3.js for being more flexible, but decided to stick with DyGraph so far as it fits the current needs of plotting graphs.
    • [x] extend to plot OPF data, not only NAB (subset of OPF) data (Help wanted, I can explain what to do with OPF data, provide parser script)
      • [x] plot single file
        The current JS code from NAB spends a lot of work in parsing a structure of files, in order to plot all *.csv files there. For OPF/Nupic we don't need that (although it wouldn't hurt to keep the functionality for NAB).
        What we need is the ability to plot a single file provided as an argument (python plot.py ./opf_results/DefaultTask.csv)
      • [x] merge data and results
        NAB uses separate folders for paths & results, on OPF results file, the all these are together
    • [ ] improve the Plot page interface (Help wanted)
      • [x] plot data, add a checkbox option, the OPF field is actual
      • [x] plot anomaly, add a checkbox option, the OPF field is anomalyScore
      • [x] plot predicted field, add a checkbox option, the OPF field is multistepBestPrediction.1
        • [x] optional, add option to plot other multistep (>1) prediction fields
        • [x] opt, add textfield to enter specific label that should be plotted, maybe can fit ^^^
        • checkboxes generated dynamically for all CSV labels
        • [ ] preselect some values -- impossible to preselect/know data - but anomalyScore & multiStepBestPredictions.actual make sense
        • [ ] opt, plot labeled anomalies (not present in OPF, could be just a specific field)
      • [x] UI: merge data & results windows?
      • [x] UI: anomalyScore [0..1.0] is not noticable with "raw data" plotted, FIX by rescaling (to say 90% of max of the data)?
      • [ ] UI: smooth zoom out. Now can soothly zoom in be selecting the are of interest. and zoom out by mouse-click to the original size. Can we zoom-out iteratively as well (mouse wheel/a scroll-bar, ...)?
      • [ ] UI: add highlight menu: check-boxes fields (ex. anomalyScore), threshold (0.9), (optionally: below/over), (opt.: color) and highlight the section where the field is over the threshold.
      • [ ] Consider an AND operator for the 2 above statements, can be used as evaluation as follows: see https://github.com/breznak/nupic/pull/16
        (anomalyScore, >0.9) AND (annotation, >0.9), green # correctly detected anomaly
        (anomalyScore, <=0.9) AND (annotation, >0.9), red # missed
        (anomalyScore, >0.9) AND (annotation, <=0.9), yellow # false positive
    • [ ] UI: On higher zoom-out levels (~1 week) it is not possible to see time (fine-grained) on x-axis interactively. (I think best solution would be if the timestamp field would show up in the "fields div on the left", as interactive values are shown there.) see https://github.com/breznak/nupic/pull/20
    • [x] opt, FIX plotting of NAB Results, currently only data seem to work. (Help wanted)
      If the above succeeds, maybe NAB should switch to using OPF format for the output (data+results) ( @subutai ?)
    • [ ] Extending OPF: issues not directly for this PR, related to OPF; @rhyolight ?
      • [ ] anomalyScore has a string type in OPF (maybe bcs it's None at the 1st step)
      • [ ] new anomalyAnnotations OPF field, to mark human annotations, could be useful in NAB
      • [ ] Plotting non-numeric inputs. Currently impossible, but each encoder has a scalarValue member, can we expose that (add a field XX.scalar for each input field to OPF file) somehow?
    • [x] Bugs
      • [x] selecting a non-OPF csv crashes the web app, can't render after reselecting a correct OPF file (w/o server restart) @jefffohl
      • [x] fields with too long name (eg metrics) get shortened in the UI, but it's not possible to see the whole name. Maybe a "context label" on mouse hover?
      • [x] "generic CSV" support (NAB,..)
        currently some fields are hard-coded (eg for "scaled anomaly score") and the rendering fails if the fields are not present. But a nice feature is the dynamic menu for plotting numeric fields, it would be nice if only the "scaled" function failed (and its checkbox is greyed-out) instead of all plot failing.
        • [x] additionally, if the "scaled" functionality could be written more generic (an array for labels to scale) so we could scale both anomaly & likelihood score, or spiketrain (0/1) data,... ?
      • [x] /low priority/ Does not render in FireFox (nor did the orginal NAB code, but the DyGraph examples work fine in a recent FF)
    • [ ] New Features
      • [ ] Online plotting
        would require: a) the model sends the updates to the server; b) a network in/out region; c) refreshing the plot of the (updated) file every second or so..
    • [ ] PR ready code
      • [ ] finish main Readme, add images, maybe a wiki
      • [ ] mention the "2 rows skipped" feature/problem
      • [x] add comments/docs to functions
      • [x] avoid hard-coded values, allow "settings" at code-level
      • [ ] should we add test-case for this code?

    UPDATES:

    • 4/11/2015 - some refactoring, configurable options, improved "generic CSV" handling, documentation
    • 3/11/2015 - Merged Jeff's PR with Bugfixes & overhauled usability! :clap: "Zoom view", and much more!
    • 27/10/2015 - Merged Jeff's work enabling OPF files plotting, improving UI
    • 24/10/2015 - Updated with @jefffohl 's "Initial commit" work, plots NAB data & results, improved checkboxes for plots
    type:enhancement newbie status:help wanted subject:opf 
    opened by breznak 127
  • Pythonic build: Switch from cmake to distutils extensions for nupic installation

    Pythonic build: Switch from cmake to distutils extensions for nupic installation

    Fixes https://github.com/numenta/nupic/issues/1573.

    This PR aims to help we have Nupic "pip installable". For this it tries to leave the build/install process the more "pythonic" as possible using: a) native support from Distutils/SetupTools to build all C++ extensions (https://github.com/numenta/nupic/issues/1573) b) Python scripts to perform jobs that were being performed by C++ executables (https://github.com/numenta/nupic/issues/1576).

    The motivation of B is because it's very tricky install all current cpp executables and libraries using A solution. Some of these libraries (like bindings) are inevitable to be build as extensions, however c++ extensions like htmtest and compare_nupic_core_version can be replaced to python scripts. Different than bindings, these 2 extensions are complicating the process as they should be installed into souce repository (bin folder, to be exact) and not into nupic package install folder like the other modules.

    Once that all extensions now can be build in a natural way, we can remove CMakeLists.txt from NuPIC Pyhton. This makes that CMake becomes essential only to build the NuPIC Core library: as this is an optional job, we could bypass CMake/Git use during PIP process decreasing the number of build/install steps and dependencies.

    I still am working on the pythonic version of HtmTest. Thus it's normal that some tests fail, because Travis still is referencing its c++ version.


    TODO:

    • [X] Figure out the ARCHFLAGS="-arch x86_64" requirement on darwin64. Update: There's not way, distutils take the flags which python was build. So we cannot set this variable inside setup.py.
    • [X] Update supporting documentation (README, wikis) with any build changes. Update: README updated with ARCHFLAGS instructions.
    • [X] Simplification of this PR by move bindings refactoring to a new PR. Solved by https://github.com/numenta/nupic/pull/1610
    • [X] Simplification of this PR by move Travis scripts, CMakeLists.txt, and README update (replacement of make commands to python instructions and custom_targets remotion) to a new PR. Solved by https://github.com/numenta/nupic/pull/1622
    • [X] Simplification of this PR by move HtmTest to nupic.core on a new PR. Solved by https://github.com/numenta/nupic/pull/1676 and numenta/nupic.core#282
    • [X] python setup.py --help & python setup.py --help-commands should print help, not compile
    opened by david-ragazzi 120
  • Document the proposed directory structure for nupic.core and nupic

    Document the proposed directory structure for nupic.core and nupic

    As a part of the nupic.core extraction, we need to create some documentation of the proposed directory structure after the extraction is complete. This should include both repositories, as well as details about where the nupic.core dependency exists within nupic.

    type:docs core extraction 
    opened by rhyolight 73
  • Problem running Python tests

    Problem running Python tests

    Hi all,

    I followed the instructions for installing Nupic on Ubuntu (https://github.com/numenta/nupic/wiki/Installing-NuPIC-on-Ubuntu) and ran into a couple of issues which I resolved by setting up the PYTHONPATH, NUPIC, and NTA environment variables.

    However, I eventually got stuck when running the Python tests and here is what I'm getting:

    tests/unit/py2/nupic/engine/network_test.py:32: NetworkTest.testErrorHandling FAILED
    
    ======================================================================= FAILURES =======================================================================
    ____________________________________________________________ NetworkTest.testErrorHandling _____________________________________________________________
    self = <tests.unit.py2.nupic.engine.network_test.NetworkTest testMethod=testErrorHandling>
    
        def testErrorHandling(self):
          n = engine.Network()
    
          # Test trying to add non-existent node
          with self.assertRaises(Exception) as cm:
            n.addRegion('r', 'py.NonExistingNode', '')
    
          self.assertEqual(cm.exception.message,
                           "Matching Python module for " +
                           "py.NonExistingNode not found.")
    
          # Test failure during import
          with self.assertRaises(Exception) as cm:
            n.addRegion('r', 'py.UnimportableNode', '')
    
          self.assertEqual(str(cm.exception),
            'invalid syntax (UnimportableNode.py, line 5)')
    
          # Test failure in the __init__() method
          with self.assertRaises(Exception) as cm:
            n.addRegion('r', 'py.TestNode', '{ failInInit: 1 }')
    
          self.assertEqual(str(cm.exception),
            'TestNode.__init__() Failing on purpose as requested')
    
          # Test failure inside the compute() method
          with self.assertRaises(Exception) as cm:
            r = n.addRegion('r', 'py.TestNode', '{ failInCompute: 1 }')
            r.dimensions = engine.Dimensions([4, 4])
            n.initialize()
            n.run(1)
    
          self.assertEqual(str(cm.exception),
    >       'TestNode.compute() Failing on purpose as requested')
    E     AssertionError: "Wrong number of arguments for overloaded function 'new_Dimensions'.\n  Possible C/C++ prototypes are:\n    nta::Dimensions()\n    nta::Dimensions(std::vector< size_t,std::allocator< size_t > >)\n    nta::Dimensions(size_t)\n    nta::Dimensions(size_t,size_t)\n    nta::Dimensions(size_t,size_t,size_t)\n" != 'TestNode.compute() Failing on purpose as requested'
    
    tests/unit/py2/nupic/engine/network_test.py:65: AssertionError
    ------------------------------------------------------------------- Captured stdout --------------------------------------------------------------------
    ERROR:  Matching Python module for py.NonExistingNode not found. [/home/mariam/nupic/nta/engine/RegionImplFactory.cpp line 450]
    ERROR:  Could not get valid spec for Region: py.UnimportableNode [/home/mariam/nupic/nta/engine/RegionImplFactory.cpp line 446]
    {'singleNodeOnly': False, 'inputs': {'bottomUpIn': {'count': 0, 'requireSplitterMap': True, 'description': 'Primary input for the node', 'isDefaultInput': True, 'dataType': 'Real64', 'required': True, 'regionLevel': False}}, 'commands': {}, 'description': 'The node spec of the NuPIC 2 Python TestNode', 'parameters': {'failInCompute': {'count': 1, 'description': 'For testing failure in compute()', 'dataType': 'Int32', 'accessMode': 'ReadWrite', 'defaultValue': '0', 'constraints': ''}, 'stringParam': {'count': 0, 'description': 'String parameter', 'dataType': 'Byte', 'accessMode': 'ReadWrite', 'defaultValue': 'nodespec value', 'constraints': ''}, 'real32Param': {'count': 1, 'description': 'Real32 scalar parameter', 'dataType': 'Real32', 'accessMode': 'ReadWrite', 'defaultValue': '32.1', 'constraints': ''}, 'failInInit': {'count': 1, 'description': 'For testing failure in __init__()', 'dataType': 'Int32', 'accessMode': 'ReadWrite', 'defaultValue': '0', 'constraints': ''}, 'uint32Param': {'count': 1, 'description': 'UInt32 scalar parameter', 'dataType': 'UInt32', 'accessMode': 'ReadWrite', 'defaultValue': '33', 'constraints': ''}, 'int32Param': {'count': 1, 'description': 'Int32 scalar parameter', 'dataType': 'Int32', 'accessMode': 'ReadWrite', 'defaultValue': '32', 'constraints': ''}, 'uint64Param': {'count': 1, 'description': 'UInt64 scalar parameter', 'dataType': 'UInt64', 'accessMode': 'ReadWrite', 'defaultValue': '65', 'constraints': ''}, 'int64arrayParam': {'count': 0, 'description': 'Int64 array parameter', 'dataType': 'Int64', 'accessMode': 'ReadWrite', 'defaultValue': '', 'constraints': ''}, 'int64Param': {'count': 1, 'description': 'Int64 scalar parameter', 'dataType': 'Int64', 'accessMode': 'ReadWrite', 'defaultValue': '64', 'constraints': ''}, 'real64Param': {'count': 1, 'description': 'Real64 scalar parameter', 'dataType': 'Real64', 'accessMode': 'ReadWrite', 'defaultValue': '64.1', 'constraints': ''}, 'real32arrayParam': {'count': 0, 'description': 'Real32 array parameter', 'dataType': 'Real32', 'accessMode': 'ReadWrite', 'defaultValue': '', 'constraints': ''}}, 'outputs': {'bottomUpOut': {'count': 0, 'dataType': 'Real64', 'isDefaultOutput': True, 'regionLevel': False, 'description': 'Primary output for the node'}}}
    !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: stopping after 1 failures !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
    =================================================== 1 failed, 153 passed, 6 skipped in 22.18 seconds ==
    

    Was wondering if anybody has any solution for this issue. My Python version is 2.7 could that be the source of problem? In other words does Nupic strictly require Python 2.6 or it's fine with 2.7 too?

    type:bug type:tests 
    opened by mariamr 66
  • Remove git submodule in favor of CMake commands

    Remove git submodule in favor of CMake commands

    Fixes #798. Fixes #634.

    Removed nta git submodule, replaced with CMake commands to manually clone, fetch, and checkout nupic.core repository. This also allows overriding the defaults on the command line via NUPIC_CORE_REMOTE, for the remote repository url, and NUPIC_CORE_COMMITISH, for the specific sha.

    There are several advantages to this approach:

    1. No git submodules (nupic git repository is no longer explicitly coupled to git://github.com/numenta/nupic.core.git)
    2. ~~CMakeLists.txt~~ .nupic_modules is the ground truth for nupic.core version dependency
    3. User can override NUPIC_CORE_REMOTE and NUPIC_CORE_COMMITISH at-will, if they are not using the official nupic.core repository.
    4. Developer can make changes in nta/ and push to remote repository of their choosing
    type:build status:ready core extraction type:development 
    opened by oxtopus 64
  • Failing database connection for ~/examples/swarm/test_db.py

    Failing database connection for ~/examples/swarm/test_db.py

    Hi,

    I am new to nupic. Just installed nupic on my mac.

    Before cloning, i have created folder called "nupic" in my user folder and then cloned nupic source code from github. So cloning created another "nupic" folder under my "nupic" folder.

    Hence i have set set environment variable $NUPIC as /users/username/nupic/nupic.

    I am able to import nupic successfully but unable to run below two commands.

    1. python ~/examples/swarm/test_db.py

    I am getting below error File "~/examples/swarm/test_db.py", line 23, in from nupic.support.configuration import Configuration ImportError: No module named support.configuration

    Not sure why it is happening, although there is "support" folder under $NUPIC/nupic

    1. Python 2.7.10 (default, Jul 13 2015, 12:05:58) [GCC 4.2.1 Compatible Apple LLVM 6.1.0 (clang-602.0.53)] on darwin Type "help", "copyright", "credits" or "license" for more information.

      import nupic from nupic.swarming import permutations_runner Traceback (most recent call last): File "", line 1, in ImportError: No module named swarming

    I am getting this error. please help me for both the errors.

    opened by viveklife2003 55
  • Anomaly modular

    Anomaly modular

    1/ turn anomaly.py into a class for better reuse, move related logic from clamodel.py here 2/ nice speedup for python (use numpy.sum()) 3/ new features/"implementation" to anomaly computation - parametrized, default off, so no change from current 3.1/ new cumulative anomaly (sliding window) 3.2/ @subutai Please have a look if you like this approach, it's prepared for integration with your likelihood anomaly code 4/ added tests

    status:ready subject:anomaly 
    opened by breznak 53
  • [RFC] SpatialPooler: extract boosting to a separate file,

    [RFC] SpatialPooler: extract boosting to a separate file, "interface"

    Addresses boosting in the SpatialPooler, and tries to put all the boosting related logic into one class. The Boosting instance is then used as a member of SP. It's only change and improvement is that it hides all the update*DutyCycles* logic from SP.

    TODO:

    • [ ] looking for help with CAPNP serialization for Boosting (and SP) https://github.com/numenta/nupic/pull/2620#issuecomment-149392750

    Fixes: #824

    question status:in progress status:help wanted 
    opened by breznak 44
  • Added wrapAround parameter to SP

    Added wrapAround parameter to SP

    Added wrapAround parameter to SP so the behavior of _mapPotential can be chosen by user.

    Also made a relatively small change in the way _mapPotential maps inputs to columns when potentialRadius is smaller than the largest input dimension. See images below.

    Fixes #1211 Fixes #1216 Fixes https://github.com/numenta/nupic/issues/884

    @chetan51 please review

    type:enhancement subject:algorithms 
    opened by baroobob 43
  • fix(sec): upgrade asteval to 0.9.23

    fix(sec): upgrade asteval to 0.9.23

    What happened?

    There are 1 security vulnerabilities found in asteval 0.9.1

    What did I do?

    Upgrade asteval from 0.9.1 to 0.9.23 for vulnerability fix

    What did you expect to happen?

    Ideally, no insecure libs should be used.

    The specification of the pull request

    PR Specification from OSCS Signed-off-by:pen4[email protected]

    opened by pen4 0
  • Dependency Issue

    Dependency Issue

    Please upgrade to Python 3.6 + Most of the libraries are obsolete right now or will be in the near future.

      File "/var/folders/_f/pbf0vr692pndzc449s346v0w0000gn/T/easy_install-l8i_KS/setuptools_scm-6.4.2/setup.py", line 31, in scm_version
        'Operating System :: Microsoft :: Windows',
    RuntimeError: support for python < 3.6 has been removed in setuptools_scm>=6.0.0
    ----------------------------------------
    

    ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.

    opened by albertsalgueda 0
  • pip install nupic throws error

    pip install nupic throws error

    Hi!

    I was trying to get acquainted with nupic library, but it does not work from the very beginning, the "pip install nupic" command.

    Running this in a Google Colab notebook (and locally), I get the following error log: Using cached https://files.pythonhosted.org/packages/86/e8/f31a7c9a3b471f6125abc8addd37c17760e00affb45d6db088394a360086/unittest2-0.5.1.tar.gz Added unittest2==0.5.1 from https://files.pythonhosted.org/packages/86/e8/f31a7c9a3b471f6125abc8addd37c17760e00affb45d6db088394a360086/unittest2-0.5.1.tar.gz#sha256=aa5de8cdf654d843379c97bd1ee240e86356d3355a97b147a6f3f4d149247a71 (from nupic) to build tracker '/tmp/pip-req-tracker-f2clgsrs' Running setup.py (path:/tmp/pip-install-1j0aqyyp/unittest2/setup.py) egg_info for package unittest2 Running command python setup.py egg_info Traceback (most recent call last): File "", line 1, in File "/tmp/pip-install-1j0aqyyp/unittest2/setup.py", line 12, in from unittest2 import version as VERSION File "/tmp/pip-install-1j0aqyyp/unittest2/unittest2/init.py", line 40, in from unittest2.collector import collector File "/tmp/pip-install-1j0aqyyp/unittest2/unittest2/collector.py", line 3, in from unittest2.loader import defaultTestLoader File "/tmp/pip-install-1j0aqyyp/unittest2/unittest2/loader.py", line 92 except Exception, e: ^ SyntaxError: invalid syntax Cleaning up... Removing source in /tmp/pip-install-1j0aqyyp/nupic Removing source in /tmp/pip-install-1j0aqyyp/asteval Removing source in /tmp/pip-install-1j0aqyyp/mock Removing source in /tmp/pip-install-1j0aqyyp/ordereddict Removing source in /tmp/pip-install-1j0aqyyp/psutil Removing source in /tmp/pip-install-1j0aqyyp/pytest Removing source in /tmp/pip-install-1j0aqyyp/pytest-cov Removing source in /tmp/pip-install-1j0aqyyp/pytest-xdist Removing source in /tmp/pip-install-1j0aqyyp/python-dateutil Removing source in /tmp/pip-install-1j0aqyyp/PyYAML Removing source in /tmp/pip-install-1j0aqyyp/unittest2 Removed unittest2==0.5.1 from https://files.pythonhosted.org/packages/86/e8/f31a7c9a3b471f6125abc8addd37c17760e00affb45d6db088394a360086/unittest2-0.5.1.tar.gz#sha256=aa5de8cdf654d843379c97bd1ee240e86356d3355a97b147a6f3f4d149247a71 (from nupic) from build tracker '/tmp/pip-req-tracker-f2clgsrs' Removed build tracker '/tmp/pip-req-tracker-f2clgsrs' ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output. Exception information: Traceback (most recent call last): File "/usr/local/lib/python3.7/dist-packages/pip/_internal/cli/base_command.py", line 153, in _main status = self.run(options, args) File "/usr/local/lib/python3.7/dist-packages/pip/_internal/commands/install.py", line 382, in run resolver.resolve(requirement_set) File "/usr/local/lib/python3.7/dist-packages/pip/_internal/legacy_resolve.py", line 201, in resolve self._resolve_one(requirement_set, req) File "/usr/local/lib/python3.7/dist-packages/pip/_internal/legacy_resolve.py", line 365, in _resolve_one abstract_dist = self._get_abstract_dist_for(req_to_install) File "/usr/local/lib/python3.7/dist-packages/pip/_internal/legacy_resolve.py", line 313, in _get_abstract_dist_for req, self.session, self.finder, self.require_hashes File "/usr/local/lib/python3.7/dist-packages/pip/_internal/operations/prepare.py", line 224, in prepare_linked_requirement req, self.req_tracker, finder, self.build_isolation, File "/usr/local/lib/python3.7/dist-packages/pip/_internal/operations/prepare.py", line 49, in _get_prepared_distribution abstract_dist.prepare_distribution_metadata(finder, build_isolation) File "/usr/local/lib/python3.7/dist-packages/pip/_internal/distributions/source/legacy.py", line 39, in prepare_distribution_metadata self.req.prepare_metadata() File "/usr/local/lib/python3.7/dist-packages/pip/_internal/req/req_install.py", line 563, in prepare_metadata self.metadata_directory = metadata_generator(self) File "/usr/local/lib/python3.7/dist-packages/pip/_internal/operations/generate_metadata.py", line 124, in _generate_metadata_legacy command_desc='python setup.py egg_info', File "/usr/local/lib/python3.7/dist-packages/pip/_internal/utils/subprocess.py", line 242, in call_subprocess raise InstallationError(exc_msg) pip._internal.exceptions.InstallationError: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.

    Is there anything that can be done about it? Honestly, if I can't even install it, what use is it to me. But maybe I am missing something simple. Any help would be much appreciated!

    Regards, A.

    opened by arb-git 1
  • A question about model.run()

    A question about model.run()

    Hi, I have a question about model.run() If I have a time series t0,d0 t1,d1 t2,d2 ... When I execute model.run(d0), the result will be the perdiction of d0 or d1?

    opened by qiezikuangge 0
  • Bump pyyaml from 3.10 to 5.4

    Bump pyyaml from 3.10 to 5.4

    Bumps pyyaml from 3.10 to 5.4.

    Changelog

    Sourced from pyyaml's changelog.

    5.4 (2021-01-19)

    5.3.1 (2020-03-18)

    • yaml/pyyaml#386 -- Prevents arbitrary code execution during python/object/new constructor

    5.3 (2020-01-06)

    5.2 (2019-12-02)

    • Repair incompatibilities introduced with 5.1. The default Loader was changed, but several methods like add_constructor still used the old default yaml/pyyaml#279 -- A more flexible fix for custom tag constructors yaml/pyyaml#287 -- Change default loader for yaml.add_constructor yaml/pyyaml#305 -- Change default loader for add_implicit_resolver, add_path_resolver
    • Make FullLoader safer by removing python/object/apply from the default FullLoader yaml/pyyaml#347 -- Move constructor for object/apply to UnsafeConstructor
    • Fix bug introduced in 5.1 where quoting went wrong on systems with sys.maxunicode <= 0xffff yaml/pyyaml#276 -- Fix logic for quoting special characters
    • Other PRs: yaml/pyyaml#280 -- Update CHANGES for 5.1

    5.1.2 (2019-07-30)

    • Re-release of 5.1 with regenerated Cython sources to build properly for Python 3.8b2+

    ... (truncated)

    Commits
    • 58d0cb7 5.4 release
    • a60f7a1 Fix compatibility with Jython
    • ee98abd Run CI on PR base branch changes
    • ddf2033 constructor.timezone: _copy & deepcopy
    • fc914d5 Avoid repeatedly appending to yaml_implicit_resolvers
    • a001f27 Fix for CVE-2020-14343
    • fe15062 Add 3.9 to appveyor file for completeness sake
    • 1e1c7fb Add a newline character to end of pyproject.toml
    • 0b6b7d6 Start sentences and phrases for capital letters
    • c976915 Shell code improvements
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 0
Releases(1.0.5)
  • 1.0.5(Jun 1, 2018)

    • d81e128c6 NUP-2519: Update nupic.core version
    • 3371e4ef9 NUP-2519: Upgrade pycapnp to 0.6.3
    • 11a13c018 NUP-2518: Remove obsolete region initialization parameters from custom region example
    • 67724debf "pip install --use-wheel" was deprecated. See https://pip.pypa.io/en/stable/news/#deprecations-and-removals
    • 08daff4a3 Fix softmax overflow
    Source code(tar.gz)
    Source code(zip)
  • 1.0.4(Apr 12, 2018)

    • 682fd2e66c6d45912cd9c518106740143eff9fd9 : NUP-2506: Add test to all Serializable subclasses and fix related issues (#3826)
    • a7ab556a64e57064a8febbcf2ee341a1b1ff18ae : NUP-2506: fix traversal limit (#3823)
    • 54e1ffead7a8cedd9a2dab08bb02c7e5e3536bf3 : Added holidays parameter to date encoder (#3822)
    • 9bb7705ebd428e73f6efc180e77f7f127ce67c4d : version lock 'sphinx-autobuild' dependency 'tornado' (#3815)
    • 13c02de82c6bc52afc364fd1fdce88c5fa1aa92c : Update some legacy code examples. (#3814)
    • 33052e10dbe1030929223fc7e54d2d7c8b8a1ced : Fix metric spec schema bug (#3812)
    • 40e216915a172901b5dbe34932543fae68aa3631 : Fix lack of logging in run_swarm.py (#3809)
    • d8c740486198a6764b18c73bffe6005988435ac7 : NUP-2487: Update category prediction example
    • 38e40266a7ad2744fce904a02faa383676950e1e : Issue #1380: Update SP parameter validation test checking array dimensions
    • e470860e962db70a2b16c82d1647f10b3e985c42 : numenta/nupic.core#1380: Fix SP tests with correct dtype values
    • 38c9c7e1d7b161d9a4b2378cdae3652af704a481 : Add example for infer as well
    • 94e5f62e669dcbd55268c07b6aa30f391135ab4f : Include example to make isSparse parameter easier to understand
    • ffd1457037a52cb63a7be0cf004e886f3909f505 : Update KNN classifier documentation to make the input pattern requirements clear in both learn and infer
    • 5ecae91017c5f4f68c944a3f5b5d79d1276e2c59 : Changed distribution keyword, casted some attributes to float, removed setting list (#3784)
    • 41e5a6aefc649b08dbe948ba3e1db46f5aaaa603 : Issue #3783: Fixes test to compute pass the probation period (#3786)
    • 7e5f587ecd039520a53a4aa4608eb7ce577654f1 : Updating to XCode 8.3
    • 1aea72abde4457878a16288d6786ffb088f69164 : Update name of nyc_taxi.csv to nycTaxi.csv (#3776)
    Source code(tar.gz)
    Source code(zip)
  • 1.0.3(Sep 13, 2017)

  • 1.0.2(Aug 22, 2017)

  • 1.0.1(Aug 8, 2017)

  • 1.0.0(Jul 7, 2017)

    • Improved exception handling in /swarm/test_db.py (#3738)
    • DEVOPS-362 Remove unnecessary install script
    • DEVOPS-362 test removing dependency on install script entirely
    • DEVOPS-362 update install script
    • DEVOPS-362 Add initial version of missing install script
    • Added serialization guide to API docs. (#3737)
    • Put conditional around capnp for Windows
    • Complete new serialization in SpatialPooler
    • Catch nupic.core reference
    • NUP-2342: consolidate read/write into a single context
    • NUP-2342: Update examples to use capnp serialization
    • NUP-2341: Use capnp serialization for SDRClassifierDiff
    • NUP-2351: Remove TODOs from HTM Prediction Model test and fix bugs exposed by this test
    • NUP-2351: Add serialization to KNNAnomalyClassifierRegion
    • NUP-2351: Fix KNNClassifier serialization
    • NUP-2349 Implemented testCapnpWriteRead test for PreviousValueModel OPF class. Implemented PreviousValueModel.getProtoType. Return instance from PreviousValueModel.read.
    • NUP-2349 Implemented capnp serialization of PreviousValueModel
    • Put capnp import checks in place for Windows
    • Add serialization tests for TMRegion
    • NUP-2463 Serialize inferenceArgs, learningEnabled, and inferenceEnabled in opf Model.
    • Add support for different TM types in TMRegion serialization
    • Added Serializable to API docs, and inheritence links
    • Fixed Next ID value in comment in model.capnp
    • NUP-2464 Integrated ModelProto support into opf TwoGramModel.
    • Fixed input to SP in docs algo example (#3708)
    • NUP-2464 Serialize numPredictions and inferenceType via ModelProto member of HTMPredictionModelProto.
    • Added Serializable to all classes with a capnp write function (#3710)
    • Safe import of capnp for moving average proto
    • getSchema returns prototype
    • Remove unused _readArray
    • Rely on pycapnp/numpy native conversions in write/read
    • Add capnp conditionals for Windows
    • NUP-2351: Use dict directly instead of creating capnp message
    • Fixed Serializable extensions
    • Fix CPP breakages from changes
    • NUP-2351: Add capnp serialization to KNNClassifierRegion
    • Fix everything up to get serialization tests working with capnp serialization for BacktrackingTM
    • Added getSchema to MovingAverage
    • Added Serializable to all classes with a capnp write function
    • Finished up first pass implementation of BacktrackingTM serialization
    • NUP-2350: capnp serialization for TwoGramModel
    • NUP-2449 Completed implementation of HTMPredictionModel serialization tests.
    • NUP-2463 Implemented test (disabled) to demonstrate the bug "Predicted field and __inferenceEnabled are not serialized by HTMPredictionModel.write"
    • OPF Guide cleanup and link fixes (#3700)
    • NUP-2355 Add new serialization to TestRegion
    • remove SVMClassifierNode (#3697)
    • handle scalar values in the sdr classifier region
    • NUP-2346: Add serialization to knn_classifier
    • NUP-2458 Fixed and enabled SDRClassifierTest.testWriteReadNoComputeBeforeSerializing
    • NUP-2458 Implemented testWriteReadNoComputeBeforeSerializing in sdr_classifier_test.py that reproduces the "deque index out of bounds", but disabled the test, since it fails in a different way after the fix, most likely unrelated to the fix, which needs to be debugged
    • NUP-2398 Refactor test comparing different configurations
    • NUP-2458 Prevent index out of bounds when saving patternNZHistory after fewer than _maxSteps input records have been processed.
    • NUP-2458 Moved HTMPredictionModel serialization test to integration/opf
    • NUP-2449 Implement simple serialization/deserialzation tests. This exposed a number of problems that need to be fixed before we can make further progress.
    • update sdr classifier doc
    Source code(tar.gz)
    Source code(zip)
  • 0.8.0(Jun 8, 2017)

    • Document ExperimentDescriptionAPI (#3679)
    • Update nupic.math API docs (#3677)
    • SP docs cleanup (#3671)
    • Allow multiple classifications for each record to SDRClassifier (#3669)
    • Updated BacktrackingTMCPP compute parameter name (#3667)
    • Fix HTMPredictionModel prediction using SDRClassifier (#3665)
    • Remove CLAClassifier (#3665)
    • Add capnp serialization to TMRegion (#3657)
    Source code(tar.gz)
    Source code(zip)
  • 0.7.0(Jun 2, 2017)

    0.7.0

    WARNING: This release contains breaking changes described in https://discourse.numenta.org/t/warning-0-7-0-breaking-changes/2200

    • Stop calling the backtracking_tm tests "tm tests" (#3650)
    • Update hierarchy demo to fix regression
    • Clean up BacktrackingTM's public API (#3645)
    • Make region file names snake_case (part 7) (#3640)
    • Removed references to obsolete tm_py_fast shim (#3639)
    • Updated OPF Metric API docs (#3638)
    • updated init.py to include missing encoders (#3487)
    • Fixed anomaly likelihood doc problems. (#3629)
    • Updates swarming, some region code to snake_case (part 6) (#3627)
    • Fixed OPF util helpers module names. (#3625)
    • Complete RST docs for nupic.support (#3624)
    • Deleted nupic.support.features* (unused) (#3622)
    • Removed nupic.support.exceptions (unused) (#3620)
    • Proper snake_case for nupic.support (part 5) (#3618)
    • Snake case nupic.encoders (part 4) (#3614)
    • Moved opf_helpers module to helpers (#3610)
    • Removes unused code from nupic.support (#3616)
    • Applying snake_case module name standards (PART 3) (#3611)
    • Fixed support initLogging docstring params
    • Finished OPF utils and env docs
    • Documented OPF Basic Env
    • Documented OPF ENv
    • Documenting OPF Task Driver
    • Documenting OPF experiment runner
    • Removed OPF utils PredictionElement (#3604)
    • Partial doc of experiment description api
    • NUP-2429 Add .gitignore with first_order_0.csv to prevent accedental commits of this generated file.
    • Documented cluster_params canned model config
    • Documented OPF model exceptions
    • Finished doccing opf_utils
    • Documenting OPF utils
    • Removed predictedField from HTMPredictionModel constructor (#3600)
    • NUP-2420 Renamed tm_shim.py to BacktrackingTM_shim.py
    • Removes inputRef / bookmark params from appendRecord (#3597)
    • Documented nupic.data (#3593)
    • OPF Model docstrings (#3589)
    • Remove obsolete nupic.research.bindings check
    • Removed unimplemented abstract methods (#3596)
    • Removed WeatherJoiner code from old example (#3595)
    • Updated snakecase opf_utils in RST docs (#3585)
    • Renamed tm_ccp test so it runs
    • Moved research tm_cpp_test.py back into nupic.research
    • Removed base.Encoder.formatBits() (#3582)
    • Replace dump() with define str in Encoders. Issue #1518 (#3559)
    • Complete encoder docstrings (#3579)
    • Removed nupic.research, moved contents to nupic.algorithms
    • move zip logic into 'build_script'
    • Add support for artifacts deployed to S3 named according to sha
    • Snake case module names PART 2 (#3561)
    • Remove old examples Part 2 (#3562)
    • NUP-2401: Check for prediction results discrepancies (#3558)
    • NUP-2397: rename TP* to TM* (#3555)
    • NUP-2405: quick-start guide for the Network API (#3557)
    • Snake case module names PART 1 (#3550)
    • NUP-2394: network API code example (#3520)
    • Remove old examples Part 1 (#3551)
    • Docs: InferenceShifter,ModelResult,SensorInput,InferenceType (#3549)
    • CLAModel name changed to HTMPredictionModel (#3516)
    • Updating FileRecordStream docstrings (#3545)
    • Fieldmeta docstrings (#3541)
    • Update KNNClassifier docstrings (#3535)
    • SDRClassifier docs, default config docs
    • Updates anomaly docstrings (#3537)
    • [NUP-2399] Added style guides to new guide (#3528)
    • NUP-2396 Allow SensorRegion to pass actValue and bucketIdx to SDRClassifierRegion
    • Added anomaly detection guide (#3521)
    • NUP-2389 Upgrade nupic.bindings dependency to 0.6.1 which has the requisite changes.
    • name change tpParams/tmEnable => tmParams/tmEnable (#3514)
    • NUP-2391: packages to document & progress tracking (#3517)
    • Quick Start
    • NUP-2389 Remove calls to Region::purgeInputLinkBufferHeads. Since we only support delay=0 in CLA models, we no longer need purgeInputLinkBufferHeads, because the new Link::compute logic in nupic.core now performs direct copy from src to dest for links with delay of 0.
    • Disable flatline hack in anomaly likelihood
    Source code(tar.gz)
    Source code(zip)
  • 0.6.0(Mar 30, 2017)

    0.6.0

    • Touch init even if model params dir exists
    • Auto-add init.py when model parms created
    • Shift code from otherwise unused nupic.engine.common_networks to example where it's used. Includes bugfix renaming rawAnomalyScore to anomalyScore
    • Explicitly import and use engine_internal in lieu of engine to avoid confusion, create nupic.engine.OS and nupic.engine.Timer by assignment rather than subclass
    • Change SparsePassThroughEncoder dtype error to ValueError
    • Fix for an unrelated change that resulted in numpy arrays being used in cpp implementation
    • Give better message for bad dtype to SparsePassThroughEncoder
    • Add test for passing float values for radius
    • Adds api docs for coordinate encoders
    • Cleanup CoordinateEncoder
    • Remove svm, cells4 tests that are moved to nupic.core.
    • Added missing anomaly stuff, fixed requirements
    • Moved sphinx deps out of requirements.txt
    • Fix hotgym_regression_test.py to make it work with nupic.core PR 1236.
    • Skip test when capnp is not available, such as windows as well as address feedback from Scott
    • Serialization base python class analagous to nupic.core Serializable c++ class
    • Adds a demo Jupyter notebook, useful for demonstrating usage of visualization framework and as an entrypoint for tinkering with different network topologies
    • Speed up SpatialPooler read method.
    • Rename normalProbability to tailProbability.
    • Use IterableCollection from engine_internal
    • Call Region.purgeInputLinkBufferHeads after compute() calls in CLAModel to integrate with the new delayed link implementation from nupic.core.
    • rename maxBoost to boostStrength in hotgym example
    • Disable backward compatibility serialization test
    • remove minPctActiveDutyCycle parameter form SP compatability test
    • update expected result in hotgym, result change due to different rounding rules in boosting
    • eliminate minPctActiveDutyCycle from spatial pooler
    • Rename maxBoost to BoostStrength
    • Stop changing the overlaps to do tie-breaking
    • Stop trying to get identical boost factors between py and cpp
    • set maxBoost in expdescriptionapi
    • update sp_overlap_test to use global inhibition
    • slight simplification of boostFactor calculation
    • Implement update boost factors local and global
    • Avoid floating point differences with C++ SpatialPooler
    • run C++ SP in spatial_pooler_boost_tests
    • update spatial pooler boost test
    • update boosting rules for spatial pooler
    • fix bug in setPotential
    • modified SP boosting rule
    Source code(tar.gz)
    Source code(zip)
  • 0.5.7(Nov 28, 2016)

    • Remove tests moved to nupic.core and update version to latest bindings release.
    • Update hello_tm.py
    • Removed linux and gcc from Travis build matrix
    • Makes anomaly_likelihood.py compliant to Python3
    • Update env vars and paths to simplify the AV configuration and installation.
    • Cleanup references to nupic.bindings and old CI code for manually fetching nupic.bindings since it should be found on PyPI without doing anything special.
    Source code(tar.gz)
    Source code(zip)
  • 0.5.6(Oct 20, 2016)

    • Since manylinux nupic.bindings wheel 0.4.10 has now been released to PyPi, we no longer need to install nupic.bindings from S3.
    • fix logic in _getColumnNeighborhood
    • Bugfix in flatIdx reuse after a segment is destroyed
    • Change private _burstColumn class method signature to accept a cellsForColumn argument in lieu of a cellsPerColumn argument. Move the calculation that otherwise depends on cellsPerColumn into the instance method.
    • TM: Support extensibility by using traditional methods
    • Update expected error for topology changes
    • Update expected hotgym result for topology changes
    • Adds RELEASE.md with documentation for releasing NuPIC.
    • Match nupic.core's SP neighborhood ordering.
    • Update inhibition comments and docstrings.
    • Introduce mechanism by which already-installed pre-release versions of nupic.bindings are ignored during installation
    • Assign self.connections value from self.connectionsFactory() rather than direct usage of Connections constructor. Allows better extensibility should the user want to change some aspect of the creation of the connections instance in a subclass
    • Removed obsolete directory src/nupic/bindings/
    • Remove the notion of "destroyed" Segments / Synapses
    • Enable proper subclassing by converting staticmethods that referenced TemporalMemory to classmethods that reference their class.
    • Fixup TemporalMemory.write() to handle columnDimensions as tuples.
    • Initialize columnDimensions as a tuple in test to reflect common convention. This forces the TemporalMemoryTest.testWriteRead test to fail in its current state.
    • Store "numActivePotentialSynapses". No more "SegmentOverlap".
    • Add a lot more scenarios to the TM perf benchmark
    • Moved audiostream example to htm-community
    • Safer "addToWinners" value. Play nicely with surgical boosting.
    • Bugfix: With no stimulus threshold, still break ties when overlaps=0
    • Clean up trailing whitespace and tabs
    • Properly apply the stimulus threshold
    • Add test for new "learn on predicted segments" behavior
    • Split compute into activateCells and activateDendrites
    • Grow synapses in predicted columns, not just bursting columns
    • Removed bundled get-pip.py and instead fetch version copy from S3
    • Removed .nupic_modules and now rely on versioned release of nupic.bindings on PyPI
    • averagingWindow size updated to improve HTM scores for RES-296
    • Build system updates for Bamboo (Linux), Travis (OS X), and AppVeyor (Windows)
    • Added nyc taxi example for anomaly detection
    Source code(tar.gz)
    Source code(zip)
  • 0.5.5(Aug 17, 2016)

    • Renamed a misclassed class name from ConnectionsTest to GroupByTest
    • not _ is => is not and fixes groupby comment and passes integration tests
    • overhaul to groupby, now 10% faster than current implementation
    • NUP-2299 Install specific versions of pip, setuptools, and wheel.
    • NUP-2299 Added platform-conditional dependency on pycapnp==0.5.8 using PEP-508.
    • lazy group_by and changes to GroupByGenerator
    • perf improvement to segment comparison in compute activity
    • 100 % increase in spped
    • small perf changes
    • demonstrate that compatability test works with predictedSegmentDec not 0.0
    • fixes subtle bug in numSegments that caused integration tests to fail
    • fixes bug where minIdx could be passed as a float rather than an int
    • skip serialization test if capnp is not installed
    • lints and updates comments in group_by.py and group_by_tests.py
    • gets same results as c++ temporal memory after group_by changes
    • ports group_by tests and they pass
    • adds groupByN utility function for use in TM
    • all connections tests written and passing, moved some stuff around and added missing function to connections
    • started porting new connections tests and minor changes to connections.py
    • improves permanence >= testing in computeActivity
    • confirmed python implementation is same as cpp version. Needs better perf now
    • adds back AnomalyRegion and Anomaly class in anomaly.py and related tests
    • fixes bug in growSynapses, almost exactly the same
    • Updated core SHA and default SDR classifier implementation
    • Updated SDRClassifier factory and region to handle cpp
    • changed input name from value to metricValue
    • updates variables names in anomaly_likelihood.py and AnomalyLikelihoodRegion
    • adds new connections methods
    • create new methods for creating/destroying synapses/segments
    • continues change of connections datastructures
    • move raw anomaly calculation back to nupic.algorithms.anomaly
    • Finished swarming/hypersearch separation
    • Moved base hypersearch classes to hypersearch
    • Moved experimentutils to nupic.swarming
    • Updated SDR classifier internals
    • calculate raw anomly score in KNNAnomalyClassifier
    • removes anomaly.py dependency in network_api_demo.py
    • changes how TPRegion computes prevPredictdColumns and updates clamodel
    • Install pip from local copy, other simplifications
    • Fixup PYTHONPATH to properly include previously-defined PYTHONPATH
    • adds pseudocode to core functions
    • continues implementation of AnomalyLikelihoodRegion
    • Limit tests to unit after ovverriding pytest args on cli
    • DEVOPS-85 OS X build infrastructure for Bamboo CI environment
    • replaces segmentCMP with lambda and updates docstrings
    • uses arrays instead of dicts in computeActivity
    • Corrections to examples in tm_high_order.py
    • incorporates binary search into the algorithm where applicable
    • remove outdated nab unit tests
    • use Q function
    • Corrections to examples in tm_high_order.py
    • change to column generator
    • Added tm_high_order.py to show examples of the temporal memory.
    • Fixed conversion bug in SDRClassifier serialization
    • Fixed patternNZ proto writing.
    • Slight fix for pattern history handling in sdr classifier
    • Small fix on SDR classifier
    • Better fix for #3172, using the initialize() function and checking if _sdrClassifier is set
    • Updated learning rate for SDR classifier + slight changes to the error ranges in OPF test
    • Updated hotgym test with actual value and implemented first fix for OPF test
    • Updated tests and examples with SDR classifier
    • Finished updating examples with SDR classifier.
    • Updated hotgym and general anomaly examples with SDR classifier.
    • Updates pycapnp to 0.5.8
    • test_db-fixes avoids printing user password in plaintext
    • test_db-fixes updates database and table name
    • Corrections made to the spatial pooler tutorial.
    • changes maxBoost default value to 1.0
    • fixes connection tests and prints config file used in test_db.py
    • Moved back overlap accesors test for spatial_pooler from API tests to unit tests.
    • Added tutorial script for the spatial pooler. Modified README file accordingly.
    • Moved the unit test for SP overlap accesors to API tests.
    Source code(tar.gz)
    Source code(zip)
  • 0.5.4(Jun 2, 2016)

    • Added overlap accessors to spatial_pooler.py plus unit tests. (Code style corrected)
    • Updated VERSION in Spatial Pooler and added backward compatibility in setstate()
    • Added members overlaps and boostedOverlaps to SpatialPooler class.
    • Addition of overlaps and boostedOverlaps members to SpatialPooler class plus unit tests.
    • Added docs for return type in RDSE internal func.
    • tm_cpp with tuned parameters
    • RES-215 Changes to add params for new TM subclass for NAB
    • Remove main function from SDRClassifierRegion
    • remove unused methods from SDRClassifierRegion
    • Add simple end-to-end integration test for SDRClassifierRegion
    • use string split instead of eval to parse strings
    • correct inconsistent error msg in sdr_classifier_factory.py
    • Fix readWrite test of SDR classifier
    • Add SDRClassifier Region to pyRegions
    • Initial implementation of SDRClassifier Region
    • implement SDR classifier factory
    • Add capnp proto for SDR classifier region
    • Add default value for SDR classifier implementation in nupic-default.xml
    Source code(tar.gz)
    Source code(zip)
  • 0.5.3(May 16, 2016)

    • Default DATETIME columns to NULL in ClientJobsDAO for compatibility across mysql versions. As of mysql 5.7.8, values of 0 are not allowed for DATETIME columns, and CURRENT_TIMESTAMP is semantically inappropriate for those columns.
    • Suppress this optional dependency on matplotlib without logging, because python logging implicitly adds the StreamHandler to root logger when calling logging.debug, etc., which may undermine an application's logging configuration
    • Bugfix: Write the 'actualValues' to the output, don't reassign the output
    • Fixed Username Regex in ClientJobsDAO
    • cleaned up region a bit to make it compliant with numenta's coding guidelines.
    Source code(tar.gz)
    Source code(zip)
  • 0.5.2(Apr 15, 2016)

  • 0.5.1(Mar 28, 2016)

    • Improves SDR classifier and tests
    • Modify the continuous online learning test
    • Add 3 tests on multiple item prediction
    • Fix test_pFormatArray
    • Implement SDR classifier in NuPIC
    • Make the 'arrayTypes' list more informative
    • Add getParameter/setParameter support for Bool and BoolArray
    • Improved anomaly params (from NAB)
    • Added minSparsity option
    • Get the encoder's outputWidth via parameter
    • Use nupic.core encoders from nupic via the Network API
    • Fix bugs and inconsistencies in the custom region demo
    • Adds BINDINGS_VERSION envvar to wheel filename (for iterative builds)
    Source code(tar.gz)
    Source code(zip)
  • 0.5.0(Feb 15, 2016)

    • Removes references to FastTemporalMemory.
    • Lower TM epsilon threshold for compatibility.
    • Add documentation for the Monitor Mixins
    • Removed FastTemporalMemory from nupic
    • Update temporal memory compatibility test to use C++ TM.
    • Sort segments before iterating for compatibility with C++
    • Sort unpredictedActiveColumns before iterating for compatibility with C++
    Source code(tar.gz)
    Source code(zip)
  • 0.4.5(Jan 27, 2016)

  • 0.4.3(Jan 26, 2016)

  • 0.4.2(Jan 25, 2016)

  • 0.4.1(Jan 25, 2016)

  • 0.4.0(Jan 25, 2016)

    • Updated hello_tm.py to use accessors
    • Updated TP_shim.py to use accessors Updated columnForCell and _validateCell in FastTemporalMemory to conform to their docstrings, which is needed for the change to TP_shim.py
    • Updated temporal memory monitor mixin to use accessors
    • Updated temporal_memory_test.py to use accessor methods.
    • Added accessors to temporal_memory.py
    • Change temporalImp to tm_py for both networks and add comment about it being a temporary value until C++ TM is implemented
    • Refactored to remove common code between network_checkpoint_test.py and temporal_memory_compatibility_test.py
    • Use named constants from nupic.data.fieldmeta in aggregator module instead of naked constants.
    • Fix AttributeError: 'TPShim' object has no attribute 'topDownCompute'
    • Support more parameters in TPShim
    • Serialize remaining fields in CLAModel using capnproto
    • Enforce pyproj==1.9.3 in requirements.txt
    • Use FastCLAClassifier read class method instead of instance method
    • Have CLAClassifierFactory.read take just the proto object
    • Add capnp serialization to CLAClassifierRegion
    • Add capnp serialization to SPRegion
    Source code(tar.gz)
    Source code(zip)
  • 0.3.6(Dec 18, 2015)

  • 0.3.5(Oct 8, 2015)

    • Raise explicit exception if user passes non-str path
    • SP: simplify local inhibition
    • SP: adapt tests, sort winning columns output
    • SP: simplify active columns assignment
    • SP: simplify global inhibition
    • file Rename as hello_tm.py and modifications in comments
    Source code(tar.gz)
    Source code(zip)
  • 0.3.4(Sep 30, 2015)

    • Added src/nupic/frameworks/opf/common_models/cluster_params.py and supporting files from numenta-apps htmengine. A separate numenta-apps PR will remove this code from htmengine.
    • fixes #2592
    • fix for #2265
    • fix for bug #2265
    • Fixup Dockerfile to install nupic.bindings, and other cleanup
    • Adding C++ compiler requirement to README.
    • Fix for test failure
    • Fixed stream definition reference error.
    • Reduce default reestimation period.
    • Remove greedy reestimation of distribution
    • Pointing README to proper bindings version.
    • Continuing work on 0.3.4.dev0.
    • removing a test that depends on nupic.vision
    • PCA_Node test: some fixes, WIP
    • formatting
    • test for PCANode region
    • remove Pillow from requirements.txt as it was used for vision only
    • fix merge mistake in csv file
    • move test from PCANode to nupic.vision unittest
    Source code(tar.gz)
    Source code(zip)
  • 0.3.3(Sep 22, 2015)

    • Include additional file types in MANIFEST.in, consistent with setup.py
    • Pattern and Sequence machines using nupic::Random
    • Wrap sparse matrix implementations with cortical column-centric semantics as a way to abstract away the underlying implementation
    • Re-enable testHotgymRegression
    Source code(tar.gz)
    Source code(zip)
  • 0.3.2(Sep 21, 2015)

    • Update to nupic.bindings version with fix for platform differences
    • Rename nupic directory to src/nupic
    • Updated S3 URL to nupic.bindings for Linux install
    • Fix paths for data files in an integration test
    • Fix issue with storing temporary file in wrong location in integration test
    Source code(tar.gz)
    Source code(zip)
  • 0.3.1(Sep 17, 2015)

    • Specify nupic.bindings version to match commit sha (0.2).
    • Use logging.debug for emitting the message about not being able to import matplotlib; we log it at debug level to avoid polluting the logs of apps and services that don't care about plotting.
    • Add Dockerfile ready to perform swarming.
    • Removes PCANode
    • Updated Linux binary install instructions.
    Source code(tar.gz)
    Source code(zip)
  • 0.3.0(Sep 3, 2015)

  • 0.2.12(Sep 3, 2015)

    • Implemented unit tests for the new features in AnomalyLikelihood class.
    • Convert AnomalyLikelihood._historicalScores to a user-configurable sliding window, instead of accumulating all of the incoming data points. This improved performance a ton! Added AnomalyLikelihood.forceModelRefresh() method.
    • Update nupic.core to include backwards compatibility fix for RandomImpl.
    • Uninstall pycapnp to avoid running tests that utilize the functionality and currently fail with Duplicate ID error.
    • Makes pycapnp and corresponding serialization optional. If pycapnp is not installed then the corresponding serialization tests will be skipped.
    • Add Multiple Prediction Test for NegLL Metric
    • Add test for NegLL Error Metric
    • Fix Orphan Decay Bug in temporal memory test
    • Change decreasing overlaps test for coordinate encoder to not require a strict decrease (staying the same is ok).
    • Allow specifying MonitoredTemporalMemory as TM implementation through OPF
    • include bucket likelihood and classifier input in clamodel
    • update metrics managers to pass model results to metrics
    • introducting a computeFlag to prevent double-computation. * The flag is used to prevent double computation in the event that customCompute() is called at the same time as compute()
    • Added numRecords param for consitency with the newly added infer method in FastCLACLassifier
    • checking if classifier has a maxCategoryCount attribute. If not, set it to solve backward compatibilities issues
    • renaming numCategories to maxCategoryCount to be constistent between KNN and CLA classifier
    • made new experimentutils file containing InferenceElement, InferenceType, and ModelResult duplicates which we will want to change in the future
    Source code(tar.gz)
    Source code(zip)
Owner
Numenta
Biologically inspired machine intelligence
Numenta
The (Official) PyTorch Implementation of the paper "Deep Extraction of Manga Structural Lines"

MangaLineExtraction_PyTorch The (Official) PyTorch Implementation of the paper "Deep Extraction of Manga Structural Lines" Usage model_torch.py [sourc

Miaomiao Li 82 Jan 02, 2023
DABO: Data Augmentation with Bilevel Optimization

DABO: Data Augmentation with Bilevel Optimization [Paper] The goal is to automatically learn an efficient data augmentation regime for image classific

ElementAI 24 Aug 12, 2022
TransGAN: Two Transformers Can Make One Strong GAN

[Preprint] "TransGAN: Two Transformers Can Make One Strong GAN", Yifan Jiang, Shiyu Chang, Zhangyang Wang

VITA 1.5k Jan 07, 2023
An Efficient Implementation of Analytic Mesh Algorithm for 3D Iso-surface Extraction from Neural Networks

AnalyticMesh Analytic Marching is an exact meshing solution from neural networks. Compared to standard methods, it completely avoids geometric and top

Karbo 45 Dec 21, 2022
This is the implementation of our work Deep Extreme Cut (DEXTR), for object segmentation from extreme points.

This is the implementation of our work Deep Extreme Cut (DEXTR), for object segmentation from extreme points.

Sergi Caelles 828 Jan 05, 2023
git《FSCE: Few-Shot Object Detection via Contrastive Proposal Encoding》(CVPR 2021) GitHub: [fig8]

FSCE: Few-Shot Object Detection via Contrastive Proposal Encoding (CVPR 2021) This repo contains the implementation of our state-of-the-art fewshot ob

233 Dec 29, 2022
Anomaly detection in multi-agent trajectories: Code for training, evaluation and the OpenAI highway simulation.

Anomaly Detection in Multi-Agent Trajectories for Automated Driving This is the official project page including the paper, code, simulation, baseline

12 Dec 02, 2022
Implementation of Graph Convolutional Networks in TensorFlow

Graph Convolutional Networks This is a TensorFlow implementation of Graph Convolutional Networks for the task of (semi-supervised) classification of n

Thomas Kipf 6.6k Dec 30, 2022
NumQMBasic - A mini-course offered to Undergrad physics students

The best way to use this material is by forking it by click the Fork button at the top, right corner. Then you will get your own copy to play with! Th

Raghu 35 Dec 05, 2022
A list of all named GANs!

The GAN Zoo Every week, new GAN papers are coming out and it's hard to keep track of them all, not to mention the incredibly creative ways in which re

Avinash Hindupur 12.9k Jan 08, 2023
Open source code for Paper "A Co-Interactive Transformer for Joint Slot Filling and Intent Detection"

A Co-Interactive Transformer for Joint Slot Filling and Intent Detection This repository contains the PyTorch implementation of the paper: A Co-Intera

67 Dec 05, 2022
code for ICCV 2021 paper 'Generalized Source-free Domain Adaptation'

G-SFDA Code (based on pytorch 1.3) for our ICCV 2021 paper 'Generalized Source-free Domain Adaptation'. [project] [paper]. Dataset preparing Download

Shiqi Yang 84 Dec 26, 2022
XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scale

XtremeDistilTransformers for Distilling Massive Multilingual Neural Networks ACL 2020 Microsoft Research [Paper] [Video] Releasing [XtremeDistilTransf

Microsoft 125 Jan 04, 2023
HeatNet is a python package that provides tools to build, train and evaluate neural networks designed to predict extreme heat wave events globally on daily to subseasonal timescales.

HeatNet HeatNet is a python package that provides tools to build, train and evaluate neural networks designed to predict extreme heat wave events glob

Google Research 6 Jul 07, 2022
dualPC.R contains the R code for the main functions.

dualPC.R contains the R code for the main functions. dualPC_sim.R contains an example run with the different PC versions; it calls dualPC_algs.R whic

3 May 30, 2022
A SAT-based sudoku solver

SAT Sudoku solver A SAT-based Sudoku solver made in the context of a small project in the "Logic Problem Solving" class in the first year at the Polyt

Alexandre Malfreyt 5 Apr 15, 2022
JstDoS - HTTP Protocol Stack Remote Code Execution Vulnerability

jstDoS If you are going to skid that, please give credits ! ^^ ¿How works? This

apolo 4 Feb 11, 2022
Video-based open-world segmentation

UVO_Challenge Team Alpes_runner Solutions This is an official repo for our UVO Challenge solutions for Image/Video-based open-world segmentation. Our

Yuming Du 84 Dec 22, 2022
NeuTex: Neural Texture Mapping for Volumetric Neural Rendering

NeuTex: Neural Texture Mapping for Volumetric Neural Rendering Paper: https://arxiv.org/abs/2103.00762 Running Run on the provided DTU scene cd run ba

Fanbo Xiang 67 Dec 28, 2022
ChebLieNet, a spectral graph neural network turned equivariant by Riemannian geometry on Lie groups.

ChebLieNet: Invariant spectral graph NNs turned equivariant by Riemannian geometry on Lie groups Hugo Aguettaz, Erik J. Bekkers, Michaël Defferrard We

haguettaz 12 Dec 10, 2022