Because Blerpl learns multiple causes, it can do something NuPIC cannot - it can learn grammar from text. Here’s what I wrote in my notebook on the day I began to figure out how to do that.

Let’s say you are uttering the sentence “The pen is blue.” At the moment you are pronouncing the word “pen” there are several causes active:
- One cause is the sentence structure “The noun is adjective” and not some other structure.
- Another cause is that the noun is “pen”, not “book” or some other noun.
- Yet another cause is that you’re up to the noun in the sentence structure.
Something else very valuable happens when the network learns multiple causes. At the instant when the word “is” is being pronounced, neither the noun nor adjective causes are relevant, and they can take on “don’t care” states. This leads to the natural emergence of temporal slowness, as causes can spend much of their time being irrelevant to what is happening right now. The don’t care states need not be stored - which is key to how Blerpl works as a lossless compression algorithm.
No comments:
Post a Comment