Unsupervised Hierarchical Temporal Abstraction by Simultaneously Learning Expectations and Representations
Unsupervised Hierarchical Temporal Abstraction by Simultaneously Learning Expectations and Representations
Katherine Metcalf, David Leake
Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence
Main track. Pages 3144-3150.
https://doi.org/10.24963/ijcai.2019/436
This paper presents ENHAnCE, an algorithm that simultaneously learns a predictive model of the input stream and generates representations of the concepts being observed. Following cognitively-inspired models of event segmentation, ENHAnCE uses expectation violations to identify boundaries between temporally extended patterns. It applies its expectation-driven process at multiple levels of temporal granularity to produce a hierarchy of predictive models that enable it to identify concepts at multiple levels of temporal abstraction. Evaluations show that the temporal abstraction hierarchies generated by ENHAnCE closely match hand-coded hierarchies for the test data streams. Given language data streams, ENHAnCE learns a hierarchy of predictive models that capture basic units of both spoken and written language: morphemes, lexemes, phonemes, syllables, and words.
Keywords:
Machine Learning: Online Learning
Machine Learning: Reinforcement Learning
Machine Learning: Time-series;Data Streams
Humans and AI: Cognitive Modeling