CREF TALK May, 10th, 2023
Title: Innovation preparedness in self-adaptive probabilistic neural networks
Speaker: Alessandro Londei (Sony CSL- Rome)

Can imagination drive a better learning framework? Does exploring the space of possibilities lead to an improved representation of knowledge acquired by training? Although these questions are challenging, a definitive answer is still far away in human sciences and artificial cognitive systems. Nevertheless, some authors assessed the importance of exploring the space of possibilities to appropriately describe the incremental evolution of different biological and cultural systems. This paper introduces a method called Dreaming Learning as a metaphor for dreaming in evolved animals, allowing a recurrent neural network to explore the space of possibilities efficiently. In particular, due to the nature of mechanisms driving the artificial machine explorative activity, the spanned space can be related to the concept of Adjacent Possible (AP) introduced by S. Kaufmann. This relationship allows us to apply the Dreaming Learning technique to Polya’s urn sequences learning, one of the most appropriate AP models in statistical modeling. Dreaming Learning significantly impacts reconstructing the statistical features of the non-stationary Polya reference system, especially the part related to the estimation of innovation rate, which vanishes in the case of traditional learning algorithms. Also, the dichotomy represented by exploration/exploitation in Dreaming Learning highlights the importance of prefigurating the future to improve the knowledge of the past and simultaneously anticipate the information carried by upcoming novelties in a general learning framework.