Show simple item record

dc.contributor.authorFelsberg, Michael
dc.contributor.authorWiklund, Johan
dc.contributor.authorJonsson, Erik
dc.contributor.authorMoe, Anders
dc.contributor.authorGranlund, Gösta
dc.date.accessioned2019-10-24T03:55:06Z
dc.date.available2019-10-24T03:55:06Z
dc.date.created2017-01-05 00:50
dc.date.issued2006
dc.identifieroai:DiVA.org:liu-54326
dc.identifierhttp://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-54326
dc.identifier.urihttp://hdl.handle.net/20.500.12424/824893
dc.description.abstractOne major goal of the COSPAL project is to develop an artificial cognitive system architecture with the capability of exploratory learning. Exploratory learning is a strategy that allows to apply generalization on a conceptual level, resulting in an extension of competences. Whereas classical learning methods aim at best possible generalization, i.e., concluding from a number of samples of a problem class to the problem class itself, exploration aims at applying acquired competences to a new problem class. Incremental or online learning is an inherent requirement to perform exploratory learning. Exploratory learning requires new theoretic tools and new algorithms. In the COSPAL project, we mainly investigate reinforcement-type learning methods for exploratory learning and in this paper we focus on its algorithmic aspect. Learning is performed in terms of four nested loops, where the outermost loop reflects the user-reinforcement-feedback loop, the intermediate two loops switch between different solution modes at symbolic respectively sub-symbolic level, and the innermost loop performs the acquired competences in terms of perception-action cycles. We present a system diagram which explains this process in more detail. We discuss the learning strategy in terms of learning scenarios provided by the user. This interaction between user (’teacher’) and system is a major difference to most existing systems where the system designer places his world model into the system. We believe that this is the key to extendable robust system behavior and successful interaction of humans and artificial cognitive systems. We furthermore address the issue of bootstrapping the system, and, in particular, the visual recognition module.We give some more in-depth details about our recognition method and how feedback from higher levels is implemented. The described system is however work in progress and no final results are available yet. The available preliminary results that we have achieved so far, clearly point towards a successful proof of the architecture concept.
dc.format.mediumapplication/pdf
dc.language.isoeng
dc.publisherLinköpings universitet, Bildbehandling
dc.publisherLinköpings universitet, Tekniska högskolan
dc.publisherLinköpings universitet, Bildbehandling
dc.publisherLinköpings universitet, Tekniska högskolan
dc.publisherLinköpings universitet, Bildbehandling
dc.publisherLinköpings universitet, Tekniska högskolan
dc.publisherLinköpings universitet, Bildbehandling
dc.publisherLinköpings universitet, Tekniska högskolan
dc.publisherLinköpings universitet, Bildbehandling
dc.publisherLinköpings universitet, Tekniska högskolan
dc.publisherLinköping : Linköping University Electronic Press
dc.relation.ispartofLiTH-ISY-R, 1400-3902 ; 2738
dc.rightsinfo:eu-repo/semantics/openAccess
dc.subjectCOSPAL project
dc.subjectTECHNOLOGY
dc.subjectTEKNIKVETENSKAP
dc.titleExploratory Learning Structure in Artificial Cognitive Systems
dc.typeReport
ge.collectioncodeOAIDATA
ge.dataimportlabelOAI metadata object
ge.identifier.legacyglobethics:10420967
ge.identifier.permalinkhttps://www.globethics.net/gel/10420967
ge.lastmodificationdate2017-01-05 00:50
ge.lastmodificationuseradmin@pointsoftware.ch (import)
ge.submissions0
ge.oai.exportid148934
ge.oai.repositoryid90
ge.oai.setnameReport
ge.oai.setnameScience
ge.oai.setspecliu
ge.oai.setspecreport
ge.oai.streamid2
ge.setnameGlobeEthicsLib
ge.setspecglobeethicslib
ge.linkhttp://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-54326


This item appears in the following Collection(s)

Show simple item record