6533b7d6fe1ef96bd12666db

RESEARCH PRODUCT

On the impact of forgetting on learning machines

Rūsiņš FreivaldsCarl SmithEfim Kinber

subject

Theoretical computer scienceActive learning (machine learning)Computer scienceSemi-supervised learningMutual recursionArtificial IntelligenceInstance-based learningHierarchyForgettingKolmogorov complexitybusiness.industryLearnabilityAlgorithmic learning theoryOnline machine learningInductive reasoningPumping lemma for regular languagesTerm (time)Computational learning theoryHardware and ArchitectureControl and Systems EngineeringArtificial intelligenceSequence learningbusinessSoftwareCognitive psychologyInformation Systems

description

People tend not to have perfect memories when it comes to learning, or to anything else for that matter. Most formal studies of learning, however, assume a perfect memory. Some approaches have restricted the number of items that could be retained. We introduce a complexity theoretic accounting of memory utilization by learning machines. In our new model, memory is measured in bits as a function of the size of the input. There is a hierarchy of learnability based on increasing memory allotment. The lower bound results are proved using an unusual combination of pumping and mutual recursion theorem arguments. For technical reasons, it was necessary to consider two types of memory : long and short term.

https://doi.org/10.1145/227683.227685