Coder Social home page Coder Social logo

associative_memory's Introduction

Associative Memories

This paper was written for an assignment in the Machine Learning course given at Uppsala University in the spring of 2016.

Public domain

This paper and any original content of this repository is hereby released into the public domain.

associative_memory's People

Contributors

chilinot avatar chrisking2020 avatar mewmew avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

associative_memory's Issues

meta: Licensing of work

@Chilinot and @chrisking2020, would you be alright with releasing our work into the public domain? Another alternative might be to use a Creative Commons license.

Personally, I'd be very happy to release the work into the public domain.

Cheers

  • Robin

meta: Identify Aim and Objectives

This issue is meant to track discussions related to identifying the aim and objective of the project.

From ML course information 2016.pdf:

Survey some sub-field of machine learning, i.e. search for scientific papers on a subject,
summarize them, identify open and closed problems and define the 'state of the art',

TODO: Olle pointed out that we should define what we mean by associative memories, to make the aim and objective clearer. This should be easier once we finish the preliminary literature review (tracked by issue #1).

The primary aim of the project is to identify open and closed problems related to machine learning approaches to achieve associative memories. A secondary aim of the project is to compare the 'state of the art' machine learning approaches to human capabilities for associative memories, as understood by neuroscience, psychology and other research disciplines.

To achieve the primary aim of the project, the following objectives have been identified.

  1. foo
  2. bar
  3. TODO

To achieve the secondary aim of the project, the following objectives have been identified.

  1. baz
  2. qux
  3. TODO

We still need to discuss and figure out the details regarding what we wish to achieve with this project. Once we know what we want to achieve, it is easy to break this down into a set of objectives which would help us reach this aim. This discussion will be fueled by whatever we may uncover from the preliminary literature review, the starting point of which is tracked in #1.

meta: Project Proposal

Write a project proposal.

Should roughly contain the following sections.

  • Project Aim and Objectives (see #2)
  • Project Deliverables
    • A meta-study report
    • A presentation
  • Project Constraints
    • What is the scope and outline of the project?
  • Project Approach
    • Brief discussion on how we will do the project; i.e. meta-study.
  • Starting Point of Research (see #1)
  • Project Plan (see #3)

review: Anderson, James A. "A simple neural network generating an interactive memory." Mathematical Biosciences 14.3 (1972): 197-220

A heavy and in-depth article. I have currently not read through it in its entirety, but I don't think we need a full review in the article as of now.

Short summary:
The article is a very in-depth analysis and discussion on how an interactive memory or what we would call a "Linear Associative Memory" today would be constructed using artificial neural networks. It even includes some sections where animal testing has been performed on monkeys in order to understand how neuron interact with each other.

report: Related Work, sub-task 4

As this is a meta-study, the related work, or background research section will be the main content of the paper. Therefore, this section has been split into several sub-tasks, to be tracked by meeting deadlines.

For each of these sub-tasks a set of research papers (see #1) will have been read, reviewed, critically evaluated, and summarized in the report.

TODO: Add specific research papers to be read, critically evaluated and summarized by each project member for this sub-task.

  • ROBIN: Arel, Itamar, Derek C. Rose, and Thomas P. Karnowski. "Deep machine learning-a new frontier in artificial intelligence research [research frontier]." Computational Intelligence Magazine, IEEE 5.4 (2010): 13-18. (issue #20)
  • ROBIN: Barto, Andrew G., Richard S. Sutton, and Peter S. Brouwer. "Associative search network: A reinforcement learning associative memory." Biological cybernetics 40.3 (1981): 201-211. (issue #19)
  • ROBIN: Palm, Günther. "On associative memory." Biological cybernetics 36.1 (1980): 19-31. (issue #21)

report: Related Work, sub-task 2

As this is a meta-study, the related work, or background research section will be the main content of the paper. Therefore, this section has been split into several sub-tasks, to be tracked by meeting deadlines.

For each of these sub-tasks a set of research papers (see #1) will have been read, reviewed, critically evaluated, and summarized in the report.

TODO: Add specific research papers to be read, critically evaluated and summarized by each project member for this sub-task.

  • ROBIN: Lansner, Anders. "Associative memory models: from the cell-assembly theory to biophysically detailed cortex simulations." Trends in neurosciences 32.3 (2009): 178-186. (issue #23)
  • ROBIN: Kohonen, T., et al. "A principle of neural associative memory." Neuroscience 2.6 (1977): 1065-1076. (issue #15)

meta: Project Plan

We should try to have a rough project plan figured out before the first meeting.

Which are the major tasks within the project?

  • Search for relevant research (see #1)
  • Preliminary literature review, get a feel for the field (see #4)
  • Discuss the aim and objective with the project (see issue #2)
  • Write a project proposal (see #5)
  • Conduct a literature review (see #6)
  • Write the project report (see #7)
  • Prepare for the presentation (see #8)
  • ...

If you've got any ideas on what to add to the project plan, or how to structure and track each task, feel free to discuss.

meta: Project Report

This is a meta-issue for the project report, tracking its progress. The specific sections have dedicated issues tracking their progress. Once all sections have been completed, and we have proof-read the report, this issue may be closed.

  • Abstract (#53)
  • Introduction (see #38)
    • Aim and Objectives (#39)
    • Hebbian Learning (#40)
  • Models of Associative Memory (#41)
    • Hopfield Network (#42)
    • Boltzmann Machine (#43)
    • Memory Resistor (#44)
  • Current Capabilities (#45)
    • Hopfield Network (#46)
    • Boltzmann Machine (#47)
    • Memory Resistor (#48)
  • Future Potential (#49)
    • Intelligent Systems (#50)
    • Energy Efficiency (#51)
  • Conclusion (#52)

meta: Meeting 2

Notes from the startup meeting (meeting 1). Things to do before next meeting.

  • Which areas to focus on (see issue #25)
  • Preliminary plan, how to divide the course work (see issue #3)

Suggested paper which merges the fields of machine learning with psychology.

  • Perceptrons (McCullough Pitts, the original paper)

meta: Models of Associative Memory

Identify key models of associative memory.

  • Hopfield networks
  • Boltzmann
  • Memristor (artificial synapses)

Other key models, not included in the report.

  • Bidirectional associative memory
  • Autoassociative memory
  • Brain-states-in-a-box

From other fields (such as psychology).

  • Transderivational search (adding description in comment below)

review: Preliminary Literature Review

Skim through the main research literature identified in #1. This will help us get a feel for what has been done in the field, a brief intuition for how it has evolved, and hopefully help us formulate the aim and objectives of the project (see #2).

review: Literature Review

Now we conduct the full literature review, trying both to objectively outline what has been done within the field, compare the results of different approaches, and critically evaluating what has been written. Are the authors clearly stating the assumptions that their work build upon, if the work generalizes or if it is only capable of solving specialized problems. This is particularly important, as the term associative memory may be vague and many authors may try to solve different problems, thus making their results difficult to compare objectively.

report: Related Work, sub-task 1

As this is a meta-study, the related work, or background research section will be the main content of the paper. Therefore, this section has been split into several sub-tasks, to be tracked by meeting deadlines.

For each of these sub-tasks a set of research papers (see #1) will have been read, reviewed, critically evaluated, and summarized in the report.

TODO: Add specific research papers to be read, critically evaluated and summarized by each project member for this sub-task.

  • ROBIN: Hinton, Geoffrey E., and James A. Anderson. Parallel Models of Associative Memory: Updated Edition. Psychology press, 2014. (issue #16)
  • ROBIN: Pershin, Yuriy V., and Massimiliano Di Ventra. "Experimental demonstration of associative memory with memristive neural networks." Neural Networks 23.7 (2010): 881-886. (issue #22)

meta: Presentation

This issue tracks everything related to preparations for the presentation.

There are a number of topics we may wish to discuss.

How do we want to present the findings of our meta-study? In what format?

  • Slides
    • How can we make this interesting?
  • Video
    • This is what is believed to happen within the brain when associative memories are learned, reinforced, and recalled from a neurological point of view.
    • This is what is believed to happen within the brain when associative memories are learned, reinforced, and recalled from a cognitive psychology point of view.
    • Analogously, these are different models for achieving associative memory with an Associatron, with a time-delay network, with deep machine learning, and with a fully connected neural network.
  • Role-play: A panel discussion between a psychologist, a neuroscientist and a machine learning expert.

Any other ideas? There will be many presentations given during the day, lets spice it up a little!

Notes about the presentation:

  • 15 minutes including questions
  • Little time
    • Focus on explaining ideas and general solutions
    • Skip details
    • Don't explain things which should be known from the course

report: Related Work, sub-task 3

As this is a meta-study, the related work, or background research section will be the main content of the paper. Therefore, this section has been split into several sub-tasks, to be tracked by meeting deadlines.

For each of these sub-tasks a set of research papers (see #1) will have been read, reviewed, critically evaluated, and summarized in the report.

TODO: Add specific research papers to be read, critically evaluated and summarized by each project member for this sub-task.

  • ROBIN: Bohland, Jason W., and Ali A. Minai. "Efficient associative memory using small-world architecture." Neurocomputing 38 (2001): 489-496. (issue #17)
  • ROBIN: Nakano, Kaoru. "Associatron-a model of associative memory." Systems, Man and Cybernetics, IEEE Transactions on 3 (1972): 380-388. (issue #18)

report: Disposition

Skeleton of the report.

  • Title
  • Abstract
  • (Glossary)
  • Introduction/Background (prior work, required to understand the solution)
  • Body (several sections)
  • Related work (if not part of introduction) (related work not necessary to understand the solution)
  • Discussion (subjective)/Conclusions and future work
  • References

Title

Short, catchy, true. 2 seconds to grab attention.

Abstract

  • Written last
  • Sell your idea
  • 100-200 words
  • Four parts/sentences
    • What's the problem
    • How did you solve it
    • What are the results
    • Conclusions (what it means for the future)
  • Make sure the abstract stands on its own
    • No references
    • Avoid acronyms

Introduction

  • Describe the problem
    • May include related work, but ...
  • State your contributions
    • Perhaps as a bulleted list
  • SKIP disposition section. Avoid "the rest of the paper is structured as follows", use forward references instead.
  • Use references for each new introduced concept.

Body

  • Meta-study: Existing methods -> Comparison -> Suggestions
  • Do it top-down; intuition first, details later
  • Choose the most direct route to the idea!
    • The way you came up with the idea is usually not interesting
  • The introduction makes claims
  • The body provides evidence
  • Evidence can be analysis and comparison, theorems, measures, case studies ...
  • Imagine a reader who wants to repeat your experiments. Is your information enough to do that? Be rigorous.

Related work

Prior work is a subset of related work. Prior work may be introduced in the introduction.

  • Part of introduction, or after the body.
  • Give credit!
    • Giving credit to someone else does not take away from yours.

Conclusion

  • Summarize your contributions
    • Be honest, mention weaknesses.
  • Conclusions from the results
  • Implications for the future
  • No new information in this section!

References

  • Always refer to the literature when
    • You claim things for which there is no evidence in this paper
    • You first introduce an established concept
  • Use Vancouver style [1]
    • Don't write, in [42], foo says bar. References should be invisible. Foo Bar describes, blah blah [42].
    • Prefer alphabetical sorting of the authors surname, rather than reference order.
  • Avoid web references.
    • If you must use web references, date them in the list.

review: Starting Point of Research

Identify a rough starting point for the meta-study research.

Which papers are key within this field?

  • ROBIN: Kohonen, T., et al. "A principle of neural associative memory." Neuroscience 2.6 (1977): 1065-1076. (issue #15)
  • Kohonen, Teuvo. Self-organization and associative memory. Vol. 8. Springer Science & Business Media, 2012.
  • Kohonen, Teuvo. Associative memory: A system-theoretical approach. Vol. 17. Springer Science & Business Media, 2012.
  • ROBIN: Hinton, Geoffrey E., and James A. Anderson. Parallel Models of Associative Memory: Updated Edition. Psychology press, 2014. (issue #16)
  • LUCAS: McEliece, Robert J., et al. "The capacity of the Hopfield associative memory." Information Theory, IEEE Transactions on 33.4 (1987): 461-482.
  • ROBIN: Bohland, Jason W., and Ali A. Minai. "Efficient associative memory using small-world architecture." Neurocomputing 38 (2001): 489-496. (issue #17)
  • ROBIN: Nugent, Michael Alexander, and Timothy Wesley Molter. "AHaH Computing–From Metastable Switches to Attractors to Machine Learning." PloS one 9.2 (2014): e85175.
  • LUCAS: Giles, C. Lee, and Tom Maxwell. "Learning, invariance, and generalization in high-order neural networks." Applied optics 26.23 (1987): 4972-4978.
  • ROBIN: Nakano, Kaoru. "Associatron-a model of associative memory." Systems, Man and Cybernetics, IEEE Transactions on 3 (1972): 380-388. (issue #18)
  • Cao, Jinde, and Qiankun Song. "Stability in Cohen? Grossberg-type bidirectional associative memory neural networks with time-varying delays." Nonlinearity 19.7 (2006): 1601.
  • WENTING: Hopfield, John J. "Neural networks and physical systems with emergent collective computational abilities." Proceedings of the national academy of sciences 79.8 (1982): 2554-2558.
  • Brown, Martin, and Christopher John Harris. "Neurofuzzy adaptive modelling and control." (1994).
  • WENTING: Psaltis, Demetri, and Nabil Farhat. "Optical information processing based on an associative-memory model of neural nets with thresholding and feedback." Optics Letters 10.2 (1985): 98-100.
  • LUCAS: Carpenter, Gail A. "Neural network models for pattern recognition and associative memory." Neural networks 2.4 (1989): 243-257.
  • ROBIN: Barto, Andrew G., Richard S. Sutton, and Peter S. Brouwer. "Associative search network: A reinforcement learning associative memory." Biological cybernetics 40.3 (1981): 201-211. (issue #19)
  • Raaijmakers, Jeroen GW, and Richard M. Shiffrin. "SAM: A theory of probabilistic search of associative memory." The psychology of learning and motivation: Advances in research and theory 14 (1981): 207-262.
  • Liao, Xiaofeng, and Juebang Yu. "Qualitative analysis of Bi‐directional Associative Memory with time delay." International Journal of Circuit Theory and Applications 26.3 (1998): 219-229.
  • WENTING: Yoshizawa, Shuji, Masahiko Morita, and Shun-Ichi Amari. "Capacity of associative memory using a nonmonotonic neuron model." Neural Networks 6.2 (1993): 167-176.
  • Gao Huang, Yu Sun, Zhuang Liu, Daniel Sedra, Kilian Weinberger. "Deep Networks with Stochastic Depth", 2016
  • ...

Have other meta-studies been done within this field? Which papers did they refer to?

  • ROBIN: Arel, Itamar, Derek C. Rose, and Thomas P. Karnowski. "Deep machine learning-a new frontier in artificial intelligence research [research frontier]." Computational Intelligence Magazine, IEEE 5.4 (2010): 13-18. (issue #20)
  • ROBIN: Palm, Günther. "On associative memory." Biological cybernetics 36.1 (1980): 19-31. (issue #21)
  • WENTING: Kan, Irene P., et al. "Implicit memory for novel associations between pictures: effects of stimulus unitization and aging." Memory & cognition 39.5 (2011): 778-790.
  • ...

Interesting papers from other disciplines. What does neuroscience have to say about associative memories?

  • Fanselow, Michael S., and Andrew M. Poulos. "The neuroscience of mammalian associative learning." Annu. Rev. Psychol. 56 (2005): 207-234.
  • Gabrieli, John DE. "Cognitive neuroscience of human memory." Annual review of psychology 49.1 (1998): 87-115.
  • Reijmers, Leon G., et al. "Localization of a stable neural correlate of associative memory." Science 317.5842 (2007): 1230-1233.
  • Hasselmo, Michael E., et al. "A model of the hippocampus combining self-organization and associative memory function." Advances in neural information processing systems (1995): 77-84.
  • Lytton, William W., and Peter Lipton. "Can the hippocampus tell time? The temporo‐septal engram shift model." Neuroreport 10.11 (1999): 2301-2306.
  • Biological Aspects of learning, memory formation and ontogeny of the CNS. 1977.
  • Rahmann, Hinrich, and Mathilde Rahmann. The neurobiological basis of memory and behavior. Springer Science & Business Media, 2012.
  • Amari, Shun-Ichi, and Kenjiro Maginu. "Statistical neurodynamics of associative memory." Neural Networks 1.1 (1988): 63-73.
  • ROBIN: Pershin, Yuriy V., and Massimiliano Di Ventra. "Experimental demonstration of associative memory with memristive neural networks." Neural Networks 23.7 (2010): 881-886. (issue #22)
  • Wang, DeLiang, Joachim Buhmann, and Christoph von der Malsburg. "Pattern segmentation in associative memory." Neural Computation 2.1 (1990): 94-106.
  • Nakazawa, Kazu, et al. "Requirement for hippocampal CA3 NMDA receptors in associative memory recall." Science 297.5579 (2002): 211-218.
  • Ranganath, Charan, et al. "Inferior temporal, prefrontal, and hippocampal contributions to visual working memory maintenance and associative memory retrieval." The Journal of Neuroscience 24.16 (2004): 3917-3925.
  • Levy, William B., and Oswald Steward. "Synapses as associative memory elements in the hippocampal formation." Brain research 175.2 (1979): 233-245.
  • ROBIN: Lansner, Anders. "Associative memory models: from the cell-assembly theory to biophysically detailed cortex simulations." Trends in neurosciences 32.3 (2009): 178-186. (issue #23)
  • Doyère, Valérie, and Serge Laroche. "Linear relationship between the maintenance of hippocampal long‐term potentiation and retention of an associative memory." Hippocampus 2.1 (1992): 39-48.
  • Staresina, Bernhard P., and Lila Davachi. "Object unitization and associative memory formation are supported by distinct brain regions." The Journal of Neuroscience 30.29 (2010): 9890-9897.
  • LUCAS: Amari, Shun-Ichi. "Characteristics of sparsely encoded associative memory." Neural Networks 2.6 (1989): 451-457.
  • Gibson, William G., and John Robinson. "Statistical analysis of the dynamics of a sparse associative memory." Neural Networks 5.4 (1992): 645-661.
  • Maren, Stephen. "Synaptic mechanisms of associative memory in the amygdala." Neuron 47.6 (2005): 783-786.
  • ...

What about psychology (cognitive psychology, biological psychology, ...)?

  • McClelland, James L., Bruce L. McNaughton, and Randall C. O'Reilly. "Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory." Psychological review 102.3 (1995): 419.
  • Anderson, John R., and Gordon H. Bower. Human associative memory. Psychology press, 2014.
  • Srull, Thomas K., Meryl Lichtenstein, and Myron Rothbart. "Associative storage and retrieval processes in person memory." Journal of Experimental Psychology: Learning, Memory, and Cognition 11.2 (1985): 316.
  • ...

Other fields

  • Paek, Eung G., and Demetri Psaltis. "Optical associative memory using Fourier transform holograms." Optical Engineering 26.5 (1987): 265428-265428.
  • ...

Please add to this list as you find more resources that may be a good starting point for the meta-study research.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.