Search results for "Hopfield"

showing 9 items of 9 documents

On the Computational Complexity of Binary and Analog Symmetric Hopfield Nets

2000

We investigate the computational properties of finite binary- and analog-state discrete-time symmetric Hopfield nets. For binary networks, we obtain a simulation of convergent asymmetric networks by symmetric networks with only a linear increase in network size and computation time. Then we analyze the convergence time of Hopfield nets in terms of the length of their bit representations. Here we construct an analog symmetric network whose convergence time exceeds the convergence time of any binary Hopfield net with the same representation length. Further, we prove that the MIN ENERGY problem for analog Hopfield nets is NP-hard and provide a polynomial time approximation algorithm for this p…

Computational complexity theoryCognitive NeuroscienceComputationBinary numberHopfield networkTuring machinesymbols.namesakeRecurrent neural networkArts and Humanities (miscellaneous)Convergence (routing)symbolsTime complexityAlgorithmMathematicsNeural Computation
researchProduct

A Survey of Continuous-Time Computation Theory

1997

Motivated partly by the resurgence of neural computation research, and partly by advances in device technology, there has been a recent increase of interest in analog, continuous-time computation. However, while special-case algorithms and devices are being developed, relatively little work exists on the general theory of continuous- time models of computation. In this paper, we survey the existing models and results in this area, and point to some of the open research questions. Final Draft peerReviewed

Discrete mathematicsTheoretical computer scienceComputabilityComputationModel of computationneuraalilaskentaTuring machineTuring machinesymbols.namesakeModels of neural computationComputable functionOpen researchTheory of computationsymbolsHopfield networkcellular automatondifferential analyzerMathematics
researchProduct

Mapping discounted and undiscounted Markov Decision Problems onto Hopfield neural networks

1995

This paper presents a framework for mapping the value-iteration and related successive approximation methods for Markov Decision Problems onto Hopfield neural networks, for both discounted and undiscounted versions of the finite state and action spaces. We analyse the asymptotic behaviour of the control sets and we give some estimates on the convergence rate for the value-iteration scheme. We relate the convergence properties on an energy function which represents the key point in mapping Markov Decision Problems onto Hopfield networks. Finally, an application from queueing systems in communication networks is taken into consideration and the results of computer simulation of Hopfield netwo…

Hopfield networkMathematical optimizationQueueing theoryArtificial neural networkRate of convergenceMarkov chainComputer scienceConvergence (routing)Function (mathematics)Decision problem
researchProduct

Exponential Transients in Continuous-Time Symmetric Hopfield Nets

2001

We establish a fundamental result in the theory of continuous-time neural computation, by showing that so called continuous-time symmetric Hopfield nets, whose asymptotic convergence is always guaranteed by the existence of a Liapunov function may, in the worst case, possess a transient period that is exponential in the network size. The result stands in contrast to e.g. the use of such network models in combinatorial optimization applications. peerReviewed

Lyapunov functionHopfield netsstabilityneural networksExponential functionHopfield networksymbols.namesakeModels of neural computationRecurrent neural networkConvergence (routing)symbolsApplied mathematicsCombinatorial optimizationdynaamiset systeemitAlgorithmMathematicsNetwork model
researchProduct

A NEURAL NETWORK PRIMER

1994

Neural networks are composed of basic units somewhat analogous to neurons. These units are linked to each other by connections whose strength is modifiable as a result of a learning process or algorithm. Each of these units integrates independently (in paral lel) the information provided by its synapses in order to evaluate its state of activation. The unit response is then a linear or nonlinear function of its activation. Linear algebra concepts are used, in general, to analyze linear units, with eigenvectors and eigenvalues being the core concepts involved. This analysis makes clear the strong similarity between linear neural networks and the general linear model developed by statisticia…

Radial basis function networkTheoretical computer scienceEcologyLiquid state machineComputer scienceTime delay neural networkApplied MathematicsActivation functionGeneral MedicineTopologyAgricultural and Biological Sciences (miscellaneous)Hopfield networkRecurrent neural networkMultilayer perceptronTypes of artificial neural networksJournal of Biological Systems
researchProduct

Immune networks: multitasking capabilities near saturation

2013

Pattern-diluted associative networks were introduced recently as models for the immune system, with nodes representing T-lymphocytes and stored patterns representing signalling protocols between T- and B-lymphocytes. It was shown earlier that in the regime of extreme pattern dilution, a system with $N_T$ T-lymphocytes can manage a number $N_B!=!\order(N_T^\delta)$ of B-lymphocytes simultaneously, with $\delta!<!1$. Here we study this model in the extensive load regime $N_B!=!\alpha N_T$, with also a high degree of pattern dilution, in agreement with immunological findings. We use graph theory and statistical mechanical analysis based on replica methods to show that in the finite-connectivit…

Statistics and ProbabilityImmune Network Statistical Mechanics Hopfield Model Parallel RetrievalQuantitative Biology::Tissues and OrgansPhase (waves)FOS: Physical sciencesGeneral Physics and AstronomyInterference (wave propagation)TopologyQuantitative Biology::Cell BehaviorCell Behavior (q-bio.CB)Physics - Biological PhysicsFinite setMathematical PhysicsConnectivityAssociative propertyPhysicsDegree (graph theory)ReplicaStatistical and Nonlinear PhysicsGraph theoryDisordered Systems and Neural Networks (cond-mat.dis-nn)Condensed Matter - Disordered Systems and Neural NetworksBiological Physics (physics.bio-ph)FOS: Biological sciencesModeling and SimulationQuantitative Biology - Cell BehaviorJournal of Physics A: Mathematical and Theoretical
researchProduct

Immune networks: Multi-tasking capabilities at medium load

2013

Associative network models featuring multi-tasking properties have been introduced recently and studied in the low load regime, where the number $P$ of simultaneously retrievable patterns scales with the number $N$ of nodes as $P\sim \log N$. In addition to their relevance in artificial intelligence, these models are increasingly important in immunology, where stored patterns represent strategies to fight pathogens and nodes represent lymphocyte clones. They allow us to understand the crucial ability of the immune system to respond simultaneously to multiple distinct antigen invasions. Here we develop further the statistical mechanical analysis of such systems, by studying the medium load r…

Statistics and ProbabilityModularity (networks)Theoretical computer scienceDegree (graph theory)Associative networkComputer scienceGeneral Physics and AstronomyFOS: Physical sciencesStatistical and Nonlinear PhysicsDisordered Systems and Neural Networks (cond-mat.dis-nn)Condensed Matter - Disordered Systems and Neural NetworksModeling and SimulationFOS: Biological sciencesCell Behavior (q-bio.CB)Human multitaskingQuantitative Biology - Cell BehaviorRelevance (information retrieval)Cluster analysisImmune Network Statistical Mechanics Hopfield model Parallel RetrievalMathematical Physics
researchProduct

Are Neural Networks Imitations of Mind?

2015

Artificial neural networks are often understood as a good way to imitate mind through the web structure of neurons in brain, but the very high complexity of human brain prevents to consider neural networks as good models for human mind;anyway neural networks are good devices for computation in parallel. The difference between feed-forward and feedback neural networks is introduced; the Hopfield network and the multi-layers Perceptron are discussed. In a very weak isomorphism (not similitude) between brain and neural networks, an artificial form of short term memory and of acknowledgement, in Elman neural networks, is proposed.

Structure (mathematical logic)Artificial neural networkQuantitative Biology::Neurons and CognitionArtificial neural networkComputer sciencebusiness.industryComputationComputer Science::Neural and Evolutionary ComputationAcknowledgementShort-term memoryRecurrent networkBrainFeed-forward networkSettore M-FIL/02 - Logica E Filosofia Della ScienzaPerceptroncomputer.software_genreMindSimilitudeHopfield networkArtificial intelligenceData miningbusinesscomputer
researchProduct

Some Afterthoughts on Hopfield Networks

1999

In the present paper we investigate four relatively independent issues, which complete our knowledge regarding the computational aspects of popular Hopfield nets. In Section 2 of the paper, the computational equivalence of convergent asymmetric and Hopfield nets is shown with respect to network size. In Section 3, the convergence time of Hopfield nets is analyzed in terms of bit representations. In Section 4, a polynomial time approximate algorithm for the minimum energy problem is shown. In Section 5, the Turing universality of analog Hopfield nets is studied. peerReviewed

TheoryofComputation_COMPUTATIONBYABSTRACTDEVICESQuantitative Biology::Neurons and CognitionComputer scienceParallel algorithmHopfield netsApproximation algorithmSection (fiber bundle)Hopfield networknetworksHopfieldAlgorithmTime complexityEquivalence (measure theory)Energy (signal processing)
researchProduct