Refine
Year of publication
- 2014 (2) (remove)
Document Type
- Article (2)
Language
- English (2) (remove)
Has Fulltext
- yes (2)
Is part of the Bibliography
- no (2) (remove)
Keywords
- Fisher information (1)
- Hebbian learning (1)
- Hubbard model (1)
- Mott insulator (1)
- band insulator (1)
- bilayer square lattice (1)
- generating functionals (1)
- homeostatic adaption (1)
- objective functions (1)
- synaptic plasticity (1)
Institute
- Physik (2) (remove)
The phase diagram of the square lattice bilayer Hubbard model: a variational Monte Carlo study
(2014)
We investigate the phase diagram of the square lattice bilayer Hubbard model at half-filling with the variational Monte Carlo method for both the magnetic and the paramagnetic case as a function of the interlayer hopping and on-site Coulomb repulsion U. With this study we resolve some discrepancies in previous calculations based on the dynamical mean-field theory, and we are able to determine the nature of the phase transitions between metal, Mott insulator and band insulator. In the magnetic case we find only two phases: an antiferromagnetic Mott insulator at small for any value of U and a band insulator at large . At large U values we approach the Heisenberg limit. The paramagnetic phase diagram shows at small a metal to Mott insulator transition at moderate U values and a Mott to band insulator transition at larger U values. We also observe a re-entrant Mott insulator to metal transition and metal to band insulator transition for increasing in the range of . Finally, we discuss the phase diagrams obtained in relation to findings from previous studies based on different many-body approaches.
Generating functionals may guide the evolution of a dynamical system and constitute a possible route for handling the complexity of neural networks as relevant for computational intelligence.We propose and explore a new objective function, which allows to obtain plasticity rules for the afferent synaptic weights. The adaption rules are Hebbian, self-limiting, and result from the minimization of the Fisher information with respect to the synaptic flux. We perform a series of simulations examining the behavior of the new learning rules in various circumstances.The vector of synaptic weights aligns with the principal direction of input activities, whenever one is present. A linear discrimination is performed when there are two or more principal directions; directions having bimodal firing-rate distributions, being characterized by a negative excess kurtosis, are preferred. We find robust performance and full homeostatic adaption of the synaptic weights results as a by-product of the synaptic flux minimization. This self-limiting behavior allows for stable online learning for arbitrary durations.The neuron acquires new information when the statistics of input activities is changed at a certain point of the simulation, showing however, a distinct resilience to unlearn previously acquired knowledge. Learning is fast when starting with randomly drawn synaptic weights and substantially slower when the synaptic weights are already fully adapted.