1. Fernando Brandao

Quantum Information Theory with Applications

We will cover the basics of quantum information theory. We will then apply them to problems in condensed matter physics (such as proving area laws for entanglement) and statistical mechanics/thermodynamics (such as understanding equilibration in quantum dynamics)

Lecture 1: fundamentals of quantum information theory
Lecture 2: applications to condensed matter physics
Lecture 3: applications to statistical mechanics/thermodynamics

 

  1. Matthew Headrick

Entanglement entropy, quantum field theory, and holography

We will give an overview of the concept of entanglement entropy (EE) and its applications in quantum field theories, including holographic ones. We will start with a brief review of the concept of entropy in quantum information theory, its properties, and its relation to entanglement. We will then discuss EE in field theories, starting with simple spin systems and proceeding through the toric code to relativistic and conformal field theories. Specific examples as well as general properties will be discussed. The goal is to explain both how to calculate EEs and why they are useful. Finally, after a very brief introduction to holographic dualities, we will present the Ryu-Takayanagi EE formula and give examples of its applications and implications, with the goal of conveying how holography elegantly encodes entanglement in geometry.

References:

  1. H. Casini, M. Huerta, "Entanglement entropy in free quantum field theory",  arXiv:0905.2562 [hep-th]
  2. P. Calabrese, J. Cardy, "Entanglement entropy and conformal field theory", arXiv:0905.4013 [cond-mat.stat-mech]
  3. M. Rangamani, T. Takayanagi, "Holographic entanglement entropy", arXiv:1609.01287 [hep-th]

 

  1. Florent Krzakala

From information theory to learning via Statistical Physics

  1. Introduction: statistical learning, Bayes rules, estimators, and statistical physics
  2. Phase transition in learning: easy, hard and impossible phases
  3. From statistical physics to statistical inference

 

  1. Ben Machta

Lecture 1: Why do simple models work?  Partial answers from information geometry.

Physics is filled with emergent models that cannot be easily motivated by reference to their component parts (1).   In biology and other areas of science there are multiparameter models which are almost certainly incomplete, over complete and otherwise microscopically wrong.  Nevertheless, both types of models can be predictive.  Why do these models work? In Mark's lecture he noted that the latter class of model's are sloppy, with just a few `relevant' parameter combinations determining behavior.  Here I will give arguments for when sloppiness should be expected.  I will first argue that sloppiness naturally arises when the data constrains itself- when some data points are effectively interpolations from others (2).  I will then show that microscopic models are often sloppy when probed with coarsened observables, making connections to renormalization group arguments from physics (3).  We can discuss the relationship between these arguments.

  1. Anderson, P. W. (1972). More is different. Science177(4047), 393-396.
  2. Transtrum, M. K., Machta, B. B., & Sethna, J. P. (2010). Why are nonlinear fits to data so challenging?. Physical review letters104(6), 060201.
  3. Machta, B. B., Chachra, R., Transtrum, M. K., & Sethna, J. P. (2013). Parameter space compression underlies emergent theories and predictive models. Science342(6158), 604-607.

*Lecture 2 will be in two parts, the first a bit longer.

Lecture 2 a: Why are simpler models better? Bayesian priors and Occam's razor.

Why do we prefer simpler models?  Are they more predictive or merely more aesthetically pleasing?  I will present a Bayesian approach to model selection, first reviewing the concept of an 'uninformative' prior with particular attention to Jeffreys' prior (1) which has an elegant interpretation in information geometry.  However, Jeffreys' prior implicitly makes the assumption that data is plentiful, sufficient to constrain all parameter values to a small range, an assumption that is sure to fail in a typical sloppy model probed with finite data.  I will present a prior which maximizes the mutual information between parameters and data, which in the large data limit is called a reference prior (2), and approaches Jeffreys'.  In the finite data limit, however, it is typically discrete, with weight on a finite number of delta functions, which tend to prefer boundaries of the model manifold. 

  1. Jeffreys, H. (1946). An invariant form for the prior probability in estimation problems. In Proceedings of the Royal Society of London A: Mathematical, Physical and Engineering Sciences (Vol. 186, No. 1007, pp. 453-461). The Royal Society.
  2. Berger, J. O., Bernardo, J. M., & Sun, D. (2009). The formal definition of reference priors. The Annals of Statistics, 905-938.

Lecture 2b: From the Jarzynski equality to linear response.

In preparation for lecture 3 I will give a brief and simplistic derivation of the Jarzynski equality (1), through the lens of bounding the dissipation of a protocol and with some discussion of the experiments that have verified it (2).  Then I will  use the Jarzynski equality to derive the fluctuation-dissipation theorem.

  1. Jarzynski, C. (1997). Nonequilibrium equality for free energy differences. Physical Review Letters78(14), 2690.
  2. Liphardt, J., Dumont, S., Smith, S. B., Tinoco, I., & Bustamante, C. (2002). Equilibrium information from nonequilibrium measurements in an experimental test of Jarzynski's equality. Science296(5574), 1832-1835.

Lecture 3:  Geometric bounds for dissipation.

I will demonstrate two bounds on dissipation, each of which have geometric interpretations.  The first (1) bounds the entropy production of a protocol which must be carried out in finite time, and quantifies the energetic cost of leaving the quasi-static regime.  The second (2) applies even in the quasistatic regime, but is sub-extensive, becoming unimportant in the thermodynamic (large N) limit.

  1. Sivak, D. A., & Crooks, G. E. (2012). Thermodynamic metrics and optimal paths. Physical review letters108(19), 190602.
  2. Machta, B. B. (2015). Dissipation Bound for Thermodynamic Control. Physical review letters115(26), 260603.

 

  1. Juan MR Parrondo

Thermodynamics of Information

Soon after the discovery of the second law of thermodynamics, Maxwell illustrated its probabilistic nature with a gedanken experiment, now known as Maxwell’s demon. He argued that if an intelligent being—a demon—had information about the velocities and positions of the particles in a gas, then that demon could transfer the fast, hot particles from a cold reservoir to a hot one, in apparent violation of the second law. Maxwell's demon reveals a fundamental relationship between entropy and information. Information is a thermodynamic resource and consequently, information manipulation involves some thermodynamic cost.

This short course will review the main aspects of information thermodynamics. We discuss the history of the Maxwell demon and how the second law should be modified to incorporate information. Then we will introduce more recent developments, like fluctuation theorems and reversibility for feedback processes, and the design of optimal Maxwell demons. The basic idea of our approach is that information is stored in slow degrees of freedom that are out of equilibrium. Form this viewpoint, information thermodynamics is the study of a certain class of non-equilibrium states and information is generated by breaking ergodicity. Finally, we will address the question of the physical nature of information confronting this approach with alternative definitions of information based on instantaneous correlations.

Session 1

  1. A bit of history: Maxwell's demon, Szilard's engine, Bennett's solution.
  2. Basic concepts: Shannon information, mutual information, relative entropy, non-equilibrium free energy.
  3. Information and the second law.

Session 2

  1. Fluctuation theorems for feedback systems.
  2. Reversibility: optimal Maxwell demons, optimal feedback.
  3. Thermodynamic cost of measurement and erasure.

Session 3

  1. Creating information: symmetry breaking.
  2. Maxwell demons in the phase space and microcanonical Szilard engines.
  3. Information flows.

References:

  1. H.S. Leff and A.F. Rex. Maxwell's demon 2: Entropy, classical and quantum Information, Computing (Institute of Physics, 2003).
  2. J.M.R. Parrondo, J.M. Horowitz and T. Sagawa. Thermodynamics of information. Nature Physics 11, 131-139 (2015).
  3. T. Sagawa and M. Ueda. Minimal Energy Cost for Thermodynamic Information Processing: Measurement and Information Erasure. Physical Review Letters 102, 250602 (2009).
  4. J.M. Horowitz and S. Vaikuntanathan. Non-equilibrium detailed fluctuation theorem for repeated discrete feedback. Physical Review E 82, 061120 (2010)
  5. J.M. Horowitz and J.M.R. Parrondo. Optimizing non-ergodic feedback engines. Acta Physica Polonica B 44, 803-814 (2013).
  6. J.M. Horowitz, T. Sagawa and J.M.R. Parrondo. Imitating Chemical Motors with Optimal Information Motor. Physical Review Letters 111, 010602 (2013).
  7. J.M.R. Parrondo. The Szilard engine revisited: Entropy, macroscopic randomness, and symmetry breaking phase transitions. Chaos 11 725-733 (2001).
  8. É. Roldán, I.A. Martínez, J.M.R. Parrondo and D. Petrov. Universal features in the energetics of symmetry breaking. Nature Physics 10 457-461 (2014).
  9. R. Marathe and J.M.R. Parrondo. Cooling classical particles with a microcanonical Szilard engine. Physical Review Letters 104, 245704 (2010).
  10. J.M.R.ParrondoandL.Granger.Maxwelldemonsinphasespace.EuropeanPhysical Journal-Special Topics 224, 865-875 (2015).
  11. J.M.HorowitzandM.Esposito.ThermodynamicswithContinuousInformationFlow. Physical Review X 4, 031015 (2014).

 

  1. Apoorva Patel

Information Theory Meets Quantum Physics: The magic of wave dynamics

The scope of information theory can be expanded by generalising the notion of a message from a "sequence of letters" to a "collection of building blocks". A variety of computational frameworks can be constructed by the choice of the building blocks, their collections and their processing. The choices are constrained by the laws of physics, and the selection can be optimised depending on what is available and what is to be accomplished. This description will be illustrated by many examples from our experience. Quantum theory combines features of particle and wave dynamics. Its basic building block is the qubit, which is much more powerful than the bit. Simple tasks that demonstrate this power will be explained.

 

  1. Mark Transtrum

Lecture 1: Sloppiness and Parameter Identifiability, Information Geometry, and the Role of Experimental Design

Abstract:
I discuss the historical development of the concepts of parameter identifiability and sloppiness and the relationship between them. I introduce the Fisher Information Matrix (FIM) and its role in quantifying identifiability and sloppiness. By interpreting the FIM as a Riemannian metric on the model parameter space, I introduce information geometry and discuss implications for sloppiness and parameter identifiability. Finally, I discuss the potential role of experimental design for improving the identifiability of model parameters.

References:

Parameter Identifiability References:
Rothenberg, Thomas J. "Identification in parametric models." Econometrica: Journal of the Econometric Society (1971): 577-591. Cobelli, Claudio, and J. J. DiStefano. "Parameter and structural identifiability concepts and ambiguities: a critical review and analysis." American Journal of Physiology-Regulatory, Integrative and Comparative Physiology 239.1 (1980): R7-R24. Raue, Andreas, et al. "Structural and practical identifiability analysis of partially observed dynamical models by exploiting the profile likelihood." Bioinformatics 25.15 (2009): 1923-1929.

Sloppy Model References:
Brown, Kevin S., et al. "The statistical mechanics of complex signaling networks: nerve growth factor signaling." Physical biology 1.3 (2004): 184.
Waterfall, Joshua J., et al. "Sloppy-model universality class and the Vandermonde matrix." Physical review letters 97.15 (2006): 150601.
Gutenkunst, Ryan N., et al. "Universally sloppy parameter sensitivities in systems biology models." PLoS Comput Biol 3.10 (2007): e189.
Daniels, Bryan C., et al. "Sloppiness, robustness, and evolvability in systems biology." Current opinion in biotechnology 19.4 (2008): 389-395.
Chachra, Ricky, Mark K. Transtrum, and James P. Sethna. "Structural susceptibility and separation of time scales in the van der Pol oscillator." Physical Review E 86.2 (2012): 026712.
Machta, Benjamin B., et al. "Parameter space compression underlies emergent theories and predictive models." Science 342.6158 (2013): 604-607.
Transtrum, Mark K., et al. "Perspective: Sloppiness and emergent theories in physics, biology, and beyond." The Journal of chemical physics 143.1 (2015): 010901.

Information Geometry References:
Bates, Douglas M., and Donald G. Watts. "Relative curvature measures of nonlinearity." Journal of the Royal Statistical Society. Series B (Methodological) (1980): 1-25.
Campbell, L. Lore. "The relation between information theory and the differential geometry approach to statistics." Information Sciences 35.3 (1985): 199-210.
Bates, Douglas M. Watts, Donald G. Douglas M. Bates, and Donald G. Watts. Nonlinear regression analysis and lts applications. No. 519.536 B3. 1988.
Murray, Michael K., and John W. Rice. Differential geometry and statistics. Vol. 48. CRC Press, 1993.
Amari, Shun-ichi, and Hiroshi Nagaoka. Methods of information geometry. Vol. 191. American Mathematical Soc., 2007.
Transtrum, Mark K., Benjamin B. Machta, and James P. Sethna. "Why are nonlinear fits to data so challenging?." Physical review letters 104.6 (2010): 060201.
Transtrum, Mark K., Benjamin B. Machta, and James P. Sethna. "Geometry of nonlinear least squares with applications to sloppy models and optimization." Physical Review E 83.3 (2011): 036701.
Mannakee, Brian K., et al. "Sloppiness and the geometry of parameter space." Uncertainty in Biology. Springer International Publishing, 2016. 271-299.

Experimental Design References:
Casey, Fergal P., et al. "Optimal experimental design in an epidermal growth factor receptor signalling and down-regulation model." IET systems biology 1.3 (2007): 190-202.
Apgar, Joshua F., et al. "Sloppy models, parameter uncertainty, and the role of experimental design." Molecular BioSystems 6.10 (2010): 1890-1900.
Chachra, Ricky, Mark K. Transtrum, and James P. Sethna. "Comment on “Sloppy models, parameter uncertainty, and the role of experimental design”." Molecular BioSystems 7.8 (2011): 2522-2522.
Transtrum, Mark K., and Peng Qiu. "Optimal experiment selection for parameter estimation in biological differential equation models." BMC bioinformatics 13.1 (2012): 1.
Tönsing, Christian, Jens Timmer, and Clemens Kreutz. "Cause and cure of sloppiness in ordinary differential equation models." Physical Review E 90.2 (2014): 023303.
White, Andrew, et al. "The limitations of model-based experimental design and parameter estimation in sloppy systems." arXiv preprint arXiv:1602.05135 (2016).


Lecture 2: Computational Differential Geometry, Optimization Algorithms, and the Manifold Boundary Approximation Method

Abstract:
I briefly introduce/review key concepts from differential geometry and techniques for implementing them numerically. I discuss several algorithms that were motivated by insights gained from information geometry: the relative off set orthogonality convergence criterion, the natural gradient, Riemannian MCMC sampling, and the geodesic Levenberg- Marquardt algorithm. Finally, I introduce the Manifold Boundary Approximation Method (MBAM) as a general parameter reduction method in overly parameterized models.

References:

Computational Differential Geometry References:
Transtrum, Mark K., Benjamin B. Machta, and James P. Sethna. "Geometry of nonlinear least squares with applications to sloppy models and optimization." Physical Review E 83.3 (2011): 036701.

Optimization Algorithms References:
Bates, Douglas M., and Donald G. Watts. "A Relative Off set Orthogonality Convergence Criterion for Nonlinear least Squares." Technometrics 23.2 (1981): 179-183.
Amari, Shun-Ichi. "Natural gradient works efficiently in learning." Neural computation 10.2 (1998): 251-276.
Gutenkunst, Ryan Nicholas. Sloppiness, modeling, and evolution in biochemical networks. Diss. Cornell University, 2008.
Girolami, Mark, and Ben Calderhead. "Riemann manifold langevin and hamiltonian monte carlo methods." Journal of the Royal Statistical Society: Series B (Statistical Methodology) 73.2 (2011): 123-214.
Transtrum, Mark K., Benjamin B. Machta, and James P. Sethna. "Why are nonlinear fits to data so challenging?." Physical review letters 104.6 (2010): 060201.
Transtrum, Mark K., and James P. Sethna. "Improvements to the Levenberg-Marquardt algorithm for nonlinear least-squares minimization." arXiv preprint arXiv:1201.5885 (2012).
 

MBAM References:
Transtrum, Mark K., and Peng Qiu. "Model reduction by manifold boundaries." Physical review letters 113.9 (2014): 098701.
Transtrum, Mark K., and Peng Qiu. "Bridging Mechanistic and Phenomenological Models of Complex Biological Systems." PLoS Comput Biol 12.5 (2016): e1004915.


Lecture 3: Applications of MBAM, Information Topology

Abstract:
I continue the previous discussions about the Manifold Boundary Approximation Method (MBAM) by discussing recent applications in statistical mechanics, control theory, systems biology, nuclear physics, power systems, and neuroscience. I discuss the relationship of MBAM to other well-known model reduction methods such as balanced truncation, singular perturbation, renormalization group, and other more recent proposals. I speculate about the potential application of MBAM to very large models. I show how MBAM suggests a topological interpretation of modeling and discuss some potential applications of information topology for model reduction, model construction, and model interpretation. Finally, I return to the original concept of parameter identifiability and reinterpret it in terms of the metrization of the underlying topological model space.

References:

Applications of MBAM:
Transtrum, Mark K., and Peng Qiu. "Model reduction by manifold boundaries." Physical review letters 113.9 (2014): 098701.
Paré, Philip E., et al. "A Unified View of Balanced Truncation and Singular Perturbation Approximations." 2015 American Control Conference (ACC). IEEE, 2015.
Transtrum, Mark K., and Peng Qiu. "Bridging Mechanistic and Phenomenological Models of Complex Biological Systems." PLoS Comput Biol 12.5 (2016): e1004915.
Nikšić, Tamara, and Dario Vretenar. "“Sloppy” nuclear energy density functionals: Effective model reduction." Physical Review C 94.2 (2016): 024333.
Transtrum, Mark K., Andrija T. Saric, and A. M. Stankovic. "Measurement-Directed Reduction of Dynamic Models in Power Systems." IEEE Transactions on Power Systems.
Maiwald, Tim, et al. "Driving the Model to Its Limit: Profile Likelihood Based Model Reduction." PloS one 11.9 (2016): e0162366.
Raman, Dhruva V., James Anderson, and Antonis Papachristodoulou. "Delineating Parameter Unidentifiabilities in Complex Models." arXiv preprint arXiv:1607.07705 (2016).

Information Topology:
Transtrum, Mark K. "Manifold boundaries give" gray-box" approximations of complex models." arXiv preprint arXiv:1605.08705 (2016).
Transtrum, Mark K., Gus Hart, and Peng Qiu. "Information topology identifies emergent model classes." arXiv preprint arXiv:1409.6203 (2014).


 

  1. S. Vaikuntanathan

Information processing and thermodynamics in biophysical control systems

Abstract:
Biological systems constantly sense stimuli from the environment, process and convey this information robustly over large length and time scales, and adapt efficiently to changing conditions. The fidelity with which information is processed by biochemical networks is surprising given that it occurs in conditions in which the influence of random thermal fluctuations is very high. The fidelity of information processing and signaling is crucial to the survival and well being of the organism. Uncovering the mechanisms used by cells to ensure robust information processing and organization remains a crucial problem in systems biology and biophysics. The broad focus of my lectures will be the microscopic informational processing mechanisms that biological systems evolving in noisy stochastic environments use able to behave in a controlled manner. The lectures will demonstrate how far from equilibrium fluctuations can enhance robustness of information processing mechanisms. While the material in the lectures will be presented in the context of biophysical and biochemical processes, the basic principles covered can be more broadly applied in other physical and chemical contexts. 

References:

Here are some references

  1. http://www.pnas.org/content/102/29/10040 (Physical limits of biochemical signaling)
  2. http://www.nature.com/nphys/journal/v8/n5/full/nphys2276.html (The energy-speed-accuracy tradeoff in sensory adaptation)
  3. http://www.pnas.org/content/109/30/12034.abstract (Speed, dissipation, and error in kinetic proofreading).

 

  1. Lenka Zdeborova

Journey trough statistical physics of constraint satisfaction and inference

Lecture 1: Random graph coloring. Belief propagation.
Lecture 2: Analysis of phase transitions and algorithmic consequences.
Lecture 3: Planted coloring, stochastic block model, computational phase transitions, spectral methods.

References:

**Phase transitions in the coloring of random graphs http://journals.aps.org/pre/abstract/10.1103/PhysRevE.76.031131
** Hiding Quiet Solutions in Random Constraint Satisfaction Problems http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.102.238701
** Asymptotic analysis of the stochastic block model for modular networks and its algorithmic applications http://journals.aps.org/pre/abstract/10.1103/PhysRevE.84.066106
**  Spectral redemption in clustering sparse networks http://www.pnas.org/content/110/52/20935.short
** Statistical physics of inference: Thresholds and algorithms: https://arxiv.org/abs/1511.02476 (review mainly relevant to Florent's lecture).