One of the most frequently used scientific words is the word “entropy”. The reason is that it is related to two main scientific domains: physics and information theory. Its origin goes back to the start of physics (thermodynamics), but since Shannon, it has become related to information theory. This conference is an opportunity to bring researchers of these two communities together and create a synergy. The main topics and sessions of the conference cover:

• Physics: classical and quantum thermodynamics
• Statistical physics and Bayesian computation
• Geometrical science of information, topology and metrics
• Maximum entropy principle and inference
• Kullback and Bayes or information theory and Bayesian inference
• Entropy in action (applications)

The inter-disciplinary nature of contributions from both theoretical and applied perspectives are very welcome, including papers addressing conceptual and methodological developments, as well as new applications of entropy and information theory.

All accepted papers will be published in the proceedings of the conference. A selection of invited and contributed talks presented during the conference will be invited to submit an extended version of their paper for a special issue of the open access journal Entropy.

You can use Markdown or HTML in your comments. You can also use LaTeX, like this: $latex E = m c^2 $. The word 'latex' comes right after the first dollar sign, with a space after it. Cancel reply

You need the word 'latex' right after the first dollar sign, and it needs a space after it. Double dollar signs don't work, and other limitations apply, some described here. You can't preview comments here, but I'm happy to fix errors.

Other event on the same topic:

CIRM seminar TGSI’17 “Topological & Geometrical Structures of Information”:

http://forum.cs-dc.org/uploads/files/1484556914411-poster-tgsi2017.pdf

and MDPI Entropy Book on “Differential Geometrical Theory of Statistics” on some of these topics:

http://www.mdpi.com/books/pdfdownload/book/313/1

Thanks!

Questions about information theory have always cropped up in my studies of physics:

Does the Nyquist frequency/rate relate to Cauchy-Schwartz Inequality and Heisenberg Uncertainty?

Does Gibbs Phenomena model any phenomena involving other abrupt discontinuities, like black holes (and Hawking radiation), or evanescent wave tunneling?

Does Bayesian vs. Frequentist feud extend to wave vs. particle duality (in for instance say, Young’s Experiment)?