Information, computation, and logic are defining concepts of the modern era. Shannon laid the foundation of information theory, demonstrating that problems of communication and compression can be precisely modeled, formulated, and analyzed. Turing formalized computation defined as the transformation of information by means of algorithms. Godel established modern foundation of logic, laying the foundation for modern computer science and science of information.
Shannon's focus was originally on data recovery in compression and communication, but information is not merely communicated, it is also acquired, represented, inferred, processed, aggregated, managed, valued, secured, and computed. Computational information explores those properties of information that can be feasibly extracted. Existence of an object is of limited utility if no reasonable algorithm can provably generate such an object. Infeasibility may arise for a number of different reasons: the desired information may be computationally hard to extract; the information may be distributed geographically and not locally extractable; or information may be encoded in (quantum) physical ways that prevent full extraction. In contrast to the classical theory of information, where precise quantitative limits can be established in most cases, in the computational setting, information is not well understood qualitatively, with exponential gaps between the upper and lower bounds on the amount of feasibly extractable information.
We must add logic to this paradigm. At its most basic, logic is the study of consequence. The core intuition motivating including logic in information is that an informational state may be characterized by the range of possibilities or configurations that are compatible with the information available at that state. But logic may restrict range of possibility, directly impacting just information. Furthermore, logic ``unusual effectiveness in computer science'', from descriptive complexity to type theory (including Voevodsky univalent axiom) to reasoning about knowledge closes the loop from logic to information to computation. Understanding how to harness it in order to deepen connections to a theory of information remains very much an open question.
There are plenty of questions with very few satisfying answers:
Note: All the travel arrangements have to be done by oneself.
Marek Zaionc (Jagiellonian University, Poland) and Wojtek Szpankowski (Purdue, USA)
14:00 - 15:00 Peter Shor (Massachusetts Institute of Technology, USA)
"Quantum Computing"
15:00 - 16:00 Vijay Vazirani (University of California, Irvine, USA)
"Cardinal-Utility Matching Markets: From Tractability to Intractability ... and Back!"
16:00 - 17:00 Andrew Barron (Yale, USA)
"Log concave coupling for sampling from neural net posterior distributions"
17:00 - 17:30 Coffee Break
17:30 - 18:00 Ten minute walk to Collegium Maius
Wojciech Szpankowski (Purdue, USA)
"Structural, Temporal, and Semantic Information"
9:00 - 10:00
Venkat Anantharam (Berkeley, USA)
"Distributed information"
10:00 - 11:00
Alex Gray (IBM, USA)
"Neuro-Symbolic AI: The Empirical and Theoretical Advantage of Logic-like Learning Models"
11:00 - 12:00
Shuki Bruck (Caltech, USA)
"The Evolution of Information Systems: From Life to Artificial Intelligence"
13:30 - 14:30
David Ellerman (University of Ljubljana, Slovenia)
"New Foundations for Information Theory: Logical Entropy and Shannon Entropy"
14:30 - 15:30
Venkat Guruswami (Berkeley, USA)
"Polymorphism minions and promise constraint satisfaction: A gateway between information and computation"
15:30 - 16:00 Coffee Break
16:00 - 17:00
Piotr Sankowski (Warsaw University, Poland)
“Analyzing the Influence of Language Model-Generated Responses in Mitigating Hate Speech”
17:00 - 18:00
Alon Orlitsky (University of California, San Diego, USA)
"Robust learning from untrusted sources"
9:00 - 10:00
Negar Kiyavash (EPFL, Switzerland)
"Causal identification: state of the art and challenges"
10:00 - 11:00
Pawel Idziak (Jagiellonian University, Poland)
"Complexity of solving equations - kith and kin"
11:00 - 12:00
Joachim Buhmann (ETH, Switzerland)
"Why is it difficult to recover a planted sub-hypergraph?"
14:00 - 15:00
Jean-Claude Belfiore (Huawei, France)
"Building a theory of semantic information: spaces of information"
15:00 - 16:00
Avi Wigderson (Institute for Advanced Study, Princeton, USA)
"The Value of Errors in Proofs"
16:00 - 17:00
Alberto Luigi Sangiovanni-Vincentelli (Berkeley, USA)
"Platform and Contract-based design: two complementary system design platforms"