Shannon information theory aims at finding fundamental limits for storage and communication. In the 70's, the information theory community started investigating rates of convergence to these limits, initiating precise analysis of their second-order asymptotics. However, we shall argue that information theory needs to be reinvigorated to meet today's challenges in biology, economics, modern communication, and knowledge extraction, focusing on elements of time, space, structure, and context.
We observe that often these elements contribute to the second-order terms of these fundamental limits. In this talk, we shall concentrate on delay and structure by studying their impact on some algorithmic and combinatorial constructions. We first discuss precise analysis of the minimax redundancy, in which delay is represented by the sequence length.
Then we cover the highlights of Markov types unveiling some nice combinatorics.
Next we turn our attention to structural compression of graphical objects. These results are obtained using analytic methods of analysis of algorithms. We conclude with some challenges for future research. Recently, National Science Foundation has established the first Science and Technology Center for Science of Information (CSoI) to address these challenges and develop tools to move beyond our current understanding of information.