Search
Search Results
-
-
Tail processes and tail measures: An approach via Palm calculus
Using an intrinsic approach, we study some properties of random fields which appear as tail fields of regularly varying stationary random fields. The...
-
Information, Novelty, and Surprise in Brain Theory
In biological research, it is common to assume that each organ of an organism serves a definite purpose. The purpose of the brain seems to be the... -
Information Theory on Lattices of Covers
Classical information theory considers the information... -
Prerequisites from Logic and Probability Theory
This chapter lays the probabilistic groundwork for the rest of the book. We introduce standard probability theory. We call the elements A of the... -
Introduction
The main purpose of this book is to extend classical information theory to incorporate the subjective element of interestingness, novelty, or... -
Novelty, Information and Surprise of Repertoires
This chapter finally contains the definition of novelty, information, and surprise for arbitrary covers and in particular for repertoires and some... -
Entropy in Physics
The term entropy was created in statistical mechanics; it is closely connected to information, and it is this connection that is the theme of this... -
Three Orderings on Repertoires
This idea leads to two almost equally reasonable definitions of an ordering of repertoires, which we will call ≤1 and ≤2. We will analyze these two... -
Order- and Lattice-Structures
In this part, we condense the new mathematical ideas and structures that have been introduced so far into a mathematical theory, which can be put... -
Improbability and Novelty of Descriptions
In this chapter, we define the information of an event A ∈ Σ, or in our terminology, the novelty of a proposition A as −log2 p(A). We further define... -
Stationary Processes and Their Information Rate
This chapter briefly introduces the necessary concepts from the theory of stochastic processes (see for example Lamperti 1977; Doob 1953) that are... -
Conditioning, Mutual Information, and Information Gain
In this chapter, we discuss the extension of three concepts of classical information theory, namely, conditional information, transinformation (also... -
Local spatial log-Gaussian Cox processes for seismic data
In this paper, we propose the use of advanced and flexible statistical models to describe the spatial displacement of earthquake data. The paper aims...
-
Repertoires and Descriptions
This chapter introduces the notion of a cover or repertoire and its proper descriptions. Based on the new idea of relating covers and descriptions,... -
Novelty, Information and Surprise
This revised edition offers an approach to information theory that is more general than the classical approach of Shannon. Classically, information...
-
How to Transmit Information Reliably with Unreliable Elements (Shannon’s Theorem)
Shannon’s theoremShannon’s theorem is one of the most important results in the foundation of information theory (Shannon & Weaver, 1949). It says... -
Conditional and Subjective Novelty and Information
This chapter introduces some more involved versions of the concepts of novelty and information, such as subjective and conditional novelty. -
On Guessing and Coding
This chapter introduces the Huffman code (Huffman 1952). The ideas of coding and optimizing average codeword length are essential for understanding... -
Information Transmission
This chapter introduces the concept of a transition probability and the problem of guessing the input of an information channel from its output. It...