Abstract
Information theory deals with the mathematical properties of communication models, which are usually defined in terms of concepts like channel, source, information, entropy, capacity, code, and which satisfy certain conditions and axioms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
Bibliography
Aczel, J.: 1966, Lectures on Functional Equations, Academic Press, New York.
Aczél, J., Pickert,G.,and Rado, F.: 1960,‘Nomogramme, Gewebe,und Quasigruppen’, Mathematica (Cluj) 2. 5 - 24.
Adler, R. L., Konheim, A. G., and McAndrew, M. H.: 1965, ‘Topological Entropy’, Trans. Amer. Math. Soc. 114, 309 - 319.
Bar-Hillel, Y. and Carnap, R.: 1953, ‘Semantic Information’, British Journal for the Philosophy of Science 4. 147 - 157
Belis, M. and Guiasu, S.: 1967, ‘A Quantitative-Qualitative Measure of Information in Cybernetic Systems’, IEEE Transactions on Information Theory IT-14 (1967) 593 - 594.
De Fériet, J. K. and Forte, B.: 1967, ‘Information et Probabilité’, C.R. Acad. Sci. Paris 265A, 110 - 114.
De Finetti, B.: 1937, ‘La Prévision: Ses Lois Logiques, Ses Sources Subjectives’, Ann. inst. Poincaré 7, 1-68. English translation in Studies in Subjective Probability (ed. by H. E. Kyburg, Jr. and H. E. Smokier), Wiley, New York, 1964, pp. 93 - 158.
Domotor, Z.: 1969, ‘Probabilistic Relational Structures and Their Applications’, Technical Report No. 144, May 14, Institute for Mathematical Studies in the Social Sciences, Stanford University, Stanford, Calif.
Erdös, P.: 1946, ‘On the Distribution Function of Additive Functions’, Ann. Math. 47, 1 - 20.
Fadeev, D. K.: 1956, ‘On the Concept of Entropy of a Finite Probability Scheme’, Uspechy Mat. Nauk 11, 227-231. On Russian.
Forte, B. and Pintacuda, N.: 1968a, ‘Sull Informazione Associata Alle Esperienze Incomplete’, Annali di Mathematica Pura ed Applicata 80, 215 - 234.
Forte. B. and Pintacuda, N.: 1968b, ‘Information Fourne par Une Expérience’, C.R. Acad. Sei. Paris 266A, 242 - 245.
Hintikka, K. J.: 1968, ‘The Varieties of Information and Scientific Explanation’, in Logic, Methodology and Philosophy of Science, III (ed. by B. van Rootselaar and J. F. Staal ), North-Holland Publishing Company, Amsterdam, pp. 151 - 171.
Hintikka, K. J. and Pietarinen, J.: 1966, ‘Semantic Information and Inductive Logic’, in Aspects of Inductive Logic (ed. by K. J. Hintikka and P. Suppes ), North-Holland Publishing Company, Amsterdam, pp. 81 - 97.
Ingarden, R. S.: 1963, ‘A Simplified Axiomatic Definition of Information’, Bull. Acad. Sci. 11, 209 - 212.
Ingarden, R. S.: 1965, ‘Simplified Axioms for Information Without Probability’, Roczniki Polskiego Tow. Maternalycznego 9, 273 - 282.
Ingarden R. S. and Urbanik, K.: 1962, ‘Information Without Probability’, Colloquium Mathematicum 9, 131 -150.
Kendall, D. G.: 1964, ‘Functional Equations in Information Theory’, Zeit sehr, für Wahrsch. und Verw. Geb. 2, 225 - 229.
Khinchin, A. I.: 1957, Der Begriffe der Entropien in der Wahrscheinlichkeitsrechnung. (Arbeiten zur Informationstheorie, I: Mathematische Forschungsberichte), VEB, Berlin.
Kolmogorov, A.N.: 1965, ‘Three Approaches to the Definition of the Concept “Quantity of Information”’, Problemy Peredaci Informacii 1, 3 - 11.
Kolmogorov, A.N.: 1967, ‘Logical Basis for Information Theory and Probability Theory’, IEEE Transactions on Information Theory IT-14, 662-664.
Kraft, C. H., Pratt, J. W., and Seidenberg, A.: 1959, ‘Intuitive Probability on Finite Sets’, Ann. Math. Stat. 30, 408 - 119.
Lee, P.M.: 1964, ‘On the Axioms of Information Theory’, Ann. Math. Stat. 35, 415 - 418.
Luce, R. D.: 1967, ‘Sufficient Conditions for the Existence of a Finitely Additive Probability Measure’, Ann. Math. Stat. 38, 780 - 786.
Luce, R. D. and Tukey, J. W.: 1964, ‘Simultaneous Conjoint Measurement: A New Type of Fundamental Measurement’, Journal of Mathematical Psychology 1, 1 - 27.
Maeda, S.: 1960, ‘On Relatively Semi-orthocomplemcnted Lattices’, J. Sci. Hiroshima Univ. 24, 155 - 161.
Maeda, S.: 1963, ‘A Lattice Theoretic Treatment of Stochastic Independence’, J. Sci. Hiroshima Univ. 27, 1 - 5.
Marezewski, E.: 1958, ‘A General Scheme of the Notions of Independence in Mathematics’, Bull, de I’Aead. Polonaise des Sci. 6, 731 - 736.
Onicescu, O.: 1966, ‘Energie Informationelle’, C.R. Acad. Sei. Paris 263A, (1966) 841 - 863.
Raiffa, H., Pratt, J. W., and Schlaifcr, R.: 1964, The Foundations of Decision Under Uncertainty: An Elementar)‘Exposition’, J. Amer. Stat. Assoc. 59, 353 - 375.
Rényi, A.: 1961, ‘On Measures of Entropy and Information’, in Proc. of the Fourth Symp. on Math. Stat, and Probability (ed. by J. Neyman), vol. I, University of California Press, Berkeley, Calif, pp. 457-561.
Savage, L. J.: 1954, The Foundations of Statistics, Wiley, New York.
Scott, D.: 1964a, ‘Linear Inequalities and Measures on Boolean Algebras’, unpublished paper.
Scott, D.: 1964b, ‘Measurement Structures and Linear Inequalities’, Journal of Mathematical Psychology 1, 233 - 247.
Skornyakov, L. A.: 1964, Complemented Modular Lattices and Regular Rings, Oliver and Boyd, London.
Suppes, P.: 1961, ‘Behavioristic Foundations of Utility’, Econometrica 29, 186 - 202.
Suppes, P. and Zinnes, J.: 1963, ‘Basic Measurement Theory’, in Handbook of Mathematical Psychology (ed. by R. D. Luce, R. R. Bush, and E. Galanter), vol. I, Wiley, New York, pp. 1-76.
Tveberg H.: 1958, ‘A New Derivation of Information Functions’, Math. Scand. 6, 297 - 298.
Varma, S. and Nath, P.: 1967, ‘Information Theory - A Survey’, J. Math. Sci. 2, 75 - 109.
Von Neumann, J.: 1960, Continuous Geometry, University Press, Princeton, N.J. Weiss, P.: 1968, J.: 1960, Continuous Geometry, University Press, Princeton, N.J.
Weiss, P.: 1968, ‘Subjektive Unsicherheit und Subjektive Information’, Kybernetik 5, (1968) 77 - 82.
References
The symbol ○ denotes functional composition; that is, I○P(A) = I(P(A)), if A∈ (NonProp). In general, we shall use the standard notation(Mathtype) or(Mathtype)for a function f which maps the set M into the set N. Complicated situations will be represented by diagrams in a well-known way.
For simplicity we shall keep the same notation, even though we are working now with the algebra 91 and not P.
For typographical simplicity we use the same symbol that was used in Section 1.2 for different ordering.
Re denotes the set of real numbers.
FQCP-structure - finite qualitative conditional probability structure (see Domotor, 1969).
nonpro)[(nonprop)]denotes the smallest Boolean algebra containing P.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1970 D. Reidel Publishing Company, Dordrecht-Holland
About this chapter
Cite this chapter
Domotor, Z. (1970). Qualitative Information and Entropy Structures. In: Hintikka, J., Suppes, P. (eds) Information and Inference. Synthese Library, vol 28. Springer, Dordrecht. https://doi.org/10.1007/978-94-010-3296-4_6
Download citation
DOI: https://doi.org/10.1007/978-94-010-3296-4_6
Publisher Name: Springer, Dordrecht
Print ISBN: 978-94-010-3298-8
Online ISBN: 978-94-010-3296-4
eBook Packages: Springer Book Archive