Qualitative Information and Entropy Structures

  • Chapter
Information and Inference

Part of the book series: Synthese Library ((SYLI,volume 28))

Abstract

Information theory deals with the mathematical properties of communication models, which are usually defined in terms of concepts like channel, source, information, entropy, capacity, code, and which satisfy certain conditions and axioms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 103.50
Price includes VAT (United Kingdom)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
GBP 129.99
Price includes VAT (United Kingdom)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info
Hardcover Book
GBP 129.99
Price includes VAT (United Kingdom)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

Bibliography

  • Aczel, J.: 1966, Lectures on Functional Equations, Academic Press, New York.

    Google Scholar 

  • Aczél, J., Pickert,G.,and Rado, F.: 1960,‘Nomogramme, Gewebe,und Quasigruppen’, Mathematica (Cluj) 2. 5 - 24.

    Google Scholar 

  • Adler, R. L., Konheim, A. G., and McAndrew, M. H.: 1965, ‘Topological Entropy’, Trans. Amer. Math. Soc. 114, 309 - 319.

    Article  Google Scholar 

  • Bar-Hillel, Y. and Carnap, R.: 1953, ‘Semantic Information’, British Journal for the Philosophy of Science 4. 147 - 157

    Article  Google Scholar 

  • Belis, M. and Guiasu, S.: 1967, ‘A Quantitative-Qualitative Measure of Information in Cybernetic Systems’, IEEE Transactions on Information Theory IT-14 (1967) 593 - 594.

    Google Scholar 

  • De Fériet, J. K. and Forte, B.: 1967, ‘Information et Probabilité’, C.R. Acad. Sci. Paris 265A, 110 - 114.

    Google Scholar 

  • De Finetti, B.: 1937, ‘La Prévision: Ses Lois Logiques, Ses Sources Subjectives’, Ann. inst. Poincaré 7, 1-68. English translation in Studies in Subjective Probability (ed. by H. E. Kyburg, Jr. and H. E. Smokier), Wiley, New York, 1964, pp. 93 - 158.

    Google Scholar 

  • Domotor, Z.: 1969, ‘Probabilistic Relational Structures and Their Applications’, Technical Report No. 144, May 14, Institute for Mathematical Studies in the Social Sciences, Stanford University, Stanford, Calif.

    Google Scholar 

  • Erdös, P.: 1946, ‘On the Distribution Function of Additive Functions’, Ann. Math. 47, 1 - 20.

    Article  Google Scholar 

  • Fadeev, D. K.: 1956, ‘On the Concept of Entropy of a Finite Probability Scheme’, Uspechy Mat. Nauk 11, 227-231. On Russian.

    Google Scholar 

  • Forte, B. and Pintacuda, N.: 1968a, ‘Sull Informazione Associata Alle Esperienze Incomplete’, Annali di Mathematica Pura ed Applicata 80, 215 - 234.

    Article  Google Scholar 

  • Forte. B. and Pintacuda, N.: 1968b, ‘Information Fourne par Une Expérience’, C.R. Acad. Sei. Paris 266A, 242 - 245.

    Google Scholar 

  • Hintikka, K. J.: 1968, ‘The Varieties of Information and Scientific Explanation’, in Logic, Methodology and Philosophy of Science, III (ed. by B. van Rootselaar and J. F. Staal ), North-Holland Publishing Company, Amsterdam, pp. 151 - 171.

    Google Scholar 

  • Hintikka, K. J. and Pietarinen, J.: 1966, ‘Semantic Information and Inductive Logic’, in Aspects of Inductive Logic (ed. by K. J. Hintikka and P. Suppes ), North-Holland Publishing Company, Amsterdam, pp. 81 - 97.

    Google Scholar 

  • Ingarden, R. S.: 1963, ‘A Simplified Axiomatic Definition of Information’, Bull. Acad. Sci. 11, 209 - 212.

    Google Scholar 

  • Ingarden, R. S.: 1965, ‘Simplified Axioms for Information Without Probability’, Roczniki Polskiego Tow. Maternalycznego 9, 273 - 282.

    Google Scholar 

  • Ingarden R. S. and Urbanik, K.: 1962, ‘Information Without Probability’, Colloquium Mathematicum 9, 131 -150.

    Google Scholar 

  • Kendall, D. G.: 1964, ‘Functional Equations in Information Theory’, Zeit sehr, für Wahrsch. und Verw. Geb. 2, 225 - 229.

    Article  Google Scholar 

  • Khinchin, A. I.: 1957, Der Begriffe der Entropien in der Wahrscheinlichkeitsrechnung. (Arbeiten zur Informationstheorie, I: Mathematische Forschungsberichte), VEB, Berlin.

    Google Scholar 

  • Kolmogorov, A.N.: 1965, ‘Three Approaches to the Definition of the Concept “Quantity of Information”’, Problemy Peredaci Informacii 1, 3 - 11.

    Google Scholar 

  • Kolmogorov, A.N.: 1967, ‘Logical Basis for Information Theory and Probability Theory’, IEEE Transactions on Information Theory IT-14, 662-664.

    Google Scholar 

  • Kraft, C. H., Pratt, J. W., and Seidenberg, A.: 1959, ‘Intuitive Probability on Finite Sets’, Ann. Math. Stat. 30, 408 - 119.

    Article  Google Scholar 

  • Lee, P.M.: 1964, ‘On the Axioms of Information Theory’, Ann. Math. Stat. 35, 415 - 418.

    Article  Google Scholar 

  • Luce, R. D.: 1967, ‘Sufficient Conditions for the Existence of a Finitely Additive Probability Measure’, Ann. Math. Stat. 38, 780 - 786.

    Article  Google Scholar 

  • Luce, R. D. and Tukey, J. W.: 1964, ‘Simultaneous Conjoint Measurement: A New Type of Fundamental Measurement’, Journal of Mathematical Psychology 1, 1 - 27.

    Article  Google Scholar 

  • Maeda, S.: 1960, ‘On Relatively Semi-orthocomplemcnted Lattices’, J. Sci. Hiroshima Univ. 24, 155 - 161.

    Google Scholar 

  • Maeda, S.: 1963, ‘A Lattice Theoretic Treatment of Stochastic Independence’, J. Sci. Hiroshima Univ. 27, 1 - 5.

    Google Scholar 

  • Marezewski, E.: 1958, ‘A General Scheme of the Notions of Independence in Mathematics’, Bull, de I’Aead. Polonaise des Sci. 6, 731 - 736.

    Google Scholar 

  • Onicescu, O.: 1966, ‘Energie Informationelle’, C.R. Acad. Sei. Paris 263A, (1966) 841 - 863.

    Google Scholar 

  • Raiffa, H., Pratt, J. W., and Schlaifcr, R.: 1964, The Foundations of Decision Under Uncertainty: An Elementar)‘Exposition’, J. Amer. Stat. Assoc. 59, 353 - 375.

    Article  Google Scholar 

  • Rényi, A.: 1961, ‘On Measures of Entropy and Information’, in Proc. of the Fourth Symp. on Math. Stat, and Probability (ed. by J. Neyman), vol. I, University of California Press, Berkeley, Calif, pp. 457-561.

    Google Scholar 

  • Savage, L. J.: 1954, The Foundations of Statistics, Wiley, New York.

    Google Scholar 

  • Scott, D.: 1964a, ‘Linear Inequalities and Measures on Boolean Algebras’, unpublished paper.

    Google Scholar 

  • Scott, D.: 1964b, ‘Measurement Structures and Linear Inequalities’, Journal of Mathematical Psychology 1, 233 - 247.

    Article  Google Scholar 

  • Skornyakov, L. A.: 1964, Complemented Modular Lattices and Regular Rings, Oliver and Boyd, London.

    Google Scholar 

  • Suppes, P.: 1961, ‘Behavioristic Foundations of Utility’, Econometrica 29, 186 - 202.

    Google Scholar 

  • Suppes, P. and Zinnes, J.: 1963, ‘Basic Measurement Theory’, in Handbook of Mathematical Psychology (ed. by R. D. Luce, R. R. Bush, and E. Galanter), vol. I, Wiley, New York, pp. 1-76.

    Google Scholar 

  • Tveberg H.: 1958, ‘A New Derivation of Information Functions’, Math. Scand. 6, 297 - 298.

    Google Scholar 

  • Varma, S. and Nath, P.: 1967, ‘Information Theory - A Survey’, J. Math. Sci. 2, 75 - 109.

    Google Scholar 

  • Von Neumann, J.: 1960, Continuous Geometry, University Press, Princeton, N.J. Weiss, P.: 1968, J.: 1960, Continuous Geometry, University Press, Princeton, N.J.

    Article  Google Scholar 

  • Weiss, P.: 1968, ‘Subjektive Unsicherheit und Subjektive Information’, Kybernetik 5, (1968) 77 - 82.

    Google Scholar 

References

  1. The symbol denotes functional composition; that is, I○P(A) = I(P(A)), if A∈ (NonProp). In general, we shall use the standard notation(Mathtype) or(Mathtype)for a function f which maps the set M into the set N. Complicated situations will be represented by diagrams in a well-known way.

    Google Scholar 

  2. For simplicity we shall keep the same notation, even though we are working now with the algebra 91 and not P.

    Google Scholar 

  3. For typographical simplicity we use the same symbol that was used in Section 1.2 for different ordering.

    Google Scholar 

  4. Re denotes the set of real numbers.

    Google Scholar 

  5. FQCP-structure - finite qualitative conditional probability structure (see Domotor, 1969).

    Google Scholar 

  6. nonpro)[(nonprop)]denotes the smallest Boolean algebra containing P.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1970 D. Reidel Publishing Company, Dordrecht-Holland

About this chapter

Cite this chapter

Domotor, Z. (1970). Qualitative Information and Entropy Structures. In: Hintikka, J., Suppes, P. (eds) Information and Inference. Synthese Library, vol 28. Springer, Dordrecht. https://doi.org/10.1007/978-94-010-3296-4_6

Download citation

  • DOI: https://doi.org/10.1007/978-94-010-3296-4_6

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-94-010-3298-8

  • Online ISBN: 978-94-010-3296-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics

Navigation