Upper Bounds on Fourier Entropy

  • Conference paper
  • First Online:
Computing and Combinatorics (COCOON 2015)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 9198))

Included in the following conference series:

  • 1324 Accesses

Abstract

Given a function \(f : {\{0,1\}}^n\rightarrow \mathbb {R}\), its Fourier Entropy is defined to be \(-\sum _S {\widehat{f}}^2(S) \log {\widehat{f}}^2(S)\), where \(\hat{f}\) denotes the Fourier transform of f. This quantity arises in a number of applications, especially in the study of Boolean functions. An outstanding open question is a conjecture of Friedgut and Kalai (1996), called the Fourier Entropy Influence (FEI) Conjecture, asserting that the Fourier Entropy of any Boolean function f is bounded above, up to a constant factor, by the total influence (= average sensitivity) of f.

In this paper we give several upper bounds on the Fourier Entropy of Boolean as well as real valued functions. We first give upper bounds on the Fourier Entropy of Boolean functions in terms of several complexity measures that are known to be bigger than the influence. These complexity measures include, among others, the logarithm of the number of leaves and the average depth of a parity decision tree. We then show that for the class of Linear Threshold Functions (LTF), the Fourier Entropy is at most \(O(\sqrt{n})\). It is known that the average sensitivity for the class of LTF is bounded by \(\Theta (\sqrt{n})\). We also establish a bound of \(O_d(n^{1-\frac{1}{4d+6}})\) for general degree-d polynomial threshold functions. Our proof is based on a new upper bound on the derivative of noise sensitivity. Next we proceed to show that the FEI Conjecture holds for read-once formulas that use \({{\mathrm{\mathsf {AND}}}}\), \({{\mathrm{\mathsf {OR}}}}\), \({{\mathrm{\mathsf {XOR}}}}\), and \(\mathsf {NOT}\) gates. The last result is independent of a recent result due to O’Donnell and Tan [14] for read-once formulas with arbitrary gates of bounded fan-in, but our proof is completely elementary and very different from theirs. Finally, we give a general bound involving the first and second moments of sensitivities of a function (average sensitivity being the first moment), which holds for real valued functions as well.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Bourgain, J., Kalai, G.: Influences of variables and threshold intervals under group symmetries. Geometric and Functional Analysis GAFA 7(3), 438–461 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  2. Chakraborty, S., Kulkarni, R., Lokam, S.V., Saurabh, N.: Upper bounds on fourier entropy. Tech. rep., Electronic Colloquium on Computational Complexity (ECCC), TR13-052 (2013). http://eccc.hpi-web.de/report/2013/052

  3. Das, B., Pal, M., Visavaliya, V.: The entropy influence conjecture revisited. Tech. rep. (2011). ar**v:1110.4301

  4. Diakonikolas, I., Raghavendra, P., Servedio, R., Tan, L.: Average sensitivity and noise sensitivity of polynomial threshold functions. SIAM Journal on Computing 43(1), 231–253 (2014)

    Article  MATH  MathSciNet  Google Scholar 

  5. Friedgut, E., Kalai, G.: Every monotone graph property has a sharp threshold. Proceedings of the American Mathematical Society 124(10), 2993–3002 (1996)

    Article  MATH  MathSciNet  Google Scholar 

  6. Gopalan, P., Kalai, A.T., Klivans, A.R.: Agnostically learning decision trees. In: Proceedings of the 40th Annual ACM Symposium on Theory of Computing, STOC 2008, pp. 527–536 (2008)

    Google Scholar 

  7. Harsha, P., Klivans, A., Meka, R.: Bounding the sensitivity of polynomial threshold functions. Theory of Computing 10(1), 1–26 (2014)

    Article  MATH  MathSciNet  Google Scholar 

  8. Kahn, J., Kalai, G., Linial, N.: The influence of variables on boolean functions. In: Proceedings of the 29th Annual IEEE Symposium on Foundations of Computer Science, pp. 68–80 (1988)

    Google Scholar 

  9. Keller, N., Mossel, E., Schlank, T.: A note on the entropy/influence conjecture. Tech. rep. ar**v:1105.2651 (2011)

  10. Klivans, A., Lee, H., Wan, A.: Mansour’s conjecture is true for random dnf formulas. In: Proceedings of the 23rd Conference on Learning Theory, pp. 368–380 (2010)

    Google Scholar 

  11. Mansour, Y.: An o(n\(^{\log \log n}\)) learning algorithm for dnf under the uniform distribution. Journal of Computer and System Sciences 50(3), 543–550 (1995)

    Article  MATH  MathSciNet  Google Scholar 

  12. O’Donnell, R.: Computational applications of noise sensitivity. Ph.D. thesis. MIT (2003)

    Google Scholar 

  13. O’Donnell, R.: Analysis of Boolean Functions. Cambridge University Press (2014)

    Google Scholar 

  14. O’Donnell, R., Tan, L.Y.: A composition theorem for the fourier entropy-influence conjecture. In: Proceedings of Automata, Languages and Programming - 40th International Colloquium (2013)

    Google Scholar 

  15. O’Donnell, R., Wright, J., Zhou, Y.: The fourier entropy-influence conjecture for certain classes of boolean functions. In: Proceedings of Automata, Languages and Programming - 38th International Colloquium, pp. 330–341 (2011)

    Google Scholar 

  16. Wan, A., Wright, J., Wu, C.: Decision trees, protocols and the entropy-influence conjecture. In: Innovations in Theoretical Computer Science, ITCS 2014, pp. 67–80 (2014)

    Google Scholar 

  17. de Wolf, R.: A brief introduction to fourier analysis on the boolean cube. Theory of Computing, Graduate Surveys 1, 1–20 (2008)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nitin Saurabh .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Chakraborty, S., Kulkarni, R., Lokam, S.V., Saurabh, N. (2015). Upper Bounds on Fourier Entropy. In: Xu, D., Du, D., Du, D. (eds) Computing and Combinatorics. COCOON 2015. Lecture Notes in Computer Science(), vol 9198. Springer, Cham. https://doi.org/10.1007/978-3-319-21398-9_60

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-21398-9_60

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-21397-2

  • Online ISBN: 978-3-319-21398-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

Navigation