Deep Learning

  • Chapter
  • First Online:
Introduction to Data Science

Abstract

In this chapter, we introduce the main concepts of Neural Networks and Deep Learning. We review the mathematical foundations and implement the classical Multilayer Perceptron and the Convolution Neural Networks with Keras on the top of TensorFlow. We show how to construct an image classification model, bridging theory with hands-on experience. Furthermore, to ensure effective model training and prevent overfitting, we introduce various techniques, such as regularization, dropout and data augmentation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now
Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 49.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    In the context of neural networks, a hidden layer is a layer of artificial neurons or nodes that sits between the input layer and the output layer. It is called “hidden” because it does not directly interact with the external data or the final output of the network. Instead, the hidden layer(s) are responsible for processing and transforming the input data before passing it to the output layer.

  2. 2.

    A minibatch is a subset of the training dataset that is used during the training of machine learning models, including neural networks. Instead of using the entire training dataset in a single step (which is known as batch or full-batch training), training is typically performed in smaller, manageable batches.

  3. 3.

    In deep learning, an epoch is one complete pass through the entire training dataset during the training phase of a neural network. During an epoch, the algorithm processes the entire dataset, computes the loss, and updates the model’s parameters (weights and biases) once.

  4. 4.

    https://www.tensorflow.org/.

  5. 5.

    https://keras.io/.

  6. 6.

    https://pytorch.org/.

  7. 7.

    https://colab.research.google.com/.

References

  1. D.O. Hebb, The Organization of Behavior: A Neuropsychological Theory (Wiley, New York, 1949). ISBN 9780471367277

    Google Scholar 

  2. F. Rosenblatt, The perceptron: a probabilistic model for information storage and organization in the brain. Psychol. Rev. 65(6), 386–408 (1958). https://doi.org/10.1037/h0042519

    Article  Google Scholar 

  3. D.E. Rumelhart, G.E. Hinton, R.J. Williams, Learning representations by back-propagating errors. Nature 323(6088), 533–536 (1986)

    Article  Google Scholar 

  4. A. Krizhevsky, I. Sutskever, G.E. Hinton, Imagenet classification with deep convolutional neural networks, in Advances in Neural Information Processing Systems 25 (2012)

    Google Scholar 

Download references

Acknowledgements

This chapter was co-written by Jordi Vitrià and Santi Seguí.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Laura Igual .

Rights and permissions

Reprints and permissions

Copyright information

© 2024 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Igual, L., Seguí, S. (2024). Deep Learning. In: Introduction to Data Science. Undergraduate Topics in Computer Science. Springer, Cham. https://doi.org/10.1007/978-3-031-48956-3_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-48956-3_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-48955-6

  • Online ISBN: 978-3-031-48956-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

Navigation