Feedback

X
Information Theory and Machine Learning

Information Theory and Machine Learning

0 Ungluers have Faved this Work
The recent successes of machine learning, especially regarding systems based on deep neural networks, have encouraged further research activities and raised a new set of challenges in understanding and designing complex machine learning algorithms. New applications require learning algorithms to be distributed, have transferable learning results, use computation resources efficiently, convergence quickly on online settings, have performance guarantees, satisfy fairness or privacy constraints, incorporate domain knowledge on model structures, etc. A new wave of developments in statistical learning theory and information theory has set out to address these challenges. This Special Issue, "Machine Learning and Information Theory", aims to collect recent results in this direction reflecting a diverse spectrum of visions and efforts to extend conventional theories and develop analysis tools for these complex machine learning systems.

This book is included in DOAB.

Why read this book? Have your say.

You must be logged in to comment.

Rights Information

Are you the author or publisher of this work? If so, you can claim it as yours by registering as an Unglue.it rights holder.

Downloads

This work has been downloaded 37 times via unglue.it ebook links.
  1. 37 - pdf (CC BY) at Unglue.it.

Keywords

  • analytical error probability
  • anomaly detection
  • artificial general intelligence
  • atypicality
  • closed-loop transcription
  • component overlap
  • deep neural network
  • distribution and federated learning
  • empirical risk
  • Entropy
  • fairness
  • feature extraction
  • finite state machines
  • generalization error
  • HGR maximal correlation
  • Hidden Markov models
  • History of engineering & technology
  • independence criterion
  • independent and non-identically distributed features
  • information criteria
  • information theoretic learning
  • Information theory
  • information-theoretic bounds
  • interpretability
  • K-means clustering
  • Lempel–Ziv algorithm
  • linear discriminative representation
  • local information geometry
  • long short-term memory
  • lossless compression
  • merging mixture components
  • meta-learning
  • minimax game
  • minimum error entropy
  • model compression
  • model-based clustering
  • overfitting
  • pattern dictionary
  • population risk
  • rate distortion theory
  • rate reduction
  • recurrent neural networks
  • reservoir computers
  • separation criterion
  • Spiking Neural network
  • supervised classification
  • Technology, engineering, agriculture
  • Technology: general issues
  • time series prediction
  • vector quantization

Links

DOI: 10.3390/books978-3-0365-5308-5

Editions

edition cover

Share

Copy/paste this into your site: