About the Book
Machine Learning Foundations, published by Pearson under the Addison-Wesley imprint, is a comprehensive textbook series that provides students, educators, and practitioners with a deep yet accessible foundation in modern machine learning. The series blends theoretical rigor with practical implementations and real-world insight, offering numerous worked examples, Python implementations, end-of-chapter exercises, and visual illustrations throughout.
The series is organized into three volumes:- Volume I: Supervised Learning — Introduces essential concepts such as generalization, bias–variance tradeoff, model capacity, regularization, and the machine learning workflow. It covers core supervised learning methods including linear and logistic regression, k-nearest neighbors, naive Bayes, decision trees, ensemble methods, and support vector machines, along with their mathematical foundations, practical usage, and implementation using popular Python libraries such as
scikit-learn,xgboost, andnltk. - Volume II: Unsupervised and Deep Learning — Focuses on learning from unlabeled data, including clustering, dimensionality reduction, density estimation, anomaly detection, and semi-supervised learning. It also presents a rigorous and practical treatment of deep learning, covering architectures such as MLPs, CNNs, RNNs, autoencoders, transformers, and GNNs, with applications in computer vision, natural language processing, time series forecasting, and recommender systems.
- Volume III: Advanced Topics — Explores cutting-edge areas including deep reinforcement learning, generative AI, and large language models. The volume also dives deeper into machine learning theory through topics such as statistical learning theory, generalization bounds, and limitations of existing algorithms. The final chapters address practical concerns including model deployment, concept drift, MLOps, interpretability, fairness, and adversarial robustness, and offer a forward-looking perspective on the future of the field.
Purchase
The book is now available for pre-order on Amazon and will be officially released in February 2026.
Buy on Amazon
Sample Chapter
Curious to explore the book before committing? Download the full chapter on Logistic Regression — a cornerstone of modern machine learning. This chapter covers the theoretical foundations, geometric interpretation, cost functions, optimization techniques, and real-world applications of logistic regression, with plenty of diagrams, code snippets, and insights to bridge theory and practice.
Code Examples
You can find all notebooks, code examples, and supplementary materials for the book on GitHub:
📂 View GitHub Repository
Solutions
Solutions for selected exercises from Volume I: Supervised Learning are freely available here: 📄 Download Volume I Selected Solutions (PDF)
Lecture Slides
Slides are available as downloadable PDFs:
- Chapter 1: Introduction to Machine Learning
- Chapter 2: Supervised Machine Learning
- Chapter 3: Introduction to Scikit-Learn
- Chapter 4: Linear Regression
- Chapter 5: Logistic Regression
- Chapter 6: K-Nearest Neighbors
- Chapter 7: Naive Bayes
- Chapter 8: Decision Trees
- Chapter 9: Ensemble Methods
- Chapter 10: Gradient Boosting Libraries
- Chapter 11: Support Vector Machines
- Appendix A: Linear Algebra
- Appendix C: Probability Theory
- Appendix E: Optimization
Citation
Please use the following BibTeX entry when citing the book:
@book{yehoshua2026ml,
title={Machine Learning Foundations, Volume 1: Supervised Learning},
author={Roi Yehoshua},
year={2026},
publisher={Pearson},
isbn={978-0135337868}
}