Readings Newsletter
Become a Readings Member to make your shopping experience even easier.
Sign in or sign up for free!
You’re not far away from qualifying for FREE standard shipping within Australia
You’ve qualified for FREE standard shipping within Australia
The cart is loading…

This book provides a comprehensive foundation in the mathematical tools essential for modern data science and machine learning. It blends core subjects such as linear algebra, calculus, probability, statistics, optimization, and numerical methods with real-world applications. Readers explore matrix operations, eigenvalues, and dimensionality reduction techniques like PCA and t-SNE. Optimization is covered through gradient-based methods and regularization strategies. Probability theory, Bayes' theorem, and statistical inference form the basis for modeling uncertainty. Information theory concepts like entropy, cross-entropy, and KL divergence are applied to learning and feature selection. Efficient computational methods are introduced using Python/Numpy implementations. Advanced topics include graph theory for network analysis and stochastic models such as Markov chains and ARIMA for time series forecasting. This book bridges theory and practice, offering step-by-step problem-solving, coding exercises, and a deep understanding of the mathematical backbone driving AI and data science.
$9.00 standard shipping within Australia
FREE standard shipping within Australia for orders over $100.00
Express & International shipping calculated at checkout
Stock availability can be subject to change without notice. We recommend calling the shop or contacting our online team to check availability of low stock items. Please see our Shopping Online page for more details.
This book provides a comprehensive foundation in the mathematical tools essential for modern data science and machine learning. It blends core subjects such as linear algebra, calculus, probability, statistics, optimization, and numerical methods with real-world applications. Readers explore matrix operations, eigenvalues, and dimensionality reduction techniques like PCA and t-SNE. Optimization is covered through gradient-based methods and regularization strategies. Probability theory, Bayes' theorem, and statistical inference form the basis for modeling uncertainty. Information theory concepts like entropy, cross-entropy, and KL divergence are applied to learning and feature selection. Efficient computational methods are introduced using Python/Numpy implementations. Advanced topics include graph theory for network analysis and stochastic models such as Markov chains and ARIMA for time series forecasting. This book bridges theory and practice, offering step-by-step problem-solving, coding exercises, and a deep understanding of the mathematical backbone driving AI and data science.