9780387848570-0387848576-The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition (Springer Series in Statistics)

The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition (Springer Series in Statistics)

ISBN-13: 9780387848570
ISBN-10: 0387848576
Edition: 2nd
Author: Trevor Hastie, Robert Tibshirani, Jerome Friedman
Publication date: 2009
Publisher: Springer
Format: Hardcover 767 pages
FREE US shipping
Rent
35 days
from $11.45 USD
FREE shipping on RENTAL RETURNS
Buy

From $25.50

Rent

From $11.45

Book details

ISBN-13: 9780387848570
ISBN-10: 0387848576
Edition: 2nd
Author: Trevor Hastie, Robert Tibshirani, Jerome Friedman
Publication date: 2009
Publisher: Springer
Format: Hardcover 767 pages

Summary

The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition (Springer Series in Statistics) (ISBN-13: 9780387848570 and ISBN-10: 0387848576), written by authors Trevor Hastie, Robert Tibshirani, Jerome Friedman, was published by Springer in 2009. With an overall rating of 3.5 stars, it's a notable title among other AI & Machine Learning (Data Mining, Databases & Big Data, Bioinformatics, Biological Sciences, Computer Science) books. You can easily purchase or rent The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition (Springer Series in Statistics) (Hardcover, Used) from BooksRun, along with many other new and used AI & Machine Learning books and textbooks. And, if you're looking to sell your copy, our current buyback offer is $25.8.

Description

This book describes the important ideas in a variety of fields such as medicine, biology, finance, and marketing in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of colour graphics. It is a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting---the first comprehensive treatment of this topic in any book.

This major new edition features many topics not covered in the original, including graphical models, random forests, ensemble methods, least angle regression & path algorithms for the lasso, non-negative matrix factorisation, and spectral clustering. There is also a chapter on methods for "wide'' data (p bigger than n), including multiple testing and false discovery rates.

Rate this book Rate this book

We would LOVE it if you could help us and other readers by reviewing the book