9781498712163-1498712169-Statistical Learning with Sparsity (Chapman & Hall/CRC Monographs on Statistics and Applied Probability)

Statistical Learning with Sparsity (Chapman & Hall/CRC Monographs on Statistics and Applied Probability)

ISBN-13: 9781498712163
ISBN-10: 1498712169
Edition: 1
Author: Trevor Hastie, Robert Tibshirani, Martin Wainwright
Publication date: 2015
Publisher: Routledge
Format: Hardcover 367 pages
FREE US shipping
Rent
35 days
from $26.97 USD
FREE shipping on RENTAL RETURNS
Buy

From $30.23

Rent

From $26.97

Book details

ISBN-13: 9781498712163
ISBN-10: 1498712169
Edition: 1
Author: Trevor Hastie, Robert Tibshirani, Martin Wainwright
Publication date: 2015
Publisher: Routledge
Format: Hardcover 367 pages

Summary

Statistical Learning with Sparsity (Chapman & Hall/CRC Monographs on Statistics and Applied Probability) (ISBN-13: 9781498712163 and ISBN-10: 1498712169), written by authors Trevor Hastie, Robert Tibshirani, Martin Wainwright, was published by Routledge in 2015. With an overall rating of 4.0 stars, it's a notable title among other Environmental Economics (Economics, Statistics, Education & Reference, Matrices, Mathematics) books. You can easily purchase or rent Statistical Learning with Sparsity (Chapman & Hall/CRC Monographs on Statistics and Applied Probability) (Hardcover, Used) from BooksRun, along with many other new and used Environmental Economics books and textbooks. And, if you're looking to sell your copy, our current buyback offer is $22.64.

Description

Discover New Methods for Dealing with High-Dimensional Data

A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data.

Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of 1 penalties to generalized linear models and support vector machines, cover generalized penalties such as the elastic net and group lasso, and review numerical methods for optimization. They also present statistical inference methods for fitted (lasso) models, including the bootstrap, Bayesian methods, and recently developed approaches. In addition, the book examines matrix decomposition, sparse multivariate analysis, graphical models, and compressed sensing. It concludes with a survey of theoretical results for the lasso.

In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. Data analysts, computer scientists, and theorists will appreciate this thorough and up-to-date treatment of sparse statistical modeling.

Rate this book Rate this book

We would LOVE it if you could help us and other readers by reviewing the book