9780262039505-0262039508-Log-Linear Models, Extensions, and Applications (Neural Information Processing series)

Log-Linear Models, Extensions, and Applications (Neural Information Processing series)

ISBN-13: 9780262039505
ISBN-10: 0262039508
Edition: Illustrated
Author: Li Deng, Aleksandr Aravkin, Anna Choromanska, Georg Heigold, Tony Jebara
Publication date: 2018
Publisher: The MIT Press
Format: Hardcover 214 pages
FREE US shipping

Book details

ISBN-13: 9780262039505
ISBN-10: 0262039508
Edition: Illustrated
Author: Li Deng, Aleksandr Aravkin, Anna Choromanska, Georg Heigold, Tony Jebara
Publication date: 2018
Publisher: The MIT Press
Format: Hardcover 214 pages

Summary

Log-Linear Models, Extensions, and Applications (Neural Information Processing series) (ISBN-13: 9780262039505 and ISBN-10: 0262039508), written by authors Li Deng, Aleksandr Aravkin, Anna Choromanska, Georg Heigold, Tony Jebara, was published by The MIT Press in 2018. With an overall rating of 4.3 stars, it's a notable title among other AI & Machine Learning (Computer Science) books. You can easily purchase or rent Log-Linear Models, Extensions, and Applications (Neural Information Processing series) (Hardcover) from BooksRun, along with many other new and used AI & Machine Learning books and textbooks. And, if you're looking to sell your copy, our current buyback offer is $0.3.

Description

Advances in training models with log-linear structures, with topics including variable selection, the geometry of neural nets, and applications.

Log-linear models play a key role in modern big data and machine learning applications. From simple binary classification models through partition functions, conditional random fields, and neural nets, log-linear structure is closely related to performance in certain applications and influences fitting techniques used to train models. This volume covers recent advances in training models with log-linear structures, covering the underlying geometry, optimization techniques, and multiple applications. The first chapter shows readers the inner workings of machine learning, providing insights into the geometry of log-linear and neural net models. The other chapters range from introductory material to optimization techniques to involved use cases. The book, which grew out of a NIPS workshop, is suitable for graduate students doing research in machine learning, in particular deep learning, variable selection, and applications to speech recognition. The contributors come from academia and industry, allowing readers to view the field from both perspectives.

Contributors
Aleksandr Aravkin, Avishy Carmi, Guillermo A. Cecchi, Anna Choromanska, Li Deng, Xinwei Deng, Jean Honorio, Tony Jebara, Huijing Jiang, Dimitri Kanevsky, Brian Kingsbury, Fabrice Lambert, Aurélie C. Lozano, Daniel Moskovich, Yuriy S. Polyakov, Bhuvana Ramabhadran, Irina Rish, Dimitris Samaras, Tara N. Sainath, Hagen Soltau, Serge F. Timashev, Ewout van den Berg

Rate this book Rate this book

We would LOVE it if you could help us and other readers by reviewing the book