Cantitate/Preț
Produs

Composing Fisher Kernels from Deep Neural Models: A Practitioner's Approach: SpringerBriefs in Computer Science

Autor Tayyaba Azim, Sarah Ahmed
en Limba Engleză Paperback – 5 sep 2018
This book shows machine learning enthusiasts and practitioners how to get the best of both worlds by deriving Fisher kernels from deep learning models. In addition, the book shares insight on how to store and retrieve large-dimensional Fisher vectors using feature selection and compression techniques. Feature selection and feature compression are two of the most popular off-the-shelf methods for reducing data’s high-dimensional memory footprint and thus making it suitable for large-scale visual retrieval and classification. Kernel methods long remained the de facto standard for solving large-scale object classification tasks using low-level features, until the revival of deep models in 2006. Later, they made a comeback with improved Fisher vectors in 2010. However, their supremacy was always challenged by various versions of deep models, now considered to be the state of the art for solving various machine learning and computer vision tasks. Although the two research paradigms differ significantly, the excellent performance of Fisher kernels on the Image Net large-scale object classification dataset has caught the attention of numerous kernel practitioners, and many have drawn parallels between the two frameworks for improving the empirical performance on benchmark classification tasks. Exploring concrete examples on different data sets, the book compares the computational and statistical aspects of different dimensionality reduction approaches and identifies metrics to show which approach is superior to the other for Fisher vector encodings. It also provides references to some of the most useful resources that could provide practitioners and machine learning enthusiasts a quick start for learning and implementing a variety of deep learning models and kernel functions.
Citește tot Restrânge

Din seria SpringerBriefs in Computer Science

Preț: 28618 lei

Preț vechi: 35772 lei
-20% Nou

Puncte Express: 429

Preț estimativ în valută:
5476 5783$ 4557£

Carte tipărită la comandă

Livrare economică 07-13 ianuarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9783319985237
ISBN-10: 331998523X
Pagini: 70
Ilustrații: XIII, 59 p. 6 illus., 5 illus. in color.
Dimensiuni: 155 x 235 mm
Ediția:1st ed. 2018
Editura: Springer International Publishing
Colecția Springer
Seria SpringerBriefs in Computer Science

Locul publicării:Cham, Switzerland

Cuprins

Chapter 1. Kernel Based Learning: A Pragmatic Approach in the Face of New Challenges.- Chapter 2. Fundamentals of Fisher Kernels.- Chapter 3. Training Deep Models and Deriving Fisher Kernels: A Step Wise Approach.- Chapter 4. Large Scale Image Retrieval and Its Challenges.- Chapter 5. Open Source Knowledge Base for Machine Learning Practitioners.

Recenzii

 

Notă biografică

Dr. Tayyaba Azim is an Assistant Professor at the Center for Information Technology, Institute of Management Sciences, Peshawar, Pakistan. 

Sarah Ahmed is a current research student enrolled in Masters of Computer Science program at Institute of Management Sciences Peshawar, Pakistan. 
She has received her  Bachelor’s Degree in Computer Science from Edwardes College, Peshawar,Pakistan. Her areas of interest include: Machine Learning, Computer Vision and Data-Science. Currently, her research work is centered around the feature compression and selection approaches for Fisher vectors derived from deep neural models. Her research paper: "Compression techniques for Deep Fisher Vectors" was awarded  the best paper in the area of applications at ICPRAM conference 2017. 

Caracteristici

Presents a step-by-step approach to deriving a kernel from any probabilistic model belonging to the family of deep networks Demonstrates the use of feature compression and selection techniques for reducing the dimensionality of Fisher vectors Reviews efficient algorithms for large-scale image retrieval and classification systems, including concrete examples on different datasets Provides programming solutions to help machine learning practitioners develop scalable solutions with novel ideas