Cantitate/Preț
Produs

Learn PySpark: Build Python-based Machine Learning and Deep Learning Models

Autor Pramod Singh
en Limba Engleză Paperback – 7 sep 2019
Leverage machine and deep learning models to build applications on real-time data using PySpark. This book is perfect for those who want to learn to use this language to perform exploratory data analysis and solve an array of business challenges.

You'll start by reviewing PySpark fundamentals, such as Spark’s core architecture, and see how to use PySpark for big data processing like data ingestion, cleaning, and transformations techniques. This is followed by building workflows for analyzing streaming data using PySpark and a comparison of various streaming platforms. 

You'll then see how to schedule different spark jobs using Airflow with PySpark and book examine tuning machine and deep learning models for real-time predictions. This book concludes with a discussion on graph frames and performing network analysis using graph algorithms in PySpark. All the code presented in the book will be available in Python scripts on Github.

What You'll Learn
  • Develop pipelines for streaming data processing using PySpark 
  • Build Machine Learning & Deep Learning models using PySpark latest offerings
  • Use graph analytics using PySpark 
  • Create Sequence Embeddings from Text data 
Who This Book is For 

Data Scientists, machine learning and deep learning engineers who want to learn and use PySpark for real time analysis on streaming data.
Citește tot Restrânge

Preț: 24909 lei

Preț vechi: 31137 lei
-20% Nou

Puncte Express: 374

Preț estimativ în valută:
4767 4952$ 3960£

Carte disponibilă

Livrare economică 13-27 ianuarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9781484249604
ISBN-10: 1484249607
Pagini: 295
Ilustrații: XVIII, 210 p. 187 illus., 32 illus. in color.
Dimensiuni: 155 x 235 x 10 mm
Greutate: 0.33 kg
Ediția:1st ed.
Editura: Apress
Colecția Apress
Locul publicării:Berkeley, CA, United States

Cuprins

Chapter 1: Introduction to PySpark.- Chapter 2: Data Processing.- Chapter 3: Spark Structured Streaming.- Chapter 4: Airflow.- Chapter 5: Machine Learning Library (MLlib).- Chapter 6: Supervised Machine Learning.- Chapter 7: Unsupervised Machine Learning.- Chapter 8: Deep Learning Using PySpark.


Notă biografică

Pramod Singh is currently a Manager (Data Science) at Publicis Sapient and working as data science lead for a project with Mercedes Benz. He has spent the last nine years working on multiple Data projects at SapientRazorfish, Infosys & Tally and has used traditional to advanced machine learning and deep learning techniques in multiple projects using R, Python, Spark and Tensorflow. Pramod has also been a regular speaker at major conferences in India and abroad and is currently authoring a couple of books on Deep Learning and AI techniques. He regularly conducts Data Science meetups at SapientRazorfish and presents webinars on Machine Learning and Artificial Intelligence. He lives in Bangalore with his wife and 2-year-old son. In his spare time, he enjoys coding, reading and watching football.


Textul de pe ultima copertă

Leverage machine and deep learning models to build applications on real-time data using PySpark. This book is perfect for those who want to learn to use this language to perform exploratory data analysis and solve an array of business challenges.

You'll start by reviewing PySpark fundamentals, such as Spark’s core architecture, and see how to use PySpark for big data processing like data ingestion, cleaning, and transformations techniques. This is followed by building workflows for analyzing streaming data using PySpark and a comparison of various streaming platforms. 

You'll then see how to schedule different spark jobs using Airflow with PySpark and book examine tuning machine and deep learning models for real-time predictions. This book concludes with a discussion on graph frames and performing network analysis using graph algorithms in PySpark. All the code presented in the book will be available in Python scripts on Github.

Caracteristici

Covers entire range of PySpark’s offerings from streaming to graph analytics Build standardized work flows for pre-processing and builds machine learning and deep learning models on big data sets Discusses how to schedule different Spark jobs using Airflow