Cantitate/Preț
Produs

Artificial Neural Networks and Machine Learning – ICANN 2021: 30th International Conference on Artificial Neural Networks, Bratislava, Slovakia, September 14–17, 2021, Proceedings, Part II: Lecture Notes in Computer Science, cartea 12892

Editat de Igor Farkaš, Paolo Masulli, Sebastian Otte, Stefan Wermter
en Limba Engleză Paperback – 11 sep 2021
The proceedings set LNCS 12891, LNCS 12892, LNCS 12893, LNCS 12894 and LNCS 12895 constitute the proceedings of the 30th International Conference on Artificial Neural Networks, ICANN 2021, held in Bratislava, Slovakia, in September 2021.* The total of 265 full papers presented in these proceedings was carefully reviewed and selected from 496 submissions, and organized in 5 volumes.


In this volume, the papers focus on topics such as computer vision and object detection, convolutional neural networks and kernel methods, deep learning and optimization, distributed and continual learning, explainable methods, few-shot learning and generative adversarial networks.
*The conference was held online 2021 due to the COVID-19 pandemic.
Citește tot Restrânge

Toate formatele și edițiile

Toate formatele și edițiile Preț Express
Paperback (5) 64509 lei  6-8 săpt.
  Springer International Publishing – 11 sep 2021 64509 lei  6-8 săpt.
  Springer International Publishing – 11 sep 2021 64717 lei  6-8 săpt.
  Springer International Publishing – 11 sep 2021 64748 lei  6-8 săpt.
  Springer International Publishing – 11 sep 2021 64781 lei  6-8 săpt.
  Springer International Publishing – 12 sep 2021 69892 lei  6-8 săpt.

Din seria Lecture Notes in Computer Science

Preț: 64509 lei

Preț vechi: 80636 lei
-20% Nou

Puncte Express: 968

Preț estimativ în valută:
12347 12869$ 10278£

Carte tipărită la comandă

Livrare economică 06-20 ianuarie 25

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9783030863395
ISBN-10: 3030863395
Pagini: 651
Ilustrații: XXIII, 651 p. 229 illus., 219 illus. in color.
Dimensiuni: 155 x 235 mm
Greutate: 0.93 kg
Ediția:1st ed. 2021
Editura: Springer International Publishing
Colecția Springer
Seriile Lecture Notes in Computer Science, Theoretical Computer Science and General Issues

Locul publicării:Cham, Switzerland

Cuprins

Computer vision and object detection.- Selective Multi-Scale Learning for Object Detection.- DRENet: Giving Full Scope to Detection and Regression-based Estimation for Video Crowd Counting.- Sisfrutos Papaya: a Dataset for Detection and Classification of Diseases in Papaya.- Faster-LTN: a neuro-symbolic, end-to-end object detection architecture.- GC-MRNet: Gated Cascade Multi-stage Regression Network for Crowd Counting.- Latent Feature-Aware and Local Structure-Preserving Network for 3D Completion from a single depth view.- Facial Expression Recognition by Expression-Specific Representation Swapping.- Iterative Error Removal for Time-of-Flight Depth Imaging.- Blurred Image Recognition: A Joint Motion Deblurring and Classification Loss-Aware Approach.- Learning How to Zoom in: Weakly Supervised ROI-based-DAM for Fine-Grained Visual Classification.- Convolutional neural networks and kernel methods.- (Input) Size Matters for CNN Classifiers.- Accelerating Depthwise Separable Convolutions with Vector Processor.- KCNet: Kernel-based Canonicalization Network for entities in Recruitment Domain.- Deep Unitary Convolutional Neural Networks.- Deep learning and optimization I.- DPWTE: A Deep Learning Approach to Survival Analysis using a Parsimonious Mixture of Weibull Distributions.- First-order and second-order variants of the gradient descent in a unified framework.- Bayesian optimization for backpropagation in Monte-Carlo tree search.- Growing Neural Networks Achieve Flatter Minima.- Dynamic Neural Diversification: Path to Computationally Sustainable Neural Networks.- Curved SDE-Net Leads to Better Generalization for Uncertainty Estimates of DNNs.- EIS - Efficient and Trainable Activation Functions for Better Accuracy and Performance.- Deep learning and optimization II.- Why Mixup Improves the Model Performance.- Mixup gamblers: Learning to abstain with auto-calibrated reward for mixed samples.- Non-Iterative Phase Retrieval With Cascaded Neural Networks.- Incorporating Discrete Wavelet Transformation Decomposition Convolution into Deep Network to Achieve Light Training.- MMF: A loss extension for feature learning in open set recognition.- On the selection of loss functions under known weak label models.- Distributed and continual learning.- Bilevel Online Deep Learning in Non-stationary Environment.- A Blockchain Based Decentralized Gradient Aggregation Design for Federated Learning.- Continual Learning for Fake News Detection from Social Media.- Balanced Softmax Cross-Entropy for Incremental Learning.- Generalised Controller Design using Continual Learning.- DRILL: Dynamic Representations for Imbalanced Lifelong Learning.- Principal Gradient Direction and Confidence Reservoir Sampling for Continual Learning.- Explainable methods.- Spontaneous Symmetry Breaking in Data Visualization.- Deep NLP Explainer: Using Prediction Slope To Explain NLP Models.- Empirically explaining SGD from a line search perspective.- Towards Ontologically Explainable Classifiers.- Few-shot learning.- Leveraging the Feature Distribution in Transfer-based Few-Shot Learning.- One-Shot Meta-Learning for Radar-Based Gesture Sequences Recognition.- Few-Shot Learning With Random Erasing and Task-Relevant Feature Transforming.- Fostering Compositionality in Latent, Generative Encodings to Solve the Omniglot Challenge.- Better Few-shot Text Classification with Pre-trained Language Model.- Generative adversarial networks.- Leveraging GANs via Non-local Features.- On Mode Collapse in Generative Adversarial Networks.- Image Inpainting Using Wasserstein Generative Adversarial Imputation Network.- COViT-GAN: Vision Transformer for COVID-19 Detection in CT Scan Images with Self-Attention GAN for Data Augmentation.- PhonicsGAN: Synthesizing Graphical Videos from Phonics Songs.- A Progressive Image Inpainting Algorithm with a Mask Auto-update Branch.- Hybrid Generative Models for Two-Dimensional Datasets.- Towards Compressing Efficient Generative Adversarial Networks for Image Translation via Pruning and Distilling.-