+353-1-416-8900REST OF WORLD
+44-20-3973-8888REST OF WORLD
1-917-300-0470EAST COAST U.S
1-800-526-8630U.S. (TOLL FREE)

Many-Sorted Algebras for Deep Learning and Quantum Technology

  • Book

  • February 2024
  • Elsevier Science and Technology
  • ID: 5850246

Many-Sorted Algebras for Deep Learning and Quantum Technology presents a precise and rigorous
description of basic concepts in quantum technologies and how they relate to deep learning and quantum theory. Current merging of quantum theory and deep learning techniques provides the need for a source that gives readers insights into the algebraic underpinnings of these disciplines. Although analytical, topological, probabilistic, as well as geometrical concepts are employed in many of these areas, algebra exhibits the principal thread; hence, this thread is exposed using many-sorted algebras. This book includes hundreds of well-designed examples that illustrate the intriguing concepts in quantum systems. Along with these examples are numerous visual displays. In particular, the polyadic graph shows the types or sorts of objects used in quantum or deep learning. It also illustrates all the inter and intra-sort operations needed in describing algebras. In brief, it provides the closure conditions. Throughout the book, all laws or equational identities needed in specifying an algebraic structure are precisely described.

Please Note: This is an On Demand product, delivery may take up to 11 working days after payment has been received.

Table of Contents

  1. Introduction to quantum many-sorted algebra
  2. Basics of deep learning
  3. Basic algebras underlying quantum and neural net
  4. Quantum Hilbert spaces and their creation
  5. Quantum and machine learning applications involving matrices
  6. Quantum annealing and adiabatic quantum computing
  7. Operators on Hilbert space
  8. Spaces and algebras for quantum operators
  9. Von Neumann algebra
  10. Fiber bundles
  11. Lie algebras and Lie groups
  12. Fundamental and universal covering groups
  13. Spectra for operators
  14. Canonical commutation relations
  15. Fock space
  16. Underlying theory for quantum computing
  17. Quantum computing applications
  18. Machine learning and data mining
  19. Reproducing kernel and other Hilbert spaces

Authors

Charles R. Giardina Bell Telephone Laboratories, Whippany, NJ, USA; Lucent Technologies, Whippany, NJ, USA.

Charles R. Giardina was born in the Bronx, NY, on December 29, 1942. He received the B.S. degree in mathematics from Fairleigh Dickinson University, Rutherford, NJ, and the M.S. degree in mathematics from Carnegie Institute of Technology, Pittsburgh, PA. He also received the M.E.E. degree in 1969, and the Ph.D. degree in mathematics and electrical engineering in 1970 from Stevens Institute of Technology, Hoboken, NJ. Dr. Giardina was Professor of Mathematics, Electrical Engineering, and Computer Science at Fairleigh Dickinson University from 1965 to 1982. From 1982 to 1986, he was a Professor at the Stevens Institute of Technology. From 1986 to 1996, he was a Professor at the College of Staten Island, City University of New York. From 1996, he was with Bell Telephone Laboratories, Whippany, NJ, USA. His research interests include digital signal and image processing, pattern recognition, artificial intelligence, and the constructive theory of functions. Dr. Giardina has authored numerous papers in these areas, and several books including, Mathematical Models for Artificial Intelligence and Autonomous Systems, Prentice Hall; Matrix Structure Image Processing, Prentice Hall; Parallel Digital Signal Processing: A Unified Signal Algebra Approach, Regency; Morphological Methods in Image and Signal Processing, Prentice Hall; Image Processing - Continuous to Discrete: Geometric, Transform, and Statistical Methods, Prentice Hall; and A Unified Signal Algebra Approach to Two-Dimensional Parallel Digital Signal Processing, Chapman and Hall/CRC Press.