Listing 1 - 10 of 11 | << page >> |
Sort by
|
Choose an application
This book provides a comprehensive and self-contained overview of recent progress in nonequilibrium statistical mechanics, in particular, the discovery of fluctuation relations and other time-reversal symmetry relations. The significance of these advances is that nonequilibrium statistical physics is no longer restricted to the linear regimes close to equilibrium, but extends to fully nonlinear regimes. These important new results have inspired the development of a unifying framework for describing both the microscopic dynamics of collections of particles, and the macroscopic hydrodynamics and thermodynamics of matter itself. The book discusses the significance of this theoretical framework in relation to a broad range of nonequilibrium processes, from the nanoscale to the macroscale, and is essential reading for researchers and graduate students in statistical physics, theoretical chemistry and biological physics.
Choose an application
Statistical physics examines the collective properties of large ensembles of particles, and is a powerful theoretical tool with important applications across many different scientific disciplines. This book provides a detailed introduction to classical and quantum statistical physics, including links to topics at the frontiers of current research. The first part of the book introduces classical ensembles, provides an extensive review of quantum mechanics, and explains how their combination leads directly to the theory of Bose and Fermi gases. This allows a detailed analysis of the quantum properties of matter, and introduces the exotic features of vacuum fluctuations. The second part discusses more advanced topics such as the two-dimensional Ising model and quantum spin chains. This modern text is ideal for advanced undergraduate and graduate students interested in the role of statistical physics in current research. 140 homework problems reinforce key concepts and further develop readers' understanding of the subject.
Statistical mechanics. --- Quantum theory. --- Quantum statistics.
Choose an application
Statistical mechanics. --- Mechanics --- Mechanics, Analytic --- Quantum statistics --- Statistical physics --- Thermodynamics
Choose an application
Statistical mechanics. --- Mechanics --- Mechanics, Analytic --- Quantum statistics --- Statistical physics --- Thermodynamics
Choose an application
Aimed at advanced undergraduates and graduate students, the book is an accessible and engaging textbook introducing the theory of statistical mechanics, as well as its fascinating real-world applications. The book's original approach, which covers interdisciplinary applications of statistical mechanics to a wide range of subjects, including chemistry, biology, linguistics, economics, sociology and more, is bound to appeal to a wide audience.While the first part of the book introduces the various methods of statistical physics, including complexity, emergence, universality, self-organized criticality, power laws and other timely topics, the final sections focus on specific relevance of these methods to the social, biological and physical sciences. The mathematical content is woven throughout the book in the form of equations, as well as further background and explanations being provided in footnotes and appendices.
Statistical mechanics. --- Statistical physics. --- Entropy. --- Ferromagnetism. --- Mécanique statistique --- Physique statistique --- Entropie --- Ferromagnétisme
Choose an application
Renormalization group. --- Fluid dynamics. --- Quantum field theory. --- Group, Renormalization --- Quantum field theory --- Statistical mechanics --- Relativistic quantum field theory --- Field theory (Physics) --- Quantum theory --- Relativity (Physics) --- Dynamics --- Fluid mechanics
Choose an application
This book offers an introduction to statistical mechanics, special relativity, and quantum physics. It is based on the lecture notes prepared for the one-semester course of "Quantum Physics" belonging to the Bachelor of Science in Material Sciences at the University of Padova. The first chapter briefly reviews the ideas of classical statistical mechanics introduced by James Clerk Maxwell, Ludwig Boltzmann, Willard Gibbs, and others. The second chapter is devoted to the special relativity of Albert Einstein. In the third chapter, it is historically analyzed the quantization of light due to Max Planck and Albert Einstein, while the fourth chapter discusses the Niels Bohr quantization of the energy levels and the electromagnetic transitions. The fifth chapter investigates the Schrodinger equation, which was obtained by Erwin Schrodinger from the idea of Louis De Broglie to associate to each particle a quantum wavelength. Chapter Six describes the basic axioms of quantum mechanics, which were formulated in the seminal books of Paul Dirac and John von Neumann. In chapter seven, there are several important application of quantum mechanics: the quantum particle in a box, the quantum particle in the harmonic potential, the quantum tunneling, the stationary perturbation theory, and the time-dependent perturbation theory. Chapter Eight is devoted to the study of quantum atomic physics with special emphasis on the spin of the electron, which needs the Dirac equation for a rigorous theoretical justification. In the ninth chapter, it is explained the quantum mechanics of many identical particles at zero temperature, while in Chapter Ten the discussion is extended to many quantum particles at finite temperature by introducing and using the quantum statistical mechanics. The four appendices on Dirac delta function, complex numbers, Fourier transform, and differential equations are a useful mathematical aid for the reader.
Quantum theory. --- Quantum dynamics --- Quantum mechanics --- Quantum physics --- Physics --- Mechanics --- Thermodynamics --- Quantum physics. --- Statistical Physics. --- Gravitation. --- Atoms. --- Molecules. --- Quantum statistics. --- Quantum Physics. --- Gravitational Physics. --- Atomic, Molecular and Chemical Physics. --- Quantum Gases and Condensates. --- Field theory (Physics) --- Matter --- Antigravity --- Centrifugal force --- Relativity (Physics) --- Mathematical statistics --- Quantum statistical mechanics --- Matrix mechanics --- Statistical mechanics --- Wave mechanics --- Chemistry, Physical and theoretical --- Stereochemistry --- Properties --- Statistical methods --- Constitution
Choose an application
This book presents a collection of articles reporting the current challenges in solid and fracture mechanics. The book is devoted to the 90th birthday of academician Nikita F. Morozov—a well-known specialist in the field of solid and fracture mechanics.
Fracture mechanics. --- Failure of solids --- Fracture of materials --- Fracture of solids --- Materials --- Mechanics, Fracture --- Solids --- Deformations (Mechanics) --- Strength of materials --- Brittleness --- Penetration mechanics --- Structural failures --- Fracture --- Fatigue --- Mechanics, Applied. --- Solids. --- Quantum statistics. --- Quantum physics. --- Computer simulation. --- Building materials. --- Engineering Mechanics. --- Solid Mechanics. --- Quantum Fluids and Solids. --- Quantum Simulations. --- Structural Materials. --- Architectural materials --- Architecture --- Building --- Building supplies --- Buildings --- Construction materials --- Structural materials --- Computer modeling --- Computer models --- Modeling, Computer --- Models, Computer --- Simulation, Computer --- Electromechanical analogies --- Mathematical models --- Simulation methods --- Model-integrated computing --- Quantum dynamics --- Quantum mechanics --- Quantum physics --- Physics --- Mechanics --- Thermodynamics --- Quantum statistical mechanics --- Matrix mechanics --- Statistical mechanics --- Wave mechanics --- Solid state physics --- Transparent solids --- Applied mechanics --- Engineering, Mechanical --- Engineering mathematics
Choose an application
Extremely popular for statistical inference, Bayesian methods are also becoming popular in machine learning and artificial intelligence problems. Bayesian estimators are often implemented by Monte Carlo methods, such as the Metropolis–Hastings algorithm of the Gibbs sampler. These algorithms target the exact posterior distribution. However, many of the modern models in statistics are simply too complex to use such methodologies. In machine learning, the volume of the data used in practice makes Monte Carlo methods too slow to be useful. On the other hand, these applications often do not require an exact knowledge of the posterior. This has motivated the development of a new generation of algorithms that are fast enough to handle huge datasets but that often target an approximation of the posterior. This book gathers 18 research papers written by Approximate Bayesian Inference specialists and provides an overview of the recent advances in these algorithms. This includes optimization-based methods (such as variational approximations) and simulation-based methods (such as ABC or Monte Carlo algorithms). The theoretical aspects of Approximate Bayesian Inference are covered, specifically the PAC–Bayes bounds and regret analysis. Applications for challenging computational problems in astrophysics, finance, medical data analysis, and computer vision area also presented.
Research & information: general --- Mathematics & science --- bifurcation --- dynamical systems --- Edward–Sokal coupling --- mean-field --- Kullback–Leibler divergence --- variational inference --- Bayesian statistics --- machine learning --- variational approximations --- PAC-Bayes --- expectation-propagation --- Markov chain Monte Carlo --- Langevin Monte Carlo --- sequential Monte Carlo --- Laplace approximations --- approximate Bayesian computation --- Gibbs posterior --- MCMC --- stochastic gradients --- neural networks --- Approximate Bayesian Computation --- differential evolution --- Markov kernels --- discrete state space --- ergodicity --- Markov chain --- probably approximately correct --- variational Bayes --- Bayesian inference --- Markov Chain Monte Carlo --- Sequential Monte Carlo --- Riemann Manifold Hamiltonian Monte Carlo --- integrated nested laplace approximation --- fixed-form variational Bayes --- stochastic volatility --- network modeling --- network variability --- Stiefel manifold --- MCMC-SAEM --- data imputation --- Bethe free energy --- factor graphs --- message passing --- variational free energy --- variational message passing --- approximate Bayesian computation (ABC) --- differential privacy (DP) --- sparse vector technique (SVT) --- Gaussian --- particle flow --- variable flow --- Langevin dynamics --- Hamilton Monte Carlo --- non-reversible dynamics --- control variates --- thinning --- meta-learning --- hyperparameters --- priors --- online learning --- online optimization --- gradient descent --- statistical learning theory --- PAC–Bayes theory --- deep learning --- generalisation bounds --- Bayesian sampling --- Monte Carlo integration --- PAC-Bayes theory --- no free lunch theorems --- sequential learning --- principal curves --- data streams --- regret bounds --- greedy algorithm --- sleeping experts --- entropy --- robustness --- statistical mechanics --- complex systems
Choose an application
Extremely popular for statistical inference, Bayesian methods are also becoming popular in machine learning and artificial intelligence problems. Bayesian estimators are often implemented by Monte Carlo methods, such as the Metropolis–Hastings algorithm of the Gibbs sampler. These algorithms target the exact posterior distribution. However, many of the modern models in statistics are simply too complex to use such methodologies. In machine learning, the volume of the data used in practice makes Monte Carlo methods too slow to be useful. On the other hand, these applications often do not require an exact knowledge of the posterior. This has motivated the development of a new generation of algorithms that are fast enough to handle huge datasets but that often target an approximation of the posterior. This book gathers 18 research papers written by Approximate Bayesian Inference specialists and provides an overview of the recent advances in these algorithms. This includes optimization-based methods (such as variational approximations) and simulation-based methods (such as ABC or Monte Carlo algorithms). The theoretical aspects of Approximate Bayesian Inference are covered, specifically the PAC–Bayes bounds and regret analysis. Applications for challenging computational problems in astrophysics, finance, medical data analysis, and computer vision area also presented.
Research & information: general --- Mathematics & science --- bifurcation --- dynamical systems --- Edward–Sokal coupling --- mean-field --- Kullback–Leibler divergence --- variational inference --- Bayesian statistics --- machine learning --- variational approximations --- PAC-Bayes --- expectation-propagation --- Markov chain Monte Carlo --- Langevin Monte Carlo --- sequential Monte Carlo --- Laplace approximations --- approximate Bayesian computation --- Gibbs posterior --- MCMC --- stochastic gradients --- neural networks --- Approximate Bayesian Computation --- differential evolution --- Markov kernels --- discrete state space --- ergodicity --- Markov chain --- probably approximately correct --- variational Bayes --- Bayesian inference --- Markov Chain Monte Carlo --- Sequential Monte Carlo --- Riemann Manifold Hamiltonian Monte Carlo --- integrated nested laplace approximation --- fixed-form variational Bayes --- stochastic volatility --- network modeling --- network variability --- Stiefel manifold --- MCMC-SAEM --- data imputation --- Bethe free energy --- factor graphs --- message passing --- variational free energy --- variational message passing --- approximate Bayesian computation (ABC) --- differential privacy (DP) --- sparse vector technique (SVT) --- Gaussian --- particle flow --- variable flow --- Langevin dynamics --- Hamilton Monte Carlo --- non-reversible dynamics --- control variates --- thinning --- meta-learning --- hyperparameters --- priors --- online learning --- online optimization --- gradient descent --- statistical learning theory --- PAC–Bayes theory --- deep learning --- generalisation bounds --- Bayesian sampling --- Monte Carlo integration --- PAC-Bayes theory --- no free lunch theorems --- sequential learning --- principal curves --- data streams --- regret bounds --- greedy algorithm --- sleeping experts --- entropy --- robustness --- statistical mechanics --- complex systems
Listing 1 - 10 of 11 | << page >> |
Sort by
|