Listing 1 - 10 of 1116 | << page >> |
Sort by
|
Choose an application
Choose an application
Choose an application
Markov processes. --- Markov processes --- Markov, Processus de
Choose an application
Markov processes --- Markov proces --- Markov proces. --- Waarschijnlijkheidsrekening. --- Markov, Processus de
Choose an application
Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative models. This Element provides a unified framework to handle these approaches via Markov chains. The authors consider stochastic normalizing flows as a pair of Markov chains fulfilling some properties, and show how many state-of-the-art models for data generation fit into this framework. Indeed numerical simulations show that including stochastic layers improves the expressivity of the network and allows for generating multimodal distributions from unimodal ones. The Markov chains point of view enables the coupling of both deterministic layers as invertible neural networks and stochastic layers as Metropolis-Hasting layers, Langevin layers, variational autoencoders and diffusion normalizing flows in a mathematically sound way. The authors' framework establishes a useful mathematical tool to combine the various approaches.
Choose an application
Dependability metrics are omnipresent in every engineering field, from simple ones through to more complex measures combining performance and dependability aspects of systems. This book presents the mathematical basis of the analysis of these metrics in the most used framework, Markov models, describing both basic results and specialised techniques. The authors first present both discrete and continuous time Markov chains before focusing on dependability measures, which necessitate the study of Markov chains on a subset of states representing different user satisfaction levels for the modelled system. Topics covered include Markovian state lumping, analysis of sojourns on subset of states of Markov chains, analysis of most dependability metrics, fundamentals of performability analysis, and bounding and simulation techniques designed to evaluate dependability measures. The book is of interest to graduate students and researchers in all areas of engineering where the concepts of lifetime, repair duration, availability, reliability and risk are important.
Choose an application
Choose an application
Elementary treatments of Markov chains, especially those devoted to discrete-time and finite state-space theory, leave the impression that everything is smooth and easy to understand. This exposition of the works of Kolmogorov, Feller, Chung, Kato, and other mathematical luminaries, which focuses on time-continuous chains but is not so far from being elementary itself, reminds us again that the impression is false: an infinite, but denumerable, state-space is where the fun begins. If you have not heard of Blackwell's example (in which all states are instantaneous), do not understand what the minimal process is, or do not know what happens after explosion, dive right in. But beware lest you are enchanted: 'There are more spells than your commonplace magicians ever dreamed of.'
Choose an application
Markov chains are an important idea, related to random walks, which crops up widely in applied stochastic analysis. They are used, for example, in performance modelling and evaluation of computer networks, queuing networks, and telecommunication systems. The main point of the present book is to provide methods, based on the construction of Lyapunov functions, of determining when a Markov chain is ergodic, null recurrent, or transient. These methods can also be extended to the study of questions of stability. Of particular concern are reflected random walks and reflected Brownian motion. The authors provide not only a self-contained introduction to the theory but also details of how the required Lyapunov functions are constructed in various situations.
Choose an application
Listing 1 - 10 of 1116 | << page >> |
Sort by
|