Narrow your search

Library

KU Leuven (5)

KBR (2)

UGent (1)


Resource type

dissertation (6)

book (2)


Language

English (5)

German (3)


Year
From To Submit

2021 (2)

2020 (2)

2018 (1)

1913 (2)

1895 (1)

Listing 1 - 8 of 8
Sort by

Dissertation
Das Verordnungsrecht nach dem Staatsrechte des deutschen Reiches
Author:
Year: 1895 Publisher: Würzburg : Memminger,

Loading...
Export citation

Choose an application

Bookmark

Abstract


Book
Das Prisenrecht in seiner neuesten Gestalt
Author:
Year: 1913 Publisher: Berlin Ernst Siegfried Mittler und Sohn

Loading...
Export citation

Choose an application

Bookmark

Abstract

Keywords

341.64.5


Book
Das Prisenrecht in seiner neuesten Gestalt
Author:
Year: 1913 Publisher: Berlin E. S. Mittler und Sohn

Loading...
Export citation

Choose an application

Bookmark

Abstract

Keywords

341.36


Dissertation
Evaluation of region-of-interest-based PET reconstruction
Authors: --- --- ---
Year: 2018 Publisher: Leuven KU Leuven. Faculteit Wetenschappen

Loading...
Export citation

Choose an application

Bookmark

Abstract

In nuclear medicine one administers radioactive substances to patients to make a medical diagnosis. These substances travel around the body until they meet and bind chemically with the medical relevant target. Once bound, the radioactive substance decays and gamma radiation is emitted in all directions. These gamma rays can be measured with special cameras and one can then calculate from these measurements where the radioactive substance has collected inside the human body. In this thesis we propose a new method to calculate the amount of radioactivity within a predetermined anatomical region with higher precision. This new method is superior to conventional methods if the radioactivity is uniformly distributed over the region in which we want to calculate the amount of radioactivity. The proposed method will also improve relatively better than the conventional method if the total radioactivity is increased. The new method is, however, not reliable if we want to calculate the amount of radioactivity of a region in which the radioactive substance is distributed very irregularly.

Keywords


Dissertation
Comparison of the image quality of PET/CT systems
Authors: --- --- ---
Year: 2020 Publisher: Leuven KU Leuven. Faculteit Wetenschappen

Loading...
Export citation

Choose an application

Bookmark

Abstract

Deze thesis evalueert en vergelijkt de beeldkwaliteit van drie PET/CT scanners, namelijk de Vereos van Philips Healthcare, de Discovery MI (DMI) van GE Healthcare en de Biograph TruePoint TrueV (TP) van Siemens Medical Solutions. PET/CT is een medische beeldvormingstechniek die gebruikt wordt voor het detecteren en opvolgen van ziektes. De parameters voor de beeldkwaliteit zijn de resolutie, de ruis en recovery coëfficiënten (RCs). RCs geven weer hoe de gereconstrueerde activiteiten in de PET beelden overeenkomen met de werkelijke activiteitwaarden. De beeldkwaliteit werd geëvalueerd a.d.h.v. fantoom acquisities. Per scanner zijn drie acquisities met verschillende contrasten tussen de sferen en achtergrond uitgevoerd. Per acquisitie zijn twaalf reconstructies bekomen met verschillende reconstructie parameters zoals de pixel grootte (2 en 4 mm), het aantal updates (27, 63 en 105), en met/zonder point spread function (PSF) modellering. PSF modellering is een methode die de resolutie van de reconstructie verbeterd. Convolutie van de beelden met een Gauss filter, die voor ruis vermindering zorgt, werd achteraf toegevoegd door de software (pymirc), welke de reconstructies analyseert. De software detecteert automatisch de zes sferen in het fantoom en bepaalt onder andere de spatiale resolutie, de ruis en RCs. Deze waarden zijn gebruikt om de convergentie van de drie scanners en het resolutie-ruis verloop te onderzoeken. Eveneens worden de verschillen in ruis en RCs voor reconstructies met eenzelfde resolutie alsook de invloed van PSF, contrast en sfeergrootte op de beeldkwaliteit, onderzocht. Uit deze studie volgt dat de Vereos reconstructie software het snelste convergeert, dit vanwege het gebruikte algoritme. Convergentie wilt zeggen dat het iteratief algoritme geen verder voordeel haalt uit verdere iteraties. De TP convergeert het traagste, omdat het een ouder systeem is zonder time-of-flight (TOF) capaciteiten. Systemen met TOF capaciteiten kunnen de gedetecteerde signalen beter lokaliseren, resulterend in betere resolutie en algemeen betere PET prestaties. De DMI heeft het beste resolutie-ruis verloop, wat wilt zeggen de beste resolutie voor een bepaalde hoeveelheid ruis en omgekeerd. Dit komt door de betere sensitiviteit van de DMI vergeleken met de Vereos en TP. Reconstructies met minder contrast resulteren in een slechtere resolutie maar dezelfde hoeveelheid ruis. Dit is een gevolg van de manier waarop de ruis bepaald is in de software gecombineerd met de uitvoering van de fantoom acquisities. Vervolgens heeft een kleinere pixelgrootte het meeste invloed op de Vereos, opnieuw vanwege het reconstructie algoritme. PSF modellering resulteert in minder ruis voor reconstructies met eenzelfde resolutie. Voor de Vereos zijn er, nogmaals vanwege het reconstructie algoritme, opvallend minder verschillen tussen de reconstructies met en zonder PSF modellering. Een kleinere sfeergrootte resulteert in lagere RCs voor alle drie de scanners. Opnieuw heeft de TP het meeste ruis. Er kon geen conclusie gevormd worden omtrent de invloed van de sfeergrootte op de resolutie omdat enkele sferen beïnvloed werden door het Gibbsverschijnsel. Algemeen kan er besloten worden dat de DMI het beste resolutie-ruis verloop heeft vergeleken met de andere twee scanners. De TP presteert het slechtste in elk onderdeel. De Vereos bereikt convergentie het snelste, wordt het meeste beïnvloed door de pixelgroottes en wordt het minste beïnvloed door PSF modellering.

Keywords


Dissertation
Optimization of a convolutional neural network to predict anatomically-guided PET reconstructions in image space
Authors: --- --- ---
Year: 2020 Publisher: Leuven KU Leuven. Faculteit Wetenschappen

Loading...
Export citation

Choose an application

Bookmark

Abstract

Convolutional neural networks and more generally machine learning and artificial intelligence are terms that are becoming more and more ubiquitous in everyday language. These are statistical techniques that use large datasets to help “teach” computers to solve traditionally challenging tasks, for example segmenting an image into different components. In reality a mathematical function is created that manipulates the numbers in the input in order to generate on average a highly likely output. Convolutional neural networks is a type of machine learning that is well suited for images. Artificial intelligence is starting to play a more substantial role in medicine and convolutional neural networks are being used in all sorts of tasks regarding medical images. For example Schramm et al. in a proof of concept study trained a network to estimate a magnetic-resonance-guided-positron-emission-tomography image (MR-guided-PET image) based on a MR and a PET image. When designing a network choices need to be made with regards to a number of tuneable parameters. These are referred to as the hyperparameters and have to be decided in advance. One can think about the hyperparameters as dials on an old fashioned radio: certain combinations will lead to good results, but the only way to find what works is to try all of them. The aim of this thesis to experiment with different hyperparameters as a continuation of the work by Schramm et al. The hyperparameters that were tested are for example the number of layers were the images are treated separately , the number of layers were the images are treated together, et cetera. In general the hyperparameters can be thought of in two classes: firstly those that add learning power and secondly those that reduce the time it takes to train a network. Learning power comes with two risks, namely taking too long to train with large datasets and overfitting. This last one refers to the network learning to make perfect prediction on the training data but becoming so specific to this data it can no longer make accurate predictions when it encounters something new. Experimenting with the hyperparameters can be thought of as an optimization problem where you try to find a combination with a satisfactory output and that doesn’t take to long to train. One hyperparameter that does not fit into these categories is the loss function. This measures the average quality of the outputs and is the backbone on which training is based. We showed that changing hyperparameters to increase the learning power lower the loss function value, without signs of overfitting. However this in not a trend that can be extrapolated without risk. Further we demonstrate that it is best to train with a mixed loss function that combines two loss functions, structural similarity and mean absolute error. Finally we can conclude that it is reasonable to employ a time saving strategy that randomly selects small sections from sections from the input images to train with.

Keywords


Dissertation
Characterization of a novel algorithm for fast iterative PET reconstruction
Authors: --- --- ---
Year: 2021 Publisher: Leuven KU Leuven. Faculteit Wetenschappen

Loading...
Export citation

Choose an application

Bookmark

Abstract

This thesis evaluates a new reconstruction algorithm for PET (Positron Emission Tomography) called SPDHG (Stochastic Primal Dual Hybrid Gradient algorithm). PET is the non-invasive principle in medicine where a radioisotope provides a measure of metabolic activity of a specific tissue. The challenge with PET is the low number of detected photons with which one tries to make a medically useful image. This low amount of photons follows a Poisson distribution and classical PET reconstruction is done by maximizing the log-likelihood function of this Poisson distribution. This iterative algorithm is called MLEM (Maximum Likelihood Expectation Maximization). MLEM has guaranteed convergence to the maximum likelihood solution, but the drawback is that the algorithm requires a high computational time. This is solved with OSEM (Ordered Subsets Expectation Maximization) by not using the entire data in each iteration but only a subset of it. This speeds up the procedure, but at the expense of the guaranteed convergence. OSEM converges to a so-called limit cycle, where the subset used causes the solution to deviate a little bit with each iteration. The newly proposed algorithm SPDHG has convergence and uses subsets. This leads to the question of whether SPDHG outperforms OSEM and can thus be of added value in clinical practices. The performance of SPDHG and OSEM is evaluated based on (1) a relative cost function (the relative log-probability), (2) the PSNR (Peak Signal-to-Noise Ratio, referenced MLEM) of the post- smoothed reconstructions, (3) the bias and uncertainty on the mean recovery values in different ROIs (Regions Of Interest). Two different phantoms are used; (1) the ellipse phantom which is easily modifiable and (2) the brain phantom which allows to evaluate the algorithm on anatomically relevant data. The results show that in all cases (for the correct step size ratio of SPDHG) the relative cost is lower for SPDHG and that the PSNR of the post-smoothed reconstructions is higher for SPDHG than for OSEM. It can also be seen that the uncertainty on these recovery values is in all cases smaller for SPDHG than for OSEM. This lower uncertainty (higher precision) gives SPDHG a better test-retest reliability. The reconstructions themselves showed that in certain cases the limit-cycle of OSEM does not show small objects in noisy environments, while the detectability of SPDHG resembled that of MLEM. They also show greater uncertainty in the center for OSEM than for SPDHG. Tuning the step size ratio gamma has been shown to be very important to ensure rapid convergence of SPDHG. However, if the correct gamma value is found, the new stochastic algorithm looks promising. In quality SPDHG converges to the solution with maximum probability (and thus not like OSEM to a limit cycle) and in speed it is comparable to OSEM. In fact, SPDHG has the potential to be even faster than OSEM as fewer projections can be used per subset. Post-smoothed SPDHG with 112 subsets is pixel-wise closer to post-smoothed MLEM than post-smoothed OSEM after 6 iterations.

Keywords


Dissertation
The role of advanced 18F-FDG PET processing techniques in the localization of epilepsy.
Authors: --- --- --- ---
Year: 2021 Publisher: Leuven KU Leuven. Faculteit Geneeskunde

Loading...
Export citation

Choose an application

Bookmark

Abstract

Background: Epilepsy is still a major neurological disorder firstly treated with anti-epileptic drugs. Surgical resection of the epileptogenic zone (EZ) can possibly give relief in patients who remain medically refractory. Clinical presurgical workup, where 18F-FDG PET plays a substantial role, is mandatory in order to obtain good delineation of the EZ and to achieve good postsurgical outcome. Partial volume correction (PVC), (semi-)quantification and 18F-FDG PET/MRI coregistration are advanced nuclear image processing techniques and have been already proven beneficial in individual studies regarding detection of the EZ. However, no review in current literature combines the available results concerning those advanced techniques. Methods: An electronic search on PUBMED, EMBASE and COCHRANE LIBRARY, applied with inclusion and exclusion criteria, was performed. This resulted in a qualitative analysis of 15 articles. Results: Implementation of PVC techniques resulted in sensitivities of 68%-100% and 71% on a lesion-based and patient-based analysis, respectively. Especially PVC techniques with anatomical prior were useful. Sensitivities for using (semi-)quantitative methods and for coregistration ranged from 53%-98% and from 90%-98%, respectively. These sensitivities reached higher values compared to analysis without implementation of (semi-)quantitative methods or coregistration. Discussion: The combination of available results in current literature showed beneficial results for PVC techniques, (semi-)quantitative methods and 18F-FDG PET/MRI coregistration. This suggests the need for implementation of those techniques in the routine clinical presurgical workup. However, further studies have to be established in order to better define the magnitude of the potential benefit and the complementarity of the different nuclear advanced image processing techniques.

Keywords

Listing 1 - 8 of 8
Sort by