Journal articles on the topic 'Tomographic reconstruction algorithms'

To see the other types of publications on this topic, follow the link: Tomographic reconstruction algorithms.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Tomographic reconstruction algorithms.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Lǖck, Sebastian, Andreas Kupsch, Axel Lange, Manfred P. Hentschel, and Volker Schmidt. "STATISTICAL ANALYSIS OF TOMOGRAPHIC RECONSTRUCTION ALGORITHMS BY MORPHOLOGICAL IMAGE CHARACTERISTICS." Image Analysis & Stereology 29, no. 2 (May 3, 2011): 61. http://dx.doi.org/10.5566/ias.v29.p61-77.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
We suggest a procedure for quantitative quality control of tomographic reconstruction algorithms. Our task-oriented evaluation focuses on the correct reproduction of phase boundary length and has thus a clear implication for morphological image analysis of tomographic data. Indirectly the method monitors accurate reproduction of a variety of locally defined critical image features within tomograms such as interface positions and microstructures, debonding, cracks and pores. Tomographic errors of such local nature are neglected if only global integral characteristics such as mean squared deviation are considered for the evaluation of an algorithm. The significance of differences in reconstruction quality between algorithms is assessed using a sample of independent random scenes to be reconstructed. These are generated by a Boolean model and thus exhibit a substantial stochastic variability with respect to image morphology. It is demonstrated that phase boundaries in standard reconstructions by filtered backprojection exhibit substantial errors. In the setting of our simulations, these could be significantly reduced by the use of the innovative reconstruction algorithm DIRECTT.
2

Ganguly, Poulami Somanya, Daniël M. Pelt, Doga Gürsoy, Francesco de Carlo, and K. Joost Batenburg. "Improving reproducibility in synchrotron tomography using implementation-adapted filters." Journal of Synchrotron Radiation 28, no. 5 (August 12, 2021): 1583–97. http://dx.doi.org/10.1107/s1600577521007153.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
For reconstructing large tomographic datasets fast, filtered backprojection-type or Fourier-based algorithms are still the method of choice, as they have been for decades. These robust and computationally efficient algorithms have been integrated in a broad range of software packages. The continuous mathematical formulas used for image reconstruction in such algorithms are unambiguous. However, variations in discretization and interpolation result in quantitative differences between reconstructed images, and corresponding segmentations, obtained from different software. This hinders reproducibility of experimental results, making it difficult to ensure that results and conclusions from experiments can be reproduced at different facilities or using different software. In this paper, a way to reduce such differences by optimizing the filter used in analytical algorithms is proposed. These filters can be computed using a wrapper routine around a black-box implementation of a reconstruction algorithm, and lead to quantitatively similar reconstructions. Use cases for this approach are demonstrated by computing implementation-adapted filters for several open-source implementations and applying them to simulated phantoms and real-world data acquired at the synchrotron. Our contribution to a reproducible reconstruction step forms a building block towards a fully reproducible synchrotron tomography data processing pipeline.
3

Wu, Juan, Mirna Lerotic, Sean Collins, Rowan Leary, Zineb Saghi, Paul Midgley, Slava Berejnov, et al. "Optimization of Three-Dimensional (3D) Chemical Imaging by Soft X-Ray Spectro-Tomography Using a Compressed Sensing Algorithm." Microscopy and Microanalysis 23, no. 5 (September 12, 2017): 951–66. http://dx.doi.org/10.1017/s1431927617012466.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
AbstractSoft X-ray spectro-tomography provides three-dimensional (3D) chemical mapping based on natural X-ray absorption properties. Since radiation damage is intrinsic to X-ray absorption, it is important to find ways to maximize signal within a given dose. For tomography, using the smallest number of tilt series images that gives a faithful reconstruction is one such method. Compressed sensing (CS) methods have relatively recently been applied to tomographic reconstruction algorithms, providing faithful 3D reconstructions with a much smaller number of projection images than when conventional reconstruction methods are used. Here, CS is applied in the context of scanning transmission X-ray microscopy tomography. Reconstructions by weighted back-projection, the simultaneous iterative reconstruction technique, and CS are compared. The effects of varying tilt angle increment and angular range for the tomographic reconstructions are examined. Optimization of the regularization parameter in the CS reconstruction is explored and discussed. The comparisons show that CS can provide improved reconstruction fidelity relative to weighted back-projection and simultaneous iterative reconstruction techniques, with increasingly pronounced advantages as the angular sampling is reduced. In particular, missing wedge artifacts are significantly reduced and there is enhanced recovery of sharp edges. Examples of using CS for low-dose scanning transmission X-ray microscopy spectroscopic tomography are presented.
4

Pelt, Daniël, Kees Batenburg, and James Sethian. "Improving Tomographic Reconstruction from Limited Data Using Mixed-Scale Dense Convolutional Neural Networks." Journal of Imaging 4, no. 11 (October 30, 2018): 128. http://dx.doi.org/10.3390/jimaging4110128.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In many applications of tomography, the acquired data are limited in one or more ways due to unavoidable experimental constraints. In such cases, popular direct reconstruction algorithms tend to produce inaccurate images, and more accurate iterative algorithms often have prohibitively high computational costs. Using machine learning to improve the image quality of direct algorithms is a recently proposed alternative, for which promising results have been shown. However, previous attempts have focused on using encoder–decoder networks, which have several disadvantages when applied to large tomographic images, preventing wide application in practice. Here, we propose the use of the Mixed-Scale Dense convolutional neural network architecture, which was specifically designed to avoid these disadvantages, to improve tomographic reconstruction from limited data. Results are shown for various types of data limitations and object types, for both simulated data and large-scale real-world experimental data. The results are compared with popular tomographic reconstruction algorithms and machine learning algorithms, showing that Mixed-Scale Dense networks are able to significantly improve reconstruction quality even with severely limited data, and produce more accurate results than existing algorithms.
5

Sipkens, T. A., S. J. Grauer, A. M. Steinberg, S. N. Rogak, and P. Kirchen. "New transform to project axisymmetric deflection fields along arbitrary rays." Measurement Science and Technology 33, no. 3 (December 21, 2021): 035201. http://dx.doi.org/10.1088/1361-6501/ac3f83.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract Axisymmetric tomography is used to extract quantitative information from line-of-sight measurements of gas flow and combustion fields. For instance, background-oriented schlieren (BOS) measurements are typically inverted by tomographic reconstruction to estimate the density field of a high-speed or high-temperature flow. Conventional reconstruction algorithms are based on the inverse Abel transform, which assumes that rays are parallel throughout the target object. However, camera rays are not parallel, and this discrepancy can result in significant errors in many practical imaging scenarios. We present a generalization of the Abel transform for use in tomographic reconstruction of light-ray deflections through an axisymmetric target. The new transform models the exact path of camera rays instead of assuming parallel paths, thereby improving the accuracy of estimates. We demonstrate our approach with a simulated BOS scenario in which we reconstruct noisy synthetic deflection data across a range of camera positions. Results are compared to state-of-the-art Abel-based algorithms. Reconstructions computed using the new transform are consistently more stable and accurate than conventional reconstructions.
6

Venkatakrishnan, Singanallur, Yuxuan Zhang, Luc Dessieux, Christina Hoffmann, Philip Bingham, and Hassina Bilheux. "Improved Acquisition and Reconstruction for Wavelength-Resolved Neutron Tomography." Journal of Imaging 7, no. 1 (January 15, 2021): 10. http://dx.doi.org/10.3390/jimaging7010010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Wavelength-resolved neutron tomography (WRNT) is an emerging technique for characterizing samples relevant to the materials sciences in 3D. WRNT studies can be carried out at beam lines in spallation neutron or reactor-based user facilities. Because of the limited availability of experimental time, potential imperfections in the neutron source, or constraints placed on the acquisition time by the type of sample, the data can be extremely noisy resulting in tomographic reconstructions with significant artifacts when standard reconstruction algorithms are used. Furthermore, making a full tomographic measurement even with a low signal-to-noise ratio can take several days, resulting in a long wait time before the user can receive feedback from the experiment when traditional acquisition protocols are used. In this paper, we propose an interlaced scanning technique and combine it with a model-based image reconstruction algorithm to produce high-quality WRNT reconstructions concurrent with the measurements being made. The interlaced scan is designed to acquire data so that successive measurements are more diverse in contrast to typical sequential scanning protocols. The model-based reconstruction algorithm combines a data-fidelity term with a regularization term to formulate the wavelength-resolved reconstruction as minimizing a high-dimensional cost-function. Using an experimental dataset of a magnetite sample acquired over a span of about two days, we demonstrate that our technique can produce high-quality reconstructions even during the experiment compared to traditional acquisition and reconstruction techniques. In summary, the combination of the proposed acquisition strategy with an advanced reconstruction algorithm provides a novel guideline for designing WRNT systems at user facilities.
7

Sorzano, C. O. S., J. Vargas, J. Otón, J. M. de la Rosa-Trevín, J. L. Vilas, M. Kazemi, R. Melero, et al. "A Survey of the Use of Iterative Reconstruction Algorithms in Electron Microscopy." BioMed Research International 2017 (2017): 1–17. http://dx.doi.org/10.1155/2017/6482567.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
One of the key steps in Electron Microscopy is the tomographic reconstruction of a three-dimensional (3D) map of the specimen being studied from a set of two-dimensional (2D) projections acquired at the microscope. This tomographic reconstruction may be performed with different reconstruction algorithms that can be grouped into several large families: direct Fourier inversion methods, back-projection methods, Radon methods, or iterative algorithms. In this review, we focus on the latter family of algorithms, explaining the mathematical rationale behind the different algorithms in this family as they have been introduced in the field of Electron Microscopy. We cover their use in Single Particle Analysis (SPA) as well as in Electron Tomography (ET).
8

Müller, Jan, Dirk Fimmel, Renate Merker, and Rainer Schaffer. "A Hardware–Software System for Tomographic Reconstruction." Journal of Circuits, Systems and Computers 12, no. 02 (April 2003): 203–29. http://dx.doi.org/10.1142/s021812660300074x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
We present the design of a hardware–software system for the reconstruction of tomographic images. In a systematic approach we developed the parallel processor array, a reconfigurable hardware controller and processing kernel, and the software control up to the integration into a graphical user interface. The processor array acting as a hardware accelerator, is constructed using theoretical results and methods of application-specific hardware design. The reconfigurability of the system allows one to utilize a much wider realm of algorithms than the three reconstruction algorithms implemented so far. In the paper we discuss the system design at different levels from algorithm transformations to board development.
9

Heidrich, G., C. O. Sahlmann, U. Siefker, H. Luig, C. Werner, E. Brunner, J. Meller, and M. Schünemann. "Improvement of tomographic reconstruction in bone SPECT." Nuklearmedizin 45, no. 01 (2006): 35–40. http://dx.doi.org/10.1055/s-0038-1623932.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Summary Aim: The comparison between iterative reconstruction and filtered backprojection in the reconstruction of bone SPECT in the diagnosis of skeletal metastases. Patients, methods: 47 consecutive patients (vertebral segments: n = 435), with suspected malignancy of the vertebral column, were examined by bone scintigraphy and MRI (maximal interval between the two procedures ± 5 weeks). The SPECT-data were reconstructed with an iterative algorithm (ISA) and with filtered backprojection. We defined semiquantitative criteria in order to assess the quality of the tomograms. Conventional reconstruction was performed both by a Wiener-filter and a low-pass-filter. Iterative reconstruction was performed by the ISA algorithm. The clinical evaluation of the different reconstruction algorithms was performed by MRI as the gold-standard. Results: Sensitivity (%): 87.3 (ISA), 86.4 (low-pass), 79.7 (Wiener); specificity (%): 95.3 (ISA), 95 (low-pass), 85.4 (Wiener). The sensitivity of iterative reconstructed SPECT and low-pass reconstructed SPECT was significantly higher (p <0.05) compared with the sensitivity of SPECT reconstructed by the Wiener-filter. The specificity of iterative reconstruction ISA and low-pass-filter reconstructed SPECT were significantly higher compared with the SPECT data reconstructed by the Wiener-filter. ISA was significantly superior to the Wiener- SPECT relating to all criteria of quality. Iterative reconstruction was significantly superior to the low-pass-SPECT relating to 2 of 3 criteria. In addition the Wiener-SPECT was significantly inferior to the low-pass-SPECT regarding to 2 of 3 criteria. Conclusion: In our series the iterative algorithm ISA was the method of choice in the reconstruction of bone SPECT data. In comparison with conventional algorithms ISA offers a significantly higher quality of the tomograms and yields a high diagnostic accuracy.
10

Yorkey, T. J., and J. G. Webster. "A comparison of impedance tomographic reconstruction algorithms." Clinical Physics and Physiological Measurement 8, no. 4A (November 1987): 55–62. http://dx.doi.org/10.1088/0143-0815/8/4a/007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Varga, László, Péter Balázs, and Antal Nagy. "Direction-dependency of binary tomographic reconstruction algorithms." Graphical Models 73, no. 6 (November 2011): 365–75. http://dx.doi.org/10.1016/j.gmod.2011.06.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Ozdiev, Ali. "X-Ray Tomography Simulation Based on Direct Radon Transform." Key Engineering Materials 743 (July 2017): 445–48. http://dx.doi.org/10.4028/www.scientific.net/kem.743.445.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Nowadays, X-ray tomography is one of the most pertinent directions of the development of non-destructive testing methods. Besides the experimental setup to conduct the X-ray tomographic measurements, it is necessary to have stable and flexible software. In most cases, existing software packages for the reconstruction of tomographic data are not freeware. This makes tomographic experiments not flexible because of the restriction of the source code correction. This papers explains how to implement one of important parts of tomographic research, namely, experimental data simulation, which allows to test reconstruction algorithms further.
13

Papoutsellis, Evangelos, Evelina Ametova, Claire Delplancke, Gemma Fardell, Jakob S. Jørgensen, Edoardo Pasca, Martin Turner, Ryan Warr, William R. B. Lionheart, and Philip J. Withers. "Core Imaging Library - Part II: multichannel reconstruction for dynamic and spectral tomography." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 379, no. 2204 (July 5, 2021): 20200193. http://dx.doi.org/10.1098/rsta.2020.0193.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The newly developed core imaging library (CIL) is a flexible plug and play library for tomographic imaging with a specific focus on iterative reconstruction. CIL provides building blocks for tailored regularized reconstruction algorithms and explicitly supports multichannel tomographic data. In the first part of this two-part publication, we introduced the fundamentals of CIL. This paper focuses on applications of CIL for multichannel data, e.g. dynamic and spectral. We formalize different optimization problems for colour processing, dynamic and hyperspectral tomography and demonstrate CIL’s capabilities for designing state-of-the-art reconstruction methods through case studies and code snapshots. This article is part of the theme issue ‘Synergistic tomographic image reconstruction: part 2’.
14

Tsoumpas, Charalampos, Jakob Sauer Jørgensen, Christoph Kolbitsch, and Kris Thielemans. "Synergistic tomographic image reconstruction: part 2." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 379, no. 2204 (July 5, 2021): 20210111. http://dx.doi.org/10.1098/rsta.2021.0111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This special issue is the second part of a themed issue that focuses on synergistic tomographic image reconstruction and includes a range of contributions in multiple disciplines and application areas. The primary subject of study lies within inverse problems which are tackled with various methods including statistical and computational approaches. This volume covers algorithms and methods for a wide range of imaging techniques such as spectral X-ray computed tomography (CT), positron emission tomography combined with CT or magnetic resonance imaging, bioluminescence imaging and fluorescence-mediated imaging as well as diffuse optical tomography combined with ultrasound. Some of the articles demonstrate their utility on real-world challenges, either medical applications (e.g. motion compensation for imaging patients) or applications in material sciences (e.g. material decomposition and characterization). One of the desired outcomes of the special issues is to bring together different scientific communities which do not usually interact as they do not share the same platforms such as journals and conferences. This article is part of the theme issue ‘Synergistic tomographic image reconstruction: part 2’.
15

Bieberle, M., and U. Hampel. "Level-set reconstruction algorithm for ultrafast limited-angle X-ray computed tomography of two-phase flows." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 373, no. 2043 (June 13, 2015): 20140395. http://dx.doi.org/10.1098/rsta.2014.0395.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Tomographic image reconstruction is based on recovering an object distribution from its projections, which have been acquired from all angular views around the object. If the angular range is limited to less than 180° of parallel projections, typical reconstruction artefacts arise when using standard algorithms. To compensate for this, specialized algorithms using a priori information about the object need to be applied. The application behind this work is ultrafast limited-angle X-ray computed tomography of two-phase flows. Here, only a binary distribution of the two phases needs to be reconstructed, which reduces the complexity of the inverse problem. To solve it, a new reconstruction algorithm (LSR) based on the level-set method is proposed. It includes one force function term accounting for matching the projection data and one incorporating a curvature-dependent smoothing of the phase boundary. The algorithm has been validated using simulated as well as measured projections of known structures, and its performance has been compared to the algebraic reconstruction technique and a binary derivative of it. The validation as well as the application of the level-set reconstruction on a dynamic two-phase flow demonstrated its applicability and its advantages over other reconstruction algorithms.
16

Rymarczyk, Tomasz, Grzegorz Kłosowski, Anna Hoła, Jerzy Hoła, Jan Sikora, Paweł Tchórzewski, and Łukasz Skowron. "Historical Buildings Dampness Analysis Using Electrical Tomography and Machine Learning Algorithms." Energies 14, no. 5 (February 27, 2021): 1307. http://dx.doi.org/10.3390/en14051307.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The article deals with the problem of detecting moisture in the walls of historical buildings. As part of the presented research, the following four methods based on mathematical modeling and machine learning were compared: total variation, least-angle regression, elastic net, and artificial neural networks. Based on the simulation data, the systems for the reconstruction of “pixel by pixel” tomographic images were trained. In order to test the reconstructive algorithms obtained during the research, images were generated based on real measurements and simulation cases. The method comparison was performed on the basis of three indicators: mean square error, relative image error, and image correlation coefficient. The above indicators were applied to four selected variants that corresponded to various parts of the walls. The variants differed in the dimensions of the tested wall sections, the number of electrodes used, and the resolution of the 3D image meshes. In all analyzed variants, the best results were obtained using the elastic net algorithm. In addition, all machine learning methods generated better tomographic reconstructions than the classic Total Variation method.
17

Blumensath, Thomas, and Richard Boardman. "Non-convexly constrained image reconstruction from nonlinear tomographic X-ray measurements." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 373, no. 2043 (June 13, 2015): 20140393. http://dx.doi.org/10.1098/rsta.2014.0393.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The use of polychromatic X-ray sources in tomographic X-ray measurements leads to nonlinear X-ray transmission effects. As these nonlinearities are not normally taken into account in tomographic reconstruction, artefacts occur, which can be particularly severe when imaging objects with multiple materials of widely varying X-ray attenuation properties. In these settings, reconstruction algorithms based on a nonlinear X-ray transmission model become valuable. We here study the use of one such model and develop algorithms that impose additional non-convex constraints on the reconstruction. This allows us to reconstruct volumetric data even when limited measurements are available. We propose a nonlinear conjugate gradient iterative hard thresholding algorithm and show how many prior modelling assumptions can be imposed using a range of non-convex constraints.
18

Kłosowski, Grzegorz, Tomasz Rymarczyk, Tomasz Cieplak, Konrad Niderla, and Łukasz Skowron. "Quality Assessment of the Neural Algorithms on the Example of EIT-UST Hybrid Tomography." Sensors 20, no. 11 (June 11, 2020): 3324. http://dx.doi.org/10.3390/s20113324.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The paper presents the results of research on the hybrid industrial tomograph electrical impedance tomography (EIT) and ultrasonic tomography (UST) (EIT-UST), operating on the basis of electrical and ultrasonic data. The emphasis of the research was placed on the algorithmic domain. However, it should be emphasized that all hardware components of the hybrid tomograph, including electronics, sensors and transducers, have been designed and mostly made in the Netrix S.A. laboratory. The test object was a tank filled with water with several dozen percent concentration. As part of the study, the original multiple neural networks system was trained, the characteristic feature of which is the generation of each of the individual pixels of the tomographic image, using an independent artificial neural network (ANN), with the input vector for all ANNs being the same. Despite the same measurement vector, each of the ANNs generates its own independent output value for a given tomogram pixel, because, during training, the networks get their respective weights and biases. During the tests, the results of three tomographic methods were compared: EIT, UST and EIT-UST hybrid. The results confirm that the use of heterogeneous tomographic systems (hybrids) increases the reliability of reconstruction in various measuring cases, which is used to solve quality problems in managing production processes.
19

Halavataya, K. A., K. V. Kozadaev, and V. S. Sadau. "Adjusting videoendoscopic 3D reconstruction results using tomographic data." Computer Optics 46, no. 2 (April 2022): 246–51. http://dx.doi.org/10.18287/2412-6179-co-910.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Videoendoscopic and tomographic research are the two leading medical imaging solutions for detecting, classifying and characterizing a wide array of pathologies and conditions. However, source information from these types of research is very different, making it hard to cross-correlate them. The paper proposes a novel algorithm for combining results of based on 3D surface reconstruction methods. This approach allows to align separate parts of two input 3D surfaces: surface obtained by applying bundle adjustment-based 3D surface reconstruction algorithm to the endoscopic video sequence, and surface reconstructed directly from separate tomographic cross-section slice projections with regular density. Proposed alignment method is based on using local feature extractor and descriptor algorithms by applying them to the source surface normal maps. This alignment allows both surfaces to be equalized and normalized relative to each other. Results show that such an adjustment allows to reduce noise, correct reconstruction artifacts and errors, increase representative quality of the resulting model and establish correctness of the reconstruction for hyperparameter tuning.
20

Israel-Jost, Vincent, Philippe Choquet, and André Constantinesco. "A Prospective Study on Algorithms Adapted to the Spatial Frequency in Tomography." International Journal of Biomedical Imaging 2006 (2006): 1–6. http://dx.doi.org/10.1155/ijbi/2006/34043.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The use of iterative algorithms in tomographic reconstruction always leads to a frequency adapted rate of convergence in that low frequencies are accurately reconstructed after a few iterations, while high frequencies sometimes require many more computations. In this paper, we propose to buildfrequency adapted(FA) algorithms based on a condition of incomplete backprojection and propose an FAsimultaneous algebraic reconstruction technique(FA-SART) algorithm as an example. The results obtained with the FA-SART algorithm demonstrate a very fast convergence on a highly detailed phantom when compared to the original SART algorithm. Though the use of such an FA algorithm may seem difficult, we specify in which case it is relevant and propose several ways to improve the reconstruction process with FA algorithms.
21

Li, Sheng, and Ke Du. "A minimum curvature algorithm for tomographic reconstruction of atmospheric chemicals based on optical remote sensing." Atmospheric Measurement Techniques 14, no. 11 (November 25, 2021): 7355–68. http://dx.doi.org/10.5194/amt-14-7355-2021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Abstract. Optical remote sensing (ORS) combined with the computerized tomography (CT) technique is a powerful tool to retrieve a two-dimensional concentration map over an area under investigation. Whereas medical CT usually uses a beam number of hundreds of thousands, ORS-CT usually uses a beam number of dozens, thus severely limiting the spatial resolution and the quality of the reconstructed map. The smoothness a priori information is, therefore, crucial for ORS-CT. Algorithms that produce smooth reconstructions include smooth basis function minimization, grid translation and multiple grid (GT-MG), and low third derivative (LTD), among which the LTD algorithm is promising because of the fast speed. However, its theoretical basis must be clarified to better understand the characteristics of its smoothness constraints. Moreover, the computational efficiency and reconstruction quality need to be improved for practical applications. This paper first treated the LTD algorithm as a special case of the Tikhonov regularization that uses the approximation of the third-order derivative as the regularization term. Then, to seek more flexible smoothness constraints, we successfully incorporated the smoothness seminorm used in variational interpolation theory into the reconstruction problem. Thus, the smoothing effects can be well understood according to the close relationship between the variational approach and the spline functions. Furthermore, other algorithms can be formulated by using different seminorms. On the basis of this idea, we propose a new minimum curvature (MC) algorithm by using a seminorm approximating the sum of the squares of the curvature, which reduces the number of linear equations to half that in the LTD algorithm. The MC algorithm was compared with the non-negative least square (NNLS), GT-MG, and LTD algorithms by using multiple test maps. The MC algorithm, compared with the LTD algorithm, shows similar performance in terms of reconstruction quality but requires only approximately 65 % the computation time. It is also simpler to implement than the GT-MG algorithm because it directly uses high-resolution grids during the reconstruction process. Compared with the traditional NNLS algorithm, it shows better performance in the following three aspects: (1) the nearness of reconstructed maps is improved by more than 50 %, (2) the peak location accuracy is improved by 1–2 m, and (3) the exposure error is improved by 2 to 5 times. Testing results indicated the effectiveness of the new algorithm according to the variational approach. More specific algorithms could be similarly further formulated and evaluated. This study promotes the practical application of ORS-CT mapping of atmospheric chemicals.
22

Perkins, G. A., C. W. Renken, S. J. Young, S. Lindsey, M. H. Ellisman, and T. G. Frey. "Use of High-Performance Computing Algorithms in Combination With Parallel Computing for Tomography." Microscopy and Microanalysis 3, S2 (August 1997): 1123–24. http://dx.doi.org/10.1017/s1431927600012502.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The electron microscope is essential for resolving complex biological structures, such as mitochondria, which are too small to be viewed in detail with the light microscope. In contrast to a conventional instrument, the High-Voltage Electron Microscope (HVEM) located at the National Center for Microscopy and Imaging Research (NCMIR) can obtain images from relatively thick specimens that contain substantial three-dimensional structure. Though a single image acquired with the HVEM represents a projection through the specimen, tomographic methods can be applied to a set of images acquired from different orientations to derive a three-dimensional representation of its biological structure. Tomography requires extensive computation and considerable processing time on conventional workstations in order to reconstruct the typically large HVEM volumes from the tilt series.In order to expedite tomographic processing, we have implemented both the commonly used singleaxis tilt, R-weighted backprojection algorithm and two iterative reconstruction methods, algebraic reconstruction (ART) and simultaneous iterative reconstruction (SIRT) on the massively parallel Intel Paragon at the San Diego Supercompter Center.
23

Natterer, Frank. "Numerical methods in tomography." Acta Numerica 8 (January 1999): 107–41. http://dx.doi.org/10.1017/s0962492900002907.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In this article we review the image reconstruction algorithms used in tomography. We restrict ourselves to the standard problems in the reconstruction of function from line or plane integrals as they occur in X-ray tomography, nuclear medicine, magnetic resonance imaging, and electron microscopy. Nonstandard situations, such as incomplete data, unknown orientations, local tomography, and discrete tomography are not dealt with. Nor do we treat nonlinear tomographic techniques such as impedance, ultrasound, and near-infrared imaging.
24

Mao, Yu, Benjamin P. Fahimian, Stanley J. Osher, and Jianwei Miao. "Development and Optimization of Regularized Tomographic Reconstruction Algorithms Utilizing Equally-Sloped Tomography." IEEE Transactions on Image Processing 19, no. 5 (May 2010): 1259–68. http://dx.doi.org/10.1109/tip.2009.2039660.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Ellisman, Mark H., Stephen J. Young, G. Y. Fan, Guy Perkins, Steve Lamont, and Maryann E. Martone. "Highlights of Selected Microscopy Research Resource Activities at San Diego." Microscopy and Microanalysis 3, S2 (August 1997): 275–76. http://dx.doi.org/10.1017/s1431927600008266.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The intermediate high-voltage electron microscope (IVEM) located at the National Center for Microscopy and Imaging Research at San Diego (NCMIR) can image relatively thick specimens that contain substantial three-dimensional (3-D) structure. Electron tomography is an important tool used at NCMIR for deriving 3-D cellular and subcellular structure from IVEM images. Reconstruction algorithms commonly used in electron tomography include weighted back projection, and iterative algebraic reconstruction techniques such as ART and SIRT. Improvements in reconstruction quality are possible using the iterative algorithms. Because these algorithms are computationally intensive, we have ported them to massively parallel computers at the San Diego Supercomputer Center, reducing the computation time over that required with workstation level machines.The quality of tomographic data for the 3-D reconstruction of biological structures is also being enhanced by NCMIR projects to improve the microscope. We have designed and constructed special electron optics and microscope control systems for the JEOL 4000EX.
26

Deng, Junjing, Yuan Hung Lo, Marcus Gallagher-Jones, Si Chen, Alan Pryor, Qiaoling Jin, Young Pyo Hong, et al. "Correlative 3D x-ray fluorescence and ptychographic tomography of frozen-hydrated green algae." Science Advances 4, no. 11 (November 2018): eaau4548. http://dx.doi.org/10.1126/sciadv.aau4548.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Accurate knowledge of elemental distributions within biological organisms is critical for understanding their cellular roles. The ability to couple this knowledge with overall cellular architecture in three dimensions (3D) deepens our understanding of cellular chemistry. Using a whole, frozen-hydrated Chlamydomonas reinhardtii cell as an example, we report the development of 3D correlative microscopy through a combination of simultaneous cryogenic x-ray ptychography and x-ray fluorescence microscopy. By taking advantage of a recently developed tomographic reconstruction algorithm, termed GENeralized Fourier Iterative REconstruction (GENFIRE), we produce high-quality 3D maps of the unlabeled alga’s cellular ultrastructure and elemental distributions within the cell. We demonstrate GENFIRE’s ability to outperform conventional tomography algorithms and to further improve the reconstruction quality by refining the experimentally intended tomographic angles. As this method continues to advance with brighter coherent light sources and more efficient data handling, we expect correlative 3D x-ray fluorescence and ptychographic tomography to be a powerful tool for probing a wide range of frozen-hydrated biological specimens, ranging from small prokaryotes such as bacteria, algae, and parasites to large eukaryotes such as mammalian cells, with applications that include understanding cellular responses to environmental stimuli and cell-to-cell interactions.
27

de Lima, Camila, and Elias Salomão Helou. "Fast projection/backprojection and incremental methods applied to synchrotron light tomographic reconstruction." Journal of Synchrotron Radiation 25, no. 1 (January 1, 2018): 248–56. http://dx.doi.org/10.1107/s1600577517015715.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Iterative methods for tomographic image reconstruction have the computational cost of each iteration dominated by the computation of the (back)projection operator, which take roughlyO(N3) floating point operations (flops) forN×Npixels images. Furthermore, classical iterative algorithms may take too many iterations in order to achieve acceptable images, thereby making the use of these techniques unpractical for high-resolution images. Techniques have been developed in the literature in order to reduce the computational cost of the (back)projection operator toO(N2logN) flops. Also, incremental algorithms have been devised that reduce by an order of magnitude the number of iterations required to achieve acceptable images. The present paper introduces an incremental algorithm with a cost ofO(N2logN) flops per iteration and applies it to the reconstruction of very large tomographic images obtained from synchrotron light illuminated data.
28

Brockhauser, Sandor, Marco Di Michiel, John E. McGeehan, Andrew A. McCarthy, and Raimond B. G. Ravelli. "X-ray tomographic reconstruction of macromolecular samples." Journal of Applied Crystallography 41, no. 6 (October 11, 2008): 1057–66. http://dx.doi.org/10.1107/s002188980802935x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The anomalous scattering properties of innate sulfur for proteins and phosphorus for DNA and RNA can be used to solve the phase problem in macromolecular crystallography (MX)viathe single-wavelength anomalous dispersion method (SAD). However, this method, which is carried out at longer X-ray wavelengths (1.5–2.5 Å), is still not a routine tool in MX. The increased absorption from both air and sample associated with the use of longer X-ray wavelengths presents a key difficulty. The absorption can be corrected for through empirical algorithms, provided truly redundant data are available. Unfortunately, weakly diffracting macromolecular crystals suffer from radiation damage, resulting in a dose-dependent non-isomorphism which violates the assumption upon which these empirical algorithms are based. In this report, X-ray microtomography is used to reconstruct the three-dimensional shapes of vitrified macromolecular crystals including the surrounding solvent and sample holder. The setup can be integrated within an MX beamline environment and exploits both absorption and phase contrast. The dose needed for the tomographic measurements could be low enough to allow the technique to be used for crystal integrity characterization and alignment. X-ray tomography has some major benefits compared with the optical-light-based crystal alignment protocols currently used.
29

Banasiak, Robert, Radosław Wajman, Tomasz Jaworski, Paweł Fiderek, Paweł Kapusta, and Dominik Sankowski. "TWO-PHASE FLOW REGIME THREE-DIMENSONAL VISUALIZATION USING ELECTRICAL CAPACITANCE TOMOGRAPHY – ALGORITHMS AND SOFTWARE." Informatics Control Measurement in Economy and Environment Protection 7, no. 1 (March 30, 2017): 11–16. http://dx.doi.org/10.5604/01.3001.0010.4575.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This paper presents the software for comprehensive processing and visualization of 2D and 3D electrical tomography data. The system name as TomoKIS Studio has been developed in the frame of DENIDIA international research project and has been improved in the frame of Polish Ministry of Science and Higher Education Project no 4664/B/T02/2010/38. This software is worldwide unique because it simultaneously integrates the process of tomographic data acquisition, numerical FEM modeling and tomographic images reconstruction. The software can be adapted to specific industrial applications, particularly to monitoring and diagnosis of two-phase flows. The software architecture is composed of independent modules. Their combination offers calibration, configuration and full-duplex communication with any tomographic acquisition system with known and open communication protocol. The other major features are: online data acquisition and processing, online and offline 2D/3D images linear and nonlinear reconstruction and visualization as well as raw data and tomograms processing. Another important ability is 2D/3D ECT sensor construction using FEM modeling. The presented software is supported with the multi-core GPU technology and parallel computing using Nvidia CUDA technology.
30

Tsoumpas, Charalampos, Jakob Sauer Jørgensen, Christoph Kolbitsch, and Kris Thielemans. "Synergistic tomographic image reconstruction: part 1." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 379, no. 2200 (May 10, 2021): 20200189. http://dx.doi.org/10.1098/rsta.2020.0189.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This special issue focuses on synergistic tomographic image reconstruction in a range of contributions in multiple disciplines and various application areas. The topic of image reconstruction covers substantial inverse problems (Mathematics) which are tackled with various methods including statistical approaches (e.g. Bayesian methods, Monte Carlo) and computational approaches (e.g. machine learning, computational modelling, simulations). The issue is separated in two volumes. This volume focuses mainly on algorithms and methods. Some of the articles will demonstrate their utility on real-world challenges, either medical applications (e.g. cardiovascular diseases, proton therapy planning) or applications in material sciences (e.g. material decomposition and characterization). One of the desired outcomes of the special issue is to bring together different scientific communities which do not usually interact as they do not share the same platforms (such as journals and conferences). This article is part of the theme issue ‘Synergistic tomographic image reconstruction: part 1’.
31

McDade, Ian C., and Edward J. Llewellyn. "Inversion techniques for recovering two-dimensional distributions of auroral emission rates from tomographic rocket photometer measurements." Canadian Journal of Physics 69, no. 8-9 (August 1, 1991): 1059–68. http://dx.doi.org/10.1139/p91-164.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In this paper we demonstrate how the spatial distribution of optical emission rates within an auroral arc may be recovered from rocket photometer measurements made in a tomographic spin scan mode. We describe the tomographic inversion procedures required to recover this information and the implementation of two inversion algorithms that are particularly well suited for dealing with the problem of noise in the observational data. These algorithms are based upon the algebraic reconstruction technique and utilize "least-squares" and "maximum-probability" iterative relaxation methods. The performance of the inversion algorithms and the limitations of the rocket tomography technique are assessed using various sets of simulated rocket measurements that were generated from "known" auroral emission-rate distributions. The simulations are used to investigate how the quality of the tomographic recovery may be influenced by various factors such as noise in the data, rocket penetration of the auroral form, background sources of emission, smearing due to the photometer field of view, and temporal variations in the auroral form.
32

Sparling, Chris, and Dave Townsend. "Tomographic reconstruction techniques optimized for velocity-map imaging applications." Journal of Chemical Physics 157, no. 11 (September 21, 2022): 114201. http://dx.doi.org/10.1063/5.0101789.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Examples of extracting meaningful information from image projection data using tomographic reconstruction techniques can be found in many areas of science. Within the photochemical dynamics community, tomography allows for complete three-dimensional (3D) charged particle momentum distributions to be reconstructed following a photodissociation or photoionization event. This permits highly differential velocity- and angle-resolved measurements to be made simultaneously. However, the generalized tomographic reconstruction strategies typically adopted for use with photochemical imaging—based around the Fourier-slice theorem and filtered back-projection algorithms—are not optimized for these specific types of problems. Here, we discuss pre-existing alternative strategies—namely, the simultaneous iterative reconstruction technique and Hankel Transform Reconstruction (HTR)—and introduce them in the context of velocity-map imaging applications. We demonstrate the clear advantages they afford, and how they can perform considerably better than approaches commonly adopted at present. Most notably, with HTR we can set a bound on the minimum number of projections required to reliably reconstruct 3D photoproduct distributions. This bound is significantly lower than what is currently accepted and will help make tomographic imaging far more accessible and efficient for many experimentalists working in the field of photochemical dynamics.
33

Krawczyk, Rafał, Piotr Kolasiński, Paweł Linczuk, Wojciech Zabołotny, Krzysztof Poźniak, Paweł Zienkiewicz, Ryszard Romaniuk, et al. "INTO THE FAST TOMOGRAPHIC POSTPROCESSING IN TOKAMAKS." Informatics Control Measurement in Economy and Environment Protection 7, no. 3 (September 30, 2017): 15–18. http://dx.doi.org/10.5604/01.3001.0010.5207.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The collaboration of authors led to implementing advanced and fast systems for diagnostics of plasma content in tokamaks. During the development of systems it is planned to add new functionalities, in particular, the algorithms of tomographic reconstruction to obtain information on three dimensional distribution of plasma impurities. In the article the idea of tomographic reconstruction is introduced and issues of performance and adequate hardware selection are presented.
34

Hofmann, Jürgen, and Robert Zboray. "An In-House Cone-Beam Tomographic Reconstruction Package for Laboratory X-ray Phase-Contrast Imaging." Applied Sciences 12, no. 3 (January 28, 2022): 1430. http://dx.doi.org/10.3390/app12031430.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Phase-contrast, and in general, multi-modal, X-ray micro-tomography is proven to be very useful for low-density, low-attention samples enabling much better contrast than its attenuation-based pendant. Therefore, it is increasingly applied in bio- and life sciences primarily dealing with such samples. Although there is a plethora of literature regarding phase-retrieval algorithms, access to implementations of those algorithms is relatively limited and very few packages combining phase-retrieval methods with the full tomographic reconstruction pipeline are available. This is especially the case for laboratory-based phase-contrast imaging typically featuring cone-beam geometry. We present here an in-house cone-beam tomographic reconstruction package for laboratory X-ray phase-contrast imaging. It covers different phase-contrast techniques and phase retrieval methods. The paper explains their implementation and integration in the filtered back projection chain. Their functionality and efficiency will be demonstrated through applications on a few dedicated samples.
35

Montisci, Augusto, Sara Carcangiu, Giuliana Sias, Barbara Cannas, and Alessandra Fanni. "A Real Time Bolometer Tomographic Reconstruction Algorithm in Nuclear Fusion Reactors." Mathematics 9, no. 11 (May 24, 2021): 1186. http://dx.doi.org/10.3390/math9111186.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
In tokamak nuclear fusion reactors, one of the main issues is to know the total emission of radiation, which is mandatory to understand the plasma physics and is very useful to monitor and control the plasma evolution. This radiation can be measured by means of a bolometer system that consists in a certain number of elements sensitive to the integral of the radiation along straight lines crossing the plasma. By placing the sensors in such a way to have families of crossing lines, sophisticated tomographic inversion algorithms allow to reconstruct the radiation tomography in the 2D poloidal cross-section of the plasma. In tokamaks, the number of projection cameras is often quite limited resulting in an inversion mathematic problem very ill conditioned so that, usually, it is solved by means of a grid-based, iterative constrained optimization procedure, whose convergence time is not suitable for the real time requirements. In this paper, to illustrate the method, an assumption not valid in general is made on the correlation among the grid elements, based on the statistical distribution of the radiation emissivity over a set of tomographic reconstructions, performed off-line. Then, a regularization procedure is carried out, which merge highly correlated grid elements providing a squared coefficients matrix with an enough low condition number. This matrix, which is inverted offline once for all, can be multiplied by the actual bolometer measures returning the tomographic reconstruction, with calculations suitable for real time application. The proposed algorithm is applied, in this paper, to a synthetic case study.
36

Bondarenko, A. S., J. Aviles, A. Alexander, A. Korepanov, and R. Mendoza. "Tomographic and centroid reconstructions of plasma emission on C-2W via enhanced 300-channel bolometry system." Review of Scientific Instruments 93, no. 10 (October 1, 2022): 103517. http://dx.doi.org/10.1063/5.0101656.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The C-2W experimental device at TAE Technologies utilizes neutral beam injection and edge biasing to sustain long-lived, stable field reversed configuration (FRC) plasma. An ongoing effort is under way to optimize the electrode biasing system, which provides boundary control to stabilize the FRC. To this end, tomography offers a powerful and non-invasive technique as tomographic reconstruction of the FRC emission profile provides an important assessment of global stability. Recently, a new signal acquisition system was implemented on a bolometer array dedicated to tomography on C-2W, significantly enhancing the signal-to-noise of the collected data. The array consists of 300 simultaneously digitized photodiode channels that respond to a broad range of wavelengths, from soft x-ray to near-infrared, as well as energetic particles, yielding 180 unique lines of sight that intersect a toroidal plane of the FRC near the mid-plane. Utilizing the collected photo-signals from a set of plasma discharges in which the electrode biasing was intentionally terminated mid-shot, time-resolved reconstruction of the plasma emissivity is achieved via pixel-based 1D and 2D tomographic algorithms, revealing sharply annular profiles with a clear magnetohydrodynamic (MHD) mode structure. In addition, reconstruction of the plasma center-of-emission trajectories via a centroid algorithm applied to the same set of discharges demonstrates a cyclical plasma wobble. Crucially, both the tomography reconstruction and centroid reconstruction indicate an n = 1 toroidal mode that reverses from the electron diamagnetic direction to the ion diamagnetic direction and grows in amplitude after bias termination, qualitatively consistent with the expected stabilizing effect of electrodes.
37

Helou, E. S., M. V. W. Zibetti, and E. X. Miqueles. "Superiorization of incremental optimization algorithms for statistical tomographic image reconstruction." Inverse Problems 33, no. 4 (March 1, 2017): 044010. http://dx.doi.org/10.1088/1361-6420/33/4/044010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Fang Xu and K. Mueller. "Accelerating popular tomographic reconstruction algorithms on commodity PC graphics hardware." IEEE Transactions on Nuclear Science 52, no. 3 (June 2005): 654–63. http://dx.doi.org/10.1109/tns.2005.851398.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Kasai, Ryosuke, Yusaku Yamaguchi, Takeshi Kojima, Omar M. Abou Al-Ola, and Tetsuya Yoshinaga. "Noise-Robust Image Reconstruction Based on Minimizing Extended Class of Power-Divergence Measures." Entropy 23, no. 8 (July 31, 2021): 1005. http://dx.doi.org/10.3390/e23081005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The problem of tomographic image reconstruction can be reduced to an optimization problem of finding unknown pixel values subject to minimizing the difference between the measured and forward projections. Iterative image reconstruction algorithms provide significant improvements over transform methods in computed tomography. In this paper, we present an extended class of power-divergence measures (PDMs), which includes a large set of distance and relative entropy measures, and propose an iterative reconstruction algorithm based on the extended PDM (EPDM) as an objective function for the optimization strategy. For this purpose, we introduce a system of nonlinear differential equations whose Lyapunov function is equivalent to the EPDM. Then, we derive an iterative formula by multiplicative discretization of the continuous-time system. Since the parameterized EPDM family includes the Kullback–Leibler divergence, the resulting iterative algorithm is a natural extension of the maximum-likelihood expectation-maximization (MLEM) method. We conducted image reconstruction experiments using noisy projection data and found that the proposed algorithm outperformed MLEM and could reconstruct high-quality images that were robust to measured noise by properly selecting parameters.
40

Bakirov, Vadim F., and Ronald A. Kline. "Diffusion-Based Thermal Tomography." Journal of Heat Transfer 127, no. 11 (January 26, 2005): 1276–79. http://dx.doi.org/10.1115/1.2039115.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Thermal imaging is one of the fastest growing areas of nondestructive testing. The basic idea is to apply heat to a material and study the way the temperature changes within the material to learn about its composition. The technique is rapid, relatively inexpensive, and, most importantly, has a wide coverage area with a single experimental measurement. One of the main research goals in thermal imaging has been to improve flaw definition through advanced image processing. Tomographic imaging is a very attractive way to achieve this goal. Although there have been some attempts to implement tomographic principles for thermal imaging, they have been only marginally successful. One possible reason for this is that conventional tomography algorithms rely on wave propagation (either electromagnetic or acoustic) and are inherently unsuitable for thermal diffusion without suitable modifications. In this research program, a modified approach to thermal imaging is proposed that fully accounts for diffusion phenomena in a tomographic imaging algorithm. Here, instead of the large area source used in conventional thermal imaging applications, a raster scanned point source is employed in order to provide the well-defined source-receiver positions required for tomographic imaging. An algorithm for the forward propagation problem, based on the Galerkin finite element method in connection with the corresponding weak formulation for the thermal diffusion is considered. A thermal diffusion modified version of the algebraic reconstruction technique (ART) is used for image reconstruction. Examples of tomographic images are presented from synthetically generated data to illustrate the utility of the approach.
41

Li, Xianyu, Yulin He, and Qun Hua. "Application of Computed Tomographic Image Reconstruction Algorithms Based on Filtered Back-Projection in Diagnosis of Bone Trauma Diseases." Journal of Medical Imaging and Health Informatics 10, no. 5 (May 1, 2020): 1219–24. http://dx.doi.org/10.1166/jmihi.2020.3036.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Objective: To improve the diagnostic rate of bone trauma diseases by applying image reconstruction algorithm based on filtered back-projection to CT images of bone trauma. Methods: Sixty-three patients with bone trauma in our hospital were selected. After hospitalization, 63 patients took satisfactory localization images to make the lesions on the localization images close to or even exceed the resolution of conventional X-ray films. After scanning, the post-processing workstation software was used for post-processing of image reconstruction algorithm based on filtered back-projection. Finally, the diagnostic accuracy of X-ray plain film, common CT image and image examination based on filtered back-projection was compared statistically. Results: Among 63 cases of bone trauma, 48 cases were found by routine CT cross-sectional examination. The image reconstruction algorithm based on filtered back-projection was applied to all cases of wrist ulnar and trauma examination. The three-dimensional imaging can display the length, direction, shape of articular surface and fracture end of bone trauma as well as the size and spatial position of free small bone fragments stereoscopically and accurately. The relationship between bone trauma and placement. Discussion: Experiments show that when the projection data are complete, the filtering back-projection algorithm can reconstruct the image better, and the overall evaluation of the new filtering function is the best. Usually, the projection data are often incomplete, sometimes even seriously insufficient. At this time, it is necessary to adopt iterative reconstruction algorithm. However, no matter which algorithm is adopted, the reconstruction speed and accuracy are improved, and the quality of the reconstructed image is improved. It remains the direction of future efforts. The FBP method is the basic common algorithm for reconstructing image, and it is also the basis of many other algorithms. It is widely used in medical CT and other fields. Conclusion: The improved CT image reconstruction algorithm based on filtered back-projection has high application value in the diagnosis of bone trauma diseases. By comparing the three indexes of serial processing time, information transfer interface and image noise, the suspicious site of bone trauma can be diagnosed clearly. In recent years, with the popularization of CT and the emergence of spiral CT, it has a good guiding significance for defining clinical diagnosis and treatment.
42

Kania, Konrad, Tomasz Rymarczyk, Mariusz Mazurek, Sylwia Skrzypek-Ahmed, Mirosław Guzik, and Piotr Oleszczuk. "Optimisation of Technological Processes by Solving Inverse Problem through Block-Wise-Transform-Reduction Method Using Open Architecture Sensor Platform." Energies 14, no. 24 (December 9, 2021): 8295. http://dx.doi.org/10.3390/en14248295.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
This paper presents an open architecture for a sensor platform for the processing, collection, and image reconstruction from measurement data. This paper focuses on ultrasound tomography in block-wise-transform-reduction image reconstruction. The advantage of the presented solution, which is part of the project “Next-generation industrial tomography platform for process diagnostics and control”, is the ability to analyze spatial data and process it quickly. The developed solution includes industrial tomography, big data, smart sensors, computational intelligence algorithms, and cloud computing. Along with the measurement platform, we validate the methods that incorporate image compression into the reconstruction process, speeding up computation and simplifying the regularisation of solving the inverse tomography problem. The algorithm is based on discrete transformation. This method uses compression on each block of the image separately. According to the experiments, this solution is much more efficient than deterministic methods. A feature of this method is that it can be directly incorporated into the compression process of the reconstructed image. Thus, the proposed solution allows tomographic sensor-based process control, multidimensional industrial process control, and big data analysis.
43

Zioga, M., A. Nikopoulou, M. Alexandridi, D. Maintas, M. Mikeli, A. N. Rapsomanikis, and E. Stiliaris. "Image Reconstruction in the Positron Emission Tomography." HNPS Proceedings 20 (December 1, 2012): 73. http://dx.doi.org/10.12681/hnps.2490.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Positron Emission Tomography (PET) has become a valuable tool with a broad spectrum of clinical applications in nuclear imaging. PET scanners can collect in vivo information from positron radiotracer distributions, which is further recon- structed to a tomographic image with the help of well established analytical or iterative algorithms. In this current work, an innovative PET image reconstruction method from raw data based on a simple mathematical model is presented. The developed technique utilizes the accumulated density distribution in a predefined voxelized volume of interest. This distribution is calculated by intersecting and weighting the two-gamma annihilation line with the specified voxels. In order to test the efficiency of the new algorithm, GEANT4/GATE simulation studies were performed. In these studies, a cylindrical PET scanner was modeled and the photon interaction points are validated on an accurate physical basis. An appropriate cylin- drical phantom with different positron radiotracers was used and the reconstructed results were compared to the original phantom.
44

Derevtsov, E. Yu, S. V. Maltseva, and I. E. Svetov. "MATHEMATICAL MODELS AND ALGORITHMS FOR RECONSTRUCTION OF SINGULAR SUPPORT OF FUNCTIONS AND VECTOR FIELDS BY TOMOGRAPHIC DATA." Eurasian Journal of Mathematical and Computer Applications 3, no. 1 (2015): 4–44. http://dx.doi.org/10.32523/2306-3172-2015-3-4-4-44.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Kasai, Ryosuke, Yusaku Yamaguchi, Takeshi Kojima, and Tetsuya Yoshinaga. "Tomographic Image Reconstruction Based on Minimization of Symmetrized Kullback-Leibler Divergence." Mathematical Problems in Engineering 2018 (July 17, 2018): 1–9. http://dx.doi.org/10.1155/2018/8973131.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Iterative reconstruction (IR) algorithms based on the principle of optimization are known for producing better reconstructed images in computed tomography. In this paper, we present an IR algorithm based on minimizing a symmetrized Kullback-Leibler divergence (SKLD) that is called Jeffreys’ J-divergence. The SKLD with iterative steps is guaranteed to decrease in convergence monotonically using a continuous dynamical method for consistent inverse problems. Specifically, we construct an autonomous differential equation for which the proposed iterative formula gives a first-order numerical discretization and demonstrate the stability of a desired solution using Lyapunov’s theorem. We describe a hybrid Euler method combined with additive and multiplicative calculus for constructing an effective and robust discretization method, thereby enabling us to obtain an approximate solution to the differential equation. We performed experiments and found that the IR algorithm derived from the hybrid discretization achieved high performance.
46

Brown, Richard, Christoph Kolbitsch, Claire Delplancke, Evangelos Papoutsellis, Johannes Mayer, Evgueni Ovtchinnikov, Edoardo Pasca, et al. "Motion estimation and correction for simultaneous PET/MR using SIRF and CIL." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 379, no. 2204 (July 5, 2021): 20200208. http://dx.doi.org/10.1098/rsta.2020.0208.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
SIRF is a powerful PET/MR image reconstruction research tool for processing data and developing new algorithms. In this research, new developments to SIRF are presented, with focus on motion estimation and correction. SIRF’s recent inclusion of the adjoint of the resampling operator allows gradient propagation through resampling, enabling the MCIR technique. Another enhancement enabled registering and resampling of complex images, suitable for MRI. Furthermore, SIRF’s integration with the optimization library CIL enables the use of novel algorithms. Finally, SPM is now supported, in addition to NiftyReg, for registration. Results of MR and PET MCIR reconstructions are presented, using FISTA and PDHG, respectively. These demonstrate the advantages of incorporating motion correction and variational and structural priors. This article is part of the theme issue ‘Synergistic tomographic image reconstruction: part 2’.
47

Rymarczyk, Tomasz, Konrad Niderla, Edward Kozłowski, Krzysztof Król, Joanna Maria Wyrwisz, Sylwia Skrzypek-Ahmed, and Piotr Gołąbek. "Logistic Regression with Wave Preprocessing to Solve Inverse Problem in Industrial Tomography for Technological Process Control." Energies 14, no. 23 (December 3, 2021): 8116. http://dx.doi.org/10.3390/en14238116.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The research presented here concerns the analysis and selection of logistic regression with wave preprocessing to solve the inverse problem in industrial tomography. The presented application includes a specialized device for tomographic measurements and dedicated algorithms for image reconstruction. The subject of the research was a model of a tank filled with tap water and specific inclusions. The research mainly targeted the study of developing and comparing models and methods for data reconstruction and analysis. The application allows choosing the appropriate method of image reconstruction, knowing the specifics of the solution. The novelty of the presented solution is the use of original machine learning algorithms to implement electrical impedance tomography. One of the features of the presented solution was the use of many individually trained subsystems, each of which produces a unique pixel of the final image. The methods were trained on data sets generated by computer simulation and based on actual laboratory measurements. Conductivity values for individual pixels are the result of the reconstruction of vector images within the tested object. By comparing the results of image reconstruction, the most efficient methods were identified.
48

Munshi, P., R. K. S. Rathore, S. T. Swamy, and I. D. Dhariyal. "Tomographic reconstruction of the density distribution using direct fan-beam algorithms." Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment 257, no. 2 (June 1987): 398–405. http://dx.doi.org/10.1016/0168-9002(87)90764-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Jang, Seong-Wook, Young-Jin Seo, Yon-Sik Yoo, and Yoon Sang Kim. "Computed Tomographic Image Analysis Based on FEM Performance Comparison of Segmentation on Knee Joint Reconstruction." Scientific World Journal 2014 (2014): 1–11. http://dx.doi.org/10.1155/2014/235858.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
The demand for an accurate and accessible image segmentation to generate 3D models from CT scan data has been increasing as such models are required in many areas of orthopedics. In this paper, to find the optimal image segmentation to create a 3D model of the knee CT data, we compared and validated segmentation algorithms based on both objective comparisons and finite element (FE) analysis. For comparison purposes, we used 1 model reconstructed in accordance with the instructions of a clinical professional and 3 models reconstructed using image processing algorithms (Sobel operator, Laplacian of Gaussian operator, and Canny edge detection). Comparison was performed by inspecting intermodel morphological deviations with the iterative closest point (ICP) algorithm, and FE analysis was performed to examine the effects of the segmentation algorithm on the results of the knee joint movement analysis.
50

Yu, Haiyan, Sihao Xia, Chenxi Wei, Yuwei Mao, Daniel Larsson, Xianghui Xiao, Piero Pianetta, Young-Sang Yu, and Yijin Liu. "Automatic projection image registration for nanoscale X-ray tomographic reconstruction." Journal of Synchrotron Radiation 25, no. 6 (October 23, 2018): 1819–26. http://dx.doi.org/10.1107/s1600577518013929.

Full text
APA, Harvard, Vancouver, ISO, and other styles
Abstract:
Novel developments in X-ray sources, optics and detectors have significantly advanced the capability of X-ray microscopy at the nanoscale. Depending on the imaging modality and the photon energy, state-of-the-art X-ray microscopes are routinely operated at a spatial resolution of tens of nanometres for hard X-rays or ∼10 nm for soft X-rays. The improvement in spatial resolution, however, has led to challenges in the tomographic reconstruction due to the fact that the imperfections of the mechanical system become clearly detectable in the projection images. Without proper registration of the projection images, a severe point spread function will be introduced into the tomographic reconstructions, causing the reduction of the three-dimensional (3D) spatial resolution as well as the enhancement of image artifacts. Here the development of a method that iteratively performs registration of the experimentally measured projection images to those that are numerically calculated by reprojecting the 3D matrix in the corresponding viewing angles is shown. Multiple algorithms are implemented to conduct the registration, which corrects the translational and/or the rotational errors. A sequence that offers a superior performance is presented and discussed. Going beyond the visual assessment of the reconstruction results, the morphological quantification of a battery electrode particle that has gone through substantial cycling is investigated. The results show that the presented method has led to a better quality tomographic reconstruction, which, subsequently, promotes the fidelity in the quantification of the sample morphology.

To the bibliography