Academic literature on the topic 'Radioisotopes – Decay – Computer programs'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Radioisotopes – Decay – Computer programs.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Radioisotopes – Decay – Computer programs"

1

Koehler, Katrina E. "Low Temperature Microcalorimeters for Decay Energy Spectroscopy." Applied Sciences 11, no. 9 (April 29, 2021): 4044. http://dx.doi.org/10.3390/app11094044.

Full text
Abstract:
Low Temperature Detectors have been used to measure embedded radioisotopes in a measurement mode known as Decay Energy Spectroscopy (DES) since 1992. DES microcalorimeter measurements have been used for applications ranging from neutrino mass measurements to metrology to measurements for safeguards and medical nuclides. While the low temperature detectors have extremely high intrinsic energy resolution (several times better than semiconductor detectors), the energy resolution achieved in practice is strongly dependent on factors such as sample preparation method. This review seeks to present the literature consensus on what has been learned by looking at the energy resolution as a function of various choices of detector, absorber, and sample preparation methods.
APA, Harvard, Vancouver, ISO, and other styles
2

Ozhigov, Yuri I. "About quantum computer software." Quantum Information and Computation 20, no. 7&8 (June 2020): 570–80. http://dx.doi.org/10.26421/qic20.7-8-3.

Full text
Abstract:
Quantum computer is the key to controlling complex processes. If its hardware, in general is successfully created on the basis of the physical baggage of the 20th century, the mathematical software is fundamentally lagging behind. Feynman's user interface in the form of quantum gate arrays, cannot be used for the control because it gives the solution of the Schrödinger equation with quadratic slowdown compared to the real process. The software must then imitate the real process using appropriate program primitives written as the programs for classical supercomputer. The decoherence will be reflected by some constant - the number of basic states that can fit into the limited of memory available to software. The real value of this constant can be found in the experimental realization of Grover search algorithm. Rough estimates of this constant are given based on the simplest processes of quantum electrodynamics and nuclear decay.
APA, Harvard, Vancouver, ISO, and other styles
3

Obikhod, T. V., and I. A. Petrenko. "Mass Reconstruction of MSSM Higgs Boson." Ukrainian Journal of Physics 64, no. 8 (September 18, 2019): 714. http://dx.doi.org/10.15407/ujpe64.8.714.

Full text
Abstract:
The problems of the Standard Model, as well as questions related to Higgs boson properties led to the need to model the ttH associated production and the Higgs boson decay to a top quark pair within the MSSM model. With the help of computer programs MadGraph, Pythia, and Delphes and using the latest kinematic cuts taken from experimental data obtained at the LHC, we have predicted the masses of MSSM Higgs bosons, A and H.
APA, Harvard, Vancouver, ISO, and other styles
4

Akhmedova, G., R. Eshburiev, U. Tukhtaev, Sh Khasanov, A. Kahhorova, Sh Sayfiev, and E. Umarov. "Activity Concentrations Of Environmental Samples Collected In Samarkand Region Of Uzbekistan." American Journal of Applied sciences 02, no. 10 (October 31, 2020): 138–44. http://dx.doi.org/10.37547/tajas/volume02issue10-20.

Full text
Abstract:
The results of the investigations of the activity concentration of natural radionuclides in certain building materials and foodstuff samples obtained from the Nurabad district of the Samarkand region of Uzbekistan are presented. The gamma radiation spectra of the samples were measured in Marinelli beaker geometry on γ- spectrometer with a Ge(Li) detector and energy resolution 6% on the 1332 keV line of 60Co. A personal computer and standard computer programs were used to accumulate and process the spectra. The characteristic density of the samples ranged from 140 to 1810 g/l. The activity concentration of 40K was observed to be comparatively higher than that of both 226Ra and 232Th in all the studied samples. The results reveal that the low activity of 137Cs in the samples is attributed due to its half-life decay period.
APA, Harvard, Vancouver, ISO, and other styles
5

Skoneczny, Szymon. "Cellular automata-based modelling and simulation of biofilm structure on multi-core computers." Water Science and Technology 72, no. 11 (August 14, 2015): 2071–81. http://dx.doi.org/10.2166/wst.2015.426.

Full text
Abstract:
The article presents a mathematical model of biofilm growth for aerobic biodegradation of a toxic carbonaceous substrate. Modelling of biofilm growth has fundamental significance in numerous processes of biotechnology and mathematical modelling of bioreactors. The process following double-substrate kinetics with substrate inhibition proceeding in a biofilm has not been modelled so far by means of cellular automata. Each process in the model proposed, i.e. diffusion of substrates, uptake of substrates, growth and decay of microorganisms and biofilm detachment, is simulated in a discrete manner. It was shown that for flat biofilm of constant thickness, the results of the presented model agree with those of a continuous model. The primary outcome of the study was to propose a mathematical model of biofilm growth; however a considerable amount of focus was also placed on the development of efficient algorithms for its solution. Two parallel algorithms were created, differing in the way computations are distributed. Computer programs were created using OpenMP Application Programming Interface for C ++ programming language. Simulations of biofilm growth were performed on three high-performance computers. Speed-up coefficients of computer programs were compared. Both algorithms enabled a significant reduction of computation time. It is important, inter alia, in modelling and simulation of bioreactor dynamics.
APA, Harvard, Vancouver, ISO, and other styles
6

Ilic, Radovan, Darko Lalic, and Srboljub Stankovic. "Srna - Monte Carlo codes for proton transport simulation in combined and voxelized geometries." Nuclear Technology and Radiation Protection 17, no. 1-2 (2002): 27–36. http://dx.doi.org/10.2298/ntrp0202027i.

Full text
Abstract:
This paper describes new Monte Carlo codes for proton transport simulations in complex geometrical forms and in materials of different composition. The SRNA codes were developed for three dimensional (3D) dose distribution calculation in proton therapy and dosimetry. The model of these codes is based on the theory of proton multiple scattering and a simple model of compound nucleus decay. The developed package consists of two codes: SRNA-2KG and SRNA-VOX. The first code simulates proton transport in combined geometry that can be described by planes and second order surfaces. The second one uses the voxelized geometry of material zones and is specifically adopted for the application of patient computer tomography data. Transition probabilities for both codes are given by the SRNADAT program. In this paper, we will present the models and algorithms of our programs, as well as the results of the numerical experiments we have carried out applying them, along with the results of proton transport simulation obtained through the PETRA and GEANT programs. The simulation of the proton beam characterization by means of the Multi-Layer Faraday Cup and spatial distribution of positron emitters obtained by our program indicate the imminent application of Monte Carlo techniques in clinical practice.
APA, Harvard, Vancouver, ISO, and other styles
7

Yamada, Shunji, Atsushi Kurotani, Eisuke Chikayama, and Jun Kikuchi. "Signal Deconvolution and Noise Factor Analysis Based on a Combination of Time–Frequency Analysis and Probabilistic Sparse Matrix Factorization." International Journal of Molecular Sciences 21, no. 8 (April 23, 2020): 2978. http://dx.doi.org/10.3390/ijms21082978.

Full text
Abstract:
Nuclear magnetic resonance (NMR) spectroscopy is commonly used to characterize molecular complexity because it produces informative atomic-resolution data on the chemical structure and molecular mobility of samples non-invasively by means of various acquisition parameters and pulse programs. However, analyzing the accumulated NMR data of mixtures is challenging due to noise and signal overlap. Therefore, data-cleansing steps, such as quality checking, noise reduction, and signal deconvolution, are important processes before spectrum analysis. Here, we have developed an NMR measurement informatics tool for data cleansing that combines short-time Fourier transform (STFT; a time–frequency analytical method) and probabilistic sparse matrix factorization (PSMF) for signal deconvolution and noise factor analysis. Our tool can be applied to the original free induction decay (FID) signals of a one-dimensional NMR spectrum. We show that the signal deconvolution method reduces the noise of FID signals, increasing the signal-to-noise ratio (SNR) about tenfold, and its application to diffusion-edited spectra allows signals of macromolecules and unsuppressed small molecules to be separated by the length of the T2* relaxation time. Noise factor analysis of NMR datasets identified correlations between SNR and acquisition parameters, identifying major experimental factors that can lower SNR.
APA, Harvard, Vancouver, ISO, and other styles
8

Pagnola, Marcelo R., Marcelo Barone, Mariano Malmoria, and Hugo Sirkin. "Influence of z/w relation in Chill Block Melt Spinning (CBMS) process and analysis of thickness in ribbons." Multidiscipline Modeling in Materials and Structures 11, no. 1 (June 8, 2015): 23–31. http://dx.doi.org/10.1108/mmms-02-2014-0008.

Full text
Abstract:
Purpose – The purpose of this paper is to present an analysis over own and other authors data related to the process of Chill Block Melt Spinning (CBMS) and propose a model of analysis for interpreting. Design/methodology/approach – The methodology used in this work is to present the data analyzed by other authors, organize own data similarly to establish comparison, and established models and propose a possible physical processes interpretation. Findings – Similarity between own experimental data. with others data reported by other authors, both z/w ratio and the thicknesses of the films produced has been found. This allows us to establish an exponential decay of the parameters studied and possibly link it the Newtonian cooling to which the samples are subjected in its production. Research limitations/implications – This work is the first model set up to predict dimensions in design process by CBMS as a function of parameters of the ribbon production process. Practical implications – The prediction of the product dimensions, with adjusting the initial parameters, allows to improve the process of ribbon production, this saves tuning time of the machine and provides certainty in the molten material ejection. Social implications – The efficient production of magnetic materials lets save efforts in the raw material process preparing in magnetic cores for the energy sector. This, improves production besides benefit society by the final product and the energy savings. Originality/value – The value of this paper is to propose a model of analysis that allows standardize production parameters, and could even allow the use of these models in computer programs, process simulators in a more effective manner.
APA, Harvard, Vancouver, ISO, and other styles
9

Horvath, Jared C., Alex J. Horton, Jason M. Lodge, and John A. C. Hattie. "The impact of binge watching on memory and perceived comprehension." First Monday, September 1, 2017. http://dx.doi.org/10.5210/fm.v22i9.7729.

Full text
Abstract:
Binge watching via video-on-demand services is now considered the new ‘normal’ way to consume television programs. In fact, recent surveys suggest upwards of 80 percent of consumers prefer and indulge in binge watching behavior. Despite this, there is no evidence regarding the impact of binge watching on the enjoyment of and memory for viewed content. In this, the first empirical and controlled study of its kind, we determined that, although binge watching leads to strong memory formation immediately following program viewing, these memories decay more rapidly than memories formed after daily- or weekly-episode viewing schedules. In addition, participants in the binge watching condition reported significantly less show enjoyment than participants in the daily- or weekly-viewing conditions — though, important considerations with regards to this finding are discussed. Although it is a preferred viewing style catered to by many internet-based on-demand distribution companies, binge watching does not appear to benefit sustained memory of viewed content and may affect show enjoyment.
APA, Harvard, Vancouver, ISO, and other styles
10

Obikhod, T. V., and E. A. Petrenko. "INVESTIGATIONS OF ELECTROWEAK SYMMETRY BREAKING MECHANISM FOR HIGGS BOSON DECAYS INTO FOUR FERMIONS." Problems of Atomic Science and Technology, September 21, 2020, 8–12. http://dx.doi.org/10.46813/2020-129-008.

Full text
Abstract:
Models with extended Higgs boson sectors are of prime importance for investigating the mechanism of electroweak symmetry breaking for Higgs decays into four fermions and for Higgs-production in association with a vector bosons. In the framework of the Two-Higgs-Doublet Model using two scenarios obtained from the experimental measurements we presented next-to-leading-order results on the four-fermion decays of light CP-even Higgs boson, h → 4f. With the help of Monte Carlo program Prophecy 4f 3.0, we calculated the values Γ = ΓEW/(ΓEW + ΓSM) and Γ = ΓEW+QCD/(ΓEW+QCD + ΓSM) for Higgs boson decay channels H → νµµeνe, µµee, eeee. We didn’t find significant difference when accounting QCD corrections to EW processes in the decay modes of Higgs boson. Using computer programs Pythia 8.2 and FeynHiggs we calculated the following values: σ(V BH)BR(H → ZZ) and σ(V BF)BR(H → WW) for VBF production processes, σ(ggH)BR(H → WW) and σ(ggH)BR(H → ZZ) for gluon fusion production process at 13 and 14 TeV and found good agreement with experimental data.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Radioisotopes – Decay – Computer programs"

1

Monger, Fred A. "KSIG - Kansas State University isotope generation microcomputer program." 1985. http://hdl.handle.net/2097/27506.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Radioisotopes – Decay – Computer programs"

1

Cramond, Wallis R. Shutdown decay heat removal analysis of a Babcock and Wilcox pressurized water reactor: Case study. Washington, DC: Division of Safety Review and Oversight, Office of Nuclear Reactor Regulation, U.S. Nuclear Regulatory Commission, 1987.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Broadhead, B. L. QADS, a multidimensional point kernel analysis module. Washington, DC: Division of Safeguards and Transportation, Office of Nuclear Material Safety and Safeguards, U.S. Nuclear Regulatory Commission, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Radioisotopes – Decay – Computer programs"

1

Pe´rot, Bertrand, Jean-Luc Artaud, Christian Passard, and Anne-Ce´cile Raoux. "Experimental Qualification With a Scale One Mock-Up of the “Measurement and Sorting Unit” for Bituminized Waste Drums." In ASME 2003 9th International Conference on Radioactive Waste Management and Environmental Remediation. ASMEDC, 2003. http://dx.doi.org/10.1115/icem2003-4597.

Full text
Abstract:
Within the framework of the cleaning operation of the Marcoule reprocessing plant UP1 (France), the CEA (French Atomic Energy Commission) developed a measurement system for 225-liter drums filled with bituminized radioactive sludge originating from the effluent treatment. This work was carried out for the CODEM, which is an economic interest group made up of CEA, EDF (the French public utility) and COGEMA (the operator of UP1). CODEM is in charge of UP1 dismantling operations, especially waste retrieval. The bituminized waste drums mainly contain plutonium, americium, uranium, curium and various beta emitters among which some are responsible for significant gamma irradiation, such as 137Cs. The aim of this system is to sort the packages according to their radioactive level, so as to direct them towards the French Aube Center, which is a surface repository. This means they must meet the acceptance criteria related to their activities. Otherwise, they will remain in interim storage in Marcoule, pending the choice of a final mode of management (e.g. underground disposal). The assay system, called UTM (the French acronym for “Measurement and Sorting Unit”), consists of three stations devoted to active gamma imaging, gamma-ray spectroscopy and combined passive / active neutron measurements. After nearly 3 years of optimization and design studies [1], the CEA has built a scale one mock-up of UTM, called SYMETRIC. The purpose was to validate the performances formerly assessed by numerical simulation, mainly with the computer code MCNP [2]. We present here the experimental results obtained with SYMETRIC for five real bituminized waste drums. These confirm the expected performances in the measurement time assigned for each assay, which is limited to 1200 seconds. With the help of gamma imaging, we are able to determine the density of the bituminous mix with an uncertainty of ± 10% for a confidence level of 95%. We can also measure the filling height with an accuracy of ± 2 cm. These data allow us to correct matrix effects in gamma and neutron measurements. For these assays, the main results concern the detection limits and measurement uncertainties on 241Am, 239Pu and 240Pu. These radioisotopes represent the major part of the total alpha activity, which is a very sensitive parameter for surface disposal limited to a maximum level of about 10 GBq per drum. The alpha activity must be calculated after a radioactive decay of 300 years, which is the survey period of the French Aube Center. If we can detect the former isotopes, the uncertainties on their measured activities are roughly 50%. If not, the detection limits are around a few GBq. These performances are sufficient to allow the sorting of the drums to either surface repository or interim storage. However, in order to increase the margin between the detection limits and the acceptance criterion on the total alpha activity, additional studies on the optimization of the measurement performances will be carried out. In this context, the experience gained with the SYMETRIC mock-up will be very helpful.
APA, Harvard, Vancouver, ISO, and other styles
2

Kawada, Ken-ichi, Ikken Sato, Yoshiharu Tobita, Werner Pfrang, Laurence Buffe, and Emmanuelle Dufour. "Development of PIRT (Phenomena Identification and Ranking Table) for SAS-SFR (SAS4A) Validation." In 2014 22nd International Conference on Nuclear Engineering. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/icone22-30679.

Full text
Abstract:
SAS-SFR (derived from SAS4A) is presently the most advanced computer code for simulation of the primary phase of the Core Disruptive Accident (CDA) of MOX-fueled Sodium-cooled Fast Reactors (SFR). In the past two decades, intensive model improvement works have been conducted for SAS-SFR utilizing the experimental data from the CABRI programs. The main target of the present work is to confirm validity of these improved models through a systematic and comprehensive set of test analyses to demonstrate that the improved models has a sufficient quality assurance level for applications to reactor conditions. In order to reach these objectives, an approach of PIRT (Phenomena Identification and Ranking Table) on a set of accident scenarios has been applied. Based on the fact that there have been a significant amount of validation studies for decades, development of the code validation matrix concentrated on key issues. Different accident scenarios have been chosen for the PIRT considering typical SFR accident transients that address a large range of phenomena. As the most important and typical Core Disruptive Accident scenarios leading to generalized core melting and to be addressed with SAS-SFR in the present study, ULOF (Unprotected Loss Of Flow), UTOP (Unprotected Transient OverPower) and ULOHS (Unprotected Loss Of Heat Sink) are selected. The PIRT process applied to a given accident scenario consists in an identification of the phenomena involved during the accident, the evaluation of the importance of the phenomena regarding to the evolution and consequences, and the evaluation of the status of knowledge based on the review of available experimental results. The identified phenomena involved in ULOF are explained as follows for the primary phase. Starting from initiating events, a loss of grid power leading to flow coast down without scram is assumed. The scenario up to coolant boiling is the main point within the first part of the ULOF phenomenological chart. Those elements related to reactivity feedback, such as heat up of coolant, fuel and various structures and their deformation due to the thermal transient are picked up. Depending on the time scale before boiling starts, primary, secondary and tertiary loop heat transfer including the DHR (Decay Heat Removal) system response is concerned since it defines the core inlet coolant temperature. Core inlet coolant temperature gives direct impact on the thermal condition of the core. It also affects reactivity through thermal expansion of the grid plate. In the second part of the ULOF phenomenological chart, elements such as coolant boiling, mechanical response of the fuel pin leading to cladding failure, FCI (Fuel-Coolant Interaction) and post-failure material relocation are picked up. This part of the chart is basically common to the ULOHS. Respective identified phenomena are to be simulated in the SAS-SFR code. To validate the function of the models in the code, ten high priority CABRI experiments are selected. Validation studies on these tests are underway. With the present study, important phenomena involved in ULOF, UTOP and ULOHS were identified and an evaluation matrix for the selected CABRI experiments was developed.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography