Academic literature on the topic 'Current value method'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Current value method.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Current value method"

1

Motalo, Andrij, and Vasil Motalo. "ANALYSIS OF CALORIMETRIC METHOD OF MEASUREMENT OF NATURAL GAS CALORIFIC VALUE." Measuring Equipment and Metrology 82, no. 3 (2021): 32–41. http://dx.doi.org/10.23939/istcmtm2021.03.032.

Full text
Abstract:
The article considers the current methods of measuring the calorific value of natural gas which are valid in the upto-date gasometry. The procedure for measuring the gross and net volume-basis specific calorific value of natural gas by the calorimetric method is analyzed. It is shown that to increase the accuracy and validity of measurement results, the experiment to determine the values of gross and net volume-basis specific calorific should be performed for at least 5 samples of the investigated gas. A methodology for estimating the accuracy of measuring the gross and net volume-basis specific calorific values of natural gas by the calorimetric method by finding estimates of the uncertainty of the obtained measurement results taking into account both random and systematic influencing factors are developed. The uncertainty budgets for measuring the gross and net volumebasis-specific calorific values of natural gas have been developed for the practical implementation of the methodology. The results of experimental studies of samples of one of the natural gas fields are given and the objective values of the gross and net volumebasis specific calorific with estimates of extended uncertainty are obtained.
APA, Harvard, Vancouver, ISO, and other styles
2

Gane, Nicholas. "Measure, Value and the Current Crises of Sociology." Sociological Review 59, no. 2_suppl (2011): 151–73. http://dx.doi.org/10.1111/j.1467-954x.2012.02054.x.

Full text
Abstract:
This paper returns to C. Wright Mills' The Sociological Imagination to make an argument about the crisis of sociological method and theory today. Mills' famous text opens with a stinging critique of abstracted empiricism and grand theory on the grounds that they fetishize either methods or concepts. It is argued that Mills' critique can be applied to current sociological practices and thinking. The first part of this paper centres on questions of method, and reads between Mills' critique of abstracted empiricism and a recent debate over what Mike Savage and Roger Burrows call the ‘coming crisis of empirical sociology’. In the light of this, it is argued that two crises currently haunt empirical sociology: a crisis of imagination and measurement. The second part of the paper then moves to the analysis of what Mills calls ‘grand theory’. Here, two parallel crises are identified: a generational crisis within social theory that is tied in turn to what might be called a crisis of the concept. The conclusion of the paper returns to Mills in order to rethink his vision of the promise or value of sociology. It is argued that innovative conceptual work must lie at the heart of future sociological thinking if it is to move beyond the parallel traps of what Mills calls abstracted empiricism and grand theory.
APA, Harvard, Vancouver, ISO, and other styles
3

Hu, Jun Sheng, Lei Guan, Jia Li Dong, Ying Wang, and Ying Yong Duan. "The Application and Research of Electrochemical Method Treating Acid Red 3R Simulation Wastewater by Response Surface Method." Applied Mechanics and Materials 295-298 (February 2013): 1258–62. http://dx.doi.org/10.4028/www.scientific.net/amm.295-298.1258.

Full text
Abstract:
Using electrochemical oxidation method treats the acid red 3R simulation wastewater, investigates the influence of current density, electrolyte concentration, pH-value and aeration and their interaction on the removal rate of chroma. Through the design of Box-Benhnken Design(BBD) and the response surface analysis, the influence sequence of all variables is current density > aeration > electrolyte concentration > pH-value, the influence sequence of all interaction is electrolyte concentration-aeration > current density-aeration ,electrolyte concentration-pH value > current density-pH value > pH value-aeration > current density-electrolyte concentration. Ultimately, the optimal value is 98.4915% under the condition of current density of 6.51mA/cm2,electrolyte concentration of 0.04mol/L,pH-value of 4.17 and aeration of 0.24m3/h.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhang, Xipeng, Nengling Tai, Pan Wu, Xiaodong Zheng, and Wentao Huang. "A Fault Line Selection Method for DC Distribution Network Using Multiple Observers." Energies 12, no. 7 (2019): 1245. http://dx.doi.org/10.3390/en12071245.

Full text
Abstract:
This paper proposes a method of fault line selection for a DC distribution network. Firstly, the 1-mode current is calculated using the measured currents of the positive and the negative line. Then, it is time reversed and further decomposed by wavelet technology. Secondly, the lossless mirror line network is established according to the parameters and the topology of the DC distribution network. Thirdly, it is presumed that several virtual current sources are employed at the locations where the corresponding observers are, and the values of these current sources are equal to the processed 1-mode currents. Fourthly, a fault is placed at every point of the lossless mirror line network the RMS value of every assumed fault current is calculated. During this process, the phase coefficient of every lossless mirror line is set to vary along with the length of the line obeying Gaussian distribution. Finally, the line with the peak value of the RMS values of the currents is selected as the fault line. The result of fault line selection is updated using the fewest observers that are set in advance according to the initial result. A DC distribution network is simulated in PSCAD/EMDTC to verify the correctness of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
5

Li, Xiao Yong, Kang Xu, Li Jun Cao, and Si Yuan Wang. "Cost Management Method with Value Engineering." Advanced Materials Research 433-440 (January 2012): 2114–19. http://dx.doi.org/10.4028/www.scientific.net/amr.433-440.2114.

Full text
Abstract:
The use of value engineering in China has grown significantly in the last twenty years. With keen competition in construction market, the cost control is the key measure to increase the economic benefits in the company and enhance the core competency of enterprises. This paper makes a holistic appraisal of value engineering as used in Chinese construction industry by investigating current theory and practice. It evaluates value engineering projects. This research suggests a methodology for the cost control process in construction enterprise, aiming at the correct systematic approach of Value Engineering and target-costing in cost management. The value engineering and target-costing are complementary processes. This proposed approach was validated in a case study, aiming at improved product cost, functionality and quality accomplishment, in accordance with customer needs and the company strategy.
APA, Harvard, Vancouver, ISO, and other styles
6

Czarnigowska, Agata. "Earned value method as a tool for project control." Budownictwo i Architektura 3, no. 2 (2008): 015–32. http://dx.doi.org/10.35784/bud-arch.2320.

Full text
Abstract:
Earned Value is a well-known project management tool that uses information on cost, schedule and work performance to establish the current status of the project. By means of a few simple rates, it allows the manager to extrapolate current trends to predict their likely final effect. The method is based on a simpli-fied model of a project, but proved to be useful in practice of cost control. It is being developed to account better for schedule and time aspects. The paper outlines the basic principles of the method and its recent extension, the Earned Schedule method, and, with help of a few examples, investigates into assumptions that affect their diagnostic and predictive accuracy.
APA, Harvard, Vancouver, ISO, and other styles
7

Loman, M. S., and V. S. Kachenya. "DETECTION OF CURRENT CIRCUITS FAULT FOR DIFFERENTIAL CURRENT PROTECTION." ENERGETIKA. Proceedings of CIS higher education institutions and power engineering associations 61, no. 2 (2018): 108–17. http://dx.doi.org/10.21122/1029-7448-2018-61-2-108-117.

Full text
Abstract:
False operation of the differential current protection leads to tripping of the most important electrical power objects. Fault of current transformer’s secondary circuits is one of the most often cause of false operation of the differential current protection. Early determination of this malfunction increases the reliability of the differential current protection and reduces the number of false trips. In the present article the methods of secondary open circuit determining for the differential protection are described. Some of the methods react instantly to the malfunction of secondary current circuits, and the other part identifies fault after a certain time delay. Each of considered methods has its advantages and disadvantages. A new method for determination secondary open current circuits based on the analysis of increments of the RMS values of differential and braking currents has been proposed. In this case, increments are calculated for half the period of the industrial frequency, which provides quick fault determining. The use of the sum and the difference between the increments of the brake and differential currents makes it possible to determine the open circuits in the most sensitive way. The method can be adapted to work with any type of differential protection, including transformer protection. The evaluation of the increment of the RMS current value is performed taking into account the transient process in the Fourier filter. With the aid of a computational experiment, the error limit of such an estimate is determined. The block diagram of algorithm of determination of open circuits on the basis of the analysis of increments of the acting values of brake and differential currents is presented; the principle of its functioning is described. The parameters of operation are determined. The limits of sensitivity of the method are determined, too. The time characteristics of the algorithm have been determined by the method of computational experiment with the of the MatLab Simulink simulation environment.
APA, Harvard, Vancouver, ISO, and other styles
8

Anggriani, Yeni, Helmi Yazid, and Muhamad Taqi. "Fair Value Non-Current Asset, Koneksi Politik, dan Audit Fee." AFRE (Accounting and Financial Review) 3, no. 2 (2021): 159–64. http://dx.doi.org/10.26905/afr.v3i2.4708.

Full text
Abstract:
This study aims to determine the effect of fair value non-current asset on the determination of audit fee and to know moderating effect of political connection between fair value non -current asset and audit fee in financial companies listed in Indonesia Stock Exchange. The method of research analysis used in this research is descriptive method. Population in this research is financial companies listed in Indonesia Stock Exchange with sample of 25 companies by using purposive sampling method in period 2016-2018. The data used is secondary data collected by documentation technique. In analyzing the data, this study used a test tool of ordinary least square (OLS) analysis and moderation regression analysis (MRA). The result of this research indicates that fair value non-current asset influence audit fee and this research show that the political connection can’t moderate of fair value non-current asset toward audit fee. DOI: https://doi.org/10.26905/afr.v3i2.4708
APA, Harvard, Vancouver, ISO, and other styles
9

Meng, Zhao Wei, and Pei Chao Yu. "Value-at-Risk Estimation Based on Empirical Likelihood Method." Advanced Materials Research 143-144 (October 2010): 1–5. http://dx.doi.org/10.4028/www.scientific.net/amr.143-144.1.

Full text
Abstract:
Value at Risk ( ) is a method using statistical knowledge to measure financial risks, and its calculating core is to estimate or predicate fluctuation of the financial assets price. In recent years, the main method of estimating and predicating fluctuation of the financial assets price is the GARCH model. So to determine a reasonable GARCH model becomes the crux of calculating. In this paper, we proposed using empirical likelihood method to estimate , and we also proved that the empirical likelihood method is more effective and more concise than other current methods by simulation analysis.
APA, Harvard, Vancouver, ISO, and other styles
10

SOBCZYK, JAN. "CURRENT-CURRENT CORRELATION FUNCTION ON ALGEBRAIC CURVES." Modern Physics Letters A 08, no. 12 (1993): 1153–59. http://dx.doi.org/10.1142/s0217732393002634.

Full text
Abstract:
We derive an explicit expression for a current-current correlation function on a Riemann surface represented as 3 sheets ramified covering over CP(1). The method used in the paper can be easily applied to more general algebraic curves. Knowledge of G(z, w) enables calculation of the expectation value of the energy momentum tensor for scalar field.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Current value method"

1

Nordgren, Sofie. "En hållbar framtid med solceller för Ljusdalshem : Solcellens betydelse i dagens samhälle samt projektering av en solcellsanläggning för elproduktion på ett flerbostadshus." Thesis, Mittuniversitetet, Avdelningen för kemiteknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-31032.

Full text
Abstract:
Solceller utvecklas ständigt med avseende på effektivitet och i samband med modulernas minskade inköpskostnad intresserar sig allt fler människor för solenergi. Solen är den renaste energikällan och det finns en stor potential för lönsamma anläggningar i Sverige sett till mängden solinstrålning. Arbetet syftar till att projektera en solcellsanläggning åt Ljusdalshem och beräkna dess lönsamhet. Vidare undersöks solcellens uppbyggnad, vilka solcellstyper som finns på marknaden idag samt vilka faktorer som påverkar en projektering. Undersökningen har genomförts med litteraturstudie av tryckta källor med internet som komplement. En del av informationen har också samlats in från personalen på Ljusdalshem samt från en leverantör med erfarenhet inom området. Energiberäkningarna har utförts genom manuella beräkningar och två olika beräkningsprogram som komplement. Lönsamheten bedöms utifrån nuvärdesmetoden och Payoff-metoden. Fastigheten har två lämpliga takpartier, ett placerat åt sydöst och ett åt sydväst. Takets olika riktningar ger större produktionsspann över dagen men ger lägre totalproduktion. Anläggningen kommer bestå av monokristallina kiselsolceller, växelriktare och effektoptimerare. Tillsammans kan dessa producera el motsvarande 17 procent av fastighetens årsförbrukning. I fastigheten bedrivs verksamheter dagtid, vilket passar bra för att ta emot producerad energi från solceller. Med hänsyn tagit till elprisökning, diskonteringsfaktor, modulerna degradering och nyinvesteringar har nuvärdet påvisat en investering som inte är lönsam. Investeringsstöd till solcellsanläggningar finns att söka hos Länsstyrelsen och är en förutsättning för att anläggningen ska ha en chans att bli lönsam, en lönsam investering kräver även att elprisökningen är större än den antagna. Åtgärder som sänker elförbrukningen ska utföras, beräkningarna baseras därför på en lägre förbrukning än den aktuella. En ännu lägre elförbrukning skulle ge ett annat resultat, vilket är en anledning till att göra nya mer exakta beräkningar när den nya årförbrukningen blir känd. Återbetalningstiden för investeringen är längre än den tekniska livslängden och därmed bidrar inte installationen till företagets lönsamhet med det antagna elpriset och övriga villkor. Då anläggningen ska ses som en referensanläggning och Ljusdalshem har ett miljömål att uppnå finns möjlighet till investering trots de ekonomiska förlusterna.<br>Solar cells are constantly evolving with regard to efficiency, and in connection with the modules' reduced purchasing costs, more and more people are becoming interested in solar energy. The sun is the purest source of energy, and there is a great potential for profitable plants in Sweden, given the amount of solar radiation. The project is to plan a solar cell facility for Ljusdalshem and calculating its profitability. Furthermore, the structure of the solar cell is investigated, which types of solar cells are present on the market today, and what factors affect the plan. The survey has been conducted with a literature study of printed sources with internet as a complement. Some of the informat-ion has also been collected from the staff at Ljusdalshem and from a supplier with experience in the area. The energy productions were carried out through manual calculations and two different calculation tools as a complement. Profitability is judged based on the present value method and the payoff method. The building has two suitable roof areas, one located to the south-east and one to the southwest. The different directions of the roof provide a larger production span over the day but inhibit overall solar energy production. The plant will consist of monocrystalline silica sols, inverters, and power optimizers. Together they can produce electricity corresponding to 17 percent of the annual consumption. Activities in the building are conducted during daytime, which fits well to receive produced energy from solar cells. Taking into account electricity price increase, discount factor, re-construction, and reinvestment modules, the present value has shown an investment that is not profitable. Investment support for solar power plants is available at the County Administ-rative Board, and is a prerequisite for the profitability of the plant. A profitable investment also requires that the electricity price increase is greater than assumed. Measures that reduce electricity consumption should be performed, therefore, the calculations are based on a lower consumption than the current one. An even lower power consumption would give another result, which is a reason for making more accurate calculations when the new annual consumption becomes known. The repayment period for the investment is longer than the technical life and thus does not contribute to the company's profitability with the assumed electricity price and other terms. Since the plant is to be regarded as a reference facility and Ljusdalshem has an environmental objective to achieve, there is an opportunity for investment despite the economic losses.
APA, Harvard, Vancouver, ISO, and other styles
2

Kubeš, Fedor. "Srovnání vybraných způsobů ocenění pro nemovitost typu byt v lokalitě Brno - Štýřice a Brno - Černovice." Master's thesis, Vysoké učení technické v Brně. Ústav soudního inženýrství, 2012. http://www.nusl.cz/ntk/nusl-232638.

Full text
Abstract:
The thesis is focused on a comparison of selected methods for the valuation property type apartment in the area of Brno - Brno Štýřice and Černovice. The thesis is notionally divided into two parts. The theoretical part describes the basic concepts associated with valuation, describes methods used of valuation, real estate market and valued locations. The practical part of thesis is focused on the valuation of flats under the current price regulation with comparative method, the direct comparison method of valuation, method of yield for an assessment the current price. At the end of the thesis are explained and commented upon the facts (locations, methods, prices, costs and profit of development company).
APA, Harvard, Vancouver, ISO, and other styles
3

Nicol, Lisa Margaret. "Diagnostic and prognostic value of current phenotyping methods and novel molecular markers in idiopathic pulmonary fibrosis." Thesis, University of Edinburgh, 2018. http://hdl.handle.net/1842/33098.

Full text
Abstract:
Background Idiopathic pulmonary fibrosis (IPF) is a devastating form of chronic lung injury of unknown aetiology characterised by progressive lung scarring. A diagnosis of definite IPF requires High Resolution Computed Tomography (HRCT) appearances indicative of usual interstitial pneumonia (UIP), or in patients with 'possible UIP' CT appearances, histological confirmation of UIP. However the proportion of such patients that undergo SLB varies, perhaps due to a perception of risk of biopsy and additive diagnostic value of biopsy in individual patients. We hypothesised that an underlying UIP pathological pattern may result in increased risk of death and aimed to explore this by comparing the risk of SLB in suspected idiopathic interstitial pneumonia, stratified according to HRCT appearance. Additionally we sought to determine the positive-predictive value of biopsy to diagnose IPF in patients with 'possible UIP HRCT' in our population. In patients with possible UIP who are not biopsied, the clinical value of bronchoalveolar lavage (BAL) is uncertain. We aimed to prospectively study the diagnostic and prognostic value of BAL differential cell count (DCC) in suspected IPF and determine the feasibility of repeat BAL and the relationship between DCC and disease progression in two successive BALs. We hypothesised that BAL DCC between definite and possible IPF was different and that baseline DCC and change in BAL DCC predicted disease progression. Alveolar macrophages (AMs) are an integral part of the lung's reparative mechanism following injury, however in IPF they contribute to pathogenesis by releasing pro-fibrotic mediators promoting fibroblast proliferation and collagen deposition. Expansion of novel subpopulations of pulmonary monocyte-like cells (PMLCs) has been reported in inflammatory lung disease. We hypothesised that a distinct AM polarisation phenotype would be associated with disease progression. We aimed to perform detailed phenotyping of AM and PMLCs in BAL in IPF patients. Several prognostic scoring systems and biomarkers have been described to predict disease progression in IPF but most were derived from clinical trial patients or tertiary referral centres and none have been validated in separate cohorts. We aimed to identify a predictive tool for disease progression utilising physiological, HRCT and serum biomarkers in a unique population of incident treatment naïve IPF patients. Methods Between 01/01/07 and 31/12/13, 611 consecutive incident patients with suspected idiopathic interstitial pneumonia (IIP) presented to the Edinburgh lung fibrosis clinic. Of these patients 222 underwent video-assisted thoracoscopic lung biopsy and histological pattern was determined according to ATS/ERS criteria. Post-operative mortality and complication rates were examined. Fewer than 2% received IPF-directed therapy and less than 1% of the cohort were lost to follow-up. Disease progression was defined as death or ≥10% decline in VC within 12 months of BAL. Cells were obtained by BAL and a panel of monoclonal antibodies; CD14, CD16, CD206, CD71, CD163, CD3, CD4, CD8 and HLA-DR were used to quantify and selectively characterise AMs, resident PMLCs, inducible PMLCs, neutrophils and CD4+/CD8+ T-cells using flow cytometry. Classical, intermediate and non-classical monocyte subsets were also quantified in peripheral blood. Potential biomarkers (n=16) were pre-selected from either previously published studies of IPF biomarkers or our hypothesis-driven profiling. Linear logistic regression was used on each predictor separately to assess its importance in terms of p-value of the associated weight, and the top two variables were used to learn a decision tree. Results Based on the 2011 ATS/ERS criteria, 87 patients were categorised as 'definite UIP', of whom 3 underwent SLB for clinical indications. IPF was confirmed in all 3 patients based on 2013 ATS/ERS/JRS/ALAT diagnostic criteria. 222 patients were diagnosed with 'possible UIP'; 55 underwent SLB, IPF was subsequently diagnosed in 37 patients, 4 were diagnosed with 'probable IPF' and 14 were considered 'not IPF'. In this group, 30 patients were aged 65 years or over and 25/30 (83%) had UIP on biopsy. 306 patients had HRCTs deemed 'inconsistent with UIP', SLB was performed in 168 patients. Post6 operative 30-day mortality was 2.2% overall, and 7.3% in the 'possible UIP' HRCT group. Patients with 'definite IPF' based on HRCT and SLB appearances had significantly better outcomes than patients with 'definite UIP' on HRCT alone (P=0.008, HR 0.44 (95% CI 0.240 to 0.812)). BAL DCC was not different between definite and possible UIP groups, but there were significant differences with the inconsistent with UIP group. In the 12 months following BAL, 33.3% (n=7/21) of patients in the definite UIP group and 29.5% (n=18/61) in the possible UIP group had progressed. There were no significant differences in BAL DCC between progressor and non-progressor groups. Mortality in patients with suspected IPF and a BAL DCC consistent with IPF was no different to those with a DCC inconsistent with IPF (P=0.425, HR 1.590 (95% CI 0.502 to 4.967)). There was no difference in disease progression in either group (P=0.885, HR 1.081 (95% CI 0.376 to 3.106)). There was no statistically significant difference in BAL DCC at 0 and 12 months in either group. There was no significant change in DCC between 0 and 12 month BALs between progressors and non-progressors. Repeat BAL was well tolerated in almost all patients. There was 1 death within 1 month of a first BAL and 1 death within 1 month of a second BAL; both were considered 'probably procedure-related'. AM CD163 and CD71 (transferrin receptor) expression were significantly different between groups (P < 0.0001), with significant increases in the IPF group vs non fibrotic ILD (P < 0.0001) and controls (P < 0.0001 and P < 0.001 respectively). CD71 expression was also significantly increased in the IPF progressor vs non-progressor group (P < 0.0001) and patients with high CD71 expression had significantly poorer survival than the CD71low group (P=0.040, median survival 40.5 and 75.6 months respectively). CD206 (mannose receptor) expression was also significantly higher in the IPF progressor vs non-progressor group (P=0.034). There were no differences in baseline BAL neutrophil, eosinophil or lymphocyte percentages between IPF progressor or non-progressor groups. The percentage of rPMLCs was significantly increased in BAL fluid cells of IPF patients compared to those with non-fibrotic ILD (P < 0.0001) and healthy controls (P < 0.05). Baseline rPMLC percentage was significantly higher in IPF progressors vs IPF non-progressors (P=0.011). Baseline BAL iPMLC:rPMLC ratio was also significantly different between IPF progressor and non-progressor groups (P=0.011). Disease progression was confidently predicted by a combination of clinical and serological variables. In our cohort we identified a predictive tool based on two key parameters, one a measure of lung function and one a single serum biomarker. Both parameters were entered into a decision tree, and when applied to our cohort yielded a sensitivity of 86.4%, specificity of 92.3%, positive predictive value of 90.5% and negative predictive value of 88.9%. We also applied previously reported predictive tools such as the GAP Index, du Bois score and CPI Index to the Edinburgh IPF cohort. Conclusions SLB can be of value in the diagnosis of ILD, however perhaps due to the perceived risks associated with the procedure, only a small percentage of patients undergo SLB despite recommendations that patients have histological confirmation of the diagnosis. Advanced age is a strong predictor for IPF, and in our cohort 83% of patients aged over 65 years with 'possible UIP' HRCT appearances, had UIP on biopsy. BAL and repeat BAL in IPF is feasible and safe (< 1.5% mortality). Of those that underwent repeat BAL, disease progression was not associated with a change in DCC. However, 22% of lavaged patients died or were deemed too frail to undergo a second procedure at 12 months. These data emphasise the importance of BAL in identifying a novel human AM polarisation phenotype in IPF. Our data suggests there is a distinct relationship between AM subtypes, cell-surface expression markers, PMLC subpopulations and disease progression in IPF. This may be utilised to investigate new targets for future therapeutic strategies.<br>Disease progression in IPF can be predicted by a combination of clinical variables and serum biomarker profiling. We have identified a unique prediction model, when applied to our locally referred, incident, treatment naïve cohort can confidently predict disease progression in IPF. IPF is a heterogeneous disease and there is a definite clinical need to identify 'personalised' prognostic biomarkers which may in turn lead to novel targets and the advent of personalised medicines.
APA, Harvard, Vancouver, ISO, and other styles
4

Bozovic, Milos. "Risks in Commodity and Currency Markets." Doctoral thesis, Universitat Pompeu Fabra, 2009. http://hdl.handle.net/10803/7388.

Full text
Abstract:
This thesis analyzes market risk factors in commodity and currency markets. It focuses on the impact of extreme events on the prices of financial products traded in these markets, and on the overall market risk faced by the investors. The first chapter develops a simple two-factor jump-diffusion model for valuation of contingent claims on commodities in order to investigate the pricing implications of shocks that are exogenous to this market. The second chapter analyzes the nature and pricing implications of the abrupt changes in exchange rates, as well as the ability of these changes to explain the shapes of option-implied volatility "smiles". Finally, the third chapter employs the notion that key results of the univariate extreme value theory can be applied separately to the principal components of ARMA-GARCH residuals of a multivariate return series. The proposed approach yields more precise Value at Risk forecasts than conventional multivariate methods, while maintaining the same efficiency.<br>El objetivo de esta tesis es analizar los factores del riesgo del mercado de las materias primas y las divisas. Está centrada en el impacto de los eventos extremos tanto en los precios de los productos financieros como en el riesgo total de mercado al cual se enfrentan los inversores. En el primer capítulo se introduce un modelo simple de difusión y saltos (jump-diffusion) con dos factores para la valuación de activos contingentes sobre las materias primas, con el objetivo de investigar las implicaciones de shocks en los precios que son exógenos a este mercado. En el segundo capítulo se analiza la naturaleza e implicaciones para la valuación de los saltos en los tipos de cambio, así como la capacidad de éstos para explicar las formas de sonrisa en la volatilidad implicada. Por último, en el tercer capítulo se utiliza la idea de que los resultados principales de la Teoria de Valores Extremos univariada se pueden aplicar por separado a los componentes principales de los residuos de un modelo ARMA-GARCH de series multivariadas de retorno. El enfoque propuesto produce pronósticos de Value at Risk más precisos que los convencionales métodos multivariados, manteniendo la misma eficiencia.
APA, Harvard, Vancouver, ISO, and other styles
5

Nelson, Ludwig W. "Value-Added Tax apportionment methodology applied in the higher sector of South Africa : Is the apportionment method currently applied in the higher education sector effective and appropriate in a South African context?" Master's thesis, University of Cape Town, 2016. http://hdl.handle.net/11427/20975.

Full text
Abstract:
This dissertation focusses on the value-added tax apportionment methodology applied in the higher education sector of South Africa. The current apportionment method applied by universities is the varied input-based method. The research question that is posed, is whether the varied input-based method is effective and appropriate in the higher education sector in a South African context and whether or not there are other solutions which could apply to alleviate the burden that apportionment has placed on the universities. In addressing the research question the dissertation specifically discusses the principles of apportionment, turnover-based method, input-based method and varied input-based method. Furthermore, the dynamics of the higher education sector is discussed in detail with specific focus on the income streams and expense types of universities and how this influences the application of apportionment methods. Lastly, other solutions are considered such as a reduced Value-Added Tax rate for the supply of educational services and the zero-rating of the supply of educational services. It is concluded that the varied input-based method is definitely not the long term solution as the difficulty in its application and the financial burden makes it impracticable to use. Also, it is concluded that a possible solution is to zero-rate the supply of educational services by universities in terms of section 11 of the Value-Added Tax Act No. 89 of 1991. This would not only alleviate the burden of apportionment, but also result in universities being allowed to claim all the VAT it incurs (except for input tax denied in terms of section 17(2) of the Value- Added Tax Act No. 89 of 1991) and result in a net gain for the universities. Whatever is decided, the current investigation of apportionment methodology applied in the higher education sector of South Africa needs to be concluded as soon as possible and a clear practical and sustainable approach needs to be agreed upon, as it is in the best interest of all of the parties involved as well as South Africa and the education of its people as a whole.
APA, Harvard, Vancouver, ISO, and other styles
6

Karch, Václav. "Vliv druhu vlastnictví bytové jednotky na obvyklou cenu v Třinci." Master's thesis, Vysoké učení technické v Brně. Ústav soudního inženýrství, 2017. http://www.nusl.cz/ntk/nusl-367525.

Full text
Abstract:
A real property, an apartment, an accommodation unit, an ownership, cooperative housing, supply, demand, a price, a worth, a current price, a valuation using price provision, method of a direct comparison, yield method, real estate market
APA, Harvard, Vancouver, ISO, and other styles
7

Lal, Ghamandi. "Analysis And Design Of Test Methods And Test Circuits For HVDC Thyristor Valves." Thesis, 1996. http://etd.iisc.ernet.in/handle/2005/1754.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Cheng, Hsu Kuo, and 徐國城. "A Study on the Locational Factors Adjustment Methods for Land Valuation - the Evidence of Current Assessed Land Value Evaluation in Tainan City." Thesis, 2002. http://ndltd.ncl.edu.tw/handle/20766834375967235932.

Full text
Abstract:
碩士<br>長榮管理學院<br>土地管理與開發學系碩士班<br>90<br>The special characteristics of the real estate make land valuation more complicated and the comprehensive criteria for valuation required more consideration. Market data approach rely on the market information and the reasonable adjustment with less subjectivity of the appraiser. This study uses Quantification theory I to establish a land valuation model and takes Tainan City as the survey area. The fuzzy linguistic method is applied for the subjectivity of locational variables adjustment. Integrated with the results and implemented with empirical evidence the final land valuation model using fuzzy Quantification theory I are established. The final land valuation model finds (1) Floor area ratio; (2) Slope; (3) Distance to pollutant area; (4) Trend of development and (5) Distance to the park are the major effecting factors. The findings show that the subjectivity of the appraisers can well improved by the fuzzy Quantification theory I model. The valuation results demonstrate more stable and consistent.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Current value method"

1

Swaffield, Simon R. Community perceptions of landscape values in the South Island high country: A literature review of current knowledge and evaluation of survey methods. Dept. of Conservation, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Pelke, Eberhard, and Eugen Brühwiler, eds. Engineering History and Heritage Structures – Viewpoints and Approaches. International Association for Bridge and Structural Engineering (IABSE), 2017. http://dx.doi.org/10.2749/sed015.

Full text
Abstract:
The present Structural Engineering Document (SED) is a compilation of contributions devoted to the vast topic of history of structural engineering as well as interventions on heritage structures and structures of high cultural values. Various, some-times opposed, viewpoints and approaches are expressed and presented. The rather heterogeneous and controversial nature of the content of this SED shall stimulate lively discus-sions within the structural engineering community who needs to increase the awareness of historical and cultural aspects of structures and structural engineering. Current structural engineering methods and practice are only at the very begin-ning of effective engineering, really integrating historical and cultural aspects in the assessment of existing structures and in intervention projects to adapt or modify structures of cultural values for future demands. Knowing the past is indispensable for modern structural engineering!
APA, Harvard, Vancouver, ISO, and other styles
3

Zuev, Sergey, Ruslan Maleev, and Aleksandr Chernov. Energy efficiency of electrical equipment systems of autonomous objects. INFRA-M Academic Publishing LLC., 2021. http://dx.doi.org/10.12737/1740252.

Full text
Abstract:
When considering the main trends in the development of modern autonomous objects (aircraft, combat vehicles, motor vehicles, floating vehicles, agricultural machines, etc.) in recent decades, two key areas can be identified. The first direction is associated with the improvement of traditional designs of autonomous objects (AO) with an internal combustion engine (ICE) or a gas turbine engine (GTD). The second direction is connected with the creation of new types of joint-stock companies, namely electric joint-stock companies( EAO), joint-stock companies with combined power plants (AOKEU).&#x0D; The energy efficiency is largely determined by the power of the generator set and the battery, which is given to the electrical network in various driving modes.&#x0D; Most of the existing methods for calculating power supply systems use the average values of disturbing factors (generator speed, current of electric energy consumers, voltage in the on-board network) when choosing the characteristics of the generator set and the battery. At the same time, it is obvious that when operating a motor vehicle, these parameters change depending on the driving mode. Modern methods of selecting the main parameters and characteristics of the power supply system do not provide for modeling its interaction with the power unit start-up system of a motor vehicle in operation due to the lack of a systematic approach.&#x0D; The choice of a generator set and a battery, as well as the concept of the synthesis of the power supply system is a problem studied in the monograph.&#x0D; For all those interested in electrical engineering and electronics.
APA, Harvard, Vancouver, ISO, and other styles
4

Muraru, Denisa, Ashraf M. Anwar, and Jae-Kwan Song. Heart valve disease: tricuspid valve disease. Oxford University Press, 2016. http://dx.doi.org/10.1093/med/9780198726012.003.0037.

Full text
Abstract:
The tricuspid valve is currently the subject of much interest from echocardiographers and surgeons. Functional tricuspid regurgitation is the most frequent aetiology of tricuspid valve pathology, is characterized by structurally normal leaflets, and is due to annular dilation and/or leaflet tethering. A primary cause of tricuspid regurgitation with/without stenosis can be identified only in a minority of cases. Echocardiography is the imaging modality of choice for assessing tricuspid valve diseases. It enables the cause to be identified, assesses the severity of valve dysfunction, monitors the right heart remodelling and haemodynamics, and helps decide the timing for surgery. The severity assessment requires the integration of multiple qualitative and quantitative parameters. The recent insights from three-dimensional echocardiography have greatly increased our understanding about the tricuspid valve and its peculiarities with respect to the mitral valve, showing promise to solve many of the current problems of conventional two-dimensional imaging. This chapter provides an overview of the current state-of-the-art assessment of tricuspid valve pathology by echocardiography, including the specific indications, strengths, and limitations of each method for diagnosis and therapeutic planning.
APA, Harvard, Vancouver, ISO, and other styles
5

Klosko, George, ed. The Oxford Handbook of the History of Political Philosophy. Oxford University Press, 2011. http://dx.doi.org/10.1093/oxfordhb/9780199238804.001.0001.

Full text
Abstract:
This book presents fifty original articles, each covering the entire subject in the history of political philosophy. It provides not only a survey of the state of research but substantial pieces that engage with, and move forward, current debates. Part I addresses questions of method. Articles discuss the contextual method, classically articulated by Quentin Skinner, along with important alternative methods associated with Leo Strauss and his followers, and contemporary post-modernism. This first part also examines the value of the history of political philosophy and the history of the discipline itself. Part II, based upon chronological periods, works through the entire history of Western political philosophy. While most articles address recognizable chronological periods, others are devoted to more specialized topics, including the influence of Roman law, medieval Arabic political philosophy, socialism, and Marxism. Aspects of the history of political philosophy that transcend specific periods are the subject of Part III. Articles on topics such as democracy, the state, and imperialism trace theoretical developments over time. The histories of major non-Western traditions—Muslim, Confucian, and Hindu—are discussed in the final part, with special reference to their relationships to Western political thought.
APA, Harvard, Vancouver, ISO, and other styles
6

Rigo, Fausto, Covadonga Fernández-Golfín, and Bruno Pinamonti. Familial cardiomyopathies. Oxford University Press, 2016. http://dx.doi.org/10.1093/med/9780198726012.003.0047.

Full text
Abstract:
The tricuspid valve is currently the subject of much interest from echocardiographers and surgeons. Functional tricuspid regurgitation is the most frequent aetiology of tricuspid valve pathology, is characterized by structurally normal leaflets, and is due to annular dilation and/or leaflet tethering. A primary cause of tricuspid regurgitation with/without stenosis can be identified only in a minority of cases. Echocardiography is the imaging modality of choice for assessing tricuspid valve diseases. It enables the cause to be identified, assesses the severity of valve dysfunction, monitors the right heart remodelling and haemodynamics, and helps decide the timing for surgery. The severity assessment requires the integration of multiple qualitative and quantitative parameters. The recent insights from three-dimensional echocardiography have greatly increased our understanding about the tricuspid valve and its peculiarities with respect to the mitral valve, showing promise to solve many of the current problems of conventional two-dimensional imaging. This chapter provides an overview of the current state-of-the-art assessment of tricuspid valve pathology by echocardiography, including the specific indications, strengths, and limitations of each method for diagnosis and therapeutic planning.
APA, Harvard, Vancouver, ISO, and other styles
7

Lancellotti, Patrizio, Julien Magne, Kim O’Connor, and Luc A. Pierard. Mitral valve disease. Oxford University Press, 2011. http://dx.doi.org/10.1093/med/9780199599639.003.0015.

Full text
Abstract:
Native mitral valve disease is the second valvular heart disease after aortic valve disease. For the last few decades, two-dimensional Doppler echocardiography was the cornerstone technique for evaluating patients with mitral valve disease. Besides aetiological information, echocardiography allows the description of valve anatomy, the assessment of disease severity, and the description of the associated lesions.This chapter will address the echocardiographic evaluation of mitral regurgitation (MR) and mitral stenosis (MS).In MR, the following findings should be assessed: 1. Aetiology. 2. Type and extent of anatomical lesions and mechanisms of regurgitation. 3. The possibility of mitral valve repair. 4. Quantification of MR severity. 5. Quantification of MR repercussions.In MS, the following findings should be assessed: 1. Aetiology. 2. Type and extent of anatomical lesions. 3. Quantification of MS severity. 4. Quantification of MS repercussions. 5. Wilkins or Cormier scores for the possibility of percutaneous mitral commissuroplasty.Management of patients with mitral valve disease is currently based on symptoms and on echocardiographic evaluation at rest. Therefore, knowing how to assess the severity of valve diseases as well as the pitfalls and the limitations of each echocardiographic method is of primary importance.
APA, Harvard, Vancouver, ISO, and other styles
8

McMurry, Timothy, and Dimitris Politis. Resampling methods for functional data. Edited by Frédéric Ferraty and Yves Romain. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780199568444.013.7.

Full text
Abstract:
This article examines the current state of methodological and practical developments for resampling inference techniques in functional data analysis, paying special attention to situations where either the data and/or the parameters being estimated take values in a space of functions. It first provides the basic background and notation before discussing bootstrap results from nonparametric smoothing, taking into account confidence bands in density estimation as well as confidence bands in nonparametric regression and autoregression. It then considers the major results in subsampling and what is known about bootstraps, along with a few recent real-data applications of bootstrapping with functional data. Finally, it highlights possible directions for further research and exploration.
APA, Harvard, Vancouver, ISO, and other styles
9

Rahmstorf, Lorenz, Gojko Barjamovic, and Nicola Ialongo, eds. Merchants, Measures and Money. Understanding Technologies of Early Trade in a Comparative Perspective. Wachholtz Verlag, 2021. http://dx.doi.org/10.23797/9783529035418.

Full text
Abstract:
This second volume in the series collects papers from two workshops held at the University of Göttingen in 2019 and 2020. The international meetings tackled questions related to merchants and money in a comparative perspective, with examples spanning from the Bronze Age to the early Modern period and embracing Europe, the Mediterranean, Asia and East Africa. The first part of this volume presents historical case studies of how merchants planned and carried out commercial expeditions; how risk, cost, and potential profit was calculated; and how the value of goods was calculated and converted. The papers in the second part address current theories and methods on the development and function of money before and after the invention of coinage. The introduction of balance scales around 3000 BCE enabled the formation of overarching indexes of value and the calculation of the commercial value of goods and services. It also allowed for a selected set of commodities to take on the role of currency. Around 650 BCE, this led to the invention of coinage in the Eastern Mediterranean.
APA, Harvard, Vancouver, ISO, and other styles
10

Miskowiak, Kamilla W., and Lars V. Kessing. Cognitive enhancement in bipolar disorder: current evidence and methodological considerations. Oxford University Press, 2017. http://dx.doi.org/10.1093/med/9780198748625.003.0026.

Full text
Abstract:
Cognitive dysfunction is an emerging treatment target in bipolar disorder (BD). Numerous trials have assessed the efficacy of novel pharmacological and psychological treatments on cognition. Overall, the results are disappointing, possibly due to methodological challenges. A key issue is the lack of consensus on whether and how to screen for cognitive impairment and on how to assess efficacy. We suggest that screening for cognitive impairment is critical and should involve objective neuropsychological tests. We also recommend that the primary outcome is a composite of neuropsychological tests with socio-occupational function as co-primary or secondary outcome. Trials should include fully or partially remitted patients, ensure that concomitant medication is kept stable and that statistical methods include mixed models or similar ways to take account of missing values. Future treatment development should implement a ‘circuit-based’ neuroimaging biomarker model to examine neural target engagement. Interventions targeting multiple treatment modalities may also be beneficial.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Current value method"

1

Wei, Yao, Yening Sun, Yanjun Wei, and Hanhong Qi. "A Parameters Tuning Method of LADRC Based on Reference Value Filtered Two-Degree-of-Freedom for PMSM Current Control." In The Proceedings of the 9th Frontier Academic Forum of Electrical Engineering. Springer Singapore, 2021. http://dx.doi.org/10.1007/978-981-33-6609-1_29.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wang, YuMei, and Zhan Zhang. "The Approximate Method of Three Phase Short-Circuit Current Calculation Based on the Per-Unit Value Form of Ohm’s Law." In Advances in Computer Science, Environment, Ecoinformatics, and Education. Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-23339-5_37.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Strödeck, Ramona, and Nicole Häusler. "Elephant visitor preferences and experiences in Sri Lanka." In The elephant tourism business. CABI, 2021. http://dx.doi.org/10.1079/9781789245868.0007.

Full text
Abstract:
Abstract This chapter discusses the results of a study concerning visitors' preferences and experiences at four major elephant tourism attractions in Sri Lanka. First, a detailed summary is provided of the current discourse on elephant tourism in this South Asian country, followed by a description of the sites under study. Using a mixed-method approach, the study was able to gain insights into the visitors' profiles, preferences and experiences by focusing on each site's educational value.
APA, Harvard, Vancouver, ISO, and other styles
4

Looks, Hanna, Jannik Fangmann, Jörg Thomaschewski, María-José Escalona, and Eva-Maria Schön. "Towards a Standardized Questionnaire for Measuring Agility at Team Level." In Lecture Notes in Business Information Processing. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-78098-2_5.

Full text
Abstract:
AbstractContext: Twenty years after the publication of the agile manifesto, agility is becoming more and more popular in different contexts. Agile values are changing the way people work together and influence people’s mindset as well as the culture of organizations. Many organizations have understood that continuous improvement is based on measurement.Objective: The objective of this paper is to present how agility can be measured at the team level. For this reason, we will introduce our questionnaire for measuring agility, which is based on the agile values of the manifesto.Method: We developed a questionnaire comprising 36 items that measure the current state of a team’s agility in six dimensions (communicative, change-affine, iterative, self-organized, product-driven and improvement-oriented). This questionnaire has been evaluated with respect to several expert reviews and in a case study.Results: The questionnaire provides a method for measuring the current state of agility, which takes the individual context of the team into account. Furthermore, our research shows, that this technique enables the user to uncover dysfunctionalities in a team.Conclusion: Practitioners and organizations can use our questionnaire to optimize collaboration within their teams in terms of agility. In particular, the value delivery of an organization can be increased by optimizing collaboration at the team level. The development of this questionnaire is a continuous learning process with the aim to develop a standardized questionnaire for measuring agility.
APA, Harvard, Vancouver, ISO, and other styles
5

Ndaba, B., R. Adeleke, R. Makofane, M. O. Daramola, and M. Moshokoa. "Butanol as a Drop-In Fuel: A Perspective on Production Methods and Current Status." In Valorization of Biomass to Value-Added Commodities. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-38032-8_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Timan, Tjerk, and Zoltan Mann. "Data Protection in the Era of Artificial Intelligence: Trends, Existing Solutions and Recommendations for Privacy-Preserving Technologies." In The Elements of Big Data Value. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-68176-0_7.

Full text
Abstract:
AbstractThis chapter addresses privacy challenges that stem particularly from working with big data. Several classification schemes of such challenges are discussed. The chapter continues by classifying the technological solutions as proposed by current state-of-the-art research projects. Three trends are distinguished: (1) putting the end user of data services back as the central focal point of Privacy-Preserving Technologies, (2) the digitisation and automation of privacy policies in and for big data services and (3) developing secure methods of multi-party computation and analytics, allowing both trusted and non-trusted partners to work together with big data while simultaneously preserving privacy. The chapter ends with three main recommendations: (1) the development of regulatory sandboxes; (2) continued support for research, innovation and deployment of Privacy-Preserving Technologies; and (3) support and contribution to the formation of technical standards for preserving privacy. The findings and recommendations of this chapter in particular demonstrate the role of Privacy-Preserving Technologies as an especially important case of data technologies towards data-driven AI. Privacy-Preserving Technologies constitute an essential element of the AI Innovation Ecosystem Enablers (Data for AI).
APA, Harvard, Vancouver, ISO, and other styles
7

Felderer, Michael, Wilhelm Hasselbring, Heiko Koziolek, et al. "Ernst Denert Software Engineering Awards 2019." In Ernst Denert Award for Software Engineering 2019. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58617-1_1.

Full text
Abstract:
AbstractThe need to improve software engineering practices is continuously rising and software development practitioners are highly interested in improving their software systems and the methods to build them. And well, software engineering research has numerous success stories. The Ernst Denert Software Engineering Award specifically rewards researchers that value the practical impact of their work and aim to improve current software engineering practices. This chapter summarizes the awards history as well as the current reward process and criteria.
APA, Harvard, Vancouver, ISO, and other styles
8

Angulo-Bejarano, Paola Isabel, Juan Luis De la Fuente Jimenez, Sujay Paul, et al. "Cell Cultures and Hairy Roots as Platform for Production of High-Value Metabolites: Current Approaches, Limitations, and Future Prospects." In Advances in Plant Transgenics: Methods and Applications. Springer Singapore, 2019. http://dx.doi.org/10.1007/978-981-13-9624-3_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Callejón-Leblic, M. A., and Pedro C. Miranda. "A Computational Parcellated Brain Model for Electric Field Analysis in Transcranial Direct Current Stimulation." In Brain and Human Body Modeling 2020. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-45623-8_5.

Full text
Abstract:
AbstractRecent years have seen the use of increasingly realistic electric field (EF) models to further our knowledge of the bioelectric basis of noninvasive brain techniques such as transcranial direct current stimulation (tDCS). Such models predict a poor spatial resolution of tDCS, showing a non-focal EF distribution with similar or even higher magnitude values far from the presumed targeted regions, thus bringing into doubt the classical criteria for electrode positioning. In addition to magnitude, the orientation of the EF over selected neural targets is thought to play a key role in the neuromodulation response. This chapter offers a summary of recent works which have studied the effect of simulated EF magnitude and orientation in tDCS, as well as providing new results derived from an anatomically representative parcellated brain model based on finite element method (FEM). The results include estimates of mean and peak tangential and normal EF values over different cortical regions and for various electrode montages typically used in clinical applications.
APA, Harvard, Vancouver, ISO, and other styles
10

Al-Dousari, Noor, Modi Ahmed, Ali Al-Dousari, Musaad Al-Daihani, and Murahib Al-Elaj. "Dust Particle Size and Statistical Parameters." In Atlas of Fallen Dust in Kuwait. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-66977-5_3.

Full text
Abstract:
AbstractGrain ‘size’ can be specified and measured in several different ways. All methods of grain size determination have blemishes, and the choice of the most appropriate method is governed by the nature of the sample and the use to which the data are placed. Four main methods are currently used for size analysis of sands: (a) sieving; (b) settling tube analysis; (c) electro-optical methods, including Coulter Counter analysis and laser granulometry; and (d) computerized image analysis. The classification of the particle size distribution of Kuwait dust was mapped according to the parameters proposed by Folk And Ward (1957) which were widely used for quantitative comparisons between natural grain size distribution and the lognormal distribution that shows better sorted sediments have lower values of σ1. Maps of the distribution of dust in Kuwait were obtained that included: fine sand (F.S.), Coarse sand (C.S), Medium Sand (M.S), Very Fine Sane (V.F.S), Very Coarse Silt (V.C.Silt), Coarse Silt (C.Silt), Medium Silt (M.Silt), Fine Silt (F.Silt), Very Fine Silt (V.F.Silt), in addition to that, the deposition percentage of Clay, Sand, mud (silt plus clay) and silt were provided.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Current value method"

1

Wang, Qinqin, Ruifang Liu, and Xuejiao Ren. "A Method to Determine the Critical Current Value of the Bearing Electrical Corrosion." In 2019 IEEE 3rd International Electrical and Energy Conference (CIEEC). IEEE, 2019. http://dx.doi.org/10.1109/cieec47146.2019.cieec-2019375.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Yang, Yuqing, Longzhang Ke, and Yi Liu. "Fault Detection Method of Modular Multilevel Converter Based on Current Mean Value and SVM." In 2020 Chinese Control And Decision Conference (CCDC). IEEE, 2020. http://dx.doi.org/10.1109/ccdc49329.2020.9164798.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kumagai, Takahiro, Keisuke Kusaka, and Jun-ichi Itoh. "Reduction Method of Current RMS Value, DC Current Ripple, and Radial Force Ripple for SRM based on Mathematical Model of Magnetization Characteristic." In 2019 IEEE 4th International Future Energy Electronics Conference (IFEEC). IEEE, 2019. http://dx.doi.org/10.1109/ifeec47410.2019.9015189.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Xiaofei Qin, Yunkuan Wang, Zhiqiang Wei, and Jun Zheng. "An online calibrating method for middle value of current measurement circuit in permanent magnet synchronous motor drives." In 2009 Asia-Pacific Conference on Computational Intelligence and Industrial Applications (PACIIA 2009). IEEE, 2009. http://dx.doi.org/10.1109/paciia.2009.5406374.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Xing-ming, Fan, He Jia-min, Zhang Xin, Liang Cong, Huang Zhi-chao, and Shi Wei-jian. "The K value determination research of advanced breaking current weighted cumulative method for VCB electrical endurance detection." In 2012 XXVth International Symposium on Discharges and Electrical Insulation in Vacuum (ISDEIV 2012). IEEE, 2012. http://dx.doi.org/10.1109/deiv.2012.6412544.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bhattacharjee, Joydip, and Trilochan Sahoo. "Effect of Current on Flexural Gravity Waves." In 25th International Conference on Offshore Mechanics and Arctic Engineering. ASMEDC, 2006. http://dx.doi.org/10.1115/omae2006-92132.

Full text
Abstract:
The effect of uniform current on the propagation of flexural gravity waves due to a floating ice sheet is analyzed in two dimensions. The problem is formulated as an initial boundary value problem in the linearized theory of water waves. By using Laplace transform technique, the initial boundary value problem is reduced to a boundary value problem, which is solved by the application of Fourier transform to obtain the surface elevation in terms of an integral, which is evaluated asymptotically for large distance and time by the application of method of stationary phase to obtain the far field behavior of the progressive waves. The effect of current on the wavelength, phase velocity and group velocity of the flexural gravity waves propagating below the floating ice sheet is analyzed theoretically to obtain certain critical values on the speed of current which are of significant importance. Simple numerical computations are performed to observe the effect of uniform current on the surface elevation, wavelength, phase velocity and group velocity of flexural gravity waves and on the far field behavior of the progressive waves.
APA, Harvard, Vancouver, ISO, and other styles
7

Wei, Xiangxiang, and Dechang Yang. "An adaptive fault line selection method based on wavelet packet comprehensive singular value for small current grounding system." In 2015 5th International Conference on Electric Utility Deregulation and Restructuring and Power Technologies (DRPT). IEEE, 2015. http://dx.doi.org/10.1109/drpt.2015.7432398.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Theodoulidis, Theodoros P. "The Truncated Region Eigenfunction Expansion Method for the Solution of Boundary Value Problems in Eddy Current Nondestructive Evaluation." In REVIEW OF PROGRESS IN QUANTITATIVE NONDESTRUCTIVE EVALUATION. AIP, 2005. http://dx.doi.org/10.1063/1.1916704.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Mantulenko, Valentina Vyacheslavovna, and Yulia Borisovna Golub. "ASSESSMENT OF THE MARKET VALUE OF THE COMPANY BY DETERMINING INTRINSIC VALUE BASED ON PJSC MMC NORILSK NICKEL." In Russian science: actual researches and developments. Samara State University of Economics, 2020. http://dx.doi.org/10.46554/russian.science-2020.03-1-220/229.

Full text
Abstract:
The authors consider a basic theoretical concept related to process of assessing a business value. The basic coefficients for predicting the company’ activities are analyzed. The calculation of the intrinsic value of the stock was carried out by two methods: by the cash flow discounting method and by comparison with peers. The current trends of the shares value of the considered company on the Moscow stock exchange are analyzed, investment recommendation is given.
APA, Harvard, Vancouver, ISO, and other styles
10

Willett, Fred T., and Michael R. Pothier. "An Improved Method for Evaluating Market Value of Turbine Gaspath Component Alternatives." In ASME Turbo Expo 2003, collocated with the 2003 International Joint Power Generation Conference. ASMEDC, 2003. http://dx.doi.org/10.1115/gt2003-38707.

Full text
Abstract:
The large installed base of large frame industrial gas turbines has prompted a number of replacement part offerings, in addition to the replacement parts offered by the OEM. Willett [1] proposed an economic model developed to evaluate gas turbine component alternatives for base load and cyclic duty operation. The improved method expands the capability of the earlier model by including risk level as a variable. Power plant operator value of alternative replacement turbine components for a popular large frame industrial gas turbines is evaluated. A baseline case is established to represent the current component repair and replacement situation, assuming no risk. Each of the modes of power plant operation is evaluated from a long-term financial focus. A short-term financial focus is evaluated for contrast and discussed briefly. Long-term focus is characterized by a nine-year evaluation period, while short-term focus is based on first year benefit only. Four factors are varied: part price, output increase, simple cycle efficiency increase, and additional risk. Natural gas fuel is considered at two different gas prices. Peak, off-peak, and spot market electricity prices are considered. Results are calculated and compared using net present value (NPV) criteria. A case study is presented to demonstrate the method’s applicability to a range of different risk scenarios, from ill-fitting replacement parts to catastrophic turbine failure.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Current value method"

1

Ayoul-Guilmard, Q., F. Nobile, S. Ganesh, et al. D6.4 Report on stochastic optimisation for unsteady problems. Scipedia, 2021. http://dx.doi.org/10.23967/exaqute.2021.2.003.

Full text
Abstract:
This report brings together methodological research on stochastic optimisation and work on benchmark and target applications of the ExaQute project, with a focus on unsteady problems. A practical, general method for the optimisation of the conditional value at risk is proposed. Three different optimisation problems are described: an oscillator problem selected as a suitable trial and illustration case; the shape optimisation of an airfoil, chosen as a benchmark application in the project; the shape optimisation of a tall building, which is the challenging target application set for ExaQUte. For each problem, the current developments and results are presented, the application of the proposed method is discussed, and the work to be done until the end of the project is laid out.
APA, Harvard, Vancouver, ISO, and other styles
2

Kyllönen, Katriina, Karri Saarnio, Ulla Makkonen, and Heidi Hellén. Verification of the validity of air quality measurements related to the Directive 2004/107/EC in 2019-2020 (DIRME2019). Finnish Meteorological Institute, 2020. http://dx.doi.org/10.35614/isbn.9789523361256.

Full text
Abstract:
This project summarizes the results from 2000–2020and evaluates the trueness andthequality control (QC) procedures of the ongoing polycyclic aromatic hydrocarbon (PAH)and trace element measurements in Finlandrelating to Air Quality (AQ) Directive 2004/107/EC. The evaluation was focused on benzo(a)pyrene and other PAH compounds as well as arsenic, cadmium and nickel in PM10and deposition. Additionally, it included lead and other metals in PM10and deposition, gaseous mercury and mercury deposition, andbriefly other specificAQ measurements such as volatile organic compounds (VOC)and PM2.5chemical composition. This project was conducted by the National Reference Laboratory on air quality and thiswas the first time these measurements were assessed. A major part of the project was field and laboratory audits of the ongoing PAH and metal measurements. Other measurements were briefly evaluated through interviews and available literature. In addition, the national AQ database, the expertise of local measurement networks and related publications were utilised. In total, all theseven measurement networks performing PAH and metal measurements in 2019–2020took part in the audits. Eleven stations were audited while these measurements are performed at 22 AQ stations in Finland. For the large networks, one station was chosen to represent the performance of the network. The audits included also six laboratories performing the analysis of the collected samples. The audits revealed the compliance of the measurements with the AQ Decree 113/2017, Directive 2004/107/EC and Standards of the European Committee for Standardization(CEN). In addition, general information of the measurements, instruments and quality control procedures were gained. The results of the laboratory audits were confidential,but this report includes general findings, and the measurement networks were informed on the audit results with the permission of the participating laboratories. As a conclusion, the measurementmethodsusedwere mainly reference methods. Currently, all sampling methods were reference methods; however, before 2018 three networks used other methods that may have underestimated concentrations. Regarding these measurements, it should be noted the results are notcomparable with the reference method. Laboratory methods were reference methods excluding two cases, where the first was considered an acceptable equivalent method. For the other, a change to a reference method was strongly recommended and this realized in 2020. For some new measurements, the ongoing QC procedures were not yet fully established, and advice were given. Some networks used consultant for calibration and maintenance, and thus theywere not fully aware of the QC procedures. EN Standards were mostly followed. Main concerns were related to the checks of flow and calculation of measurement uncertainty, and suggestions for improvement were given. When the measurement networks implement the recommendations given inthe audits, it can be concluded that the EN Standards are adequately followed in the networks. In the ongoing sampling, clear factors risking the trueness of the result were not found. This applies also for the laboratory analyses in 2020. One network had concentrations above the target value, and theindicative measurementsshould be updated to fixed measurements.
APA, Harvard, Vancouver, ISO, and other styles
3

Wright, Kirsten. Collecting Plant Phenology Data In Imperiled Oregon White Oak Ecosystems: Analysis and Recommendations for Metro. Portland State University, 2020. http://dx.doi.org/10.15760/mem.64.

Full text
Abstract:
Highly imperiled Oregon white oak ecosystems are a regional conservation priority of numerous organizations, including Oregon Metro, a regional government serving over one million people in the Portland area. Previously dominant systems in the Pacific Northwest, upland prairie and oak woodlands are now experiencing significant threat, with only 2% remaining in the Willamette Valley in small fragments (Hulse et al. 2002). These fragments are of high conservation value because of the rich biodiversity they support, including rare and endemic species, such as Delphinium leucophaeum (Oregon Department of Agriculture, 2020). Since 2010, Metro scientists and volunteers have collected phenology data on approximately 140 species of forbs and graminoids in regional oak prairie and woodlands. Phenology is the study of life-stage events in plants and animals, such as budbreak and senescence in flowering plants, and widely acknowledged as a sensitive indicator of environmental change (Parmesan 2007). Indeed, shifts in plant phenology have been observed over the last few decades as a result of climate change (Parmesan 2006). In oak systems, these changes have profound implications for plant community composition and diversity, as well as trophic interactions and general ecosystem function (Willis 2008). While the original intent of Metro’s phenology data-collection was to track long-term phenology trends, limitations in data collection methods have made such analysis difficult. Rather, these data are currently used to inform seasonal management decisions on Metro properties, such as when to collect seed for propagation and when to spray herbicide to control invasive species. Metro is now interested in fine-tuning their data-collection methods to better capture long-term phenology trends to guide future conservation strategies. Addressing the regional and global conservation issues of our time will require unprecedented collaboration. Phenology data collected on Metro properties is not only an important asset for Metro’s conservation plan, but holds potential to support broader research on a larger scale. As a leader in urban conservation, Metro is poised to make a meaningful scientific contribution by sharing phenology data with regional and national organizations. Data-sharing will benefit the common goal of conservation and create avenues for collaboration with other scientists and conservation practitioners (Rosemartin 2013). In order to support Metro’s ongoing conservation efforts in Oregon white oak systems, I have implemented a three-part master’s project. Part one of the project examines Metro’s previously collected phenology data, providing descriptive statistics and assessing the strengths and weaknesses of the methods by which the data were collected. Part two makes recommendations for improving future phenology data-collection methods, and includes recommendations for datasharing with regional and national organizations. Part three is a collection of scientific vouchers documenting key plant species in varying phases of phenology for Metro’s teaching herbarium. The purpose of these vouchers is to provide a visual tool for Metro staff and volunteers who rely on plant identification to carry out aspects of their job in plant conservation. Each component of this project addresses specific aspects of Metro’s conservation program, from day-to-day management concerns to long-term scientific inquiry.
APA, Harvard, Vancouver, ISO, and other styles
4

Job, Jacob. Mesa Verde National Park: Acoustic monitoring report. National Park Service, 2021. http://dx.doi.org/10.36967/nrr-2286703.

Full text
Abstract:
In 2015, the Natural Sounds and Night Skies Division (NSNSD) received a request to collect baseline acoustical data at Mesa Verde National Park (MEVE). Between July and August 2015, as well as February and March 2016, three acoustical monitoring systems were deployed throughout the park, however one site (MEVE002) stopped recording after a couple days during the summer due to wildlife interference. The goal of the study was to establish a baseline soundscape inventory of backcountry and frontcountry sites within the park. This inventory will be used to establish indicators and thresholds of soundscape quality that will support the park and NSNSD in developing a comprehensive approach to protecting the acoustic environment through soundscape management planning. Additionally, results of this study will help the park identify major sources of noise within the park, as well as provide a baseline understanding of the acoustical environment as a whole for use in potential future comparative studies. In this deployment, sound pressure level (SPL) was measured continuously every second by a calibrated sound level meter. Other equipment included an anemometer to collect wind speed and a digital audio recorder collecting continuous recordings to document sound sources. In this document, “sound pressure level” refers to broadband (12.5 Hz–20 kHz), A-weighted, 1-second time averaged sound level (LAeq, 1s), and hereafter referred to as “sound level.” Sound levels are measured on a logarithmic scale relative to the reference sound pressure for atmospheric sources, 20 μPa. The logarithmic scale is a useful way to express the wide range of sound pressures perceived by the human ear. Sound levels are reported in decibels (dB). A-weighting is applied to sound levels in order to account for the response of the human ear (Harris, 1998). To approximate human hearing sensitivity, A-weighting discounts sounds below 1 kHz and above 6 kHz. Trained technicians calculated time audible metrics after monitoring was complete. See Methods section for protocol details, equipment specifications, and metric calculations. Median existing (LA50) and natural ambient (LAnat) metrics are also reported for daytime (7:00–19:00) and nighttime (19:00–7:00). Prominent noise sources at the two backcountry sites (MEVE001 and MEVE002) included vehicles and aircraft, while building and vehicle predominated at the frontcountry site (MEVE003). Table 1 displays time audible values for each of these noise sources during the monitoring period, as well as ambient sound levels. In determining the current conditions of an acoustical environment, it is informative to examine how often sound levels exceed certain values. Table 2 reports the percent of time that measured levels at the three monitoring locations were above four key values.
APA, Harvard, Vancouver, ISO, and other styles
5

Carney, Nancy, Tamara Cheney, Annette M. Totten, et al. Prehospital Airway Management: A Systematic Review. Agency for Healthcare Research and Quality (AHRQ), 2021. http://dx.doi.org/10.23970/ahrqepccer243.

Full text
Abstract:
Objective. To assess the comparative benefits and harms across three airway management approaches (bag valve mask [BVM], supraglottic airway [SGA], and endotracheal intubation [ETI]) by emergency medical services in the prehospital setting, and how the benefits and harms differ based on patient characteristics, techniques, and devices. Data sources. We searched electronic citation databases (Ovid® MEDLINE®, CINAHL®, the Cochrane Central Register of Controlled Trials, the Cochrane Database of Systematic Reviews, and Scopus®) from 1990 to September 2020 and reference lists, and posted a Federal Register notice request for data. Review methods. Review methods followed Agency for Healthcare Research and Quality Evidence-based Practice Center Program methods guidance. Using pre-established criteria, studies were selected and dual reviewed, data were abstracted, and studies were evaluated for risk of bias. Meta-analyses using profile-likelihood random effects models were conducted when data were available from studies reporting on similar outcomes, with analyses stratified by study design, emergency type, and age. We qualitatively synthesized results when meta-analysis was not indicated. Strength of evidence (SOE) was assessed for primary outcomes (survival, neurological function, return of spontaneous circulation [ROSC], and successful advanced airway insertion [for SGA and ETI only]). Results. We included 99 studies (22 randomized controlled trials and 77 observational studies) involving 630,397 patients. Overall, we found few differences in primary outcomes when airway management approaches were compared. • For survival, there was moderate SOE for findings of no difference for BVM versus ETI in adult and mixed-age cardiac arrest patients. There was low SOE for no difference in these patients for BVM versus SGA and SGA versus ETI. There was low SOE for all three comparisons in pediatric cardiac arrest patients, and low SOE in adult trauma patients when BVM was compared with ETI. • For neurological function, there was moderate SOE for no difference for BVM compared with ETI in adults with cardiac arrest. There was low SOE for no difference in pediatric cardiac arrest for BVM versus ETI and SGA versus ETI. In adults with cardiac arrest, neurological function was better for BVM and ETI compared with SGA (both low SOE). • ROSC was applicable only in cardiac arrest. For adults, there was low SOE that ROSC was more frequent with SGA compared with ETI, and no difference for BVM versus SGA or BVM versus ETI. In pediatric patients there was low SOE of no difference for BVM versus ETI and SGA versus ETI. • For successful advanced airway insertion, low SOE supported better first-pass success with SGA in adult and pediatric cardiac arrest patients and adult patients in studies that mixed emergency types. Low SOE also supported no difference for first-pass success in adult medical patients. For overall success, there was moderate SOE of no difference for adults with cardiac arrest, medical, and mixed emergency types. • While harms were not always measured or reported, moderate SOE supported all available findings. There were no differences in harms for BVM versus SGA or ETI. When SGA was compared with ETI, there were no differences for aspiration, oral/airway trauma, and regurgitation; SGA was better for multiple insertion attempts; and ETI was better for inadequate ventilation. Conclusions. The most common findings, across emergency types and age groups, were of no differences in primary outcomes when prehospital airway management approaches were compared. As most of the included studies were observational, these findings may reflect study design and methodological limitations. Due to the dynamic nature of the prehospital environment, the results are susceptible to indication and survival biases as well as confounding; however, the current evidence does not favor more invasive airway approaches. No conclusion was supported by high SOE for any comparison and patient group. This supports the need for high-quality randomized controlled trials designed to account for the variability and dynamic nature of prehospital airway management to advance and inform clinical practice as well as emergency medical services education and policy, and to improve patient-centered outcomes.
APA, Harvard, Vancouver, ISO, and other styles
6

Viswanathan, Meera, Jennifer Cook Middleton, Alison Stuebe, et al. Maternal, Fetal, and Child Outcomes of Mental Health Treatments in Women: A Systematic Review of Perinatal Pharmacologic Interventions. Agency for Healthcare Research and Quality (AHRQ), 2021. http://dx.doi.org/10.23970/ahrqepccer236.

Full text
Abstract:
Background. Untreated maternal mental health disorders can have devastating sequelae for the mother and child. For women who are currently or planning to become pregnant or are breastfeeding, a critical question is whether the benefits of treating psychiatric illness with pharmacologic interventions outweigh the harms for mother and child. Methods. We conducted a systematic review to assess the benefits and harms of pharmacologic interventions compared with placebo, no treatment, or other pharmacologic interventions for pregnant and postpartum women with mental health disorders. We searched four databases and other sources for evidence available from inception through June 5, 2020 and surveilled the literature through March 2, 2021; dually screened the results; and analyzed eligible studies. We included studies of pregnant, postpartum, or reproductive-age women with a new or preexisting diagnosis of a mental health disorder treated with pharmacotherapy; we excluded psychotherapy. Eligible comparators included women with the disorder but no pharmacotherapy or women who discontinued the pharmacotherapy before pregnancy. Results. A total of 164 studies (168 articles) met eligibility criteria. Brexanolone for depression onset in the third trimester or in the postpartum period probably improves depressive symptoms at 30 days (least square mean difference in the Hamilton Rating Scale for Depression, -2.6; p=0.02; N=209) when compared with placebo. Sertraline for postpartum depression may improve response (calculated relative risk [RR], 2.24; 95% confidence interval [CI], 0.95 to 5.24; N=36), remission (calculated RR, 2.51; 95% CI, 0.94 to 6.70; N=36), and depressive symptoms (p-values ranging from 0.01 to 0.05) when compared with placebo. Discontinuing use of mood stabilizers during pregnancy may increase recurrence (adjusted hazard ratio [AHR], 2.2; 95% CI, 1.2 to 4.2; N=89) and reduce time to recurrence of mood disorders (2 vs. 28 weeks, AHR, 12.1; 95% CI, 1.6 to 91; N=26) for bipolar disorder when compared with continued use. Brexanolone for depression onset in the third trimester or in the postpartum period may increase the risk of sedation or somnolence, leading to dose interruption or reduction when compared with placebo (5% vs. 0%). More than 95 percent of studies reporting on harms were observational in design and unable to fully account for confounding. These studies suggested some associations between benzodiazepine exposure before conception and ectopic pregnancy; between specific antidepressants during pregnancy and adverse maternal outcomes such as postpartum hemorrhage, preeclampsia, and spontaneous abortion, and child outcomes such as respiratory issues, low Apgar scores, persistent pulmonary hypertension of the newborn, depression in children, and autism spectrum disorder; between quetiapine or olanzapine and gestational diabetes; and between benzodiazepine and neonatal intensive care admissions. Causality cannot be inferred from these studies. We found insufficient evidence on benefits and harms from comparative effectiveness studies, with one exception: one study suggested a higher risk of overall congenital anomalies (adjusted RR [ARR], 1.85; 95% CI, 1.23 to 2.78; N=2,608) and cardiac anomalies (ARR, 2.25; 95% CI, 1.17 to 4.34; N=2,608) for lithium compared with lamotrigine during first- trimester exposure. Conclusions. Few studies have been conducted in pregnant and postpartum women on the benefits of pharmacotherapy; many studies report on harms but are of low quality. The limited evidence available is consistent with some benefit, and some studies suggested increased adverse events. However, because these studies could not rule out underlying disease severity as the cause of the association, the causal link between the exposure and adverse events is unclear. Patients and clinicians need to make an informed, collaborative decision on treatment choices.
APA, Harvard, Vancouver, ISO, and other styles
7

African Open Science Platform Part 1: Landscape Study. Academy of Science of South Africa (ASSAf), 2019. http://dx.doi.org/10.17159/assaf.2019/0047.

Full text
Abstract:
This report maps the African landscape of Open Science – with a focus on Open Data as a sub-set of Open Science. Data to inform the landscape study were collected through a variety of methods, including surveys, desk research, engagement with a community of practice, networking with stakeholders, participation in conferences, case study presentations, and workshops hosted. Although the majority of African countries (35 of 54) demonstrates commitment to science through its investment in research and development (R&amp;D), academies of science, ministries of science and technology, policies, recognition of research, and participation in the Science Granting Councils Initiative (SGCI), the following countries demonstrate the highest commitment and political willingness to invest in science: Botswana, Ethiopia, Kenya, Senegal, South Africa, Tanzania, and Uganda. In addition to existing policies in Science, Technology and Innovation (STI), the following countries have made progress towards Open Data policies: Botswana, Kenya, Madagascar, Mauritius, South Africa and Uganda. Only two African countries (Kenya and South Africa) at this stage contribute 0.8% of its GDP (Gross Domestic Product) to R&amp;D (Research and Development), which is the closest to the AU’s (African Union’s) suggested 1%. Countries such as Lesotho and Madagascar ranked as 0%, while the R&amp;D expenditure for 24 African countries is unknown. In addition to this, science globally has become fully dependent on stable ICT (Information and Communication Technologies) infrastructure, which includes connectivity/bandwidth, high performance computing facilities and data services. This is especially applicable since countries globally are finding themselves in the midst of the 4th Industrial Revolution (4IR), which is not only “about” data, but which “is” data. According to an article1 by Alan Marcus (2015) (Senior Director, Head of Information Technology and Telecommunications Industries, World Economic Forum), “At its core, data represents a post-industrial opportunity. Its uses have unprecedented complexity, velocity and global reach. As digital communications become ubiquitous, data will rule in a world where nearly everyone and everything is connected in real time. That will require a highly reliable, secure and available infrastructure at its core, and innovation at the edge.” Every industry is affected as part of this revolution – also science. An important component of the digital transformation is “trust” – people must be able to trust that governments and all other industries (including the science sector), adequately handle and protect their data. This requires accountability on a global level, and digital industries must embrace the change and go for a higher standard of protection. “This will reassure consumers and citizens, benefitting the whole digital economy”, says Marcus. A stable and secure information and communication technologies (ICT) infrastructure – currently provided by the National Research and Education Networks (NRENs) – is key to advance collaboration in science. The AfricaConnect2 project (AfricaConnect (2012–2014) and AfricaConnect2 (2016–2018)) through establishing connectivity between National Research and Education Networks (NRENs), is planning to roll out AfricaConnect3 by the end of 2019. The concern however is that selected African governments (with the exception of a few countries such as South Africa, Mozambique, Ethiopia and others) have low awareness of the impact the Internet has today on all societal levels, how much ICT (and the 4th Industrial Revolution) have affected research, and the added value an NREN can bring to higher education and research in addressing the respective needs, which is far more complex than simply providing connectivity. Apart from more commitment and investment in R&amp;D, African governments – to become and remain part of the 4th Industrial Revolution – have no option other than to acknowledge and commit to the role NRENs play in advancing science towards addressing the SDG (Sustainable Development Goals). For successful collaboration and direction, it is fundamental that policies within one country are aligned with one another. Alignment on continental level is crucial for the future Pan-African African Open Science Platform to be successful. Both the HIPSSA ((Harmonization of ICT Policies in Sub-Saharan Africa)3 project and WATRA (the West Africa Telecommunications Regulators Assembly)4, have made progress towards the regulation of the telecom sector, and in particular of bottlenecks which curb the development of competition among ISPs. A study under HIPSSA identified potential bottlenecks in access at an affordable price to the international capacity of submarine cables and suggested means and tools used by regulators to remedy them. Work on the recommended measures and making them operational continues in collaboration with WATRA. In addition to sufficient bandwidth and connectivity, high-performance computing facilities and services in support of data sharing are also required. The South African National Integrated Cyberinfrastructure System5 (NICIS) has made great progress in planning and setting up a cyberinfrastructure ecosystem in support of collaborative science and data sharing. The regional Southern African Development Community6 (SADC) Cyber-infrastructure Framework provides a valuable roadmap towards high-speed Internet, developing human capacity and skills in ICT technologies, high- performance computing and more. The following countries have been identified as having high-performance computing facilities, some as a result of the Square Kilometre Array7 (SKA) partnership: Botswana, Ghana, Kenya, Madagascar, Mozambique, Mauritius, Namibia, South Africa, Tunisia, and Zambia. More and more NRENs – especially the Level 6 NRENs 8 (Algeria, Egypt, Kenya, South Africa, and recently Zambia) – are exploring offering additional services; also in support of data sharing and transfer. The following NRENs already allow for running data-intensive applications and sharing of high-end computing assets, bio-modelling and computation on high-performance/ supercomputers: KENET (Kenya), TENET (South Africa), RENU (Uganda), ZAMREN (Zambia), EUN (Egypt) and ARN (Algeria). Fifteen higher education training institutions from eight African countries (Botswana, Benin, Kenya, Nigeria, Rwanda, South Africa, Sudan, and Tanzania) have been identified as offering formal courses on data science. In addition to formal degrees, a number of international short courses have been developed and free international online courses are also available as an option to build capacity and integrate as part of curricula. The small number of higher education or research intensive institutions offering data science is however insufficient, and there is a desperate need for more training in data science. The CODATA-RDA Schools of Research Data Science aim at addressing the continental need for foundational data skills across all disciplines, along with training conducted by The Carpentries 9 programme (specifically Data Carpentry 10 ). Thus far, CODATA-RDA schools in collaboration with AOSP, integrating content from Data Carpentry, were presented in Rwanda (in 2018), and during17-29 June 2019, in Ethiopia. Awareness regarding Open Science (including Open Data) is evident through the 12 Open Science-related Open Access/Open Data/Open Science declarations and agreements endorsed or signed by African governments; 200 Open Access journals from Africa registered on the Directory of Open Access Journals (DOAJ); 174 Open Access institutional research repositories registered on openDOAR (Directory of Open Access Repositories); 33 Open Access/Open Science policies registered on ROARMAP (Registry of Open Access Repository Mandates and Policies); 24 data repositories registered with the Registry of Data Repositories (re3data.org) (although the pilot project identified 66 research data repositories); and one data repository assigned the CoreTrustSeal. Although this is a start, far more needs to be done to align African data curation and research practices with global standards. Funding to conduct research remains a challenge. African researchers mostly fund their own research, and there are little incentives for them to make their research and accompanying data sets openly accessible. Funding and peer recognition, along with an enabling research environment conducive for research, are regarded as major incentives. The landscape report concludes with a number of concerns towards sharing research data openly, as well as challenges in terms of Open Data policy, ICT infrastructure supportive of data sharing, capacity building, lack of skills, and the need for incentives. Although great progress has been made in terms of Open Science and Open Data practices, more awareness needs to be created and further advocacy efforts are required for buy-in from African governments. A federated African Open Science Platform (AOSP) will not only encourage more collaboration among researchers in addressing the SDGs, but it will also benefit the many stakeholders identified as part of the pilot phase. The time is now, for governments in Africa, to acknowledge the important role of science in general, but specifically Open Science and Open Data, through developing and aligning the relevant policies, investing in an ICT infrastructure conducive for data sharing through committing funding to making NRENs financially sustainable, incentivising open research practices by scientists, and creating opportunities for more scientists and stakeholders across all disciplines to be trained in data management.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography