To see the other types of publications on this topic, follow the link: Proportion – Data processing.

Journal articles on the topic 'Proportion – Data processing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Proportion – Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Kralev, Velin, Radoslava Kraleva, and Petia Koprinkova-Hristova. "Data modelling and data processing generated by human eye movements." International Journal of Electrical and Computer Engineering (IJECE) 11, no. 5 (October 1, 2021): 4345. http://dx.doi.org/10.11591/ijece.v11i5.pp4345-4352.

Full text
Abstract:
Data modeling and data processing are important activities in any scientific research. This research focuses on the modeling of data and processing of data generated by a saccadometer. The approach used is based on the relational data model, but the processing and storage of the data is done with client datasets. The experiments were performed with 26 randomly selected files from a total of 264 experimental sessions. The data from each experimental session was stored in three different formats, respectively text, binary and extensible markup language (XML) based. The results showed that the text format and the binary format were the most compact. Several actions related to data processing were analyzed. Based on the results obtained, it was found that the two fastest actions are respectively loading data from a binary file and storing data into a binary file. In contrast, the two slowest actions were storing the data in XML format and loading the data from a text file, respectively. Also, one of the time-consuming operations turned out to be the conversion of data from text format to binary format. Moreover, the time required to perform this action does not depend in proportion on the number of records processed.
APA, Harvard, Vancouver, ISO, and other styles
2

Luo, Wen Hua. "The Processing and Analyzing of Non-Structured Data in Digital Investigation." Advanced Materials Research 774-776 (September 2013): 1807–11. http://dx.doi.org/10.4028/www.scientific.net/amr.774-776.1807.

Full text
Abstract:
The proportion of Non-structured data in total amount is much more than that of structured data. But the research on the method of processing and analyzing non-structured data is not as deep as that on structured data. This paper illustrates the importance of the research on non-structured data. Then from the angle of digital investigation, it illustrates the key techniques of its processing and analyzing ways. And it combines the self-developed Intelligent Analyzing System of Mass Case Information and the background of handling the online ball gambling in Chinese mainland, it illustrates in detail the specific application of non-structured data processing and analyzing in digital investigation.
APA, Harvard, Vancouver, ISO, and other styles
3

Palguyev, D. A., and A. N. Shentyabin. "Matrix application for multi-radar processing of radar data arrays." Radio industry (Russia) 30, no. 3 (September 8, 2020): 99–111. http://dx.doi.org/10.21778/2413-9599-2020-30-3-99-111.

Full text
Abstract:
In the processing of dynamically changing data, for example, radar data (RD), a crucial part is made by the representation of various data sets containing information about routes and signs of air objects. In the practical implementation of the computational process, it previously seemed natural that RD processing in data arrays was carried out by the elementwise search method. However, the representation of data arrays in the form of matrices and the use of matrix math allow optimal calculations to be formed during tertiary processing. Forming matrices and working with them requires a significant computational resource, so the authors can assume that a certain gain in calculation time may be achieved if there is a large amount of data in the arrays, at least several thousand messages. The article shows the sequences of the most frequently repeated operations of tertiary network processing, such as searching for and replacing an array element. The simulation results show that the processing efficiency (relative reduction of processing time and saving of computing resources) with the use of matrices, in comparison with elementwise search and replacement, increases in proportion to the number of messages received by the information processing device. The most significant gain is observed when processing several thousand messages (array elements). Thus, the use of matrices and the mathematical apparatus of matrix math for processing arrays of dynamically changing data can reduce processing time and save computational resources. The proposed matrix method of organizing calculations can also find its place in the modeling of complex information systems.
APA, Harvard, Vancouver, ISO, and other styles
4

Hussin, Masnida, Raja Azlina Raja Mahmood, and Mas Rina Mustaffa. "Sensor Communication Model Using Cyber-Physical System Approach for Green Data Center." International Journal of Interactive Mobile Technologies (iJIM) 13, no. 10 (September 25, 2019): 188. http://dx.doi.org/10.3991/ijim.v13i10.11310.

Full text
Abstract:
Energy consumption in distributed computing system gains a lot of attention recently after its processing capacity becomes significant for better business and economic operations. Comprehensive analysis of energy efficiency in high-performance data center for distributed processing requires ability to monitor a proportion of resource utilization versus energy consumption. In order to gain green data center while sustaining computational performance, a model of energy efficient cyber-physical communication is proposed. A real-time sensor communication is used to monitor heat emitted by processors and room temperature. Specifically, our cyber-physical communication model dynamically identifies processing states in data center while implying a suitable air-conditioning temperature level. The information is then used by administration to fine-tune the room temperature according to the current processing activities. Our automated triggering approach aims to improve edge computing performance with cost-effective energy consumption. Simulation experiments show that our cyber-physical communication achieves better energy consumption and resource utilization compared with other cooling model.
APA, Harvard, Vancouver, ISO, and other styles
5

Men, Hong, Donglin Chen, Xiaoting Zhang, Jingjing Liu, and Ke Ning. "Data Fusion of Electronic Nose and Electronic Tongue for Detection of Mixed Edible-Oil." Journal of Sensors 2014 (2014): 1–7. http://dx.doi.org/10.1155/2014/840685.

Full text
Abstract:
For the problem of the waste of the edible-oil in the food processing, on the premise of food security, they often need to add new edible-oil to the old frying oil which had been used in food processing to control the cost of the production. Due to the fact that the different additive proportion of the oil has different material and different volatile gases, we use fusion technology based on the electronic nose and electronic tongue to detect the blending ratio of the old frying oil and the new edible-oil in this paper. Principal component analysis (PCA) is used to distinguish the different proportion of the old frying oil and new edible-oil; on the other hand we use partial least squares (PLS) to predict the blending ratio of the old frying oil and new edible-oil. Two conclusions were proposed: data fusion of electronic nose and electronic tongue can be used to detect the blending ratio of the old frying oil and new edible-oil; in contrast to single used electronic nose or single used electronic tongue, the detection effect has increased by using data fusion of electronic nose and electronic tongue.
APA, Harvard, Vancouver, ISO, and other styles
6

Pawitradewi, Anak Agung Istri, and Made Gede Wirakusuma. "Pengaruh Kinerja Lingkungan, Umur Perusahaan dan Proporsi Dewan Komisaris Independen pada Pengungkapan Informasi Lingkungan." E-Jurnal Akuntansi 30, no. 3 (March 14, 2020): 598. http://dx.doi.org/10.24843/eja.2020.v30.i03.p04.

Full text
Abstract:
This study aims to determine the effect of environmental performance, company age and the proportion of independent commissioners on disclosure of environmental information. Samples from this study were 24 companies where this sample was obtained using a purposive sampling method with the criteria of high profile companies listed on the IDX and registered as participants in the 2016-2018 PROPER. Research data processing was performed using multiple linear regression data analysis techniques. The results obtained after testing are environmental performance and the proportion of independent commissioners has a positive influence on the disclosure of environmental information. While the results of the study found from the relationship between the age of the company on the disclosure of environmental information is found no influence between the two variables. Keywords: Environmental Performance; Company Age; Proportioni of Independent Commissioners; Environmental Disclosure.
APA, Harvard, Vancouver, ISO, and other styles
7

Wei, P., A. Li, M. Hou, L. Zhu, D. Xu, and B. Ning. "EQUAL PROPORTION REPRODUCTION METHOD OF GROTTO BASED ON POINT CLOUD." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-2/W15 (August 26, 2019): 1215–19. http://dx.doi.org/10.5194/isprs-archives-xlii-2-w15-1215-2019.

Full text
Abstract:
<p><strong>Abstract.</strong> The rapid development of 3D laser scanning and 3D printing technology provides new technologies and ideas for cultural relic protection and reproduction. Aiming at the requirement of equal proportional reproduction of large-scale grottoes, this paper takes the point cloud data of the 18th Cave of Yungang Grottoes obtained by 3D laser scanning as an example, and proposes a data processing and reproduction block partitioning method for equal proportion reproduction. The Cyclone, Geomagic and AutoCAD software were used to construct the 3D model of the grotto, and the 3D printing technology was used to realize the secondary design and model print. Among them, the research focuses on the modeling of massive point clouds and the method of model partitioning based on voxels. It can meet the requirements of movable and assembly while realizing the equal proportional reproduction of the whole grotto. The research results and application can be a good reference for the future grotto reproduction work.</p>
APA, Harvard, Vancouver, ISO, and other styles
8

Chen, Kai, Li Qing Fang, and Hong Kai Wang. "The Primary Processing of MEMS Devices and Applications Analysis." Advanced Materials Research 418-420 (December 2011): 2134–38. http://dx.doi.org/10.4028/www.scientific.net/amr.418-420.2134.

Full text
Abstract:
This paper details the system of MEMS technology, focusing on analysis MEMS device processing and application status. Through the analysis of MEMS technology in the application of MEMS devices, MEMS devices described in the application of the status in modern life, while the survey data produced a MEMS device in the next few years the proportion of market share, and analyse the developments of MEMS devices and development trend.
APA, Harvard, Vancouver, ISO, and other styles
9

Haryanto, Desi Anis Anggraini, Miftachul Ulum, and Achmad Fiqhi Ibadillah. "Image Processing Based Aquaponics Monitoring System." JEEE-U (Journal of Electrical and Electronic Engineering-UMSIDA) 5, no. 1 (March 28, 2021): 37–59. http://dx.doi.org/10.21070/jeeeu.v5i1.1220.

Full text
Abstract:
Aquaponics means a culture that is very necessary to be applied, because in this system it is a combination of fish farming techniques as well as plant enlargement techniques by hydroponics. This research develops a smart aquaponics system that can control and increase the acidity level, air temperature, fish feed, and the installation of a camera to monitor fish development. In this system, there are sensors installed to retrieve data. Thus, air quality and circulation is well maintained. The results obtained from this study are to test an automatic feed system that runs well for each experiment, with an accurate proportion of 93.33%, and PH measurements that have been calibrated run well, the comparison of manual measurements using the PH meter measurement sensor gets the proportion 97, 83. for the meter Flow measurement results obtained a proportion of 91.02%, then for plant development every week got pretty good results, in the first week the plants grew 1cm after sowing, 3cm for the 2nd week, 7cm for the second week. -3. The results of measuring the weight of fish using image processing are not much different from manual measurements, the length of the fish is measured manually, it is 7 cm, and in the image it is 5.6 cm, the weight of manual fish is 11g, in the image it is 11.66g. Keywords: Aquaponics; Camera; Android; image processing, flow. Abstrak. Akuaponik merupakan suatu budaya yang sangat disarankan untuk diterapkan, karena pada sistem ini berupa kombinasi dari teknik budidaya ikan sekaligus teknik pembesaran tanaman dengan cara hidroponik. Penelitian ini merancang sistem akuaponik pintar yang bisa mengendalikan dan pantau tingkat keasaman, pakan ikan, dan pemasangan kamera untuk memantau perkembangan ikan. Dalam sistem ini, ada sensor yang dipasang untuk mengambil data,. Dengan demikian, kualitas dan sirkulasi air terjaga dengan baik. Hasil yang didapat dari penelitian ini yaitu untuk pengujian sistem pakan otomatis berjalan dengan cukup baik, dengan persentase keberhasilan 93.33 %, untuk pengukuran PH yang sudah terkalibrasi berjalan dengan baik, perbandingan pengukuran manual dengan pengukuran menggunakan sensor PH meter mendapatkan persentase keberhasilan 97.83% untuk hasil pengukuran sensor Flow meter didapatkan persentase keberhasilan sebesar 91.02%, selanjutnya untuk perkembangan tanaman setiap minggu mendapatkan hasil yang cukup baik, pada minggu pertama tanaman diperkirakan tumbuh 1cm setelah penyemaian, 3 cm untuk minggu ke-2, 7cm untuk minggu ke-3. Pengukuran berat ikan menggunakan Image processing mendapatkan hasil yang tidak jauh berbeda dengan pengukuran secara manual, panjang ikan yang diukur secara manual yaitu 7 cm, dan secara image yaitu 5.6 cm, berat ikan manual 11g, secara image 11.66g
APA, Harvard, Vancouver, ISO, and other styles
10

Huang, Weihua. "Research on the Revolution of Multidimensional Learning Space in the Big Data Environment." Complexity 2021 (May 18, 2021): 1–12. http://dx.doi.org/10.1155/2021/6583491.

Full text
Abstract:
Multiuser fair sharing of clusters is a classic problem in cluster construction. However, the cluster computing system for hybrid big data applications has the characteristics of heterogeneous requirements, which makes more and more cluster resource managers support fine-grained multidimensional learning resource management. In this context, it is oriented to multiusers of multidimensional learning resources. Shared clusters have become a new topic. A single consideration of a fair-shared cluster will result in a huge waste of resources in the context of discrete and dynamic resource allocation. Fairness and efficiency of cluster resource sharing for multidimensional learning resources are equally important. This paper studies big data processing technology and representative systems and analyzes multidimensional analysis and performance optimization technology. This article discusses the importance of discrete multidimensional learning resource allocation optimization in dynamic scenarios. At the same time, in view of the fact that most of the resources of the big data application cluster system are supplied to large jobs that account for a small proportion of job submissions, while the small jobs that account for a large proportion only use the characteristics of a small part of the system’s resources, the expected residual multidimensionality of large-scale work is proposed. The server with the least learning resources is allocated first, and only fair strategies are considered for small assignments. The topic index is distributed and stored on the system to realize the parallel processing of search to improve the efficiency of search processing. The effectiveness of RDIBT is verified through experimental simulation. The results show that RDIBT has higher performance than LSII index technology in index creation speed and search response speed. In addition, RDIBT can also ensure the scalability of the index system.
APA, Harvard, Vancouver, ISO, and other styles
11

Randle, Valerie, Mark Coleman, and Gregory Owen. "Evolution of the Grain Boundary Network as a Consequence of Deformation and Annealing." Materials Science Forum 550 (July 2007): 35–44. http://dx.doi.org/10.4028/www.scientific.net/msf.550.35.

Full text
Abstract:
Iterative processing, involving sequential deformation and annealing, has been carried out on copper specimens with the aim of grain boundary engineering (GBE) them. The data have provided some interesting insights into the mechanisms of GBE. The results have demonstrated that development of a high proportion of Σ3s is beneficial to properties, as shown by improved strain-to-failure for the same strength. The proportion of Σ3s saturates at approximately 60% length fraction. Analysis of the data indicates that iterative processing is not always necessary for the development of beneficial properties, and it is further suggested that the condition of the starting specimen has a large influence on the subsequent microstructural development. The present, new data are also compared with previous research on copper where all five parameters of the grain boundary network population have been measured.
APA, Harvard, Vancouver, ISO, and other styles
12

Li, Bing Jie, Dong Xiao Niu, Yan Lu, Hua Li Xia, and Wei Bin Ding. "Research on Management Engineering with Data Processing for Relationship between Equity Structures and Operating Performance — Take the Utilities Listed Companies as Example." Advanced Materials Research 977 (June 2014): 547–50. http://dx.doi.org/10.4028/www.scientific.net/amr.977.547.

Full text
Abstract:
Rational equity structure is the key to improve the operating performance. On the basis of the basic theory of international experts, the paper selects the data of utilities listed companies, and takes the principal component analysis and multiple linear regression model to build the index system of utilities for evaluating the operating performance after a series of data processing methods, at last, conducts the empirical analysis of the relationship between equity structure and operating performance. The results show that there is no significant correlation between the proportion of the state-owned restricted shares and the company's profitability; there is a positive correlation between the proportion of outstanding shares and the company’s profitability. Accordingly, the paper proposes some recommendations on management engineering at last to improve the company’s operating performance and the efficiency of management engineering.
APA, Harvard, Vancouver, ISO, and other styles
13

Tomlin, Dani, Harvey Dillon, and Andrea S. Kelly. "Allowing for Asymmetric Distributions When Comparing Auditory Processing Test Percentage Scores with Normative Data." Journal of the American Academy of Audiology 25, no. 06 (June 2014): 541–48. http://dx.doi.org/10.3766/jaaa.25.6.4.

Full text
Abstract:
Background: Raw percentage scores can be transformed to age-specific Z scores, despite the asymmetric distribution of normative data using a process that is applicable to any percentage (or proportion)-based result. Purpose: Normative values are generated for the commonly used dichotic digit and frequency pattern behavioral tests of auditory processing. Study Sample: A total of 180 normal-hearing children aged 7 yr 0 mo to 12 yr 2 mo took part in this study. Research Design: A transformation and regression method is incorporated that allows for the asymmetric distribution of normative results and the development of the response across the 7–12-yr-age range. Data Collection and Analysis: Percentage correct scores were determined for each ear in the dichotic digit and frequency pattern tests, delivered at 50 dB HL. The scores were arcsine transformed, then regressed against using an exponential equation, providing an age specific estimated mean score. The residual error of the regression was then used to estimate age specific variance. Results and Conclusions: The ability to express results along an age continuum (while accounting for the asymmetric distribution and significant developmental influences) as a standard unit across all ages enables a simplified expression of performance ability on a task.
APA, Harvard, Vancouver, ISO, and other styles
14

Scheer, Ľ., and R. Sitko. "Assessment of some forest characteristics employing IKONOS satellite data." Journal of Forest Science 53, No. 8 (January 7, 2008): 345–51. http://dx.doi.org/10.17221/2181-jfs.

Full text
Abstract:
In recent years, satellite remote sensing has become a new tool for estimation of forest condition. The paper deals with spruce timber growing stock and vegetation cover assessment employing IKONOS satellite data from a mountain forest area of Central Slovakia. Original digital data as well as enhanced digital images were used to estimate some forest variables. Image enhancement approaches employing topographic normalization, PCA analysis and different vegetation indices are a very important part of data processing. Apart from spectral characteristics, texture as an additional variable was utilized. In order to improve classification accuracy the knowledge of the vertical distribution of tree species also was incorporated into classifiers. Spectral signatures as auxiliary variables measured with the aid of training sets were utilized for the construction of spectral models for growing stock estimation. In spite of the fact that the standard error of these models is not very favourable as it varies about 30%, they offer initial information for application of different sampling designs for timber growing stock assessment, where the final precision is acceptable. Stepwise discriminant analysis was employed to choose appropriate sets for the classification of vegetation cover. Classification results show an assumed contribution of categorial knowledge for increasing the correctly classified pixel proportion and this improvement was on average about 10%. Likewise, the texture contributes to better resolution of some very near spectral classes.
APA, Harvard, Vancouver, ISO, and other styles
15

Almoayed, Khaled Abdullah, Ali Bin Break, Mutahar Al-Qassimi, Ali Assabri, and Yousef Khader. "The Acute Flaccid Paralysis (AFP) Surveillance System in Yemen, 2010-2015: Descriptive Study Based on Secondary Data Analysis." JMIR Public Health and Surveillance 5, no. 4 (December 6, 2019): e14413. http://dx.doi.org/10.2196/14413.

Full text
Abstract:
Background Acute flaccid paralysis (AFP) surveillance is an essential strategy for poliovirus eradication. Objective This study aimed to evaluate the performance of the AFP surveillance system in Yemen from 2010 to 2015, identify components that require strengthening, and compare the indicators by year and governorates. Methods This descriptive study was based on secondary analysis of AFP surveillance data reported during 2010-2015 from all Yemeni governorates. The World Health Organization (WHO) minimum performance standards were used to evaluate the performance of the AFP surveillance system. Results A total of 3019 AFP cases were reported between January 2010 and December 2015. At the national level, AFP surveillance achieved WHO targets throughout the evaluating period for the nonpolio AFP rate of cases per 100,000 members of the population younger than 15 years of age, proportion of AFP cases reported within 7 days, proportion of AFP cases investigated within 48 hours of notification, proportion of AFP cases with two adequate stool specimens, and proportion of stool specimens from which nonpolio enterovirus was isolated. However, the proportion of specimens that arrived at the central level within 3 days of the first sample collection and the proportion of stool specimens with results sent from the reference laboratory within 28 days of receipt did not reach targets in 2011 and 2015, respectively. Conclusions The AFP surveillance system in Yemen has met most of the WHO indicator levels. Nevertheless, the evaluation showed areas of weakness regarding the arrival of specimens at the central level within 3 days of the first sample collection and delays in processing of the results and submitting feedback by the laboratory. Therefore, there is a need to strengthen the follow-up of specimens submitted to the laboratory.
APA, Harvard, Vancouver, ISO, and other styles
16

Zeng, Le-tian, Chun-hui Yang, Mao-sheng Huang, and Yue-long Zhao. "Verification of Imaging Algorithm for Signal Processing Software within Synthetic Aperture Radar (SAR) System." Scientific Programming 2019 (February 21, 2019): 1–12. http://dx.doi.org/10.1155/2019/7105281.

Full text
Abstract:
In the signal processing software testing for synthetic aperture radar (SAR), the verification for algorithms is professional and has a very high proportion. However, existing methods can only perform a degree of validation for algorithms, exerting an adverse effect on the effectiveness of the software testing. This paper proposes a procedure-based approach for algorithm validation. Firstly, it describes the processing procedures of polar format algorithm (PFA) under the motion-error circumstance, based on which it analyzes the possible questions that may exist in the actual situation. By data simulation, the SAR echoes are generated flexibly and efficiently. Then, algorithm simulation is utilized to focus on the demonstrations for the approximations adopted in the algorithm. Combined with real data processing, the bugs concealed are excavated further, implementing a comprehensive validation for PFA. Simulated experiments and real data processing validate the correctness and effectiveness of the proposed algorithm.
APA, Harvard, Vancouver, ISO, and other styles
17

Zhang, Hongjun, Youliang Zhang, and Rui Zhang. "Dimension-Specific Efficiency Measurement Using Data Envelopment Analysis." Mathematical Problems in Engineering 2014 (2014): 1–9. http://dx.doi.org/10.1155/2014/247248.

Full text
Abstract:
Data envelopment analysis (DEA) is a powerful tool for evaluating and improving the performance of a set of decision-making units (DMUs). Empirically, there are usually many DMUs exhibiting “efficient” status in multi-input multioutput situations. However, it is not appropriate to assert that all efficient DMUs have equivalent performances. Actually, a DMU can be evaluated to be efficient as long as it performs best in a single dimension. This paper argues that an efficient DMU of a particular input-output proportion has its own specialty and may also perform poorly in some dimensions. Two DEA-based approaches are proposed to measure the dimension-specific efficiency of DMUs. One is measuring efficiency in multiplier-form by further processing the original multiplier DEA model. The other is calculating efficiency in envelopment-form by comparing with an ideal DMU. The proposed approaches are applied to 26 supermarkets in the city of Nanjing, China, which have provided new insights on efficiency for the managers.
APA, Harvard, Vancouver, ISO, and other styles
18

Holko, Gabriela, Matthew C. Kelley, Scott James Perry, and Benjamin V. Tucker. "Perception of Unfamiliar English Phonemes by Native Mandarin Speakers." Alberta Academic Review 2, no. 2 (September 10, 2019): 37–38. http://dx.doi.org/10.29173/aar46.

Full text
Abstract:
In second language acquisition, speech sounds, or phonemes, not present in a learner’s native language often pose an extra challenge for speech production. When hearing one of these unfamiliar phonemes, the learner either maps it to a similar native phoneme, perceives it as a completely foreign sound, or does not perceive it as speech at all. In the first case, the learner is unable to perceive a difference between the unfamiliar phoneme and the native phoneme to which it is mapped. This mapping difficulty potentially creates problems for the learner during word recognition. The present research investigated the extent to which English phonemes absent from the Mandarin phonological inventory impact processing of native Mandarin speakers in an auditory lexical decision task. Results of this research will expand the understanding of second language perception, especially within the context of auditory lexical decision tasks. A list of ten phonemes—/ɪ/, /æ/, /ʊ/, /ɛ/, /v/, /z/, /ʒ/, /ɵ/, /ð/, /ʤ/—present in the English phonological inventory but absent from that of Mandarin were identified as unfamiliar to native Mandarin speakers. Data from the Massive Auditory Lexical Decision (MALD) database, in which participants decided whether recorded utterances were English words or made-up words, were utilized. The effects of the proportion of unfamiliar phonemes, proportion of unfamiliar vowels, and proportion of unfamiliar consonants on reaction time, representative of processing difficulty, were then calculated using statistical techniques. It was found that the proportion of all unfamiliar phonemes in an utterance had no significant effect on the reaction time of the native Mandarin speakers. However, when the list of unfamiliar phonemes was divided into vowels and consonants, a greater proportion of unfamiliar vowels was noticed to increase reaction time, while a greater proportion of unfamiliar consonants was found to decrease reaction time. Further research in this area is required to determine a concrete explanation for these results. Interestingly, when the same analysis was performed on the data of native English speakers, similar results were observed. This may reflect a common language processing mechanism in second language learners and native speakers.
APA, Harvard, Vancouver, ISO, and other styles
19

Asr, Fatemeh Torabi, Mohammad Mazraeh, Alexandre Lopes, Vasundhara Gautam, Junette Gonzales, Prashanth Rao, and Maite Taboada. "The Gender Gap Tracker: Using Natural Language Processing to measure gender bias in media." PLOS ONE 16, no. 1 (January 29, 2021): e0245533. http://dx.doi.org/10.1371/journal.pone.0245533.

Full text
Abstract:
We examine gender bias in media by tallying the number of men and women quoted in news text, using the Gender Gap Tracker, a software system we developed specifically for this purpose. The Gender Gap Tracker downloads and analyzes the online daily publication of seven English-language Canadian news outlets and enhances the data with multiple layers of linguistic information. We describe the Natural Language Processing technology behind this system, the curation of off-the-shelf tools and resources that we used to build it, and the parts that we developed. We evaluate the system in each language processing task and report errors using real-world examples. Finally, by applying the Tracker to the data, we provide valuable insights about the proportion of people mentioned and quoted, by gender, news organization, and author gender. Data collected between October 1, 2018 and September 30, 2020 shows that, in general, men are quoted about three times as frequently as women. While this proportion varies across news outlets and time intervals, the general pattern is consistent. We believe that, in a world with about 50% women, this should not be the case. Although journalists naturally need to quote newsmakers who are men, they also have a certain amount of control over who they approach as sources. The Gender Gap Tracker relies on the same principles as fitness or goal-setting trackers: By quantifying and measuring regular progress, we hope to motivate news organizations to provide a more diverse set of voices in their reporting.
APA, Harvard, Vancouver, ISO, and other styles
20

Nguyen, M., E. J. Woo, S. Winiecki, J. Scott, D. Martin, T. Botsis, R. Ball, and B. Baer. "Can Natural Language Processing Improve the Efficiency of Vaccine Adverse Event Report Review?" Methods of Information in Medicine 55, no. 02 (2016): 144–50. http://dx.doi.org/10.3414/me14-01-0066.

Full text
Abstract:
SummaryBackground: Individual case review of spontaneous adverse event (AE) reports remains a cornerstone of medical product safety surveil-lance for industry and regulators. Previously we developed the Vaccine Adverse Event Text Miner (VaeTM) to offer automated information extraction and potentially accelerate the evaluation of large volumes of unstructured data and facilitate signal detection.Objective: To assess how the information extraction performed by VaeTM impacts the accuracy of a medical expert’s review of the vaccine adverse event report.Methods: The “outcome of interest” (diagnosis, cause of death, second level diagnosis), “onset time,” and “alternative explanations” (drug, medical and family history) for the adverse event were extracted from 1000 reports from the Vaccine Adverse Event Reporting System (VAERS) using the VaeTM system. We compared the human interpretation, by medical experts, of the VaeTM extracted data with their interpretation of the traditional full text reports for these three variables. Two experienced clinicians alternately reviewed text miner output and full text. A third clinician scored the match rate using a predefined algorithm; the proportion of matches and 95% confidence intervals (CI) were calculated. Review time per report was analyzed.Results: Proportion of matches between the interpretation of the VaeTM extracted data, compared to the interpretation of the full text: 93% for outcome of interest (95% CI: 91– 94%) and 78% for alternative explanation (95% CI: 75 – 81%). Extracted data on the time to onset was used in 14% of cases and was a match in 54% (95% CI: 46 – 63%) of those cases. When supported by structured time data from reports, the match for time to onset was 79% (95% CI: 76 – 81%). The extracted text averaged 136 (74%) fewer words, resulting in a mean reduction in review time of 50 (58%) seconds per report.Conclusion: Despite a 74% reduction in words, the clinical conclusion from VaeTM extracted data agreed with the full text in 93% and 78% of reports for the outcome of interest and alternative explanation, respec -tively. The limited amount of extracted time interval data indicates the need for further development of this feature. VaeTM may improve review efficiency, but further study is needed to determine if this level of agreement is sufficient for routine use.
APA, Harvard, Vancouver, ISO, and other styles
21

Bartoněk, Dalibor. "Solving Big GIS Projects on Desktop Computers." Kartografija i geoinformacije 18, no. 32 (December 15, 2019): 44–62. http://dx.doi.org/10.32909/kg.18.32.4.

Full text
Abstract:
We are witnessing great developments in digital information technologies. The situation encroaches on spatial data, which contain both attributive and localization features, and this determines their position unequally within an obligatory coordinate system. These changes have resulted in the rapid growth of digital data, significantly supported by technical advances regarding the devices which produce them. As technology for making spatial data advances, methods and software for big data processing are falling behind. Paradoxically, only about 2% of the total volume of data is actually used. Big data processing often requires high computation performance hardware and software. Only a few users possess the appropriate information infrastructure. The proportion of processed data would improve if big data could be processed by ordinary users. In geographical information systems (GIS), these problems arise when solving projects related to extensive territory or considerable secondary complexity, which require big data processing. This paper focuses on the creation and verification of methods by which it would be possible to process effectively extensive projects in GIS supported by desktop hardware and software. It is a project regarding new quick methods for the functional reduction of the data volume, optimization of processing, edge detection in 3D and automated vectorization.
APA, Harvard, Vancouver, ISO, and other styles
22

Samad, Rosdiyana, Law Wen Yan, Mahfuzah Mustafa, Nor Rul Hasma Abdullah, and Dwi Pebrianti. "Multiple Human Body Postures Detection using Kinect." Indonesian Journal of Electrical Engineering and Computer Science 10, no. 2 (May 1, 2018): 528. http://dx.doi.org/10.11591/ijeecs.v10.i2.pp528-536.

Full text
Abstract:
<span lang="EN-US">This paper presents a method to detect multiple human body postures using Kinect sensor. In this study, a combination of shape features and body joint points are used as input features. The Kinect sensor which used infrared camera to produce a depth image is suitable to be used in an environment that has varying lighting conditions. The method for human detection is done by processing the depth image and joint data (skeleton) which able to overcome several problems such as cluttered background, various articulated poses, and change in color and illumination. Then, the body joint coordinates found on the object are used to calculate the body proportion ratio. In the experiment, the average body proportions from three body parts are obtained to verify the suitableness of golden ratio usage in this work. Finally, the measured body proportion is compared with Golden Ratio to determine whether the found object is a real human body or not. This method is tested for various scenarios, where true positive human detection is high for various postures. This method able to detect a human body in low lighting and dark room. The average body proportions obtained from the experiment show that the value is close to the golden ratio value.</span>
APA, Harvard, Vancouver, ISO, and other styles
23

Hilchey, Shannon P., Alexander F. Rosenberg, Ollivier Hyrien, Shelley Secor-Socha, Matthew R. Cochran, Michael T. Brady, Jyh-Chiang E. Wang, et al. "Follicular lymphoma tumor–infiltrating T-helper (TH) cells have the same polyfunctional potential as normal nodal TH cells despite skewed differentiation." Blood 118, no. 13 (September 29, 2011): 3591–602. http://dx.doi.org/10.1182/blood-2011-03-340646.

Full text
Abstract:
Abstract The follicular lymphoma (FL) T-cell microenvironment plays a critical role in the biology of this disease. We therefore determined the lineage, differentiation state, and functional potential of FL-infiltrating CD4+ T-helper cells (TH) compared with reactive and normal lymph node (NLN) TH cells. Relative to NLNs, FL cells have decreased proportions of naive and central memory but increased proportions of effector memory TH cells. We further show differences in the distribution and anatomical localization of CXCR5+ TH populations that, on the basis of transcription factor analysis, include both regulatory and follicular helper T cells. On Staphylococcus enterotoxin-B stimulation, which stimulates T cells through the T-cell receptor, requires no processing by APCs, and can overcome regulator T cell-mediated suppression, the proportion of uncommitted primed precursor cells, as well as TH2 and TH17 cells is higher in FL cells than in reactive lymph nodes or NLNs. However, the proportion of TH1 and polyfunctional TH cells (producing multiple cytokines simultaneously) is similar in FL cells and NLNs. These data suggest that, although TH-cell differentiation in FL is skewed compared with NLNs, FL TH cells should have the same intrinsic ability to elicit antitumor effector responses as NLN TH cells when tumor suppressive mechanisms are attenuated.
APA, Harvard, Vancouver, ISO, and other styles
24

Nadig, Aparna S., and Julie C. Sedivy. "Evidence of Perspective-Taking Constraints in Children's On-Line Reference Resolution." Psychological Science 13, no. 4 (July 2002): 329–36. http://dx.doi.org/10.1111/j.0956-7976.2002.00460.x.

Full text
Abstract:
Young children's communication has often been characterized as egocentric. Some researchers claim that the processing of language involves an initial stage that relies on egocentric heuristics, even in adults. Such an account, combined with general developmental difficulties with late-stage processes, could provide an explanation for much of children's egocentric communication. However, the experimental data reported in this article do not support such an account: In an elicited-production task, 5- to 6-year-old children were found to be sensitive to their partner's perspective. Moreover, in an on-line comprehension task, they showed sensitivity to common-ground information from the initial stages of language processing. We propose that mutual knowledge is not distinct from other knowledge relevant for language processing, and exerts early effects on processing in proportion to its salience and reliability.
APA, Harvard, Vancouver, ISO, and other styles
25

BUZMAKOV, Vasiliy Nikolaevich, and Yuliya Vasil’evna VOLODINA. "Estimation of influence of the mineral composition of ore bodies of titanomagnetites of the Gusevogorskoye deposit on the concentration of vanadium in the products of their processing." NEWS of the Ural State Mining University 59, no. 3 (September 15, 2020): 62–68. http://dx.doi.org/10.21440/2307-2091-2020-3-62-68.

Full text
Abstract:
Purpose of the work: to study the possibilities of increasing the extraction of vanadium pentoxide and reducing titanium dioxide into concentrate at EVRAZ KGOK. However, vanadium pentoxide and titanium dioxide are closely interconnected due to the complex mineral composition of the ores, which, in turn, is due to the genesis of the deposit. As a consequence, a decrease in the titanium content causes a decrease in the vanadium content. Relevance. EVRAZ KGOK develops the Kachkanar group of titanomagnetite iron ore deposits containing vanadium impurities, which make it possible to smelt high-strength alloyed steels. Impurities of titanium negatively affect the subsequent blast-furnace conversion and increase the cost of the smelting process. Methodology. To determine the extraction of vanadium pentoxide into the concentrate, magnetic analysis on a Davis tube, sieve analysis of the fineness of grind and statistical processing of the data were used. Results. The deposit is being developed in four ore bodies with different mineral and petrographic composition and, accordingly, different enrichment results. During the concentration conversion, part of the vanadium is lost in the tailings of enrichment. To ensure the planned content of vanadium in the concentrate, the ratio “proportion of vanadium pentoxide / proportion of iron in the original ore” should be at least 0.0077 (in each batch). Conclusions. To ensure the proportion of vanadium in the products of concentrate processing, it is necessary to control the process of shipping ores using information on the proportion of vanadium that can be recovered. Control should be carried out according to the indicator “the ratio of the proportion of vanadium and the proportion of iron in the original ore.” Corrective actions should be taken by changing the proportion of ore shipped from various ore bodies.
APA, Harvard, Vancouver, ISO, and other styles
26

Sclama, Gregory, and Diego Rose. "Modeling the Potential of Household-Level Maize Processing to Reduce the Burden of Zinc Deficiency Among Women of Childbearing Age in Malawi." Current Developments in Nutrition 4, Supplement_2 (May 29, 2020): 905. http://dx.doi.org/10.1093/cdn/nzaa053_110.

Full text
Abstract:
Abstract Objectives Dietary phytate is a potent inhibitor of zinc absorption. Phytate levels of cereals can be reduced by basic household processing techniques such as soaking, germinating, and fermenting. The objective of this study was to model the potential of such techniques to reduce the burden of zinc deficiency in Malawi, where high-phytate maize is a dietary staple. Methods Using nationally representative household consumption data and food composition tables, we estimated daily phytate and zinc intakes for individuals in Malawi. We then applied a mathematical model of zinc absorption based on total dietary zinc and phytate to calculate the apparent absorbed zinc for each individual. Using the Cut-Point method described by the Institute of Medicine, we determined the proportion of each physiological group with absorbed zinc below their mean requirements. We then simulated the reduction in dietary phytate resulting from maize processing and estimated the new burdens of zinc deficiency. We estimated the impact at various coverage levels and compared the results against an alternative model using zinc-biofortified maize. Results Nationally, 34% of females age 14–18 and 23% of females over age 18 were at risk of zinc deficiency. Only 13% of women of childbearing age met the zinc requirement for pregnancy, while less than 4% met the requirement for breastfeeding. The burden of zinc deficiency was highest in the South where maize intake was highest. The simulation of phytate reduction from household processing found that with 40% coverage, the proportion of at-risk females age 14–18 fell below 23%, while the proportion over age 18 fell to 14%. The potential benefits were greatest in the South, where the proportion of women at risk was reduced by over a third. Biofortification also reduced zinc deficiency, however the modeled impact of processing was greater than biofortification for all regions and subgroups. Conclusions Household food processing techniques may be an important strategy to reduce the burden of zinc deficiency among vulnerable women in Malawi. These techniques are low-cost and not widely practiced at present. Behavior change interventions to promote them must consider culture, gender norms, and drivers of food preference. Food-based approaches such as these should be given greater attention in nutrition and health policy and programming. Funding Sources None.
APA, Harvard, Vancouver, ISO, and other styles
27

Riley, Merilyn. "Population Prevalence Rates of Birth Defects: A Data Management and Epidemiological Perspective." Health Information Management 34, no. 3 (September 2005): 94–99. http://dx.doi.org/10.1177/183335830503400307.

Full text
Abstract:
The Victorian Birth Defects Register (VBDR) is a population-based surveillance system with a primary function of monitoring trends in birth defects. This paper outlines the processes undertaken in Victoria, Australia, to obtain population prevalence rates of birth defects and investigates the effect on the prevalence rates of variations in collection and processing tasks. It includes all birth defects that were notified to the VBDR by 31 December 2004. The overall prevalence rate of birth defects in Victoria for 2003 was 4.0%, with an overall accuracy rate of 88%. However, this proportion varied according to what birth defects were included, the age by which birth defects were diagnosed, changes to sources of ascertainment, inclusion of terminations of pregnancy, or reporting by cases rate (infants affected) or birth defect rate (individual birth defects). Taking all of these factors into consideration, we are confident that 4.0% is an accurate population prevalence rate of birth defects in Victoria for 2003.
APA, Harvard, Vancouver, ISO, and other styles
28

Farrell, Anne M., Joshua O. Goh, and Brian J. White. "The Effect of Performance-Based Incentive Contracts on System 1 and System 2 Processing in Affective Decision Contexts: fMRI and Behavioral Evidence." Accounting Review 89, no. 6 (July 1, 2014): 1979–2010. http://dx.doi.org/10.2308/accr-50852.

Full text
Abstract:
ABSTRACT Managers may rely on emotional reactions to a setting to the detriment of economic considerations (“System 1 processing”), resulting in decisions that are costly for firms. While economic theory prescribes performance-based incentives to align goals and induce effort, psychology theory suggests that the salience of emotions is difficult to overcome without also inducing more deliberate consideration of both emotional and economic factors (“System 2 processing”). We link these perspectives by investigating whether performance-based incentives mitigate the costly influence of emotion by inducing more System 2 processing. Using functional magnetic resonance imaging and traditional experiments, we investigate managers' brain activity and choices under fixed wage and performance-based contracts. Under both, brain regions associated with System 1 processing are more active when emotion is present. Relative to fixed wage contracts, performance-based contracts induce System 2 processing in emotional contexts beyond that observed absent emotion, and decrease the proportion of economically costly choices. Data Availability: Contact the authors.
APA, Harvard, Vancouver, ISO, and other styles
29

Muhammad Saqlain, Syed, Anwar Ghani, Imran Khan, Shahbaz Ahmed Khan Ghayyur, Shahaboddin Shamshirband , Narjes Nabipour, and Manouchehr Shokri. "Image Analysis Using Human Body Geometry and Size Proportion Science for Action Classification." Applied Sciences 10, no. 16 (August 7, 2020): 5453. http://dx.doi.org/10.3390/app10165453.

Full text
Abstract:
Gestures are one of the basic modes of human communication and are usually used to represent different actions. Automatic recognition of these actions forms the basis for solving more complex problems like human behavior analysis, video surveillance, event detection, and sign language recognition, etc. Action recognition from images is a challenging task as the key information like temporal data, object trajectory, and optical flow are not available in still images. While measuring the size of different regions of the human body i.e., step size, arms span, length of the arm, forearm, and hand, etc., provides valuable clues for identification of the human actions. In this article, a framework for classification of the human actions is presented where humans are detected and localized through faster region-convolutional neural networks followed by morphological image processing techniques. Furthermore, geometric features from human blob are extracted and incorporated into the classification rules for the six human actions i.e., standing, walking, single-hand side wave, single-hand top wave, both hands side wave, and both hands top wave. The performance of the proposed technique has been evaluated using precision, recall, omission error, and commission error. The proposed technique has been comparatively analyzed in terms of overall accuracy with existing approaches showing that it performs well in contrast to its counterparts.
APA, Harvard, Vancouver, ISO, and other styles
30

Svensson, Olof, Maciej Gilski, Didier Nurizzo, and Matthew W. Bowler. "A comparative anatomy of protein crystals: lessons from the automatic processing of 56 000 samples." IUCrJ 6, no. 5 (July 10, 2019): 822–31. http://dx.doi.org/10.1107/s2052252519008017.

Full text
Abstract:
The fully automatic processing of crystals of macromolecules has presented a unique opportunity to gather information on the samples that is not usually recorded. This has proved invaluable in improving sample-location, characterization and data-collection algorithms. After operating for four years, MASSIF-1 has now processed over 56 000 samples, gathering information at each stage, from the volume of the crystal to the unit-cell dimensions, the space group, the quality of the data collected and the reasoning behind the decisions made in data collection. This provides an unprecedented opportunity to analyse these data together, providing a detailed landscape of macromolecular crystals, intimate details of their contents and, importantly, how the two are related. The data show that mosaic spread is unrelated to the size or shape of crystals and demonstrate experimentally that diffraction intensities scale in proportion to crystal volume and molecular weight. It is also shown that crystal volume scales inversely with molecular weight. The results set the scene for the development of X-ray crystallography in a changing environment for structural biology.
APA, Harvard, Vancouver, ISO, and other styles
31

Yudiandika, I. Putu, I. Wayan Suarna, and I. Made Sudarma. "PENGARUH JUMLAH BAKTERI METHANOBACTERIUM DAN LAMA FERMENTASI TERHADAP PROPORSI GAS METANA (CH4) PADA PENGOLAHAN SAMPAH ORGANIK DI TPA SUWUNG DENPASAR." ECOTROPHIC : Jurnal Ilmu Lingkungan (Journal of Environmental Science) 11, no. 1 (May 1, 2017): 29. http://dx.doi.org/10.24843/ejes.2017.v11.i01.p05.

Full text
Abstract:
EFFECT OF NUMBER OF METHANOBACTERIUM AND FERMENTATION DURATION TO METHANE (CH4) GAS PROPORTION IN ORGANIC WASTE PROCESSING IN SUWUNG TPA DENPASARA research has been conducted to find out the effect to the amount of methanobacterium bacteria and fermentation duration toward proportion of methana (CH4) at organic waste processing at TPA Suwung Denpasar. Methana gas produced from this organic waste will be processed become fuel of electric generation. From this study will be expected to get all methana gas that contained at the waste so that there is no methana gas loss to the atmosphere. This study was conducted by using 4 treatments that are without bacteria (B0), bacteria with number of population 106 CFU/ml (B1), bacteria with population of 107 CFU/ml (B2), and bacteria with population of 108 CFU/ml (B3). Each treatment conducted thrice (3) repeat. The four treatments conducted measurement of gas variable after fermentation during 0 week, 3 weeks, 5 weeks, 7 weeks and 9 weeks by uisng gas analyzer GA 2000 Geotech. Data from study result then analyzed by using complicated factorial design (RAL). From ANOVA analysis shows there was significant bacteria number and fermentation duration toward proportion or procentage of methana gas resulted. The longer fermentation time takes place, the bigger the proportion of the methane gas produced. However, the greater number of the bacteria population does not always produce bigger proportion of methane gas To find out the combination which could give best effect the researcher used Duncan test. The result of analysis from Duncan shows that combination at the ninth weeks by number of bacteria 107 CFU/ml giving best result that was percentage of methana gas is 55,10%.
APA, Harvard, Vancouver, ISO, and other styles
32

LEGACY, JACQUELINE, PASCAL ZESIGER, MARGARET FRIEND, and DIANE POULIN-DUBOIS. "Vocabulary size and speed of word recognition in very young French–English bilinguals: A longitudinal study." Bilingualism: Language and Cognition 21, no. 1 (August 19, 2016): 137–49. http://dx.doi.org/10.1017/s1366728916000833.

Full text
Abstract:
A longitudinal study of lexical development in very young French–English bilinguals is reported. The Computerized Comprehension Test (CCT) was used to directly assess receptive vocabulary and processing efficiency, and parental report (CDI) was used to measure expressive vocabulary in monolingual and bilingual infants at 16 months, and six months later, at 22 months. All infants increased their comprehension and production of words over the six-month period, and bilingual infants acquired approximately as many new words in each of their languages as the monolinguals did. Speed of online word processing was also equivalent in both groups at each wave of data collection, and increased significantly across waves. Importantly, significant relations emerged between language exposure, vocabulary size, and processing speed, with proportion of language exposure predicting vocabulary size at each time point. This study extends previous findings by utilizing a direct measure of receptive vocabulary development and online word processing.
APA, Harvard, Vancouver, ISO, and other styles
33

Spencer, Erin T., Emilie Richards, Blaire Steinwand, Juliette Clemons, Jessica Dahringer, Priya Desai, Morgan Fisher, et al. "A high proportion of red snapper sold in North Carolina is mislabeled." PeerJ 8 (June 25, 2020): e9218. http://dx.doi.org/10.7717/peerj.9218.

Full text
Abstract:
Seafood mislabeling occurs when a market label is inaccurate, primarily in terms of species identity, but also regarding weight, geographic origin, or other characteristics. This widespread problem allows cheaper or illegally-caught species to be marketed as species desirable to consumers. Previous studies have identified red snapper (Lutjanus campechanus) as one of the most frequently mislabeled seafood species in the United States. To quantify how common mislabeling of red snapper is across North Carolina, the Seafood Forensics class at the University of North Carolina at Chapel Hill used DNA barcoding to analyze samples sold as “red snapper” from restaurants, seafood markets, and grocery stores purchased in ten counties. Of 43 samples successfully sequenced and identified, 90.7% were mislabeled. Only one grocery store chain (of four chains tested) accurately labeled red snapper. The mislabeling rate for restaurants and seafood markets was 100%. Vermilion snapper (Rhomboplites aurorubens) and tilapia (Oreochromis aureus and O. niloticus) were the species most frequently substituted for red snapper (13 of 39 mislabeled samples for both taxa, or 26 of 39 mislabeled total). This study builds on previous mislabeling research by collecting samples of a specific species in a confined geographic region, allowing local vendors and policy makers to better understand the scope of red snapper mislabeling in North Carolina. This methodology is also a model for other academic institutions to engage undergraduate researchers in mislabeling data collection, sample processing, and analysis.
APA, Harvard, Vancouver, ISO, and other styles
34

Zhang, Yi Ting, Bin Wang, and Zhi Hui Zhang. "Design and Implementation of a Data Parsing Module for Power Information Equipment Log Management System." Advanced Materials Research 765-767 (September 2013): 1092–97. http://dx.doi.org/10.4028/www.scientific.net/amr.765-767.1092.

Full text
Abstract:
in order to manage the log information of Windows servers, Linux servers, network devices and security devices in a unified, so as to query log data, analysis and audit log data conveniently, a program is put forward, in which a variety of power system information devices log data be converted into a unified relational model and integrated into the database. The data parsing module uses the Windows Workflow procedure to select, clean and merge the massive log data. The database is created and operated by Microsoft SQL Server 2005 development platform. All of the log files have to be converted into a unified format and saved in centralized storage. Experiments and test results show that the module has a good efficiency of data processing and integration, and it greatly increases the proportion of valid data. It provides supports for efficient log auditing and fault diagnosis in the future.
APA, Harvard, Vancouver, ISO, and other styles
35

Zhou, Feichi, Jiewei Chen, Xiaoming Tao, Xinran Wang, and Yang Chai. "2D Materials Based Optoelectronic Memory: Convergence of Electronic Memory and Optical Sensor." Research 2019 (August 21, 2019): 1–17. http://dx.doi.org/10.34133/2019/9490413.

Full text
Abstract:
The continuous development of electron devices towards the trend of “More than Moore” requires functional diversification that can collect data (sensors) and store (memories) and process (computing units) information. Considering the large occupation proportion of image data in both data center and edge devices, a device integration with optical sensing and data storage and processing is highly demanded for future energy-efficient and miniaturized electronic system. Two-dimensional (2D) materials and their heterostructures have exhibited broadband photoresponse and high photoresponsivity in the configuration of optical sensors and showed fast switching speed, multi-bit data storage, and large ON/OFF ratio in memory devices. In addition, its ultrathin body thickness and transfer process at low temperature allow 2D materials to be heterogeneously integrated with other existing materials system. In this paper, we overview the state-of-the-art optoelectronic random-access memories (ORAMs) based on 2D materials, as well as ORAM synaptic devices and their applications in neural network and image processing. The ORAM devices potentially enable direct storage/processing of sensory data from external environment. We also provide perspectives on possible directions of other neuromorphic sensor design (e.g., auditory and olfactory) based on 2D materials towards the future smart electronic systems for artificial intelligence.
APA, Harvard, Vancouver, ISO, and other styles
36

Landsman, David, Ahmed Abdelbasit, Christine Wang, Michael Guerzhoy, Ujash Joshi, Shaun Mathew, Chloe Pou-Prom, et al. "Cohort profile: St. Michael’s Hospital Tuberculosis Database (SMH-TB), a retrospective cohort of electronic health record data and variables extracted using natural language processing." PLOS ONE 16, no. 3 (March 3, 2021): e0247872. http://dx.doi.org/10.1371/journal.pone.0247872.

Full text
Abstract:
Background Tuberculosis (TB) is a major cause of death worldwide. TB research draws heavily on clinical cohorts which can be generated using electronic health records (EHR), but granular information extracted from unstructured EHR data is limited. The St. Michael’s Hospital TB database (SMH-TB) was established to address gaps in EHR-derived TB clinical cohorts and provide researchers and clinicians with detailed, granular data related to TB management and treatment. Methods We collected and validated multiple layers of EHR data from the TB outpatient clinic at St. Michael’s Hospital, Toronto, Ontario, Canada to generate the SMH-TB database. SMH-TB contains structured data directly from the EHR, and variables generated using natural language processing (NLP) by extracting relevant information from free-text within clinic, radiology, and other notes. NLP performance was assessed using recall, precision and F1 score averaged across variable labels. We present characteristics of the cohort population using binomial proportions and 95% confidence intervals (CI), with and without adjusting for NLP misclassification errors. Results SMH-TB currently contains retrospective patient data spanning 2011 to 2018, for a total of 3298 patients (N = 3237 with at least 1 associated dictation). Performance of TB diagnosis and medication NLP rulesets surpasses 93% in recall, precision and F1 metrics, indicating good generalizability. We estimated 20% (95% CI: 18.4–21.2%) were diagnosed with active TB and 46% (95% CI: 43.8–47.2%) were diagnosed with latent TB. After adjusting for potential misclassification, the proportion of patients diagnosed with active and latent TB was 18% (95% CI: 16.8–19.7%) and 40% (95% CI: 37.8–41.6%) respectively Conclusion SMH-TB is a unique database that includes a breadth of structured data derived from structured and unstructured EHR data by using NLP rulesets. The data are available for a variety of research applications, such as clinical epidemiology, quality improvement and mathematical modeling studies.
APA, Harvard, Vancouver, ISO, and other styles
37

Hamilton, Thomas, Julie Walker, Warren C. Rusche, and Zachary K. Smith. "233 Effects of Harvest Maturity And/or Kernel Processing on Corn Silage Processing Score and Particle Size of Corn Silage." Journal of Animal Science 99, Supplement_1 (May 1, 2021): 11–12. http://dx.doi.org/10.1093/jas/skab054.019.

Full text
Abstract:
Abstract A single corn hybrid was used to evaluate harvest maturity (Mat) and/or kernel processing (KP) effects on corn silage processing score (CSPS) and particle size (PS). Treatments were arranged in a 2 × 2 factorial of 1) Mat (early and late) and 2) KP (no or yes). A single corn field was planted on April 27, 2020. There were 12 loads (experimental unit) per simple effect treatment mean. Data were analyzed as a completely randomized design. Early harvest occurred on August 28, 2020 [yield (as is) = 39.1 Mg/hectare; DM = 43.1%; CP, NDF, and starch = 6.5, 46.0, and 32.9%, respectively (DM basis)]. Late harvest occurred on September 9, 2020 [yield = 37.8 Mg/hectare (as is); DM = 49.2%; CP, NDF, and starch = 6.6, 49.8, and 37.5%, respectively (DM basis)]. The same equipment was used for both Mat with KP achieved by narrowing processing rollers. The CSPS was determined as the proportion of starch retained below a 4.75-mm sieve. Grain content (DM basis) of the corn silage was calculated from starch/0.72. Particle size was assessed using the Penn State Particle Separator. A Mat × KP interaction (P = 0.05) was detected for CSPS. Early/no and late/no had decreased (P ≤ 0.05) CSPS compared to early/yes and late/yes had the greatest CSPS (P ≤ 0.05) compared to others. Grain content was 13.9% greater in late compared to early (P = 0.01). A Mat × KP interaction (P = 0.03) was detected for PS. Early/no had the greatest (P ≤ 0.05) PS, early/yes and late/no were intermediate, and late/yes had decreased PS compared to others (P ≤ 0.05). These data indicate that Mat and KP influence CSPS synergistically. Producers should consider KP when corn silage is harvested at a later maturity to enhance CSPS.
APA, Harvard, Vancouver, ISO, and other styles
38

Müdespacher, A. "Die Diffusion von Innovationen der Telematik in der Schweiz." Geographica Helvetica 40, no. 3 (September 30, 1985): 113–22. http://dx.doi.org/10.5194/gh-40-113-1985.

Full text
Abstract:
Abstract. This article analyses the reasons for the spatial variations in the diffusion process of innovations of the new information technology. A model of the diffusion process is developped in which hypotheses from well-known economic and geographical research have been integrated and combined with hypotheses which have been derived from the peculiarities of the examined innovations. The model is tested with Statistical methods such as discriminant analysis. Statistical evaluation of three different innovations (electronic data processing, data processing by means of telecommunications and facsimile machines shows empirical evidence for several hypotheses though the estimated functions are only moderatly correlated with the explained variables. The most important variables of the decision to adopt the new information technology by firms are the size of plants and a high Proportion of routine information activities. Spatial factors have much less influence but may prove statistically significant.
APA, Harvard, Vancouver, ISO, and other styles
39

Milne, David, Louis L. Pen, David Thompson, and William Powrie. "Automated processing of railway track deflection signals obtained from velocity and acceleration measurements." Proceedings of the Institution of Mechanical Engineers, Part F: Journal of Rail and Rapid Transit 232, no. 8 (March 19, 2018): 2097–110. http://dx.doi.org/10.1177/0954409718762172.

Full text
Abstract:
Measurements of low-frequency vibration are increasingly being used to assess the condition and performance of railway tracks. Displacements used to characterise the track movement under train loads are commonly obtained from velocity or acceleration signals. Artefacts from signal processing, which lead to a shift in the datum associated with the at-rest position, as well as variability between successive wheels, mean that interpreting measurements is non-trivial. As a result, deflections are often interpreted by inspection rather than following an algorithmic or statistical process. This can limit the amount of data that can be usefully analysed in practice, militating against widespread or long-term use of track vibration measurements for condition or performance monitoring purposes. This paper shows how the cumulative distribution function of the track deflection can be used to identify the at-rest position and to interpret the typical range of track movement from displacement data. This process can be used to correct the shift in the at-rest position in velocity or acceleration data, to determine the proportion of upward and downward movement and to align data from multiple transducers to a common datum for visualising deflection as a function of distance along the track. The technique provides a means of characterising track displacement automatically, which can be used as a measure of system performance. This enables large volumes of track vibration data to be used for condition monitoring.
APA, Harvard, Vancouver, ISO, and other styles
40

Chaudhry, Theresa. "Microeconomic Flexibility in India and Pakistan: Employment Adjustment at the Firm Level." LAHORE JOURNAL OF ECONOMICS 14, Special Edition (September 1, 2009): 17–27. http://dx.doi.org/10.35536/lje.2009.v14.isp.a2.

Full text
Abstract:
In this paper, we look at the pace at which firms adjust their employment levels as a measure of “microeconomic flexibility.” Flexibility aids in creative destruction processes, where less efficient establishments recede and dynamic firms can rapidly expand. Following the techniques used by Caballero, Engel, and Micco (2004), we use firm-level data from India and Pakistan to estimate the proportion of the gap closed in a year between desired and actual employment. The results for the proportion of the gap closed for India were 0.46 in 2001 and 0.45 in 2000. For Pakistan, we estimated the proportion of the gap closed as 0.2 in 2001 and 0.53 in 2000. The results for 2001 were much lower than expected (and lower than previous estimates for both countries), possibly due to the events of 9/11. Pakistan compared favorably to India in various key sectors, including chemicals, food processing, and garments. Exporters did not seem to have a quicker speed of adjustment.
APA, Harvard, Vancouver, ISO, and other styles
41

Wu, Chao, Zhen Wang, Simon Hu, Julien Lepine, Xiaoxiang Na, Daniel Ainalis, and Marc Stettler. "An Automated Machine-Learning Approach for Road Pothole Detection Using Smartphone Sensor Data." Sensors 20, no. 19 (September 28, 2020): 5564. http://dx.doi.org/10.3390/s20195564.

Full text
Abstract:
Road surface monitoring and maintenance are essential for driving comfort, transport safety and preserving infrastructure integrity. Traditional road condition monitoring is regularly conducted by specially designed instrumented vehicles, which requires time and money and is only able to cover a limited proportion of the road network. In light of the ubiquitous use of smartphones, this paper proposes an automatic pothole detection system utilizing the built-in vibration sensors and global positioning system receivers in smartphones. We collected road condition data in a city using dedicated vehicles and smartphones with a purpose-built mobile application designed for this study. A series of processing methods were applied to the collected data, and features from different frequency domains were extracted, along with various machine-learning classifiers. The results indicated that features from the time and frequency domains outperformed other features for identifying potholes. Among the classifiers tested, the Random Forest method exhibited the best classification performance for potholes, with a precision of 88.5% and recall of 75%. Finally, we validated the proposed method using datasets generated from different road types and examined its universality and robustness.
APA, Harvard, Vancouver, ISO, and other styles
42

Xuan, Weidong, Qiang Fu, Guanghua Qin, Cong Zhu, Suli Pan, and Yue-Ping Xu. "Hydrological Simulation and Runoff Component Analysis over a Cold Mountainous River Basin in Southwest China." Water 10, no. 11 (November 21, 2018): 1705. http://dx.doi.org/10.3390/w10111705.

Full text
Abstract:
Assessment of water resources from mountainous catchments is crucial for the development of upstream rural areas and downstream urban communities. However, lack of data in these mountainous catchments prevents full understanding of the response of hydrology or water resources to climate change. Meanwhile, hydrological modeling is challenging due to parameter uncertainty. In this work, one tributary of the Yarlung Zangbo River Basin (the upper stream of the Brahmaputra River) was used as a case study for hydrological modeling. Tropical Rainfall Measuring Mission (TRMM 3B42V7) data were utilized as a substitute for gauge-based rainfall data, and the capability of simulating precipitation, snow, and groundwater contributions to total runoff by the Soil and Water Assessment Tool (SWAT) was investigated. The uncertainty in runoff proportions from precipitation, snowmelt, and groundwater was quantified by a batch-processing module. Hydrological signatures were finally used to help identify if the hydrological model simulated total runoff and corresponding proportions properly. The results showed that: (1) TRMM data were very useful for hydrological simulation in high and cold mountainous catchments; (2) precipitation was the primary contributor nearly all year round, reaching 56.5% of the total runoff on average; (3) groundwater occupied the biggest proportion during dry seasons, whereas snowmelt made a substantial contribution only in late spring and summer; and (4) hydrological signatures were useful for helping to evaluate the performance of the hydrological model.
APA, Harvard, Vancouver, ISO, and other styles
43

Tyssandier, Viviane, Emmanuelle Reboul, Jean-François Dumas, Corinne Bouteloup-Demange, Martine Armand, Julie Marcand, Marcel Sallas, and Patrick Borel. "Processing of vegetable-borne carotenoids in the human stomach and duodenum." American Journal of Physiology-Gastrointestinal and Liver Physiology 284, no. 6 (June 1, 2003): G913—G923. http://dx.doi.org/10.1152/ajpgi.00410.2002.

Full text
Abstract:
Carotenoids are thought to diminish the incidence of certain degenerative diseases, but the mechanisms involved in their intestinal absorption are poorly understood. Our aim was to obtain basic data on the fate of carotenoids in the human stomach and duodenum. Ten healthy men were intragastrically fed three liquid test meals differing only in the vegetable added 3 wk apart and in a random order. They contained 40 g sunflower oil and mashed vegetables as the sole source of carotenoids. Tomato purée provided 10 mg lycopene as the main carotenoid, chopped spinach (10 mg lutein), and carrot purée (10 mg β-carotene). Samples of stomach and duodenal contents and blood samples were collected at regular time intervals after meal intake. all -trans and ciscarotenoids were assayed in stomach and duodenal contents, in the fat and aqueous phases of those contents, and in chylomicrons. The cis-trans β-carotene and lycopene ratios did not significantly vary in the stomach during digestion. Carotenoids were recovered in the fat phase present in the stomach during digestion. The proportion of all -trans carotenoids found in the micellar phase of the duodenum was as follows (means ± SE): lutein (5.6 ± 0.4%), β-carotene (4.7 ± 0.3%), lycopene (2.0 ± 0.2%). The proportion of 13- cis β-carotene in the micellar phase was significantly higher (14.8 ± 1.6%) than that of the all -trans isomer (4.7 ± 0.3%). There was no significant variation in chylomicron lycopene after the tomato meal, whereas there was significant increase in chylomicron β-carotene and lutein after the carrot and the spinach meals, respectively. There is no significant cis-transisomerization of β-carotene and lycopene in the human stomach. The stomach initiates the transfer of carotenoids from the vegetable matrix to the fat phase of the meal. Lycopene is less efficiently transferred to micelles than β-carotene and lutein. The very small transfer of carotenoids from their vegetable matrices to micelles explains the poor bioavailability of these phytomicroconstituents.
APA, Harvard, Vancouver, ISO, and other styles
44

Fabre-Thorpe, Michèle, Arnaud Delorme, Catherine Marlot, and Simon Thorpe. "A Limit to the Speed of Processing in Ultra-Rapid Visual Categorization of Novel Natural Scenes." Journal of Cognitive Neuroscience 13, no. 2 (February 1, 2001): 171–80. http://dx.doi.org/10.1162/089892901564234.

Full text
Abstract:
The processing required to decide whether a briefly flashed natural scene contains an animal can be achieved in 150 msec (Thorpe, Fize, & Marlot, 1996). Here we report that extensive training with a subset of photographs over a 3-week period failed to increase the speed of the processing underlying such Rapid Visual Categorizations: Completely novel scenes could be categorized just as fast as highly familiar ones. Such data imply that the visual system processes new stimuli at a speed and with a number of stages that cannot be compressed. This rapid processing mode was seen with a wide range of visual complex images, challenging the idea that short reaction times can only be seen with simple visual stimuli and implying that highly automatic feed-forward mechanisms underlie a far greater proportion of the sophisticated image analysis needed for everyday vision than is generally assumed.
APA, Harvard, Vancouver, ISO, and other styles
45

Seghier, Mohamed L. "Clustering of fMRI data: the elusive optimal number of clusters." PeerJ 6 (October 3, 2018): e5416. http://dx.doi.org/10.7717/peerj.5416.

Full text
Abstract:
Model-free methods are widely used for the processing of brain fMRI data collected under natural stimulations, sleep, or rest. Among them is the popular fuzzy c-mean algorithm, commonly combined with cluster validity (CV) indices to identify the ‘true’ number of clusters (components), in an unsupervised way. CV indices may however reveal different optimal c-partitions for the same fMRI data, and their effectiveness can be hindered by the high data dimensionality, the limited signal-to-noise ratio, the small proportion of relevant voxels, and the presence of artefacts or outliers. Here, the author investigated the behaviour of seven robust CV indices. A new CV index that incorporates both compactness and separation measures is also introduced. Using both artificial and real fMRI data, the findings highlight the importance of looking at the behavior of different compactness and separation measures, defined here as building blocks of CV indices, to depict a full description of the data structure, in particular when no agreement is found between CV indices. Overall, for fMRI, it makes sense to relax the assumption that only one unique c-partition exists, and appreciate that different c-partitions (with different optimal numbers of clusters) can be useful explanations of the data, given the hierarchical organization of many brain networks.
APA, Harvard, Vancouver, ISO, and other styles
46

FREUDENTHAL, DANIEL, JULIAN M. PINE, and FERNAND GOBET. "Understanding the developmental dynamics of subject omission: the role of processing limitations in learning." Journal of Child Language 34, no. 1 (January 25, 2007): 83–110. http://dx.doi.org/10.1017/s0305000906007719.

Full text
Abstract:
P. Bloom's (1990) data on subject omission are often taken as strong support for the view that child language can be explained in terms of full competence coupled with processing limitations in production. This paper examines whether processing limitations in learning may provide a more parsimonious explanation of the data without the need to assume full competence. We extended P. Bloom’s study by using a larger sample (12 children) and measuring subject omission phenomena in three developmental phases. The results revealed a Verb Phrase-length effect consistent with that reported by P. Bloom. However, contrary to the predictions of the processing limitations account, the proportion of overt subjects that were pronominal increased with developmental phase. The data were simulated with MOSAIC, a computational model that learns to produce progressively longer utterances as a function of training. MOSAIC was able to capture all of the effects reported by P. Bloom through a resource-limited distributional analysis of child-directed speech. Since MOSAIC does not have any built-in linguistic knowledge, these results show that the phenomena identified by P. Bloom do not constitute evidence for underlying competence on the part of the child. They also underline the need to develop more empirically grounded models of the way that processing limitations in learning might influence the language acquisition process.
APA, Harvard, Vancouver, ISO, and other styles
47

Melnikov, P. V., V. N. Dovedov, D. Yu Kanner, and I. L. Chernikovskiy. "Artificial Intelligence in surgical practice." Pelvic Surgery and Oncology 10, no. 3-4 (December 30, 2020): 60–64. http://dx.doi.org/10.17650/2686-9594-2020-10-3-4-60-64.

Full text
Abstract:
The aim of this literature review was to a highlight the basic concepts of artificial intelligence in medicine, focusing on the application of this area of technological development in changes of surgery. PubMed and Google searches were performed using the key words “artificial intelligence”, “surgery”. Further references were obtained by cross-referencing the key articles.The integration of artificial intelligence into surgical practice will take place in the field of education, storage and processing of medical data and the speed of implementation will be in direct proportion to the cost of labor and the need for “transparency” of statistical data.
APA, Harvard, Vancouver, ISO, and other styles
48

Singer, Joshua, Emma Thomson, Joseph Hughes, Elihu Aranday-Cortes, John McLauchlan, Ana da Silva Filipe, Lily Tong, et al. "Interpreting Viral Deep Sequencing Data with GLUE." Viruses 11, no. 4 (April 3, 2019): 323. http://dx.doi.org/10.3390/v11040323.

Full text
Abstract:
Using deep sequencing technologies such as Illumina’s platform, it is possible to obtain reads from the viral RNA population revealing the viral genome diversity within a single host. A range of software tools and pipelines can transform raw deep sequencing reads into Sequence Alignment Mapping (SAM) files. We propose that interpretation tools should process these SAM files, directly translating individual reads to amino acids in order to extract statistics of interest such as the proportion of different amino acid residues at specific sites. This preserves per-read linkage between nucleotide variants at different positions within a codon location. The samReporter is a subsystem of the GLUE software toolkit which follows this direct read translation approach in its processing of SAM files. We test samReporter on a deep sequencing dataset obtained from a cohort of 241 UK HCV patients for whom prior treatment with direct-acting antivirals has failed; deep sequencing and resistance testing have been suggested to be of clinical use in this context. We compared the polymorphism interpretation results of the samReporter against an approach that does not preserve per-read linkage. We found that the samReporter was able to properly interpret the sequence data at resistance-associated locations in nine patients where the alternative approach was equivocal. In three cases, the samReporter confirmed that resistance or an atypical substitution was present at NS5A position 30. In three further cases, it confirmed that the sofosbuvir-resistant NS5B substitution S282T was absent. This suggests the direct read translation approach implemented is of value for interpreting viral deep sequencing data.
APA, Harvard, Vancouver, ISO, and other styles
49

Shirzadifard, Maysam, Ehsan Shahghasemi, Elaheh Hejazi, and Shima Aminipour. "Life Management Strategies as Mediators Between Information Processing Style and Subjective Well-Being." SAGE Open 10, no. 4 (October 2020): 215824402096280. http://dx.doi.org/10.1177/2158244020962806.

Full text
Abstract:
This study investigates the mediating role of life management strategies to see how information processing styles indirectly influence subjective well-being. Participants were 440 university students (female = 202, male = 238) ranging in age from 18 to 50 years from all levels and all majors from universities in Quchan, Iran. In a nonexperimental design and by using path analysis, we found that selection, optimization, and compensation fully mediated the relationship between information processing styles and subjective well-being. Our proposed model fitted well to the data and could account for a significant proportion of variance in satisfaction with life, positive affects, and negative affects’ scores (42%, 51%, and 35%, respectively). These results provide empirical evidence that rational information processing style is a defining factor for planning, and its impact on subjective indicators of well-being operates indirectly and through life management strategies. This model, with a more active approach, has implications for both theory and practice in psychotherapy.
APA, Harvard, Vancouver, ISO, and other styles
50

Lung, Pei-Yau, Dongrui Zhong, Xiaodong Pang, Yan Li, and Jinfeng Zhang. "Maximizing the reusability of gene expression data by predicting missing metadata." PLOS Computational Biology 16, no. 11 (November 6, 2020): e1007450. http://dx.doi.org/10.1371/journal.pcbi.1007450.

Full text
Abstract:
Reusability is part of the FAIR data principle, which aims to make data Findable, Accessible, Interoperable, and Reusable. One of the current efforts to increase the reusability of public genomics data has been to focus on the inclusion of quality metadata associated with the data. When necessary metadata are missing, most researchers will consider the data useless. In this study, we developed a framework to predict the missing metadata of gene expression datasets to maximize their reusability. We found that when using predicted data to conduct other analyses, it is not optimal to use all the predicted data. Instead, one should only use the subset of data, which can be predicted accurately. We proposed a new metric called Proportion of Cases Accurately Predicted (PCAP), which is optimized in our specifically-designed machine learning pipeline. The new approach performed better than pipelines using commonly used metrics such as F1-score in terms of maximizing the reusability of data with missing values. We also found that different variables might need to be predicted using different machine learning methods and/or different data processing protocols. Using differential gene expression analysis as an example, we showed that when missing variables are accurately predicted, the corresponding gene expression data can be reliably used in downstream analyses.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography