Pour voir les autres types de publications sur ce sujet consultez le lien suivant : Limit CPU usage.

Articles de revues sur le sujet « Limit CPU usage »

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les 27 meilleurs articles de revues pour votre recherche sur le sujet « Limit CPU usage ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Parcourez les articles de revues sur diverses disciplines et organisez correctement votre bibliographie.

1

Raza, Syed M., Jaeyeop Jeong, Moonseong Kim, Byungseok Kang et Hyunseung Choo. « Empirical Performance and Energy Consumption Evaluation of Container Solutions on Resource Constrained IoT Gateways ». Sensors 21, no 4 (16 février 2021) : 1378. http://dx.doi.org/10.3390/s21041378.

Texte intégral
Résumé :
Containers virtually package a piece of software and share the host Operating System (OS) upon deployment. This makes them notably light weight and suitable for dynamic service deployment at the network edge and Internet of Things (IoT) devices for reduced latency and energy consumption. Data collection, computation, and now intelligence is included in variety of IoT devices which have very tight latency and energy consumption conditions. Recent studies satisfy latency condition through containerized services deployment on IoT devices and gateways. They fail to account for the limited energy and computing resources of these devices which limit the scalability and concurrent services deployment. This paper aims to establish guidelines and identify critical factors for containerized services deployment on resource constrained IoT devices. For this purpose, two container orchestration tools (i.e., Docker Swarm and Kubernetes) are tested and compared on a baseline IoT gateways testbed. Experiments use Deep Learning driven data analytics and Intrusion Detection System services, and evaluate the time it takes to prepare and deploy a container (creation time), Central Processing Unit (CPU) utilization for concurrent containers deployment, memory usage under different traffic loads, and energy consumption. The results indicate that container creation time and memory usage are decisive factors for containerized micro service architecture.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Găitan, Vasile Gheorghiță, et Ionel Zagan. « An Overview of the nMPRA and nHSE Microarchitectures for Real-Time Applications ». Sensors 21, no 13 (30 juin 2021) : 4500. http://dx.doi.org/10.3390/s21134500.

Texte intégral
Résumé :
In the context of real-time control systems, it has become possible to obtain temporal resolutions of microseconds due to the development of embedded systems and the Internet of Things (IoT), the optimization of the use of processor hardware, and the improvement of architectures and real-time operating systems (RTOSs). All of these factors, together with current technological developments, have led to efficient central processing unit (CPU) time usage, guaranteeing both the predictability of thread execution and the satisfaction of the timing constraints required by real-time systems (RTSs). This is mainly due to time sharing in embedded RTSs and the pseudo-parallel execution of tasks in single-processor and multi-processor systems. The non-deterministic behavior triggered by asynchronous external interrupts and events in general is due to the fact that, for most commercial RTOSs, the execution of the same instruction ends in a variable number of cycles, primarily due to hazards. The software implementation of RTOS-specific mechanisms may lead to significant delays that can affect deadline requirements for some RTSs. The main objective of this paper was the design and deployment of innovative solutions to improve the performance of RTOSs by implementing their functions in hardware. The obtained architectures are intended to provide feasible scheduling, even if the total CPU utilization is close to the maximum limit. The contributions made by the authors will be followed by the validation of a high-performing microarchitecture, which is expected to allow a thread context switching time and event response time of only one clock cycle each. The main purpose of the research presented in this paper is to improve these factors of RTSs, as well as the implementation of the hardware structure used for the static and dynamic scheduling of tasks, for RTOS mechanisms specific to resource sharing and intertask communication.
Styles APA, Harvard, Vancouver, ISO, etc.
3

Gao, Meng, Bryan A. Franz, Kirk Knobelspiesse, Peng-Wang Zhai, Vanderlei Martins, Sharon Burton, Brian Cairns et al. « Efficient multi-angle polarimetric inversion of aerosols and ocean color powered by a deep neural network forward model ». Atmospheric Measurement Techniques 14, no 6 (4 juin 2021) : 4083–110. http://dx.doi.org/10.5194/amt-14-4083-2021.

Texte intégral
Résumé :
Abstract. NASA's Plankton, Aerosol, Cloud, ocean Ecosystem (PACE) mission, scheduled for launch in the timeframe of 2023, will carry a hyperspectral scanning radiometer named the Ocean Color Instrument (OCI) and two multi-angle polarimeters (MAPs): the UMBC Hyper-Angular Rainbow Polarimeter (HARP2) and the SRON Spectro-Polarimeter for Planetary EXploration one (SPEXone). The MAP measurements contain rich information on the microphysical properties of aerosols and hydrosols and therefore can be used to retrieve accurate aerosol properties for complex atmosphere and ocean systems. Most polarimetric aerosol retrieval algorithms utilize vector radiative transfer models iteratively in an optimization approach, which leads to high computational costs that limit their usage in the operational processing of large data volumes acquired by the MAP imagers. In this work, we propose a deep neural network (NN) forward model to represent the radiative transfer simulation of coupled atmosphere and ocean systems for applications to the HARP2 instrument and its predecessors. Through the evaluation of synthetic datasets for AirHARP (airborne version of HARP2), the NN model achieves a numerical accuracy smaller than the instrument uncertainties, with a running time of 0.01 s in a single CPU core or 1 ms in a GPU. Using the NN as a forward model, we built an efficient joint aerosol and ocean color retrieval algorithm called FastMAPOL, evolved from the well-validated Multi-Angular Polarimetric Ocean coLor (MAPOL) algorithm. Retrievals of aerosol properties and water-leaving signals were conducted on both the synthetic data and the AirHARP field measurements from the Aerosol Characterization from Polarimeter and Lidar (ACEPOL) campaign in 2017. From the validation with the synthetic data and the collocated High Spectral Resolution Lidar (HSRL) aerosol products, we demonstrated that the aerosol microphysical properties and water-leaving signals can be retrieved efficiently and within acceptable error. Comparing to the retrieval speed using a conventional radiative transfer forward model, the computational acceleration is 103 times faster with CPU or 104 times with GPU processors. The FastMAPOL algorithm can be used to operationally process the large volume of polarimetric data acquired by PACE and other future Earth-observing satellite missions with similar capabilities.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Rastogi, Saumya, Bimal Charles et Asirvatham Edwin Sam. « Prevalence and Predictors of Self-Reported Consistent Condom Usage among Male Clients of Female Sex Workers in Tamil Nadu, India ». Journal of Sexually Transmitted Diseases 2014 (1 juin 2014) : 1–7. http://dx.doi.org/10.1155/2014/952035.

Texte intégral
Résumé :
Clients of female sex workers (FSWs) possess a high potential of transmitting HIV and other sexually transmitted infections from high risk FSWs to the general population. Promotion of safer sex practices among the clients is essential to limit the spread of HIV/AIDS epidemic. The aim of this study is to estimate the prevalence of consistent condom use (CCU) among clients of FSWs and to assess the factors associated with CCU in Tamil Nadu. 146 male respondents were recruited from the hotspots who reportedly had sex with FSWs in exchange for cash at least once in the past one month. Data were analyzed using bivariate and multivariate methods. Overall, 48.6 and 0.8 percent clients consistently used condoms in the past 12 months with FSWs and regular partners, respectively. Logistic regression showed that factors such as education, peers’ use of condoms, and alcohol consumption significantly influenced clients’ CCU with FSWs. Strategies for safe sex-behaviour are needed among clients of FSWs in order to limit the spread of HIV/AIDS epidemic in the general population. The role of peer-educators in experience sharing and awareness generation must also be emphasized.
Styles APA, Harvard, Vancouver, ISO, etc.
5

Wu, Wenjing, et David Cameron. « Backfilling the Grid with Containerized BOINC in the ATLAS computing ». EPJ Web of Conferences 214 (2019) : 07015. http://dx.doi.org/10.1051/epjconf/201921407015.

Texte intégral
Résumé :
Virtualization is a commonly used solution for utilizing the opportunistic computing resources in the HEP field, as it provides a unified software and OS layer that the HEP computing tasks require over the heterogeneous opportunistic computing resources. However there is always performance penalty with virtualization, especially for short jobs which are always the case for volunteer computing tasks, the overhead of virtualization reduces the CPU efficiency of the jobs, hence it leads to low CPU efficiency of the jobs. With the wide usage of containers in HEP computing, we explore the possibility of adopting the container technology into the ATLAS BOINC project, hence we implemented a Native version in BOINC, which uses the Singularity container or direct usage of the Operating System of the host machines to replace VirtualBox. In this paper, we will discuss 1) the implementation and workflow of the Native version in the ATLAS BOINC; 2) the performance measurement of the Native version comparing to the previous virtualization version. 3) the limits and shortcomings of the Native version; 4) The practice and outcome of the Native version which includes using it in backfilling the ATLAS Grid Tier2 sites and other clusters, and to utilize the idle computers from the CERN computing centre.
Styles APA, Harvard, Vancouver, ISO, etc.
6

Jacquet, Éric. « Construction de la limite interne et « bon usage du double interdit du toucher » dans des groupes thérapeutiques de jeunes enfants ». Cahiers de psychologie clinique 38, no 1 (2012) : 179. http://dx.doi.org/10.3917/cpc.038.0179.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Marta, Deni, M. Angga Eka Putra et Guntoro Barovih. « Analisis Perbandingan Performa Virtualisasi Server Sebagai Basis Layanan Infrastructure As A Service Pada Jaringan Cloud ». MATRIK : Jurnal Manajemen, Teknik Informatika dan Rekayasa Komputer 19, no 1 (15 septembre 2019) : 1–8. http://dx.doi.org/10.30812/matrik.v19i1.433.

Texte intégral
Résumé :
Cloud Computing provides convenience and comfort to every service. Infrastructure as a Service is one of the cloud computing services that is a choice of several users, it is very important to know the performance of each existing platform in order to get the maximum result according to our needs. In this study, testing 3 platforms of cloud computing service providers are VMWare ESXi, XenServer, and Proxmox, using action research methods. From the results of performance measurements, then analyzed and compared with the minimum and maximum limits. The tested indicators are response time, throughput, and resource-utilization as a comparison of server virtualization performance implementations. In the resource utilization testing when the condition of installing an operating system, CPU usage on the Proxmox platform shows the lowest usage of 10.72%, and the lowest RAM usage of 53.32% also on the Proxmox platform. In the resource test utilization when idle state shows the lowest usage of 5.78% on the Proxmox platform, while the lowest RAM usage is 57.25% on the VMWare ESXi platform. The mean resource utilization tests indicate that the Proxmox platform is better. At the throughput test when the upload measurement of the XenServer platform is better 1.37 MB/s, while the throughput test when the download of the VMWare ESXi platform is better than 1.39 MB/s. On response time testing shows the platform VMWare ESXi as the fastest is 0.180 sec.
Styles APA, Harvard, Vancouver, ISO, etc.
8

V, Dr Kiran, Akshay Narayan Pai et Gautham S. « Performance Analysis of Virtual Machine in Cloud Architecture ». Journal of University of Shanghai for Science and Technology 23, no 07 (19 juillet 2021) : 924–29. http://dx.doi.org/10.51201/jusst/21/07210.

Texte intégral
Résumé :
Cloud computing is a technique for storing and processing data that makes use of a network of remote servers. Cloud computing is gaining popularity due to its vast storage capacity, ease of access, and diverse variety of services. When cloud computing advanced and technologies such as virtual machines appeared, virtualization entered the scene. When customers’ computing demands for storage and servers increased, however, virtual machines were unable to match those expectations due to scalability and resource allocation limits. As a consequence, containerization became a reality. Containerization is the process of packaging software code along with all of its essential components, including frameworks, libraries, and other dependencies, such that they may be separated or separated in their own container. The program operating in containers may execute reliably in any environment or infrastructure. Containers provide OS-level virtualization, which reduces the computational load on the host machine and enables programs to run much faster and more reliably. Performance analysis is very important in comparing the throughput of both VM-based and Container-based designs. To analyze it same web application is running in both the designs. CPU usage and RAM usage in both designs were compared. Results obtained are tabulated and a Proper conclusion has been given.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Negru, Adrian Eduard, Latchezar Betev, Mihai Carabaș, Costin Grigoraș, Nicolae Țăpuş et Sergiu Weisz. « Analysis of data integrity and storage quality of a distributed storage system ». EPJ Web of Conferences 251 (2021) : 02035. http://dx.doi.org/10.1051/epjconf/202125102035.

Texte intégral
Résumé :
CERN uses the world’s largest scientific computing grid, WLCG, for distributed data storage and processing. Monitoring of the CPU and storage resources is an important and essential element to detect operational issues in its systems, for example in the storage elements, and to ensure their proper and efficient function. The processing of experiment data depends strongly on the data access quality, as well as its integrity and both of these key parameters must be assured for the data lifetime. Given the substantial amount of data, O(200 PB), already collected by ALICE and kept at various storage elements around the globe, scanning every single data chunk would be a very expensive process, both in terms of computing resources usage and in terms of execution time. In this paper, we describe a distributed file crawler that addresses these natural limits by periodically extracting and analyzing statistically significant samples of files from storage elements, evaluates the results and is integrated with the existing monitoring solution, MonALISA.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Saylan, Erdem, Cihangir et Denizli. « Detecting Fingerprints of Waterborne Bacteria on a Sensor ». Chemosensors 7, no 3 (25 juillet 2019) : 33. http://dx.doi.org/10.3390/chemosensors7030033.

Texte intégral
Résumé :
Human fecal contamination is a crucial threat that results in difficulties in access to clean water. Enterococcus faecalis is a bacteria which is utilized as an indicator in polluted water. Nevertheless, existing strategies face several challenges, including low affinity and the need for labelling, which limit their access to large scale applications. Herein, a label-free fingerprint of the surface proteins of waterborne bacteria on a sensor was demonstrated for real-time bacteria detection from aqueous and water samples. The kinetic performance of the sensor was evaluated and shown to have a range of detection that spanned five orders of magnitude, having a low detection limit (3.4 × 104 cfu/mL) and a high correlation coefficient (R2 = 0.9957). The sensor also designated a high selectivity while other competitor bacteria were employed. The capability for multiple usage and long shelf-life are superior to other modalities. This is an impressive surface modification method that uses the target itself as a recognition element, ensuring a broad range of variability to replicate others with different structure, size and physical and chemical properties.
Styles APA, Harvard, Vancouver, ISO, etc.
11

Breitenbach, S. F. M., K. Rehfeld, B. Goswami, J. U. L. Baldini, H. E. Ridley, D. Kennett, K. Prufer et al. « COnstructing Proxy-Record Age models (COPRA) ». Climate of the Past Discussions 8, no 3 (19 juin 2012) : 2369–408. http://dx.doi.org/10.5194/cpd-8-2369-2012.

Texte intégral
Résumé :
Abstract. Reliable age models are fundamental for any palaeoclimate reconstruction. Interpolation procedures between age control points are often inadequately reported, and available modeling algorithms do not allow incorporation of layer counted intervals to improve the confidence limits of the age model in question. We present a modeling approach that allows automatic detection and interactive handling of outliers and hiatuses. We use Monte Carlo simulation to assign an absolute time scale to climate proxies by conferring the dating uncertainties to uncertainties in the proxy values. The algorithm allows us to integrate incremental relative dating information to improve the final age model. The software package COPRA1.0 facilitates easy interactive usage.
Styles APA, Harvard, Vancouver, ISO, etc.
12

Francis, Laurent A., Roselien Vercauteren, Audrey Leprince et Jacques Mahillon. « Stable Porous Silicon Membranes for Fast Bacterial Detection ». Engineering Proceedings 4, no 1 (16 avril 2021) : 45. http://dx.doi.org/10.3390/micromachines2021-09598.

Texte intégral
Résumé :
The rapid detection of hazardous bacteria is important for healthcare situations, where such identification can lead to substantial gains for patient treatment and recovery and a reduced usage of broad-spectrum antibiotics. Potential biosensors must be able to provide a fast, sensitive and selective response with as little sample preparation as possible. Indeed, some of these pathogens, such as Staphylococcus aureus, can be yet harmful at very low concentrations in the blood stream, e.g., below 10 colony forming units per mL (CFU/mL). These stringent requirements limit the number of candidates, especially for point-of-care applications. Amongst several biosensing techniques, optical sensing using porous silicon (PSi) substrate has been widely suggested in recent years thanks to unique features such as a large surface area, tunable optical characteristics, and above all relatively easy and affordable fabrication techniques. In most configurations, PSi optical biosensors are close-ended porous layers; this limits their sensitivity and responsiveness due to diffusion-limited infiltration of the analytes in the porous layer. Also, PSi is a reactive material, its oxidation in buffer solutions results in time-varying shifts. Despite its attractive properties, several challenges must still be overcome in order to reach practical applications. Our work addresses three main improvement points. The first one is the stability over time in saline solutions helped by atomic layer deposition of metal oxides inside the pores. Besides a better stability, our solution is helping with an increase of the optical signal to noise ratio, thus reducing the limit of detection. The second one is to perform the lysis of the bacteria prior to its exposure to the sensor, such that the selective detection is based upon the percolation of bacterial residues inside the pores rather than the bacteria themselves. The third one is to remove the bulk silicon below a PSi layer to create a membrane, that allows for flow-through of the analytes, thus enhancing the interactions between the lysate and the sensor’s surface. This approach allows us to avoid the step of surface functionalization used in classical biosensors. We tested thanks to these improvements the selective detection of Bacillus cereus lysate with concentrations between 103 and 105 CFU/mL. Future works are dedicated to further improvements, including optical signal enhancement techniques and dielectrophoretic assisted percolation in the porous silicon membrane.
Styles APA, Harvard, Vancouver, ISO, etc.
13

He, Yuan, Shunyi Zheng, Fengbo Zhu et Xia Huang. « Real-Time 3D Reconstruction of Thin Surface Based on Laser Line Scanner ». Sensors 20, no 2 (18 janvier 2020) : 534. http://dx.doi.org/10.3390/s20020534.

Texte intégral
Résumé :
The truncated signed distance field (TSDF) has been applied as a fast, accurate, and flexible geometric fusion method in 3D reconstruction of industrial products based on a hand-held laser line scanner. However, this method has some problems for the surface reconstruction of thin products. The surface mesh will collapse to the interior of the model, resulting in some topological errors, such as overlap, intersections, or gaps. Meanwhile, the existing TSDF method ensures real-time performance through significant graphics processing unit (GPU) memory usage, which limits the scale of reconstruction scene. In this work, we propose three improvements to the existing TSDF methods, including: (i) a thin surface attribution judgment method in real-time processing that solves the problem of interference between the opposite sides of the thin surface; we distinguish measurements originating from different parts of a thin surface by the angle between the surface normal and the observation line of sight; (ii) a post-processing method to automatically detect and repair the topological errors in some areas where misjudgment of thin-surface attribution may occur; (iii) a framework that integrates the central processing unit (CPU) and GPU resources to implement our 3D reconstruction approach, which ensures real-time performance and reduces GPU memory usage. The proposed results show that this method can provide more accurate 3D reconstruction of a thin surface, which is similar to the state-of-the-art laser line scanners with 0.02 mm accuracy. In terms of performance, the algorithm can guarantee a frame rate of more than 60 frames per second (FPS) with the GPU memory footprint under 500 MB. In total, the proposed method can achieve a real-time and high-precision 3D reconstruction of a thin surface.
Styles APA, Harvard, Vancouver, ISO, etc.
14

Cardoen, Ben, Stijn Manhaeve, Yentl Van Tendeloo et Jan Broeckhove. « A PDEVS simulator supporting multiple synchronization protocols : implementation and performance analysis ». SIMULATION 94, no 4 (1 février 2017) : 281–300. http://dx.doi.org/10.1177/0037549717690826.

Texte intégral
Résumé :
With the ever-increasing complexity of simulation models, parallel simulation becomes necessary to perform simulation within reasonable time bounds. The built-in parallelism of Parallel DEVS is often insufficient to tackle this problem on its own. Several synchronization protocols have been proposed, each with their distinct advantages and disadvantages. Due to the significantly different implementation of these protocols, most Parallel DEVS simulation tools are limited to only one such protocol. In this paper, we present a Parallel DEVS simulator, grafted on C++11 and based on PythonPDEVS, supporting both conservative and optimistic synchronization protocols. The simulator not only supports both protocols but also has the capability to switch between them at runtime. The simulator can combine each synchronization protocols with either a threaded or sequential implementation of the PDEVS protocol. We evaluate the performance gain obtained by choosing the most appropriate synchronization protocol. A comparison is made to adevs in terms of CPU time and memory usage, to show that our modularity does not hinder performance. We compare the speedup obtained by synchronization with that of the inherent parallelism of PDEVS in isolation and combination, and contrast the results with the theoretical limits. We further allow for an external component to gather simulation statistics, on which runtime switching between the different synchronization protocols can be based. The effects of allocation on our synchronization protocols are also studied.
Styles APA, Harvard, Vancouver, ISO, etc.
15

Elmsheuser, Johannes, Christos Anastopoulos, Jamie Boyd, James Catmore, Heather Gray, Attila Krasznahorkay, Josh McFayden et al. « Evolution of the ATLAS analysis model for Run-3 and prospects for HL-LHC ». EPJ Web of Conferences 245 (2020) : 06014. http://dx.doi.org/10.1051/epjconf/202024506014.

Texte intégral
Résumé :
With an increased dataset obtained during the Run-2 of the LHC at CERN, the even larger forthcoming Run-3 data and the expected increase of the dataset by more than one order of magnitude for the HL-LHC, the ATLAS experiment is reaching the limits of the current data production model in terms of disk storage resources. The anticipated availability of an improved fast simulation will enable ATLAS to produce significantly larger Monte Carlo samples with the available CPU, which will then be limited by insufficient disk resources. The ATLAS Analysis Model Study Group for Run-3 was setup at the end of Run-2. Its tasks have been to analyse the efficiency and suitability of the current analysis model and to propose significant improvements. The group has considered options allowing ATLAS to save, for the same sample of data and simulated events, at least 30% disk space overall, and has given recommendations on how significantly larger savings could be realised for the HL-LHC. Furthermore, suggestions were made to harmonise the current stage of analysis across the collaboration. The group has now completed its work: key recommendations will be the new small sized analysis formats DAOD_PHYS and DAOD_PHYSLITE and the increased usage of a tape carousel mode in the centralised production of these formats. This proceeding reviews the recommended ATLAS analysis model for Run-3 and the status of its implementation. It also provides an outlook to the HL-LHC analysis.
Styles APA, Harvard, Vancouver, ISO, etc.
16

Hunca-Bednarska, Anna. « A new perspective on the usefulness of the Rorschach test in psychological assessment.Reflections on the short version of the test (Basic Rorschach) ». Current Problems of Psychiatry 20, no 4 (1 décembre 2019) : 273–88. http://dx.doi.org/10.2478/cpp-2019-0019.

Texte intégral
Résumé :
Abstract The Rorschach test is the most well-known psychological test ever invented; it has captured the imagination of entire generations of clinicians, researchers, artists, writers, and ordinary participants in mass culture. Yet, no psychological test has faced such heavily emotional criticism. The drastically ambiguous status of this test in the community of psychologists can be call an identity crisis. This is the diagnosis presented in the book titled Assessment Using the Rorschach Inkblot Test by James P. Choca and Edward D. Rossini, American professors of clinical psychology currently affiliated with the Roosevelt University in Chicago. It was this book that inspired the present article. Choca and Rossini claim that the crisis associated with the use of the inkblot test stems from the lack of understanding of what the essence of this test actually is and from its improper usage. They also indicate realistic and practical ways to overcome this crisis. Faced with the excessively elaborate systems for processing and interpreting the material obtained using the test, the authors attempt to create a short version of the inkblot test (Basic Rorschach). In the short version it is possible to use a smaller number of categories or even limit oneself to use only four plates instead of ten. Choca and Rossini admit that the Basic Rorschach requires further studies; they are also willing to give psychologists a great degree of freedom and the possibility of deciding what to take into account and what to ignore in the interpretation of results. They also propose to introduce a new final phase of the test, which, in a way, involves the examinee in the process of analyzing his or her responses. In this paper I address the changes proposed by the authors, concerning both the procedure and the manner of categorizing and interpreting responses. For this purpose, I use own clinical experience and the results of my empirical research.
Styles APA, Harvard, Vancouver, ISO, etc.
17

Boldrin, Fabio, Chiara Taddia et Gianluca Mazzini. « Web Distributed Computing Systems Implementation and Modeling ». International Journal of Adaptive, Resilient and Autonomic Systems 1, no 1 (janvier 2010) : 75–91. http://dx.doi.org/10.4018/jaras.2010071705.

Texte intégral
Résumé :
This article proposes a new approach for distributed computing. The main novelty consists in the exploitation of Web browsers as clients, thanks to the availability of JavaScript, AJAX and Flex. The described solution has two main advantages: it is client-free, so no additional programs have to be installed to perform the computation, and it requires low CPU usage, so client-side computation is no invasive for users. The solution is developed using both AJAX and Adobe®Flex® technologies embedding a pseudo-client into a Web page that hosts the computation. While users browse the hosting Web page, computation takes place resolving single sub-problems and sending the solution to the server-side part of the system. Our client-free solution is an example of high resilient and auto-administrated system that is able to organize the scheduling of the processes and the error management in an autonomic manner. A mathematical model has been developed over this solution. The main goals of the model are to describe and classify different categories of problems on the basis of the feasibility and to find the limits in the dimensioning of the scheduling systems to have convenience in the use of this approach. The new architecture has been tested through different performance metrics by implementing two examples of distributed computing, the cracking of an RSA cryptosystem through the factorization of the public key and the correlation index between samples in genetic data sets. Results have shown good feasibility of this approach both in a closed environment and also in an Internet environment, in a typical real situation.
Styles APA, Harvard, Vancouver, ISO, etc.
18

Dankwa, Adwoa, Jennifer Perry et Lewis Perkins. « Assessment of Microbial Variation and Chemical Composition in Water Kefir Over Repeated Brewing Cycles and Refrigerated Storage ». Current Developments in Nutrition 5, Supplement_2 (juin 2021) : 580. http://dx.doi.org/10.1093/cdn/nzab044_011.

Texte intégral
Résumé :
Abstract Objectives Water kefir is a fermented beverage with potential probiotic effects. Its production requires a starter culture known as ‘grains’ and a sweetened water substrate. These grains self-propagate for use in sequential brewing cycles. Changes in microbial composition and its metabolites with repeated grain usage and prolonged storage are unknown. This may result in loss of functional properties or reduced consumer acceptability. This study evaluates the stability and functional components of microbial communities and chemical compositions in water kefir products over repeated brewing cycles and during refrigerated storage. Methods Six cultures were obtained from commercial suppliers (n = 5) or homebrewers (n = 1). Each culture system was brewed following a simplified standard recipe for 20 repeated cycles. Samples were collected at pH 4 and stored at 4 oC for analysis at predetermined time points (2, 6 and 12 weeks). Samples were enumerated for total aerobic bacteria, lactic acid bacteria (LAB), acetic acid bacteria (AAB) and yeast on TSA, MRS, ABS, and APDA media respectively. HPLC was used for simultaneous detection of the major organic acids, sugars, and alcohol in water kefir. Data was subject to ANOVA, MANCOVA, and mapping for visualization. Results Microbial population and chemical profile of water kefir were significantly affected by both repeated brewing cycles and storage. All brewing cycles at the initial 2-week storage time did not significantly affect LAB (range: 5.5–5.7 Log CFU/g) or AAB (range: 5.2–5.4 Log CFU/g) populations. However, significant variations were seen at the 12-week storage in these populations resulting in a reduction of 1.8–2.1 Log CFU/g. Overall, the microbial communities and their metabolite concentration increased until the 10th brewing cycle and then declined to their lowest at the 20th cycle. Hydrolysis of sucrose into component monosaccharides decreased from weeks 2 to 12 for brewing cycles 1 and 10; and increased at brewing cycle 20 over all storage time. Ethanol concentration was significantly lower at the 10th and 20th brewing cycles. Conclusions This data suggests that repeated brewing beyond a certain threshold cycle adversely affects the stability and viability of the microbial culture. There is a need to limit the number of brewing cycles to achieve the optimum health benefits of water kefir. Funding Sources USDA-NIFA.
Styles APA, Harvard, Vancouver, ISO, etc.
19

Keshava-Prasad, Holavanahalli, Krishna Oza, Girindra Raval et Tamim Antakli. « Management of Intractable Bleeding after Cardiac Surgery with Recombinant Activated Factor VII ». Blood 112, no 11 (16 novembre 2008) : 4526. http://dx.doi.org/10.1182/blood.v112.11.4526.4526.

Texte intégral
Résumé :
Abstract Background Intractable hemorrhage is a dreaded complication after cardiovascular surgery often requiring re-exploration and the administration of large quantities of blood products. In view of problems with aprotinin, a new safer effective agent is needed. Recombinant activated FVII is approved for use in patients with hemophilia A and B who have inhibitors to factors VIII and IX, and has shown promise in off-label use for the management of life-threatening hemorrhage in several clinical scenarios including cardiac surgery. It may help control bleeding, reduce blood product usage, and avoid potential morbidity. Its exact place in the management of bleeding during and after cardiac surgery is not yet fully known. Methods. We performed a retrospective review of patients who were given recombinant factor VIIa (rFVIIa; Novoseven, NovoNordisk, Copenhagen, Denmark) to control bleeding after major cardiovascular surgery requiring cardiopulmonary bypass (CPB) at our institution. The decision to administer rFVIIa was made empirically based on the observation by the surgeons of refractory bleeding that appeared unresponsive to conventional hemostasis agents including the requirement of large volumes of blood components, and was at least severe enough to prevent chest closure. We compared blood loss and blood component usage in patients before and after rFVIIa. We also performed a detailed review of the English literature to determine the role of rFVIIa in the treatment of bleeding after cardiac surgery. Results. Between August 2002 to February 2006, 1295 patients underwent open heart surgery at our institution; of these, 28 were given Novoseven either to control intractable bleeding, or to prevent major bleeding. Table 1 shows the patient characteristics. Satisfactory hemostasis was achieved in all but 3 patients after a single 90 μg/kg intravenous dose of rFVIIa. In all patients, there was a dramatic reduction in the amount of blood components (PRBCs, Platelets and FFP) used after rFVIIa infusion (Table 2). Cryoprecipitate was administered routinely with rFVIIa and its usage did not change significantly (Table 2). No thromboembolic or other complications directly related to rVIIa occurred. Conclusions. We have demonstrated that intravenous rFVIIa is effective, safe, and valuable in the management of intractable bleeding after complicated cardiac surgeries. There are several reports and reviews in the literature which corroborate our experience and indicate that recombinant factor VIIa is a potent pro-hemostatic agent which has a role in the treatment of life-threatening refractory hemorrhage associated with cardiac surgery. Earlier preemptive administration of rFVIIa during or before surgery may be of value in patients at high risk of intractable bleeding in order to limit blood loss, and to avoid potential morbidity from large volume blood component transfusions. Randomized, controlled trials are warranted to assess the efficacy, safety, and cost-benefit of this intervention in cardiac surgical patients. TABLE 1. Characteristics and operative course of the 28 patients Mean age 60 yrs (range 22–85) Male, M 24(85%); F 4(15%) Total number of surgical procedures performed: 34 Aortic valve: 7; Bentall or modified Bentall: 9 (3 emergent) Mitral Valve Replacement: 4; CABG: 10; Redo 2 Left pneumonectomy/resection of L Atrial cuff & pericardium: 1 Removal of Inferior vena cava tumor (Renal cell ca): Re-exploration: 6; Delayed closure: 5; Both re-exploration and delayed closure: Median bypass time: 214 min (65–358) Timing of Novoseven: intra op: 21 including elective use in 2 pts; post op: 7 Dose of Novoseven: 90mcg/kg in 22; 45 mcg/kg 2 patients Responders 25(89%) Outcome: Deaths 11(38%) Autopsies: 2; no evidence of systemic thrombosis Table 2. Details of the blood products administered both before and after rFVIIa infusion. Componen Mean units Before rVIIa Mean units After rVIIa Difference; p value PRBC usage 15.9 5.033333 0.045 Platelet usage 4.448276 1.37931 0.005 FFP Usage 9.931034 5.793103 0.042 Cryoppt 21.71429 12.54167 0.091
Styles APA, Harvard, Vancouver, ISO, etc.
20

YU, Yonghan, Lingxi Chen, Xinyao Miao et Shuai Cheng Li. « SpecHap : a diploid phasing algorithm based on spectral graph theory ». Nucleic Acids Research, 17 août 2021. http://dx.doi.org/10.1093/nar/gkab709.

Texte intégral
Résumé :
Abstract Haplotype phasing plays an important role in understanding the genetic data of diploid eukaryotic organisms. Different sequencing technologies (such as next-generation sequencing or third-generation sequencing) produce various genetic data that require haplotype assembly. Although multiple diploid haplotype phasing algorithms exist, only a few will work equally well across all sequencing technologies. In this work, we propose SpecHap, a novel haplotype assembly tool that leverages spectral graph theory. On both in silico and whole-genome sequencing datasets, SpecHap consumed less memory and required less CPU time, yet achieved comparable accuracy with state-of-art methods across all the test instances, which comprises sequencing data from next-generation sequencing, linked-reads, high-throughput chromosome conformation capture, PacBio single-molecule real-time, and Oxford Nanopore long-reads. Furthermore, SpecHap successfully phased an individual Ambystoma mexicanum, a species with gigantic diploid genomes, within 6 CPU hours and 945MB peak memory usage, while other tools failed to yield results either due to memory overflow (40GB) or time limit exceeded (5 days). Our results demonstrated that SpecHap is scalable, efficient, and accurate for diploid phasing across many sequencing platforms.
Styles APA, Harvard, Vancouver, ISO, etc.
21

« Retrograde autologous blood priming is an efficient technique for without, or minimally usage of blood infant cardiosurgery ». Medical & ; Clinical Research 2, no 1 (18 avril 2017). http://dx.doi.org/10.33140/mcr.02.01.02.

Texte intégral
Résumé :
Background: Both severities of cardiac surgery and technical features of extracorporeal circulation circuit demands blood transfusion from donors, which involves a number of risks for the patient, especially with low body weight. Priming of the cardiopulmonary bypass circuit with patients’ own blood [retrograde autologous priming (RAP)] is a technique used to limit haemodilution and reduce transfusion requirements. Methods: the study included 250 children (131 boys, 119 girls) with congenital heart disease, operated on heart under CPB, weighing less than 20kg (18.45 ± 2.15) and 3.4 ± 1.7 years average age, who were divided into experimental (125 children) and control group (125children). In the control group, conventional CPB was performed (supplementing the priming with red blood cells), while in study group CPB was started after RAP via aortic cannula with recuperation till 45 % of cristaloid “priming”. The hematocrit (Hct), lactate (Lac) levels at two perioperative time-points, and intraoperative and postoperative blood usage were recorded. There were no significant differences in CPB time, aortic cross-clamp time between groups. Results: No hospital lethality occurred in the study and no surgical hemostasis was performed. Blood loss accounted for 6.2 ml/kg /24h. Postoperative transfusion of homologous blood (erythrocyte mass) needed 73 children, that make up only 29, 2 % of the whole study group. Amongst children who received transfusion on pump, the number of packed red blood cells was less in the RAP group than that in the standard priming group intraoperatively and perioperatively (0.54 ± 0.17 vs. 1.48 ± 0.68 units, P = 0.03; 0.94 ± 0.54 vs. 1.69 ± 0.69 units, P = 0.15). There were no significant differences in CPB time, aortic clamp, and Lac value between the two groups (P>0.05). Length of ICU and hospial stay were similar. Conclusions: “priming”minimalisation and autologous blood priming, modified ultrafiltration (MUF) could diminish the necessity of perioperative blood transfusion in infant cardiac surgery.
Styles APA, Harvard, Vancouver, ISO, etc.
22

« Providing Enhanced Resource Management Framework for Cloud Storage ». International Journal of Engineering and Advanced Technology 9, no 1 (30 octobre 2019) : 3903–8. http://dx.doi.org/10.35940/ijeat.a1292.109119.

Texte intégral
Résumé :
Data centers are progressively being re-intended for workload combination with a specific end goal to receive the rewards of better resource usage, control cost, and physical space investment cost. Among the strengths driving costs are server and storage virtualization innovations. A key understanding is that there is a more noteworthy cooperative energy between the two layers of storage and server virtualization to be application piece sharing data than was beforehand thought conceivable. In this segment, we display ERMF, a platform that is intended to have MapReduce applications in virtualized cost. ERMF gives a bunch file framework that backings a uniform record framework namespace over the group by coordinating the discrete nearby storage of the individual hubs. Our paper proposes ERMF accommodates the two data and VM resource assignment with contending requirements, for example, storage usage, changing CPU load and system connect limits. ERMF utilizes a stream arrange based calculation that can improve MapReduce performance under the predetermined limitations by starting situation, as well as by straightening out through VM and data relocation also. Moreover, ERMF uncovered, generally shrouded, bring down level topology data to the MapReduce work scheduler with the goal that it makes close ideal task scheduling.
Styles APA, Harvard, Vancouver, ISO, etc.
23

« A Novel DSP Accelerator using Carry Look-Ahead Concept ». International Journal of Engineering and Advanced Technology 9, no 1 (30 octobre 2019) : 4269–72. http://dx.doi.org/10.35940/ijeat.a1343.109119.

Texte intégral
Résumé :
Hardware expanding velocity is a completed technique for digital signal processing (DSP) systems. The stimulated system uses extra computational unit gave to specific limits, for instance, planned method of reasoning, extra CPU and reviving operators' structure structures are related to execution examination booking and portion, hardware and programming co plans are done by joint gear and programming building. As opposed to grasping a strong application-express joined circuit setup. It is a new animating operator designing with versatile computational units that help the execution of a huge course of action formats seen in the DSP parts. It is isolated from past tackles versatile reviving operators by enabling estimations to be mightily perform with pass on lookahead tree. The preliminary evaluations exhibit that the proposed stimulating specialist configuration passes on reduction in delay and in essentialness usage differentiated and the past work is illustrated.
Styles APA, Harvard, Vancouver, ISO, etc.
24

Altinoluk-Mimiroglu, Pinar, et Belgin Camur-Elipek. « Vertical and horizontal composition of fecal pollution indicator bacteria in lotic and lentic ecosystems at Turkish Thrace ». Periodicum Biologorum 118, no 4 (17 mars 2017). http://dx.doi.org/10.18054/pb.v118i4.4154.

Texte intégral
Résumé :
Background and Purpose:Although freshwater ecosystems have natural bacterial populations, their distributions are negatively affected by agricultural activities, domestic and industrial discharges. Bacterial composition at different depths can limit the usage of the water column for drinking, irrigation or other intentions. This study was designed to give similar indications concerning the nature of distribution of indicator bacteria in two different freshwater ecosystem types (lotic and lentic biotopes), and also to identify the factors that might be responsible in shaping them.Materials and Methods:For this aim, stagnant and running water resources located in Meric-Ergene River Basin at Turkish Thrace were sampled at three water depths (surface, middle, bottom) and two sediment depths (shore and bottom) between the dates October 2014 and September 2015 at seasonal intervals. While the heterotrophic bacteria, total and fecal coliform bacteria, and Escherichia coli were recorded by the CFU and MPN techniques, some features (temperature, dissolved oxygen, pH, conductivity, salinity, nutrients, ions, and elements) were also measured by classical chemical, chromatographic or spectrometric methods.Results and Conslusions:According to the data, the bacterial distribution in each ecosystem was found as similar for the bottom and the surface water columns. Results were also supported statistically by Bray-Curtis similarity index and correspondence analyse. The relationships between the bacterial distribution and environmental features were evaluated by Spearman correlation index. Consequently, it was observed that the bacterial distribution can differ in both water column/sediment depths and lotic/lentic ecosystems. And, it was suggested that the middle water column in each ecosystem is the most proper column for human usage.
Styles APA, Harvard, Vancouver, ISO, etc.
25

Li, Zhenpeng, Hongxia Guan, Wei Wang, He Gao, Weihong Feng, Jie Li, Baowei Diao, Hongqun Zhao, Biao Kan et Jingyun Zhang. « Development of a Rapid and Fully Automated Multiplex Real-Time PCR Assay for Identification and Differentiation of Vibrio cholerae and Vibrio parahaemolyticus on the BD MAX Platform ». Frontiers in Cellular and Infection Microbiology 11 (25 février 2021). http://dx.doi.org/10.3389/fcimb.2021.639473.

Texte intégral
Résumé :
Vibrio cholerae and Vibrio parahaemolyticus are common diarrheal pathogens of great public health concern. A multiplex TaqMan-based real-time PCR assay was developed on the BD MAX platform; this assay can simultaneously detect and differentiate V. cholerae and V. parahaemolyticus directly from human fecal specimens. The assay includes two reactions. One reaction, BDM-VC, targets the gene ompW, the cholera toxin (CT) coding gene ctxA, the O1 serogroup specific gene rfbN, and the O139 serogroup specific gene wbfR of V. cholerae. The other, BDM-VP, targets the gene toxR and the toxin coding genes tdh and trh of V. parahaemolyticus. In addition, each reaction contains a sample process control. When evaluated with spiked stool samples, the detection limit of the BD MAX assay was 195–780 CFU/ml for V. cholerae and 46–184 CFU/ml for V. parahaemolyticus, and the amplification efficiency of all genes was between 95 and 115%. The assay showed 100% analytical specificity, using 63 isolates. The BD MAX assay was evaluated for its performance compared with conventional real-time PCR after automated DNA extraction steps, using 164 retrospective stool samples. The overall percent agreement between the BD MAX assay and real-time PCR was ≥ 98.8%; the positive percent agreement was 85.7% for ompW, 100% for toxR/tdh, and lower (66.7%) for trh because of a false negative. This is the first report to evaluate the usage of the BD MAX open system in detection and differentiation of V. cholerae and V. parahaemolyticus directly from human samples.
Styles APA, Harvard, Vancouver, ISO, etc.
26

Koo, Seok Hwee, Boran Jiang, Pei Qi Lim, My-Van La et Thean Yen Tan. « Development of a rapid multiplex PCR assay for the detection of common pathogens associated with community-acquired pneumonia ». Transactions of The Royal Society of Tropical Medicine and Hygiene, 20 mai 2021. http://dx.doi.org/10.1093/trstmh/trab079.

Texte intégral
Résumé :
Abstract Background Community-acquired pneumonia (CAP) is one of the most common infectious diseases and is a significant cause of mortality and morbidity globally. A microbial cause was not determined in a sizable percentage of patients with CAP; there are increasing data to suggest regional differences in bacterial aetiology. We devised a multiplex real-time PCR assay for detecting four microorganisms (Streptococcus pneumoniae, Haemophilus influenzae, Klebsiella pneumoniae and Burkholderia pseudomallei) of relevance to CAP infections in Asia. Methods Analytical validation was accomplished using bacterial isolates (n=10–33 of each target organism for analytical sensitivity and n=117 for analytical sensitivity) and clinical validation using 58 culture-positive respiratory tract specimens. Results The qPCR assay exhibited 100% analytical sensitivity and analytical specificity, and 100% clinical sensitivity and 94–100% clinical specificity. The limit of detection and efficiency for the multiplex PCR assay were 3–33 CFU/mL and 93–110%, respectively. The results showed that the PCR-based method had higher sensitivity than traditional culture-based methods. The assay also demonstrated an ability to semiquantify bacterial loads. Conclusion We have devised a reliable laboratory-developed multiplex qPCR assay, with a turnaround time of within one working day, for detection of four clinically important CAP-associated microorganisms in Asia. The availability of a test with improved diagnostic capabilities potentially leads to an informed choice of antibiotic usage and appropriate management of the patient to achieve a better treatment outcome and financial savings.
Styles APA, Harvard, Vancouver, ISO, etc.
27

Gerhard, David. « Three Degrees of “G”s : How an Airbag Deployment Sensor Transformed Video Games, Exercise, and Dance ». M/C Journal 16, no 6 (7 novembre 2013). http://dx.doi.org/10.5204/mcj.742.

Texte intégral
Résumé :
Introduction The accelerometer seems, at first, both advanced and dated, both too complex and not complex enough. It sits in our video game controllers and our smartphones allowing us to move beyond mere button presses into immersive experiences where the motion of the hand is directly translated into the motion on the screen, where our flesh is transformed into the flesh of a superhero. Or at least that was the promise in 2005. Since then, motion control has moved from a promised revitalization of the video game industry to a not-quite-good-enough gimmick that all games use but none use well. Rogers describes the diffusion of innovation, as an invention or technology comes to market, in five phases: First, innovators will take risks with a new invention. Second, early adopters will establish a market and lead opinion. Third, the early majority shows that the product has wide appeal and application. Fourth, the late majority adopt the technology only after their skepticism has been allayed. Finally the laggards adopt the technology only when no other options are present (62). Not every technology makes it through the diffusion, however, and there are many who have never warmed to the accelerometer-controlled video game. Once an innovation has moved into the mainstream, additional waves of innovation may take place, when innovators or early adopters may find new uses for existing technology, and bring these uses into the majority. This is the case with the accelerometer that began as an airbag trigger and today is used for measuring and augmenting human motion, from dance to health (Walter 84). In many ways, gestural control of video games, an augmentation technology, was an interlude in the advancement of motion control. History In the early 1920s, bulky proofs-of-concept were produced that manipulated electrical voltage levels based on the movement of a probe, many related to early pressure or force sensors. The relationships between pressure, force, velocity and acceleration are well understood, but development of a tool that could measure one and infer the others was a many-fronted activity. Each of these individual sensors has its own specific application and many are still in use today, as pressure triggers, reaction devices, or other sensor-based interactivity, such as video games (Latulipe et al. 2995) and dance (Chu et al. 184). Over the years, the probes and devices became smaller and more accurate, and eventually migrated to the semiconductor, allowing the measurement of acceleration to take place within an almost inconsequential form-factor. Today, accelerometer chips are in many consumer devices and athletes wear battery-powered wireless accelerometer bracelets that report their every movement in real-time, a concept unimaginable only 20 years ago. One of the significant initial uses for accelerometers was as a sensor for the deployment of airbags in automobiles (Varat and Husher 1). The sensor was placed in the front bumper, detecting quick changes in speed that would indicate a crash. The system was a significant advance in the safety of automobiles, and followed Rogers’ diffusion through to the point where all new cars have airbags as a standard component. Airbags, and the accelerometers which allow them to function fast enough to save lives, are a ubiquitous, commoditized technology that most people take for granted, and served as the primary motivating factor for the mass-production of silicon-based accelerometer chips. On 14 September 2005, a device was introduced which would fundamentally alter the principal market for accelerometer microchips. The accelerometer was the ADXL335, a small, low-power, 3-Axis device capable of measuring up to 3g (1g is the acceleration due to gravity), and the device that used this accelerometer was the Wii remote, also called the Wiimote. Developed by Nintendo and its holding companies, the Wii remote was to be a defining feature of Nintendo’s 7th-generation video game console, in direct competition with the Xbox 360 and the Playstation 3. The Wii remote was so successful that both Microsoft and Sony added motion control to their platforms, in the form of the accelerometer-based “dual shock” controller for the Playstation, and later the Playstation Move controller; as well as an integrated accelerometer in the Xbox 360 controller and the later release of the Microsoft Kinect 3D motion sensing camera. Simultaneously, computer manufacturing companies saw a different, more pedantic use of the accelerometer. The primary storage medium in most computers today is the Hard Disk Drive (HDD), a set of spinning platters of electro-magnetically stored information. Much like a record player, the HDD contains a “head” which sweeps back and forth across the platter, reading and writing data. As computers changed from desktops to laptops, people moved their computers more often, and a problem arose. If the HDD inside a laptop was active when the laptop was moved, the read head might touch the surface of the disk, damaging the HDD and destroying information. Two solutions were implemented: vibration dampening in the manufacturing process, and the use of an accelerometer to detect motion. When the laptop is bumped, or dropped, the hard disk will sense the motion and immediately park the head, saving the disk and the valuable data inside. As a consequence of laptop computers and Wii remotes using accelerometers, the market for these devices began to swing from their use within car airbag systems toward their use in computer systems. And with an accelerometer in every computer, it wasn’t long before clever programmers began to make use of the information coming from the accelerometer for more than just protecting the hard drive. Programs began to appear that would use the accelerometer within a laptop to “lock” it when the user was away, invoking a loud noise like a car alarm to alert passers-by to any potential theft. Other programmers began to use the accelerometer as a gaming input, and this was the beginning of gesture control and the augmentation of human motion. Like laptops, most smartphones and tablets today have accelerometers included among their sensor suite (Brezmes et al. 796). These accelerometers strictly a user-interface tool, allowing the phone to re-orient its interface based on how the user is holding it, and allowing the user to play games and track health information using the phone. Many other consumer electronic devices use accelerometers, such as digital cameras for image stabilization and landscape/portrait orientation. Allowing a device to know its relative orientation and motion provides a wide range of augmentation possibilities. The Language of Measuring Motion When studying accelerometers, their function, and applications, a critical first step is to examine the language used to describe these devices. As the name implies, the accelerometer is a device which measures acceleration, however, our everyday connotation of this term is problematic at best. In colloquial language, we say “accelerate” when we mean “speed up”, but this is, in fact, two connotations removed from the physical property being measured by the device, and we must unwrap these layers of meaning before we can understand what is being measured. Physicists use the term “accelerate” to mean any change in velocity. It is worth reminding ourselves that velocity (to the physicists) is actually a pair of quantities: a speed coupled with a direction. Given this definition, when an object changes velocity (accelerates), it can be changing its speed, its direction, or both. So a car can be said to be accelerating when speeding up, slowing down, or even turning while maintaining a speed. This is why the accelerometer could be used as an airbag sensor in the first place. The airbags should deploy when a car suddenly changes velocity in any direction, including getting faster (due to being hit from behind), getting slower (from a front impact crash) or changing direction (being hit from the side). It is because of this ability to measure changes in velocity that accelerometers have come into common usage for laptop drop sensors and video game motion controllers. But even this understanding of accelerometers is incomplete. Because of the way that accelerometers are constructed, they actually measure “proper acceleration” within the context of a relativistic frame of reference. Discussing general relativity is beyond the scope of this paper, but it is sufficient to describe a relativistic frame of reference as one in which no forces are felt. A familiar example is being in orbit around the planet, when astronauts (and their equipment) float freely in space. A state of “free-fall” is one in which no forces are felt, and this is the only situation in which an accelerometer reads 0 acceleration. Since most of us are not in free-fall most of the time, any accelerometers in devices in normal use do not experience 0 proper acceleration, even when apparently sitting still. This is, of course, because of the force due to gravity. An accelerometer sitting on a table experiences 1g of force from the table, acting against the gravitational acceleration. This non-zero reading for a stationary object is the reason that accelerometers can serve a second (and, today, much more common) use: measuring orientation with respect to gravity. Gravity and Tilt Accelerometers typically measure forces with respect to three linear dimensions, labeled x, y, and z. These three directions orient along the axes of the accelerometer chip itself, with x and y normally orienting along the long faces of the device, and the z direction often pointing through the face of the device. Relative motion within a gravity field can easily be inferred assuming that the only force acting on the device is gravity. In this case, the single force is distributed among the three axes depending on the orientation of the device. This is how personal smartphones and video game controllers are able to use “tilt” control. When held in a natural position, the software extracts the relative value on all three axes and uses that as a reference point. When the user tilts the device, the new direction of the gravitational acceleration is then compared to the reference value and used to infer the tilt. This can be done hundreds of times a second and can be used to control and augment any aspect of the user experience. If, however, gravity is not the only force present, it becomes more difficult to infer orientation. Another common use for accelerometers is to measure physical activity like walking steps. In this case, it is the forces on the accelerometer from each footfall that are interpreted to measure fitness features. Tilt is unreliable in this circumstance because both gravity and the forces from the footfall are measured by the accelerometer, and it is impossible to separate the two forces from a single measurement. Velocity and Position A second common assumption with accelerometers is that since they can measure acceleration (rate of change of velocity), it should be possible to infer the velocity. If the device begins at rest, then any measured acceleration can be interpreted as changes to the velocity in some direction, thus inferring the new velocity. Although this is theoretically possible, real-world factors come in to play which prevent this from being realized. First, the assumption of beginning from a state of rest is not always reasonable. Further, if we don’t know whether the device is moving or not, knowing its acceleration at any moment will not help us to determine it’s new speed or position. The most important real-world problem, however, is that accelerometers typically show small variations even when the object is at rest. This is because of inaccuracies in the way that the accelerometer itself is interpreted. In normal operation, these small changes are ignored, but when trying to infer velocity or position, these little errors will quickly add up to the point where any inferred velocity or position would be unreliable. A common solution to these problems is in the combination of devices. Many new smartphones combine an accelerometer and a gyroscopes (a device which measures changes in rotational inertia) to provide a sensing system known as an IMU (Inertial measurement unit), which makes the readings from each more reliable. In this case, the gyroscope can be used to directly measure tilt (instead of inferring it from gravity) and this tilt information can be subtracted from the accelerometer reading to separate out the motion of the device from the force of gravity. Augmentation Applications in Health, Gaming, and Art Accelerometer-based devices have been used extensively in healthcare (Ward et al. 582), either using the accelerometer within a smartphone worn in the pocket (Yoshioka et al. 502) or using a standalone accelerometer device such as a wristband or shoe tab (Paradiso and Hu 165). In many cases, these devices have been used to measure specific activity such as swimming, gait (Henriksen et al. 288), and muscular activity (Thompson and Bemben 897), as well as general activity for tracking health (Troiano et al. 181), both in children (Stone et al. 136) and the elderly (Davis and Fox 581). These simple measurements are the first step in allowing athletes to modify their performance based on past activity. In the past, athletes would pour over recorded video to analyze and improve their performance, but with accelerometer devices, they can receive feedback in real time and modify their own behaviour based on these measurements. This augmentation is a competitive advantage but could be seen as unfair considering the current non-equal access to computer and electronic technology, i.e. the digital divide (Buente and Robbin 1743). When video games were augmented with motion controls, many assumed that this would have a positive impact on health. Physical activity in children is a common concern (Treuth et al. 1259), and there was a hope that if children had to move to play games, an activity that used to be considered a problem for health could be turned into an opportunity (Mellecker et al. 343). Unfortunately, the impact of children playing motion controlled video games has been less than successful. Although fitness games have been created, it is relatively easy to figure out how to activate controls with the least possible motion, thereby nullifying any potential benefit. One of the most interesting applications of accelerometers, in the context of this paper, is the application to dance-based video games (Brezmes et al. 796). In these systems, participants wear devices originally intended for health tracking in order to increase the sensitivity and control options for dance. This has evolved both from the use of accelerometers for gestural control in video games and for measuring and augmenting sport. Researchers and artists have also recently used accelerometers to augment dance systems in many ways (Latulipe et al. 2995) including combining multiple sensors (Yang et al. 121), as discussed above. Conclusions Although more and more people are using accelerometers in their research and art practice, it is significant that there is a lack of widespread knowledge about how the devices actually work. This can be seen in the many art installations and sports research studies that do not take full advantage of the capabilities of the accelerometer, or infer information or data that is unreliable because of the way that accelerometers behave. This lack of understanding of accelerometers also serves to limit the increased utilization of this powerful device, specifically in the context of augmentation tools. Being able to detect, analyze and interpret the motion of a body part has significant applications in augmentation that are only starting to be realized. The history of accelerometers is interesting and varied, and it is worthwhile, when exploring new ideas for applications of accelerometers, to be fully aware of the previous uses, current trends and technical limitations. It is clear that applications of accelerometers to the measurement of human motion are increasing, and that many new opportunities exist, especially in the application of combinations of sensors and new software techniques. The real novelty, however, will come from researchers and artists using accelerometers and sensors in novel and unusual ways. References Brezmes, Tomas, Juan-Luis Gorricho, and Josep Cotrina. “Activity Recognition from Accelerometer Data on a Mobile Phone.” In Distributed Computing, Artificial Intelligence, Bioinformatics, Soft Computing, and Ambient Assisted Living. Springer, 2009. Buente, Wayne, and Alice Robbin. “Trends in Internet Information Behavior, 2000-2004.” Journal of the American Society for Information Science and Technology 59.11 (2008).Chu, Narisa N.Y., Chang-Ming Yang, and Chih-Chung Wu. “Game Interface Using Digital Textile Sensors, Accelerometer and Gyroscope.” IEEE Transactions on Consumer Electronics 58.2 (2012): 184-189. Davis, Mark G., and Kenneth R. Fox. “Physical Activity Patterns Assessed by Accelerometry in Older People.” European Journal of Applied Physiology 100.5 (2007): 581-589.Hagstromer, Maria, Pekka Oja, and Michael Sjostrom. “Physical Activity and Inactivity in an Adult Population Assessed by Accelerometry.” Medical Science and Sports Exercise. 39.9 (2007): 1502-08. Henriksen, Marius, H. Lund, R. Moe-Nilssen, H. Bliddal, and B. Danneskiod-Samsøe. “Test–Retest Reliability of Trunk Accelerometric Gait Analysis.” Gait & Posture 19.3 (2004): 288-297. Latulipe, Celine, David Wilson, Sybil Huskey, Melissa Word, Arthur Carroll, Erin Carroll, Berto Gonzalez, Vikash Singh, Mike Wirth, and Danielle Lottridge. “Exploring the Design Space in Technology-Augmented Dance.” In CHI’10 Extended Abstracts on Human Factors in Computing Systems. ACM, 2010. Mellecker, Robin R., Lorraine Lanningham-Foster, James A. Levine, and Alison M. McManus. “Energy Intake during Activity Enhanced Video Game Play.” Appetite 55.2 (2010): 343-347. Paradiso, Joseph A., and Eric Hu. “Expressive Footwear for Computer-Augmented Dance Performance.” In First International Symposium on Wearable Computers. IEEE, 1997. Rogers, Everett M. Diffusion of Innovations. New York: Free Press of Glencoe, 1962. Stone, Michelle R., Ann V. Rowlands, and Roger G. Eston. "Relationships between Accelerometer-Assessed Physical Activity and Health in Children: Impact of the Activity-Intensity Classification Method" The Free Library 1 Mar. 2009. Thompson, Christian J., and Michael G. Bemben. “Reliability and Comparability of the Accelerometer as a Measure of Muscular Power.” Medicine and Science in Sports and Exercise. 31.6 (1999): 897-902.Treuth, Margarita S., Kathryn Schmitz, Diane J. Catellier, Robert G. McMurray, David M. Murray, M. Joao Almeida, Scott Going, James E. Norman, and Russell Pate. “Defining Accelerometer Thresholds for Activity Intensities in Adolescent Girls.” Medicine and Science in Sports and Exercise 36.7 (2004):1259-1266Troiano, Richard P., David Berrigan, Kevin W. Dodd, Louise C. Masse, Timothy Tilert, Margaret McDowell, et al. “Physical Activity in the United States Measured by Accelerometer.” Medicine and Science in Sports and Exercise, 40.1 (2008):181-88. Varat, Michael S., and Stein E. Husher. “Vehicle Impact Response Analysis through the Use of Accelerometer Data.” In SAE World Congress, 2000. Walter, Patrick L. “The History of the Accelerometer”. Sound and Vibration (Mar. 1997): 16-22. Ward, Dianne S., Kelly R. Evenson, Amber Vaughn, Anne Brown Rodgers, Richard P. Troiano, et al. “Accelerometer Use in Physical Activity: Best Practices and Research Recommendations.” Medicine and Science in Sports and Exercise 37.11 (2005): S582-8. Yang, Chang-Ming, Jwu-Sheng Hu, Ching-Wen Yang, Chih-Chung Wu, and Narisa Chu. “Dancing Game by Digital Textile Sensor, Accelerometer and Gyroscope.” In IEEE International Games Innovation Conference. IEEE, 2011.Yoshioka, M., M. Ayabe, T. Yahiro, H. Higuchi, Y. Higaki, J. St-Amand, H. Miyazaki, Y. Yoshitake, M. Shindo, and H. Tanaka. “Long-Period Accelerometer Monitoring Shows the Role of Physical Activity in Overweight and Obesity.” International Journal of Obesity 29.5 (2005): 502-508.
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie