To see the other types of publications on this topic, follow the link: Sensors on dynamic environments.

Journal articles on the topic 'Sensors on dynamic environments'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Sensors on dynamic environments.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Müller, Simone, and Dieter Kranzlmüller. "Dynamic Sensor Matching based on Geomagnetic Inertial Navigation." Journal of WSCG 30, no. 1-2 (2022): 16–25. http://dx.doi.org/10.24132/jwscg.2022.3.

Full text
Abstract:
Optical sensors can capture dynamic environments and derive depth information in near real-time. The quality of these digital reconstructions is determined by factors like illumination, surface and texture conditions, sensing speed and other sensor characteristics as well as the sensor-object relations. Improvements can be obtained by us- ing dynamically collected data from multiple sensors. However, matching the data from multiple sensors requires a shared world coordinate system. We present a concept for transferring multi-sensor data into a commonly ref- erenced world coordinate system: the earth’s magnetic field. The steady presence of our planetary magnetic field provides a reliable world coordinate system, which can serve as a reference for a position-defined reconstruction of dynamic environments. Our approach is evaluated using magnetic field sensors of the ZED 2 stereo camera from Stereolabs, which provides orientation relative to the North Pole similar to a compass. With the help of inertial measurement unit informations, each camera’s position data can be transferred into the unified world coordinate system. Our evaluation reveals the level of quality possible using the earth magnetic field and allows a basis for dynamic and real-time-based applications of optical multi-sensors for environment detection.
APA, Harvard, Vancouver, ISO, and other styles
2

Gao, Rui, Wenjun Zhang, Junmin Jing, et al. "Design, Fabrication, and Dynamic Environmental Test of a Piezoresistive Pressure Sensor." Micromachines 13, no. 7 (2022): 1142. http://dx.doi.org/10.3390/mi13071142.

Full text
Abstract:
Microelectromechanical system (MEMS) pressure sensors have a wide range of applications based on the advantages of mature technology and easy integration. Among them, piezoresistive sensors have attracted great attention with the advantage of simple back-end processing circuits. However, less research has been reported on the performance of piezoresistive pressure sensors in dynamic environments, especially considering the vibrations and shocks frequently encountered during the application of the sensors. To address these issues, this paper proposes a design method for a MEMS piezoresistive pressure sensor, and the fabricated sensor is evaluated in a series of systematic dynamic environmental adaptability tests. After testing, the output sensitivity of the sensor chip was 9.21 mV∙bar−1, while the nonlinearity was 0.069% FSS. The sensor overreacts to rapidly changing pressure environments and can withstand acceleration shocks of up to 20× g. In addition, the sensor is capable of providing normal output over the vibration frequency range of 0–5000 Hz with a temperature coefficient sensitivity of −0.30% FSS °C−1 over the temperature range of 0–80 °C. Our proposed sensor can play a key role in applications with wide pressure ranges, high-frequency vibrations, and high acceleration shocks, as well as guide MEMS-based pressure sensors in high pressure ranges and complex environmental adaptability in their design.
APA, Harvard, Vancouver, ISO, and other styles
3

Fox, D., W. Burgard, and S. Thrun. "Markov Localization for Mobile Robots in Dynamic Environments." Journal of Artificial Intelligence Research 11 (November 23, 1999): 391–427. http://dx.doi.org/10.1613/jair.616.

Full text
Abstract:
Localization, that is the estimation of a robot's location from sensor data, is a fundamental problem in mobile robotics. This papers presents a version of Markov localization which provides accurate position estimates and which is tailored towards dynamic environments. The key idea of Markov localization is to maintain a probability density over the space of all locations of a robot in its environment. Our approach represents this space metrically, using a fine-grained grid to approximate densities. It is able to globally localize the robot from scratch and to recover from localization failures. It is robust to approximate models of the environment (such as occupancy grid maps) and noisy sensors (such as ultrasound sensors). Our approach also includes a filtering technique which allows a mobile robot to reliably estimate its position even in densely populated environments in which crowds of people block the robot's sensors for extended periods of time. The method described here has been implemented and tested in several real-world applications of mobile robots, including the deployments of two mobile robots as interactive museum tour-guides.
APA, Harvard, Vancouver, ISO, and other styles
4

Russell, Joseph, Jeroen H. M. Bergmann, and Vikranth H. Nagaraja. "Towards Dynamic Multi-Modal Intent Sensing Using Probabilistic Sensor Networks." Sensors 22, no. 7 (2022): 2603. http://dx.doi.org/10.3390/s22072603.

Full text
Abstract:
Intent sensing—the ability to sense what a user wants to happen—has many potential technological applications. Assistive medical devices, such as prosthetic limbs, could benefit from intent-based control systems, allowing for faster and more intuitive control. The accuracy of intent sensing could be improved by using multiple sensors sensing multiple environments. As users will typically pass through different sensing environments throughout the day, the system should be dynamic, with sensors dropping in and out as required. An intent-sensing algorithm that allows for this cannot rely on training from only a particular combination of sensors. It should allow any (dynamic) combination of sensors to be used. Therefore, the objective of this study is to develop and test a dynamic intent-sensing system under changing conditions. A method has been proposed that treats each sensor individually and combines them using Bayesian sensor fusion. This approach was tested on laboratory data obtained from subjects wearing Inertial Measurement Units and surface electromyography electrodes. The proposed algorithm was then used to classify functional reach activities and compare the performance to an established classifier (k-nearest-neighbours) in cases of simulated sensor dropouts. Results showed that the Bayesian sensor fusion algorithm was less affected as more sensors dropped out, supporting this intent-sensing approach as viable in dynamic real-world scenarios.
APA, Harvard, Vancouver, ISO, and other styles
5

Zainab, Begum, Tahreem Fatima Nisaa, Suzana, Harish Joshi Dr, Kausar Prof.Uzma, and Bawge Prof.Ashok. "Smart Sensors: A Comprehensive Survey of IoT Sensor Types and Applications." Journal of Emerging Trends in Electrical Engineering 7, no. 2 (2025): 16–24. https://doi.org/10.5281/zenodo.15573961.

Full text
Abstract:
<em>The Internet of Things (IoT) is a transformative technology, redefining modern life by embedding billions of sensors and actuators into everyday environments. This integration yields intelligent, connected scenarios where continuous data capture provides a comprehensive operational view. Widely recognized in academic research, sensors facilitate ubiquitous data acquisition and are pivotal for diverse IoT applications. This paper explores a wide range of IoT sensors and examines various sensor-driven applications that help form smart environments. By evaluating multiple sensor application areas, we identify optimal sensor configurations for distinct IoT scenarios, thereby laying a robust foundation for future advancements in this dynamic domain.</em>
APA, Harvard, Vancouver, ISO, and other styles
6

Yoon, John. "Trustworthiness of Dynamic Moving Sensors for Secure Mobile Edge Computing." Computers 7, no. 4 (2018): 63. http://dx.doi.org/10.3390/computers7040063.

Full text
Abstract:
Wireless sensor network is an emerging technology, and the collaboration of wireless sensors becomes one of the active research areas for utilizing sensor data. Various sensors collaborate to recognize the changes of a target environment, to identify, if any radical change occurs. For the accuracy improvement, the calibration of sensors has been discussed, and sensor data analytics are becoming popular in research and development. However, they are not satisfactorily efficient for the situations where sensor devices are dynamically moving, abruptly appearing, or disappearing. If the abrupt appearance of sensors is a zero-day attack, and the disappearance of sensors is an ill-functioning comrade, then sensor data analytics of untrusted sensors will result in an indecisive artifact. The predefined sensor requirements or meta-data-based sensor verification is not adaptive to identify dynamically moving sensors. This paper describes a deep-learning approach to verify the trustworthiness of sensors by considering the sensor data only. The proposed verification on sensors can be done without having to use meta-data about sensors or to request consultation from a cloud server. The contribution of this paper includes (1) quality preservation of sensor data for mining analytics. The sensor data are trained to identify their characteristics of outliers: whether they are attack outliers, or outlier-like abrupt changes in environments; and (2) authenticity verification of dynamically moving sensors, which was possible. Previous unknown sensors are also identified by deep-learning approach.
APA, Harvard, Vancouver, ISO, and other styles
7

Schulte-Tigges, Joschua, Marco Förster, Gjorgji Nikolovski, et al. "Benchmarking of Various LiDAR Sensors for Use in Self-Driving Vehicles in Real-World Environments." Sensors 22, no. 19 (2022): 7146. http://dx.doi.org/10.3390/s22197146.

Full text
Abstract:
In this paper, we report on our benchmark results of the LiDAR sensors Livox Horizon, Robosense M1, Blickfeld Cube, Blickfeld Cube Range, Velodyne Velarray H800, and Innoviz Pro. The idea was to test the sensors in different typical scenarios that were defined with real-world use cases in mind, in order to find a sensor that meet the requirements of self-driving vehicles. For this, we defined static and dynamic benchmark scenarios. In the static scenarios, both LiDAR and the detection target do not move during the measurement. In dynamic scenarios, the LiDAR sensor was mounted on the vehicle which was driving toward the detection target. We tested all mentioned LiDAR sensors in both scenarios, show the results regarding the detection accuracy of the targets, and discuss their usefulness for deployment in self-driving cars.
APA, Harvard, Vancouver, ISO, and other styles
8

Wang, Y., D. Ewert, T. Meisen, D. Schilberg, and S. Jeschke. "Work area monitoring in dynamic environments using multiple auto-aligning 3-D sensors." Journal of Sensors and Sensor Systems 3, no. 1 (2014): 113–20. http://dx.doi.org/10.5194/jsss-3-113-2014.

Full text
Abstract:
Abstract. Compared to current industry standards future production systems will be more flexible and robust and will adapt to unforeseen states and events. Industrial robots will interact with each other as well as with human coworkers. To be able to act in such a dynamic environment, each acting entity ideally needs complete knowledge of its surroundings, concerning working materials as well as other working entities. Therefore new monitoring methods providing complete coverage for complex and changing working areas are needed. While single 3-D sensors already provide detailed information within their field of view, complete coverage of a complete work area can only be achieved by relying on a multitude of these sensors. However, to provide useful information all data of each sensor must be aligned to each other and fused into an overall world picture. To be able to align the data correctly, the position and orientation of each sensor must be known with sufficient exactness. In a quickly changing dynamic environment, the positions of sensors are not fixed, but must be adjusted to maintain optimal coverage. Therefore, the sensors need to autonomously align themselves in real time. This can be achieved by adding defined markers with given geometrical patterns to the environment which can be used for calibration and localization of each sensor. As soon as two sensors detect the same markers, their relative position to each other can be calculated. Additional anchor markers at fixed positions serve as global reference points for the base coordinate system. In this paper we present a prototype for a self-aligning monitoring system based on a robot operating system (ROS) and Microsoft Kinect. This system is capable of autonomous real-time calibration relative to and with respect to a global coordinate system as well as to detect and track defined objects within the working area.
APA, Harvard, Vancouver, ISO, and other styles
9

Liu, Ning Han, Hsiang Ming Hsu, and Tien Cheng Huang. "Enhanced Approaches for Processing Window Queries in Dynamic Wireless Sensor Networks." Applied Mechanics and Materials 58-60 (June 2011): 2122–27. http://dx.doi.org/10.4028/www.scientific.net/amm.58-60.2122.

Full text
Abstract:
Due to the proliferation of low cost wireless sensors, there is growing research interest in their applications, for example, in home healthcare and location tracking. However, due to sensors’ energy resource constraint, some possible applications of sensors have been restricted. In particular, in applications concerning deployment of mobile sensors in dynamic environments, high amounts of energy are consumed by sensors to maintain routing tables. Although existing methods have been proposed to query data from sensors without the use of any routing tables, these methods typically require redundant data to be sent back to the sink and not all of the aggregation functions could be executed precisely. In this paper, we modify an existing method to provide more accurate query answers and extend the lifetime of a wireless sensor network (WSN). According to our simulation, this method outperforms the existing method our approach modifies.
APA, Harvard, Vancouver, ISO, and other styles
10

V.S., Krushnasamy, Pravin Prakash Adivarekar, Deepa G, Mohammed Azam, and Prince Williams. "MEMS-ENHANCED SENSOR FUSION FOR AUTONOMOUS NAVIGATION IN ROBOTICS." ICTACT Journal on Microelectronics 9, no. 3 (2023): 1607–12. https://doi.org/10.21917/ijme.2023.0279.

Full text
Abstract:
In autonomous robotics, achieving precise navigation remains a formidable challenge, necessitating advancements in sensor fusion techniques. This study addresses the pivotal role of Micro-Electro-Mechanical Systems (MEMS) in enhancing sensor fusion for autonomous navigation. The pressing problem of achieving accurate and real-time navigation in dynamic environments has spurred the need for innovative solutions. The existing literature reveals a research gap in the seamless integration of MEMS sensors for robust navigation. While traditional sensor fusion methods often face limitations in handling diverse and rapidly changing environmental conditions, MEMS offer a promising avenue for overcoming these challenges. The miniature size, low power consumption, and high sensitivity of MEMS sensors make them ideal candidates for providing rich and reliable data for navigation purposes. The method employed in this research involves a comprehensive integration of MEMS sensors, such as accelerometers, gyroscopes, and magnetometers, into a unified sensor fusion framework. This framework leverages advanced algorithms to intelligently combine data from multiple sensors, mitigating individual sensor limitations and enhancing overall accuracy. The integration of MEMS sensors aims to provide a more holistic understanding of the robot surroundings, facilitating improved decision-making in navigation tasks. The results of our study showcase a significant improvement in the accuracy and efficiency of autonomous navigation in dynamic environments. MEMS-enhanced sensor fusion proves to be a viable solution for addressing the challenges posed by unpredictable terrains and obstacles. The robot equipped with MEMS sensors demonstrates enhanced adaptability and responsiveness, showcasing the potential for real-world applications.
APA, Harvard, Vancouver, ISO, and other styles
11

Jansen, Wouter, Dennis Laurijssen, and Jan Steckel. "Real-Time Sonar Fusion for Layered Navigation Controller." Sensors 22, no. 9 (2022): 3109. http://dx.doi.org/10.3390/s22093109.

Full text
Abstract:
Navigation in varied and dynamic indoor environments remains a complex task for autonomous mobile platforms. Especially when conditions worsen, typical sensor modalities may fail to operate optimally and subsequently provide inapt input for safe navigation control. In this study, we present an approach for the navigation of a dynamic indoor environment with a mobile platform with a single or several sonar sensors using a layered control system. These sensors can operate in conditions such as rain, fog, dust, or dirt. The different control layers, such as collision avoidance and corridor following behavior, are activated based on acoustic flow queues in the fusion of the sonar images. The novelty of this work is allowing these sensors to be freely positioned on the mobile platform and providing the framework for designing the optimal navigational outcome based on a zoning system around the mobile platform. Presented in this paper is the acoustic flow model used, as well as the design of the layered controller. Next to validation in simulation, an implementation is presented and validated in a real office environment using a real mobile platform with one, two, or three sonar sensors in real time with 2D navigation. Multiple sensor layouts were validated in both the simulation and real experiments to demonstrate that the modular approach for the controller and sensor fusion works optimally. The results of this work show stable and safe navigation of indoor environments with dynamic objects.
APA, Harvard, Vancouver, ISO, and other styles
12

Vinay, Nagarad Dasavandi Krishnamurthy. "Advancements in Automotive Temperature Sensing: Technologies, Applications, and Smart Sensor Integration." European Journal of Advances in Engineering and Technology 8, no. 10 (2021): 91–95. https://doi.org/10.5281/zenodo.12771189.

Full text
Abstract:
Temperature sensors are essential components in modern automotive systems, contributing significantly to vehicle performance, safety, and efficiency. This paper provides an extensive overview of temperature sensors used in automotive applications, encompassing their types, functions, and applica- tions. The types of temperature sensors commonly employed in automotive systems include thermistors, thermocouples, resis- tance temperature detectors (RTDs), and infrared devices. Each sensor type offers distinct advantages and is suited to specific temperature ranges and applications. For instance, thermocou- ples excel in high-temperature environments, while RTDs provide accurate and stable measurements across a wide temperature range. Various automotive temperature sensors, such as coolant temperature sensors, intake air temperature sensors, and exhaust gas temperature sensors, play critical roles in engine performance optimization, emission control, and component protection. These sensors enable precise monitoring of temperature parameters, facilitating efficient engine operation and prolonging the lifespan of vehicle components. Moreover, the integration of &rdquo;smart sensor&rdquo; technology enhances the functionality of automotive temperature sensors by incorporating electronic signal processing capabilities. Smart sensors offer features such as automatic gain control, dynamic threshold sensing, and digital output signals, improving sensor accuracy, reliability, and compatibility with communication networks. Furthermore, the paper discusses tem- perature sensor range and types used in automotive applications, highlighting the operational characteristics and applications of silicon IC sensors, thermistors, and RTDs. The comparative analysis of high-temperature sensors for exhaust monitoring applications underscores the superiority of RTDs in demanding environments.
APA, Harvard, Vancouver, ISO, and other styles
13

Yu Hui-Fen, QI He, Tu Xiao-Niu, et al. "Research Progress on High-temperature Piezoelectric Vibration Sensors and Piezoelectric Materials." Acta Physica Sinica 74, no. 2 (2025): 0. https://doi.org/10.7498/aps.74.20240906.

Full text
Abstract:
Vibration sensor technology, particularly piezoelectric vibration sensors, is extensively utilized across various fields due to their excellent dynamic response, linearity, wide bandwidth, high sensitivity, large temperature range, simple structure, and stable performance. They are widely applied in sectors such as nuclear power, aerospace, rail transportation, and defense industries. However, most piezoelectric vibration sensors are limited to operating temperatures below 500 ℃, which restricts their use in extreme high-temperature environments encountered in nuclear reactors, aircraft engines, missile systems, and internal combustion engines. These application scenarios impose higher demands on the reliability of piezoelectric vibration sensors for long-term service in extreme environments. How to improve the operating temperature of piezoelectric vibration sensors to meet the application needs in extreme environments is currently an urgent problem to be solved.&lt;br&gt;High-temperature piezoelectric materials, as the core components of piezoelectric vibration sensors, play a decisive role in determining the overall performance of the sensor. Common high-temperature piezoelectric materials include piezoelectric ceramics and single crystals. To ensure stable operation and excellent sensitivity in extreme environments, it is essential to select piezoelectric materials with high Curie temperatures, high piezoelectric coefficients, high resistivity, and low dielectric losses as the sensing elements of the sensor. Piezoelectric vibration sensors typically come in three main types: bending, compression, and shear. In addition to selecting the appropriate piezoelectric material, it is also crucial to choose the optimal sensor structure tailored to the specific application scenario.&lt;br&gt;Based on the urgent demand for ultrahigh-temperature vibration sensors, this paper primarily reviews the current research progress on high-temperature piezoelectric materials and high-temperature piezoelectric vibration sensors, summarizes the structures, advantages and disadvantages, and application scenarios of different types of high-temperature piezoelectric vibration sensors, explores the current problems and future development trends of high-temperature piezoelectric vibration sensors, and provides ideas for developing the next generation of ultrahigh temperature vibration sensors for extreme environmental applications, which is expected to promote the further development of high-temperature piezoelectric vibration sensing technology.
APA, Harvard, Vancouver, ISO, and other styles
14

M, Abdul Rahiman, Jain Nishanth, K. Dubey Arun, and Manoj Kumar G. "CODE AWARE DYNAMIC SOURCE ROUTING FOR DISTRIBUTED SENSOR NETWORK." International Journal of Network Security & Its Applications (IJNSA) 5, no. 2 (2013): 117–24. https://doi.org/10.5281/zenodo.3991330.

Full text
Abstract:
Sensor network facilitates monitoring and controlling of physical environments. These wireless networks consist of dense collection of sensors capable of collection and dissemination of data. They have application in variety of fields such as military purposes, environment monitoring etc. Typical deployment of sensor network assumes central processing station or a gateway to which all other nodes route their data using dynamic source routing (DSR). This causes congestion at central station and thus reduces the efficiency of the network. In this work we will propose a better dynamic source routing technique using network coding to reduce total number of transmission in sensor networks resulting in better efficiency.
APA, Harvard, Vancouver, ISO, and other styles
15

Macenski, Steve, David Tsai, and Max Feinberg. "Spatio-temporal voxel layer: A view on robot perception for the dynamic world." International Journal of Advanced Robotic Systems 17, no. 2 (2020): 172988142091053. http://dx.doi.org/10.1177/1729881420910530.

Full text
Abstract:
The spatio-temporal voxel grid is an actively maintained open-source project providing an improved three-dimensional environmental representation that has been garnering increased adoption in large, dynamic, and complex environments. We provide a voxel grid and the Costmap 2-D layer plug-in, Spatio-Temporal Voxel Layer, powered by a real-time sparse occupancy grid with constant time access to voxels which does not scale with the environment’s size. We replace ray-casting with a new clearing technique we dub frustum acceleration that does not assume a static environment and in practice, represents moving environments better. Our method operates at nearly 400% less CPU load on average while processing 9 QVGA resolution depth cameras as compared to the voxel layer. This technique also supports sensors such as three-dimensional laser scanners, radars, and additional modern sensors that were previously unsupported in the available ROS Navigation framework that has become staples in the roboticists’ toolbox. These sensors are becoming more widely used in robotics as sensor prices are driven down and mobile compute capabilities improve. The Spatio-Temporal Voxel Layer was developed in the open with community feedback over its development life cycle and continues to have additional features and capabilities added by the community. As of February 2019, the Spatio-Temporal Voxel Layer is being used on over 600 robots worldwide in warehouses, factories, hospitals, hotels, stores, and libraries. The open-source software can be viewed and installed on its GitHub page at https://github.com/SteveMacenski/spatio_temporal_voxel_layer .
APA, Harvard, Vancouver, ISO, and other styles
16

Gaci, Omar, Hervé Mathieu, Jean-Pierre Deutsch, and Laurent Gomez. "Dynamic Risk Assessment by Communicating Objects in Supply Chain of Chemicals." International Journal of Applied Logistics 4, no. 2 (2013): 34–45. http://dx.doi.org/10.4018/jal.2013040103.

Full text
Abstract:
In this paper, a wireless sensor network is deployed to improve the security of goods, environment and persons along a supply chain manipulating chemicals in the European Union. Pallets are equipped with a RFID tag and a set of sensors that monitor in real-time the environment state. By defining and monitoring constraints that must satisfy pallet environments, a real-time risk assessment is proposed. Then, sensors send accident risks in case of unusual values to a centralized software. Supply chain actors responsible for goods are thus contacted and in parallel emergency services are contacted to plan and organize their interventions.
APA, Harvard, Vancouver, ISO, and other styles
17

Li, Wanjing, Andeng Liu, Yimeng Wang, et al. "Implantable and Degradable Wireless Passive Protein-Based Tactile Sensor for Intracranial Dynamic Pressure Detection." Electronics 12, no. 11 (2023): 2466. http://dx.doi.org/10.3390/electronics12112466.

Full text
Abstract:
Implantable sensors normally require devices with excellent biocompatibility and flexibility as well as wireless communication. Silk fibroin (SF) is an ideal material for implantable electronic devices due to its natural biodegradability and biocompatibility. In this work, we prepared SF protein materials with different force/chemical properties through mesoscopic regulation, and realized full protein replacement from substrate to dielectric elastomer for implantable sensors, so as to achieve controlled complete degradation. In wireless tests simulating intracranial pressure, the SF-based all-protein sensor achieved a sensitivity up to 4.44 MHz/mmHg in the pressure range of 0–20 mmHg. In addition, the sensor is insensitive to temperature changes and tissue environments, and can work stably in simulated body fluids for a long time. This work provides a wireless passive, all-protein material solution for implantable pressure sensors.
APA, Harvard, Vancouver, ISO, and other styles
18

Costa Antunes, Paulo, João Miguel Dias, Humberto Varum, and Paulo André. "Dynamic structural health monitoring of a civil engineering structure with a POF accelerometer." Sensor Review 34, no. 1 (2014): 36–41. http://dx.doi.org/10.1108/sr-04-2013-656.

Full text
Abstract:
Purpose – In this work, the paper aims to demonstrate the feasibility of plastic optical fiber (POF) based accelerometers for the structural health monitoring (SHM) of civil engineering structures based on measurements of their dynamic response, namely to estimate natural frequencies. These sensors use POFs, combining the advantages of the optical technology with the robustness of this particular kind of fiber. The POF sensor output is directly compared with the signal from an electrical sensor, demonstrating the potential use of such sensors in structural monitoring applications. Design/methodology/approach – Within this work, the paper demonstrates the feasibility of using a low-cost acceleration system based on a POF accelerometer on the dynamic monitoring of a civil engineering structure, aiming its natural frequency evaluation, which is a primary parameter to be used in SHM methods and numerical models calibration. Findings – A low-cost POF-based accelerometer was used in the characterization of a civil engineering structural component, located in a building at the University of Aveiro Campus, being used to estimate its natural frequency with a relative error of 0.36 percent, comparatively to the value estimated recurring to a calibrated electronic sensor. Originality/value – Optical fiber sensors take advantage of the fibers properties, such as immunity to electromagnetic interference and electrical isolation. They are very attractive for use in hostile environments, like submerse environments or flammable atmospheres where electrical currents might pose a hazard. The advantages of POF itself should also be considered, like resistance to hash environments, robustness, flexibility, low-cost interrogation units and high numeric aperture (lower cost components). The paper demonstrates the feasibility of using a low-cost acceleration system based on a POF accelerometer on the dynamic monitoring of a civil engineering structure, aiming its natural frequency evaluation.
APA, Harvard, Vancouver, ISO, and other styles
19

Rwigema, James, Hyo-Rim Choi, and TaeYong Kim. "A Differential Evolution Approach to Optimize Weights of Dynamic Time Warping for Multi-Sensor Based Gesture Recognition." Sensors 19, no. 5 (2019): 1007. http://dx.doi.org/10.3390/s19051007.

Full text
Abstract:
In this research, we present a differential evolution approach to optimize the weights of dynamic time warping for multi-sensory based gesture recognition. Mainly, we aimed to develop a robust gesture recognition method that can be used in various environments. Both a wearable inertial sensor and a depth camera (Kinect Sensor) were used as heterogeneous sensors to verify and collect the data. The proposed approach was used for the calculation of optimal weight values and different characteristic features of heterogeneous sensor data, while having different effects during gesture recognition. In this research, we studied 27 different actions to analyze the data. As finding the optimal value of the data from numerous sensors became more complex, a differential evolution approach was used during the fusion and optimization of the data. To verify the performance accuracy of the presented method in this study, a University of Texas at Dallas Multimodal Human Action Datasets (UTD-MHAD) from previous research was used. However, the average recognition rates presented by previous research using respective methods were still low, due to the complexity in the calculation of the optimal values of the acquired data from sensors, as well as the installation environment. Our contribution was based on a method that enabled us to adjust the number of depth cameras and combine this data with inertial sensors (multi-sensors in this study). We applied a differential evolution approach to calculate the optimal values of the added weights. The proposed method achieved an accuracy 10% higher than the previous research results using the same database, indicating a much improved accuracy rate of motion recognition.
APA, Harvard, Vancouver, ISO, and other styles
20

Li, Zhiling, Gao Wang, Jianping Yin, et al. "Development and Performance Analysis of an Atomic Layer Thermopile Sensor for Composite Heat Flux Testing in an Explosive Environment." Electronics 12, no. 17 (2023): 3582. http://dx.doi.org/10.3390/electronics12173582.

Full text
Abstract:
Traditional contact heat flux sensors suffer from a lack of dynamic performance, and existing non-contact optical heat measurement equipment fails to detect convective heat transfer effectively. This limitation precludes the effective testing of composite heat flux in explosive fields. This study introduces an ultra-responsive atomic layer thermopile (ALTP) heat flux sensor, developed and employed for the first time, to evaluate the transient heat flux associated with thermobaric explosions. Measurements reveal that the ALTP sensor’s temporal resolution surpasses that of the thermal resistance thin film heat flux sensor (TFHF), attaining a spectral response time of 10 μs under pulsed laser irradiation. Beyond these radiation-based tests, the present work also conducted novel simulation analyses of high-temperature jet impacts using COMSOL software. Static simulation discovered that fluid velocity significantly influences ALTP’s sensitivity, resulting in an error of 71%. Conversely, dynamic simulation demonstrated that an increase in fluid velocity reduces the ALTP’s time constant, whereas other factors such as fluid temperature exert minimal impact on its dynamic characteristics. This confirms that the simulation model compensates for the cost and accuracy deficiencies of convection heating tests. It also provides a new way to analyze the error of explosive heat flux measurement caused by sensitivity fluctuation and insufficient dynamic performance. In thermobaric explosive trials, the maximum heat fluxes recorded were 202 kW/m2 in semi-enclosed environments and 526 kW/m2 in open environments. A distinctive double-wave phenomenon was evident in the test curve. By a fast-response thermocouple, the study was able to differentiate between radiation and convective heat flux in the explosion field. The findings substantiate that the ALTP sensor amalgamates the benefits of optical thermal measurement tools with those of traditional contact heat flux sensors, thereby facilitating composite heat flux measurements in the challenging conditions of an explosive field.
APA, Harvard, Vancouver, ISO, and other styles
21

Godaliyadda, G. M. Dilshan P., Vijay Pothukuchi, and JuneChul Roh. "Multi-Sensor Fusion in Dynamic Environment using Evidential Grid Mapping." Electronic Imaging 2020, no. 16 (2020): 255–1. http://dx.doi.org/10.2352/issn.2470-1173.2020.16.avm-203.

Full text
Abstract:
Grid mapping is widely used to represent the environment surrounding a car or a robot for autonomous navigation. This paper describes an algorithm for evidential occupancy grid (OG) mapping that fuses measurements from different sensors, based on the Dempster-Shafer theory, and is intended for scenes with stationary and moving (dynamic) objects. Conventional OGmapping algorithms tend to struggle in the presence of moving objects because they do not explicitly distinguish between moving and stationary objects. In contrast, evidential OG mapping allows for dynamic and ambiguous states (e.g. a LIDAR measurement: cannot differentiate between moving and stationary objects) that are more aligned with measurements made by sensors. In this paper, we present a framework for fusing measurements as they are received from disparate sensors (e.g. radar, camera and LIDAR) using evidential grid mapping. With this approach, we can form a live map of the environment, and also alleviate the problem of having to synchronize sensors in time. We also designed a new inverse sensor model for radar that allows us to extract more information from object level measurements, by incorporating knowledge of the sensor’s characteristics. We have implemented our algorithm in the OpenVX framework to enable seamless integration into embedded platforms. Test results show compelling performance especially in the presence of moving objects.
APA, Harvard, Vancouver, ISO, and other styles
22

Philippe, Julien, Maria De Paolis, Dominique Henry, et al. "In-Situ Wireless Pressure Measurement Using Zero-Power Packaged Microwave Sensors." Sensors 19, no. 6 (2019): 1263. http://dx.doi.org/10.3390/s19061263.

Full text
Abstract:
This paper reports the indoor wireless measurement of pressure from zero-power (or passive) microwave (24 GHz) sensors. The sensors are packaged and allow the remote measurement of overpressure up to 2.1 bars. Their design, fabrication process and packaging are detailed. From the measurement of sensor scattering parameters, the outstanding sensitivity of 995 MHz/bar between 0.8 and 2.1 bars was achieved with the full-scale measurement range of 1.33 GHz. Moreover, the 3D radar imagery technique was applied for the remote interrogation of these sensors in electromagnetic reverberant environments. The full-scale dynamic range of 4.9 dB and the sensitivity of 4.9 dB/bar between 0.7 and 1.7 bars were achieved with radar detection in a highly reflective environment. These measurement results demonstrate for the first time the ability of the radar imagery technique to interrogate fully passive pressure sensors in electromagnetic reverberant environments.
APA, Harvard, Vancouver, ISO, and other styles
23

Li, Qiang, Ruidong Liu, Yalei Liu, and Zhenzhong Wei. "Research on Registration Methods for Coupled Errors in Maneuvering Platforms." Entropy 27, no. 6 (2025): 607. https://doi.org/10.3390/e27060607.

Full text
Abstract:
The performance limitations of single-sensor systems in target tracking have led to the widespread adoption of multi-sensor fusion, which improves accuracy through information complementarity and redundancy. However, on mobile platforms, dynamic changes in sensor attitude and position introduce coupled measurement and attitude errors, making accurate sensor registration particularly challenging. Most existing methods either treat these errors independently or rely on simplified assumptions, which limit their effectiveness in dynamic environments. To address this, we propose a novel joint error estimation and registration method based on a pseudo-Kalman filter (PKF). The PKF constructs pseudo-measurements by subtracting outputs from multiple sensors, projecting them into a bias space that is independent of the target’s state. A decoupling mechanism is introduced to distinguish between measurement and attitude error components, enabling accurate joint estimation in real time. In the shipborne environment, simulation experiments on pitch, yaw, and roll motions were conducted using two sensors. This method was compared with least squares (LS), maximum likelihood (ML), and the standard method based on PKF. The results show that the method based on PKF has a lower root mean square error (RMSE), a faster convergence speed, and better estimation accuracy and robustness. The proposed approach provides a practical and scalable solution for sensor registration in dynamic environments, particularly in maritime or aerial applications where coupled errors are prevalent.
APA, Harvard, Vancouver, ISO, and other styles
24

Neumann, Ulrich, Suya You, Jinhui Hu, Bolan Jiang, and Ismail Oner Sebe. "Visualizing Reality in an Augmented Virtual Environment." Presence: Teleoperators and Virtual Environments 13, no. 2 (2004): 222–33. http://dx.doi.org/10.1162/1054746041382366.

Full text
Abstract:
An Augmented Virtual Environment (AVE) fuses dynamic imagery with 3D models. An AVE provides a unique approach to visualizing spatial relationships and temporal events that occur in real-world environments. A geometric scene model provides a 3D substrate for the visualization of multiple image sequences gathered by fixed or moving image sensors. The resulting visualization is that of a world-in-miniature that depicts the corresponding real-world scene and dynamic activities. This paper describes the core elements of an AVE system, including static and dynamic model construction, sensor tracking, and image projection for 3D visualization.
APA, Harvard, Vancouver, ISO, and other styles
25

Tan, Lihua, Yingjie Zhou, Hu Kong, Zhiliang Yue, Qilong Wang, and Lei Zhou. "Dynamic Monitoring of Steel Beam Stress Based on PMN-PT Sensor." Buildings 14, no. 9 (2024): 2831. http://dx.doi.org/10.3390/buildings14092831.

Full text
Abstract:
Steel beams are widely used load-bearing components in bridge construction. They are prone to internal stress concentration under low-frequency vibrations caused by natural disasters and adverse loads, leading to microcracks and fractures, thereby accelerating the instability of steel components. Therefore, dynamic stress monitoring of steel beams under low-frequency vibrations is crucial to ensure structural safety. This study proposed an external stress sensor based on PMN-PT material. The sensor has the advantages of high sensitivity, comprehensive frequency response, and fast response speed. To verify the accuracy and feasibility of the sensor in actual engineering, the LETRY universal testing machine and drop hammer impact system were used to carry out stress monitoring tests and finite element simulations on scaled I-shaped steel beams with PMN-PT sensors attached. The results show that: (1) The PMN-PT sensor has exceptionally high sensitivity, maintained at 1.716~1.726 V/MPa in the frequency range of 0~1000 Hz. The sensor performance is much higher than that of PVDF sensors with the same adhesive layer thickness. (2) Under low-frequency random vibration, the sensor’s time domain and frequency domain output voltages are always consistent with the waveform of the applied load, which can reflect the changes in the structural stress state in real time. (3) Under the impact of a drop hammer, the sensor signal response delay is only 0.001 s, and the sensitivity linear fitting degree is above 0.9. (4) The simulation and experimental results are highly consistent, confirming the superior performance of the PMN-PT sensor, which can be effectively used for stress monitoring of steel structures in low-frequency vibration environments.
APA, Harvard, Vancouver, ISO, and other styles
26

Alotaibi, Theyab, Kamal Jambi, Maher Khemakhem, Fathy Eassa, and Farid Bourennani. "Deep Learning-Based Autonomous Navigation of 5G Drones in Unknown and Dynamic Environments." Drones 9, no. 4 (2025): 249. https://doi.org/10.3390/drones9040249.

Full text
Abstract:
The flexibility and rapid mobility of drones make them ideal for Internet of Things (IoT) applications, such as traffic control and data collection. Therefore, the autonomous navigation of 5G drones in unknown and dynamic environments has become a major research topic. Current methods rely on sensors to perceive the environment to plan the path from the start point to the target and to avoid obstacles; however, their limited field of view prevents them from moving in all directions and detecting and avoiding obstacles. This article proposes the deep learning (DL)-based autonomous navigation of 5G drones. This proposal uses sensors capable of perceiving the entire environment surrounding the drone and fuses sensor data to detect and avoid obstacles, plan a path, and move in all directions. We trained a convolution neural network (CNN) using a novel dataset we created for drone ascent and passing over obstacles, which achieved 99% accuracy. We also trained artificial neural networks (ANNs) to control drones and achieved a 100% accuracy. Experiments in the Gazebo environment demonstrated the efficiency of sensor fusion, and our proposal was the only one that perceived the entire environment, particularly above the drone. Furthermore, it excelled at detecting U-shaped obstacles and enabling drones to emerge from them.
APA, Harvard, Vancouver, ISO, and other styles
27

Hofman, Jelle, Borislav Lazarov, Christophe Stroobants, Evelyne Elst, Inge Smets, and Martine Van Poppel. "Portable Sensors for Dynamic Exposure Assessments in Urban Environments: State of the Science." Sensors 24, no. 17 (2024): 5653. http://dx.doi.org/10.3390/s24175653.

Full text
Abstract:
This study presents a fit-for-purpose lab and field evaluation of commercially available portable sensor systems for PM, NO2, and/or BC. The main aim of the study is to identify portable sensor systems that are capable of reliably quantifying dynamic exposure gradients in urban environments. After an initial literature and market study resulting in 39 sensor systems, 10 sensor systems were ultimately purchased and benchmarked under laboratory and real-word conditions. We evaluated the comparability to reference analyzers, sensor precision, and sensitivity towards environmental confounders (temperature, humidity, and O3). Moreover, we evaluated if the sensor accuracy can be improved by applying a lab or field calibration. Because the targeted application of the sensor systems under evaluation is mobile monitoring, we conducted a mobile field test in an urban environment to evaluate the GPS accuracy and potential impacts from vibrations on the resulting sensor signals. Results of the considered sensor systems indicate that out-of-the-box performance is relatively good for PM (R2 = 0.68–0.9, Uexp = 16–66%, BSU = 0.1–0.7 µg/m3) and BC (R2 = 0.82–0.83), but maturity of the tested NO2 sensors is still low (R2 = 0.38–0.55, Uexp = 111–614%) and additional efforts are needed in terms of signal noise and calibration, as proven by the performance after multilinear calibration (R2 = 0.75–0.83, Uexp = 37–44%)). The horizontal accuracy of the built-in GPS was generally good, achieving &lt;10 m accuracy for all sensor systems. More accurate and dynamic exposure assessments in contemporary urban environments are crucial to study real-world exposure of individuals and the resulting impacts on potential health endpoints. A greater availability of mobile monitoring systems capable of quantifying urban pollutant gradients will further boost this line of research.
APA, Harvard, Vancouver, ISO, and other styles
28

Song, Chengqun, Bo Zeng, Jun Cheng, Fuxiang Wu, and Fusheng Hao. "PSMD-SLAM: Panoptic Segmentation-Aided Multi-Sensor Fusion Simultaneous Localization and Mapping in Dynamic Scenes." Applied Sciences 14, no. 9 (2024): 3843. http://dx.doi.org/10.3390/app14093843.

Full text
Abstract:
Multi-sensor fusion is pivotal in augmenting the robustness and precision of simultaneous localization and mapping (SLAM) systems. The LiDAR–visual–inertial approach has been empirically shown to adeptly amalgamate the benefits of these sensors for SLAM across various scenarios. Furthermore, methods of panoptic segmentation have been introduced to deliver pixel-level semantic and instance segmentation data in a single instance. This paper delves deeper into these methodologies, introducing PSMD-SLAM, a novel panoptic segmentation assisted multi-sensor fusion SLAM approach tailored for dynamic environments. Our approach employs both probability propagation-based and PCA-based clustering techniques, supplemented by panoptic segmentation. This is utilized for dynamic object detection and the removal of visual and LiDAR data, respectively. Furthermore, we introduce a module designed for the robust real-time estimation of the 6D pose of dynamic objects. We test our approach on a publicly available dataset and show that PSMD-SLAM outperforms other SLAM algorithms in terms of accuracy and robustness, especially in dynamic environments.
APA, Harvard, Vancouver, ISO, and other styles
29

Chen, Chin-Ling, Tzay-Farn Shih, Yu-Ting Tsai, and De-Kui Li. "A Bilinear Pairing-Based Dynamic Key Management and Authentication for Wireless Sensor Networks." Journal of Sensors 2015 (2015): 1–14. http://dx.doi.org/10.1155/2015/534657.

Full text
Abstract:
In recent years, wireless sensor networks have been used in a variety of environments; a wireless network infrastructure, established to communicate and exchange information in a monitoring area, has also been applied in different environments. However, for sensitive applications, security is the paramount issue. In this paper, we propose using bilinear pairing to design dynamic key management and authentication scheme of the hierarchical sensor network. We use the dynamic key management and the pairing-based cryptography (PBC) to establish the session key and the hash message authentication code (HMAC) to support the mutual authentication between the sensors and the base station. In addition, we also embed the capability of the Global Positioning System (GPS) to cluster nodes to find the best path of the sensor network. The proposed scheme can also provide the requisite security of the dynamic key management, mutual authentication, and session key protection. Our scheme can defend against impersonation attack, replay attack, wormhole attack, and message manipulation attack.
APA, Harvard, Vancouver, ISO, and other styles
30

Aboutaleb, Ahmed, Amr S. El-Wakeel, Haidy Elghamrawy, and Aboelmagd Noureldin. "LiDAR/RISS/GNSS Dynamic Integration for Land Vehicle Robust Positioning in Challenging GNSS Environments." Remote Sensing 12, no. 14 (2020): 2323. http://dx.doi.org/10.3390/rs12142323.

Full text
Abstract:
The autonomous vehicles (AV) industry has a growing demand for reliable, continuous, and accurate positioning information to ensure safe traffic and for other various applications. Global navigation satellite system (GNSS) receivers have been widely used for this purpose. However, GNSS positioning accuracy deteriorates drastically in challenging environments such as urban environments and downtown cores. Therefore, inertial sensors are widely deployed inside the land vehicle for various purposes, including the integration with GNSS receivers to provide positioning information that can bridge potential GNSS failures. However, in dense urban areas and downtown cores where GNSS receivers may incur prolonged outages, the integrated positioning solution may become prone to severe drift resulting in substantial position errors. Therefore, it is becoming necessary to include other sensors and systems that can be available in future land vehicles to be integrated with both the GNSS receivers and inertial sensors to enhance the positioning performance in such challenging environments. This work aims to design and examine the performance of a multi-sensor system that fuses the GNSS receiver data with not only the three-dimensional reduced inertial sensor system (3D-RISS), but also with the three-dimensional point cloud of onboard light detection and ranging (LiDAR) system. In this paper, a comprehensive LiDAR processing and odometry method is developed to provide a continuous and reliable positioning solution. In addition, a multi-sensor Extended Kalman filtering (EKF)-based fusion is developed to integrate the LiDAR positioning information with both GNSS and 3D-RISS and utilize the LiDAR updates to limit the drift in the positioning solution, even in challenging or ultimately denied GNSS environment. The performance of the proposed positioning solution is examined using several road test trajectories in both Kingston and Toronto downtown areas involving different vehicle dynamics and driving scenarios. The proposed solution provided a performance improvement over the standalone inertial solution by 64%. Over a GNSS outage of 10 min and 2 km distance traveled, our solution achieved position errors less than 2% of the distance travelled.
APA, Harvard, Vancouver, ISO, and other styles
31

Shi, Yuan, Ang Li, T. K. Satish Kumar, and Craig A. Knoblock. "Building Survivable Software Systems by Automatically Adapting to Sensor Changes." Applied Sciences 11, no. 11 (2021): 4808. http://dx.doi.org/10.3390/app11114808.

Full text
Abstract:
Many software systems run on long-lifespan platforms that operate in diverse and dynamic environments. If these software systems could automatically adapt to hardware changes, it would significantly reduce the maintenance cost and enable rapid upgrade. In this paper, we study the problem of how to automatically adapt to sensor changes, as an important step towards building such long-lived, survivable software systems. We address challenges in sensor adaptation when a set of sensors are replaced by new sensors. Our approach reconstructs sensor values of replaced sensors by preserving distributions of sensor values before and after the sensor change, thereby not warranting a change in higher-layer software. Compared to existing work, our approach has the following advantages: (a) ability to exploit new sensors without requiring an overlapping period of time between the new sensors and the old ones; (b) ability to provide an estimation of adaptation quality; and (c) ability to scale to a large number of sensors. Experiments on weather data and Unmanned Undersea Vehicle (UUV) data demonstrate that our approach can automatically adapt to sensor changes with 5.7% higher accuracy compared to baseline methods.
APA, Harvard, Vancouver, ISO, and other styles
32

Dallo, Federico, Daniele Zannoni, Jacopo Gabrieli, et al. "Calibration and assessment of electrochemical low-cost sensors in remote alpine harsh environments." Atmospheric Measurement Techniques 14, no. 9 (2021): 6005–21. http://dx.doi.org/10.5194/amt-14-6005-2021.

Full text
Abstract:
Abstract. This work presents results from an original open-source low-cost sensor (LCS) system developed to measure tropospheric O3 in a remote high altitude alpine site. Our study was conducted at the Col Margherita Observatory (2543 m above sea level), in the Italian Eastern Alps. The sensor system mounts three commercial low-cost O3/NO2 sensors that have been calibrated before field deployment against a laboratory standard (Thermo Scientific; 49i-PS), calibrated against the standard reference photometer no. 15 calibration scale of the World Meteorological Organization (WMO). Intra- and intercomparison between the sensors and a reference instrument (Thermo Scientific; 49c) have been conducted for 7 months from May to December 2018. The sensors required an individual calibration, both in laboratory and in the field. The sensor's dependence on the environmental meteorological variables has been considered and discussed. We showed that it is possible to reduce the bias of one LCS by using the average coefficient values of another LCS working in tandem, suggesting a way forward for the development of remote field calibration techniques. We showed that it is possible reconstruct the environmental ozone concentration during the loss of reference instrument data in situations caused by power outages. The evaluation of the analytical performances of this sensing system provides a limit of detection (LOD) &lt;5 ppb (parts per billion), limit of quantification (LOQ) &lt;17 ppb, linear dynamic range (LDR) up to 250 ppb, intra-Pearson correlation coefficient (PCC) up to 0.96, inter-PCC &gt;0.8, bias &gt;3.5 ppb and ±8.5 at 95 % confidence. This first implementation of a LCS system in an alpine remote location demonstrated how to obtain valuable data from a low-cost instrument in a remote environment, opening new perspectives for the adoption of low-cost sensor networks in atmospheric sciences.
APA, Harvard, Vancouver, ISO, and other styles
33

Han, Joong-hee, and Chi-ho Park. "Performance evaluation on GNSS, wheel speed sensor, yaw rate sensor, and gravity sensor integrated positioning algorithm for automotive navigation system." E3S Web of Conferences 94 (2019): 02003. http://dx.doi.org/10.1051/e3sconf/20199402003.

Full text
Abstract:
The Global Navigation Satellite System (GNSS) positioning technique is widely used for the automotive navigation system since it can provide the stable and accurate position and velocity in the most road environments at an affordable price. However, the performance of GNSS positioning technique is degraded in certain areas, where GNSS signals are blocked by buildings and tunnel. To overcome this problem, the GNSS positioning technique should be integrated with dead reckoning (DR) sensors such as accelerometer, gyroscope, and odometer. Recently, the most passenger cars are equipped with the Advanced Driver Assistance System (ADAS) based on numerous sensors to improve safety and convenience in driving. Among sensors for the ADAS, vehicle dynamic sensors such as wheel speed sensor (WSS), yaw rate sensor (YRS), gravity sensor (GS) can be used for the DR algorithm since those sensors measure vehicle’s motions. Therefore, this paper evaluates the vehicle positioning algorithm that integrate the GNSS with a three-dimensional dead reckoning based on WSS, YRS, and GS. The vehicle positioning algorithm is implemented through the extended Kalman filter of a loosely-coupled mode. Performance was evaluated through tests carried out in real driving trajectory including various GNSS signals reception environments. It is found that the proposed algorithm can be an alternative solution to compensate the limitation of the GNSS positioning technique, without the use of a low-cost inertial measurement unit.
APA, Harvard, Vancouver, ISO, and other styles
34

Filipowska, Anna, Wojciech Filipowski, Paweł Raif, et al. "Machine Learning-Based Gesture Recognition Glove: Design and Implementation." Sensors 24, no. 18 (2024): 6157. http://dx.doi.org/10.3390/s24186157.

Full text
Abstract:
In the evolving field of human–computer interaction (HCI), gesture recognition has emerged as a critical focus, with smart gloves equipped with sensors playing one of the most important roles. Despite the significance of dynamic gesture recognition, most research on data gloves has concentrated on static gestures, with only a small percentage addressing dynamic gestures or both. This study explores the development of a low-cost smart glove prototype designed to capture and classify dynamic hand gestures for game control and presents a prototype of data gloves equipped with five flex sensors, five force sensors, and one inertial measurement unit (IMU) sensor. To classify dynamic gestures, we developed a neural network-based classifier, utilizing a convolutional neural network (CNN) with three two-dimensional convolutional layers and rectified linear unit (ReLU) activation where its accuracy was 90%. The developed glove effectively captures dynamic gestures for game control, achieving high classification accuracy, precision, and recall, as evidenced by the confusion matrix and training metrics. Despite limitations in the number of gestures and participants, the solution offers a cost-effective and accurate approach to gesture recognition, with potential applications in VR/AR environments.
APA, Harvard, Vancouver, ISO, and other styles
35

Ryu, Seonghwan, Seoyeon Kim, Jiwoo Shin, Taesik Kim, and Jinman Jung. "A Fusion Sensor System for Efficient Road Surface Monitoring on UGV." Korean Institute of Smart Media 13, no. 3 (2024): 18–26. http://dx.doi.org/10.30693/smj.2024.13.3.18.

Full text
Abstract:
Road surface monitoring is essential for maintaining road environment safety through managing risk factors like rutting and crack detection. Using autonomous driving-based UGVs with high-performance 2D laser sensors enables more precise measurements. However, the increased energy consumption of these sensors is limited by constrained battery capacity. In this paper, we propose a fusion sensor system for efficient surface monitoring with UGVs. The proposed system combines color information from cameras and depth information from line laser sensors to accurately detect surface displacement. Furthermore, a dynamic sampling algorithm is applied to control the scanning frequency of line laser sensors based on the detection status of monitoring targets using camera sensors, reducing unnecessary energy consumption. A power consumption model of the fusion sensor system analyzes its energy efficiency considering various crack distributions and sensor characteristics in different mission environments. Performance analysis demonstrates that setting the power consumption of the line laser sensor to twice that of the saving state when in the active state increases power consumption efficiency by 13.3% compared to fixed sampling under the condition of =10, =10.
APA, Harvard, Vancouver, ISO, and other styles
36

Sreeram Sivadasan. "Optimized Energy-Efficient Routing and Adaptive Coverage in Wireless Sensor Networks for Multi-Access Edge Environments." Communications on Applied Nonlinear Analysis 32, no. 3s (2024): 518–28. https://doi.org/10.52783/cana.v32.2689.

Full text
Abstract:
The growing demand for edge computing necessitates optimizing wireless sensor networks (WSNs) for efficient real-time data processing. Achieving energy efficiency and dynamic coverage in WSNs, particularly in multi-access edge environments, is a key challenge. This paper introduces a framework with three core algorithms to enhance network coverage and routing in such environments. The first algorithm uses Delaunay Triangulation to detect coverage gaps, identifying areas with insufficient sensor deployment. To address these gaps, the second algorithm applies a dynamic node placement strategy, where additional sensors are strategically positioned, optimized using clustering and refined Delaunay Triangulation to integrate new nodes. The third algorithm proposes an adaptive tree-based routing protocol using a Minimum Spanning Tree (MST) to conserve energy by rerouting data through energy-efficient paths. Together, these algorithms form a resilient WSN system that adapts to environmental changes, improving data reliability and system performance. Simulation results demonstrate significant improvements in network lifetime and efficiency within multi-access edge environments.
APA, Harvard, Vancouver, ISO, and other styles
37

Sreeram Sivadasan. "Optimized Energy-Efficient Routing and Adaptive Coverage in Wireless Sensor Networks for Multi-Access Edge Environments." Communications on Applied Nonlinear Analysis 32, no. 2s (2024): 435–46. https://doi.org/10.52783/cana.v32.2433.

Full text
Abstract:
The growing demand for edge computing necessitates optimizing wireless sensor networks (WSNs) for efficient real-time data processing. Achieving energy efficiency and dynamic coverage in WSNs, particularly in multi-access edge environments, is a key challenge. This paper introduces a framework with three core algorithms to enhance network coverage and routing in such environments. The first algorithm uses Delaunay Triangulation to detect coverage gaps, identifying areas with insufficient sensor deployment. To address these gaps, the second algorithm applies a dynamic node placement strategy, where additional sensors are strategically positioned, optimized using clustering and refined Delaunay Triangulation to integrate new nodes. The third algorithm proposes an adaptive tree-based routing protocol using a Minimum Spanning Tree (MST) to conserve energy by rerouting data through energy-efficient paths. Together, these algorithms form a resilient WSN system that adapts to environmental changes, improving data reliability and system performance. Simulation results demonstrate significant improvements in network lifetime and efficiency within multi-access edge environments.
APA, Harvard, Vancouver, ISO, and other styles
38

Shi, Nianbo, Jian Yang, Zhixiang Cao, and Xiangliang Jin. "A Programmable Ambient Light Sensor with Dark Current Compensation and Wide Dynamic Range." Sensors 24, no. 11 (2024): 3396. http://dx.doi.org/10.3390/s24113396.

Full text
Abstract:
Ambient light sensors are becoming increasingly popular due to their effectiveness in extending the battery life of portable electronic devices. However, conventional ambient light sensors are large in area and small in dynamic range, and they do not take into account the effects caused due to a dark current. To address the above problems, a programmable ambient light sensor with dark current compensation and a wide dynamic range is proposed in this paper. The proposed ambient light sensor exhibits a low current power consumption of only 7.7 µA in dark environments, and it operates across a wide voltage range (2–5 V) and temperature range (−40–80 °C). It senses ambient light and provides an output current proportional to the ambient light intensity, with built-in dark current compensation to effectively suppress the effects of a dark current. It provides a wide dynamic range over the entire operating temperature range with three selectable output-current gain modes. The proposed ambient light sensor was designed and fabricated using a 0.18 µm standard CMOS process, and the effective area of the chip is 663 µm × 652 µm. The effectiveness of the circuit was verified through testing, making it highly suitable for portable electronic products and fluorescent fiber-optic temperature sensors.
APA, Harvard, Vancouver, ISO, and other styles
39

Priyanka, Das. "Enhancing Localization Accuracy with Sensor Fusion Techniques in Unknown Environments." INTERNATIONAL JOURNAL OF INNOVATIVE RESEARCH AND CREATIVE TECHNOLOGY 8, no. 1 (2022): 1–5. https://doi.org/10.5281/zenodo.14250577.

Full text
Abstract:
The purpose of this paper is to discuss advanced approaches to sensor fusion, such as EKF SLAM, Graph-based SLAM, and virtual-inertial SLAM (VISLAM), with methods to improve the quality of a sensor such as Loop Closure Detection and Dynamic Weighting. It also incorporates necessary measures to enhance the performance of SLAM. The paper established that navigation in an unknown environment is a critical feature for autonomous systems, and it relies on accurate localization and mapping with no prior information. The report further affirmed that multi-sensor fusion methods coordinate information acquired from different sensors like LiDAR, cameras, and IMU to improve localization quality and avoid dependence on a solitary sensor. These findings contribute to the development of more realistic and scalable SLAM systems to model and navigate through challenging and dynamic environments. As technologies continue to evolve, there is a possibility for the emergence of advanced systems in robotics, self-driving cars, and similar technologies further to improve the performance of SLAM systems in unknown environments.
APA, Harvard, Vancouver, ISO, and other styles
40

Wang, Yujing, Abdul Hadi Abd Rahman, Fadilla ’Atyka Nor Rashid, and Mohamad Khairulamirin Md Razali. "Tackling Heterogeneous Light Detection and Ranging-Camera Alignment Challenges in Dynamic Environments: A Review for Object Detection." Sensors 24, no. 23 (2024): 7855. https://doi.org/10.3390/s24237855.

Full text
Abstract:
Object detection is an essential computer vision task that identifies and locates objects within images or videos and is crucial for applications such as autonomous driving, robotics, and augmented reality. Light Detection and Ranging (LiDAR) and camera sensors are widely used for reliable object detection. These sensors produce heterogeneous data due to differences in data format, spatial resolution, and environmental responsiveness. Existing review articles on object detection predominantly focus on the statistical analysis of fusion algorithms, often overlooking the complexities of aligning data from these distinct modalities, especially dynamic environment data alignment. This paper addresses the challenges of heterogeneous LiDAR-camera alignment in dynamic environments by surveying over 20 alignment methods for three-dimensional (3D) object detection, focusing on research published between 2019 and 2024. This study introduces the core concepts of multimodal 3D object detection, emphasizing the importance of integrating data from different sensor modalities for accurate object recognition in dynamic environments. The survey then delves into a detailed comparison of recent heterogeneous alignment methods, analyzing critical approaches found in the literature, and identifying their strengths and limitations. A classification of methods for aligning heterogeneous data in 3D object detection is presented. This paper also highlights the critical challenges in aligning multimodal data, including dynamic environments, sensor fusion, scalability, and real-time processing. These limitations are thoroughly discussed, and potential future research directions are proposed to address current gaps and advance the state-of-the-art. By summarizing the latest advancements and highlighting open challenges, this survey aims to stimulate further research and innovation in heterogeneous alignment methods for multimodal 3D object detection, thereby pushing the boundaries of what is currently achievable in this rapidly evolving domain.
APA, Harvard, Vancouver, ISO, and other styles
41

Nur, Siti. "Precision without GPS: Multi-Sensor Fusion for Autonomous Drone Navigation in Complex Environments." International Journal of Innovative Research in Computer Science and Technology 12, no. 6 (2024): 34–43. http://dx.doi.org/10.55524/ijircst.2024.12.6.6.

Full text
Abstract:
The rapid evolution of drone technology has expanded its applications across variodus domains, including delivery services, environmental monitoring, and search and rescue operations. However, many of these applications face significant challenges in GPS-denied environments, such as dense urban areas and heavily forested regions, where traditional navigation methods falter. This paper presents a novel multi-sensor fusion algorithm designed to enhance the localization accuracy of autonomous drones without reliance on GPS. By integrating data from an Inertial Measurement Unit (IMU), LiDAR, and visual sensors, the proposed approach effectively compensates for the limitations of individual sensors, enabling robust navigation in complex environments. Experimental results demonstrate that the algorithm achieves an average localization accuracy of 1.2 meters in urban areas and 1.5 meters in forested settings, showcasing its resilience against sensor noise and environmental challenges. The implementation of loop closure techniques further improves long-term navigation accuracy, making it suitable for prolonged missions. This research contributes to the growing body of knowledge in autonomous drone navigation and offers significant implications for enhancing the operational capabilities of drones in real-world scenarios. Future work will focus on integrating additional sensors, exploring machine learning techniques for adaptive fusion, and conducting extensive field trials to validate the system's performance in dynamic environments..
APA, Harvard, Vancouver, ISO, and other styles
42

Hale, J. M., J. R. White, R. Stephenson, and F. Liu. "Development of piezoelectric paint thick-film vibration sensors." Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science 219, no. 1 (2005): 1–9. http://dx.doi.org/10.1243/095440605x8441.

Full text
Abstract:
This paper describes a programme of trials of thick-film dynamic strain sensors made using ‘piezoelectric paint’. The fabrication process is described and it is shown that the sensitivity is comparable with that of other thick-film sensors and the piezoelectric polymer polyvnylidenefluoride (PVDF). A series of dynamic and environmental tests is described. The dynamic range and bandwidth are shown to be suitable for structural vibration monitoring, and to be largely unaffected by adverse environments (rain, frost, sunlight, etc.).
APA, Harvard, Vancouver, ISO, and other styles
43

Barambones, Jose, Ricardo Imbert, and Cristian Moral. "Applicability of Multi-Agent Systems and Constrained Reasoning for Sensor-Based Distributed Scenarios: A Systematic Mapping Study on Dynamic DCOPs." Sensors 21, no. 11 (2021): 3807. http://dx.doi.org/10.3390/s21113807.

Full text
Abstract:
Context: At present, sensor-based systems are widely used to solve distributed problems in changing environments where sensors are controlled by intelligent agents. On Multi-Agent Systems, agents perceive their environment through such sensors, acting upon that environment through actuators in a continuous cycle. These problems have not always been addressed from an ad-hoc perspective, designed specifically for the circumstances of the problem at hand. Instead, they have been modelled under a common mathematical framework as distributed constrained optimisation problems (DCOP). Objective: The question to answer is how sensor-based scenarios have been modelled as DCOPs in changing environments known as Dynamic DCOP and what their trends, gaps, and progression are. Method: A systematic mapping study of Dynamic DCOPs has been conducted, considering the scattered literature and the lack of consensus in the terminology. Results: Given the high complexity of distributed constraint-based problems, priority is given to obtaining sub-optimal but fast responses with a low communication cost. Other trending aspects are the scalability and guaranteeing the solution over time. Conclusion: Despite some lacks in the analysis and experimentation in real-world scenarios, a large set that is applicable to changing sensor-based scenarios is evidenced, along with proposals that allow the integration of off-the-shell constraint-based algorithms.
APA, Harvard, Vancouver, ISO, and other styles
44

Sakhai, Mustafa, Kaung Sithu, Min Khant Soe Oke, and Maciej Wielgosz. "Cyberattack Resilience of Autonomous Vehicle Sensor Systems: Evaluating RGB vs. Dynamic Vision Sensors in CARLA." Applied Sciences 15, no. 13 (2025): 7493. https://doi.org/10.3390/app15137493.

Full text
Abstract:
Autonomous vehicles (AVs) rely on a heterogeneous sensor suite of RGB cameras, LiDAR, GPS/IMU, and emerging event-based dynamic vision sensors (DVS) to perceive and navigate complex environments. However, these sensors can be deceived by realistic cyberattacks, undermining safety. In this work, we systematically implement seven attack vectors in the CARLA simulator—salt and pepper noise, event flooding, depth map tampering, LiDAR phantom injection, GPS spoofing, denial of service, and steering bias control—and measure their impact on a state-of-the-art end-to-end driving agent. We then equip each sensor with tailored defenses (e.g., adaptive median filtering for RGB and spatial clustering for DVS) and integrate a unsupervised anomaly detector (EfficientAD from anomalib) trained exclusively on benign data. Our detector achieves clear separation between normal and attacked conditions (mean RGB anomaly scores of 0.00 vs. 0.38; DVS: 0.61 vs. 0.76), yielding over 95% detection accuracy with fewer than 5% false positives. Defense evaluations reveal that GPS spoofing is fully mitigated, whereas RGB- and depth-based attacks still induce 30–45% trajectory drift despite filtering. Notably, our research-focused evaluation of DVS sensors suggests potential intrinsic resilience advantages in high-dynamic-range scenarios, though their asynchronous output necessitates carefully tuned thresholds. These findings underscore the critical role of multi-modal anomaly detection and demonstrate that DVS sensors exhibit greater intrinsic resilience in high-dynamic-range scenarios, suggesting their potential to enhance AV cybersecurity when integrated with conventional sensors.
APA, Harvard, Vancouver, ISO, and other styles
45

Toumi, Mohamed, Abderrahim Maizate, Mohammed Ouzzif, and Med Said Salah. "Dynamic Clustering Algorithm for Tracking Targets with High and Variable Celerity (ATHVC)." Journal of Computer Networks and Communications 2016 (2016): 1–10. http://dx.doi.org/10.1155/2016/7631548.

Full text
Abstract:
Target tracking with the wireless sensors networks is to detect and locate a target on its entire path through a region of interest. This application arouses interest in the world of research for its many fields of use. Wireless sensor networks, thanks to their versatility, can be used in many hostile and inaccessible to humans environments. However, with a limited energy, they cannot remain permanently active, which can significantly reduce their lifetime. The formation of a cluster network seems an effective mechanism to increase network lifetime. We propose to build optimal dynamic clusters on the target trajectory. For increasing energy efficiency, our algorithm integrates for the first time, to our knowledge, strategies to avoid overlapping clusters and a model to wake up the sensors, adapting to the context of targets with large and variable speed.
APA, Harvard, Vancouver, ISO, and other styles
46

Mueggler, Elias, Henri Rebecq, Guillermo Gallego, Tobi Delbruck, and Davide Scaramuzza. "The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and SLAM." International Journal of Robotics Research 36, no. 2 (2017): 142–49. http://dx.doi.org/10.1177/0278364917691115.

Full text
Abstract:
New vision sensors, such as the dynamic and active-pixel vision sensor (DAVIS), incorporate a conventional global-shutter camera and an event-based sensor in the same pixel array. These sensors have great potential for high-speed robotics and computer vision because they allow us to combine the benefits of conventional cameras with those of event-based sensors: low latency, high temporal resolution, and very high dynamic range. However, new algorithms are required to exploit the sensor characteristics and cope with its unconventional output, which consists of a stream of asynchronous brightness changes (called “events”) and synchronous grayscale frames. For this purpose, we present and release a collection of datasets captured with a DAVIS in a variety of synthetic and real environments, which we hope will motivate research on new algorithms for high-speed and high-dynamic-range robotics and computer-vision applications. In addition to global-shutter intensity images and asynchronous events, we provide inertial measurements and ground-truth camera poses from a motion-capture system. The latter allows comparing the pose accuracy of ego-motion estimation algorithms quantitatively. All the data are released both as standard text files and binary files (i.e. rosbag). This paper provides an overview of the available data and describes a simulator that we release open-source to create synthetic event-camera data.
APA, Harvard, Vancouver, ISO, and other styles
47

Lee, Taewoo, Yumin Choi, and Hyunbum Kim. "Reinforcing Deep Learning-Enabled Surveillance with Smart Sensors." Sensors 25, no. 11 (2025): 3345. https://doi.org/10.3390/s25113345.

Full text
Abstract:
It is critical to solidify surveillance in 3D environments with heterogeneous sensors. This study introduces an innovative deep learning-assisted surveillance reinforcement system with smart sensors for resource-constrained cyber-physical devices and mobile elements. The proposed system incorporates deep learning technologies to address the challenges of dynamic public environments. By enhancing the adaptability and effectiveness of surveillance in environments with high human mobility, this paper aims to optimize surveillance node placement and ensure real-time system responsiveness. The integration of deep learning not only improves accuracy and efficiency but also introduces unprecedented flexibility in surveillance operations.
APA, Harvard, Vancouver, ISO, and other styles
48

Henry H. James, Razu Pawel, and Gawin Saduf. "Autonomous Vehicles and Robust Decision-Making in Dynamic Environments." Fusion of Multidisciplinary Research, An International Journal 1, no. 2 (2020): 110–21. https://doi.org/10.63995/nstn6884.

Full text
Abstract:
Autonomous vehicles (AVs) are at the forefront of transforming transportation, requiring robust decision-making capabilities to navigate dynamic environments effectively. These vehicles rely on advanced sensors, machine learning algorithms, and real-time data processing to make split-second decisions that ensure safety and efficiency. Key challenges include accurately perceiving the environment, predicting the behavior of other road users, and responding to unpredictable conditions such as changing weather, road obstacles, and traffic patterns. Robust decision-making in AVs integrates various technologies, including computer vision, sensor fusion, and deep learning, to create comprehensive situational awareness. Decision-making frameworks, such as reinforcement learning and probabilistic modeling, enable AVs to evaluate multiple scenarios and select optimal actions under uncertainty. Additionally, the development of vehicle-to-everything (V2X) communication enhances situational awareness by allowing AVs to exchange information with infrastructure and other vehicles. Ensuring the reliability and safety of AVs in dynamic environments requires rigorous testing, validation, and continuous improvement of decision-making algorithms. Ethical considerations, regulatory frameworks, and public acceptance also play crucial roles in the widespread adoption of autonomous vehicles. This abstract highlights the technological and methodological advancements in robust decision-making for AVs, emphasizing their potential to revolutionize transportation by enhancing safety, reducing congestion, and improving mobility in dynamic and complex environments.
APA, Harvard, Vancouver, ISO, and other styles
49

Chung, Y. L., and S. A. Spiewak. "A Model of High Performance Dynamometer." Journal of Engineering for Industry 116, no. 3 (1994): 279–88. http://dx.doi.org/10.1115/1.2901943.

Full text
Abstract:
A narrow frequency bandwidth, strong fluctuations of the gain versus signal frequency and sensitivity to disturbances caused by the operating environments are the most common factors limiting the applicability of sensors in manufacturing systems. Self-tuning filters represent an efficient means of alleviating these limitations. Since the dynamic properties of sensors vary rapidly, a successful implementation of sensors coupled with self-tuning filters hinges upon accurate, real-time adjustments of these filters. The selection of optimum filter settings, based upon the available distorted output signals from the in-process sensors, poses a difficult problem. In general, the algorithm of self tuning requires a priori information about the sensor and its environment, condensed into a form of an analytical model. A systematic approach to the analytical modeling of sensors is proposed. To illustrate this approach, a comprehensive model of a commercial dynamometer is developed and tested.
APA, Harvard, Vancouver, ISO, and other styles
50

Dufour, Jonathan S., Alexander M. Aurand, Eric B. Weston, Christopher N. Haritos, Reid A. Souchereau, and William S. Marras. "Dynamic Joint Motions in Occupational Environments as Indicators of Potential Musculoskeletal Injury Risk." Journal of Applied Biomechanics 37, no. 3 (2021): 196–203. http://dx.doi.org/10.1123/jab.2020-0213.

Full text
Abstract:
The objective of this study was to test the feasibility of using a pair of wearable inertial measurement unit (IMU) sensors to accurately capture dynamic joint motion data during simulated occupational conditions. Eleven subjects (5 males and 6 females) performed repetitive neck, low-back, and shoulder motions simulating low- and high-difficulty occupational tasks in a laboratory setting. Kinematics for each of the 3 joints were measured via IMU sensors in addition to a “gold standard” passive marker optical motion capture system. The IMU accuracy was benchmarked relative to the optical motion capture system, and IMU sensitivity to low- and high-difficulty tasks was evaluated. The accuracy of the IMU sensors was found to be very good on average, but significant positional drift was observed in some trials. In addition, IMU measurements were shown to be sensitive to differences in task difficulty in all 3 joints (P &lt; .05). These results demonstrate the feasibility for using wearable IMU sensors to capture kinematic exposures as potential indicators of occupational injury risk. Velocities and accelerations demonstrate the most potential for developing risk metrics since they are sensitive to task difficulty and less sensitive to drift than rotational position measurements.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!