Academic literature on the topic '3D IMMERSIVE TOOL'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic '3D IMMERSIVE TOOL.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "3D IMMERSIVE TOOL"

1

Chhikara, Vanshika. "IMMERSIVE ANALYTICS." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 05 (2024): 1–5. http://dx.doi.org/10.55041/ijsrem33585.

Full text
Abstract:
Immersive Analytics focuses on the benefits and challenges of using immersive environments for data analysis, and developing designs to improve efficiency. Although immersive technologies are widely available, practical solutions have not gained widespread acceptance in real- world applications. Research in this field focuses on abstract 3D visualization, immersive environments, paper sampling and use case evaluation. 03 Related Works • Brooks early review of VR applications found it effective in specific domains like flight simulators, automotive engineering, and astronaut training. • Van Dam et al. highlighted VR applications for SciVis, benefiting archaeology and medical fields. • Laha and Bowman reviewed VR techniques for visualizing volume data, highlighting the need for controlled experiments to explore individual components of immersion. • Reda et al. summarized research for hybrid reality environments like the CAVE2, emphasizing the possibility of collaborative data analysis. • Brath collected evidence that 3D visualizations offer advantages beyond 2D, focusing on immersive displays. 05 Result • Overall, immersive analytics can lead to better data comprehension, better decision-making, more engagement and teamwork, effective big data exploration, creative data visualization methods, non-technical users' empowerment, and applications in a variety of fields. • These results add to the increasing importance of immersive analytics as an efficient tool for decision support and data analysis. Introduction • To facilitate data-driven analytical reasoning, immersive analytics leverages interactive technology like as virtual reality glasses, big flat screen displays, and even the internet of things. • Immersion refers to an experience that creates a genuine sense of presence in a virtual environment. • A person experiences a shift in awareness from their immediate real environment to another reality. • Immersion analytics is still a relatively new field that has mostly been studied in use cases and workshops. • The notion remains extremely difficult, even within the small community of immersive analytics practitioners. Methodology 4.1 Immersive environments • The study focuses on immersive environments leading to a mixed reality experience. • Abstract 3D visualizations must be presented in a mixed or VR environment where hardware and user interact closely. 4.2 Abstract 3D visualizations • Data that lacks a physical representation or intrinsic spatial organization is referred to as abstract data. • In visualization, abstraction is obtained by using colors and shapes that are not directly associated with the object. 4.3 Paper sampling • Paper sampling in immersive analytics refers to the process of selecting and gathering relevant research papers and publications related to immersive analytics. Conclusion and future scope • In conclusion, immersive analytics has the potential to completely transform how we work with data by facilitating deeper understanding, better decision-making, and improved teamwork. • Immersion analytics will develop further and contribute significantly to data analysis and decision support in the future by tackling issues and seizing opportunities in technological innovation, domain-specific applications, ethical issues, and user experience design.
APA, Harvard, Vancouver, ISO, and other styles
2

Sadeghi, Amir H., Wouter Bakhuis, Frank Van Schaagen, et al. "Immersive 3D virtual reality imaging in planning minimally invasive and complex adult cardiac surgery." European Heart Journal - Digital Health 1, no. 1 (2020): 62–70. http://dx.doi.org/10.1093/ehjdh/ztaa011.

Full text
Abstract:
Abstract Aims Increased complexity in cardiac surgery over the last decades necessitates more precise preoperative planning to minimize operating time, to limit the risk of complications during surgery and to aim for the best possible patient outcome. Novel, more realistic, and more immersive techniques, such as three-dimensional (3D) virtual reality (VR) could potentially contribute to the preoperative planning phase. This study shows our initial experience on the implementation of immersive VR technology as a complementary research-based imaging tool for preoperative planning in cardiothoracic surgery. In addition, essentials to set up and implement a VR platform are described. Methods Six patients who underwent cardiac surgery at the Erasmus Medical Center, Rotterdam, The Netherlands, between March 2020 and August 2020, were included, based on request by the surgeon and availability of computed tomography images. After 3D VR rendering and 3D segmentation of specific structures, the reconstruction was analysed via a head mount display. All participating surgeons (n = 5) filled out a questionnaire to evaluate the use of VR as preoperative planning tool for surgery. Conclusion Our study demonstrates that immersive 3D VR visualization of anatomy might be beneficial as a supplementary preoperative planning tool for cardiothoracic surgery, and further research on this topic may be considered to implement this innovative tool in daily clinical practice. Lay summary Over the past decades, surgery on the heart and vessels is becoming more and more complex, necessitating more precise and accurate preoperative planning. Nowadays, operative planning is feasible on flat, two-dimensional computer screens, however, requiring a lot of spatial and three-dimensional (3D) thinking of the surgeon. Since immersive 3D virtual reality (VR) is an upcoming imaging technique with promising results in other fields of surgery, we aimed in this study to explore the additional value of this technique in heart surgery. Our surgeons planned six different heart operations by visualizing computed tomography scans with a dedicated VR headset, enabling them to visualize the patient’s anatomy in an immersive and 3D environment. The outcomes of this preliminary study are positive, with a much more reality-like simulation for the surgeon. In such, VR could potentially be beneficial as a preoperative planning tool for complex heart surgery.
APA, Harvard, Vancouver, ISO, and other styles
3

Medeiros, Daniel, Felipe Carvalho, Lucas Teixeira, Priscilla Braz, Alberto Raposo, and Ismael Santos. "Proposal and evaluation of a tablet-based tool for 3D virtual environments." Journal on Interactive Systems 4, no. 2 (2014): 1. http://dx.doi.org/10.5753/jis.2013.633.

Full text
Abstract:
The introduction of embedded sensors in smartphones and tablets allowed the use of these devices to interact with virtual environments. These devices also have the possibility of including additional information and performing naturally non-immersive tasks. This work presents a 3D interaction tablet-based tool, which allows the aggregation of all major 3D interaction tasks, such as navigation, selection, manipulation, system control and symbolic input. This tool is for generalpurpose systems, as well as, engineering applications. Generally this kind of application uses specific interaction devices with four or more degrees of freedom and a common keyboard and mouse for tasks that are naturally non-immersive, such as symbolic input (e.g., text or number input). This article proposes a new tablet-based device that can perform all these major tasks in an immersive environment. It also presents a study case of the use of the device and some user tests.
APA, Harvard, Vancouver, ISO, and other styles
4

Avola, Danilo, Luigi Cinque, and Daniele Pannone. "Design of a 3D Platform for Immersive Neurocognitive Rehabilitation." Information 11, no. 3 (2020): 134. http://dx.doi.org/10.3390/info11030134.

Full text
Abstract:
In recent years, advancements in human–computer interaction (HCI) have enabled the development of versatile immersive devices, including Head-Mounted Displays (HMDs). These devices are usually used for entertainment activities as video-gaming or augmented/virtual reality applications for tourist or learning purposes. Actually, HMDs, together with the design of ad-hoc exercises, can also be used to support rehabilitation tasks, including neurocognitive rehabilitation due to strokes, traumatic brain injuries, or brain surgeries. In this paper, a tool for immersive neurocognitive rehabilitation is presented. The tool allows therapists to create and set 3D rooms to simulate home environments in which patients can perform tasks of their everyday life (e.g., find a key, set a table, do numerical exercises). The tool allows therapists to implement the different exercises on the basis of a random mechanism by which different parameters (e.g., objects position, task complexity) can change over time, thus stimulating the problem-solving skills of patients. The latter aspect plays a key role in neurocognitive rehabilitation. Experiments obtained on 35 real patients and comparative evaluations, conducted by five therapists, of the proposed tool with respect to the traditional neurocognitive rehabilitation methods highlight remarkable results in terms of motivation, acceptance, and usability as well as recovery of lost skills.
APA, Harvard, Vancouver, ISO, and other styles
5

Papadopoulou, A., D. Kontos, and A. Georgopoulos. "DEVELOPING A VR TOOL FOR 3D ARCHITECTURAL MEASUREMENTS." International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLVI-2/W1-2022 (February 25, 2022): 421–27. http://dx.doi.org/10.5194/isprs-archives-xlvi-2-w1-2022-421-2022.

Full text
Abstract:
Abstract. Virtual Reality technology has already matured and is capable of offering impressive immersive experiences. AT the same time head mounted devices (HMD) are also offering many possibilities along with the game engine environments. So far, all these impressive technologies have been implemented to increase the popularity of on-line visits and serious games development, as far as their application in the domain of Cultural Heritage is concerned. In this paper we present the development of a set of VR tools, which enable the user to perform accurate measurements within the immersive environment. In this way we believe that these tools will be very helpful and appeal to experts in need of these measurements, as they can perform them in the laboratory instead of visiting the object itself. This toolbox includes measuring the coordinates of single points in 3D space, measuring three-dimensional distances and performing horizontal or vertical cross sections. The first two have been already presented previously (Kontos & Georgopoulos 2020) and this paper focuses on the evaluation of the performance of the toolbox in determining cross sections. The development of the tool is explained in detail and the resulting cross sections of the 3D model of the Holy Aedicule are compared to real measurements performed geodetically. The promising results are discussed and evaluated.
APA, Harvard, Vancouver, ISO, and other styles
6

Byrd, B., M. Warren, J. Fenwick, and P. Bridge. "Development of a novel 3D immersive visualisation tool for manual image matching." Journal of Radiotherapy in Practice 18, no. 4 (2019): 318–22. http://dx.doi.org/10.1017/s1460396919000219.

Full text
Abstract:
AbstractAim:The novel Volumetric Image Matching Environment for Radiotherapy (VIMER) was developed to allow users to view both computed tomography (CT) and cone-beam CT (CBCT) datasets within the same 3D model in virtual reality (VR) space. Stereoscopic visualisation of both datasets combined with custom slicing tools and complete freedom in motion enables alternative inspection and matching of the datasets for image-guided radiotherapy (IGRT).Material and methods:A qualitative study was conducted to explore the challenges and benefits of VIMER with respect to image registration. Following training and use of the software, an interview session was conducted with a sample group of six university staff members with clinical experience in image matching.Results:User discomfort and frustration stemmed from unfamiliarity with the drastically different input tools and matching interface. As the primary advantage, the users reported match inspection efficiency when presented with the 3D volumetric renderings of the planning and secondary CBCT datasets.Findings:This study provided initial evidence for the achievable benefits and limitations to consider when implementing a 3D voxel-based dataset comparison VR tool including a need for extensive training and the minimal interruption to IGRT workflow. Key advantages include efficient 3D anatomical interpretation and the capability for volumetric matching.
APA, Harvard, Vancouver, ISO, and other styles
7

Li, Yuxuan, Hua Luo, and Yiren Zhou. "Design and Implementation of Virtual Campus Roaming System Based on Unity3d." Journal of Physics: Conference Series 2173, no. 1 (2022): 012038. http://dx.doi.org/10.1088/1742-6596/2173/1/012038.

Full text
Abstract:
Abstract With the rapid development of Internet technology and the maturity of 5g technology, virtual reality gradually appears in the public vision, and its involved fields are also expanding. Using virtual reality technology and head mounted display, users’ immersion and authenticity can be improved to the greatest extent. In this paper, an immersive virtual campus roaming system is realized by using 3ds Max tool to create a model, unity 3D tool to build a scene, c# language to write human-computer interaction script, and action one headset device to take Nanchang Institute of technology as an example.
APA, Harvard, Vancouver, ISO, and other styles
8

Subramaniyam, Bala, Dharshan Shylesh, Jaganathan Ramasamy, and Navin Kumar. "A Virtual Reality Tool for Accuracy Assessment of 3D Models in an Immersive Virtual Environment." ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences X-4-2024 (October 18, 2024): 333–40. http://dx.doi.org/10.5194/isprs-annals-x-4-2024-333-2024.

Full text
Abstract:
Abstract. Accurate validation and assessment techniques are essential for ensuring the reliability of spatial reconstructions derived from photogrammetry, enabling well-informed decision-making across diverse domains. This study presents a Virtual Reality (VR) based accuracy assessment tool tailored for evaluating the accuracy and quality of 3D models generated by Unmanned Aerial Vehicles (UAVs). Leveraging the Unity game engine platform, our workflow entails three key steps: aligning real-world coordinates with an arbitrary Unity coordinate system, transforming the positions of Ground Control Points (GCPs) from field survey to the arbitrary system using a reference GCP, and marking observed points on the 3D models. Absolute and Relative Root Mean Square Errors (RMSE), Mean Errors (ME), and Standard Deviation of errors (SD) are computed within the virtual environment via the game object transform properties. The error distributions around each GCP are visually depicted using Unity game engine components for enhanced interaction and comprehension. The efficacy of the tool is validated through experimentation on four 3D models generated from varying camera angles during UAV data capture. The tool provides the opportunity to directly interact with the 3D models and visualize the errors, which is quite distinct from traditional methods. Using the developed tool, results were obtained to indicate that configurations employing camera angles of 60° + 75º exhibit notable performance in terms of relative and absolute accuracy.
APA, Harvard, Vancouver, ISO, and other styles
9

Hakeem, Abeer, Hind Bitar, and Ayman Alfahid. "ALHK: Integrating 3D Holograms and Gesture Interaction for Elementary Education." Inteligencia Artificial 28, no. 75 (2024): 30–45. http://dx.doi.org/10.4114/intartif.vol28iss75pp30-45.

Full text
Abstract:
The integration of technology in elementary education offers innovative ways to enhance learning. One such advancement is the use of three-dimensional holograms (3DH), which provide immersive displays that merge seamlessly with the learner’s environment, creating a dynamic and engaging atmosphere. Educators have found that 3D visual tools significantly improve student comprehension, with 94.4% agreeing in a preliminary study. However, using interactive 3D holography alone has limitations, such as the inability for students to physically touch or manipulate holographic objects. To address this, Active Learning with Holo-Kid (ALHK) is introduced as a desktop application for elementary school students (grades 1 to 6). ALHK combines Leap Motion technology’s precision with interactive 3D holography to overcome these limitations. The combination allows students to interact with virtual objects in a more immersive and realistic manner. Holograms provide visual representation, while Leap Motion enables precise gesture recognition and hand tracking, resulting in a seamless and intuitive user experience. Initial evaluations demonstrate improved student engagement and comprehension. Future iterations aim to enhance scalability by incorporating features like custom object upload, multi-user interaction, and broader age applicability. ALHK shows promise as a tool for creating an immersive and intuitive learning environment using 3D holograms and interactive technology in elementary education.
APA, Harvard, Vancouver, ISO, and other styles
10

Chastenay, Pierre. "Digital Planetariums as New Tools for Conceptual Change." Communicating Astronomy with the Public Journal 17, no. 2 (2023): 21–23. https://doi.org/10.5281/zenodo.14986650.

Full text
Abstract:
Misconceptions in astronomy arise because of our unique, geocentric perspective on the night sky and astronomical phenomena. However, astronomy is quintessentially a “spatial” science, and three-dimensional visualisation is necessary for understanding most of its core concepts. Since the traditional optomechanical planetarium offers the same geocentric point of view on the sky, it might generate similar misconceptions in visitors. The new digital planetarium, which projects on the dome a realistic and accurate rendition of the cosmos in 3D, allows us to break free from 2D representations and transforms the planetarium theatre into a virtual spacecraft, affording different perspectives on astronomical phenomena. This new planetarium thus becomes an extraordinary tool to foster conceptual change in visitors. Recent research in astronomy education about using the digital planetarium to teach basic astronomy concepts, like the diurnal cycle, phases of the Moon, and seasons, has shown the advantages of this new tool, compared with 2D visuals and traditional classroom instruction. More research needs to be done to better understand how to use the digital planetarium in communicating astronomy with the public, especially to present concepts with a strong spatial component, but the future is promising!
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "3D IMMERSIVE TOOL"

1

Bridge, Pete. "The development and evaluation of a novel 3D radiotherapy immersive outlining tool." Thesis, Queensland University of Technology, 2017. https://eprints.qut.edu.au/123511/1/Peter%20Bridge%20Thesis.pdf.

Full text
Abstract:
Radiotherapy target definition traditionally relies on manually drawing round the relevant anatomical structures on successive CT slices. This is a laborious process that impacts on patient throughput and can limit the adoption of more complex techniques. Although automated segmentation algorithms can create rapid volumes they lack the capacity to adapt to tumour volumes and areas of abnormal or variable anatomical structures. Such software invariably requires considerable manual input in terms of editing the outlines. In addition there is a growing desire among clinicians to be more actively involved and ensure that their clinical decision making is factored in to the generated volumes. This thesis presents a novel solution to this problem. The primary aim of this "proof of principle" thesis is the development and evaluation of software capable of generating a mesh structure derived from a small number of points placed on a range of CT planes. This is in contrast to the traditional method of outlining that relies on a large number of points placed on successive axial CT slices. Use of a small number of points is hypothesised to require less clinician time. The software also allows the user to edit the resultant outline volumetrically with 3D modelling tools derived from animation applications. These tools enable multiple "slices" to be manually edited simultaneously while retaining a smooth and clinically relevant volume shape. The thesis presents the development and evaluation of this new software application though a series of published works. The evaluation is drawn from a combination of qualitative research involving user focus groups and quantitative data collection relating to the clinical impact of the new paradigm. The first published paper reports on the development of the software tool with some preliminary user evaluation highlighting recommendations for optimum use and training. Mesh generation from a small number of points placed on a range of planes was found to be a potentially rapid and effective means of target delineation, although further work was suggested to improve multi-slice volume sculpting prior to more formal pre-clinical testing. The second paper presents qualitative data gained from Radiation Oncologist outliners relating to the clinical value of the software for accelerating clinician-directed prostate and seminal vesicle segmentation. The new tool was well-received and reported to be capable of producing very rapid and smooth volumes. This phase suffered due to time pressures experienced by the cohort and further testing of the software with a less time-poor cohort was be indicated. The third paper was developed from the initial two phases of the study and highlighted the specific challenge of radiotherapy outlining with a lack of "gold standard" and suggests that the inherent variability mandates a constructivist approach to evaluation. This constructivist approach to variability may empower clinicians to accept variability as an inherent aspect of their practice. Furthermore, research efforts should be focussed on maximising impact of training and guidelines as well as the development of a target minimum agreed measure of intra-observer variability that educational interventions should seek to facilitate. The final published work reported on quantitative testing of the software with a less time-pressured cohort. Student radiation therapists were tasked with outlining a bladder volume with both the new tool and the industry standard tool and found a significant (p = 0.03) time saving of 30% for bladder segmentation compared to axial-based outlining. The new volumetric outlining paradigm is conceptually challenging and requires users to adopt a significantly different approach to generating and editing structure outlines. It also demands high levels of spatial awareness to engage with the 3D navigation tools. Given the increasing use of 3D visualisation in medicine and the non-axial image interpretation demands of MR imaging it is important that training in these techniques be embedded at pre-registration level. Future work aims to further develop this outlining tool and establish its role in editing of autosegmentation derived contour sets.
APA, Harvard, Vancouver, ISO, and other styles
2

GUARNERA, LUCA. "Discovering Fingerprints for Deepfake Detection and Multimedia-Enhanced Forensic Investigations." Doctoral thesis, Università degli studi di Catania, 2021. http://hdl.handle.net/20.500.11769/539620.

Full text
Abstract:
Forensic Science, which concerns the application of technical and scientific methods to justice, investigation and evidence discovery, has evolved over the years to the birth of several fields such as Multimedia Forensics, which involves the analysis of digital images, video and audio contents. Multimedia data was (and still is), altered using common editing tools such as Photoshop and GIMP. Rapid advances in Deep Learning have opened up the possibility of creating sophisticated algorithms capable of manipulating images, video and audio in a “simple” manner causing the emergence of a powerful yet frightening new phenomenon called deepfake: synthetic multimedia data created and/or altered using generative models. A great discovery made by forensic researchers over the years concerns the possibility of extracting a unique fingerprint that can determine the devices and software used to create the data itself. Unfortunately, extracting these traces turns out to be a complicated task. A fingerprint can be extracted not only in multimedia data in order to determine the devices used in the acquisition phase, or the social networks where the file was uploaded, or recently define the generative models used to create deepfakes, but, in general, this trace can be extracted from evidences recovered in a crime scene as shells or projectiles to determine the model of gun that have fired (Forensic Firearms Ballistics Comparison). Forensic Analysis of Handwritten Documents is another field of Forensic Science that can determine the authors of a manuscript by extracting a fingerprint defined by a careful analysis of the text style in the document. Developing new algorithms for Deepfake Detection, Forensic Firearms Ballistics Comparison, and Forensic Handwritten Document Analysis was the main focus of this Ph.D. thesis. These three macro areas of Forensic Science have a common element, namely a unique fingerprint present in the data itself that can be extracted in order to solve the various tasks. Therefore, for each of these topics a preliminary analysis will be performed and new detection techniques will be presented obtaining promising results in all these domains.
APA, Harvard, Vancouver, ISO, and other styles
3

Kemp, Jeremy William. "Introducing an avatar acceptance model: Student intention to use 3D immersive learning tools in an online learning classroom." FIELDING GRADUATE UNIVERSITY, 2012. http://pqdtopen.proquest.com/#viewpdf?dispub=3452838.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

(8803076), Jordan M. McGraw. "Implementation and Analysis of Co-Located Virtual Reality for Scientific Data Visualization." Thesis, 2020.

Find full text
Abstract:
<div>Advancements in virtual reality (VR) technologies have led to overwhelming critique and acclaim in recent years. Academic researchers have already begun to take advantage of these immersive technologies across all manner of settings. Using immersive technologies, educators are able to more easily interpret complex information with students and colleagues. Despite the advantages these technologies bring, some drawbacks still remain. One particular drawback is the difficulty of engaging in immersive environments with others in a shared physical space (i.e., with a shared virtual environment). A common strategy for improving collaborative data exploration has been to use technological substitutions to make distant users feel they are collaborating in the same space. This research, however, is focused on how virtual reality can be used to build upon real-world interactions which take place in the same physical space (i.e., collaborative, co-located, multi-user virtual reality).</div><div><br></div><div>In this study we address two primary dimensions of collaborative data visualization and analysis as follows: [1] we detail the implementation of a novel co-located VR hardware and software system, [2] we conduct a formal user experience study of the novel system using the NASA Task Load Index (Hart, 1986) and introduce the Modified User Experience Inventory, a new user study inventory based upon the Unified User Experience Inventory, (Tcha-Tokey, Christmann, Loup-Escande, Richir, 2016) to empirically observe the dependent measures of Workload, Presence, Engagement, Consequence, and Immersion. A total of 77 participants volunteered to join a demonstration of this technology at Purdue University. In groups ranging from two to four, participants shared a co-located virtual environment built to visualize point cloud measurements of exploded supernovae. This study is not experimental but observational. We found there to be moderately high levels of user experience and moderate levels of workload demand in our results. We describe the implementation of the software platform and present user reactions to the technology that was created. These are described in detail within this manuscript.</div>
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "3D IMMERSIVE TOOL"

1

Architectural Visualization with Unreal Engine 5: Leverage the Power of Immersive ArchViz Creation Using One of the Most Advanced 3D Creation Tool. Packt Publishing, Limited, 2023.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "3D IMMERSIVE TOOL"

1

Kumar, Abhishek. "Tools for Architectural Visualization." In Immersive 3D Design Visualization. Apress, 2020. http://dx.doi.org/10.1007/978-1-4842-6597-0_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hajirasouli, Aso, Vito Getuli, Alessandro Bruttini, Tommaso Sorbi, and Pietro Capone. "Towards a Digital Era in AEC Higher Education: Combining Theory and Technology to Develop and Deliver Architectural Master Classes." In CONVR 2023 - Proceedings of the 23rd International Conference on Construction Applications of Virtual Reality. Firenze University Press, 2023. http://dx.doi.org/10.36253/10.36253/979-12-215-0289-3.25.

Full text
Abstract:
In recent years, technology has been playing a transformative role in the field of built environment, architecture, and construction education. It can be argued that the emergence of digital technologies has revolutionised the approach to teaching and learning in higher education in these fields. Digital technologies, such as Artificial intelligence (AI), additive manufacturing, robotics, 3D laser scanners, and Immersive Realities (IR), have played a crucial role in enhancing sustainability and efficiency in the industry. However, the opportunities provided by the use of these technologies (as a single tool or combined) in higher education and within the field of Architecture, Engineering, and Construction (AEC) are still relatively unexplored. To address this gap, this work presents a novel pedagogical framework aimed to enhance students’ literacy on emerging technologies, and increase their criticality, and understanding of professional practices along with the related ethical challenges. Furthermore, to assess its effectiveness regarding the integration of immersive VR technologies in the teaching practice, a learner-centred evaluation approach is proposed, based on the collection and correlation of both qualitative and quantitative data. Concerning the former, a dedicated questionnaire is developed to collect students’ subjective feedback. For the latter, a method for tracking their use of space in the virtual environment is discussed. Both the immersive pedagogical framework and evaluation approach presented in this work will be implemented in diverse architecture and civil engineering master classes in Australia and in Italy, and their comparative outcomes and validation will be the object of future joint contributions
APA, Harvard, Vancouver, ISO, and other styles
3

Hajirasouli, Aso, Vito Getuli, Alessandro Bruttini, Tommaso Sorbi, and Pietro Capone. "Towards a Digital Era in AEC Higher Education: Combining Theory and Technology to Develop and Deliver Architectural Master Classes." In CONVR 2023 - Proceedings of the 23rd International Conference on Construction Applications of Virtual Reality. Firenze University Press, 2023. http://dx.doi.org/10.36253/979-12-215-0289-3.25.

Full text
Abstract:
In recent years, technology has been playing a transformative role in the field of built environment, architecture, and construction education. It can be argued that the emergence of digital technologies has revolutionised the approach to teaching and learning in higher education in these fields. Digital technologies, such as Artificial intelligence (AI), additive manufacturing, robotics, 3D laser scanners, and Immersive Realities (IR), have played a crucial role in enhancing sustainability and efficiency in the industry. However, the opportunities provided by the use of these technologies (as a single tool or combined) in higher education and within the field of Architecture, Engineering, and Construction (AEC) are still relatively unexplored. To address this gap, this work presents a novel pedagogical framework aimed to enhance students’ literacy on emerging technologies, and increase their criticality, and understanding of professional practices along with the related ethical challenges. Furthermore, to assess its effectiveness regarding the integration of immersive VR technologies in the teaching practice, a learner-centred evaluation approach is proposed, based on the collection and correlation of both qualitative and quantitative data. Concerning the former, a dedicated questionnaire is developed to collect students’ subjective feedback. For the latter, a method for tracking their use of space in the virtual environment is discussed. Both the immersive pedagogical framework and evaluation approach presented in this work will be implemented in diverse architecture and civil engineering master classes in Australia and in Italy, and their comparative outcomes and validation will be the object of future joint contributions
APA, Harvard, Vancouver, ISO, and other styles
4

Johansson, Mikael, Mattias Roupé, and Mikael Viklund Tallgren. "Collaborative Site Layout Planning Using Multi-Touch Table and Immersive VR." In CONVR 2023 - Proceedings of the 23rd International Conference on Construction Applications of Virtual Reality. Firenze University Press, 2023. http://dx.doi.org/10.36253/979-12-215-0289-3.08.

Full text
Abstract:
Building Information Modeling (BIM) is changing the way architects and engineers produce and deliver design results, and object-oriented 3D models are now starting to replace traditional 2D drawings during the construction phase. This allows for a number of applications to increase efficiency, such as quantity take-off, cost-estimation, and planning, but it also supports better communication and increased understanding at the construction site by means of detailed 3D models together with various visualization techniques. However, even in projects with a fully BIM-based design, there is one remaining part that is still done primarily using 2D drawings and sketches – the construction site layout plan. In addition to not take advantage of the benefits offered by 3D, it also makes it difficult to integrate site layout planning within the openBIM ecosystem. In this paper we present the design and evaluation of a user-friendly, IFC-compatible software system that supports collaborative, multi-user creation of construction site layout plans using both multi-touch table and immersive VR. By allowing temporary structures, machines, and other components to be easily added and updated it is possible to continuously produce and communicate 3D site layout plans that are aligned with the schedule and supports integration with other BIM-tools
APA, Harvard, Vancouver, ISO, and other styles
5

Johansson, Mikael, Mattias Roupé, and Mikael Viklund Tallgren. "Collaborative Site Layout Planning Using Multi-Touch Table and Immersive VR." In CONVR 2023 - Proceedings of the 23rd International Conference on Construction Applications of Virtual Reality. Firenze University Press, 2023. http://dx.doi.org/10.36253/10.36253/979-12-215-0289-3.08.

Full text
Abstract:
Building Information Modeling (BIM) is changing the way architects and engineers produce and deliver design results, and object-oriented 3D models are now starting to replace traditional 2D drawings during the construction phase. This allows for a number of applications to increase efficiency, such as quantity take-off, cost-estimation, and planning, but it also supports better communication and increased understanding at the construction site by means of detailed 3D models together with various visualization techniques. However, even in projects with a fully BIM-based design, there is one remaining part that is still done primarily using 2D drawings and sketches – the construction site layout plan. In addition to not take advantage of the benefits offered by 3D, it also makes it difficult to integrate site layout planning within the openBIM ecosystem. In this paper we present the design and evaluation of a user-friendly, IFC-compatible software system that supports collaborative, multi-user creation of construction site layout plans using both multi-touch table and immersive VR. By allowing temporary structures, machines, and other components to be easily added and updated it is possible to continuously produce and communicate 3D site layout plans that are aligned with the schedule and supports integration with other BIM-tools
APA, Harvard, Vancouver, ISO, and other styles
6

Bounaouara, Wafa, Louis Rivest, and Antoine Tahan. "Combining Large-Scale 3D Metrology and Mixed Reality for Assembly Quality Control in Modular Construction." In CONVR 2023 - Proceedings of the 23rd International Conference on Construction Applications of Virtual Reality. Firenze University Press, 2023. http://dx.doi.org/10.36253/979-12-215-0289-3.116.

Full text
Abstract:
The quality control (QC) of assembled modules is an essential process when constructing modular buildings such as hotels and hospitals. Defects that go undetected during module assembly may result in lost productivity in the form of unnecessary transportation, rework or project delays. QC has traditionally been performed using specialized tools and carried out a posteriori in an inspection station dedicated solely to this task. Nowadays, large-scale 3D metrology technology provides a more efficient alternative since it enables accurate measurements to be taken in situ. Additionally, mixed reality (MR) supports the immersive projection of information and guidance instructions. This paper introduces a proof of concept of a framework that combines industrial photogrammetry with the HoloLens 2 MR headset to assist with assembly and QC during the off-site construction phase of modular construction. Many tests were conducted in a laboratory and a factory setting to evaluate the system’s user-friendliness and possible challenges associated with its future implementation. The experiments conducted confirmed that combining 3D metrology with MR offers an interesting solution for integrating QC into the assembly process. However, further work is needed to enhance the measurement workflow and optimize the measurement system’s accuracy
APA, Harvard, Vancouver, ISO, and other styles
7

Bounaouara, Wafa, Louis Rivest, and Antoine Tahan. "Combining Large-Scale 3D Metrology and Mixed Reality for Assembly Quality Control in Modular Construction." In CONVR 2023 - Proceedings of the 23rd International Conference on Construction Applications of Virtual Reality. Firenze University Press, 2023. http://dx.doi.org/10.36253/10.36253/979-12-215-0289-3.116.

Full text
Abstract:
The quality control (QC) of assembled modules is an essential process when constructing modular buildings such as hotels and hospitals. Defects that go undetected during module assembly may result in lost productivity in the form of unnecessary transportation, rework or project delays. QC has traditionally been performed using specialized tools and carried out a posteriori in an inspection station dedicated solely to this task. Nowadays, large-scale 3D metrology technology provides a more efficient alternative since it enables accurate measurements to be taken in situ. Additionally, mixed reality (MR) supports the immersive projection of information and guidance instructions. This paper introduces a proof of concept of a framework that combines industrial photogrammetry with the HoloLens 2 MR headset to assist with assembly and QC during the off-site construction phase of modular construction. Many tests were conducted in a laboratory and a factory setting to evaluate the system’s user-friendliness and possible challenges associated with its future implementation. The experiments conducted confirmed that combining 3D metrology with MR offers an interesting solution for integrating QC into the assembly process. However, further work is needed to enhance the measurement workflow and optimize the measurement system’s accuracy
APA, Harvard, Vancouver, ISO, and other styles
8

Richards-Rissetto, Heather, Kristy E. Primeau, David E. Witt, and Graham Goodwin. "Multisensory Experiences in Archaeological Landscapes—Sound, Vision, and Movement in GIS and Virtual Reality." In Capturing the Senses. Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-23133-9_9.

Full text
Abstract:
AbstractArchaeologists are employing a variety of digital tools to develop new methodological frameworks that combine computational and experiential approaches which is leading to new multisensory research. In this article, we explore vision, sound, and movement at the ancient Maya city of Copan from a multisensory and multiscalar perspective bridging concepts and approaches from different archaeological paradigms. Our methods and interpretations employ theory-inspired variables from proxemics and semiotics to develop a methodological framework that combines computation with sensory perception. Using GIS, 3D, and acoustic tools we create multisensory experiences in VR with spatial sound using an immersive headset (Oculus Rift) and touch controllers (for movement). The case study simulates the late eighth and early ninth-century landscape of the ancient Maya city of Copan to investigate the role of landscape in facilitate movement, send messages, influence social interaction, and structure cultural events. We perform two simulations to begin to study the impact of vegetation on viewsheds and soundsheds of a stela at ancient Copan. Our objectives are twofold: (1) design and test steps towards developing a GIS computational approach to analyse the impact of vegetation within urban agrarian landscapes on viewsheds and soundsheds and (2) explore cultural significance of Stela 12, and more generally the role of synesthetic experience in ancient Maya society using a multisensory approach that incorporates GIS and VR.
APA, Harvard, Vancouver, ISO, and other styles
9

Gascuel J.D., Payno H., Schmerber S., and Martin O. "Immersive Virtual Environment for Visuo-Vestibular Therapy: Preliminary Results." In Studies in Health Technology and Informatics. IOS Press, 2012. https://doi.org/10.3233/978-1-61499-121-2-187.

Full text
Abstract:
The sense of equilibrium aggregates several interacting cues. On vestibular areflexic patients, vision plays a major role. We developed an immersive therapeutic platform, based on 3D opto-kinetic stimulation that enables to tune the difficulty of the balance task by managing the type of optic flow and its speed. The balance adjustments are recorded by a force plate, quantified by the length of the center of pressure trajectory and detection of disequilibrium corrections (leans, compensation step). Preliminary analysis shows that (i) patients report a strong immersion feeling in the motion flow, triggering intense motor response to &amp;ldquo;fight against fall&amp;rdquo;; (ii) the ANOVA factorial design shows a significant effect of flow speed, session number and gaze anchor impact. In conclusion, this study shows that 3D immersive stimulation removes essential limits of traditional opto-kinetic stimulators (limited 2D motions and remaining fixed background cues). Moreover, the immersive optic flow stimulation is an efficient tool to induce balance adaptive reactions in vestibular patients. Hence, such a platform appears to be a powerful therapeutic tool for training and relearning of balance control processes.
APA, Harvard, Vancouver, ISO, and other styles
10

Fachada, Sarah, Daniele Bonatto, Mehrdad Teratani, and Gauthier Lafruit. "View Synthesis Tool for VR Immersive Video." In Computer Game Development [Working Title]. IntechOpen, 2022. http://dx.doi.org/10.5772/intechopen.102382.

Full text
Abstract:
This chapter addresses the view synthesis of natural scenes in virtual reality (VR) using depth image-based rendering (DIBR). This method reaches photorealistic results as it directly warps photos to obtain the output, avoiding the need to photograph every possible viewpoint or to make a 3D reconstruction of a scene followed by a ray-tracing rendering. An overview of the DIBR approach and frequently encountered challenges (disocclusion and ghosting artifacts, multi-view blending, handling of non-Lambertian objects) are described. Such technology finds applications in VR immersive displays and holography. Finally, a comprehensive manual of the Reference View Synthesis software (RVS), an open-source tool tested on open datasets and recognized by the MPEG-I standardization activities (where”I″ refers to”immersive”) is described for hands-on practicing.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "3D IMMERSIVE TOOL"

1

Kourbane, Ikram, Panagiotis Papadakis, and Mihai Andries. "An Immersive Annotation Tool for Movement Quality Assessment with 3D Visualization." In 2025 IEEE 38th International Symposium on Computer-Based Medical Systems (CBMS). IEEE, 2025. https://doi.org/10.1109/cbms65348.2025.00036.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Gabel, Jenny, Ann Lauren Osthof, Christof Berns, et al. "Exploring Written Artefacts in Virtual Reality – Potential and Challenges for Creating Immersive Tools and Applications." In 2025 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, 2025. https://doi.org/10.1109/vrw66409.2025.00013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Cibilic, Iva, Vesna Posloncec-Petric, and Marko Matijevic. "GEOVISUALIZATION OF BORONGAJ CAMPUS IN MOBILE AUGMENTED REALITY." In SGEM International Multidisciplinary Scientific GeoConference. STEF92 Technology, 2024. https://doi.org/10.5593/sgem2024v/6.2/s26.36.

Full text
Abstract:
This paper explores the application of augmented reality (AR) in spatial data and 3D model visualization, particularly in the context of smart cities and campuses. The term "smart city" refers to the concept of urban planning that optimizes city space by applying digital technologies to improve the quality of life. Smart cities include innovative solutions in traffic, urban planning, energy, communications and digital management of the city and buildings to provide efficient and sustainable development. We developed a mobile AR application that integrates the 3D model of the Borongaj Campus, allowing users to visualize and interact with the model through the use of markers. The application provides an interactive environment where users can engage with the spatial data, enhancing their understanding of the campus layout and structure. Visualizing the 3D model in AR creates a dynamic and immersive experience, contributing to the growing adoption of AR in geovisualization of urban landscapes. The paper also discusses the technical aspects of integrating AR with spatial data, highlighting the challenges and solutions involved in the process. In conclusion, we evaluate the advantages and limitations of the created model, the software tools used, and mobile AR as a medium for visualizing spatial data, providing insights into the potential of this technology for future applications in smart campuses and cities.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhou, Xiaoyan. "Designing Navigation Tool for Immersive Analytics in AR." In 2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, 2023. http://dx.doi.org/10.1109/vrw58643.2023.00330.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sun, Lei. "Soundscape as a Tool for Place-Making in Industrial Heritage Sites." In 2023 Immersive and 3D Audio: from Architecture to Automotive (I3DA). IEEE, 2023. http://dx.doi.org/10.1109/i3da57090.2023.10289367.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Beltran, Fernando, David White, and Jing Geng. "Aroaro - A Tool for Distributed Immersive Mixed Reality Visualization." In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, 2022. http://dx.doi.org/10.1109/vrw55335.2022.00337.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Cassola, Fernando, Manuel Pinto, Daniel Mendes, Leonel Morgado, Antonio Coelho, and Hugo Paredes. "A Novel Tool for Immersive Authoring of Experiential Learning in Virtual Reality." In 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, 2021. http://dx.doi.org/10.1109/vrw52623.2021.00014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Thompson, D., A. Banerjee, P. Banerjee, T. DeFanti, and S. Retterer. "Functional Specifications for Tele-Immersive Product Evaluation." In ASME 1999 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 1999. http://dx.doi.org/10.1115/imece1999-0159.

Full text
Abstract:
Abstract A novel virtual tele-immersive product evaluation environment is conceived. The components include a robust Virtual Reality (VR) hardware system, associated VR driving software, development tool for the tele-immersive virtual environment, networking software, user representation scheme and tools for developing 3D models and incorporating dynamic properties into the models. We have developed a model to allow users to collaboratively evaluate products using the CAVE™, Performer, CAVERN, CAVEActors, Pro/ENGINEER, and ADAMS software libraries.
APA, Harvard, Vancouver, ISO, and other styles
9

Simon, Cassandre, Manel Boukli-Hacene, Flavien Lebrun, Samir Otmane, and Amine Chellali. "Impact of Multimodal Instructions for Tool Manipulation Skills on Performance and User Experience in an Immersive Environment." In 2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR). IEEE, 2024. http://dx.doi.org/10.1109/vr58804.2024.00087.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Basu, Aryabrata. "STAG: A Tool for realtime Replay and Analysis of Spatial Trajectory and Gaze Information captured in Immersive Environments." In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, 2022. http://dx.doi.org/10.1109/vrw55335.2022.00016.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "3D IMMERSIVE TOOL"

1

Bernstetter, Armin. Virtual Fieldwork in Unreal Engine (software). GEOMAR Helmholtz Centre for Ocean Research Kiel, 2024. http://dx.doi.org/10.3289/sw_6_2024.

Full text
Abstract:
The Virtual Fieldwork tool is an application developed in the game engine and 3D creation tool Unreal Engine. It is developed specifically for the ARENA2, a large, spatially immersive environment, and virtual reality using head-mounted displays. It contains a toolbox for quantitative measurements of visualized 3D model data. The application contains three different data sets from real-world use cases at GEOMAR.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!