Auswahl der wissenschaftlichen Literatur zum Thema „Augmented imaging“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Augmented imaging" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Zeitschriftenartikel zum Thema "Augmented imaging":

1

DERSHAW, D. DAVID. „Imaging the Augmented Breast“. Contemporary Diagnostic Radiology 21, Nr. 12 (1998): 1–5. http://dx.doi.org/10.1097/00219246-199821120-00001.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Stott, Peter. „Transcendental imaging and augmented reality“. Technoetic Arts 9, Nr. 1 (05.09.2011): 49–64. http://dx.doi.org/10.1386/tear.9.1.49_1.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Marchesini, Stefano, Andre Schirotzek, Chao Yang, Hau-tieng Wu und Filipe Maia. „Augmented projections for ptychographic imaging“. Inverse Problems 29, Nr. 11 (03.10.2013): 115009. http://dx.doi.org/10.1088/0266-5611/29/11/115009.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Davidson, J., F. W. Poon, J. H. McKillop und H. W. Gray. „Pethidine-augmented HMPAO leukocyte imaging“. Nuclear Medicine Communications 20, Nr. 5 (Mai 1999): 479. http://dx.doi.org/10.1097/00006231-199905000-00087.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

JACOBSON, ARNOLD F. „False-Positive Morphine Augmented Hepatobiliary Imaging“. Clinical Nuclear Medicine 21, Nr. 1 (Januar 1996): 81. http://dx.doi.org/10.1097/00003072-199601000-00030.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Eklund, GW, RC Busby, SH Miller und JS Job. „Improved imaging of the augmented breast“. American Journal of Roentgenology 151, Nr. 3 (September 1988): 469–73. http://dx.doi.org/10.2214/ajr.151.3.469.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Douglas, David, Clifford Wilke, J. Gibson, John Boone und Max Wintermark. „Augmented Reality: Advances in Diagnostic Imaging“. Multimodal Technologies and Interaction 1, Nr. 4 (08.11.2017): 29. http://dx.doi.org/10.3390/mti1040029.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

CHANDRAMOULY, BELUR S., und RAKESH D. SHAH. „False-Positive Morphine Augmented Hepatobiliary Imaging“. Clinical Nuclear Medicine 21, Nr. 1 (Januar 1996): 80–81. http://dx.doi.org/10.1097/00003072-199601000-00029.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Kruse, Beth D., und A. Jill Leibman. „Breast Imaging and the Augmented Breast“. Plastic Surgical Nursing 12, Nr. 3 (1992): 109–16. http://dx.doi.org/10.1097/00006527-199201230-00005.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Huch, R. A., W. Künzi, J. F. Debatin, W. Wiesner und G. P. Krestin. „MR imaging of the augmented breast“. European Radiology 8, Nr. 3 (27.03.1998): 371–76. http://dx.doi.org/10.1007/s003300050397.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Dissertationen zum Thema "Augmented imaging":

1

Shen, Xin, Hong Hua und Bahram Javidi. „3D augmented reality with integral imaging display“. SPIE-INT SOC OPTICAL ENGINEERING, 2016. http://hdl.handle.net/10150/621808.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
In this paper, a three-dimensional (3D) integral imaging display for augmented reality is presented. By implementing the pseudoscopic-to-orthoscopic conversion method, elemental image arrays with different capturing parameters can be transferred into the identical format for 3D display. With the proposed merging algorithm, a new set of elemental images for augmented reality display is generated. The newly generated elemental images contain both the virtual objects and real world scene with desired depth information and transparency parameters. The experimental results indicate the feasibility of the proposed 3D augmented reality with integral imaging.
2

Mela, Christopher Andrew. „MULTIMODAL IMAGING, COMPUTER VISION, AND AUGMENTED REALITY FOR MEDICAL GUIDANCE“. University of Akron / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=akron1542642892866467.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Shelton, Brett E. „How augmented reality helps students learn dynamic spatial relationships /“. Thesis, Connect to this title online; UW restricted, 2003. http://hdl.handle.net/1773/7668.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Watson, Jeffrey R., Summer Garland und Marek Romanowski. „Intraoperative visualization of plasmon resonant liposomes using augmented microscopy“. SPIE-INT SOC OPTICAL ENGINEERING, 2017. http://hdl.handle.net/10150/625390.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Plasmon resonance associated with nanoparticles of gold can enable photothermal ablation of tissues or controlled drug release with exquisite temporal and spatial control. These technologies may support many applications of precision medicine. However, clinical implementations of these technologies will require new methods of intraoperative imaging and guidance. Near-infrared laser surgery is a prime example that relies on improved image guidance. Here we set forth applications of augmented microscopy in guiding surgical procedures employing plasmon resonant gold-coated liposomes. Absorption of near-infrared laser light is the first step in activation of various diagnostic and therapeutic functions of these novel functional nanoparticles. Therefore, we demonstrate examples of near-infrared visualization of the laser beam and gold-coated liposomes. The augmented microscope proves to be a promisingimage guidance platform for a range of image-guided medical procedures.
5

Elgort, Daniel Robert. „Real-Time Catheter Tracking and Adaptive Imaging for Interventional Cardiovascular MRI“. Case Western Reserve University School of Graduate Studies / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=case1111437062.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Eustice, Ryan M. „Large-area visually augmented navigation for autonomous underwater vehicles“. Thesis, Massachusetts Institute of Technology, 2005. http://hdl.handle.net/1721.1/39227.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
Thesis (Ph. D.)--Joint Program in Applied Ocean Science and Engineering (Massachusetts Institute of Technology, Dept. of Ocean Engineering; and the Woods Hole Oceanographic Institution), 2005.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Includes bibliographical references (p. 173-187).
This thesis describes a vision-based, large-area, simultaneous localization and mapping (SLAM) algorithm that respects the low-overlap imagery constraints typical of autonomous underwater vehicles (AUVs) while exploiting the inertial sensor information that is routinely available on such platforms. We adopt a systems-level approach exploiting the complementary aspects of inertial sensing and visual perception from a calibrated pose-instrumented platform. This systems-level strategy yields a robust solution to underwater imaging that overcomes many of the unique challenges of a marine environment (e.g., unstructured terrain, low-overlap imagery, moving light source). Our large-area SLAM algorithm recursively incorporates relative-pose constraints using a view-based representation that exploits exact sparsity in the Gaussian canonical form. This sparsity allows for efficient O(n) update complexity in the number of images composing the view-based map by utilizing recent multilevel relaxation techniques. We show that our algorithmic formulation is inherently sparse unlike other feature-based canonical SLAM algorithms, which impose sparseness via pruning approximations. In particular, we investigate the sparsication methodology employed by sparse extended information filters (SEIFs) and offer new insight as to why, and how, its approximation can lead to inconsistencies in the estimated state errors. Lastly, we present a novel algorithm for efficiently extracting consistent marginal covariances useful for data association from the information matrix.
(cont.) In summary, this thesis advances the current state-of-the-art in underwater visual navigation by demonstrating end-to-end automatic processing of the largest visually navigated dataset to date using data collected from a survey of the RMS Titanic (path length over 3 km and 3100 m² of mapped area). This accomplishment embodies the summed contributions of this thesis to several current SLAM research issues including scalability, 6 degree of freedom motion, unstructured environments, and visual perception.
by Ryan M. Eustice.
Ph.D.
7

Murray, Preston Roylance. „Flow-induced Responses of Normal, Bowed, and Augmented Synthetic Vocal Fold Models“. BYU ScholarsArchive, 2011. https://scholarsarchive.byu.edu/etd/2873.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
The voice is the primary mode of communication for humans. Because the voice is so important, voice disorders tend to severely diminish quality of life. A better understanding of the physics of voice production can help to improve treatment of voice disorders. For this thesis research a self-oscillating synthetic vocal fold model was developed, compared with previous synthetic vocal fold models, and used to explore the physical effects of augmentation injections on vibration dynamics. The research was conducted in two stages. First, four vocal fold models were evaluated by quantifying onset pressure, frequency, maximum glottal gap, flow rate, and medial surface motion. The newly developed model, differentiated from the other models by the inclusion of more layers, adjusted geometry, and an extremely soft superficial lamina propria layer, was included in this study. One of the models, created using MRI-derived geometry, had the most defined mucosal wave. The newly-developed model had the lowest onset pressure, flow rate, and smallest maximum glottal width, and the model motion compared very well with published excised human larynx data. Second, the new model was altered to simulate bowing by decreasing the volume of the body layer relative to that of a normal, unbowed model. Two models with varying degrees of bowing were created and tested while paired with normal models. Pre- and post-injection data (onset pressure, vibration frequency, glottal flow rate, open quotient, and high-speed image sequences) were recorded and compared. General pre- to post-injection trends included decreased onset pressure, glottal flow rate, and open quotient, and increased vibration frequency. Additionally, there was a decrease in mucosal wave velocity and an increase in phase angle. The thesis results are anticipated to aid in better understanding the physical effects of augmentation injections, with the ultimate goal of obtaining more consistent surgical outcomes, and also to contribute to the advancement of voice research through the development of the new synthetic model.
8

Habert, Séverine [Verfasser], Nassir [Akademischer Betreuer] Navab, Nassir [Gutachter] Navab und Pascal [Gutachter] Fallavollita. „Multi-Modal Visualization Paradigms for RGBD augmented X-ray Imaging / Séverine Habert ; Gutachter: Nassir Navab, Pascal Fallavollita ; Betreuer: Nassir Navab“. München : Universitätsbibliothek der TU München, 2018. http://d-nb.info/1164590758/34.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Feuerstein, Marco. „Augmented reality in laparoscopic surgery new concepts and methods for intraoperative multimodal imaging and hybrid tracking in computer aided surgery“. Saarbrücken VDM Verlag Dr. Müller, 2007. http://d-nb.info/991301250/04.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Hammami, Houda. „Guidance of radioembolization procedures in the context of interventional oncology“. Thesis, Rennes 1, 2021. http://www.theses.fr/2021REN1S121.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Annotation:
La radioembolisation est une intervention mini-invasive réalisée pour traiter le cancer du foie en administrant des microsphères radioactives. Afin d'optimiser les résultats du traitement, la procédure est réalisée en deux sessions : une intervention de prétraitement, principalement réalisée pour localiser le site d'injection, évaluer la distribution et effectuer une évaluation dosimétrique, et une intervention de traitement réalisée pour injecter la dose appropriée de microsphères radioactives dans le site d'injection localisé. En raison la complexité de la vascularisation hépatique, les radiologues interventionnels manipulent soigneusement le cathéter, lors des deux interventions, sous guidage radiographique et recourent à l'injection de produit de contraste afin de visualiser les vaisseaux. Dans cette thèse, nous proposons une nouvelle stratégie de guidage qui promet une simplification et une précision de la navigation du cathéter lors des deux interventions. Le système de navigation proposé traite les images préopératoires et peropératoires pour réaliser une fusion d'images grâce à une technique de recalage rigide. Cette approche est conçue pour 1) aider l'accès au tronc cœliaque, 2) aider l'accès au site d'injection et 3) reproduire le site d'injection lors de l'intervention de traitement. Sachant que le foie subit un déplacement lié au mouvement respiratoire, nous proposons également une approche qui permet d'obtenir une superposition dynamique des vaisseaux 3D projetés sur la fluoroscopie
Radioembolization is a minimally-invasive intervention performed to treat liver cancer by administering radioactive microspheres. In order to optimize radioembolization outcomes, the procedure is carried out in two sessions: pretreatment assessment intervention, mainly performed to locate the injection site, assess microspheres distribution and perform dosimetry evaluation, and treatment intervention performed to inject the estimated proper dose of radioactive microspheres in the located injection site. Due to the hepatic vasculature complexity, interventional radiologists carefully manipulate the catheter, during the two interventions, under X-Ray image guidance and resort to contrast media injection in order to highlight vessels. In this thesis, we propose a novel guidance strategy that promises a simplification and accuracy of the catheter navigation during the pretreatment assessment, as well as during the treatment interventions. The proposed navigation system processes pre- and intraoperative images to achieve intraoperative image fusion through a rigid registration technique. This approach is designed to 1) assist the celiac trunk access, 2) assist the injection site access and 3) automatically reproduce the injection site during the proper intervention. Knowing that the liver undergoes a motion induced by the breathing, we also propose an approach that allows obtaining a dynamic overlay of the projected 3D vessels onto fluoroscopy

Bücher zum Thema "Augmented imaging":

1

Liao, Hongen, P. J. "Eddie" Edwards, Xiaochuan Pan, Yong Fan und Guang-Zhong Yang, Hrsg. Medical Imaging and Augmented Reality. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-15699-1.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Dohi, Takeyoshi, Ichiro Sakuma und Hongen Liao, Hrsg. Medical Imaging and Augmented Reality. Berlin, Heidelberg: Springer Berlin Heidelberg, 2008. http://dx.doi.org/10.1007/978-3-540-79982-5.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Yang, Guang-Zhong, TianZi Jiang, Dinggang Shen, Lixu Gu und Jie Yang, Hrsg. Medical Imaging and Augmented Reality. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11812715.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Zheng, Guoyan, Hongen Liao, Pierre Jannin, Philippe Cattin und Su-Lin Lee, Hrsg. Medical Imaging and Augmented Reality. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-43775-0.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Yang, Guang-Zhong, und Tian-Zi Jiang, Hrsg. Medical Imaging and Augmented Reality. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/b99698.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Liao, Hongen, Cristian A. Linte, Ken Masamune, Terry M. Peters und Guoyan Zheng, Hrsg. Augmented Reality Environments for Medical Imaging and Computer-Assisted Interventions. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-40843-4.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

International Workshop on Medical Imaging and Augmented Reality (5th 2010 Beijing, China). Medical imaging and augmented reality: 5th international workshop, MIAR 2010, Beijing, China, September 19-20, 2010 : proceedings. Berlin: Springer, 2010.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Huang, Weidong. Human Factors in Augmented Reality Environments. New York, NY: Springer New York, 2013.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

David, Hutchison. Medical Imaging and Augmented Reality: 4th International Workshop Tokyo, Japan, August 1-2, 2008 Proceedings. Berlin, Heidelberg: Springer-Verlag Berlin Heidelberg, 2008.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Höhl, Wolfgang. Interactive environments with open-source software: 3D walkthroughs and augmented reality for architects with Blender 2.43, DART 3.0 and ARToolKit 2.72. Wien: Springer, 2009.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Buchteile zum Thema "Augmented imaging":

1

Borrelli, Claire D. „Imaging the Augmented Breast“. In Digital Mammography, 223–29. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-04831-4_27.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Liao, Hongen. „3D Medical Imaging and Augmented Reality for Image-Guided Surgery“. In Handbook of Augmented Reality, 589–602. New York, NY: Springer New York, 2011. http://dx.doi.org/10.1007/978-1-4614-0064-6_27.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Zheng, Guoyan, Hongen Liao, Pierre Jannin, Philippe Cattin und Su-Lin Lee. „Erratum to: Medical Imaging and Augmented Reality“. In Lecture Notes in Computer Science, E1. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-43775-0_40.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Wubbels, Peter, Erin Nishimura, Evan Rapoport, Benjamin Darling, Dennis Proffitt, Traci Downs und J. Hunter Downs. „Exploring Calibration Techniques for Functional Near-Infrared Imaging (fNIR) Controlled Brain-Computer Interfaces“. In Foundations of Augmented Cognition, 23–29. Berlin, Heidelberg: Springer Berlin Heidelberg, 2007. http://dx.doi.org/10.1007/978-3-540-73216-7_3.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Kim, Gyoung, Joonhyun Jeon und Frank Biocca. „M.I.N.D. Brain Sensor Caps: Coupling Precise Brain Imaging to Virtual Reality Head-Mounted Displays“. In Augmented Cognition: Intelligent Technologies, 120–30. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-91470-1_11.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Phillips, Henry L., Peter B. Walker, Carrie H. Kennedy, Owen Carmichael und Ian N. Davidson. „Guided Learning Algorithms: An Application of Constrained Spectral Partitioning to Functional Magnetic Resonance Imaging (fMRI)“. In Foundations of Augmented Cognition, 709–16. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-39454-6_76.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Kim, Jeong-Hyun, Zhu Teng, Dong-Joong Kang und Jong-Eun Ha. „Multiple Plane Detection Method from Range Data of Digital Imaging System for Moving Robot Applications“. In Augmented Vision and Reality, 201–15. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-55131-4_11.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Makeig, Scott. „Mind Monitoring via Mobile Brain-Body Imaging“. In Foundations of Augmented Cognition. Neuroergonomics and Operational Neuroscience, 749–58. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-02812-0_85.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Kovalchuk, Mikhail V., und Yuri I. Kholodny. „Functional Magnetic Resonance Imaging Augmented with Polygraph: New Capabilities“. In Advances in Intelligent Systems and Computing, 260–65. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-25719-4_33.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Edgcumbe, Philip, Rohit Singla, Philip Pratt, Caitlin Schneider, Christopher Nguan und Robert Rohling. „Augmented Reality Imaging for Robot-Assisted Partial Nephrectomy Surgery“. In Lecture Notes in Computer Science, 139–50. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-43775-0_13.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Konferenzberichte zum Thema "Augmented imaging":

1

Samset, E., D. Schmalstieg, J. Vander Sloten, A. Freudenthal, J. Declerck, S. Casciaro, Ø. Rideng und B. Gersak. „Augmented reality in surgical procedures“. In Electronic Imaging 2008, herausgegeben von Bernice E. Rogowitz und Thrasyvoulos N. Pappas. SPIE, 2008. http://dx.doi.org/10.1117/12.784155.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Bornik, Alexander, Bernhard Reitinger, Reinhard Beichel, Erich Sorantin und Georg Werkgartner. „Augmented-reality-based segmentation refinement“. In Medical Imaging 2004, herausgegeben von Robert L. Galloway, Jr. SPIE, 2004. http://dx.doi.org/10.1117/12.535478.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Schutz, Christian L., und Heinz Huegli. „Augmented reality using range images“. In Electronic Imaging '97, herausgegeben von Scott S. Fisher, John O. Merritt und Mark T. Bolas. SPIE, 1997. http://dx.doi.org/10.1117/12.274489.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Kim, Juwan, Haedong Kim, Byungtae Jang, Jungsik Kim und Donghyun Kim. „Augmented reality using GPS“. In Photonics West '98 Electronic Imaging, herausgegeben von Mark T. Bolas, Scott S. Fisher und John O. Merritt. SPIE, 1998. http://dx.doi.org/10.1117/12.307190.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Poustinchi, Ebrahim. „Robotically Augmented Imaging (RAI Alpha)“. In ACADIA 2019: Ubiquity and Autonomy. ACADIA, 2019. http://dx.doi.org/10.52842/conf.acadia.2019.352.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Poustinchi, Ebrahim. „Robotically Augmented Imaging (RAI Alpha)“. In ACADIA 2019: Ubiquity and Autonomy. ACADIA, 2019. http://dx.doi.org/10.52842/conf.acadia.2019.352.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Garcia Giraldez, Jaime, Haydar Talib, Marco Caversaccio und Miguel A. Gonzalez Ballester. „Multimodal augmented reality system for surgical microscopy“. In Medical Imaging, herausgegeben von Kevin R. Cleary und Robert L. Galloway, Jr. SPIE, 2006. http://dx.doi.org/10.1117/12.651267.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Sauer, Frank, Sebastian Vogt, Ali Khamene, Sandro Heining, Ekkehard Euler, Marc Schneberger, Konrad Zuerl und Wolf Mutschler. „Augmented reality visualization for thoracoscopic spine surgery“. In Medical Imaging, herausgegeben von Kevin R. Cleary und Robert L. Galloway, Jr. SPIE, 2006. http://dx.doi.org/10.1117/12.654305.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Drascic, David, und Paul Milgram. „Perceptual issues in augmented reality“. In Electronic Imaging: Science & Technology, herausgegeben von Mark T. Bolas, Scott S. Fisher und John O. Merritt. SPIE, 1996. http://dx.doi.org/10.1117/12.237425.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Kitchin, Paul, und Kirk Martinez. „Toward natural fiducials for augmented reality“. In Electronic Imaging 2005, herausgegeben von Andrew J. Woods, Mark T. Bolas, John O. Merritt und Ian E. McDowall. SPIE, 2005. http://dx.doi.org/10.1117/12.585923.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Zur Bibliographie