Academic literature on the topic 'Data-driven animations'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Data-driven animations.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Data-driven animations"
Ge, T., Y. Zhao, B. Lee, D. Ren, B. Chen, and Y. Wang. "Canis: A High‐Level Language for Data‐Driven Chart Animations." Computer Graphics Forum 39, no. 3 (June 2020): 607–17. http://dx.doi.org/10.1111/cgf.14005.
Full textKlotsman, Marina, and Ayellet Tal. "Animation of Flocks Flying in Line Formations." Artificial Life 18, no. 1 (December 2011): 91–105. http://dx.doi.org/10.1162/artl_a_00050.
Full textM.M.T. Wickramasinghe, M. H. M. Wickramasinghe,. "Impact of using 2D Animation As a Pedagogical Tool." Psychology and Education Journal 58, no. 1 (January 1, 2021): 3435–39. http://dx.doi.org/10.17762/pae.v58i1.1283.
Full textSchnitzer, Julia. "Generative Design For Creators – The Impact Of Data Driven Visualization And Processing In The Field Of Creative Business." Electronic Imaging 2021, no. 3 (June 18, 2021): 22–1. http://dx.doi.org/10.2352/issn.2470-1173.2021.3.mobmu-022.
Full textGarcia Fernandez, J., K. Tammi, and A. Joutsiniemi. "EXTENDING THE LIFE OF VIRTUAL HERITAGE: REUSE OF TLS POINT CLOUDS IN SYNTHETIC STEREOSCOPIC SPHERICAL IMAGES." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-2/W3 (February 23, 2017): 317–23. http://dx.doi.org/10.5194/isprs-archives-xlii-2-w3-317-2017.
Full textde Vries, Gwyneth, Kevin Roy, and Victoria Chester. "Using Three-Dimensional Gait Data for Foot/Ankle Orthopaedic Surgery." Open Orthopaedics Journal 3, no. 1 (November 3, 2009): 89–95. http://dx.doi.org/10.2174/1874325000903010089.
Full textGholba, N. D., A. Babu, S. Shanmugapriya, A. Singh, A. Srivastava, and S. Saran. "APPLICATION OF VARIOUS OPEN SOURCE VISUALIZATION TOOLS FOR EFFECTIVE MINING OF INFORMATION FROM GEOSPATIAL PETROLEUM DATA." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-5 (November 19, 2018): 167–74. http://dx.doi.org/10.5194/isprs-archives-xlii-5-167-2018.
Full textZhang, Long, Yubo Zhang, Zhongding Jiang, Luying Li, Wei Chen, and Qunsheng Peng. "Precomputing data-driven tree animation." Computer Animation and Virtual Worlds 18, no. 4-5 (2007): 371–82. http://dx.doi.org/10.1002/cav.205.
Full textFarahani, Navid, Zheng Liu, Dylan Jutt, and Jeffrey L. Fine. "Pathologists' Computer-Assisted Diagnosis: A Mock-up of a Prototype Information System to Facilitate Automation of Pathology Sign-out." Archives of Pathology & Laboratory Medicine 141, no. 10 (July 7, 2017): 1413–20. http://dx.doi.org/10.5858/arpa.2016-0214-oa.
Full text주은정, Jehee Lee, and Sohmin Ahn. "Data-driven Facial Animation Using Sketch Interface." Journal of the Korea Computer Graphics Society 13, no. 3 (September 2007): 11–18. http://dx.doi.org/10.15701/kcgs.2007.13.3.11.
Full textDissertations / Theses on the topic "Data-driven animations"
Rowe, Daniel Taylor. "Using Graphics, Animations, and Data-Driven Animations to Teach the Principles of Simple Linear Regression to Graduate Students." BYU ScholarsArchive, 2004. https://scholarsarchive.byu.edu/etd/6.
Full textMousas, Christos. "Data-driven techniques for animating virtual characters." Thesis, University of Sussex, 2015. http://sro.sussex.ac.uk/id/eprint/52967/.
Full textScheidt, November. "A facial animation driven by X-ray microbeam data." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0021/MQ54745.pdf.
Full textYin, KangKang. "Data-driven kinematic and dynamic models for character animation." Thesis, University of British Columbia, 2007. http://hdl.handle.net/2429/31759.
Full textScience, Faculty of
Computer Science, Department of
Graduate
Naert, Lucie. "Capture, annotation and synthesis of motions for the data-driven animation of sign language avatars." Thesis, Lorient, 2020. http://www.theses.fr/2020LORIS561.
Full textThis thesis deals with the capture, annotation, synthesis and evaluation of arm and hand motions for the animation of avatars communicating in Sign Languages (SL). Currently, the production and dissemination of SL messages often depend on video recordings which lack depth information and for which editing and analysis are complex issues. Signing avatars constitute a powerful alternative to video. They are generally animated using either procedural or data-driven techniques. Procedural animation often results in robotic and unrealistic motions, but any sign can be precisely produced. With data-driven animation, the avatar's motions are realistic but the variety of the signs that can be synthesized is limited and/or biased by the initial database. As we considered the acceptance of the avatar to be a prime issue, we selected the data-driven approach but, to address its main limitation, we propose to use annotated motions present in an SL Motion Capture database to synthesize novel SL signs and utterances absent from this initial database. To achieve this goal, our first contribution is the design, recording and perceptual evaluation of a French Sign Language (LSF) Motion Capture database composed of signs and utterances performed by deaf LSF teachers. Our second contribution is the development of automatic annotation techniques for different tracks based on the analysis of the kinematic properties of specific joints and existing machine learning algorithms. Our last contribution is the implementation of different motion synthesis techniques based on motion retrieval per phonological component and on the modular reconstruction of new SL content with the additional use of motion generation techniques such as inverse kinematics, parameterized to comply to the properties of real motions
Trutoiu, Laura. "Perceptually Valid Dynamics for Smiles and Blinks." Research Showcase @ CMU, 2014. http://repository.cmu.edu/dissertations/428.
Full textChang, Pai-chun, and 張百群. "Data-Driven Water Flow Animation In Oriental Paintings." Thesis, 2014. http://ndltd.ncl.edu.tw/handle/23327560520382105673.
Full text國立臺灣科技大學
資訊工程系
102
This work designs a data-driven system which takes an oriental painting as its input to analyze the structure, placement density, and ink density of strokes, generate a smooth flow field based on the extracted flow pattern, and places and animates strokes by choosing proper strokes from the extracted strokes according to the flow field in virtual world scene. The main contributions lie in the integration of a datadriven method which extracts the flow pattern and stroke style from an oriental painting and physical simulation for the creation of an oriental painting flow animation. The physical simulation using the Naiver-Stokes equation to create static flow field and wave equations to simulate the disturbance among dynamic object and the steady water flow. The strokes is constructed and animated using the strokes extracted from the painting according to the flow filed.
Abson, Karl, and Ian J. Palmer. "Motion capture: capturing interaction between human and animal." 2015. http://hdl.handle.net/10454/9106.
Full textWe introduce a new "marker-based" model for use in capturing equine movement. This model is informed by a sound biomechanical study of the animal and can be deployed in the pursuit of many undertakings. Unlike many other approaches, our method provides a high level of automation and hides the intricate biomechanical knowledge required to produce realistic results. Due to this approach, it is possible to acquire solved data with minimal manual intervention even in real-time conditions. The approach introduced can be replicated for the production of many other animals. The model is first informed by the veterinary world through studies of the subject's anatomy. Second, further medical studies aimed at understanding and addressing surface processes, inform model creation. The latter studies address items such as skin sliding. If not otherwise corrected these processes may hinder marker based capture. The resultant model has been tested in feasibility studies for practicality and subject acceptance during production. Data is provided for scrutiny along with the subject digitally captured through a variety of methods. The digital subject in mesh form as well as the motion capture model aid in comparison and show the level of accurateness achieved. The video reference and digital renders provide an insight into the level of realism achieved.
Chaudhry, E., S. J. Bian, Hassan Ugail, X. Jin, L. H. You, and J. J. Zhang. "Dynamic skin deformation using finite difference solutions for character animation." 2014. http://hdl.handle.net/10454/8163.
Full textWe present a new skin deformation method to create dynamic skin deformations in this paper. The core elements of our approach are a dynamic deformation model, an efficient data-driven finite difference solution, and a curve-based representation of 3D models.We first reconstruct skin deformation models at different poses from the taken photos of a male human arm movement to achieve real deformed skin shapes. Then, we extract curves from these reconstructed skin deformation models. A new dynamic deformation model is proposed to describe physics of dynamic curve deformations, and its finite difference solution is developed to determine shape changes of the extracted curves. In order to improve visual realism of skin deformations, we employ data-driven methods and introduce skin shapes at the initial and final poses in to our proposed dynamic deformation model. Experimental examples and comparisons made in this paper indicate that our proposed dynamic skin deformation technique can create realistic deformed skin shapes efficiently with a small data size.
Books on the topic "Data-driven animations"
Deng, Zhigang, and Ulrich Neumann, eds. Data-Driven 3D Facial Animation. London: Springer London, 2007. http://dx.doi.org/10.1007/978-1-84628-907-1.
Full text(Editor), Zhigang Deng, and Ulrich Neumann (Editor), eds. Data-Driven 3D Facial Animation. Springer, 2007.
Find full textBook chapters on the topic "Data-driven animations"
Courty, Nicolas, and Thomas Corpetti. "Data-Driven Animation of Crowds." In Computer Vision/Computer Graphics Collaboration Techniques, 377–88. Berlin, Heidelberg: Springer Berlin Heidelberg, 2007. http://dx.doi.org/10.1007/978-3-540-71457-6_34.
Full textKomura, Taku, Ikhsanul Habibie, Jonathan Schwarz, and Daniel Holden. "Data-Driven Character Animation Synthesis." In Handbook of Human Motion, 1–29. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-30808-1_10-1.
Full textJörg, Sophie. "Data-Driven Hand Animation Synthesis." In Handbook of Human Motion, 1–13. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-30808-1_13-1.
Full textKomura, Taku, Ikhsanul Habibie, Jonathan Schwarz, and Daniel Holden. "Data-Driven Character Animation Synthesis." In Handbook of Human Motion, 2003–31. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-14418-4_10.
Full textJörg, Sophie. "Data-Driven Hand Animation Synthesis." In Handbook of Human Motion, 2079–91. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-14418-4_13.
Full textLiang, Yuan, Song-Hai Zhang, and Ralph Robert Martin. "Automatic Data-Driven Room Design Generation." In Next Generation Computer Animation Techniques, 133–48. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-69487-0_10.
Full textSourina, Olga, Alexei Sourin, and Vladimir Kulish. "EEG Data Driven Animation and Its Application." In Computer Vision/Computer Graphics CollaborationTechniques, 380–88. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-01811-4_34.
Full textGamage, Vihanga, Cathy Ennis, and Robert Ross. "Latent Dynamics for Artefact-Free Character Animation via Data-Driven Reinforcement Learning." In Lecture Notes in Computer Science, 675–87. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-86380-7_55.
Full textVogt, David, Steve Grehl, Erik Berger, Heni Ben Amor, and Bernhard Jung. "A Data-Driven Method for Real-Time Character Animation in Human-Agent Interaction." In Intelligent Virtual Agents, 463–76. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-09767-1_57.
Full textConference papers on the topic "Data-driven animations"
Ge, Tong, Bongshin Lee, and Yunhai Wang. "CAST: Authoring Data-Driven Chart Animations." In CHI '21: CHI Conference on Human Factors in Computing Systems. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3411764.3445452.
Full textWhite, Ryan, Keenan Crane, and D. A. Forsyth. "Data driven cloth animation." In ACM SIGGRAPH 2007 sketches. New York, New York, USA: ACM Press, 2007. http://dx.doi.org/10.1145/1278780.1278825.
Full textLee, Jehee. "Introduction to data-driven animation." In ACM SIGGRAPH ASIA 2010 Courses. New York, New York, USA: ACM Press, 2010. http://dx.doi.org/10.1145/1900520.1900524.
Full textGrover, Divyanshu, and Parag Chaudhuri. "Data-driven 2D effects animation." In the Tenth Indian Conference. New York, New York, USA: ACM Press, 2016. http://dx.doi.org/10.1145/3009977.3010000.
Full textYu, Hongchuan, Taku Komura, and Jian J. Zhang. "Data-driven animation technology (D2AT)." In SA '17: SIGGRAPH Asia 2017. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3154457.3154458.
Full textYang, Xin, Wanchao Su, Jian Deng, and Zhigeng Pan. "Real traffic data-driven animation simulation." In VRCAI '15: International Conference on Virtual Reality Continuum and Its Applications in Industry. New York, NY, USA: ACM, 2015. http://dx.doi.org/10.1145/2817675.2817683.
Full textZhang, Xinyi, and Michiel van de Panne. "Data-driven autocompletion for keyframe animation." In MIG '18: Motion, Interaction and Games. New York, NY, USA: ACM, 2018. http://dx.doi.org/10.1145/3274247.3274502.
Full textBrandt, Sascha, Matthias Fischer, Maria Gerges, Claudius Jähn, and Jan Berssenbrügge. "Automatic Derivation of Geometric Properties of Components From 3D Polygon Models." In ASME 2017 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/detc2017-67528.
Full textLi, Xi, Jun Yu, Fei Gao, and Jian Zhang. "Data-driven facial animation via hypergraph learning." In 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE, 2016. http://dx.doi.org/10.1109/smc.2016.7844280.
Full textHamer, Henning, Juergen Gall, Raquel Urtasun, and Luc Van Gool. "Data-driven animation of hand-object interactions." In Gesture Recognition (FG 2011). IEEE, 2011. http://dx.doi.org/10.1109/fg.2011.5771426.
Full text