Academic literature on the topic 'Texture-based rendering'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Texture-based rendering.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Texture-based rendering"
Sun, Yuhong, Jiatao Wang, and Lijuan Han. "Pencil drawing rendering based on example texture." Journal of Computational Methods in Sciences and Engineering 17, no. 4 (November 24, 2017): 635–44. http://dx.doi.org/10.3233/jcm-170747.
Full textCeles, Waldemar, and Frederico Abraham. "Fast and versatile texture-based wireframe rendering." Visual Computer 27, no. 10 (August 27, 2011): 939–48. http://dx.doi.org/10.1007/s00371-011-0623-6.
Full textCaban, J. J., and P. Rheingans. "Texture-based Transfer Functions for Direct Volume Rendering." IEEE Transactions on Visualization and Computer Graphics 14, no. 6 (November 2008): 1364–71. http://dx.doi.org/10.1109/tvcg.2008.169.
Full textQian, Wen Hua, Dan Xu, Kun Yue, Zheng Guan, and Yuan Yuan Pu. "Texture Deviation Mapping Based on Detail Enhancement." Advanced Engineering Forum 6-7 (September 2012): 32–37. http://dx.doi.org/10.4028/www.scientific.net/aef.6-7.32.
Full textTsunematsu, Yuta, Norihiko Kawai, Tomokazu Sato, and Naokazu Yokoya. "Texture Transfer Based on Energy Minimization for Painterly Rendering." Journal of Information Processing 24, no. 6 (2016): 897–907. http://dx.doi.org/10.2197/ipsjjip.24.897.
Full textBajaj, Chandrajit, Insung Ihm, and Sanghun Park. "Compression-Based 3D Texture Mapping for Real-Time Rendering." Graphical Models 62, no. 6 (November 2000): 391–410. http://dx.doi.org/10.1006/gmod.2000.0532.
Full textKniss, J., P. McCormick, A. McPherson, J. Ahrens, J. Painter, A. Keahey, and C. Hansen. "Interactive texture-based volume rendering for large data sets." IEEE Computer Graphics and Applications 21, no. 4 (2001): 52–61. http://dx.doi.org/10.1109/38.933524.
Full textKähler, Ralf, and Hans-Christian Hege. "Texture-based volume rendering of adaptive mesh refinement data." Visual Computer 18, no. 8 (December 1, 2002): 481–92. http://dx.doi.org/10.1007/s00371-002-0174-y.
Full textYang, Chao, Shui Yan Dai, Ling Da Wu, and Rong Huan Yu. "Smoothly Rendering of Large-Scale Vector Data on Virtual Globe." Applied Mechanics and Materials 631-632 (September 2014): 516–20. http://dx.doi.org/10.4028/www.scientific.net/amm.631-632.516.
Full textZeng, Tao, Yan Liu, and Enshan Ouyang. "Combination of oriented-plane curvature reproduction and squeeze film effect-based texture reproduction to simulate curved and textured surface." Mechanics & Industry 22 (2021): 21. http://dx.doi.org/10.1051/meca/2021024.
Full textDissertations / Theses on the topic "Texture-based rendering"
Kwatra, Vivek. "Example-based Rendering of Textural Phenomena." Diss., Georgia Institute of Technology, 2005. http://hdl.handle.net/1853/7214.
Full textMuddala, Suryanarayana Murthy. "Free View Rendering for 3D Video : Edge-Aided Rendering and Depth-Based Image Inpainting." Doctoral thesis, Mittuniversitetet, Avdelningen för informations- och kommunikationssystem, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-25097.
Full textJansson, Emil. "Matematisk generering och realtidsrendering av vegetation i Gizmo3D." Thesis, Linköping University, Department of Electrical Engineering, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2324.
Full textTo render outdoor scenes with lots of vegetation in real time is a big challenge. This problem has important applications in the areas of visualization and simulation. Some progress has been made the last years, but a previously unsolved difficulty has been to combine high rendering quality with abundant variation in scenes.
I present a method to mathematically generate and render vegetation in real time, with implementation in the scene graph Gizmo3D. The most important quality of the method is its ability to render scenes with many unique specimens with very low aliasing.
To obtain real time performance, a hierarchical level-of-detail scheme (LOD- scheme) is used which facilitates generation of vegetation in the desired level- of-detail on the fly. The LOD-scheme is texture-based and uses textures that are common for all specimens of a whole species. The most important contribution is that I combine this LOD-scheme with the use of semi- transparency, which makes it possible to obtain low aliasing.
Scenes with semi-transparency require correct rendering order. I solve this problem by introducing a new method for approximate depth sorting. An additional contribution is a variant of axis-aligned billboards, designated blob, which is used in the LOD-scheme. Furthermore, building blocks consisting of small branches are used to increase generation performance.
Huff, Rafael. "Recorte volumétrico usando técnicas de interação 2D e 3D." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2006. http://hdl.handle.net/10183/7385.
Full textVisualization of volumetric datasets is common in many fields and has been an active area of research in the past two decades. In spite of developments in volume visualization techniques, interacting with large datasets still demands research efforts due to perceptual and performance issues. The support of graphics hardware for texture-based visualization allows efficient implementation of rendering techniques that can be combined with interactive sculpting tools to enable interactive inspection of 3D datasets. Many studies regarding performance optimization of sculpting tools have been reported, but very few are concerned with the interaction techniques employed. The purpose of this work is the development of interactive, intuitive, and easy-to-use sculpting tools. Initially, a review of the main techniques for direct volume visualization and sculpting is presented. The best solution that guarantees the required interaction is highlighted. Afterwards, in order to identify the most user-friendly interaction technique for volume sculpting, several interaction techniques, metaphors and taxonomies are presented. Based on that, this work presents the development of three generic sculpting tools implemented using two different interaction metaphors, which are often used by users of 3D applications: virtual pointer and virtual hand. Interactive rates for these sculpting tools are obtained by running special fragment programs on the graphics hardware which specify regions within the volume to be discarded from rendering based on geometric predicates. After development, the performance, precision and user preference of the sculpting tools were evaluated to compare the interaction metaphors. Afterward, the tools were evaluated by comparing the use of a 3D mouse against a conventional wheel mouse for guiding volume and tools manipulation. Two-handed input was also tested with both types of mouse. The results from the evaluation experiments are presented and discussed.
Ang, Jason. "Offset Surface Light Fields." Thesis, University of Waterloo, 2003. http://hdl.handle.net/10012/1100.
Full textBorikar, Siddharth Rajkumar. "FAST ALGORITHMS FOR FRAGMENT BASED COMPLETION IN IMAGES OF NATURAL SCENES." Master's thesis, University of Central Florida, 2004. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/4424.
Full textM.S.
School of Computer Science
Engineering and Computer Science
Computer Science
Lee, Jiunn-Shyan, and 李俊賢. "A Study of Art-Based Rendering and Example-Based Texture Synthesis." Thesis, 2004. http://ndltd.ncl.edu.tw/handle/20511688817145069158.
Full text國立中興大學
資訊科學研究所
92
This dissertation introduces new algorithms for computer-generated non-photorealistic images that look like hand-made paintings. Two art-based rendering algorithms are developed: Ink Diffusion Synthesis and Impressionist Line Integral Convolution. Then, we explore a patch-based sampling algorithm for texture synthesis and transfer denoted as Example-Based Texture Synthesis. Calligraphy has blossomed through a long history in the Orient and is appreciated by many people. In this thesis, we first present an interactive system capable of synthesizing realistic ink diffusion for calligraphic writing education. The system provides two frames for potential users. The first frame presents the outline of a Chinese character that is selected by the user, and a menu where a user can specify parameters such as ink density and paper styles. Users imitate the action of calligraphic writing through mouse movements along the skeleton of the character. Once this has been completed, an ink diffusion effect based on the user’s mouse movement is then synthesized and demonstrated in the second frame. We present a physically-based model combined with fibrous paper structures to synthesize the ink diffusion effect. The experiment results show that this system adequately enlightens and entertains both skillful students and naive novices. In conclusion, by using our system users, especially new beginners, benefit from interactively practicing calligraphy as often as they want without feeling bored. Next, we discuss the line integral convolution (LIC) method, which was originally developed for imaging vector field in scientific visualization and has the potential to produce images with directional characteristics. In this study, we present four techniques to explore LIC in generating images in the style of the Impressionists. In particular, we develop an Impressionist Line Integral Convolution algorithm (ILIC) to generate images with Impressionist styles. This algorithm takes advantage of directional information provided by a photograph image, incorporates a shading technique to blend cool and warm colors into the image, and applies the revised LIC method to imitate paintings in the Impressionist style. Furthermore, we propose the color fidelity technique, which takes advantages of the cool-to-warm scheme to imitate conventional artistic painting and enhance the visual depth perception. We also present an information preservation technique, which quantifies image details to control the convolution length, thus preserving subtle information during the convolution process. Finally, we demonstrate a top-down sampling technique where a series of artistic mip-maps are generated to construct aesthetic virtual environments. These maps provide constant strokes of directional cues, achieving frame-to-frame coherence in an interactive walkthrough system. Both silhouette drawing and tour into the picture (TIP) approach are employed to enhance the user’s immersion in a virtual world. The experimental results demonstrate the merits of our techniques in generating images in the Impressionist style and constructing an interactive walkthrough system that provides an immersion experience rendered in painting styles. Last, texture synthesis has been widely studied in recent years and patch-based sampling has proven superior in synthesis quality and computation time. However, it suffers from the problem of non-parallel textures usually captured by tilted camera projection. Here we propose a novel texture synthesis framework to tackle the problem of displacement textures. Initially, we adopt a patch-based sampling algorithm by overlapping texture patches to synthesize textures of arbitrary size with similar appearance. Most importantly, we present a hybrid method of dynamic programming and feathering technique to make possible a consistent transition between two stitched boundaries. Secondly, we develop a synthesis system with no user intervention during the synthesis process. Our system is amenable to synthesizing tiling textures as well as constrained textures. Thirdly, a novel framework for displacement texture synthesis is proposed. Given a tilted source, our algorithm efficiently renders an extended texture with the same slant as the input sample. In addition, our method can easily rectify a displacement image to a vertical image, and vice versa. Experimental results show that the proposed framework succeeds in synthesizing frontal non-parallel textures. Finally, we propose a novel non-iterative transfer algorithm. Given a source and a target, our algorithm efficiently renders the target image by transferring matched source patches without incurring iteration. The method takes into account two principles of target fidelity and neighbor coherence. Experimental results demonstrate that our approach presents a more visually plausible appearance and runs faster than the iterative counterpart.
Wu, Shun-Liang, and 吳順良. "Rendering Complex Scenes Based on Spatial Subdivision and Texture-with-Depth." Thesis, 2000. http://ndltd.ncl.edu.tw/handle/25594323917654572643.
Full text國立交通大學
資訊工程系
88
In this thesis, we combine geometry-based and image-based rendering techniques to design and implement a VR navigation system that will have efficiency relatively independent of the scene complexity. The system has two phases. In the preprocessing phase, the x-y plane of a 3D scene is partitioned into equal-sized hexagonal cells, called navigation cells, each of which is associated with a larger image cell that has the identical center. Each side face of the image cell will be stored a cached image with depth that is obtained by rendering the scene using the cell''s center as the projection center and the side face as the window. The depth mesh of the cached image will be obtained by triangulating cached image using depth. In the run-time phase, the participant navigates inside a navigation cell and views the image derived by combining the geometry-based rendering of the objects inside the corresponding image cell and image-based rendering of the objects outside the corresponding image cell. Objects outside the image cell will be rendered by warping and reprojecting depth mesh with cached texture image and objects inside the image cell will be rendered by using meshes with appropriate resolution. The visibility culling technique will be also integrated to speedup geometry-based rendering.
Keng-JungHsu and 許耕榮. "GPU Implementation for Centralized Texture Depth Depacking and Depth Image-based Rendering." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/hm9f3u.
Full textWang, Wei-Jhih, and 王暐智. "Compression and Rendering of Multi-Spectral Bidirectional Texture Functions Using GPU-Based Tensor Approximation." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/3527nz.
Full text元智大學
資訊工程學系
105
Multi-Spectral Bidirectional Texture Functions (MSBTFs) are designed for the accurate color reproduction of complex materials in virtual scenes with arbitrary illumination. However, rendering MSBTFs at interactive rates is challenging since the amount of datasets is huge. This thesis applies a GPU-based tensor approximation framework for compressing MSBTFs and discusses some practical details about compressing and rendering MSBTFs. We also present a heuristic method that decides suitable compression parameters for better offline performance and a novel technique for efficient rendering at runtime. Finally, we analyze the offline performance of the GPU-based framework with thorough experiments to prove its efficiency and find out appropriate configurations for compressing MSBTFs.
Book chapters on the topic "Texture-based rendering"
Debevec, Paul, Yizhou Yu, and George Borshukov. "Efficient View-Dependent Image-Based Rendering with Projective Texture-Mapping." In Rendering Techniques ’98, 105–16. Vienna: Springer Vienna, 1998. http://dx.doi.org/10.1007/978-3-7091-6453-2_10.
Full textAdi, Waskito, and Suziah Sulaiman. "Haptic Texture Rendering Based on Visual Texture Information: A Study to Achieve Realistic Haptic Texture Rendering." In Lecture Notes in Computer Science, 279–87. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-05036-7_27.
Full textMax, Nelson, Oliver Deussen, and Brett Keating. "Hierarchical Image-Based Rendering using Texture Mapping Hardware." In Eurographics, 57–62. Vienna: Springer Vienna, 1999. http://dx.doi.org/10.1007/978-3-7091-6809-7_6.
Full textDumont, Reynald, Fabio Pellacini, and James A. Ferwerda. "A Perceptually-Based Texture Caching Algorithm for Hardware-Based Rendering." In Eurographics, 249–56. Vienna: Springer Vienna, 2001. http://dx.doi.org/10.1007/978-3-7091-6242-2_23.
Full textWang, Yubin, Meijun Sun, Zheng Wang, and Shiyao Wang. "2D Texture Library Based Fast 3D Ink Style Rendering." In Proceedings of the 2012 International Conference on Cybernetics and Informatics, 1919–29. New York, NY: Springer New York, 2013. http://dx.doi.org/10.1007/978-1-4614-3872-4_246.
Full textZabulis, Xenophon, Manolis I. A. Lourakis, and Stefanos S. Stefanou. "3D Pose Refinement Using Rendering and Texture-Based Matching." In Computer Vision and Graphics, 672–79. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-11331-9_80.
Full textHasegawa, Kyoko, Kozaburo Hachimura, and Satoshi Tanaka. "3D Fused Visualization Based on Particles-Based Rendering with Opacity Using Volume Texture." In Communications in Computer and Information Science, 160–65. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-45037-2_15.
Full textLaMar, Eric, Mark A. Duchaineau, Bernd Hamann, and Kenneth I. Joy. "Multiresolution Techniques for Interactive Texture-based Rendering of Arbitrarily Oriented Cutting Planes." In Eurographics, 105–14. Vienna: Springer Vienna, 2000. http://dx.doi.org/10.1007/978-3-7091-6783-0_11.
Full textLee, Won-Jong, Woo-Chan Park, Jung-Woo Kim, Tack-Don Han, Sung-Bong Yang, and Francis Neelamkavil. "A Bandwidth Reduction Scheme for 3D Texture-Based Volume Rendering on Commodity Graphics Hardware." In Computational Science and Its Applications – ICCSA 2004, 741–50. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-24709-8_78.
Full textMeyer, Joerg, Ragnar Borg, Ikuko Takanashi, Eric B. Lum, and Bernd Hamann. "Segmentation and Texture-Based Hierarchical Rendering Techniques for Large-Scale Real-Color Biomedical Image Data." In Data Visualization, 169–82. Boston, MA: Springer US, 2003. http://dx.doi.org/10.1007/978-1-4615-1177-9_12.
Full textConference papers on the topic "Texture-based rendering"
Celes, W., and F. Abraham. "Texture-Based Wireframe Rendering." In 2010 23rd SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI 2010). IEEE, 2010. http://dx.doi.org/10.1109/sibgrapi.2010.28.
Full textLi, Jialu, Aiguo Song, and Xiaorui Zhang. "Image-based haptic texture rendering." In the 9th ACM SIGGRAPH Conference. New York, New York, USA: ACM Press, 2010. http://dx.doi.org/10.1145/1900179.1900230.
Full textShihao, Chen, He Guiqing, and Hao Chongyang. "Rapid Texture-based Volume Rendering." In 2009 International Conference on Environmental Science and Information Application Technology, ESIAT. IEEE, 2009. http://dx.doi.org/10.1109/esiat.2009.147.
Full textWenquan, Sun, Tang Liyu, Chen Chongcheng, and Chen Gang. "Terrain rendering technology based on vertex texture." In 2010 International Conference on Audio, Language and Image Processing (ICALIP). IEEE, 2010. http://dx.doi.org/10.1109/icalip.2010.5684967.
Full textWoodford, O. J., and A. Fitzgibbon. "Fast Image-based Rendering using Hierarchical Texture Priors." In British Machine Vision Conference 2005. British Machine Vision Association, 2005. http://dx.doi.org/10.5244/c.19.38.
Full textNdj, P., M. Koppel, D. Doshkov, H. Lakshman, P. Merkle, K. Muller, and T. Wiegand. "Depth image based rendering with advanced texture synthesis." In 2010 IEEE International Conference on Multimedia and Expo (ICME). IEEE, 2010. http://dx.doi.org/10.1109/icme.2010.5583559.
Full textPai, Hong-Yi. "Texture designs and workflows for physically based rendering using procedural texture generation." In 2019 IEEE Eurasia Conference on IOT, Communication and Engineering (ECICE). IEEE, 2019. http://dx.doi.org/10.1109/ecice47484.2019.8942651.
Full textDong, Junyu, Xinghui Dong, Shanli Mou, and Bo Qin. "Flow Visualization Based on Rendering of 3D Surface Texture." In 2008 International Workshop on Geoscience and Remote Sensing (ETT and GRS). IEEE, 2008. http://dx.doi.org/10.1109/ettandgrs.2008.411.
Full textChen, Yixin, Wenying Qiu, Xiaohao Wang, and Min Zhang. "Tactile Rendering of Fabric Textures Based on Texture Recognition." In 2019 IEEE 2nd International Conference on Micro/Nano Sensors for AI, Healthcare, and Robotics (NSENS). IEEE, 2019. http://dx.doi.org/10.1109/nsens49395.2019.9293989.
Full textLi, Qian, Changhui Sun, and MeiKe Wang. "Low-pass filter along ray in texture-based volume rendering." In 2012 IEEE International Conference on Computer Science and Automation Engineering (CSAE). IEEE, 2012. http://dx.doi.org/10.1109/csae.2012.6272574.
Full text