To see the other types of publications on this topic, follow the link: Realistic.

Dissertations / Theses on the topic 'Realistic'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Realistic.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Jay, C. T. "Realistic fictionalism." Thesis, University College London (University of London), 2012. http://discovery.ucl.ac.uk/1356103/.

Full text
Abstract:
Realistic Fictionalism, argues for two main claims: First, that there is no conceptual or logical incoherence in the idea of a fictionalist theory of some discourse which accommodates a form of realism about that discourse (a claim which has been made in passing by various people, but which has never been adequately explored and assessed); and Second, that just such a fictionalist theory promises to be the best theory of our ordinary moral commitments, judgements and deliberation. In Part I, I explore the spirit of fictionalism and argue that thinking of fictionalism as closely tied to an analogy between its target discourse and fiction is liable to be misleading and is not mandatory. It emerges that the fictionalist’s strategy requires just a semantic thesis (representationalism) and a thesis about the sort of ‘acceptance’ appropriate for some practice involving their target discourse (nondoxasticism). I offer a theory of what ‘acceptance’ is, which treats belief as a mode of acceptance and distinguishes the nondoxastic modes of acceptance from belief in a principled and independently plausible way. And I argue that the coherence of realistic fictionalism is preserved by the fact that a person (the realistic fictionalist) can perfectly coherently both believe and nondoxastically accept the same claims. In Part II, I employ the theory of acceptance developed in Part I to propose a fictionalist model of how our ordinary moral commitments often are and generally ought to be. I then give an argument to the conclusion that, in respect of the relation between moral commitment and action guiding at least, it would be better if our moral commitments were to be nondoxastic. I then argue that realistic fictionalism offers a better way of explaining why we ought to have any moral commitments at all than a non-realist fictionalist theory could.
APA, Harvard, Vancouver, ISO, and other styles
2

Otis, Roger W. "Realistic emergence /." Online version of thesis, 1990. http://hdl.handle.net/1850/10919.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Harrison, Stephen James Roger. "Realistic image synthesis." Thesis, University of Cambridge, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.315934.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Jaber, Mona. "Realistic 5G backhaul." Thesis, University of Surrey, 2017. http://epubs.surrey.ac.uk/842117/.

Full text
Abstract:
The hype surrounding the 5G mobile networks is well justified in view of the explosive increase in mobile traffic and the inclusion of massive “non-human” users that form the internet of things. Advanced radio features such as network densification, cloud radio access networks (C-RAN), and untapped frequency bands jointly succeed in increasing the radio capacity to accommodate the increasing traffic demand. However, a new challenge has arisen: the backhaul (BH), the transport network that connects radio cells to the core network. The BH needs to expand in a timely fashion to reach the fast spreading small cells. Moreover, the realistic BH solutions are unable to provide the unprecedented 5G performance requirements to every cell. To this end, this research addresses the gap between the 5G stipulated BH characteristics and the available BH capabilities. On the other hand, heterogeneity is a leading trait in 5G networks. First, the RAN is heterogeneous since it comprises different cell types, radio access technologies, and architectures. Second, the BH is composed of a mix of different wired and wireless technologies with different limitations. In addition, 5G users have a broader range of capabilities and requirements than any incumbent mobile network. We exploit this trait and develop a novel scheme, termed User-Centric-BH (UCB). The UCB targets the user association mechanism which is traditionally blind to users’ needs and BH conditions. The UCB builds on the existing concept of cell range extension (CRE) and proposes multiple-offset factors (CREO) whereby each reflects the cell's joint RAN and BH capability with respect to a defined attribute (e.g., throughput, latency, resilience, etc.). In parallel, users associate different weights to different attributes, hence, they can make a user-centric decision. The proposed scheme significantly outperforms the state-of-the-art and unlocks the BH bottleneck by availing existing but misused resources to users in need.
APA, Harvard, Vancouver, ISO, and other styles
5

Brown, Steven G. "Realistic Virtue Ethics." The Ohio State University, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=osu1339517161.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Lau, Wing Hung. "Realistic 3D image composition." Thesis, University of Cambridge, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.239174.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Crause, Justin. "Fast, Realistic Terrain Synthesis." Thesis, University of Cape Town, 2015. http://pubs.cs.uct.ac.za/archive/00001052/.

Full text
Abstract:
The authoring of realistic terrain models is necessary to generate immersive virtual environments for computer games and film visual effects. However, creating these landscapes is difficult – it usually involves an artist spending many hours sculpting a model in a 3D design program. Specialised terrain generation programs exist to rapidly create artificial terrains, such as Bryce (2013) and Terragen (2013). These make use of complex algorithms to pseudo-randomly generate the terrains, which can then be exported into a 3D editing program for fine tuning. Height-maps are a 2D data-structure, which stores elevation values, and can be used to represent terrain data. They are also a common format used with terrain generation and editing systems. Height-maps share the same storage design as image files, as such they can be viewed like any picture and image transformation algorithms can be applied to them. Early techniques for generating terrains include fractal generation and physical simulation. These methods proved difficult to use as the algorithms were manipulated with a set of parameters. However, the outcome from changing the values is not known, which results in the user changing values over several iterations to produce their desired terrain. An improved technique brings in a higher degree of user control as well as improved realism, known as texture-based terrain synthesis. This borrows techniques from texture synthesis, which is the process of algorithmically generating a larger image from a smaller sample image. Texture-based terrain synthesis makes use or real-world terrain data to produce highly realistic landscapes, which improves upon previous techniques. Recent work in texture-based synthesis has focused on improving both the realism and user control, through the use of sketching interfaces. We present a patch-based terrain synthesis system that utilises a user sketch to control the location of desired terrain features, such as ridges and valleys. Digital Elevation Models (DEMs) of real landscapes are used as exemplars, from which candidate patches of data are extracted and matched against the user’s sketch. The best candidates are merged seamlessly into the final terrain. Because real landscapes are used the resulting terrain appears highly realistic. Our research contributes a new version of this approach that employs multiple input terrains and acceleration using a modern Graphics Processing Unit (GPU). The use of multiple inputs increases the candidate pool of patches and thus the system is capable of producing more varied terrains. This addresses the limitation where supplying the wrong type of input terrain would fail to synthesise anything useful, for example supplying the system with a mountainous DEM and expecting deep valleys in the output. We developed a hybrid multithreaded CPU and GPU implementation that achieves a 45 times speedup.
APA, Harvard, Vancouver, ISO, and other styles
8

Lu, Zhaoying. "Perceptually realistic flower generation." Thesis, University of Bath, 2001. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.393800.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ghosh, Abhijeet. "Realistic materials and illumination environments." Thesis, University of British Columbia, 2007. http://hdl.handle.net/2429/31311.

Full text
Abstract:
Throughout its history, the field of computer graphics has been striving towards increased realism. This goal has traditionally been described by the notion of photo-realism, and more recently and in many cases the more ambitious goal of perceptual realism. Photo-realistic image synthesis involves many algorithms describing the phenomena of light transport in a scene as well as its interaction with various materials. On the other hand, research in perceptual realism typically involves various tone mapping algorithms for display devices as well as algorithms that mimic the natural response of the human visual system in order to recreate the visual experience of a real scene. An important aspect of realistic rendering is the accurate modeling of the scene elements such as light sources and material reflectance properties. This dissertation proposes a set of new techniques for efficient acquisition of material properties as well as new algorithms for high quality rendering with acquired data. Here, we are mostly concerned with the acquisition and rendering of local illumination effects. In particular, we propose a new optical setup for efficient acquisition of the bidirectional reflectance distribution function (BRDF) with basis illumination and various Monte Carlo strategies for efficient sampling of direct illumination. The dissertation also looks into the display end of the image synthesis pipeline and proposes algorithms for displaying scenes on high dynamic range (HDR) displays for visual realism, and for tying the room illumination with the viewing environment for a sense of presence and immersion in a virtual environment. Here, we develop real-time rendering algorithms for driving the HDR displays as well as for active control of room illumination based on dynamic scene content. Thus, we propose contributions to the acquisition, rendering, and display end of the image synthesis pipeline while targeting real-time rendering applications, as well as high quality off-line rendering with realistic materials and illumination environments.
Science, Faculty of
Computer Science, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
10

Loderer, Armin. "Erwartungsmanagement durch realistic service previews." München FGM-Verl, 2005. http://deposit.ddb.de/cgi-bin/dokserv?id=2658490&prov=M&dok_var=1&dok_ext=htm.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Eustace, Natalie Margaret. "Biological Realistic Education Technology (BRET)." Thesis, University of Canterbury. HIT Lab NZ, 2014. http://hdl.handle.net/10092/9242.

Full text
Abstract:
The aim of this project was to develop and evaluate an interactive Augmented Reality interface for teaching children aged 8 to 15 about biological systems present in the human body. The interface was de- signed as one component of a “human body scanner” exhibit, which is to be featured at the ScienceAlive! Science Centre. In the exhibit, the interface allows visualization and interaction with the body systems while being moved across a human male mannequin named BRET. Prior research has shown that Augmented Reality, Visualization applications, and games are viable methods to teach biology to university aged users, and Augmented Reality and interactive systems have been used with children and learning biology as well. BRET went through three iteration phases, in the first phase, prototypes were evaluated by ScienceAlive! and designs and interactions were implemented, while the use of Augmented Reality through a transparent display was rejected. Iteration two included integration of the non-transparent touch display screen and observational evaluation of six children from 9 to 15 years old. This evaluation resulted in design and interaction changes. Iteration three was the last iteration where final interface and interaction modifications were made and re- search was conducted with 48 children from the ages 8 to 15. This was to determine whether learning, fun, and retention rates were higher for children who interacted with BRET versus those who watched video clips, or read text. Each child used one learning method to learn the three different body systems: skeletal, circulatory, and digestion. The results of the final evaluation showed that overall there was no significant difference in the children’s rating of fun or the amount of information they retained between the different learning methods. There was a positive significant difference between some of the expected fun scores and the actual fun scores. It was also found that learning with text was higher than the interactive condition but there was no differences between learning with video and interaction, or with text and video.
APA, Harvard, Vancouver, ISO, and other styles
12

Mustapha, Faridah. "Realistic modelling of interspecific interactions." Thesis, University of Strathclyde, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.248284.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Westaby, Stephen. "Towards a realistic artificial heart." Thesis, University of Strathclyde, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.248952.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Xie, Liguang. "Realistic Motion Estimation Using Accelerometers." Thesis, Virginia Tech, 2009. http://hdl.handle.net/10919/43368.

Full text
Abstract:
A challenging goal for both the game industry and the research community of computer graphics is the generation of 3D virtual avatars that automatically perform realistic human motions with high speed at low monetary cost. So far, full body motion estimation of human complexity remains an important open problem. We propose a realistic motion estimation framework to control the animation of 3D avatars. Instead of relying on a motion capture device as the control signal, we use low-cost and ubiquitously available 3D accelerometer sensors. The framework is developed in a data-driven fashion, which includes two phases: model learning from an existing high quality motion database, and motion synthesis from the control signal. In the phase of model learning, we built a high quality motion model of less complexity that learned from a large motion capture database. Then, by taking the 3D accelerometer sensor signal as input, we were able to synthesize high-quality motion from the motion model we learned. In this thesis, we present two different techniques for model learning and motion synthesis, respectively. Linear and nonlinear reduction techniques for data dimensionality are applied to search for the proper low dimensional representation of motion data. Two motion synthesis methods, interpolation and optimization, are compared using the 3D acceleration signals with high noise. We evaluate the result visually compared to the real video and quantitatively compared to the ground truth motion. The system performs well, which makes it available to a wide range of interactive applications, such as character control in 3D virtual environments and occupational training.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
15

Yu, Kaimin. "Towards Realistic Facial Expression Recognition." Thesis, The University of Sydney, 2013. http://hdl.handle.net/2123/9459.

Full text
Abstract:
Automatic facial expression recognition has attracted significant attention over the past decades. Although substantial progress has been achieved for certain scenarios (such as frontal faces in strictly controlled laboratory settings), accurate recognition of facial expression in realistic environments remains unsolved for the most part. The main objective of this thesis is to investigate facial expression recognition in unconstrained environments. As one major problem faced by the literature is the lack of realistic training and testing data, this thesis presents a web search based framework to collect realistic facial expression dataset from the Web. By adopting an active learning based method to remove noisy images from text based image search results, the proposed approach minimizes the human efforts during the dataset construction and maximizes the scalability for future research. Various novel facial expression features are then proposed to address the challenges imposed by the newly collected dataset. Finally, a spectral embedding based feature fusion framework is presented to combine the proposed facial expression features to form a more descriptive representation. This thesis also systematically investigates how the number of frames of a facial expression sequence can affect the performance of facial expression recognition algorithms, since facial expression sequences may be captured under different frame rates in realistic scenarios. A facial expression keyframe selection method is proposed based on keypoint based frame representation. Comprehensive experiments have been performed to demonstrate the effectiveness of the presented methods.
APA, Harvard, Vancouver, ISO, and other styles
16

Wang, Rongyu. "Strategic choices in realistic settings." Thesis, University of Edinburgh, 2016. http://hdl.handle.net/1842/22052.

Full text
Abstract:
In this thesis, we study Bayesian games with two players and two actions (2 by 2 games) in realistic settings where private information is correlated or players have scarcity of attention. The contribution of this thesis is to shed further light on strategic interactions in realistic settings. Chapter 1 gives an introduction of the research and contributions of this thesis. In Chapter 2, we study how the correlation of private information affects rational agents’ choice in a symmetric game of strategic substitutes. The game we study is a static 2 by 2 entry game. Private information is assumed to be jointly normally distributed. The game can, for some parameter values, be solved by a cutoff strategy: that is enter if the private payoff shock is above some cutoff value and do not enter otherwise. Chapter 2 shows that there is a restriction on the value of correlation coefficient such that the game can be solved by the use of cutoff strategies. In this strategic-substitutes game, there are two possibilities. When the game can be solved by cutoff strategies, either, the game exhibits a unique (symmetric) equilibrium for any value of correlation coefficient; or, there is a threshold value for the correlation coefficient such that there is a unique (symmetric) equilibrium if the correlation coefficient is below the threshold, while if the correlation coefficient is above the threshold value, there are three equilibria: a symmetric equilibrium and two asymmetric equilibria. To understand how parameter changes affect players’ equilibrium behaviour, a comparative statics analysis on symmetric equilibrium is conducted. It is found that increasing monopoly profit or duopoly profit encourages players to enter the market, while increasing information correlation or jointly increasing the variances of players’ prior distribution will make players more likely to choose entry if the equilibrium cutoff strategies are below the unconditional mean, and less likely to choose entry if the current equilibrium cutoff strategies are above the unconditional mean. In Chapter 3, we study a 2 by 2 entry game of strategic complements in which players’ private information is correlated. As in Chapter 2, the game is symmetric and private information is modelled by a joint normal distribution. We use a cutoff strategy as defined in Chapter 2 to solve the game. Given other parameters, there exists a critical value of the correlation coefficient. For correlation coefficient below this critical value, cutoff strategies cannot be used to solve the game. We explore the number of equilibria and comparative static properties of the solution with respect to the correlation coefficient and the variance of the prior distribution. As the correlation coefficient changes from the lowest feasible (such that cutoff strategies are applicable) value to one, the sequence of the number of equilibrium will be 3 to 2 to 1, or 3 to 1. Alternatively, under some parameter specifications, the game exhibits a unique equilibrium for all feasible value of the correlation coefficient. The comparative statics of equilibrium strategies depends on the sign of the equilibrium cutoff strategies and the equilibrium’s stability. We provide a necessary and sufficient condition for the existence of a unique equilibrium. This necessary and sufficient condition nests the sufficient condition for uniqueness given by Morris and Shin (2005). Finally, if the correlation coefficient is negative for the strategic-complements games or positive for the strategic-substitutes games, there exists a critical value of variance such that for a variance below this threshold, the game cannot be solved in cutoff strategies. This implies that Harsanyi’s (1973) purification rationale, supposing the perturbed games are solved by cutoff strategies and the uncertainty of perturbed games vanishes as the variances of the perturbation-error distribution converge to zero, cannot be applied for a strategic-substitutes (strategic-complements) game with dependent perturbation errors that follow a joint normal distribution if the correlation coefficient is positive (negative). However, if the correlation coefficient is positive for the strategic-complements games or negative for the strategic-substitutes games, the purification rationale is still applicable even with dependent perturbation errors. There are Bayesian games that converge to the underlying complete information game as the perturbation errors degenerate to zero, and every pure strategy Bayesian Nash equilibrium of the perturbed games will converge to the corresponding Nash equilibrium of the complete information game in the limit. In Chapter 4, we study how scarcity of attention affects strategic choice behaviour in a 2 by 2 incomplete information strategic-substitutes entry game. Scarcity of attention is a common psychological characteristic (Kahneman 1973) and it is modelled by the rational inattention approach introduced by Sims (1998). In our game, players acquire information about their own private payoff shocks (which here follows a high-low binary distribution) at a cost. We find that, given the opponent’s strategy, as the unit cost of information acquisition increases a player’s best response will switch from acquiring information to simply comparing the ex-ante expected payoff of each action (using the player’s prior). By studying symmetric Bayesian games, we find that scarcity of attention can generate multiple equilibria in games that ordinarily have a unique equilibrium. These multiple equilibria are generated by the information cost. In any Bayesian game where there are multiple equilibria, there always exists one pair of asymmetric equilibria in which at least one player plays the game without acquiring information. The number of equilibria differs with the value of the unit information cost. There can be 1, 5 or 3 equilibria. Increasing the unit information cost could encourage or discourage a player from choosing entry. It depends on whether the prior probability of a high payoff shock is greater or less than some threshold value. We compare the rational inattention Bayesian game with a Bayesian quantal response equilibrium game where the observation errors are additive and follow a Type I extreme value distribution. A necessary and sufficient condition is established such that both the rational inattention Bayesian game and quantal response game have a common equilibrium.
APA, Harvard, Vancouver, ISO, and other styles
17

Marsden, Timothy. "Designing a realistic virtual bumblebee." Digital WPI, 2016. https://digitalcommons.wpi.edu/etd-theses/1304.

Full text
Abstract:
Optimal Foraging Theory is a set of mathematical models used in the field of behavioral ecology to predict how animals should weigh foraging costs and benefits in order to maximize their food intake. One popular model, referred to as the Optimal Diet Model (ODM), focuses on how individuals should respond to variation in food quality in order to optimize food selection. The main prediction of the ODM is that low quality food items should only be accepted when higher quality items are encountered below a predicted threshold. Yet, many empirical studies have found that animals still include low quality items in their diet above such thresholds, indicating a sub-optimal foraging strategy. Here, we test the hypothesis that such ‘partial preferences’ are produced as a consequence of incomplete information on prey distributions resulting from memory limitations. To test this hypothesis, we used agent-based modeling in NetLogo to create a model of flower choice behavior in a virtual bumblebee forager (SimBee). We program virtual bee foragers with an adaptive decision-making algorithm based on the classic ODM, which we have modified to include memory. Our results show that the probability of correctly rejecting a low quality food item increases with memory size, suggesting that memory limitations play a significant role in driving partial preferences. We discuss the implications of this finding and further applications of our SimBee model in research and educational contexts.
APA, Harvard, Vancouver, ISO, and other styles
18

Öhrn, Kristina. "Different Mapping Techniques for Realistic Surfaces." Thesis, University of Gävle, Department of Mathematics, Natural and Computer Sciences, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:hig:diva-612.

Full text
Abstract:

The different mapping techniques that are used increases the details on surfaces without increasing the number of polygons. Image Based Sculpting tools in the program Modo and Z-Brush is used to create folds and wrinkles from photographs of actual fabrics instead of trying to create these shapes by modeling them. This method makes it easier to achieve photorealistic renderings and produce as realistic fabric dynamics as possible when they are applied on objects.

APA, Harvard, Vancouver, ISO, and other styles
19

Rugarn, Jonatan. "Rapid Development of Realistic UAV Simulations." Thesis, Linköping University, Department of Computer and Information Science, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-17099.

Full text
Abstract:

Instrument Control Sweden (ICS) is a software company that develops NATO STANAG 4586 compatible ground station software for control of unmanned systems such as unmanned aerial vehicles (UAVs). To perform testing and demonstration of the ground station software ICS needs a realistic UAV simulator that implements the STANAG 4586 protocol. This thesis studies what methods are best suited for the rapid development of such a simulator.

One goal with the project was to examine what existing flight simulator systems and flight dynamics models can be used to rapidly develop a UAV simulator. Another goal was to design and implement such a simulator. It is found that it’s possible to quickly develop a UAV simulator based on existing projects such as the flight simulator FlightGear, the simulation framework OpenEaagles and the flight dynamics model (FDM) JSBSim.

The design of the simulator is modular, object-oriented and features real-time design techniques. The main application is a simulation of a Vehicle Specific Module, which implements the STANAG 4586 protocol. Another module based on the OpenEaagles framework simulates the aircraft and its subsystems. A third module consists of the JSBSim FDM and simulates the flight dynamics and movements of the aircraft under the forces and moments affecting it.

APA, Harvard, Vancouver, ISO, and other styles
20

Ludwigsson, Jonas. "Creating realistic hair in Autodesk Maya." Thesis, Högskolan i Gävle, Avdelningen för Industriell utveckling, IT och Samhällsbyggnad, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:hig:diva-14411.

Full text
Abstract:
This thesis work focuses on how to create realistic looking hair using only the vanilla version of Autodesk Maya. It describes two approaches, the widely used polygon-stripe based technique and the Maya built-in nHair. It also evaluates these two approaches in terms of ease of implementation, production speed and quality of final results. The conclusion is that nHair has the potential to produce realistic looking hair but contains various bugs and is not optimized at the current stage, while the polygon-stripe based approach is robust and flexible but the realism of rendering results is heavily dependent upon the skill level of artists.
APA, Harvard, Vancouver, ISO, and other styles
21

Vettickal, Thomas V. "Sarvodaya of Mahatma Gandhi, realistic utopia." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/nq35355.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Green, Stuart Antony. "Multiprocessor systems for realistic image synthesis." Thesis, University of Bristol, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.329883.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Sanders, Jet G. "Face perception and hyper-realistic masks." Thesis, University of York, 2018. http://etheses.whiterose.ac.uk/22393/.

Full text
Abstract:
Previous research has shown that deliberate disguise deteriorates human and automatic face recognition, with consequences for person identification in criminal situations. Common forms of deliberate disguise (e.g. balaclavas or hoodies) are easy to detect. When such disguises are used, viewer can distinguish between an unmasked individual - whose identity they knowingly can observe from facial appearance - and a masked individual - whose identity they knowingly cannot. Hyper-realistic silicone masks change this. Their recent use in criminal settings suggests that they effectively disguise identity and are difficult to detect. In this thesis, I first show that viewers are strikingly poor at distinguishing hyper-realistic masks from real faces under live and photographic test conditions, and are worse in other-race conditions. I also show large individual differences in discriminating realistic masks from real faces (5%-100% accuracy), and use an image analysis to isolate information that high performers use for effective categorisation. The analysis reveals an informative region directly below the eyes, which is used by high performers but not low performers. These findings point to selection and training as routes to improved mask detection. Second, I examine the reliability of estimates made of the person beneath the mask. Demographic profiling and social character estimates are poor, and results show that recognition rates were only just above chance, even for familiar viewers. This analysis highlights a systematic bias in these estimates: demographics, traits and social characteristics of the mask were attributed to those of the wearer. This bias has theoretical and applied consequences. First, it supports the automaticity with which viewers use a face to judge a person, even when they know the face is not that of the person. Second, it suggests that predictions of the person underneath the mask, by familiar and unfamiliar viewers alike, should be treated with great caution.
APA, Harvard, Vancouver, ISO, and other styles
24

Levene, Jonathan (Jonathan Steven) 1974. "A framework for non-realistic projections." Thesis, Massachusetts Institute of Technology, 1998. http://hdl.handle.net/1721.1/49647.

Full text
Abstract:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1998.
Includes bibliographical references (leaves 46-48).
by Jonathan Levene.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
25

Neugebauer, R., P. Klimant, and M. Witt. "Realistic Machine Simulation with Virtual Reality." Universitätsbibliothek Chemnitz, 2014. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-qucosa-151993.

Full text
Abstract:
Today highly complex components are manufactured on NC-controlled machine tools. The NC programs, controlling these machines, are usually automatically generated by CAM software. This automatic processing is often erroneous. The VR-based realistic machine simulation, presented in this paper, extends the usual content of a machine simulation, like material removal and collision detection, by various new aspects. The coupling of a real NC unit allows the recognition and elimination of all process- as well as controller-caused errors. The integration of the multi-body simulation enables the consideration of inertia, machine rigidity and milling cutter deflection.
APA, Harvard, Vancouver, ISO, and other styles
26

Pargana, Julio Balsa. "Realistic modelling of tension fabric structures." Thesis, Imperial College London, 2004. http://hdl.handle.net/10044/1/51483.

Full text
Abstract:
An accurate and reliable analysis capability for Tensioned Fabric Structures (TFS) constructed from plain weave PTFE coated glass-fiber fabric has been developed in this work. This analysis facility has in turn enabled an investigation of the behaviour and design of TFS to be conducted. The investigation has revealed deficiencies in current design procedures, which may ultimately result in compromised design, as these are founded on unrealistic overly simplistic design assumptions. The analysis facility developed is based on finite element analysis, and utilizes newly purpose developed elements and a specially developed material model for the fabric component of structures. The new elements and material model for fabric have been integrated into ADAPTIC (Izzuddin, 1991), an advanced nonlinear structural analysis program, allowing real structures to be analysed. The analysis facility developed enables analytical structural models to converge closer to physical reality than allowed by existing analysis facilities, which is attributed to the robust assumptions upon which the new analysis facility is based. As an example of such a robust assumption, fabric patterns are taken to be flat in their unstressed state, a realistic assumption ignored by typical current analysis capabilities for TFS. The accuracy and reliability of the developed finite elements and material model for the fabric are demonstrated to be high, through an appropriate number of verification examples and comparison of experimental test data, for the material response, against the modelled response. Confidence in the analysis facility is thus guaranteed. The work also contains a theoretical deliberation of the current design procedure, the integrated design procedure and a new design procedure called the combined design procedure. The combined design procedure, so called as it is based on both the current design procedure and the integrated design procedure, offers the prospect of considerable improvements to the overall design process for TFS.
APA, Harvard, Vancouver, ISO, and other styles
27

Long, Harry. "Procedurally generated realistic virtual rural worlds." Master's thesis, University of Cape Town, 2016. http://hdl.handle.net/11427/20874.

Full text
Abstract:
Manually creating virtual rural worlds is often a difficult and lengthy task for artists, as plant species selection, plant distributions and water networks must be deduced such that they realistically reflect the environment being modelled. As virtual worlds grow in size and complexity, climates vary on the terrain itself and a single ecosystem is no longer sufficient to realistically model all vegetation. Consequentially, the task is only becoming more difficult for these artists. Procedural methods are extensively used in computer graphics to partially or fully automate some tasks and take some of the burden off the user. Input parameters for these procedural algorithms are often unintuitive, however, and their impact on the final results, unclear. This thesis proposes, implements, and evaluates an approach to procedurally generate vegetation and water networks for realistic virtual rural worlds. Rather than placing these to reflect the environment being modelled, the work-flow is mirrored and the user models the environment directly by specifying the resources available. These intuitive input parameters are subsequently used to configure procedural algorithms and determine suitable vegetation, plant distributions and water networks. By design, the placeable plant species are configurable so any type of environment can be modelled at various levels of detail. The system has been tested by creating three ecosystems with little effort on the part of the user.
APA, Harvard, Vancouver, ISO, and other styles
28

Radak, Jovan. "Algorithms for Realistic Wireless Sensor Networks." Thesis, Lille 1, 2011. http://www.theses.fr/2011LIL10079/document.

Full text
Abstract:
Réseaux de capteurs sont des réseaux composés de petits objets répartis dans l'espace, appelés nœuds ou capteurs, qui travaillent en collaboration - échange de messages sans fil - sur la même application. Aujourd'hui, ces types des réseaux sont largement utilisés dans le suivi environnemental, industriel et les applications grand public et à des fins militaires. Dans ces travaux, nous nous attaquons à différents domaines de recherche dans les réseaux de capteurs: contrôle de topologie, la mobilité, la découverte de voisinage et d'expérimentation à grande échelle. Nous utilisons une réduction de graphe des plus proches voisins avec les données obtenues d'alimentation du nœud pour développer l'algorithme de contrôle de topologie. Cet algorithme conserve une connectivité du réseau dans les situations critiques où certains des capteurs épuisent de leurs batteries. Les paramètres de découverte de voisinage sont utilisés pour en déduire la mobilité relative des capteurs. Ensuite, ces paramètres sont adaptés avec la puissance d’ émission pour obtenir un algorithme efficace de découverte de voisinage. Les sites d'expérimentation à grands échelle sont un outil précieux pour développer et tester des algorithmes pour les réseaux de capteurs sans fil, mais ils ont aussi des défauts divers, le plus grand d'entre eux est le coût. Nous présentons une émulation de réseaux à grande échelle comme une solution. On utilise de petits réseaux avec un placement précis des capteurs qui permet la réplication de comportement ainsi émuler des réseaux à grande échelle. Les algorithmes sont testés et évalués sur le simulateur WSNet et pratiquement en utilisant la plate-forme SensLab et nœuds de capteurs WSN430
Wireless sensor networks can be defined as networks of small spatially distributed devices, called sensor nodes, which are working cooperatively - exchanging messages wirelessly - on the same application. Today these kinds of networks are widely used in environmental monitoring, industrial and consumer applications and for military purposes. In this thesis we are tackling different areas of research in wireless sensor networks: topology control, mobility, neighborhood discovery and large scale experimentation. We are using relative neighborhood graph reduction along with power supply data obtained from the sensor node to develop topology control algorithm. This algorithm maintains connectivity of the network in critical situations when some of the sensors drain their batteries. Neighborhood discovery parameters are used to deduce relative mobility of the sensor nodes. Then these parameters are adapted with transmission range to obtain energy efficient neighborhood discovery algorithm. Large scale experimentation sites are valuable tool for developing and testing of algorithms for wireless sensor networks but they also have various deficiencies, the biggest of them is cost. We present emulation of large scale networks as a solution. It uses small networks with the specific placement of the sensor nodes which allows replicating thus emulating behavior of the large scale networks. Algorithms are tested and evaluated on the WSNet simulator and practically using the SensLab platform and WSN430 sensor nodes
APA, Harvard, Vancouver, ISO, and other styles
29

Ozkok, Ozlem. "A realistic model of network survivability." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2003. http://library.nps.navy.mil/uhtbin/hyperion-image/03sep%5FOzkok.pdf.

Full text
Abstract:
Thesis (M.S. in Information Technology Management and M.S. in Computer Science)--Naval Postgraduate School, September 2003.
Thesis advisor(s): Geoffrey Xie, Alex Bordetsky. Includes bibliographical references (p. 47-48). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
30

Gustafsson, Emil. "Synthetic Generation of Realistic Network Traffic." Thesis, Linköpings universitet, Databas och informationsteknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-165285.

Full text
Abstract:
The industry shows a clear need for synthetically generated realistic network traffic. As a possible solution, this thesis proposes a method for generating such data in an automatic and controllable manner. This thesis first examines the characteristics of real network traffic and analyzes the length of ON/OFF periods. The theory that network traffic exhibits self-similarity and high variability is once again tested and proven, thereby also the fact that the ON/OFF periods of real network traffic comes from a heavy-tailed distribution. Thereafter, the thesis proposes a way to simulate user interaction with real world applications by using a UI testing framework called WinAppDriver. This tool is then used to synthetically generate network traffic, of which the characteristics are analyzed and compared to that of real network traffic. The results show that the generated network traffic is indeed statistically similar to real network traffic. Finally, everything is combined by setting up a whole network of virtual machines with simulated users.
APA, Harvard, Vancouver, ISO, and other styles
31

Macdonald, L. W. "Realistic visualisation of cultural heritage objects." Thesis, University College London (University of London), 2015. http://discovery.ucl.ac.uk/1471969/.

Full text
Abstract:
This research investigation used digital photography in a hemispherical dome, enabling a set of 64 photographic images of an object to be captured in perfect pixel register, with each image illuminated from a different direction. This representation turns out to be much richer than a single 2D image, because it contains information at each point about both the 3D shape of the surface (gradient and local curvature) and the directionality of reflectance (gloss and specularity). Thereby it enables not only interactive visualisation through viewer software, giving the illusion of 3D, but also the reconstruction of an actual 3D surface and highly realistic rendering of a wide range of materials. The following seven outcomes of the research are claimed as novel and therefore as representing contributions to knowledge in the field:  A method for determining the geometry of an illumination dome;  An adaptive method for finding surface normals by bounded regression;  Generating 3D surfaces from photometric stereo;  Relationship between surface normals and specular angles;  Modelling surface specularity by a modified Lorentzian function;  Determining the optimal wavelengths of colour laser scanners;  Characterising colour devices by synthetic reflectance spectra.
APA, Harvard, Vancouver, ISO, and other styles
32

Normando, Paulo Garcia. "Spatial interference alignment under realistic scenarios." reponame:Repositório Institucional da UFC, 2013. http://www.repositorio.ufc.br/handle/riufc/11051.

Full text
Abstract:
MORMANDO, P. G. Spatial interference alignment under realistic scenarios. 2013. 65 f. Dissertação (Mestrado em Engenharia de Teleinformática) – Centro de Tecnologia, Universidade Federal do Ceará, Fortaleza, 2013.
Submitted by Marlene Sousa (mmarlene@ufc.br) on 2015-03-05T18:30:21Z No. of bitstreams: 1 2013_dis_pgmormando.pdf: 993520 bytes, checksum: 5ab14e795c4ba9291b53267352f954ef (MD5)
Approved for entry into archive by Marlene Sousa(mmarlene@ufc.br) on 2015-03-24T11:01:20Z (GMT) No. of bitstreams: 1 2013_dis_pgmormando.pdf: 993520 bytes, checksum: 5ab14e795c4ba9291b53267352f954ef (MD5)
Made available in DSpace on 2015-03-24T11:01:20Z (GMT). No. of bitstreams: 1 2013_dis_pgmormando.pdf: 993520 bytes, checksum: 5ab14e795c4ba9291b53267352f954ef (MD5) Previous issue date: 2013-08-02
Due to the rapid growth and the aggressive throughput requirements of current wireless networks, such as the 4th Generation (4G) cellular systems, the interference has become an issue that cannot be neglected anymore. In this context, the Interference Alignment (IA) arises as a promising technique that enables transmissions free of interference with high-spectral efficiency. However, while recent works have focused mainly on the theoretical gains that the technique could provide, this dissertation aims to go a step further and clarify some of the practical issues on the implementation of this technique in a cellular network, as well as compare it to other well-established techniques. As an initial evaluation scenario, a 3-cell network was considered, for which several realistic factors were taken into account in order to perform different analyses. The first analysis was based on channel imperfections, for which the results showed that IA is more robust than Block Diagonalization (BD) regarding the Channel State Information (CSI) errors, but both are similarly affected by the correlation among transmit antennas. The impact of uncoordinated interference was also evaluated, by modeling this interference with different covariance matrices in order to mimic several scenarios. The results showed that modifications on the IA algorithms can boost their performance, with an advantage to the approach that suppresses one stream, when the Bit Error Rate (BER) is compared. To combine both factors, the temporal channel variations were taken into account. At these set of simulations, besides the presence of an external interference, the precoders were calculated using a delayed CSI, leading to results that corroborate with the previous analyses. A recurring fact on the herein considered analyses was the dilemma of weather to apply the Joint Processing (JP)-based algorithms in order to achieve higher sum capacities or to send the information through a more reliable link by using IA. A reasonable step towards solving this dilemma is to actually perform the packet transmissions, which was accomplished by employing a system-level simulator composed by a large number of Transmission Points (TPs). As a result, all analyses conducted with this simulator showed that the IA technique can provide an intermediate performance between the non-cooperation and the full cooperation scheme. Concluding, one of the main contributions of this work has been to show some scenarios/cases where the IA technique can be applied. For instance, when the CSI is not reliable it can be better to use IA than a JP-based scheme. Also, the modifications on the algorithms to take into account the external interference can boost their performance. Finally, the IA technique finds itself in-between the conventional transmissions and Coordinated Multi-Point (CoMP). IA achieves an intermediate performance, while requiring a certain degree of cooperation among the neighboring sectors, but demanding less infrastructure than the JP-based schemes.
Devido ao rápido crescimento e os agressivos requisitos de vazão nas atuais redes sem fio, como os sistemas celulares de 4 a Geração, a interferência se tornou um problema que não pode mais ser negligenciado. Neste contexto, o Alinhamento de Interferência (IA) tem surgido como uma técnica promissora que possibilita transmissões livres de interferência com elevada eficiência espectral. No entanto, trabalhos recentes têm focado principalmente nos ganhos teóricos que esta técnica pode prover, enquanto esta dissertação visa dar um passo na direção de esclarecer alguns dos problemas práticos de implementação da técnica em redes celulares, bem como compará-la com outras técnicas bem estabelecidas. Uma rede composta por três células foi escolhida como cenário inicial de avaliação, para o qual diversos fatores realistas foram considerados de modo a realizar diferentes análises. A primeira análise foi baseada em imperfeições de canal, cujos resultados mostraram que o IA é mais robusto aos erros de estimação de canal que o BD (do inglês, Block Diagonalization), enquanto as duas abordagens são igualmente afetadas pela correlação entre as antenas. O impacto de uma interferência externa não-coordenada, que foi modelada por diferentes matrizes de covariância de modo a emular vários cenários, também foi avaliado. Os resultados mostraram que as modificações feitas nos algoritmos de IA podem melhorar bastante seus desempenho, com uma vantagem para o algoritmo que suprime um único fluxo de dados, quando são comparadas as taxas de erro de bit alcançadas por cada um. Para combinar os fatores das análises anteriores, as variações temporais de canal foram consideradas. Neste conjunto de simulações, além da presença da interferência externa, os pré-codificadores são calculados através de medidas atrasadas de canal, levando a resultados que corroboraram com as análises anteriores. Um fato recorrente percebido em todas as análises anteriores é o dilema entre aplicar os algoritmos baseados em BD, para que se consiga alcançar maiores capacidades, ou enviar a informação através de um enlace mais confiável utilizando o IA. Uma maneira de esclarecer este dilema é efetivamente realizar simulações a nível sistêmico, para isto foi aplicado um simulador sistêmico composto por um grande número de setores. Como resultado, todas as análises realizadas neste simulador mostraram que a técnica de IA atinge desempenhos intermediários entre a não cooperação e os algoritmos baseados na pré-codificação conjunta. Uma das principais contribuições deste trabalho foi mostrar alguns cenários em que a técnica do IA pode ser aplicada. Por exemplo, quando as estimações dos canais não são tão confiáveis é melhor aplicar o IA do que os esquemas baseados no processamento conjunto. Também mostrou-se que as modificações nos algoritmos de IA, que levam em consideração a interferência externa, podem melhorar consideravelmente o desempenho dos algoritmos. Finalmente, o IA se mostrou uma técnica adequada para ser aplicada em cenários em que a interferência é alta e não é possível ter um alto grau de cooperação entre os setores vizinhos.
APA, Harvard, Vancouver, ISO, and other styles
33

Maho, Thibault. "Neural networks security under realistic scenario." Electronic Thesis or Diss., Université de Rennes (2023-....), 2023. http://www.theses.fr/2023URENS121.

Full text
Abstract:
L'Intelligence Artificielle est aujourd'hui sur toutes les lèvres, porté par la révolution des réseaux de neurones qui ont fait leurs preuves dans diverses tâches. Ils ont notamment surpassé les capacités humaines en vision par ordinateur. Cette thèse se concentre ainsi sur les réseaux de neurones, en mettant l'accent sur les tâches de classification d'images. Cependant, ce succès remarquable s'accompagne de certaines failles. Les réseaux de neurones présentent des faiblesses en termes de confidentialité, d'intégrité et de disponibilité de leurs composants. Les données d'entraînement, le modèle et les données d'inférence sont exposés à des attaques. Même dans le scénario réaliste considéré dans cette thèse, où le modèle fonctionne dans un environnement boîte noire avec des limitations sur le nombre de requêtes, il est possible de voler et de reconstruire le modèle et les données d'entraînement, ainsi que de manipuler les données d'inférence. Cette thèse met l'accent sur la protection de la confidentialité du modèle compromise l'extraction de modèle et l'extraction de paramètres. Elle explore également le domaine des exemples adverses. Ces perturbations soigneusement conçues entraînent des erreurs de classification, menaçant l'intégrité du modèle à l'inférence. Par conséquent, une partie de cette thèse est dédiée à l'exploration de leurs origines, de leur création et des stratégies de défense contre eux.Cependant, ce succès remarquable s'accompagne de certaines failles. Les réseaux de neurones présentent des faiblesses en termes de confidentialité, d'intégrité et de disponibilité de leurs composants. Les données d'entraînement, le modèle et les données d'inférence sont exposés à des attaques. Même dans le scénario réaliste considéré dans cette thèse, où le modèle fonctionne dans un environnement boîte noire avec des limitations sur le nombre de requêtes, il est possible de voler et de reconstruire le modèle et les données d'entraînement, ainsi que de manipuler les données d'inférence.Cette thèse met l'accent sur la protection de la confidentialité du modèle compromise l'extraction de modèle et l'extraction de paramètres. Elle explore également le domaine des exemples adverses. Ces perturbations soigneusement conçues entraînent des erreurs de classification, menaçant l'intégrité du modèle à l'inférence. Par conséquent, une partie de cette thèse est dédiée à l'exploration de leurs origines, de leur création et des stratégies de défense contre eux
Artificial Intelligence is a hot topic today, driven by the revolution of neural networks that have shown impressive performances across various tasks. Notably, in Computer Vision, they have even outperformed humans. This thesis centers on neural networks applied to image classification tasks. Yet, this remarkable success is not without its vulnerabilities. Neural networks exhibit weaknesses in terms of confidentiality, integrity, and availability of their components. The training data, the model, and the inference data, are susceptible to potential attacks. Even in the realistic scenario considered in this thesis where the model operates in a black-box setup with limitations on the number of queries, it remains possible for an attacker to steal and reconstruct the model and training data, as well as manipulate inference data. This thesis places a particular emphasis on safeguarding the confidentiality of the model, which can be compromised through techniques such as model extraction and parameter extraction. Additionally, it delves into the realm of adversarial examples, which pose threats to the integrity of model inference. The deliberate introduction of small, well-crafted perturbations can result in misclassifications. Consequently, a significant portion of this thesis is dedicated to exploring the origins of adversarial examples, their creation, and strategies for defending against them
APA, Harvard, Vancouver, ISO, and other styles
34

Pafčo, Tomáš. "Knihovna pro generování realistických modelů stromů." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2009. http://www.nusl.cz/ntk/nusl-236726.

Full text
Abstract:
The goal of this thesis was to propose algorithms for procedural generation of realistic three-dimensional tree models and implement them as a library. This library uses a set of 92 mostly numerical parameters as an input and enables to export generated model into 3DS or OBJ file. It's an objective library, written in C++ language and designed mainly for MS Windows platform. Proposed algorithms are able to generate specific biologic species of broadleaf and coniferous trees.
APA, Harvard, Vancouver, ISO, and other styles
35

Quadroni, Reto. "Realistic models of medial vestibular nuclei neurons /." [S.l.] : [s.n.], 1993. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=10255.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Pueyo, Vallet Oriol. "Realistic urban layout modeling from real data." Doctoral thesis, Universitat de Girona, 2016. http://hdl.handle.net/10803/401631.

Full text
Abstract:
Computer Graphics applications community has big interest on urban modeling and especially on designing complex and realistic cities and buildings. This topic is interesting for film, video game and urbanism industries. One challenge of this thesis is to extract information from cadastral documents, clean and structure it in a robust way. We present a semiautomatic robust and generic solution to detect, process and correct 2D maps in order to reconstruct a hierarchical structure with blocks and buildings that can be extruded into 3D models. Also, this thesis contributes on the video games industry, by providing a tool to easy the game designers work. We propose an automatic process to generate a simplified city from a real city network. This simplified city has a smaller area, where key city features (buildings, parks, roads, etc.) remain unaffected, but where unimportant features are reduced, while retaining the general appearance of the city
La Informàtica Gràfica mostra gran interès en el modelatge urbà i especialment en el disseny realista de ciutats i edificis. Temàtica d’especial interès per indústries com el cinema, els vídeojocs i l’urbanisme. Un dels reptes d’aquesta tesi és el d’extreure dades reals d’informació cadastral, netejar-les i donar-los estructura. Presentem una tècnica semiautomàtica, robusta i genèrica per detectar, processar i corregir dades cadastrals 2D obtenint una estructura jeràrquica de blocs i edificis que posteriorment pot extruir-se en un model 3D. Per altra banda, aquest treball contribueix al sector dels videojocs proporcionant una eina per facilitar la feina dels dissenyadors. Presentem un procés automàtic de generació de ciutats simplificades. Prtint d’una xarxa de carrers real, en genera una d’àrea reduïda que conserva intactes les seves principals característiques i punts clau (carrers, parcs, edificis, etc.), tot reduint aquelles zones prescindibles de la ciutat original. Així manté la seva aparença i essència
APA, Harvard, Vancouver, ISO, and other styles
37

Turksoyu, Faith. "Realistic traffic generation capability for SAAM testbed." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2001. http://handle.dtic.mil/100.2/ADA390418.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Donner, Craig Steven. "Towards realistic image synthesis of scattering materials." Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 2006. http://wwwlib.umi.com/cr/ucsd/fullcit?p3226771.

Full text
Abstract:
Thesis (Ph. D.)--University of California, San Diego, 2006.
Title from first page of PDF file (viewed October 11, 2006). Available via ProQuest Digital Dissertations. Vita. Includes bibliographical references (p. 116-126).
APA, Harvard, Vancouver, ISO, and other styles
39

Bwanika, Daniel. "Realistic Theory as Methods for Scientific Research." Thesis, Örebro University, School of Humanities, Education and Social Sciences, 2002. http://urn.kb.se/resolve?urn=urn:nbn:se:oru:diva-6327.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Subbareddy, Dheeraj Reddy. "Correct, Efficient, and Realistic Wireless Network Simulations." Diss., Georgia Institute of Technology, 2007. http://hdl.handle.net/1853/14558.

Full text
Abstract:
Simulating wireless networks accurately is a non-trivial task because of the large parameter space that affects the performance of such networks. Increasing the amount of detail in the simulation model increases these requirements by many times. Hence there is a need to develop suitable abstractions that maintain the accuracy of the simulation while keeping the computational resource requirements low. The topic of wireless network simulation models is explored in this research, concentrating on the medium access control and the physical layers. In the recent years, a large amount of research has focussed on various kinds of wireless networks to fit various application domains. Mobile Ad-Hoc Networks (MANETs), Wire- less Local Area Networks (WLANs), and Sensor Networks are a few examples.The IEEE 802.11 Physical layer(PHY) and Medium Access Control (MAC) layer are the most popular wireless technologies in practice. Consequently, most implementations use the IEEE 802.11 specifications as the basis for higher layer protocol design and analyses. In this dissertation, we explore the correctness, efficiency, and realism of wireless network simulations. We concentrate on the 802.11-based wireless network simulations, although the methods and results can also be used for various other wireless network simulations too. While many simulators model the IEEE 802.11 wireless networks, almost all of them tend to make some abstractions to lessen the computation burden and to obtain reasonable results. A comparitive study of three wireless simulators is made with respect to the correctness of their ideal behavior as well as their behavior under a high degree of load. Further, the physical-layer abstraction in wireless network simulations tends to be very simplistic because of the huge computational requirements that are needed to accurately model the various propagation, fading, and shadowing models. When mobility is taken into account several other issues like the Doppler effect should also be accounted for. This dissertation explores an empirical way to model the physical layer which cumula- tively accounts for all these effects. From a network protocol designers perspective, it is the cumulative effect of all these parameters that is of interest. Our major contribution has been the investigation of novel empirical models of the wireless physical layer, which account for node mobility and other effects in an outdoor environment. These models are relatively more realistic and efficient when implemented in a simulation environment. Our simulation experiments validate the models and pro- vide simulation results which closely match our outdoor experiments. Another significant contribution is in understanding and design of wireless network simulation models.
APA, Harvard, Vancouver, ISO, and other styles
41

Mao, Yuxiong. "Computer simulations of realistic three-dimensional microstructures." Diss., Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/33954.

Full text
Abstract:
A novel and efficient methodology is developed for computer simulations of realistic two-dimensional (2D) and three-dimensional (3D) microstructures. The simulations incorporate realistic 2D and 3D complex morphologies/shapes, spatial patterns, anisotropy, volume fractions, and size distributions of the microstructural features statistically similar to those in the corresponding real microstructures. The methodology permits simulations of sufficiently large 2D as well as 3D microstructural windows that incorporate short-range (on the order of particle/feature size) as well as long-range (hundred times the particle/feature size) microstructural heterogeneities and spatial patterns at high resolution. The utility of the technique has been successfully demonstrated through its application to the 2D microstructures of the constituent particles in wrought Al-alloys, the 3D microstructure of discontinuously reinforced Al-alloy (DRA) composites containing SiC particles that have complex 3D shapes/morphologies and spatial clustering, and 3D microstructure of boron modified Ti-6Al-4V composites containing fine TiB whiskers and coarse primary TiB particles. The simulation parameters are correlated with the materials processing parameters (such as composition, particle size ratio, extrusion ratio, extrusion temperature, etc.), which enables the simulations of rational virtual 3D microstructures for the parametric studies on microstructure-properties relationships. The simulated microstructures have been implemented in the 3D finite-elements (FE)-based framework for simulations of micro-mechanical response and stress-strain curves. Finally, a new unbiased and assumption free dual-scale virtual cycloids probe for estimating surface area of 3D objects constructed by 2D serial section images is also presented.
APA, Harvard, Vancouver, ISO, and other styles
42

Trinh, Ellen Man Ngoc. "Cine-animé: adaptations of realistic lighting styles." Texas A&M University, 2005. http://hdl.handle.net/1969.1/2644.

Full text
Abstract:
Animé, a style of Japanese animation, has begun to evolve into more than a simple animation. The stories found in animé have reached a level of complexity similar to traditional cinema. However, lighting in animé, has been minimal. Using computers to create animé, rather than creating it traditionally by hand, has allowed greater opportunities to be creative with lighting. Color and computer-generated (CG) effects can be integrated with traditional line drawings to create beautiful images in animé. Since cinematic lighting exhibits some of the finest examples of lighting, this thesis will analyze lighting styles from three different cinematographers and adapt them to three anim??e style scenes in 3D. The scenes will be modeled, lit, and rendered using Alias/Wavefront MAYATM, and textured using Adobe PhotoshopTM. The result will be a visual CG piece that adapts the lighting style of certain distinctive cinematographers, while retaining the look of animé.
APA, Harvard, Vancouver, ISO, and other styles
43

Saunders, Ryan L. "Terrainosaurus: realistic terrain synthesis using genetic algorithms." Texas A&M University, 2006. http://hdl.handle.net/1969.1/4892.

Full text
Abstract:
Synthetically generated terrain models are useful across a broad range of applications, including computer generated art & animation, virtual reality and gaming, and architecture. Existing algorithms for terrain generation suffer from a number of problems, especially that of being limited in the types of terrain that they can produce and of being difficult for the user to control. Typical applications of synthetic terrain have several factors in common: first, they require the generation of large regions of believable (though not necessarily physically correct) terrain features; and second, while real-time performance is often needed when visualizing the terrain, this is generally not the case when generating the terrain. In this thesis, I present a new, design-by-example method for synthesizing terrain height fields. In this approach, the user designs the layout of the terrain by sketching out simple regions using a CAD-style interface, and specifies the desired terrain characteristics of each region by providing example height fields displaying these characteristics (these height fields will typically come from real-world GIS data sources). A height field matching the user's design is generated at several levels of detail, using a genetic algorithm to blend together chunks of elevation data from the example height fields in a visually plausible manner. This method has the advantage of producing an unlimited diversity of reasonably realistic results, while requiring relatively little user effort and expertise. The guided randomization inherent in the genetic algorithm allows the algorithm to come up with novel arrangements of features, while still approximating user-specified constraints.
APA, Harvard, Vancouver, ISO, and other styles
44

Pikunic, Jorge. "Realistic Molecular Models for Disordered Porous Carbons." NCSU, 2003. http://www.lib.ncsu.edu/theses/available/etd-08172003-203233/.

Full text
Abstract:
The complex pore morphology and topology of many non-graphitizable porous carbons is not captured by the current molecular models that are used in analysis of adsorption isotherms. We present a novel constrained reverse Monte Carlo method to build models that quantitatively match carbon-carbon pair correlation functions obtained from experimental diffraction data of real nanoporous carbons. Our approach is based on reverse Monte Carlo with carefully selected constraints on the bond angles and carbon coordination numbers to describe the three-body correlations. Through successive Monte Carlo moves, using a simulated annealing scheme, the model structure is matched to the experimental diffraction data, subject to the imposed three-body constraints. We modeled a series of saccharose-based carbons and tested the resulting models against high resolution transmission electron microscopy (TEM) data. Simulated TEM images of the resulting structural models are in very good agreement with experimental ones. For the carbons studied, the pore structure is highly convoluted, and the commonly used slit pore model is not appropriate. We simulated adsorption of nitrogen and argon at 77 K using grand canonical Monte Carlo, and diffusion of argon at 300 K using canonical molecular dynamics simulations. The isosteric heats of adsorption at 77 K are in excellent agreement with experimental results. The adsorption isotherms and heats of adsorption in these models do not resemble those for fluids in slit pores having the same pore size distribution. We found that diffusion in the structural models is non-Fickian. Instead, a strong single-file character is observed, revealed by the proportionality of the mean square displacement to the square root of time at relatively long times. The single-file mode is a consequence of the small sizes of the quasi one-dimensional pores in the adsorbent models.
APA, Harvard, Vancouver, ISO, and other styles
45

Nguyễn. "Learning to teach realistic mathematics in Vietnam." [S.l. : Amsterdam : s.n.] ; Universiteit van Amsterdam [Host], 2005. http://dare.uva.nl/document/18047.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Mahadevan, Priya. "Mechanisms for generating realistic annotated Internet topologies." Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 2007. http://wwwlib.umi.com/cr/ucsd/fullcit?p3274514.

Full text
Abstract:
Thesis (Ph. D.)--University of California, San Diego, 2007.
Title from first page of PDF file (viewed October 3, 2007). Available via ProQuest Digital Dissertations. Vita. Includes bibliographical references (p. 134-141).
APA, Harvard, Vancouver, ISO, and other styles
47

Ward, Jason T. "Realistic texture in simulated thermal infrared imagery /." Online version of thesis, 2008. http://hdl.handle.net/1850/7067.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Biadillah, Youssef. "Hemodynamics of an anatomically realistic human aorta." Thesis, McGill University, 2005. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=82469.

Full text
Abstract:
Cardiovascular disease (CVD) is North America's leading killer for both men and women among all racial and ethnic groups. Almost 1 million North Americans die of CVD each year, which adds up to 42% of all deaths.
Numerous investigations point out that normal blood flow (hemodynamics) is essential to good health and many studies found that there is a relationship between the genesis and the progression of CVD with the locally irregular blood flow occurring in the diseased zones.
The study of hemodynamics in the cardiovascular system is therefore key to the understanding of CVD; its genesis and progression.
The aorta is the largest artery in the body, rising from the heart's major pumping chamber, the left ventricle. It is the primary artery of the circulatory system, delivering oxygenated blood to all other arteries except those of the lungs and is a major site for CVD.
Despite the clinical importance of the aorta, relatively little is known about its hemodynamic features due in part to the difficulty of studying blood flow in this artery.
This thesis presents a numerical analysis on the hemodynamics of a 3D realistic model of the human aorta and its arch reconstructed from Magnetic Resonance Imaging (MRI) data.
The objective was to evaluate the effect of flow waveform and inlet flow velocity profile on the hemodynamics in the proximal, medial, distal regions of the aorta and on the hemodynamics in the branches.
APA, Harvard, Vancouver, ISO, and other styles
49

Peng, Bo. "Energy-efficient geographic routing in realistic WSNs." Thesis, University of Leeds, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.522967.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Risser, Eric. "Grass Mapping: Realistic Real Time Grass Rendering." Honors in the Major Thesis, University of Central Florida, 2005. http://digital.library.ucf.edu/cdm/ref/collection/ETH/id/796.

Full text
Abstract:
This item is only available in print in the UCF Libraries. If this is your Honors Thesis, you can help us make it available online for use by researchers around the world by following the instructions on the distribution consent form at http://library.ucf
Bachelors
Engineering and Computer Science
Computer Science
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography