To see the other types of publications on this topic, follow the link: Time and resources.

Dissertations / Theses on the topic 'Time and resources'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Time and resources.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Medhurst, Pamela Wendy. "Part-time study, full-time lives : stories of success from part-time undergraduate students." Thesis, University of Hull, 2008. http://hydra.hull.ac.uk/resources/hull:1748.

Full text
Abstract:
This thesis is concerned with part-time undergraduate students within the higher education system in England. In particular it focusses on the strategies this group of students employ to complete their degrees successfully. I place the experience of successful part-time students at the heart of the thesis because I think that it is vital in the twenty-first century to further our understanding of this heterogeneous group in order to have an accessible higher education system that does not by design discriminate against those who choose a particular mode of study. By doing this I create a collective narrative for part-time students. A small qualitative sample of completed part-time undergraduate students was interviewed to produce the data used herein.
APA, Harvard, Vancouver, ISO, and other styles
2

Yang, Jian. "A priori planning and real-time resources allocation /." Full text (PDF) from UMI/Dissertation Abstracts International, 2000. http://wwwlib.umi.com/cr/utexas/fullcit?p9992941.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hodgson, Anthony Malcolm. "Time, pattern, perception : integrating systems and futures thinking." Thesis, University of Hull, 2016. http://hydra.hull.ac.uk/resources/hull:16878.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Al-Jajjoka, Sam Nooh K. "Time domain threshold crossing for signals in noise." Thesis, University of Hull, 1995. http://hydra.hull.ac.uk/resources/hull:11535.

Full text
Abstract:
This work investigates the discrimination of times between threshold crossings for deterministic periodic signals with added band-limited noise. The methods include very low signal to noise ratio (one or less). Investigation has concentrated on the theory of double threshold crossings, with especial care taken in the effects of correlations in the noise, and their effects on the probability of detection of double crossings. A computer program has been written to evaluate these probabilities for a wide range of signal to noise ratiOS, a wide range of signal to bandwidth ratios, and a range of times between crossings of up to two signal periods. Correlations due to the extreme cases of a Brickwall filter and a second order Butterworth filter have been included; other filters can easily be included in the program. The method is simulated and demonstrated by implementing on a digital signal processor (DSP) using a TMS32020. Results from the DSP technique are in agreement with the theoretical evaluations. Probability results could be used to determine optimum time thresholds and windows for signal detection and frequency discrimination, to determine the signal length for adequate discrimination, and to evaluate channel capacities. The ability to treat high noise, including exact effects of time correlations, promises new applications in electronic signal detection, communications, and pulse discrimination neural networks.
APA, Harvard, Vancouver, ISO, and other styles
5

Grewal, Harsh Kumar. "A metaphysics of personal identity : emotion, others and time." Thesis, University of Hull, 1999. http://hydra.hull.ac.uk/resources/hull:14031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

West, Richard. "Adaptive real-time management of communication and computation resources." Diss., Georgia Institute of Technology, 2000. http://hdl.handle.net/1853/9237.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Nesti, Mark Stephen. "Anxiety and sport : time to ask what rather than why." Thesis, University of Hull, 1999. http://hydra.hull.ac.uk/resources/hull:8055.

Full text
Abstract:
Approaches to the study of anxiety in sport have tended to rely on the use of questionnaires to assess levels of competitive anxiety. The development of the Competitive State Anxiety Inventory-2 (Martens et aI., 1982) has according to Jones (1995) led to considerable research investigating the relationship between anxiety and sport performance. Study 1 reported here utilised the CSAI-2 with an additional directional scale to examine individual differences and competitive state anxiety in sport. Results revealed that there were no significant differences (p<.05) between three achievement levels of competitive swimmers (n=89) for intensity scores, however, significant differences were found for cognitive anxiety and somatic anxiety directional scores across levels. Further, unexpected correlations between CSAI-2 intensity and directional scores for several items, highlighted the importance of considering individual differences in the interpretation of anxiety symptoms. Study 2 was based on Davidson and Schwartz's (1976) Matching Hypothesis which claims that interventions, to be effective, must be matched to the individual's dominant mode of experiencing anxiety. Female high level skaters (n=15) were assigned to a control group (n=5), a cognitive anxiety group or a somatic anxiety group based on interview data, CSAI-2 scores, coach reports, and performance at a simulated competitive event. Results revealed that there was no support for the Matching Hypothesis, and that greater attention should be devoted to using methods that allow for a more individualised approach to understanding anxiety in sport. A diary-based methodology incorporating Watson and Tellegen' s (1985) concept of mood, was employed in study 3 with high level Netballers (n=8) and Super League Rugby League Referees (n=8), to examine the relationships between anxiety, mood and sport and other life events for a 4 week period. Results suggested that this methodology can be used to allow data to be analysed ideographically and from an inter-individual basis as well, and helps to place sport anxiety into a broader context in relation to other mood states and life events. Finally, study 4 further developed the use of the diary based methodology by investigating the relationship between mood, anxiety and performance in International Student Rugby players (n=Il). Whilst no clear relationship was found between anxiety, mood states and match performance scores, several interesting findings revealed that much more could be achieved by re-directing focus at what anxiety means to an individual both before and after sport performance. The findings from the diary-based studies are discussed in terms of the need to address the meaning of anxiety in sport, in part, by drawing on the approach taken within existential-phenomenological psychology.
APA, Harvard, Vancouver, ISO, and other styles
8

Qiang, Fu. "Bayesian multivariate time series models for forecasting European macroeconomic series." Thesis, University of Hull, 2000. http://hydra.hull.ac.uk/resources/hull:8068.

Full text
Abstract:
Research on and debate about 'wise use' of explicitly Bayesian forecasting procedures has been widespread and often heated. This situation has come about partly in response to the dissatisfaction with the poor forecasting performance of conventional methods and partly in view of the development of computational capacity and macro-data availability. Experience with Bayesian econometric forecasting schemes is still rather limited, but it seems to be an attractive alternative to subjectively adjusted statistical models [see, for example, Phillips (1995a), Todd (1984) and West & Harrison (1989)]. It provides effective standards of forecasting performance and has demonstrated success in forecasting macroeconomic variables. Therefore, there would seem a case for seeking some additional insights into the important role of such methods in achieving objectives within the macroeconomics profession. The primary concerns of this study, motivated by the apparent deterioration of mainstream macroeconometric forecasts of the world economy in recent years [Wallis (1989), pp.34-43], are threefold. The first is to formalize a thorough, yet simple, methodological framework for empirical macroeconometric modelling in a Bayesian spirit. The second is to investigate whether improved forecasting accuracy is feasible within a European-based multicountry context. This is conducted with particular emphasis on the construction and implementation of Bayesian vector autoregressive (BVAR) models that incorporate both a priori and cointegration restrictions. The third is to extend the approach and apply it to the joint-modelling of system-wide interactions amongst national economies. The intention is to attempt to generate more accurate answers to a variety of practical questions about the future path towards a united Europe. The use of BVARs has advanced considerably. In particular, the value of joint-modelling with time-varying parameters and much more sophisticated prior distributions has been stressed in the econometric methodology literature. See e.g. Doan et al. (1984). Kadiyala and Karlsson (1993, 1997), Litterman (1986a), and Phillips (1995a, 1995b). Although trade-linked multicountry macroeconomic models may not be able to clarify all the structural and finer economic characteristics of each economy, they do provide a flexible and adaptable framework for analysis of global economic issues. In this thesis, the forecasting record for the main European countries is examined using the 'post mortem' of IMF, DECO and EEC sources. The formulation, estimation and selection of BVAR forecasting models, carried out using Microfit, MicroTSP, PcGive and RATS packages, are reported. Practical applications of BVAR models especially address the issues as to whether combinations of forecasts explicitly outperform the forecasts of a single model, and whether the recent failures of multicountry forecasts can be attributed to an increase in the 'internal volatility' of the world economic environment. See Artis and Holly (1992), and Barrell and Pain (1992, p.3). The research undertaken consolidates existing empirical and theoretical knowledge of BVAR modelling. It provides a unified coverage of economic forecasting applications and develops a common, effective and progressive methodology for the European economies. The empirical results reflect that in simulated 'out-of-sample' forecasting performances, the gains in forecast accuracy from imposing prior and long-run constraints are statistically significant, especially for small estimation sample sizes and long forecast horizons.
APA, Harvard, Vancouver, ISO, and other styles
9

Sahu, Reetik Kumar. "Multi-agent real-time decision making in water resources systems." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/120636.

Full text
Abstract:
Thesis: Ph. D. in Computational Science and Engineering, Massachusetts Institute of Technology, Department of Civil and Environmental Engineering, 2018.<br>Cataloged from PDF version of thesis.<br>Includes bibliographical references (pages 77-83).<br>Optimal utilization of natural resources such as water, wind and land over extended periods of time requires a carefully designed framework coupling decision making and a mathematical abstraction of the physical system. On one hand, the choice of the decision-strategy can set limits/bounds on the maximum benefit that can be extracted from the physical system. On the other hand the mathematical formulation of the physical system determines the limitations of such strategies when applied to real physical systems. The nuances of decision making and abstraction of the physical system are illustrated with two classical water resource problems: optimal hydropower reservoir operation and competition for a common pool groundwater source. Reservoir operation is modeled as a single agent stochastic optimal control problem where the operator (agent) negotiates a firm power contract before operations begin and adjusts the reservoir release during operations. A probabilistic analysis shows that predictive decision strategies such as stochastic dynamic programming and model predictive control give better performance than standard deterministic operating rules. Groundwater competition is modeled as a multi-agent dynamic game where each farmer (agent) aims to maximize his/her personal benefit. The game analysis shows that uncooperative competition for the resource reduces economic efficiency somewhat with respect to the cooperative socially optimum behavior. However, the efficiency reduction is relatively small compared to what might be expected from incorrect assumptions about uncertain factors such as future energy and crop prices. Spatially lumped and distributed models of the groundwater system give similar pictures of the inefficiencies that result from uncooperative behavior. The spatially distributed model also reveals the important roles of the geometry and density of the pumping well network. Overall, the game analysis provides useful insight about the factors that make cooperative groundwater management beneficial in particular situations.<br>by Reetik Kumar Sahu.<br>Ph. D. in Computational Science and Engineering
APA, Harvard, Vancouver, ISO, and other styles
10

Gonzalez, Sanchez Raul. "Systemic intervention to manage ccomplexity in Mexican SMEs to last over time." Thesis, University of Hull, 2016. http://hydra.hull.ac.uk/resources/hull:16080.

Full text
Abstract:
The purpose of this research is to develop a new methodology based upon ideas on managing complexity from the Viable System Model. The context for the research is Small and medium-sized enterprises (SMEs) in Mexico. Worldwide, SMEs represent the segment of the economy that contributes the largest number of economic units and employees, both in industrialised countries and in those that are less developed. However, the astonishing rate of change today influences most human activities, including business organisations, and, therefore SMEs. Organisational complexity continues to grow as organisations are forced to address more issues and greater diversity in their operating environments. So, the current challenges imposed by modern-day complexity suggest to think about new ways of approaching managementpractice. The research aims to adopt systems thinking approaches applied on daily life as an ongoing process, based on a learning system which aims to increase the ability to manage complexity in SMEs to last over time. The research design is based on an action research approach developing a single case study intervention, based on Yin's work, in a Mexican SME in order to provide the empirical data. To do so, this work presents a novel model (ModK+) and multi-methodology (MetK+) as a way of thinking and acting, respectively, to perform a systemic intervention, linking the philosophical, methodological and practical levels. Finally, and based on the sources of evidence, the researcher realised two main findings. First, the MetK+ facilitated the adoption of systems thinking approaches in the daily practice of organisational management: it helped managers to identify and to overcome their main challenges and it enabled them to better manage their complexity. Second, the researcher identified the positive impact of building a learning system because it helped managers to refine their learning cycle to manage complexity; however, despite having such a learning system it was clear that managers would still require further accompaniment after the systemic intervention to overcome inertia in their busy daily agenda.
APA, Harvard, Vancouver, ISO, and other styles
11

Cheah, Lam Aun. "Real-time detection of auditory : steady-state brainstem potentials evoked by auditory stimuli." Thesis, University of Hull, 2010. http://hydra.hull.ac.uk/resources/hull:3433.

Full text
Abstract:
The auditory steady-state response (ASSR) is advantageous against other hearing techniques because of its capability in providing objective and frequency specific information. The objectives are to reduce the lengthy test duration, and improve the signal detection rate and the robustness of the detection against the background noise and unwanted artefacts.Two prominent state estimation techniques of Luenberger observer and Kalman filter have been used in the development of the autonomous ASSR detection scheme. Both techniques are real-time implementable, while the challenges faced in the application of the observer and Kalman filter techniques are the very poor SNR (could be as low as −30dB) of ASSRs and unknown statistics of the noise. Dual-channel architecture is proposed, one is for the estimate of sinusoid and the other for the estimate of the background noise. Simulation and experimental studies were also conducted to evaluate the performances of the developed ASSR detection scheme, and to compare the new method with other conventional techniques. In general, both the state estimation techniques within the detection scheme produced comparable results as compared to the conventional techniques, but achieved significant measurement time reduction in some cases. A guide is given for the determination of the observer gains, while an adaptive algorithm has been used for adjustment of the gains in the Kalman filters.In order to enhance the robustness of the ASSR detection scheme with adaptive Kalman filters against possible artefacts (outliers), a multisensory data fusion approach is used to combine both standard mean operation and median operation in the ASSR detection algorithm. In addition, a self-tuned statistical-based thresholding using the regression technique is applied in the autonomous ASSR detection scheme. The scheme with adaptive Kalman filters is capable of estimating the variances of system and background noise to improve the ASSR detection rate.
APA, Harvard, Vancouver, ISO, and other styles
12

Salmerón, Cabañas Julia. ""Errant in time and space" : a reading of Leonora Carrington's major literary works." Thesis, University of Hull, 1997. http://hydra.hull.ac.uk/resources/hull:11073.

Full text
Abstract:
Part One deals with Carrington's association with the Surrealist movement and looks at her texts as dreams/nightmares. Born in 1917, Carrington arrived in Paris just before her twentieth birthday. The opening chapter deals chiefly with biographical material and creates a context for Carrington's writing within the Surrealist movement. Chapters Two and Three explore Carrington's main stories of this period, examining the stylistic devices that make them dream-texts. Part Two deals with the major crisis in Carrington's life and writing: her internment in a Spanish asylum. Chapter Four looks at the biographical events that led Carrington to be interned and suggests that her father and his associations with Imperial Chemical Industries had more to do with her internment than is commonly believed (Appendix I includes a transcript of my interview with her Spanish doctor and testifies to contacts with ICI). Chapter Five analyses the "mad" narrative "Down Below", where the repression of Carrington's "playing with language" is exposed through an impressive imagery of death. Chapter Six explores the stories written in New York immediately after release: "Cast Down By Sadness", "White Rabbits", "Waiting", "The Seventh Horse" and "As They Rode Along the Edge". The grotesque female bodies and the pervasiveness of the monstrous distinguish these stories as Carrington's chaotic, "creative" resurrection. Finally Part Three looks at Carrington's Mexican period, where her writing achieves a voice that, although resonant of previous moments, stops being tragic and becomes revolutionarily comic. Chapter Seven follows Carrington's life in Mexico, where she still lives, from 1942 to the present. Chapter Eight deals with four of her best Mexican writings: the novel The Stone Door, the play The Invention of the Mole, the short story "The Happy Corpse Story" and an unpublished letter to Remedios Varo (1958) included in Appendix II. Finally Chapter Nine deals at length with The Hearing Trumpet.
APA, Harvard, Vancouver, ISO, and other styles
13

Tan, Zhenyu. "Certification of Instrumentation Techniques for Resources Management of Real-Time Systems." Ohio University / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1129146878.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Liu, Qizhi. "Assessing the dynamic status of fish resources in space and time." Thesis, University of Ottawa (Canada), 2001. http://hdl.handle.net/10393/9423.

Full text
Abstract:
The purpose of this thesis is to analyze the spatial-temporal dynamic status of Scotia-Fundy herring stocks in NAFO divisions 4WX in the Bay of Fundy region and the Scotian Shelf. This thesis uses an alternative method to examine the dynamic in-season status of fish movement in space and time over the fishing season. Two models are presented based on Bayesian uncertainty statistical theory. A working computer decision model is developed to estimate the weekly stock abundance by area. A simulation model is developed to examine the strategic plan of the fishery. A visualized computer application tool, MapTest has been developed to implement the above two models. This application uses OLE (object linked and embedded) techniques to present fishing data on a GIS (geographic information system) mapping system and link and embed to a Visual Basic application. Using MapTest, fishery scientists can estimate the stock abundance based on the available catch data and simulate the impact of alternative catch scenarios. This thesis used the MapTest application to analyze the catch and estimated abundance of the five spawning areas in the 4WX herring fishery and processed three estimated caches year analyses. The computer application developed and illustrated here provides valuable information to assist decision makers in managing the fishery in real on a spatial-temporal basis throughout the fishing year.
APA, Harvard, Vancouver, ISO, and other styles
15

Hesslewood, Aidan. "Reconstituting troublesome youth in Newcastle upon Tyne : theorising exclusion in the night-time economy." Thesis, University of Hull, 2009. http://hydra.hull.ac.uk/resources/hull:3468.

Full text
Abstract:
Following economic stagnation and deindustrialisation in 1970s and 1980s Britain, the shift toward neoliberalism and entrepreneurial urbanism has had profound effects on the ways in which cities are experienced by different socio-cultural groups. As many urban commentators have noted, in the pursuit of maintaining a spatial capital fix, some groups have found themselves increasingly marginalised through various image-related redevelopment processes. The working classes, the homeless and, increasingly, young people continue to be faced with a number of curtailments which restrict access and spatial freedoms. Taking Newcastle upon Tyne and its night-time economy as a case in point, this thesis examines the roles of class identity, delinquency, and exclusion in contemporary nightlife, and how current representations of troublesome youth such as the ‘chav’ are used to articulate exclusionary practices. This thesis, though, also illustrates that exclusion is ultimately driven by commercially-defined imperatives commensurate with extant urbanentrepreneurialism. However, whilst it was initially speculated that the young ‘lower’ classes were excluded from city centre nightlife outright, it was actually found that the night-time economy functions through a number of channelling and redistributive processes. The ‘chav element’, whilst being rejected from many venues, is not wholly excluded from the city centre, but segregated and contained in certain locales. Pointing to a more nuanced idea of exclusion as a spatial restructuring process, this thesis suggests that urban cultural geography should pay closer attention to a temporal, fluid, and fragmentary notion of exclusion that is constantly shifting and transforming alongside other changes in production and consumption.
APA, Harvard, Vancouver, ISO, and other styles
16

Lucking, Walter. "The application of time encoded signals to automated machine condition classification using neural networks." Thesis, University of Hull, 1997. http://hydra.hull.ac.uk/resources/hull:3766.

Full text
Abstract:
This thesis considers the classification of physical states in a simplified gearbox using acoustical data and simple time domain signal shape characterisation techniques allied to a basic feedforward multi-layer perceptron neural network. A novel extension to the signal coding scheme (TES), involving the application of energy based shape descriptors, was developed. This sought specifically to improve the techniques suitability to the identification of mechanical states and was evaluated against the more traditional minima based TES descriptors. The application of learning based identification techniques offers potential advantages over more traditional programmed techniques both in terms of greater noise immunity and in the reduced requirement for highly skilled operators. The practical advantages accrued by using these networks are studied together with some of the problems associated in their use within safety critical monitoring systems.Practical trials were used as a means of developing the TES conversion mechanism and were used to evaluate the requirements of the neural networks being used to classify the data. These assessed the effects upon performance of the acquisition and digital signal processing phases as well as the subsequent training requirements of networks used for accurate condition classification. Both random data selection and more operator intensive performance based selection processes were evaluated for training. Some rudimentary studies were performed on the internal architectural configuration of the neural networks in order to quantify its influence on the classification process, specifically its effect upon fault resolution enhancement.The techniques have proved to be successful in separating several unique physical states without the necessity for complex state definitions to be identified in advance. Both the computational demands and the practical constraints arising from the use of these techniques fall within the bounds of a realisable system.
APA, Harvard, Vancouver, ISO, and other styles
17

Ffolliott, Peter F. "Updating Hydrologic Time-Trend Response Functions of Fire Impacts." Arizona-Nevada Academy of Science, 2001. http://hdl.handle.net/10150/296580.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Jalle, Ibarra Javier. "Improving time predictability of shared hardware resources in real-time multicore systems : emphasis on the space domain." Doctoral thesis, Universitat Politècnica de Catalunya, 2016. http://hdl.handle.net/10803/397651.

Full text
Abstract:
Critical Real-Time Embedded Systems (CRTES) follow a verification and validation process on the timing and functional correctness. This process includes the timing analysis that provides Worst-Case Execution Time (WCET) estimates to provide evidence that the execution time of the system, or parts of it, remain within the deadlines. A key design principle for CRTES is the incremental qualification, whereby each software component can be subject to verification and validation independently of any other component, with obvious benefits for cost. At timing level, this requires time composability, such that the timing behavior of a function is not affected by other functions. CRTES are experiencing an unprecedented growth with rising performance demands that have motivated the use of multicore architectures. Multicores can provide the performance required and bring the potential of integrating several software functions onto the same hardware. However, multicore contention in the access to shared hardware resources creates a dependence of the execution time of a task with the rest of the tasks running simultaneously. This dependence threatens time predictability and jeopardizes time composability. In this thesis we analyze and propose hardware solutions to be applied on current multicore designs for CRTES to improve time predictability and time composability, focusing on the on-chip bus and the memory controller. At hardware level, we propose new bus and memory controller designs that control and mitigate contention between different cores and allow to have time composability by design, also in the context of mixed-criticality systems. At analysis level, we propose contention prediction models that factor the impact of contenders and don¿t need modifications to the hardware. We also propose a set of Performance Monitoring Counters (PMC) that provide evidence about the contention. We give an special emphasis on the Space domain focusing on the Cobham Gaisler NGMP multicore processor, which is currently assessed by the European Space Agency for its future missions.<br>Los Sistemas Críticos Empotrados de Tiempo Real (CRTES) siguen un proceso de verificación y validación para su correctitud funcional y temporal. Este proceso incluye el análisis temporal que proporciona estimaciones de el peor caso del tiempo de ejecución (WCET) para dar evidencia de que el tiempo de ejecución del sistema, o partes de él, permanecen dentro de los límites temporales. Un principio de diseño clave para los CRTES es la cualificación incremental, por la que cada componente de software puede ser verificado y validado independientemente del resto de componentes, con beneficios obvios para el coste. A nivel temporal, esto requiere composabilidad temporal, por la que el comportamiento temporal de una función no se ve afectado por otras funciones. CRTES están experimentando un crecimiento sin precedentes con crecientes demandas de rendimiento que han motivado el uso the arquitecturas multi-núcleo (multicore). Los procesadores multi-núcleo pueden proporcionar el rendimiento requerido y tienen el potencial de integrar varias funcionalidades software en el mismo hardware. A pesar de ello, la interferencia entre los diferentes núcleos que aparece en los recursos compartidos de os procesadores multi núcleo crea una dependencia del tiempo de ejecución de una tarea con el resto de tareas ejecutándose simultáneamente en el procesador. Esta dependencia amenaza la predictabilidad temporal y compromete la composabilidad temporal. En esta tésis analizamos y proponemos soluciones hardware para ser aplicadas en los diseños multi núcleo actuales para CRTES que mejoran la predictabilidad y composabilidad temporal, centrándose en el bus y el controlador de memoria internos al chip. A nivel de hardware, proponemos nuevos diseños de buses y controladores de memoria que controlan y mitigan la interferencia entre los diferentes núcleos y permiten tener composabilidad temporal por diseño, también en el contexto de sistemas de criticalidad mixta. A nivel de análisis, proponemos modelos de predicción de la interferencia que factorizan el impacto de los núcleos y no necesitan modificaciones hardware. También proponemos un conjunto de Contadores de Control del Rendimiento (PMC) que proporcionoan evidencia de la interferencia. En esta tésis, damós especial importancia al dominio espacial, centrándonos en el procesador mutli núcleo Cobham Gaisler NGMP, que está siendo actualmente evaluado por la Agencia Espacial Europea para sus futuras misiones.
APA, Harvard, Vancouver, ISO, and other styles
19

Pickering, Alastair. "A study of outcomes following head injury among children and young adults in full-time education." Thesis, University of Hull, 2007. http://hydra.hull.ac.uk/resources/hull:2178.

Full text
Abstract:
Head injuries are often claimed to account for more than one million attendances to emergency departments, across the United Kingdom, per year. A review of head injury epidemiology in the 1970's estimated the number of attendances to emergency departments to be between 1600 and 1700 per 100,000 of the population. With the current UK population quoted as just over 60 million, this would estimate the attendance rate, following head injury, at between 960,000 and 1,020,000 per year.
APA, Harvard, Vancouver, ISO, and other styles
20

Ow, Say Cheoh. "An investigation of alternating-direction implicit finite-difference time-domain (ADI-FDTD) method in numerical electromagnetics." Thesis, University of Hull, 2003. http://hydra.hull.ac.uk/resources/hull:5496.

Full text
Abstract:
In this thesis, the alternating-direction implicit method (ADI) is investigated in conjunction with the finite difference time-domain method (FDTD) to allow crossing of the Courant-Friedrich-Levy (CFL) stability criterion while maintaining stability in the FDTD algorithm. The main reason for this is to be able to use a larger numerical time step than that governed by the CFL criterion. The desired effect is a significant reduction in numerical run-times. Although the ADI-FDTD method has been used in the literature, most analysis and application have been performed on simple three-dimensional cavities. This work makes original contribution in two aspects. Firstly, a new modified alternating-direction implicit method for a three-dimensional FDTD algorithm has been successfully developed and implemented in this research. This new method allows correct modelling of a realistic physical structure such as a microstrip patch with the ADI scheme without causing instability even when the CFL criterion is not observed. However, due to the inherent property of this modified ADI-FDTD method, a decreasing reflection coefficient is observed using this scheme. The second and more important contribution this research makes in the field of numerical electromagnetics is the development of a new method of simulating realistic complex structures such as geometries comprising copper patch antennas on a dielectric substrate. With this new method, for the first time, the ADl-FDTD algorithm remains stable while still in violation of the CFL criterion, even when complex structures are being modelled. However, there is a trade-off between accuracy and computational speed in ADI-FDTD and modified ADI-FDTD methods. The larger the numerical time step, the shorter is the simulation run-time but an increase in numerical time step causes a degradation in accuracy of numerical results. Comparison between speed and accuracy is shown in this thesis and it has to be mentioned here that these values are very much dependent on the structure being modelled.
APA, Harvard, Vancouver, ISO, and other styles
21

Barratt, J. R. "Assessment of normality transformations in water resources time series analysis and generation /." Title page, abstract and contents only, 1990. http://web4.library.adelaide.edu.au/theses/09ENS/09ensb269.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Schliecker, Simon [Verfasser]. "Performance analysis of multiprocessor real-time systems with shared resources / Simon Schliecker." Braunschweig : Institut für Datentechnik und Kommunikationsnetze, 2011. http://d-nb.info/1231994363/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Ben, Halima Kchaou Rania. "Cost optimization of business processes based on time constraints on cloud resources." Electronic Thesis or Diss., Institut polytechnique de Paris, 2020. http://www.theses.fr/2020IPPAS014.

Full text
Abstract:
Motivé par le besoin "d'optimiser le coût de déploiement des processus métier" les organisations externalisent certaines de leurs opérations vers le cloud computing. Les fournisseurs cloud proposent des stratégies de tarification compétitives (par exemple, à la demande, réservée, ponctuelle) spécifiées en fonction des contraintes temporelles pour répondre aux demandes de changement et de dernières minutes des utilisateurs. En outre, les processus métier ont des contraintes temporelles et toute violation de ces contraintes pourrait entraîner des conséquences sévères. Par conséquent, il est nécessaire de vérifier formellement que l’allocation des ressources cloud dans le processus métier est temporellement correcte. Cependant, en l'absence d'une définition formelle des stratégies de tarification, spécifiée en langage naturel, la consistance temporelle de l'allocation des ressources cloud ne peut pas être vérifiée dans le contexte de gestion de processus métier. En outre, la variété des ressources cloud, des stratégies de tarification et des exigences des activités ne permettent pas au concepteur du processus métier de trouver facilement le coût optimal de déploiement d’un processus métier. Dans cette thèse, nous visons à: (i) améliorer le support, dans un processus métier, des contraintes temporelles des activités, des disponibilités temporelles des ressources cloud ainsi que les stratégies de tarification, (ii) minimiser le coût de déploiement des processus métier dans les ressources cloud. Pour ce faire, nous proposons une spécification formelle des ressources cloud, des stratégies de tarification et des contraintes temporelles des activités. Cette spécification est utilisée pour vérifier formellement la consistance temporelle de l’allocation des ressources cloud dans un processus métier enrichi par des contraintes temporelles. Ensuite, nous proposons deux modèles de programmation linéaire, un programme linéaire binaire et un programme mixte en nombres entiers, pour trouver le coût optimal de déploiement d’un processus métier dans les ressources cloud<br>Motivated by the need of "optimizing the deployment cost of business processes" organizations outsource some of their operations to cloud computing. Cloud providers offer competitive pricing strategies (e.g., on-demand, reserved, and spot) specified based on temporal constraints to accommodate users' changing and last-minute demands. Besides, the organizations' business processes are time constrained and any violation to these constraints could lead to serious consequences. Therefore, there is a need to formally verify that the cloud resource allocation in a business process is temporally correct. However, due to the lack of a formal definition of cloud pricing strategies, specified in natural language, the temporal correctness of cloud resource allocation in a business process management context can not be verified. Furthermore, the variety of cloud resources, pricing strategies, and activities requirements do not help the business process designer to easily find the optimal business process's deployment cost. In this thesis, our objectives are to: (i) improve the business processes support of temporal constraints on activities and cloud resources, as well as pricing strategies and (ii) minimize the business process deployment cost. To this end, we propose a formal specification for cloud resources, pricing strategies, and activities' temporal constraints. This specification is used to formally verify the temporal correctness of cloud resource allocation in time-aware business processes. Then, we propose two linear program models, binary linear program and mixed integer program, to find the optimal deployment cost of time-aware business processes in cloud resources
APA, Harvard, Vancouver, ISO, and other styles
24

Hernandez, Krystal M. "Using Spiritual Resources to Prevent Declines in Sexuality among First-Time Parents." Bowling Green State University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1300851920.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Thibeault, Nancy. "Sinclair Curriculum eXchange (SCX) Sharing Learning Resources to Improve Part-Time Instruction." NSUWorks, 2005. http://nsuworks.nova.edu/gscis_etd/880.

Full text
Abstract:
The dissertation effort focused upon improving the quality and consistency of instruction across multiple course sections taught by full -time and part-time faculty. Sinclair Curriculum eXchange (SCX), an online repository of learning objects (LOs) was designed, implemented, and used to deliver a consistent set of teaching materials to introductory Computer Information Systems (CIS) students. Experienced CIS faculty documented successful learning activities along with instructions for using those activities in the classroom. The SCX system was used to assemble the materials for three LOs and one lesson, and then the SCX system was used to share the materials with all faculty teaching the course. The quality and consistency of instruction were measured by a faculty survey and the analysis of student quiz scores. Overall, the faculty agreed that the materials were effective, they liked the teaching approach, and the materials made it easier to teach. Student quiz scores were compared across instructors, course sections, and instructor status. Statistical analysis revealed no significant differences on three of the four quizzes or on all quizzes combined. The results of the faculty survey and analysis of student quiz scores suggest that the SCX system has the potential to increase the quality and consistency of instruction across multiple course sections. It is therefore recommended that a complete course be developed in SCX and the system be re-evaluated. Two major issues surfaced during this study. Faculty participation was problematic in the development of course materials. The fine granularity level used required the creation of a prohibitive number of files.
APA, Harvard, Vancouver, ISO, and other styles
26

Kadhom, Hana M. "Prevention of pressure sores in hospital and community with special reference to the time spent for care." Thesis, University of Hull, 1989. http://hydra.hull.ac.uk/resources/hull:3632.

Full text
Abstract:
The main purpose of this study was to evaluate the amount of time which was spent in giving preventive pressure area care in both a sample of hospital patients and a sample of community patients. A pilot study was carried out to test the methodology, which was subsequently used with only minor modifications, for the main study. Bedfast or chairfast patients were studied from admission to the selected hospital wards or community nursing areas for a period of six weeks or until they were discharged from care, developed pressure sores, died, or became mobile. Data was collected by means of interviews and observations made of patients, nurses and relatives. A diary sheet was designed for use by nurses in hospital and by nurses and relatives in the community, on which they were asked to record pressure area care as it was given. Information collected by this means included the time spent in care, the method used and observation of the skin areas. The researcher also collected data about the patient's appetite, Norton Score, age, sex and diagnosis. The outcome measure used was whether or not the patient developed a pressure sore which was defined for this study as a break in the skin due to pressure. Due to geographical dispersion of patients within the community in the health district used for that part of the study, fewer community patients (n = 30) were included in the study than the number of hospital patients studied (n = 88). Discriminant analysis was used on the results to distinguish between groups of patients. Results of this study showed that a higher percentage (29%) of the hospital patients developed pressure sores than among the community patients studied (20%). The average total time spent on pressure area care daily was higher for the community patients than for the hospital patients. Interestingly, of the six community patients who developed pressure sores, five were dependent entirely upon the nursing service for pressure area care, whilst the usual pattern at home was that relatives and nurses shared the care. Frequency of pressure area care given showed a significant relationship with outcome for both hospital and community patients. It should be noted that whilst the number of patients who developed sores is reported here, and this is related to the total number of patients studied, this study is not an incidence or a prevalence study, and should not be considered as such. The study appears to show that nursing care devoted to the prevention of pressure sores in terms of time and frequency is significantly related to outcome and thus to effectiveness.
APA, Harvard, Vancouver, ISO, and other styles
27

Lupton, Michael J. "How slow can you go? : the joint effects of action preparation and emotion on the perception of time." Thesis, University of Hull, 2017. http://hydra.hull.ac.uk/resources/hull:15695.

Full text
Abstract:
People are often found to temporally overestimate the duration of emotionally salient stimuli relative to neutral stimuli. To date there has been no investigation into the behavioural consequences of such an effect or whether such an effect can be enhanced. Experiments 1, 2 and 3 investigated whether a behavioural advantage to temporally overestimating the duration of emotive stimuli exists. Reaction time facilitation was found following the display of an emotive stimulus which was more frequently temporally overestimated than a neutral stimulus. This provides support for the notion that temporal overestimation due to threat prepares one to act. However, such effects were not found in Experiment 1. Experiments 4 and 5 used multiple experimental manipulations to induce an enhanced temporal overestimation effect. Neither experiment provided evidence that one’s perception of time can be distorted to a greater amount than has been previously demonstrated. This is explained by the operation of an internal clock, such as scalar expectancy theory (SET) (Gibbon, Church, & Meck, 1984), operating at some maximum level. Finally Experiment 6 used electroencephalography to investigate the N1P2 complex in spider phobic and non-phobic individuals. The peak amplitude of the N1P2 complex was not modulated by the spider stimuli, however, the latency of the N1 component was found to be earlier when a spider stimulus was presented. It is suggested that the reaction time facilitation reported in Experiments 2 and 3 of this thesis may not be attributable to temporal overestimation per se, but instead is the result of a general cognitive speeding effect which also leads to temporal overestimation.
APA, Harvard, Vancouver, ISO, and other styles
28

Millians, Jeffrey T. "Separation of cognitive resources within a dual task scenario." Thesis, Georgia Institute of Technology, 1999. http://hdl.handle.net/1853/29831.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

James, Fiona. "An exploration of grounded theory with reference to self and identity of part-time, mature learners in higher education." Thesis, University of Hull, 2013. http://hydra.hull.ac.uk/resources/hull:7163.

Full text
Abstract:
This study explores ‘Constructivist’ Grounded Theory, a methodology advanced by Charmaz (2006) to serve as a redevelopment of Glaser and Strauss’ original version. The study’s focus is a group of mature learners on a part-time Higher Education programme in relation to ‘self’ and ‘identity’. Data from thirteen in-depth interviews are analysed, enabling the construction of a data-driven ‘Grounded Theorisation’. Commensurate with the methodology, no extant theoretical framework is applied to this analysis. The findings from the ‘Grounded Theorisation’ were that while participants encounter varying tensions relating to the accomplishment of their own particular goals, a unifying principle is ‘operating within constraints’. Expressions of how they manage the difficulties presented in these constraining circumstances are interlaced with particular ‘selves’. On an individual level, participants confront the various pressures they experience with ‘selves’ that coincide with their coping strategies. A self that resists a sense of being ‘channelled’ by the demands of the course may take precedence. Other presentations of self include one resigned to taking a patient stance as an explorer in an undulating journey. A further analytic concept developed concerns ‘containing’. This may involve monitoring the impact of studying upon one’s life; alternatively, ‘containing’ may pertain to a personal resolve to block out external impediments and remain on track. When the literature was consulted, the Grounded Theorisation resonated with the extant concepts: ‘identity work’ and ‘framing’ of self. ‘Identity work’ entails thinking of self as resistant, submitting only reluctantly. Further, to be ‘bloody minded’ and resist, rather than circumvent obstacles, might represent a student’s sense-making and their efforts to maintain a feeling of integrity amidst turbulence. Finally, participants’ collective commitment to clearing hurdles is glimpsed via particular constructs and shared phrases: participants ‘frame’, or ‘make sense of’ themselves and their actions with respect to navigating obstacles presented by the course.
APA, Harvard, Vancouver, ISO, and other styles
30

Avocanh, Jean-Thierry Stephen. "Resources allocation in high mobility scenarios of LTE networks." Thesis, Sorbonne Paris Cité, 2015. http://www.theses.fr/2015USPCD052/document.

Full text
Abstract:
Cette étude porte sur l’allocation de ressources radio dans les réseaux LTE à forte mobilité. En particulier, il s’agit de concevoir des stratégies d’allocation de ressources capables d’améliorer la qualité de service des flux multimédia dans un contexte de forte mobilité des terminaux. Pour atteindre ces objectifs, l’étude a été menée en deux étapes. Dans un premier temps les travaux se sont déroulés dans un contexte où l’aspect forte mobilité n’a pas été pris en compte. Cela a permis de bien maitriser tous les aspects liés à l’allocation de ressources dans le LTE tout en proposant de nouvelles méthodes meilleures que celles existantes. Une fois cette tâche accomplie, l’aspect forte mobilité a été ajouté au problème et des stratégies adaptées à ce contexte ont été proposées. Néanmoins, dû aux différences entre les liens montants et descendants, l’étude a été menée dans les deux directions. Comme première contribution, nous avons conçu deux stratégies pour améliorer l’allocation de ressources sur la liaison descendante dans un contexte où la forte mobilité n’a pas été prise en compte. La première méthode est un mécanisme qui améliore cette allocation en particulier dans les scénarios d’overbooking en faisant un compromis entre l’équité, le débit global du système et les exigences de qualité de service des applications. La seconde stratégie permet non seulement de satisfaire aux contraintes de délais mais également de garantir un très faible taux de perte de paquets aux services de type multimédias. Les performances des systèmes proposés ont été évaluées par des simulations en les comparant à d’autres mécanismes dans la littérature. Les analyses ont démontré leur efficacité et révélé qu’elles obtenaient les meilleures performances. Notre deuxième contribution a permis d’améliorer l’allocation de ressources toujours dans un contexte de non prise en compte de la forte mobilité, mais cette fois ci sur le lien montant et pour des flux de type vidéo téléphonie. Nous avons conçu un nouveau protocole qui réduit de façon considérable les retards causés par l’allocation dynamique des ressources. L’idée consiste à allouer des ressources à ces trafics en utilisant une stratégie semi-persistante associée à un processus de pré-allocation. Les performances de notre méthode ont été évaluées par simulations et les résultats ont montré qu’elle fournissait le meilleur support en qualité de service. La dernière partie de nos travaux s’est intéressée au problème d’allocation de ressources dans les scénarios de fortes mobilités des terminaux. Dans cette partie, nous avons élaboré deux stratégies efficaces convenant aux scénarios véhiculaires. La première méthode est une technique permettant de maintenir le niveau de qualité de service nécessaire pour le bon fonctionnement des applications vidéo des utilisateurs ayant les vitesses les plus élevées. Elle consiste à déterminer en fonction des différentes vitesses des utilisateurs, le taux minimum de rapports CQI à envoyer à la station de base. Quant à la seconde stratégie, c’est un procédé d’ordonnancement opportuniste qui améliore la qualité de service des applications vidéo des utilisateurs ayant les vitesses les plus élevées. Avec cette stratégie, ces utilisateurs obtiennent une plus grande priorité et acquièrent ainsi beaucoup plus de ressources<br>Abstract Our thesis focuses on issues related to resources allocation in LTE Networks. In particular the purpose of this study is to design efficient scheduling algorithms to improve the QoS of real time flows in a context of high mobility of the users. To reach this goal, the study has been carried out in two steps. At first, in order to have an expert knowledge of the key facets of LTE scheduling, we conducted the study in a context where the high mobility aspect of the node was not taken into account. This helped not only to critically analyze the literature but also to propose new schemes to improve QoS of real time applications. After that, the high mobility parameter has been added and innovative methods dealing with this context have been designed. Nevertheless due to the existing differences between the downlink and the uplink, the issue was tackled in each of the aforementioned directions. We firstly addressed the problem of improving the scheduling of downlink communications in a context where the high mobility was not taken into account. Two major methods have been designed for this purpose. The first one is an innovative scheme which improves resources assignment in overbooking scenarios by doing a trade-off between fairness, overall system through put and QoS requirements. The second one is an enhanced scheduling scheme which provides strict delay bounds and guarantees very low packet loss rate to multimedia flows. The performance of the proposed schemes have been evaluated by simulations and compared to other schemes in the literature. The analyses demonstrated their effectiveness and showed that they outperformed the existing ones. The second contribution concerned the problem of improving the scheduling of uplink communications in a context where the high mobility was not taken into account. We designed a novel scheduling protocol which improves resources allocation for videotelephony flows and reduces the delay caused by dynamic scheduling. It consists in scheduling such traffics using a semi-persistent strategy associated with a provisioning process. The performance of our proposed method have been evaluated by simulations and results demonstrated its effectiveness by showing that it improved videotelephony flows performance and provided the best QoS support compared to the dynamic scheduling.The last contribution addressed the problem of resources allocation in high mobility scenarios. In this part, the high mobility aspect was taken into account for designing suitable schemes for vehicular scenarios. We proposed in this way two efficient strategies. The first one is a technique which maintains the required level of QoS for supporting video users at high velocities. It consists in identifying depending on the UEs velocity, the minimum CQI reports rate in order to maintain the required QoS. The second proposed strategy is an opportunistic method which improves the performance of high speed video users. With this strategy, more priority are given to the UEs having the highest velocity. Simulations results demonstrated its effectiveness and showed that it improved the QoS support of video users having the highest velocity
APA, Harvard, Vancouver, ISO, and other styles
31

顔尊還 and Tsuen-wan Ngan. "On the effectiveness of additional resources for on-line firm deadlinescheduling." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2001. http://hub.hku.hk/bib/B29710935.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Awan, Malik Shahzad K. "Performance characterization of computational resources for time-constrained job execution in P2P environments." Thesis, University of Warwick, 2013. http://wrap.warwick.ac.uk/57452/.

Full text
Abstract:
Peer-to-peer (P2P) computing, involving the participation of thousands of general purpose, public computers, has established itself as a viable paradigm for executing looselycoupled, complex scientific applications requiring significant computational resources. The paradigm provides cheap, general-purpose computing resources with comparable computing power (FLOP/s) to an otherwise expensive supercomputer. The main characteristic of the paradigm is the volunteer participation of the general public, without any legal obligation, who dedicate their heterogeneous computational resources for advancing scientific research. The development of several middleware solutions have also furthered the application of P2P computing for solving complex scientific problems. The Berkeley Open Infrastructure for Network Computing (BOINC) is one of the most widely deployed middleware platforms in P2P systems, and has been deployed in more than 7.5 million general purpose computers for scientific computations, achieving an overall performance of 16,632.605 TeraFLOPS. ClimatePrediction.net, a large P2P project based on the BOINC middleware, involves more than 429,000 machines representing 200 different microprocessor architectures and running 21 distinct operating systems. The availability of such a large and diverse set of computational resources requires an in-depth investigation into the performance aspects of available computational resources in this dynamic P2P environment. This thesis analyses the performance data of ClimatePrediction.net primarily collected using two benchmarks, Dhrystone and Whetstone, which form part of the BOINC middleware. The results reveal a significant variation in integer and floating-point operational performance characterized by Dhrystone and Whetstone respectively for similar microprocessors, operating systems and hardware configurations. Under the BOINC environment, these performance results could be useful for: i) the selection of a suitable computing platform for executing time-constrained jobs; ii) calculating an incentive unit for rewarding project participants for their volunteer participation in large P2P projects to advance scientific research; and iii) efficient and effective utilization of available computational resources. However, the inconsistency in performance results of Dhrystone and Whetstone significantly affect their usefulness for the afore-mentioned three important applications areas, and highlight the need for reliability and consistency of performance results for obtaining maximum benefit in an uncontrolled and dynamic P2P environment. This thesis, based on the analysis of performance data of ClimatePrediction.net, identifies the key challenges associated with benchmarking in P2P environments. The thesis further suggests the design of a new light-weight P2P representative benchmark, by considering the source code of large P2P projects. The design outline of a new light-weight P2P representative benchmark – MalikStone – has been presented, whilst the results of MalikStone are compared with Dhrystone, Whetstone and CPU SPEC2006 and show its superiority in terms of consistency over both Dhrystone and Whetstone. For floating-point performance, MalikStone gave more representative results than Whetstone for Intel Corei5- 2400, Q9400, Q6600 and Pentium D processors with the standard deviation of repeated runs remaining less than 1 for each of the platforms. Similarly for integer operations, MalikStone also performed more consistently than Dhrystone with the standard deviation of repeated runs remaining less than 1 and gave more representative results for Corei5-2400, Q9400, Q6600 and Pentium D processors. In addition to the consistency in performance results, MalikStone captures broader performance characteristics by measuring floating-point, integer, bitwise-logic, string manipulation and programming construct operations. The performance results of MalikStone are further used for designing a new incentive unit – MalikCredit – for ensuring fairness in rewarding the project participants for their volunteer participation in large P2P projects to advance scientific research. MalikCredit is compared with BOINC’s existing incentive unit – Cobblestone, at three levels: 1) hourly level; 2) work-unit level; and 3) team-level; with the results showing fairness in rewards awarded using MalikCredit. This in turn is useful for retaining the existing project participants and attracting new volunteers for participating in large P2P projects, thereby, enhancing the application of P2P computing for solving scientific problems. A comparison of the credit values for the considered microprocessor architectures reveals that MalikCredit values are at least 2X more than Cobblestone values before normalization while the difference increases up to 3.3X for the fastest microprocessor, once normalization is applied to the claimed Cobblestones. The application of performance characterization done by MalikStone is further extended for scheduling computational resources by dynamically slicing the work-units keeping in view the available computational time of the resources and estimated execution time of the work-unit. The results of this new scheduling policy highlight their usefulness in maximizing the utilization of available computational resources when compared to BOINC’s traditional scheduling policies. The results have revealed that the policy improved the utilization of available computational resources by approximately 10% for the considered set of computational resources under the experimental setup considered in the case study (see Chapter 5). The findings of this thesis are envisaged to be primarily of significance to three main stakeholders: i) application developers; ii) project participants; and iii) project administrators. For application developers, the performance characterization done by MalikStone will be useful in exploiting the characteristics of underlying platforms for efficient execution, while at the same time supporting the improvement efforts for future versions of the software. The results will support project participants by informing them as to the amount of RAM, swap memory and main memory consumed during execution. The fairness in received rewards will encourage the existing project participants to continue participating in the lengthy execution of large P2P projects and will motivate the new volunteers to dedicate their computational resources to join large P2P projects. For the project administrators, the findings of this thesis will be useful in identifying suitable processor, operating system and hardware component configuration for best-case execution. In such a case the middleware might be instructed to postpone the allocation of work until a more effective architecture became available. Further, the newly proposed scheduling policy involving dynamic slicing of work-units based on the performance characterization of MalikStone could be deployed for improving the utilization of available computational resources. Finally, a few avenues of future research have been identified, which if explored could further enhance the appeal of this dynamic and uncontrolled P2P computing paradigm for cheaply solving complex and lengthy scientific problems that otherwise require enormous amount of financial cost as well as computational resources even exceeding that of traditional supercomputers.
APA, Harvard, Vancouver, ISO, and other styles
33

Dexter, H. "A system for real-time allocation of irrigation resources : Lower Jordan Valley, Israel." Thesis, University of Salford, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.372151.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Al-Shethry, Abdulaziz H. "Juvenile delinquency in Saudi Arabia : with special reference to the use of free time among delinquent youth in Riyadh City." Thesis, University of Hull, 1993. http://hydra.hull.ac.uk/resources/hull:4704.

Full text
Abstract:
This study examines the impact of free time activities and companions on juvenile delinquency in the city of Riyadh, in the context of social change in Saudi society as a whole, as new forms of leisure and recreation seem to have arisen as a result of the process of urbanization taking place in the country.The field work was conducted in 1990-91 in the Social Observation Home in Riyadh. Social survey and case study methods were employed in the research.The findings of the study show that the major factors influencing juvenile delinquency in the city of Riyadh, fall into four groups concerning: the family, the school, the community and the society. As expected, the recent economic growth in Saudi Arabia has had a particular influence upon the situation of the youth in the society, in various social and cultural aspects, as a result of the cultural contact with foreigners in and outside the Kingdom and other factors.It is found that the peer-group has a strong influence on its members through many aspects of play, enjoyment, friendship and passing time which may, eventually, lead them to misbehaviour and delinquency. The impact of delinquent companions is visible from many indications: a) The majority of the sample had committed their offences in groups. b) A large number of them mentioned the desire to follow or please friends as the reason for committing the offence. c) Most importantly, about two thirds of the whole sample reported that they had friends with a previous history of delinquency.
APA, Harvard, Vancouver, ISO, and other styles
35

Ngan, Tsuen-wan. "On the effectiveness of additional resources for on-line firm deadline scheduling /." Hong Kong : University of Hong Kong, 2002. http://sunzi.lib.hku.hk/hkuto/record.jsp?B24520962.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Hopkins, Mark. "Intelligent dispatch for distributed renewable resources." Thesis, Manhattan, Kan. : Kansas State University, 2009. http://hdl.handle.net/2097/1512.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Allan, Shaun Michael. "Part-time defenders of the Realm : is the history of the Territorial Army a likely indicator of Future Reserves 2020 success?" Thesis, University of Hull, 2017. http://hydra.hull.ac.uk/resources/hull:16422.

Full text
Abstract:
This research investigates whether the Army Reserve (AR), the new name for the Territorial Army, can become a more integrated, more efficient, better trained, and more deployable force, as intimated by the government, than their previous incarnations. The hoped-for better-trained AR is expected to take a far greater role, taking over some Regular Army roles, providing better trained part-time soldiers for overseas operations and filling capability gaps left by the retrenchment of 20,000 Regular soldiers by !he year 2020. To investigate whether these aspirations are achievable, and whether it is possible to train volunteer soldiers better than they have been in the past, this research completes an historical analysis of the history of the AR's antecedents, the Territorial Force and the Territorial Army, their training, kit and equipment, and overseas deployment record. The thesis also explores historical social. economic and cultured issues ·which have had an impact upon Territorials, such as civilian employers' attitudes towards the volunteers and their organisation (and a sense of what wider society thought about the Territorials). Furthermore. research into the Territorial's family issues, support or otherwise also sheds light upon the influence the family unit has upon the volunteer's decision to join and how long he/she stays in service. Coupled with family support is research into the support the family received from the government when their Territorial was deployed and what happens when the part-timer returns from war, from 1908 to 2012. This historical comparison ·will help highlight past struggles and.failures inherent in the framework for training and making ready the volunteers of the past with today's AR, which uses the same structure for training, filling in civilian commitments and family life.
APA, Harvard, Vancouver, ISO, and other styles
38

Goering, Dustin C. "Decision support for Wisconsin's manure spreaders| Development of a real-time Runoff Risk Advisory Forecast." Thesis, The University of Arizona, 2013. http://pqdtopen.proquest.com/#viewpdf?dispub=1545431.

Full text
Abstract:
<p> The Runoff Risk Advisory Forecast (RRAF) provides Wisconsin's farmers with an innovative decision support tool which communicates the threat of undesirable conditions for manure and nutrient spreading for up to 10 days in advance. The RRAF is a pioneering example of applying the National Weather Service's hydrologic forecasting abilities towards the Nation's water quality challenges. Relying on the North Central River Forecast Center's (NCRFC) operational Snow17 and Sacramento Soil Moisture Accounting Models, runoff risk is predicted for 216 modeled watersheds in Wisconsin. The RRAF is the first-of-its-kind real-time forecast tool to incorporate 5-days of future precipitation as well as 10-days of forecast temperatures to generate runoff risk guidance. The forecast product is updated three times daily and hosted on the Wisconsin Department of Agriculture, Trade, and Consumer Protection (DATCP) website. Developed with inter-agency collaboration, the RRAF model was validated against both edge-of-field observed runoff as well as small USGS gauged basin response. This analysis indicated promising results with a Bias Score of 0.93 and a False Alarm Ratio (FAR) of only 0.34 after applying a threshold method. Although the threshold process did dampen the Probability of Detection (POD) from 0.71 to 0.53, it was found that the magnitude of the events categorized as hits was 10-times larger than those classified as misses. The encouraging results from this first generation tool are aiding State of Wisconsin officials in increasing awareness of risky runoff conditions to help minimize contaminated agriculture runoff from entering the State's water bodies.</p>
APA, Harvard, Vancouver, ISO, and other styles
39

Williamson, Hugh. "PENNSYLVANIA HIGH SCHOOL INSTRUMENTAL MUSIC TEACHERS' PERCEPTIONS OF CHANGES IN INSTRUCTIONAL TIME AND RESOURCES." Diss., Temple University Libraries, 2014. http://cdm16002.contentdm.oclc.org/cdm/ref/collection/p245801coll10/id/249629.

Full text
Abstract:
Music Education<br>Ph.D.<br>The purpose of this study was to determine Pennsylvania public high school instrumental music teachers' perceptions of changes to instrumental music instruction that may have been the result of a narrowing focus on student performance on standardized tests and sanctions linked to the No Child Left Behind Act of 2001 (2002). The study used a descriptive design to investigate ways that standardized testing may have influenced student opportunities to participate in school instrumental music, instructional time available for instrumental lessons and performing ensembles, budgetary resources and funding sources, staffing, and instrumental music curricula in Pennsylvania high schools. Data were gathered via an anonymous web-based survey. Of the entire population of 710 full-time high school instrumental music teachers in Pennsylvania, 304 responded. Of those, 247 successfully completed the survey and were appropriate for analysis. Results suggested that across PA high schools, instrumental music opportunities were varied and inconsistent with regard to instructional time, financial resources, access and availability of students, and support for instrumental music within the larger curriculum of the schools. These inconsistencies may have resulted in unequal opportunities to participate in instrumental music programs, partially because of funding and policy priorities at the state and local level that value test-based accountability rather than more comprehensive methods of evaluating child development and learning. Prior research suggested that opportunities to participate in instrumental music were linked to individual and group standardized test performance. Schools in very large urban districts with high percentages of low-income and minority students were the most likely to face reductions in instrumental music opportunities. Implications included the possibility of inequitable reductions to music programs potentially undermining efforts to help reduce or prevent achievement gaps. Reductions in instrumental music opportunities for elementary level students was a particular concern since neurobiological research findings suggest special benefits for early childhood music instruction. Recommendations for further research included replication of the study using identifiable data, case studies of individual high schools, the continuation and expansion of longitudinal studies between neuroscientists and music educators, and a survey of school administrator attitudes toward music education.<br>Temple University--Theses
APA, Harvard, Vancouver, ISO, and other styles
40

Hua, Xiayu. "Theoretical Analysis of Real-Time Scheduling on Resources with Performance Degradation and Periodic Rejuvenation." Thesis, Illinois Institute of Technology, 2017. http://pqdtopen.proquest.com/#viewpdf?dispub=10603696.

Full text
Abstract:
<p> In 1973, Liu and Layland published their seminal paper on schedulability analysis of real-time system for both EDF and RM schedulers. In this work, they provide schedulability conditions and schedulability utilization bounds for both EDF and RM scheduling algorithms, respectively. In the following four decades, scheduling algorithms, utilization bounds and schedulability analyses for real-time tasks have been studied intensively. Amongst those studies, most of the research relies on a strong assumption that the performance of a computing resource does not change during its lifetime. Unfortunately, for many long standing real-time systems, such as data acquisition systems (DAQ), deep-space exploration programs and SCADA systems for power, water and other national infrastructures, the performance of computational resources suffer notably performance degradations after a long and continuous execution period.</p><p> To overcome the performance degradation in long standing systems, countermeasures, which are also called system rejuvenation approaches in the literature, were introduced and studied in depth in the last two decades. Rejuvenation approaches recover system performance when being invoked and hence benefit most long standing applications. However, for applications with real-time requirements, the system downtime caused by rejuvenation process, along with the decreasing performance during the system's available time, makes the existing real-time scheduling theories difficult to be applied directly.</p><p> To address this problem, this thesis studies the schedulability issues of a real-time task set running on long standing computing systems that suffers performance degradation and uses rejuvenation mechanism to recover.</p><p> Our first study in the thesis focuses on a simpler resource model, i.e. the periodic resource model, which only considers periodic rejuvenations. We introduce a method, i.e., <i>Periodic Resource Integration,</i> to combine multiple periodic resources into a single equivalent periodic resource and provide the schedulability analysis based on the combined periodic resource for real-time tasks. By integrating multiple periodic resources into one, existing real-time scheduling researches on single periodic resource can be directly applied on multiple periodic resources.</p><p> In our second study, we extend the periodic resource mode to a new resource model, the <i>P<sup>2</sup></i>-resource model, in our second work to characterize resources with both the <u>p</u>erformance degradation and the <u>p</u>eriodic rejuvenation. We formally define the <i>P<sup>2</sup></i>-resource and analyze the schedulability of real-time task sets on a <i>P<sup>2</sup></i>-resource. In particular, we first analyze the resource supply status of a given <i> P<sup>2</sup></i>-resource and provide its supply bound and linear supply bound functions. We then developed the schedulability conditions for a task set running on a <i>P<sup>2</sup></i>-resource with EDF or RM scheduling algorithms, respectively. We further derive utilization bounds of both EDF and RM scheduling algorithms, respectively, for schedulability test purposes.</p><p> With the <i>P<sup>2</sup></i>-resource model and the schedulability analysis on a single <i>P<sup>2</sup></i>-resource, we further extend our work to multiple <i>P<sup>2</sup></i>-resources. In this research, we 1) analyze the schedulability of a real-time task set on multiple <i>P<sup>2</sup></i>-resources under fixed-priority scheduling algorithm, 2) introduce the GP-RM-P2 algorithm and 3) provide the utilization bound for this algorithm. Simulation results show that in most cases, the sufficient bounds we provide are tight.</p><p> As the rejuvenation technology keeps advancing, many systems are now able to perform rejuvenations in different system layers. To accommodate this new advance, we study the schedulability conditions of a real-time task set on a single <i>P<sup>2</sup></i>-resource with both cold or warm rejuvenations. We introduce a new resource model, the <i>P<sup> 2</sup></i>-resource with duel-level rejuvenation, i.e., <i>P<sup> 2</sup>D</i>-resource, to accommodate this new feature. We first study the supply bound and the linear supply bound of a given <i>P<sup>2 </sup>D</i>-resource. We then study the sufficient utilization bounds for both RM and EDF scheduling algorithms, respectively.</p><p>
APA, Harvard, Vancouver, ISO, and other styles
41

Bailey, Roy Douglas. "Autogenic regulation training (ART), sickness absence, personal problems, time and the emotional-physical stress of student nurses in general training : a report of a longitudinal field investigation." Thesis, University of Hull, 1985. http://hydra.hull.ac.uk/resources/hull:5040.

Full text
Abstract:
A field investigation was carried out with student nurses entering General Training in a School of Nursing. Autogenic Regulation Training (ART), sickness absence, personal problems, time and their emotional physical experience was evaluated. Measures used in the study included:The Sickness Absence Record (SAR)The Mooney Problem Checklist (MPC)The Crown-Crisp Experiential Index (CCEI)and The Personal Observations Inventory (POI)Data was collected at different time periods early in their nurse education. The study was carried out to investigate the effectiveness of ART in providing a method of coping with individual stress. Analyses were made between and within an ART group of student nurses and a comparison group who did not receive training in ART. Consideration was also given to individual differences of student nurses in each group.Particular attention was paid to the hypotheses that 1) ART is associated with reduced sickness absence in student nurses when analysed against a comparison group' of student nurses not trained in ART; and 2) ART is associated with reduced stress in student nurses when compared with student nurses not trained in ART. 'It is generally concluded that student nurses trained in ART may reduce their level of sickness absence and can alleviate stress for some student nurses. However, examination of individual student nurse reports of ART and its usefulness and practice within these group data, suggest more complex interpretations of the study. Despite the study limitations, implications for methods of stress control for nurses, curriculum development and cost-effective savings for nursing administrations are suggested, and possibilities for the development of comprehensive counselling services for nurses are raised. These issues it is suggested, should be examined within a broader programme of research into coping with stress amongst nurses.
APA, Harvard, Vancouver, ISO, and other styles
42

Kenneally, Michael Martin. "The contribution of the Presentation Brothers to Irish education 1960-1998 : a study of a Roman Catholic religious teaching institute in a time of change and transition." Thesis, University of Hull, 1998. http://hydra.hull.ac.uk/resources/hull:11536.

Full text
Abstract:
The Institute of Presentation Brothers is a Roman Catholic religious Congregation founded by Edmund Rice in Waterford, in 1802. The Brothers declare their mission to be Christian formation, primarily of youth and in particular of the poor and disadvantaged. The aim of this thesis is to outline and examine the contribution of the Brothers to education in Ireland in the period 1960-1998. Taking account of the Catholic Christian tradition and against the background of the nineteenth century Ireland the thesis describes the growth and development of the Brothers' work. Particular attention is focused on the period from 1960 onwards and how the twin forces of change in society and in the Catholic Church impacted on the Brothers' contribution to education. The thesis considers how the Brothers have dealt with the major educational issues of the time. The key issues of training and personnel are dealt with, along with an analysis of the special role of religious education, Irish culture and sport in the Brothers' schools. The educational philosophy of the Brothers is traced from its origins as is the challenge to articulate a contemporary Presentation philosophy of education. The contribution of a number of significant educational leaders among the Brothers is highlighted and the views of a range of past-pupil writers are offered regarding the quality of their educational experience in Presentation schools. The primary motivation for the Brothers' involvement in education is religious. They are committed to a Catholic vision of education which has profound implications for the lives of young people. The rapidity of change has radically altered the presence and role of the Brothers in Irish education in the last forty years. The thesis contends that this period can be divided into two phases, roughly approximating to twenty years each. During the first phase the Brothers' educational mission lacked vision and strategy. It was a time of confusion. The second phase has seen the Presentation Brothers and their co-workers grapple with deeper educational questions. A new vision is forming and the present position of the Brothers and their associates is analysed along with the contemporary challenges they face in education. During the period 1960-1998, the Brothers conducted a network of schools at primary and secondary level. In the last ten years they have also developed a variety of other educational initiatives. This study contends that the Presentation Brothers have made and continue to make a distinctive contribution to the education of thousands of young Irish people. The problems that face the Presentation Brothers as we move into the new millennium are many and complex. An analysis of the past may provide valuable learnings for the future and so an evaluation of the Brothers' contribution to education since the onset of rapid change in the 1960s is attempted. The study contends that the Catholic/Edmund Rice educational vision of the Brothers, given re-articulation and commitment has much to offer to young people and to the Ireland of the future.
APA, Harvard, Vancouver, ISO, and other styles
43

Marlow, Gregory. "Week 01, Video 06: Maya UI Time Slider and Range Slider." Digital Commons @ East Tennessee State University, 2020. https://dc.etsu.edu/digital-animation-videos-oer/11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Xu, Zongying M. Eng Massachusetts Institute of Technology. "On-time delivery improvement at a semiconductor equipment manufacturing facility : developing robust work plan to improve resources utilization and shorten production cycle time/." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/113729.

Full text
Abstract:
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Mechanical Engineering, 2017.<br>This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.<br>Cataloged from student-submitted PDF version of thesis.<br>Includes bibliographical references (pages 119-120).<br>The boost of the need in semiconductor equipment has brought Varian, a giant in semiconductor manufacturer an increasing demand period. Unsatisfactory on-time delivery in the subassembly production in SMKT, however, has gradually become a critical issue that prevents the production from operating smoothly and efficiently as well as resulting in significant amount of extra financial costs. While the SMKT ontime delivery is generally improved from the two aspects, the material shortage issue and the capacity and priority issue, the second approach was focused, analyzed and well resolved in this thesis. Capacity analysis indicates that more manpower is needed to fulfill the production demand, and an analysis tool based on Excel was created to help with capacity management and planning in a long term. A series of matching algorithms was designed to solve the prioritization issue by providing efficient work plans that direct the production in SMKT assembly process, improving the SMKT on-time delivery by 10-20% and shortening the shop order cycle time in assembly process by one day. Moreover, recommendations on data utilization, capacity management and resources management in SMKT are provided accordingly to optimize the SMKT production to be more proficient and well-organized, which will greatly benefit Varian in the long term. The implementation of this algorithm generates no financial cost except for increasing the assembler's capacity, which indicates that all solutions provided in this thesis are economic and effective at the same time.<br>by Zongying Xu.<br>M. Eng.
APA, Harvard, Vancouver, ISO, and other styles
45

O'Donnell, Jeffrey Michael. "Collaboration And Conflict In The Adirondack Park: An Analysis Of Conservation Discourses Over Time." ScholarWorks @ UVM, 2015. http://scholarworks.uvm.edu/graddis/391.

Full text
Abstract:
The role of collaboration within conservation is of increasing interest to scholars, managers and forest communities. Collaboration can take many forms, but one under-studied topic is the form and content of public discourses across conservation project timelines. To understand the discursive processes that influence conservation decision-making, this research evaluates the use of collaborative rhetoric and claims about place within discourses of conservation in the Adirondacks. Local newspaper articles and editorials published from January 1996 to December 2013 and concerning six major conservation projects were studied using content analysis. Results show that collaborative rhetoric increased during the study period, and conflict discourses declined, in concert with the rise of collaborative planning efforts. Data also show an increasing convergence between conservation sponsors and local communities regarding the economic benefits of conservation and the importance of public participation. The study has value in examining representations of place and media claims-making strategies within conservation discourses, an important topic as natural resource managers increasingly embrace community-based natural resource management.
APA, Harvard, Vancouver, ISO, and other styles
46

Lamata, Martinez Ignacio. "The integration of earthquake engineering resources." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:5c5ca053-efc7-49a2-a52e-234189f5fb3c.

Full text
Abstract:
Earthquake engineering is increasingly focusing on large international collaborations to address complex problems. Recent computing advances have greatly contributed to the way scientific collaborations are conducted, where web-based solutions are an emerging trend to manage and present results to the scientific community and the general public. However, collaborations in earthquake engineering lack a common interoperability framework, resulting in tedious and complex processes to integrate results, which cannot be efficiently used by third-party institutions. The work described in this thesis applies novel computing techniques to enable the interoperability of earthquake engineering resources, by integrating data, distributed simulation services and laboratory facilities. This integration focuses on distributed approaches rather than centralised solutions, and has been materialised in a platform called Celestina, that supports the integration of hazard mitigation resources. The prototype of Celestina has been implemented and validated within the context of two of the current largest earthquake engineering networks, the SERIES network in Europe and the NEES network in the USA. It has been divided into three sub-systems to address different problems: (i) Celestina Data, to develop best methods to define, store, integrate and share earthquake engineering experimental data. Celestina Data uses a novel approach based on Semantic Web technologies, and it has accomplished the first data integration between earthquake engineering institutions from the United States and Europe by means of a formalised infrastructure. (ii) Celestina Tools, to research applications that can be implemented on top of the data integration, in order to provide a practical benefit for the end user. (iii) Celestina Simulations, to create the most efficient methods to integrate distributed testing software and to support the planning, definition and execution of the experimental workflow from a high-level perspective. Celestina Simulations has been implemented and validated by conducting distributed simulations between the Universities of Oxford and Kassel. Such validation has demonstrated the feasibility to conduct both flexible, general-purpose and high performance simulations under the framework. Celestina has enabled global analysis of data requirements for the whole community, the definition of global policies for data authorship, curation and preservation, more efficient use of efforts and funding, more accurate decision support systems and more efficient sharing and evaluation of data results in scientific articles.
APA, Harvard, Vancouver, ISO, and other styles
47

Martinoty, Laurine. "Intrahousehold Allocation of Time and Consumption during Hard Times." Thesis, Lyon, École normale supérieure, 2015. http://www.theses.fr/2015ENSL1021/document.

Full text
Abstract:
Les conséquences des chocs économiques négatifs sur les ménages ont été documentés extensivement, mais on en sait beaucoup moins sur la manière dont ces chocs sont transmis aux individus à travers la médiation du ménage. Le ménage contribue-il à modérer l'effet des chocs négatifs ? Dans quelle mesure le choc économique pèse-t-il dans la négociation familiale ? À partir de données sur la crise économique argentine de 2001, je montre d'abord que les femmes en couple ont une plus grande probabilité de devenir actives si leur mari a fait l'expérience d'un choc de revenu. Ensuite, je montre que le cycle économique importe dans les décisions d'investissement en capital humain. Sur le long terme, les profils de salaire et d'employabilité des hommes argentins sont affectés de manière persistante par les conditions économiques initiales au moment de l'obtention du diplôme. Enfin, je considère la dimension “man-cession” de la crise économique de 2009 en Espagne et montre que la part des ressources du ménage reçues par les femmes pour leur consommation privée augmente avec la diminution de l'écart des taux de chômage hommes-femmes, confortant l'hypothèse que les chocs négatifs modifient le pouvoir de négociation des individus au sein du ménage<br>The consequences of adverse aggregate shocks on households have been repeatedly documented, but far less has been said on the way they are passed over to individuals through the mediation of the household. Does the household contribute in mitigating the effects? Or does the economic shock rather invite itself at the family negociating table? Using the Argentine 2001 economic crisis as a natural experiment, I first show that married women are more likely to enter the labor market if their husband experienced a loss in income, giving credit to the insurance mechanism. Then, I show that the business cycle matters for investments in education, and that long run labor outcomes of Argentine men are persistently affected by the initial conditions upon graduation. Finally, I consider the “Mancession” dimension of the Great Recession in Spain and demonstrate that the resource share accruing to wives for own consumption increases together with the decreasing unemployment gap, which comes in support to the bargaining hypothesis
APA, Harvard, Vancouver, ISO, and other styles
48

Riquelme, Victor. "Optimal control problems for bioremediation of water resources." Thesis, Montpellier, 2016. http://www.theses.fr/2016MONTT290/document.

Full text
Abstract:
Cette thèse se compose de deux parties. Dans la première partie, nous étudions les stratégies de temps minimum pour le traitement de la pollution dans de grandes ressources en eau, par exemple des lacs ou réservoirs naturels, à l'aide d'un bioréacteur continu qui fonctionne à un état quasi stationnaire. On contrôle le débit d'entrée d'eau au bioréacteur, dont la sortie revient à la ressource avec le même débit. Nous disposons de l'hypothèse d'homogénéité de la concentration de polluant dans la ressource en proposant trois modèles spatialement structurés. Le premier modèle considère deux zones connectées l'une à l'autre par diffusion et seulement une d'entre elles connectée au bioréacteur. Avec l'aide du Principe du Maximum de Pontryagin, nous montrons que le contrôle optimal en boucle fermée dépend seulement des mesures de pollution dans la zone traitée, sans influence des paramètres de volume, diffusion, ou la concentration dans la zone non traitée. Nous montrons que l'effet d'une pompe de recirculation qui aide à homogénéiser les deux zones est avantageux si opérée à vitesse maximale. Nous prouvons que la famille de fonctions de temps minimal en fonction du paramètre de diffusion est décroissante. Le deuxième modèle consiste en deux zones connectées l'une à l'autre par diffusion et les deux connectées au bioréacteur. Ceci est un problème dont l'ensemble des vitesses est non convexe, pour lequel il n'est pas possible de prouver directement l'existence des solutions. Nous surmontons cette difficulté et résolvons entièrement le problème étudié en appliquant le principe de Pontryagin au problème de contrôle relaxé associé, obtenant un contrôle en boucle fermée qui traite la zone la plus polluée jusqu'au l'homogénéisation des deux concentrations. Nous obtenons des limites explicites sur la fonction valeur via des techniques de Hamilton-Jacobi-Bellman. Nous prouvons que la fonction de temps minimal est non monotone par rapport au paramètre de diffusion. Le troisième modèle consiste en deux zones connectées au bioréacteur en série et une pompe de recirculation entre elles. L'ensemble des contrôles dépend de l'état, et nous montrons que la contrainte est active à partir d'un temps jusqu'à la fin du processus. Nous montrons que le contrôle optimal consiste à l'atteinte d'un temps à partir duquel il est optimal de recirculer à vitesse maximale et ensuite ré-polluer la deuxième zone avec la concentration de la première. Ce résultat est non intuitif. Des simulations numériques illustrent les résultats théoriques, et les stratégies optimales obtenues sont testées sur des modèles hydrodynamiques, en montrant qu'elles sont de bonnes approximations de la solution du problème inhomogène. La deuxième partie consiste au développement et l'étude d'un modèle stochastique de réacteur biologique séquentiel. Le modèle est obtenu comme une limite des processus de naissance et de mort. Nous établissons l'existence et l'unicité des solutions de l'équation contrôlée qui ne satisfait pas les hypothèses habituelles. Nous prouvons que pour n'importe quelle loi de contrôle la probabilité d'extinction de la biomasse est positive. Nous étudions le problème de la maximisation de la probabilité d'atteindre un niveau de pollution cible, avec le réacteur à sa capacité maximale, avant l'extinction. Ce problème ne satisfait aucune des suppositions habituelles (la dynamique n'est pas lipschitzienne, diffusion dégénérée localement hölderienne, contraintes d'état, ensembles cible et absorbant s'intersectent), donc le problème doit être étudié dans deux étapes: en premier lieu, nous prouvons la continuité de la fonction de coût non contrôlée pour les conditions initiales avec le volume maximal et ensuite nous développons un principe de programmation dynamique pour une modification du problème original comme un problème de contrôle optimal avec coût final sans contrainte sur l'état<br>This thesis consists of two parts. In the first part we study minimal time strategies for the treatment of pollution in large water volumes, such as lakes or natural reservoirs, using a single continuous bioreactor that operates in a quasi-steady state. The control consists of feeding the bioreactor from the resource, with clean output returning to the resource with the same flow rate. We drop the hypothesis of homogeneity of the pollutant concentration in the water resource by proposing three spatially structured models. The first model considers two zones connected to each other by diffusion and only one of them treated by the bioreactor. With the help of the Pontryagin Maximum Principle, we show that the optimal state feedback depends only on the measurements of pollution in the treated zone, with no influence of volume, diffusion parameter, or pollutant concentration in the untreated zone. We show that the effect of a recirculation pump that helps to mix the two zones is beneficial if operated at full speed. We prove that the family of minimal time functions depending on the diffusion parameter is decreasing. The second model consists of two zones connected to each other by diffusion and each of them connected to the bioreactor. This is a problem with a non convex velocity set for which it is not possible to directly prove the existence of its solutions. We overcome this difficulty and fully solve the studied problem applying Pontryagin's principle to the associated problem with relaxed controls, obtaining a feedback control that treats the most polluted zone up to the homogenization of the two concentrations. We also obtain explicit bounds on its value function via Hamilton-Jacobi-Bellman techniques. We prove that the minimal time function is nonmonotone as a function of the diffusion parameter. The third model consists of a system of two zones connected to the bioreactor in series, and a recirculation pump between them. The control set depends on the state variable; we show that this constraint is active from some time up to the final time. We show that the optimal control consists of waiting up to a time from which it is optimal the mixing at maximum speed, and then to repollute the second zone with the concentration of the first zone. This is a non intuitive result. Numerical simulations illustrate the theoretical results, and the obtained optimal strategies are tested in hydrodynamic models, showing to be good approximations of the solution of the inhomogeneous problem. The second part consists of the development and study of a stochastic model of sequencing batch reactor. We obtain the model as a limit of birth and death processes. We establish the existence and uniqueness of solutions of the controlled equation that does not satisfy the usual assumptions. We prove that with any control law the probability of extinction is positive, which is a non classical result. We study the problem of the maximization of the probability of attaining a target pollution level, with the reactor at maximum capacity, prior to extinction. This problem does not satisfy any of the usual assumptions (non Lipschitz dynamics, degenerate locally H"older diffusion parameter, restricted state space, intersecting reach and avoid sets), so the problem must be studied in two stages: first, we prove the continuity of the uncontrolled cost function for initial conditions with maximum volume, and then we develop a dynamic programming principle for a modification of the problem as an optimal control problem with final cost and without state constraint
APA, Harvard, Vancouver, ISO, and other styles
49

Dullien, Starley Beatrix. "In time on time: Website for teachers of English to speakers of other languages." CSUSB ScholarWorks, 2005. https://scholarworks.lib.csusb.edu/etd-project/2730.

Full text
Abstract:
The purpose of the "In Time On Time TESOL" website for Teachers of English to Students of Other Languages (TESOL) is to provide adult-education teachers online access to classroom managing techniques, teaching and learning strategies, and online resources based on constructivism and adult-learning theory. The instructional design and navigation structure is based on Random Access Instruction (RAI) and hypertext theory.
APA, Harvard, Vancouver, ISO, and other styles
50

Mrong, Clewestam Sufola. "Utilizing the potential resources of elderly people : An interview study about the potential resources of elderly and young (40+) people, what those resources consists of and how they can be defined." Thesis, Linköpings universitet, Institutionen för samhälls- och välfärdsstudier, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-94538.

Full text
Abstract:
The overall purpose of this master essay is to investigate whether there is an interest among people near retirement to contribute with their knowledge and experience after they reach the defined retirement age. Furthermore, I would like to investigate if they want to continue with their old jobs, full time/part time or to do something else, which is beneficial to the society. I also want to find out how the older people view themselves to continuing to work. Also which types of social and structural barriers that might exist for them and why. The aim is also to investigate what types of knowledge and experience they believe could be used in the future community. For that purpose I have structured interview questions and conducted eight individual interviews. Below is a brief summary of the main results. The study shows that there is a general interest in working after retirement. Most people prefer to work only part-time. Participation and influence is meaningful to the individuals. It is seen as having a positive impact on individuals, groups and societal perspectives. At the individual level, the relationship between the possibility to choose both type of work and working hours and the desire to continue working, is very strong. The choice contributes to the feeling that life is meaningful. That proved to be of great motivation to participate in the working life. The study shows the importance of paying attention to flexible working hours adapted at individual level. The results also show that the elderly are often pictured as competent, knowledgeable and skilled people, which resources can be utilized in the society. It also appeared that older people are carriers of knowledge and experiences, which can be taken care of and transferred on to the younger generations instead of being lost. The study pointed out that there is manual work situations particularly risky for older people. It also revealed negative factors such as competition for the jobs between younger and older people, which can create an opinion in society against letting older people work. The results also indicate that the mixture of older and younger people creates a knowledge and experience transfer which creates new approaches to working tasks. It also emerged a number of factors that influence an individual's choice to attend to work; Important to take an individual approach, assess the participant’s interests and abilities for the particular job, his desired working hours and the need of upgrading or retraining.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!