Academic literature on the topic 'Quantity of interest'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Quantity of interest.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Quantity of interest"

1

DeVore, Ronald, Simon Foucart, Guergana Petrova, and Przemyslaw Wojtaszczyk. "Computing a Quantity of Interest from Observational Data." Constructive Approximation 49, no. 3 (June 1, 2018): 461–508. http://dx.doi.org/10.1007/s00365-018-9433-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ireland, Peter N. "THE MACROECONOMIC EFFECTS OF INTEREST ON RESERVES." Macroeconomic Dynamics 18, no. 6 (May 24, 2013): 1271–312. http://dx.doi.org/10.1017/s1365100512000934.

Full text
Abstract:
This paper uses a New Keynesian model with banks and deposits to study the macroeconomic effects of policies that pay interest on reserves. Although their effects on output and inflation are small, these policies require major adjustments in the way that the monetary authority manages the supply of reserves, as liquidity effects vanish in the short run. In the long run, however, the additional freedom the monetary authority acquires by paying interest on reserves is best described as affecting the real quantity of reserves: policy actions that change prices must still change the nominal quantity of reserves proportionally.
APA, Harvard, Vancouver, ISO, and other styles
3

BOX-STEFFENSMEIER, JANET M., DINO P. CHRISTENSON, and MATTHEW P. HITT. "Quality Over Quantity: Amici Influence and Judicial Decision Making." American Political Science Review 107, no. 3 (July 10, 2013): 446–60. http://dx.doi.org/10.1017/s000305541300021x.

Full text
Abstract:
Interest groups often make their preferences known on cases before the U.S. Supreme Court via amicus curiae briefs. In evaluating the case and related arguments, we posit that judges take into account more than just the number of supporters for the liberal and conservative positions. Specifically, judges’ decisions may also reflect the relative power of the groups. We use network position to measure interest group power in U.S. Supreme Court cases from 1946 to 2001. We find that the effect of interest group power is minimal in times of heavily advantaged cases. However, when the two sides of a case are approximately equal in the number of briefs, such power is a valuable signal to judges. We also show that justice ideology moderates the effect of liberal interest group power. The results corroborate previous findings on the influence of amicus curiae briefs and add a nuanced understanding of the conditions under which the quality and reputation of interest groups matter, not just the quantity.
APA, Harvard, Vancouver, ISO, and other styles
4

Chaudhry, Jehanzeb Hameed, Eric C. Cyr, Kuo Liu, Thomas A. Manteuffel, Luke N. Olson, and Lei Tang. "Enhancing Least-Squares Finite Element Methods Through a Quantity-of-Interest." SIAM Journal on Numerical Analysis 52, no. 6 (January 2014): 3085–105. http://dx.doi.org/10.1137/13090496x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Tang, Zuqi, Suyang Lou, Abdelkader Benabou, Emmanuel Creuse, Serge Nicaise, Julien Korecki, and Jean-Claude Mipo. "Guaranteed Quantity of Interest Error Estimate Based on Equilibrated Flux Reconstruction." IEEE Transactions on Magnetics 57, no. 6 (June 2021): 1–4. http://dx.doi.org/10.1109/tmag.2021.3071641.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Congdon, Tim. "Interest rates or quantity of money? Edward Nelson on Milton Friedman." Economic Affairs 41, no. 2 (June 2021): 320–35. http://dx.doi.org/10.1111/ecaf.12467.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Griesse, Roland, and Boris Vexler. "Numerical Sensitivity Analysis for the Quantity of Interest in PDE‐Constrained Optimization." SIAM Journal on Scientific Computing 29, no. 1 (January 2007): 22–48. http://dx.doi.org/10.1137/050637273.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Taghizadeh, Jonas Larsson. "Quality over quantity? Technical information, interest advocacy and school closures in Sweden." Interest Groups & Advocacy 4, no. 2 (December 16, 2014): 101–19. http://dx.doi.org/10.1057/iga.2014.17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Peterson, Andrew, and Arthur Spirling. "Classification Accuracy as a Substantive Quantity of Interest: Measuring Polarization in Westminster Systems." Political Analysis 26, no. 1 (January 2018): 120–28. http://dx.doi.org/10.1017/pan.2017.39.

Full text
Abstract:
Measuring the polarization of legislators and parties is a key step in understanding how politics develops over time. But in parliamentary systems—where ideological positions estimated from roll calls may not be informative—producing valid estimates is extremely challenging. We suggest a new measurement strategy that makes innovative use of the “accuracy” of machine classifiers, i.e., the number of correct predictions made as a proportion of all predictions. In our case, the “labels” are the party identifications of the members of parliament, predicted from their speeches along with some information on debate subjects. Intuitively, when the learner is able to discriminate members in the two main Westminster parties well, we claim we are in a period of “high” polarization. By contrast, when the classifier has low accuracy—and makes a relatively large number of mistakes in terms of allocating members to parties based on the data—we argue parliament is in an era of “low” polarization. This approach is fast and substantively valid, and we demonstrate its merits with simulations, and by comparing the estimates from 78 years of House of Commons speeches with qualitative and quantitative historical accounts of the same. As a headline finding, we note that contemporary British politics is approximately as polarized as it was in the mid-1960s—that is, in the middle of the “postwar consensus”. More broadly, we show that the technical performance of supervised learning algorithms can be directly informative about substantive matters in social science.
APA, Harvard, Vancouver, ISO, and other styles
10

Yu, Alan C. L. "Quantity, stress and reduplication in Washo." Phonology 22, no. 3 (December 2005): 437–75. http://dx.doi.org/10.1017/s0952675705000679.

Full text
Abstract:
Plural internal reduplication in Washo has generated much interest in the phonological literature. This study presents a novel analysis that unifies the treatment of a set of seemingly disparate aspects of this plural reduplication pattern (e.g. variation in the placement and size of the reduplicant, contrastive vowel length in stressed syllables, post-tonic gemination, and vowel-length inheritance in reduplication), relying on the interaction between constraints on weight assignment, affix anchoring and stress assignment. In particular, the odd placement of the plural reduplicant in roots with internal consonant sequences and the restricted distribution of long vowels in Washo can be attributed to a previously unnoticed emerging preference for heavy stressed syllables on the surface. The results of this study have implications for theories of reduplication and theories of weight phenomena in general.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Quantity of interest"

1

Djatouti, Zohra. "Amélioration de la prédiction de quantités d'intérêt par modélisation inverse : application à la thermique du bâtiment." Thesis, Paris Est, 2019. http://www.theses.fr/2019PESC2006.

Full text
Abstract:
Dans le contexte actuel de dérèglement climatique et d’épuisement des ressources, la réduction des consommations énergétiques finales du secteur résidentiel/tertiaire représente un enjeu majeur. En France, ce secteur compte pour environ 45% des consommations énergétiques finales. De plus, le parc immobilier est constitué majoritairement de bâtiments anciens et énergivores et son taux de renouvellement annuel est très faible (1% à 2%), raisons pour lesquelles les bâtiments existants représentent un important gisement d’économies d’énergie. Avant d’entreprendre des travaux de rénovation énergétique d’un bâtiment, il est nécessaire d’estimer sa consommation réelle. Ceci requiert une bonne connaissance de ses caractéristiques thermiques. Des méthodes inverses, couplant des modèles physiques et des mesures peuvent être utilisées à cet effet. La présente thèse introduit une méthode inverse d’identification de paramètres de modèles vis-à-vis d’une quantité d’intérêt. Contrairement aux méthodes inverses standards telles que la méthode de régularisation de Tikhonov qui visent à minimiser l’écart entre mesures et simulation en recalant l’ensemble des paramètres du modèle, l’approche proposée est formulée pour l’amélioration de la prédiction de quantités d’intérêt. Seuls les paramètres auxquels celles-ci sont sensibles sont mis à jour. Pour optimiser le temps de calcul, cette méthode inverse est utilisée en combinaison avec la méthode PGD (Proper Generalized Decomposition). La méthode inverse a été appliquée à des mesures réelles issues de l’instrumentation de deux bâtiments de l’équipement d’excellence « Sense-City ». Les résultats obtenus montrent que, comme attendu, la méthode proposée n’identifie que les paramètres auxquels les quantités d’intérêt sont le plus sensibles. La méthode d’identification de paramètres vis-à-vis d’une quantité d’intérêt converge plus rapidement comparée à la méthode de Tikhonov. Enfin, cette approche pourra être appliquée à des bâtiments réels en situation d’occupation et étendue à l’échelle du quartier. Elle peut également être exploitée pour du positionnement optimal de capteurs
This work introduces an original inverse strategy for model parameter identification that can be used for onsite building characterization in view of energy performance assessment and as a tool of decision-making during energy retrofitting of existing buildings. Unlike the standard global inverse approaches such as Tikhonov regularization method that aim at identifying all the model parameters in order to best fit the measurement data, the goal-oriented inverse method is formulated for a robust prediction of a quantity of interest. Thus, it only updates the model parameters that most affect the computation of the sought quantity of interest. In order to optimize the computation time, the goal-oriented inverse method is combined with the Proper Generalized Decomposition (PGD) model order reduction method. The proposed identification strategy is applied to two existing buildings part of the equipment “Sense-City” that were instrumented for this purpose. The results show that the goal-oriented inverse method robustly predicts the sought quantities of interest by only updating the model parameters to which they are sensitive and it converges faster than the Tikhonov regularization method. Finally, the proposed inverse strategy can be applied to occupied buildings and extended to the district scale. It can also be used for the optimal placement of sensors
APA, Harvard, Vancouver, ISO, and other styles
2

Kwablah, Andrews. "Financial Crowding Out of Ghanaian Private Sector Corporations." ScholarWorks, 2018. https://scholarworks.waldenu.edu/dissertations/4932.

Full text
Abstract:
The government of Ghana borrows from both domestic and foreign sources to finance the budget deficit. By the year 2013, the domestic debt was 55% of the public debt. Government domestic borrowing is competitive and can potentially crowd out the private corporate sector. Therefore, the specific research problem addressed in this study was whether the Ghanaian government's domestic debt (DEBT) caused financial crowding out (FCO) in Ghana. FCO theory is not conclusive and not proven specifically for Ghana, so the purpose of this research was to investigate its presence in Ghana. The neoclassical theory of FCO underpinned the research. The 2 research questions investigated FCO along the quantity and cost channels. The research examined the relationship between DEBT as the independent variable, the quantity of private sector credit (PSCREDIT), and the net interest margin (NIM) of banks as dependent variables. Covariates were macroeconomic and banking industry variables. The research population was the banking sector of the financial services industry. The research was correlational, and it used time series data from the Bank of Ghana and the World Bank. Data analysis used the autoregressive distributed lag method. The analysis returned a negative relationship between DEBT and PSCREDIT, and a positve relationship between NIM and DEBT. These results indicated the presence of FCO along both the quantity and cost channels. The research provides policymakers a means of quantifying the extent and effects of fiscal policies. The study may contribute to positive social change by promoting the revision of fiscal policies to favor the private corporate sector to invest, create jobs, and grow the Ghanaian economy.
APA, Harvard, Vancouver, ISO, and other styles
3

Edwards, Paul. "Quantile hedging interest rate derivatives using the Libor market model." Thesis, Imperial College London, 2005. http://hdl.handle.net/10044/1/11361.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Verdugo, Rojano Francesc. "Error assessment and adaptivity for structural transient dynamics." Doctoral thesis, Universitat Politècnica de Catalunya, 2013. http://hdl.handle.net/10803/286745.

Full text
Abstract:
The finite element method is a valuable tool for simulating complex physical phenomena. However, any finite element based simulation has an intrinsic amount of error with respect to the exact solution of the selected physical model. Being aware of this error is of notorious importance if sensitive engineering decisions are taken on the basis of the numerical results. Assessing the error in elliptic problems (as structural statics) is a well known problem. However, assessing the error in structural transient dynamics is still ongoing research. The present thesis aims at contributing on error assessment techniques for structural transient dynamics. First, a new approach is introduced to compute bounds of the error measured in some quantity of interest. The proposed methodology yields error bounds with better quality than the already available approaches. Second, an efficient methodology to compute approximations of the error in the quantity of interest is introduced. The proposed technique uses modal analysis to compute the solution of the adjoint problem associated with the selected quantity of interest. The resulting error estimate is very well suited for time-dependent problems, because the cost of computing the estimate at each time step is very low. Third, a space-time adaptive strategy is proposed. The local error indicators driving the adaptive process are computed using the previously mentioned modal-based error estimate. The resulting adapted approximations are more accurate than the ones obtained with an straightforward uniform mesh refinement. That is, the adapted computations lead to lower errors in the quantity of interest than the non-adapted ones for the same number of space-time elements. Fourth, a new type of quantities of interest are introduced for error assessment in time-dependent problems. These quantities (referred as timeline-dependent quantities of interest) are scalar time-dependent outputs of the transient solution and are better suited to time-dependent problems than the standard scalar ones. The error in timeline-dependent quantities is eficiently assessed using the modal-based description of the adjoint solution. The thesis contributions are enclosed in five papers which are attached to the thesis document.
El mètode dels elements finits és una eina valuosa per a simular fenòmens físics complexos. Tot i això, aquest mètode només proporciona aproximacions de la solució exacta del model físic considerat. Per tant, quantificar l'error comés en l'aproximació és important si la simulació numèrica s'utilitza per a prendre decisions que poden tenir importants conseqüències. Actualment, les eines que permeten avaluar aquest error són ben conegudes per a problemes estacionaris, però encara presenten importants limitacions per a problemes transitoris com la dinàmica d'estructures. L'objectiu d'aquest treball és contribuir a millorar les tècniques existents per estimar l'error en dinàmica d'estructures i proposar-ne de noves. La primera contribució és una nova metodologia per a calcular cotes de l'error en una quantitat d'interès del problema. Les cotes proposades són més precises i proporcionen una millor estima de l'error que les cotes calculades amb tècniques prèvies. La segona contribució és una una nova tècnica que proporciona aproximacions de l'error en una quantitat d'interès utilitzant càlculs eficients. La novetat principal d'aquesta proposta és aproximar la solució del problema adjunt associat a la quantitat d'interès utilitzant l'anàlisi modal. El resultat és un estimador de l'error indicat particularment per a problemes transitoris, ja que el cost de calcular l'estimador a cada pas de temps és molt baix. La tercera contribució és una tècnica que permet construir de manera adaptada tant la discretizació temporal com espacial amb l'objectiu de millorar l'eficiència de la simulació. Aquesta tècnica es basa en la informació proporcionada per l'estima de l'error amb anàlisi modal. Les aproximacions calculades utilitzant les discretitzacions adaptades són més precises que les obtingudes amb un simple refinament uniforme de la malla de càlcul. És a dir, les discretitzacions adaptades proporcionen un error en la quantitat d'interès menor que les discretizacions no adaptades per al mateix nombre d'elements espai-temps. Finalment, la quarta contribució és un nou tipus de quantitats d'interès especialment indicades per a estimar l'error en problemes transitoris. Aquest nou tipus de quantitats són funcions escalars dependents del temps que proporcionen una informació més completa sobre l'error en problemes transitoris que les quantitats d'interès estàndard. L'error en aquestes noves quantitats és estimat eficientment utilitzant la descripció modal de la solució del problema adjunt. Les contribucions d'aquest treball es troben recopilades en cinc articles que s'inclouen adjunts en el document de la tesi.
APA, Harvard, Vancouver, ISO, and other styles
5

Klabi, Ramzi. "Essai sur la reformulation de la théorie quantitative de la monnaie par Maurice Allais." Thesis, Aix-Marseille, 2016. http://www.theses.fr/2016AIXM2002.

Full text
Abstract:
En 1965, Allais proposa une reformulation tout à fait originale de la théorie quantitative de la monnaie. Il s’agit de la théorie Héréditaire et Relativiste (HR) de la demande de monnaie. Apparue une décennie après la reformulation friedmanienne et la publication du modèle de Cagan (1956) relatif aux hyperinflations, cette théorie n’a pas réussi à se frayer une voie dans le champ de l’analyse monétaire. Plusieurs raisons ont concouru au non succès de cette théorie dont notamment son cadre conceptuel tout à fait étrange par rapport aux approches alors dominantes. L’objet de notre thèse est d’interroger l’apport de la théorie HR en tant que reformulation de la théorie quantitative et ce par rapport à la question de la stabilité de la demande de monnaie.Cette thèse est composée de trois parties. La première partie développe certains préludes nécessaires à l'analyse de la théorie HR (partie I). Les deux dernières parties contiennent les deux principaux résultats de notre travail. Le premier est que la théorie HR constitue une reformulation « ontologique » de la théorie quantitative, qui passe par la considération du temps psychologique-le temps tel que ressenti par l’ensemble des agents économiques (Partie II). Le second résultat est que la théorie HR, en tant que théorie macroéconomique, est grosse d’un changement paradigmatique qui fait écho à celui introduit en physique par la théorie de la relativité : pour certains phénomènes monétaires, la théorie HR substitue à l’explication par des relations causales entre agrégats une explication par la seule déformation psychologique du temps (Partie III)
In 1965, Allais proposed an original restatement of the quantity theory of money. It is the Hereditary and Relativistic (HR) theory of the money demand. Published a decade after Friedman’s restatement and Cagan’s model of hyperinflations, the HR theory remained unknown. Many reasons contributed to the lack of success of this theory, one of which is related to its conceptual framework which is incongruous with the standard approach. The HR theory is based upon the notion of time relativity from a psychological point of view, and the idea that the behavior of economic agents is conditioned by a hereditary effect of past events.Our thesis aims to investigate the contribution of the HR theory as a restatement of the quantity theory with regard to the question of the stability of money demand.The thesis is composed of three parts. The first part contains necessary preludes to the analysis of the HR theory (Part I). The second and the third part contain the two main results of the thesis. The first one is that the HR theory represents an ontological restatement of the quantity theory based on the notion of “psychological time”- time as experienced by the collectivity as a whole (Part II). The second result is that the HR theory, as a macroeconomic theory, contains a paradigmatic shift which echoes the one introduced in physics by the theory of relativity: in the HR theory, an explanation of some monetary phenomena using the psychological distortion of time is substituted to the explanation through causal relations between aggregates (Part III)
APA, Harvard, Vancouver, ISO, and other styles
6

Eoff, Brian David. "Using genetic programming to quantify the effectiveness of similar user cluster history as a personalized search metric." Auburn, Ala., 2005. http://repo.lib.auburn.edu/Send%2012-16-07/EOFF_BRIAN_7.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Thomas, Soby. "Residential mortgage loan securitization and the subprime crisis / S. Thomas." Thesis, North-West University, 2010. http://hdl.handle.net/10394/4591.

Full text
Abstract:
Many analysts believe that problems in the U.S. housing market initiated the 2008–2010 global financial crisis. In this regard, the subprime mortgage crisis (SMC) shook the foundations of the financial industry by causing the failure of many iconic Wall Street investment banks and prominent depository institutions. This crisis stymied credit extension to households and businesses thus creating credit crunches and, ultimately, a global recession. This thesis specifically discusses the SMC and its components, causes, consequences and cures in relation to subprime mortgages, securitization, as well as data. In particular, the SMC has highlighted the fact that risk, credit ratings, profit and valuation as well as capital regulation are important banking considerations. With regard to risk, the thesis discusses credit (including counterparty), market (including interest rate, basis, prepayment, liquidity and price), tranching (including maturity mismatch and synthetic), operational (including house appraisal, valuation and compensation) and systemic (including maturity transformation) risks. The thesis introduces the IDIOM hypothesis that postulates that the SMC was largely caused by the intricacy and design of subprime agents, mortgage origination and securitization that led to information problems (loss, asymmetry and contagion), valuation opaqueness and ineffective risk mitigation. It also contains appropriate examples, discussions, timelines as well as appendices about the main results on the aforementioned topics. Numerous references point to the material not covered in the thesis, and indicate some avenues for further research. In the thesis, the primary subprime agents that we consider are house appraisers (HAs), mortgage brokers (MBs), mortgagors (MRs), servicers (SRs), SOR mortgage insurers (SOMIs), trustees, underwriters, credit rating agencies (CRAs), credit enhancement providers (CEPs) and monoline insurers (MLIs). Furthermore, the banks that we study are subprime interbank lenders (SILs), subprime originators (SORs), subprime dealer banks (SDBs) and their special purpose vehicles (SPVs) such as Wall Street investment banks and their special structures as well as subprime investing banks (SIBs). The main components of the SMC are MRs, the housing market, SDBs/hedge funds/money market funds/SIBs, the economy as well as the government (G) and central banks. Here, G either plays a regulatory or policymaking role. Most of the aforementioned agents and banks are assumed to be risk neutral with SOR being the exception since it can be risk (and regret) averse on occasion. The main aspects of the SMC - subprime mortgages, securitization, as well as data - that we cover in this thesis and the chapters in which they are found are outlined below. In Chapter 2, we discuss the dynamics of subprime SORs' risk and profit as well as their valuation under mortgage origination. In particular, we model subprime mortgages that are able to fully amortize, voluntarily prepay or default and construct a discrete–time model for SOR risk and profit incorporating costs of funds and mortgage insurance as well as mortgage losses. In addition, we show how high loan–to–value ratios due to declining housing prices curtailed the refinancing of subprime mortgages, while low ratios imply favorable house equity for subprime MRs. Chapter 3 investigates the securitization of subprime mortgages into structured mortgage products such as subprime residential mortgage–backed securities (RMBSs) and collateralized debt obligations (CDOs). In this regard, our discussions focus on information, risk and valuation as well as the role of capital under RMBSs and RMBS CDOs. Our research supports the view that incentives to monitor mortgages has been all but removed when changing from a traditional mortgage model to a subprime mortgage model. In the latter context, we provide formulas for IB's profit and valuation under RMBSs and RMBS CDOs. This is illustrated via several examples. Chapter 3 also explores the relationship between mortgage securitization and capital under Basel regulation and the SMC. This involves studying bank credit and capital under the Basel II paradigm where risk–weights vary. Further issues dealt with are the quantity and pricing of RMBSs, RMBS CDOs as well as capital under Basel regulation. Furthermore, we investigate subprime RMBSs and their rates with slack and holding constraints. Also, we examine the effect of SMC–induced credit rating shocks in future periods on subprime RMBSs and RMBS payout rates. A key problem is whether Basel capital regulation exacerbated the SMC. Very importantly, the thesis answers this question in the affirmative. Chapter 4 explores issues related to subprime data. In particular, we present mortgage and securitization level data and forge connections with the results presented in Chapters 2 and 3. The work presented in this thesis is based on 2 peer–reviewed chapters in books (see [99] and [104]), 2 peer–reviewed international journal articles (see [48] and [101]), and 2 peer–reviewed conference proceeding papers (see [102] and [103]).
Thesis (Ph.D. (Applied Mathematics))--North-West University, Potchefstroom Campus, 2011.
APA, Harvard, Vancouver, ISO, and other styles
8

Prakash, Anila. "Three Essays on Labor Market Outcomes." Diss., The University of Arizona, 2015. http://hdl.handle.net/10150/560807.

Full text
Abstract:
The three chapters in this dissertation look at different aspects of the labor market and its players. The first chapter estimates the impact of using the internet for job search on job match quality. Using both the semi-parametric Meyer (1990) model and the non-parametric Hausman Woutersen (2014) hazard model, the paper finds that exit rate from employment is at least 28% lower when internet is used as a job search tool. The second chapter looks at the effect of past unemployment on future wages. It is believed that employers may use past unemployment as a signal of low productivity. In this situation workers with a history of unemployment may receive lower wages. The paper uses the Machado Mata (2005) quantile decomposition technique to decompose the wage difference into differences due to characteristics and differences due to rewards. Results indicate that workers with an unemployment spell of more than three months receive at least 12% lower wages and that more than 40% of this wage difference can be attributed to the lower rewards received by the previously unemployed.. The last chapter focuses on human capital formation and looks at some of the reasons behind the low levels of schooling India. Using the Indian Household Development Survey (2005), the paper finds that income continues to be an important factor behind the low level of primary school enrollment. On average, poor students have at least 3% lower enrollment rates, when compared to similar skilled non-poor students.
APA, Harvard, Vancouver, ISO, and other styles
9

Lesage, François. "Modélisation et expérimentation des transferts de matière et de quantité de mouvement dans les réacteurs à lit fixe." Phd thesis, Institut National Polytechnique de Lorraine - INPL, 2000. http://tel.archives-ouvertes.fr/tel-00790847.

Full text
Abstract:
La première partie de ce travail traite de la modélisation et de la simulation de l'hydrodynamique et du transfert de matière dans les réacteurs à lit fixe arrosés. Le milieu poreux a été considéré comme un continuum par prise de moyenne des équations de transport microscopiques. Différents modèles prenant en compte l'effet de paroi de notre système ont été proposés, résolus et confrontés à des résultats expérimentaux. Les résultats obtenus sont assez satisfaisants, et ont permis de choisir les modèles les plus appropriés. Des modèles plus complexes, établis grâce aux résultats expérimentaux, devraient permettre d'améliorer la qualité des simulations. Dans une seconde partie, l'étude locale de l'hydrodynamique a été menée, essentiellement à l'aide de microélectrodes insérées dans un pore, qui permettent la mesure par voie électrochimique des gradients de vitesse. En écoulement de liquide seul, les limites des régimes d'écoulement ont été déterminées. On a pu confirmer le comportement turbulent à fort débit et caractériser l'écoulement du fluide au sein d'un pore, notamment à l'aide des auto- et intercorrélations des mesures. Enfin, nous avons utilisé un modèle de renouvellement de surface couplé à la méthode VITA (Variable Interval Time Averaging) pour calculer le gradient de vitesse moyen. En écoulement gaz-liquide, la technique électrochimique est moins intéressante. Nous avons par contre pu caractériser les écoulements pulsés à l'aide de sondes de pression en paroi.
APA, Harvard, Vancouver, ISO, and other styles
10

de, Rezende Rafael B. "Essays on Macro-Financial Linkages." Doctoral thesis, Handelshögskolan i Stockholm, Institutionen för Finansiell ekonomi, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:hhs:diva-2259.

Full text
Abstract:
This doctoral thesis is a collection of four papers on the analysis of the term structure of interest rates with a focus at the intersection of macroeconomics and finance. "Risk in Macroeconomic Fundamentals and Bond Return Predictability" documents that factors related to risks underlying the macroeconomy such as expectations, uncertainty and downside (upside) macroeconomic risks are able to explain variation in bond risk premia. The information provided is found to be, to a large extent, unrelated to that contained in forward rates and current macroeconomic conditions. "Out-of-sample bond excess returns predictability" provides evidence that macroeconomic variables, risks in macroeconomic outcomes as well as the combination of these different sources of information are able to generate statistical as well as economic bond excess returns predictability in an out-of-sample setting. Results suggest that this finding is not driven by revisions in macroeconomic data. The term spread (yield curve slope) is largely used as an indicator of future economic activity. "Re-examining the predictive power of the yield curve with quantile regression" provides new evidence on the predictive ability of the term spread by studying the whole conditional distribution of GDP growth. "Modeling and forecasting the yield curve by extended Nelson-Siegel class of models: a quantile regression approach" deals with yield curve prediction. More flexible Nelson-Siegel models are found to provide better fitting to the data, even when penalizing for additional model complexity. For the forecasting exercise, quantile-based models are found to overcome all competitors.

Diss. Stockholm :  Stockholm School of Economics, 2014. Introduction together with 4 papers.

APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Quantity of interest"

1

MacKinnon, K. Inflation, interest rates and the capital stock in a cash in advance economy. Toronto, Ont: Department of Economics, York University, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Sugioka, Michio. The neo-theories of the real quantity of money to the dynamic balancing economic social growth. [Hyogo, Japan: Michio Sugioka], 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Sugioka, Michio. The neo-theories of the real quantity of money to the dynamic balancing economic social growth. [Hyogo, Japan: Michio Sugioka], 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Die deutsche Zeitstruktur der Zinssätze im Lichte der Wicksellschen Kredittheorie. Frankfurt am Main: P. Lang, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Lukanin, Alleksandr. Cleaning of gas and air emissions. ru: INFRA-M Academic Publishing LLC., 2020. http://dx.doi.org/10.12737/1070340.

Full text
Abstract:
The monograph examines the currently existing industrial gas emissions in the chemical, petrochemical, microbiological, pharmaceutical and related industries, methods for calculating their quantity and methods for protecting the air basin from them. The materials are based on an in-depth analysis of methods for cleaning frequently occurring, most dangerous substances that enter the Earth's atmosphere with waste gases of large-scale production. Recommendations are given on methods for calculating gross emissions of harmful substances for a large number of specific industries. The subject of the monograph is related to the scientific areas "Technosphere safety" and "Engineering environmental protection", training profiles: engineering environmental protection of localities, engineering environmental protection of industrial enterprises and environmental protection and resource conservation. It will be of interest to engineering and technical staff, graduate students and teachers.
APA, Harvard, Vancouver, ISO, and other styles
6

Pansy, ed. Betty & Pansy's severe queer review of San Francisco. 7th ed. San Francisco, Calif: Cleis Press, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Benigni, Valentina, Lucyna Gebert, and Julija Nikolaeva, eds. Le lingue slave tra struttura e uso. Florence: Firenze University Press, 2016. http://dx.doi.org/10.36253/978-88-6453-328-5.

Full text
Abstract:
Il volume, che prende le mosse dal V Incontro di Linguistica Slava (Roma, 25-27 settembre 2014), riflette lo stato delle ricerche più recenti condotte in Italia nell’ambito di questa disciplina, proseguendo la tradizione già avviata alla fine degli anni ’80 dalla serie Problemi di morfosintassi delle lingue slave (Bologna 1988, 1990, 1991; Padova 1994, 1995), e poi rinnovata recentemente nei volumi che hanno seguito gli Incontri di Bergamo (2007), Padova (2008), Forlì (2010) e Milano (2014). Il presente contributo testimonia un ampliamento degli interessi oltre i confini della morfosintassi, verso altre aree della linguistica teorica e applicata, quali la pragmatica, la semantica, l’acquisizione e la sociolinguistica. Per la varietà dei temi trattati e delle metodologie utilizzate, la pubblicazione può interessare non solo quanti svolgono le loro ricerche, sia teoriche che applicate, nell’ambito delle lingue slave, ma anche gli studiosi di linguistica generale.
APA, Harvard, Vancouver, ISO, and other styles
8

Snell, Lawrence D., Sanjiv V. Bhave, Laszlo Takacs, and Boris Tabakoff. Biological Markers of Substance Use. Edited by Kenneth J. Sher. Oxford University Press, 2015. http://dx.doi.org/10.1093/oxfordhb/9780199381708.013.23.

Full text
Abstract:
Ascertaining an individual’s history of alcohol consumption is an important component in the proper treatment of accidental trauma or acute or chronic illness, as well as for matters of public health and safety, legal issues, insurance coverage, and the management of and recovery from hazardous/harmful levels of alcohol consumption. Although self-report of alcohol consumption in both research and clinical settings represents the most common mode of assessment, there is long-standing interest in developing objective measures of alcohol consumption that do not rely on the ability or willingness of a person to truthfully report consumption. Biologic diagnostic tests or biomarkers can provide information on current and past quantity and frequency of alcohol consumption. This chapter discusses and evaluates many of the biomarker candidates that have been investigated and provides insights into future searches for optimal diagnostic tools to provide biologic evidence of duration, quantity, and frequency of individual alcohol consumption. We have included a limited discussion of biomarkers for assessing cannabis use since cannabis and alcohol use many times are a concomitant feature of intoxication.
APA, Harvard, Vancouver, ISO, and other styles
9

Complete ready reckoner, or, Trader's companion: Shewing, at one view, the value of any quantity of goods from one to one thousand, at any price from one farthing to one pound with French and English headings, also containing interest tables. [Quebec?: s.n.], 1985.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Mann, Peter. Newton’s Three Laws. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198822370.003.0001.

Full text
Abstract:
This chapter introduces Newton’s laws, the Newtonian formulation of mechanics and key concepts such as configuration space and phase space for later development. In 1687, the natural philosopher Sir Isaac Newton published the Principia Mathematica and, with it, sparked the revolutionary ideas key to all branches of classical physics. In this chapter, the system is the object of interest and is considered to be either a single or a collection of generic particles that are not governed by quantum mechanics, for quantum systems do not follow these laws explicitly. Results for systems of particles and conservation laws are presented as the invariance of a given quantity under time evolution. The N-body problem, first integrals, initial value problems and Galilean transformations are all introduced and the Picard iteration and the Verlet algorithm are discussed.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Quantity of interest"

1

Neittaanmäki, Pekka, Sergey Korotov, and Janne Martikainen. "A Posteriori Error Estimation of “Quantities of Interest” on “Quantity-Adapted” Meshes." In Scientific Computation, 171–81. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-642-18560-1_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Pivetti, Massimo. "Interest, Prices and the Quantity of Money: Thomas Tooke and His Main Critic." In An Essay on Money and Distribution, 74–86. London: Palgrave Macmillan UK, 1991. http://dx.doi.org/10.1007/978-1-349-21334-4_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Romano, Maurizio, Francesco Mola, and Claudio Conversano. "Decomposing tourists’ sentiment from raw NL text to assess customer satisfaction." In Proceedings e report, 147–51. Florence: Firenze University Press, 2021. http://dx.doi.org/10.36253/978-88-5518-304-8.29.

Full text
Abstract:
The importance of the Word of Mouth is growing day by day in many topics. This phenomenon is evident in everyday life, e.g., the rise of influencers and social media managers. If more people positively debate specific products, then even more people are encouraged to buy them and vice versa. This effect is directly affected by the relationship between the potential customer and the reviewer. Moreover, considering the negative reporting bias is evident in how the Word of Mouth analysis is of absolute interest in many fields. We propose an algorithm to extract the sentiment from a natural language text corpus. The combined approach of Neural Networks, with high predictive power but more challenging interpretation, with more simple but informative models, allows us to quantify a sentiment with a numeric value and to predict if a sentence has a positive (negative) sentiment. The assessment of an objective quantity improves the interpretation of the results in many fields. For example, it is possible to identify crucial specific sectors that require intervention, improving the company's services whilst finding the strengths of the company himself (useful for advertising campaigns). Moreover, considering that the time information is usually available in textual data with a web origin, to analyze trends on macro/micro topics. After showing how to properly reduce the dimensionality of the textual data with a data-cleaning phase, we show how to combine: WordEmbedding, K-Means clustering, SentiWordNet, and the Threshold-based Naïve Bayes classifier. We apply this method to Booking.com and TripAdvisor.com data, analyzing the sentiment of people who discuss a particular issue, providing an example of customer satisfaction.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhou, Lin, Zhen Li, Yingmei Chen, and Tong Li. "The Research on Electronic Tag Quantity Estimate Arithmetic Based on Probability Statistics." In Internet of Things, 254–61. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-32427-7_35.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Reeves, Matt, Inés Ibáñez, Dana Blumenthal, Gang Chen, Qinfeng Guo, Catherine Jarnevich, Jennifer Koch, et al. "Tools and Technologies for Quantifying Spread and Impacts of Invasive Species." In Invasive Species in Forests and Rangelands of the United States, 243–65. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-45367-1_11.

Full text
Abstract:
AbstractThe need for tools and technologies for understanding and quantifying invasive species has never been greater. Rates of infestation vary on the species or organism being examined across the United States, and notable examples can be found. For example, from 2001 to 2003 alone, ash (Fraxinus spp.) mortality progressed at a rate of 12.97 km year −1 (Siegert et al. 2014), and cheatgrass (Bromus tectorum) is expected to increase dominance on 14% of Great Basin rangelands (Boyte et al. 2016). The magnitude and scope of problems that invasive species present suggest novel approaches for detection and management are needed, especially those that enable more cost-effective solutions. The advantages of using technologically advanced approaches and tools are numerous, and the quality and quantity of available information can be significantly enhanced by their use. They can also play a key role in development of decision-support systems; they are meant to be integrated with other systems, such as inventory and monitoring, because often the tools are applied after a species of interest has been detected and a threat has been identified. In addition, the inventory systems mentioned in Chap. 10.1007/978-3-030-45367-1_10 are regularly used in calibrating and validating models and decision-support systems. For forested areas, Forest Inventory and Analysis (FIA) data are most commonly used (e.g., Václavík et al. 2015) given the long history of the program. In non-forested systems, national inventory datasets have not been around as long (see Chap. 10.1007/978-3-030-45367-1_10), but use of these data to calibrate and validate spatial models is growing. These inventory datasets include the National Resources Inventory (NRI) (e.g., Duniway et al. 2012) and the Assessment Inventory and Monitoring program (AIM) (e.g., McCord et al. 2017). Similarly, use of the Nonindigenous Aquatic Species (NAS) database is growing as well (e.g., Evangelista et al. 2017). The consistent protocols employed by these programs prove valuable for developing better tools, but the data they afford are generally limited for some tools because the sampling intensity is too low.
APA, Harvard, Vancouver, ISO, and other styles
6

Choi, Baek-Young, Zhi-Li Zhang, and David Hung-Chang Du. "Quantile Sampling for Practical Delay Monitoring in Internet Backbone Networks." In Scalable Network Monitoring in High Speed Networks, 111–43. New York, NY: Springer New York, 2011. http://dx.doi.org/10.1007/978-1-4614-0119-3_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ndaw, Marie, Gervais Mendy, Samuel Ouya, and Diaraf Seck. "Quantify the Maturity of Internet Banking Security Measures in WAEMU (West African Economic and Monetary Union) Banks." In Innovation and Interdisciplinary Solutions for Underserved Areas, 125–30. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-72965-7_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Zon, María A., Fernando J. Arévalo, Adrian M. Granero, Sebastián N. Robledo, Gastón D. Pierini, Walter I. Riberi, Jimena C. López, and Héctor Fernández. "Development of Modern Electroanalytical Techniques Based on Electrochemical Sensors and Biosensors to Quantify Substances of Interest in Food Science and Technology." In Practical Applications of Physical Chemistry in Food Science and Technology, 109–28. Series statement: Innovations in physical chemistry: monographic series: Apple Academic Press, 2020. http://dx.doi.org/10.1201/9781003020004-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Brown, James E., Rui Qiang, Paul J. Stadnik, Larry J. Stotts, and Jeffrey A. Von Arx. "RF-Induced Unintended Stimulation for Implantable Medical Devices in MRI." In Brain and Human Body Modeling 2020, 283–92. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-45623-8_17.

Full text
Abstract:
AbstractHistorically, patients with implantable medical devices have been denied access to magnetic resonance imaging (MRI) due to several potentially hazardous interactions. There has been significant interest in recent years to provide access to MRI to patients with implantable medical devices, as it is the preferred imaging modality for soft tissue imaging. Among the potential hazards of MRI for patients with an active implantable medical device is radio frequency (RF)-induced unintended stimulation. RF energy incident on the device may be rectified by internal active components. Any rectified waveform present at the lead electrodes may stimulate nearby tissue. In order to assess the risk to the patient, device manufacturers use computational human models (CHMs) to quantify the incident RF on the device and perform in vitro testing to determine the likelihood of unintended stimulation. The use of CHMs enables the investigation of millions of scenarios of scan parameters, patient sizes and anatomies, and MR system technologies.
APA, Harvard, Vancouver, ISO, and other styles
10

Milton, Friedman, and D. Bordo Michael. "Interest Rates and the Demand for Money." In The Optimum Quantity of Money, 141–56. Routledge, 2017. http://dx.doi.org/10.4324/9781315133607-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Quantity of interest"

1

Heinz, Jeffrey. "Learning quantity insensitive stress systems via local inference." In the Eighth Meeting of the ACL Special Interest Group. Morristown, NJ, USA: Association for Computational Linguistics, 2006. http://dx.doi.org/10.3115/1622165.1622168.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bane, Max, and Jason Riggle. "Three correlates of the typological frequency of quantity-insensitive stress systems." In the Tenth Meeting of ACL Special Interest Group. Morristown, NJ, USA: Association for Computational Linguistics, 2008. http://dx.doi.org/10.3115/1626324.1626330.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bashir, Muhammad Ahmad, Umar Farooq, Maryam Shahid, Muhammad Fareed Zaffar, and Christo Wilson. "Quantity vs. Quality: Evaluating User Interest Profiles Using Ad Preference Managers." In Network and Distributed System Security Symposium. Reston, VA: Internet Society, 2019. http://dx.doi.org/10.14722/ndss.2019.23392.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Wentworth, Mami T., and Ralph C. Smith. "Construction of Bayesian Prediction Intervals for Smart Systems." In ASME 2013 Conference on Smart Materials, Adaptive Structures and Intelligent Systems. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/smasis2013-3168.

Full text
Abstract:
In this paper, we employ adaptive Metropolis algorithms to construct densities for parameters and quantities of interest for models arising in the analysis of smart material structures. In the first step of the construction, MCMC algorithms are used to quantify the uncertainty in parameters due to measurement errors. We then combine uncertainties from the input parameters and measurement errors, and construct prediction intervals for the quantity of interest by propagating uncertainties through the models.
APA, Harvard, Vancouver, ISO, and other styles
5

Dommel, Johannes, Dennis Wieruch, Zoran Utkovski, and Slawomir Stanczak. "A Semantics-Aware Communication Scheme to Estimate the Empirical Measure of A Quantity of Interest Via Multiple Access Fading Channels." In 2021 IEEE Statistical Signal Processing Workshop (SSP). IEEE, 2021. http://dx.doi.org/10.1109/ssp49050.2021.9513758.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Raj, D. Samuel, Jerome Arul Praveen C., and Aarthi S. Kumaran. "Studies on Cryogenic Treated Drills Under Nano-Fluid Based Reduced Quantity Lubrication Conditions for Machining Ti6Al4V." In ASME 2018 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/imece2018-86941.

Full text
Abstract:
Minimum quantity lubrication (MQL) machining is gaining wide acceptance because of the need to make machining more environment friendly. With cutting fluids contributing more than 10–15% of the manufacturing costs, economics of MQL is also gaining interest. The initial cost of commercial MQL systems is one of the limiting factors hindering their adoption in small and medium scale industries. This study, uses a simple commercial paint sprayer with a portable compressor (with a total cost < $125) to produce an MQL like spray with a slightly higher flow rate as compared to MQL (and is thus called reduced quantity lubrication) in drilling Ti6Al4V. Tungsten carbide drills of 7 mm diameter were subjected to deep cryogenic treatment, which has been acknowledged as a means of improving tool wear resistance. There was a marginal improvement in drill microhardness as a result of cryogenic treatment. The performance of untreated and cryo-treated drills was compared for tool wear, hole surface roughness and cutting forces. In addition to the normal air-coolant mist, the effect of adding nano-particles in the coolant was also studied. It is found that cryo-treated drills under nano-fluid based reduced quantity lubrication (RQL) conditions performed better than untreated drills and conventional cutting fluids, in terms of all the three measured parameters. This can be attributed to the higher thermal conductivity of nano-fluid based coolants and their ability to reduce the coefficient of friction at the chip-tool interface.
APA, Harvard, Vancouver, ISO, and other styles
7

Wilson, John P. "Compression of Barotropic Turbulence Simulation Data Using Wavelet-Based Lossy Coding." In ASME 2002 Joint U.S.-European Fluids Engineering Division Conference. ASMEDC, 2002. http://dx.doi.org/10.1115/fedsm2002-31120.

Full text
Abstract:
Single-precision floating point data from a simulation of barotropic turbulence is compressed with a wavelet-based method. The quantity being compressed is vorticity. The compression error is evaluated both in terms of error in the vorticity and the error in various quantities derived from the vorticity. Numerical error is evaluated in all quantities and visualizations of the vorticity and correlation of the error with the uncompressed data are evaluated. It is found that depending on the quantities of interest and the evaluation criteria, compression ratios of 4:1 to 256:1 are achievable. Under a conservative definition of acceptable error, it is possible to recover quantities of interest from data compressed 4:1 (8bpp), the data rate that in existing practice is used for visualization.
APA, Harvard, Vancouver, ISO, and other styles
8

Ramachandran, Raveesh, Benjamin W. Caldwell, and Gregory M. Mocko. "A User Study to Evaluate the Function Model and Function Interaction Model for Concept Generation." In ASME 2011 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2011. http://dx.doi.org/10.1115/detc2011-47660.

Full text
Abstract:
Function modeling is often performed during the conceptual design phase to identify what the product must do in a solution independent form. However, function-based design approaches do not adequately enable designers to capture and analyze the non-functional requirements, interactions between the product of interest and other products and interactions between the product of interest and human users. This paper presents the results of a user study to evaluate how two models: (1) traditional function models (FM) and (2) the function interaction model (FIM), relate to functional and non-functional engineering requirements, and how they affect the creation of design solutions. Forty students were divided into two groups and asked to generate solutions for a design problem using either the function model (FM) or the function interaction model (FIM). The concepts were then evaluated in terms of quantity and quality by an external panel. Results from this study indicate that the quantity of solutions generated by the function model (FM) group is greater than the function interaction model (FIM) group. However, the quality of design concepts from the function interaction model (FIM) group is greater than the function model (FM) group. Further, nonfunctional requirements that are important to the design solutions cannot be captured in function model and thus are not reflected in the associated solution concepts.
APA, Harvard, Vancouver, ISO, and other styles
9

Chamoin, Ludovic, Pierre Ladevèze, and Florent Pled. "Goal-Oriented Control of Finite Element Models: Recent Advances and Performances on 3D Industrial Applications." In ASME 2012 11th Biennial Conference on Engineering Systems Design and Analysis. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/esda2012-83009.

Full text
Abstract:
In this work, we present two recent developments in goal-oriented error estimation applied to finite element simulations. The first one is a non-intrusive enrichment of the adjoint solution using handbook techniques, inserting locally the (generalized) Green functions associated with the quantity of interest under study. The second one is a new bounding technique, based on homothetic domains, that is an alternative to classical Cauchy-Schwarz boundings. Technical aspects and capabilities of the resulting verification tool will be shown on 3D numerical experiments.
APA, Harvard, Vancouver, ISO, and other styles
10

Barone, Dominic, Eric Loth, and Philip H. Snyder. "Particle Dynamics of a 2-D Inertial Particle Separator." In ASME Turbo Expo 2014: Turbine Technical Conference and Exposition. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/gt2014-26922.

Full text
Abstract:
The effects of sand and dust ingestion often limit the useful life of turbine engines operating in austere environments and efforts are needed to reduce the quantity of particulate entering the engine. Several Engine Air Particle Separation (EAPS) systems exist to accomplish this task. Inertial Particle Separators (IPS) are of particular interest because they offer significant weight savings and are more compact. This study focuses on the how small particles are affected by the dynamic fluid forces present in the IPS. Using Multi-Phase Particle Image Velocimetry (MP-PIV), 10um and 35um glass spheres were tracked through the IPS. Further, the data was also used to analyze the particles Coefficient of Restitution, (CORn̂), where they impact the Outer Surface Geometry (OSG) of the IPS.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Quantity of interest"

1

Carpenter, Jennifer, Fangzhou Lu, and Robert Whitelaw. The Price and Quantity of Interest Rate Risk. Cambridge, MA: National Bureau of Economic Research, February 2021. http://dx.doi.org/10.3386/w28444.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Mercado, Leo A. The National Security Strategy and National Interests: Quantity or Quality? Fort Belvoir, VA: Defense Technical Information Center, April 2001. http://dx.doi.org/10.21236/ada393498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Cavalli, Nicolò. Future orientation and fertility: cross-national evidence using Google search. Verlag der Österreichischen Akademie der Wissenschaften, December 2020. http://dx.doi.org/10.1553/populationyearbook2020.res06.

Full text
Abstract:
Using digital traces to investigate demographic behaviours, I leverage in this paper aggregated web search data to develop a Future Orientation Index for 200 countries and territories across the world. This index is expressed as the ratio of Google search volumes for ‘next year’ (e.g., 2021) to search volumes for ‘current year’ (e.g., 2020), adjusted for country-level internet penetration rates. I show that countries with lower levels of future orientation also have higher levels of fertility. Fertility rates decrease quickly as future orientation levels increase; but at the highest levels of future orientation, this correlation flattens out. Theoretically, I reconstruct the role that varying degrees of future orientation might play in fertility decisions by incorporating advances in behavioural economics into a traditional quantity-quality framework à la Becker.
APA, Harvard, Vancouver, ISO, and other styles
4

Robledo, Ana, and Amber Gove. What Works in Early Reading Materials. RTI Press, February 2019. http://dx.doi.org/10.3768/rtipress.2018.op.0058.1902.

Full text
Abstract:
Access to books is key to learning to read and sustaining a love of reading. Yet many low- and middle-income countries struggle to provide their students with reading materials of sufficient quality and quantity. Since 2008, RTI International has provided technical assistance in early reading assessment and instruction to ministries of education in dozens of low- and middle-income countries. The central objective of many of these programs has been to improve learning outcomes—in particular, reading—for students in the early grades of primary school. Under these programs, RTI has partnered with ministry staff to produce and distribute evidence-based instructional materials at a regional or national scale, in quantities that increase the likelihood that children will have ample opportunities to practice reading skills, and at a cost that can be sustained in the long term by the education system. In this paper, we seek to capture the practices RTI has developed and refined over the last decade, particularly in response to the challenges inherent in contexts with high linguistic diversity and low operational capacity for producing and distributing instructional materials. These practices constitute our approach to developing and producing instructional materials for early grade literacy. We also touch upon effective planning for printing and distribution procurement, but we do not consider the printing and distribution processes in depth in this paper. We expect this volume will be useful for donors, policymakers, and practitioners interested in improving access to cost-effective, high-quality teaching and learning materials for the early grades.
APA, Harvard, Vancouver, ISO, and other styles
5

Hilbrecht, Margo, Sally M. Gainsbury, Nassim Tabri, Michael J. A. Wohl, Silas Xuereb, Jeffrey L. Derevensky, Simone N. Rodda, McKnight Sheila, Voll Jess, and Gottvald Brittany. Prevention and education evidence review: Gambling-related harm. Edited by Margo Hilbrecht. Greo, September 2021. http://dx.doi.org/10.33684/2021.006.

Full text
Abstract:
This report supports an evidence-based approach to the prevention and education objective of the National Strategy to Reduce Harm from Gambling. Applying a public health policy lens, it considers three levels of measures: universal (for the benefit of the whole population), selective (for the benefit of at-risk groups), and indicated (for the benefit of at-risk individuals). Six measures are reviewed by drawing upon a range of evidence in the academic and grey literature. The universal level measures are “Regulatory restriction on how gambling is provided” and “Population-based safer gambling/responsible gambling efforts.” Selective measures focus on age cohorts in a chapter entitled, “Targeted safer gambling campaigns for children, youth, and older adults.” The indicated measures are “Brief internet delivered interventions for gambling,” “Systems and tools that produced actual (‘hard’) barriers and limit access to funds,” and “Self-exclusion.” Since the quantity and quality of the evidence base varied by measure, appropriate review methods were selected to assess publications using a systematic, scoping, or narrative approach. Some measures offered consistent findings regarding the effectiveness of interventions and initiatives, while others were less clear. Unintended consequences were noted since it is important to be aware of unanticipated, negative consequences resulting from prevention and education activities. After reviewing the evidence, authors identified knowledge gaps that require further research, and provided guidance for how the findings could be used to enhance the prevention and education objective. The research evidence is supplemented by consultations with third sector charity representatives who design and implement gambling harm prevention and education programmes. Their insights and experiences enhance, support, or challenge the academic evidence base, and are shared in a separate chapter. Overall, research evidence is limited for many of the measures. Quality assessments suggest that improvements are needed to support policy decisions more fully. Still, opportunities exist to advance evidence-based policy for an effective gambling harm prevention and education plan.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography