To see the other types of publications on this topic, follow the link: Shifted Lognormal Forward Rates.

Journal articles on the topic 'Shifted Lognormal Forward Rates'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 24 journal articles for your research on the topic 'Shifted Lognormal Forward Rates.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

DECAMPS, MARC, MARC GOOVAERTS, and WIM SCHOUTENS. "SELF EXCITING THRESHOLD INTEREST RATES MODELS." International Journal of Theoretical and Applied Finance 09, no. 07 (November 2006): 1093–122. http://dx.doi.org/10.1142/s0219024906003937.

Full text
Abstract:
In this paper, we study a new class of tractable diffusions suitable for model's primitives of interest rates. We consider scalar diffusions with scale s′(x) and speed m(x) densities discontinuous at the level x*. We call that family of processes Self Exciting Threshold (SET) diffusions. Following Gorovoi and Linetsky [18], we obtain semi-analytical expressions for the transition density of SET (killed) diffusions. We propose several applications to interest rates modeling. We show that SET short rate processes do not generate arbitrage possibilities and we adapt the HJM procedure to forward rates with discontinuous scale density. We also extend the CEV and the shifted-lognormal LIBOR market models. Finally, the models are calibrated to the US market. SET diffusions can also be used to model stock price, stochastic volatility, credit spread, etc.
APA, Harvard, Vancouver, ISO, and other styles
2

MERCURIO, FABIO. "MODERN LIBOR MARKET MODELS: USING DIFFERENT CURVES FOR PROJECTING RATES AND FOR DISCOUNTING." International Journal of Theoretical and Applied Finance 13, no. 01 (February 2010): 113–37. http://dx.doi.org/10.1142/s021902491000570x.

Full text
Abstract:
We introduce an extended LIBOR market model that is compatible with the current market practice of building different yield curves for different tenors and for discounting. The new paradigm is based on modeling the joint evolution of FRA rates and forward rates belonging to the discount curve. We will start by analyzing the basic lognormal case, then we will add stochastic volatility. The dynamics of FRA rates under different measures will be obtained and closed form formulas for caplets and swaptions derived in the lognormal and Heston (1993) cases.
APA, Harvard, Vancouver, ISO, and other styles
3

DUN, TIM, GEOFF BARTON, and ERIK SCHLÖGL. "SIMULATED SWAPTION DELTA–HEDGING IN THE LOGNORMAL FORWARD LIBOR MODEL." International Journal of Theoretical and Applied Finance 04, no. 04 (August 2001): 677–709. http://dx.doi.org/10.1142/s0219024901001127.

Full text
Abstract:
Alternative approaches to hedging swaptions are explored and tested by simulation. Hedging methods implied by the Black swaption formula are compared with a lognormal forward LIBOR model approach encompassing all the relevant forward rates. The simulation is undertaken within the LIBOR model framework for a range of swaptions and volatility structures. Despite incompatibilities with the model assumptions, the Black method performs equally well as the LIBOR method, yielding very similar distributions for the hedging profit and loss — even at high rehedging frequencies. This result demonstrates the robustness of the Black hedging technique and implies that — being simpler and generally better understood by financial practitioners — it would be the preferred method in practice.
APA, Harvard, Vancouver, ISO, and other styles
4

Goldys, Beniamin. "A note on pricing interest rate derivatives when forward LIBOR rates are lognormal." Finance and Stochastics 1, no. 4 (September 1, 1997): 345–52. http://dx.doi.org/10.1007/s007800050028.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

VAN APPEL, JACQUES, and THOMAS A. MCWALTER. "EFFICIENT LONG-DATED SWAPTION VOLATILITY APPROXIMATION IN THE FORWARD-LIBOR MODEL." International Journal of Theoretical and Applied Finance 21, no. 04 (June 2018): 1850020. http://dx.doi.org/10.1142/s0219024918500206.

Full text
Abstract:
We provide efficient swaption volatility approximations for longer maturities and tenors under the lognormal forward-LIBOR model (LFM). In particular, we approximate the swaption volatility with a mean update of the spanning forward rates. Since the joint distribution of the forward rates is not known under a typical pricing measure, we resort to numerical discretization techniques. More specifically, we approximate the mean forward rates with a multi-dimensional weak order 2.0 Itō–Taylor scheme. The higher-order terms allow us to more accurately capture the state dependence in the drift terms and compute conditional expectations with second-order accuracy. We test our approximations for longer maturities and tenors using a quasi-Monte Carlo (QMC) study and find them to be substantially more effective when compared to the existing approximations, particularly for calibration purposes.
APA, Harvard, Vancouver, ISO, and other styles
6

VAN APPEL, JACQUES, and THOMAS A. MCWALTER. "MOMENT APPROXIMATIONS OF DISPLACED FORWARD-LIBOR RATES WITH APPLICATION TO SWAPTIONS." International Journal of Theoretical and Applied Finance 23, no. 07 (November 2020): 2050046. http://dx.doi.org/10.1142/s0219024920500466.

Full text
Abstract:
We present an algorithm to approximate moments for forward rates under a displaced lognormal forward-LIBOR model (DLFM). Since the joint distribution of rates is unknown, we use a multi-dimensional full weak order 2.0 Ito–Taylor expansion in combination with a second-order Delta method. This more accurately accounts for state dependence in the drift terms, improving upon previous approaches. To verify this improvement we conduct quasi-Monte Carlo simulations. We use the new mean approximation to provide an improved swaption volatility approximation, and compare this to the approaches of Rebonato, Hull–White and Kawai, adapted to price swaptions under the DLFM. Rebonato and Hull–White are found to be the least accurate. While Kawai is the most accurate, it is computationally inefficient. Numerical results show that our approach strikes a balance between accuracy and efficiency.
APA, Harvard, Vancouver, ISO, and other styles
7

Scheler, Gabriele. "Logarithmic distributions prove that intrinsic learning is Hebbian." F1000Research 6 (July 25, 2017): 1222. http://dx.doi.org/10.12688/f1000research.12130.1.

Full text
Abstract:
In this paper, we document lognormal distributions for spike rates, synaptic weights and intrinsic excitability (gain) for neurons in various brain areas, such as auditory or visual cortex, hippocampus, cerebellum, striatum, midbrain nuclei. We find a remarkable consistency of heavy-tailed, specifically lognormal, distributions for rates, weights and gains in all brain areas. The difference between strongly recurrent and feed-forward connectivity (cortex vs. striatum and cerebellum), neurotransmitter (GABA (striatum) or glutamate (cortex)) or the level of activation (low in cortex, high in Purkinje cells and midbrain nuclei) turns out to be irrelevant for this feature. Logarithmic scale distribution of weights and gains appears as a functional property that is present everywhere. Secondly, we created a generic neural model to show that Hebbian learning will create and maintain lognormal distributions. We could prove with the model that not only weights, but also intrinsic gains, need to have strong Hebbian learning in order to produce and maintain the experimentally attested distributions. This settles a long-standing question about the type of plasticity exhibited by intrinsic excitability.
APA, Harvard, Vancouver, ISO, and other styles
8

Scheler, Gabriele. "Logarithmic distributions prove that intrinsic learning is Hebbian." F1000Research 6 (October 11, 2017): 1222. http://dx.doi.org/10.12688/f1000research.12130.2.

Full text
Abstract:
In this paper, we present data for the lognormal distributions of spike rates, synaptic weights and intrinsic excitability (gain) for neurons in various brain areas, such as auditory or visual cortex, hippocampus, cerebellum, striatum, midbrain nuclei. We find a remarkable consistency of heavy-tailed, specifically lognormal, distributions for rates, weights and gains in all brain areas examined. The difference between strongly recurrent and feed-forward connectivity (cortex vs. striatum and cerebellum), neurotransmitter (GABA (striatum) or glutamate (cortex)) or the level of activation (low in cortex, high in Purkinje cells and midbrain nuclei) turns out to be irrelevant for this feature. Logarithmic scale distribution of weights and gains appears to be a general, functional property in all cases analyzed. We then created a generic neural model to investigate adaptive learning rules that create and maintain lognormal distributions. We conclusively demonstrate that not only weights, but also intrinsic gains, need to have strong Hebbian learning in order to produce and maintain the experimentally attested distributions. This provides a solution to the long-standing question about the type of plasticity exhibited by intrinsic excitability.
APA, Harvard, Vancouver, ISO, and other styles
9

Kuznetsov, Victor P., Andery S. Skorobogatov, and Vladimir G. Gorgots. "IMPACT OF INDENTOR SLIDING VELOCITY AND LOADING REPETITION FACTOR ON SHEAR STRAIN AND STRUCTURE DISPERSION IN NANOSTRUCTURING BURNISHING." Facta Universitatis, Series: Mechanical Engineering 17, no. 2 (July 26, 2019): 161. http://dx.doi.org/10.22190/fume190330023k.

Full text
Abstract:
The article probes into a relationship of the shear strain intensity and the shear strain rate in the surface layer and the sliding velocity of a spherical indentor and its loading repetition factor. It brings forward an experimental procedure to evaluate the shear strain intensity and rate by analyzing the geometrical parameters of the bulge of plastically edged metal and the thickness of the shifted layer relative to different sliding velocities and feed rates.
APA, Harvard, Vancouver, ISO, and other styles
10

Khripach, Ludmila V., T. D. Knjazeva, S. M. Yudin, S. V. German, and I. E. Zykova. "COMPARATIVE ANALYSIS OF SERUM ANTIBODY RESPONSES TO H.PYLORI AND TO RECOMBINANT CAGA IN THE COHORT OF WORKING-AGE MOSCOW ADULTS." Hygiene and sanitation 97, no. 9 (September 15, 2018): 785–90. http://dx.doi.org/10.18821/0016-9900-2018-97-9-785-790.

Full text
Abstract:
Introduction. Helicobacter pylori (Hр) is a helix-shaped bacterium adapted evolutionary to living in the mucoid of stomach. Considered usually as one of the factors in the development of gastritis, peptic ulcer and gastric cancer, but the opposite opinions were also discussed. The aim of this study was to assess levels of serum antibodies to Hp and recombinant CagA in the cohort of working-age Moscow adults. Methods. Commercial ELISA kits “IFA-Helicobacter IgG”© (ZAO EKOlab, Russia) and “HelicoBest-antibodies”© (ZAO Vector-Best, Russia) were applied for the estimation of serum antibodies to Hp and CagA, correspondingly, in the observed cohort (both gender adults, N=319). Results. 85 % of the human cohort (N=271) had positive rates of IgG-antibodies against complex Hp antigen, with lognormal distribution of IgG titers (median 1:688; Q1 - Q3 1:370 - 1:1223) and cut-off value equal to 1:100. 54 % of the human cohort (N=172) were seropositive to recombinant CagA, with the levels of total serum antibodies (IgM, IgA and IgG) from 23 to 129 elisa units (median 87,9; Q1 - Q3 56,7 - 102,5) and cut-off value equal to 18,5 EU. The distribtion of CagA antibody levels was sharply different from lognormal distribution of IgG titers to complex Hp antigen and had signs of bimodality with the main maximum shifted to the right. In the complete cohort under observation (N=319), the levels of serum antibodies to Hp and CagA were associated with a weak (R=0,217), but highly significant (p=0,00009) positive linkage; human persons, seropositive to both antigens, had no any association between the markers. Discussion. Possible reasons of differences in the shape of distributions of the studied markers are discussed. Taking into account the extraordinary genetic variability of natural Hp isolates, lognormal distribution of antibodies to complex Hp antigen can reflect combinatorial differences in the degree of proximity of Hp antigenic determinants between human persons under observation and the antigenic preparation. Bimodal distribution of antibody levels to individual protein CagA, possibly, reflect genetically determined differences in immunoreactivity inside the observed cohort.
APA, Harvard, Vancouver, ISO, and other styles
11

Burnete, Sorin. "Industries in Central and Eastern Europe. Enhancing Competitiveness by Integrating Services into Manufacturing." Human and Social Studies 4, no. 1 (March 1, 2015): 30–42. http://dx.doi.org/10.1515/hssr-2015-0003.

Full text
Abstract:
Abstract During the last two decades, the intra-industry trade between western companies and former socialist enterprises in Central and Eastern Europe gradually shifted from the subcontracting of marginal operations such as final assembly to the outsourcing of products and intermediate inputs. To further enhance their competitiveness, firms in Central and Eastern Europe have yet to take one more step forward: integrate services with manufacturing. Developing such capabilities hinges, aside from intensive training and learning on the existence of functional interactive knowledgebased innovation systems. Whereas Central and East European economies exhibit conspicuous weaknesses in this last respect, they still possess a countervailing advantage that is apt to lure foreign investors into the region: lower wage rates relative to western countries across all industries and skill levels. Offshoring therefore seems to be the most appropriate means to reconcile the two sides of the coin.
APA, Harvard, Vancouver, ISO, and other styles
12

Hofmann, B., D. Düvelmeyer, and K. Krumbiegel. "APPROXIMATE SOURCE CONDITIONS IN TIKHONOV REGULARIZATION‐NEW ANALYTICAL RESULTS AND SOME NUMERICAL STUDIES." Mathematical Modelling and Analysis 11, no. 1 (March 31, 2006): 41–56. http://dx.doi.org/10.3846/13926292.2006.9637301.

Full text
Abstract:
We present some new ideas and results for finding convergence rates in Tikhonov regularization for ill‐posed linear inverse problems with compact and non‐compact forward operators based on the consideration of approximate source conditions and corresponding distance functions. The new results and studies complement and extend in numerous points the recent papers [5, 7, 8, 10] that also exploit the distance functions originally introduced in [2] which measure the violation of a moderate source condition that works as a benchmark. In this context, we distinguish as in [8] logarithmic, power and exponential decay rates for the distance functions and their consequences. Under specific range inclusions the decay rate of distance functions is verified explicitly, whereas in [10] this result is also used but formulated only in an implicit manner. Applications to non‐compact multiplication operators are briefly reviewed from [8]. An important new result is that we can show for compact operators a one‐to‐one correspondence between the maximal power type decay rates for the distance functions and maximal exponents of Holder rates in Tikhonov regularization linked by the specific singular value expansion of the solution element. Some numerical studies on simple integration illustrate the compact operator case and the specific situation of discretized problems. Finally, some ideas of generalization are mentioned concerning the fact that the benchmark of the distance function can be shifted.
APA, Harvard, Vancouver, ISO, and other styles
13

Middlebrooks, J. C., and E. I. Knudsen. "Changes in external ear position modify the spatial tuning of auditory units in the cat's superior colliculus." Journal of Neurophysiology 57, no. 3 (March 1, 1987): 672–87. http://dx.doi.org/10.1152/jn.1987.57.3.672.

Full text
Abstract:
This study examines the influence of external ear position on the auditory spatial tuning of single units in the superior colliculus of the anesthetized cat. Unit responses to broad-band stimuli presented in a free sound field were measured with the external ears in a forward symmetrical position or with one or the other ear turned 40 degrees to the side; the ears are referred to as contra- or ipsilateral with respect to the side of the recording site. Changes in the position of either ear modified the spatial tuning of units. The region of space from which a stimulus was most effective in activating a unit is referred to as the unit's “best area”. Whenever the contralateral ear was turned to the side, best areas shifted peripherally and somewhat upward, roughly in proportion to the magnitude of the change in ear position. A turn of the ipsilateral ear to the side had more variable effects, but best areas generally shifted frontally. Best areas located between approximately 10 and 40 degrees contralateral when the ears were forward were least affected by changes in ipsilateral ear position. Changes in ear position also modified the maximum response rates of many units. Units with best areas located within approximately 20 degrees of the frontal midline when the ears were forward exhibited a pronounced decrease in responsiveness when either ear was turned. Units with more peripheral best areas tended to show no change or a slight increase in responsiveness. The influence of ear position on the directionality of the external ears was determined by mapping the cochlear microphonic response to tones or one-third-octave bands of noise before and after turning the ear. When the ears were forward, maximum interaural intensity differences (IIDs) were produced by high-frequency sound sources (greater than or equal to 20 kHz) located 20-40 degrees from the frontal midline and by lower frequency sources located further peripherally. The influence of ear position on the locations from which maximum IIDs were produced was similar to the influence of ear position on unit best areas. Changes in ipsilateral ear position had different effects on high- and low-frequency IIDs that were comparable with the effects of changes in ear position on frontally and peripherally located best areas, respectively.(ABSTRACT TRUNCATED AT 400 WORDS)
APA, Harvard, Vancouver, ISO, and other styles
14

Wada, Tokio, and Larry D. Jacobson. "Regimes and stock-recruitment relationships in Japanese sardine (Sardinops melanostictus), 1951-1995." Canadian Journal of Fisheries and Aquatic Sciences 55, no. 11 (November 1, 1998): 2455–63. http://dx.doi.org/10.1139/f98-135.

Full text
Abstract:
We used reproductive success, rather than abundance or catch, to identify regimes because reproductive success responds faster to environmental changes. Peak abundance of Japanese sardine during 1951-1995 was about 1000 times higher than minimum abundance. A regime shift occurred in the early 1970s when carrying capacity (measured using spawner-recruit models) increased by about 75 times. We hypothesize that this was due to large-scale changes in the Kuroshio and Oyashio Current systems. Long-term environmental variation (regimes), interannual variability in recruitment success, and density-dependent recruitment and growth rates affected dynamics of Japanese sardine. We hypothesize that density-dependent effects on recruitment of Sardinops spp. are common but usually obscured in short data sets by environmental variability and measurement error. Virtual population analysis and forward-simulation modeling approaches gave similar biomass and recruitment estimates. The relationship between sardine biomass and catch per unit search time was nonlinear. Mass-at-age and biomass were correlated, and it may be possible to use mass-at-age as an abundance index. Current abundance is low, and we believe that the environment has shifted to a regime that is unfavorable for Japanese sardine.
APA, Harvard, Vancouver, ISO, and other styles
15

De Groof, Vicky, Marta Coma, Tom Arnot, David J. Leak, and Ana B. Lanham. "Medium Chain Carboxylic Acids from Complex Organic Feedstocks by Mixed Culture Fermentation." Molecules 24, no. 3 (January 22, 2019): 398. http://dx.doi.org/10.3390/molecules24030398.

Full text
Abstract:
Environmental pressures caused by population growth and consumerism require the development of resource recovery from waste, hence a circular economy approach. The production of chemicals and fuels from organic waste using mixed microbial cultures (MMC) has become promising. MMC use the synergy of bio-catalytic activities from different microorganisms to transform complex organic feedstock, such as by-products from food production and food waste. In the absence of oxygen, the feedstock can be converted into biogas through the established anaerobic digestion (AD) approach. The potential of MMC has shifted to production of intermediate AD compounds as precursors for renewable chemicals. A particular set of anaerobic pathways in MMC fermentation, known as chain elongation, can occur under specific conditions producing medium chain carboxylic acids (MCCAs) with higher value than biogas and broader applicability. This review introduces the chain elongation pathway and other bio-reactions occurring during MMC fermentation. We present an overview of the complex feedstocks used, and pinpoint the main operational parameters for MCCAs production such as temperature, pH, loading rates, inoculum, head space composition, and reactor design. The review evaluates the key findings of MCCA production using MMC, and concludes by identifying critical research targets to drive forward this promising technology as a valorisation method for complex organic waste.
APA, Harvard, Vancouver, ISO, and other styles
16

Kazmi, Aqdas Ali. "An Econometric Estimation of Tax-discounting in Pakistan." Pakistan Development Review 34, no. 4III (December 1, 1995): 1067–77. http://dx.doi.org/10.30541/v34i4iiipp.1067-1077.

Full text
Abstract:
The debt neutrality hypothesis which has been a source of major controversies in the theory of public finance, and macroeconomics has at the same time generated a vast literature on the implications of budgetary deficits and public debt on various subsectors/ variables of the economy, such as inflation, interest rates, current account deficit, etc. Tax discounting has been one of the fields of research associated with debt neutrality. The econometric estimation of some of the standard models of taxdiscounting has shown that consumer response to fiscal policy in Pakistan reflects neither the extreme Barro-like rational anticipation of future tax liabilities nor the Buchanan-type extreme fiscal myopia. It broadly follows a middle path between these extremes. The controversy relating to debt neutrality is quite old in economic theory. However, due to its serious and far-reaching implications for the formulation of fiscal policy and macroeconomic management, the issues of debt neutrality have assumed a foremost position in economic theoretisation and empirical testing. This controversy is based on two important questions: (a) Who bears the burden of the debt? (b) Should debt be used to finance public expenditure? The first question centres on whether the debt can be shifted forward in time, while the second question explores whether taxation is equivalent to debt in its effects on the national economy.
APA, Harvard, Vancouver, ISO, and other styles
17

Aguiar, Guilherme Brasileiro de, Rafael Gomes dos Santos, Vinícius Ricieri Ferraz, André Freitas Nunes, Rodrigo Salmeron de Toledo Aguiar, Maurício Jory, Mario Luiz Marques Conti, and José Carlos Esteves Veiga. "Blister like aneurysm: a review about its endovascular management / Aneurismas Blister-Like: uma revisão sobre seu tratamento endovascular." Arquivos Médicos dos Hospitais e da Faculdade de Ciências Médicas da Santa Casa de São Paulo 63, no. 3 (December 10, 2018): 208. http://dx.doi.org/10.26432/1809-3019.2018.63.3.208.

Full text
Abstract:
Introduction: Blood blister-like (BBAs) aneurysms are rare cerebrovascular lesions for which the endovascular treatment methods are reviewed here. The reported pathogenesis varies, and hemodynamic stress, arterial dissection, and arteriosclerotic ulceration have all been described. The excessive fragility of BBAs and their parent vessels, can make microsurgical clipping technically difficult. Surgical treatment is associated with high rates of complications, morbidity, and mortality. The approach to the treatment of BBAs in recent times has shifted from microsurgical treatment to endovascular treatment, thanks to ongoing innovations in regard to endovascular techniques and devices. Method: The authors performed a review of available endovascular techniques used for blood blister-like aneurysms treatment. The Pubmed database was used as source search introducing “blood blister-like aneurysm” and “blister aneurysms” as keywords. The most relevant articles and those that focused on endovascular treatment techniques were selected. Discussion: Endovascular interventional techniques have evolved as an effective treatment for intracranial aneurysms. Considerable interest has emerged regarding the use of endovascular approaches to treat BBAs. In some studies, endovascular treatment of BBAs was associated with high rates of complete occlusion and good mid- to long-term neurological outcomes. Various endovascular techniques have been applied to treat BBAs, such as coil embolization, stenting, stent-assisted coiling and flow-diverting stents. Conclusion: From the available endovascular techniques, flow-diverting stents appears to be the safest and most effective treatment modality, with a higher rate of complete occlusion of an aneurysm and a lower rate of retreatment. The development of novel flow-diverting stents with decreased thrombogenic properties, may represents a key step forward and increases the potential for flow diversion becoming the gold standard for endovascular treatment of BBAs.Keywords: Intracranial aneurysm; Aneurysm, ruptured; Carotid artery disease; Subarachnoid hemorrhage; Endovascular procedures
APA, Harvard, Vancouver, ISO, and other styles
18

Nzewi, Ogochukwu. "Gender and HIV/AIDS: Exploring Men and Vulnerability Towards Effective HIV/AIDS Policy Interventions and Sub-Saharan Africa." Africa’s Public Service Delivery and Performance Review 1, no. 1 (June 1, 2012): 55. http://dx.doi.org/10.4102/apsdpr.v1i1.24.

Full text
Abstract:
This article examines the dynamics between HIV/Aids gender policy strategies and the socio-political demands on HIV/Aids interventions in sub-Saharan Africa. Gender in HIV/Aids intervention seems inescapable. Nowhere else is this more marked than in the social dimensions of HIV/Aids prevention in sub-Saharan Africa. This has resulted in prevention strategies, which are encumbered by the reality of poverty, gender, access, power and the various debates on behavioural change. The social constructions of gender roles and power relations play a significant role in the region’s HIV /Aids dynamic. To this end, the mainstreaming of gender issues into national political, social and economic agenda and policies has been championed by international development and economic institutions. In developing HIV/Aids intervention policies, gender has also been mainstreamed, especially where epidemiological data show the disparity in infection rates between men and women, where women are seen as more susceptible to infection. The gendered approach to HIV/Aids appears to typecast women as the vulnerable and suffering face of HIV/Aids, while men, as ‘the other’, are generally regarded as the perpetuators and spreaders of the virus. While there is no doubt that women’s vulnerability in this milieu has been proven within known research evidence to exist, the neglect of institutional (social, cultural and economic) and historical vulnerabilities of African men’s realities are sometimes overlooked. Recently, greater focus has shifted to curbing infection rates in men based on new scientific evidence that shows that risk of transmission in circumcised men is reduced. The article argues that such movement towards showing areas of men’s vulnerability as a focus in HIV/Aids policy interventions may have the potential to shift the observed burden that current HIV/Aids policy thrusts inadvertently place on African women. The article will put forward an argument for ‘the vulnerable other’ in HIV/Aids policy intervention, suggesting a new continental policy strategy that sees men going from peripheral footnotes to the centre of HIV/Aids policy and intervention programmes.<br /><br />
APA, Harvard, Vancouver, ISO, and other styles
19

de Reus, M., S. Borrmann, A. Bansemer, A. J. Heymsfield, R. Weigel, C. Schiller, V. Mitev, et al. "Evidence for ice particles in the tropical stratosphere from in-situ measurements." Atmospheric Chemistry and Physics 9, no. 18 (September 18, 2009): 6775–92. http://dx.doi.org/10.5194/acp-9-6775-2009.

Full text
Abstract:
Abstract. In-situ ice crystal size distribution measurements are presented within the tropical troposphere and lower stratosphere. The measurements were performed using a combination of a Forward Scattering Spectrometer Probe (FSSP-100) and a Cloud Imaging Probe (CIP), which were installed on the Russian high altitude research aircraft M55 "Geophysica" during the SCOUT-O3 campaign in Darwin, Australia. One of the objectives of the campaign was to characterise the Hector convective system, which appears on an almost daily basis during the pre-monsoon season over the Tiwi Islands, north of Darwin. In total 90 encounters with ice clouds, between 10 and 19 km altitude were selected from the dataset and were analysed. Six of these encounters were observed in the lower stratosphere, up to 1.4 km above the local tropopause. Concurrent lidar measurements on board "Geophysica" indicate that these ice clouds were a result of overshooting convection. Large ice crystals, with a maximum dimension up to 400 μm, were observed in the stratosphere. The stratospheric ice clouds included an ice water content ranging from 7.7×10−5 to 8.5×10−4 g m−3 and were observed at ambient relative humidities (with respect to ice) between 75 and 157%. Three modal lognormal size distributions were fitted to the average size distributions for different potential temperature intervals, showing that the shape of the size distribution of the stratospheric ice clouds are similar to those observed in the upper troposphere. In the tropical troposphere the effective radius of the ice cloud particles decreases from 100 μm at about 10 km altitude, to 3 μm at the tropopause, while the ice water content decreases from 0.04 to 10−5 g m−3. No clear trend in the number concentration was observed with altitude, due to the thin and inhomogeneous characteristics of the observed cirrus clouds. The ice water content calculated from the observed ice crystal size distribution is compared to the ice water content derived from two hygrometer instruments. This independent measurement of the ice water content agrees within the combined uncertainty of the instruments for ice water contents exceeding 3×10−4g m−3. Stratospheric residence times, calculated based on gravitational settling, and evaporation rates show that the ice crystals observed in the stratosphere over the Hector storm system had a high potential of humidifying the stratosphere locally. Utilizing total aerosol number concentration measurements from a four channel condensation particle counter during two separate campaigns, it can be shown that the fraction of ice particles to the number of aerosol particles remaining ranges from 1:300 to 1:30 000 for tropical upper tropospheric ice clouds with ambient temperatures below −75°C.
APA, Harvard, Vancouver, ISO, and other styles
20

Franz, Gerhard. "Plasma Enhanced Chemical Vapor Deposition of Organic Polymers." Processes 9, no. 6 (June 1, 2021): 980. http://dx.doi.org/10.3390/pr9060980.

Full text
Abstract:
Chemical Vapor Deposition (CVD) with its plasma-enhanced variation (PECVD) is a mighty instrument in the toolbox of surface refinement to cover it with a layer with very even thickness. Remarkable the lateral and vertical conformity which is second to none. Originating from the evaporation of elements, this was soon applied to deposit compound layers by simultaneous evaporation of two or three elemental sources and today, CVD is rather applied for vaporous reactants, whereas the evaporation of solid sources has almost completely shifted to epitaxial processes with even lower deposition rates but growth which is adapted to the crystalline substrate. CVD means first breaking of chemical bonds which is followed by an atomic reorientation. As result, a new compound has been generated. Breaking of bonds requires energy, i.e., heat. Therefore, it was a giant step forward to use plasmas for this rate-limiting step. In most cases, the maximum temperature could be significantly reduced, and eventually, also organic compounds moved into the preparative focus. Even molecules with saturated bonds (CH4) were subjected to plasmas—and the result was diamond! In this article, some of these strategies are portrayed. One issue is the variety of reaction paths which can happen in a low-pressure plasma. It can act as a source for deposition and etching which turn out to be two sides of the same medal. Therefore, the view is directed to the reasons for this behavior. The advantages and disadvantages of three of the widest-spread types, namely microwave-driven plasmas and the two types of radio frequency-driven plasmas denoted Capacitively-Coupled Plasmas (CCPs) and Inductively-Coupled Plasmas (ICPs) are described. The view is also directed towards the surface analytics of the deposited layers—a very delicate issue because carbon is the most prominent atom to form multiple bonds and branched polymers which causes multifold reaction paths in almost all cases. Purification of a mixture of volatile compounds is not at all an easy task, but it is impossible for solids. Therefore, the characterization of the film properties is often more orientated towards typical surface properties, e.g., hydrophobicity, or dielectric strength instead of chemical parameters, e.g., certain spectra which characterize the purity (infrared or Raman). Besides diamond and Carbon Nano Tubes, CNTs, one of the polymers which exhibit an almost threadlike character is poly-pxylylene, commercially denoted parylene, which has turned out a film with outstanding properties when compared to other synthetics. Therefore, CVD deposition of parylene is making inroads in several technical fields. Even applications demanding tight requirements on coating quality, like gate dielectrics for semiconductor industry and semi-permeable layers for drug eluting implants in medical science, are coming within its purview. Plasma-enhancement of chemical vapor deposition has opened the window for coatings with remarkable surface qualities. In the case of diamond and CNTs, their purity can be proven by spectroscopic methods. In all the other cases, quantitative measurements of other parameters of bulk or surface parameters, resp., are more appropriate to describe and to evaluate the quality of the coatings.
APA, Harvard, Vancouver, ISO, and other styles
21

"A Resolution to Valuation Conflicts of Swaptions/Caps and OIS/LIBOR." Journal of Fixed Income, January 1, 2019. http://dx.doi.org/10.3905/jfi.2018.28.3.068.

Full text
Abstract:
In this article, the authors provide a unified valuation framework under which a multicurve economy can be established and caps/floors and swaptions can be consistently priced. Furthermore, if a lognormal distribution is employed for the forward price (or 1 plus forward rate), then a “model-free” volatility calibration can be achieved, and all swaptions and caps/floors are perfectly repriced. This article leverages earlier work by Chen, Hsieh, and Huang (2017) who fix a crucial drift-adjustment problem of the traditional LIBOR market model (LMM) where the LIBOR rates follow a lognormal distribution. By assuming 1 + LIBOR to be lognormal (hence LIBOR is shifted lognormal), Chen, Hsieh, and Huang achieve an exact and deterministic drift-adjustment term. In this article, they extend the model to provide a perfect calibration to both swaptions and caps/floors (which is not doable under the traditional LMM), and by using a foreign currency analogy, they show that the model supports multiple curves, which is a key element to overnight index swap (OIS) discounting.
APA, Harvard, Vancouver, ISO, and other styles
22

Rolloos, Frido. "A Multi-Factor Shifted Lognormal Model for Forward Starting Variance Swaps." SSRN Electronic Journal, 2019. http://dx.doi.org/10.2139/ssrn.3486194.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Fronteira, I., J. Simoes, and G. Augusto. "Informal carer in Portugal: moving towards political recognition in Portugal." European Journal of Public Health 29, Supplement_4 (November 1, 2019). http://dx.doi.org/10.1093/eurpub/ckz187.178.

Full text
Abstract:
Abstract Informal care represents around 80% of all long term care provided in EU countries. Nevertheless, the needs for this type of care are expected to increase in the coming years in all OECD countries. Portugal is among the OECD countries with the highest ageing index (21.5% of the population was older than 65 years in 2017) due to high life expectancy and low fertility rates. As this demographic trend establishes, Portugal is expected to have more than 40% of the population over 65 years in 2037, and the expected prevalence of dementia is 3%, in 2050. In 2015 there were 2.1% of people over 65 receiving long-term care, representing 52% of all long-term care users. Around 38% were receiving care at home. It is estimated that 287,000 people in Portugal depend on informal carers. The agenda towards the official recognition of informal cares has been push forward in the country. Since 2015, several recommendations have been issued by the Parliament as well as legislative initiatives and a proposal for a Status of the Informal Carer is currently under discussion. We analyse the process of formulation of this policy in terms of sectors and stakeholders involved, definition and scope of informal carer, rights and obligations, role of the person being cared for, formal protection (e.g., labor, social, financial, training) and implementation. Recognition of the informal carer is a sector wide approach. One of the main features is the economic, social and labor protection mainly through reconciliation between work life and caring activities and promotion of the carer’s well being. Notwithstanding, and from a health system perspective, community health teams are to be the focal point for informal carers, supporting and providing specific training whenever needed. Despite its relevance, informal care should not be professionalized and responsibility of care should not be shifted from health services to informal carers. Key messages Needs for informal care are expected to increase in the coming years in OECD countries. Recognition of the informal carer is a sector wide approach.
APA, Harvard, Vancouver, ISO, and other styles
24

Egliston, Ben. "Building Skill in Videogames: A Play of Bodies, Controllers and Game-Guides." M/C Journal 20, no. 2 (April 26, 2017). http://dx.doi.org/10.5204/mcj.1218.

Full text
Abstract:
IntroductionIn his now-seminal book, Pilgrim in the Microworld (1983), David Sudnow details his process of learning to play the game Breakout on the Atari 2600. Sudnow develops an account of his graduation from a novice (having never played a videogame prior, and middle-aged at time of writing) to being able to fluidly perform the various configurative processes involved in an acclimated Breakout player’s repertoire.Sudnow’s account of videogame skill-development is not at odds with common-sense views on the matter: people become competent at videogames by playing them—we get used to how controllers work and feel, and to the timings of the game and those required of our bodies, through exposure. We learn by playing, failing, repeating, and ultimately internalising the game’s rhythms—allowing us to perform requisite actions. While he does not put it in as many words, Sudnow’s account affords parity to various human and nonhuman stakeholders involved in videogame-play: technical, temporal, and corporeal. Essentially, his point is that intertwined technical systems like software and human-interface devices—with their respective temporal rhythms, which coalesce and conflict with those of the human player—require management to play skilfully.The perspective Sudnow develops here is no doubt important, but modes of building competency cannot be strictly fixed around a player-videogame relationship; a relatively noncontroversial view in game studies. Videogame scholars have shown that there is currency in understanding how competencies in gameplay arise from engaging with ancillary objects beyond the thresholds of player-game relations; the literature to date casting a long shadow across a broad spectrum of materials and practices. Pursuing this thread, this article addresses the enterprise (and conceptualisation) of ‘skill building’ in videogames (taken as the ability to ‘beat games’ or cultivate the various competencies to do so) via the invocation of peripheral objects or practices. More precisely, this article develops the perspective that we need to attend to the impacts of ancillary objects on play—positioned as hybrid assemblage, as described in the work of writers like Sudnow. In doing so, I first survey how the intervention of peripheral game material has been researched and theorised in game studies, suggesting that many accounts deal too simply with how players build skill through these means—eliding the fact that play works as an engine of many moving parts. We do not simply become ‘better’ at videogames by engaging peripheral material. Furthering this view, I visit recent literature broadly associated with disciplines like post-phenomenology, which handles the hybridity of play and its extension across bodies, game systems, and other gaming material—attending to how skill building occurs; that is, through the recalibration of perceptual faculties operating in the bodily and temporal dimensions of videogame play. We become ‘better’ at videogames by drawing on peripheral gaming material to augment how we negotiate the rhythms of play.Following on from this, I conclude by mobilising post-phenomenological thinking to further consider skill-building through peripheral material, showing how such approaches can generate insights into important and emerging areas of this practice. Following recent games research, such as the work of James Ash, I adopt Bernard Stiegler’s formulation of technicity—pointing toward the conditioning of play through ancillary gaming objects: focusing particularly on the relationship between game skill, game guides, and embodied processes of memory and perception.In short, this article considers videogame skill-building, through means beyond the game, as a significant recalibration of embodied, temporal, and technical entanglements involved in play. Building Skill: From Guides to BodiesThere is a handsome literature that has sought to conceptualise the influence of ancillary game material, which can be traced to earlier theories of media convergence (Jenkins). More incisive accounts (pointing directly at game-skill) have been developed since, through theoretical rubrics such as paratext and metagaming. A point of congruence is the theme of relation: the idea that the locus of understanding and meaning can be specified through things outside the game. For scholars like Mia Consalvo (who popularised the notion of paratext in game studies), paratexts are a central motor in play. As Consalvo suggests, paratexts are quite often primed to condition how we do things in and around videogames; there is a great instructive potential in material like walkthrough guides, gaming magazines and cheating devices. Subsequent work has since made productive use of the concept to investigate game-skill and peripheral material and practice. Worth noting is Chris Paul’s research on World of Warcraft (WoW). Paul suggests that players disseminate high-level strategies through a practice known as ‘Theorycraft’ in the game’s community: one involving the use of paratextual statistics applications to optimise play—the results then disseminated across Web-forums (see also: Nardi).Metagaming (Salen and Zimmerman 482) is another concept that is often used to position the various extrinsic objects or practices installed in play—a concept deployed by scholars to conceptualise skill building through both games and the things at their thresholds (Donaldson). Moreover, the ability to negotiate out-of-game material has been positioned as a form of skill in its own right (see also: Donaldson). Becoming familiar with paratextual resources and being able to parse this information could then constitute skill-building. Ancillary gaming objects are important, and as some have argued, central in gaming culture (Consalvo). However, critical areas are left unexamined with respect to skill-building, because scholars often fail to place paratexts or metagaming in the contexts in which they operate; that is, amongst the complex technical, embodied and temporal conjunctures of play—such as those described by Sudnow. Conceptually, much of what Sudnow says in Microworld undergirds the post-human, object-oriented, or post-phenomenological literature that has begun to populate game studies (and indeed media studies more broadly). This materially-inflected writing takes seriously the fact that technical objects (like videogames) and human subjects are caught up in the rhythms of each other; digital media exists “as a mode or cluster of operations in consort with matter”, as Anna Munster tells us (330).To return to videogames, Patrick Crogan and Helen Kennedy argue that gameplay is about a “technicity” between human and nonhuman things, irreducible to any sole actor. Play is a confluence of metastable forces and conditions, a network of distributed agencies (see also Taylor, Assemblage). Others like Brendan Keogh forward post-phenomenological approaches (operating under scholars like Don Ihde)—looking past the subject-centred nature of videogame research. Ultimately, these theorists situate play as an ‘exploded diagram’, challenging anthropocentric accounts.This position has proven productive in research on ‘skilled’ or ‘high-level’ play (fertile ground for considering competency-development). Emma Witkowski, T.L. Taylor (Raising), and Todd Harper have suggested that skilled play in games emerges from the management of complex embodied and technical rhythms (echoing the points raised prior by Sudnow).Placing Paratexts in PlayWhile we have these varying accounts of how skill develops within and beyond player-game relationships, these two perspectives are rarely consolidated. That said, I address some of the limited body of work that has sought to place the paratext in the complex and distributed conjunctures of play; building a vocabulary and framework via encounters with what could loosely be called post-phenomenological thinking (not dissimilar to the just surveyed accounts). The strength of this work lies in its development of a more precise view of the operational reality of playing ‘with’ paratexts. The recent work of Darshana Jayemanne, Bjorn Nansen, and Thomas Apperley theorises the outward expansion of games and play, into diverse material, social, and spatial dimensions (147), as an ‘aesthetics of recruitment’. Consideration is given to ‘paratextual’ play and skill. For instance, they provide the example of players invoking the expertise they have witnessed broadcast through Websites like Twitch.tv or YouTube—skill-building operating here across various fronts, and through various modalities (155). Players are ‘recruited’, in different capacities, through expanded interfaces, which ultimately contour phenomenological encounters with games.Ash provides a fine-grained account in research on spatiotemporal perception and videogames—one much more focused on game-skill. Ash examines how high-level communities of players cultivate ‘spatiotemporal sensitivity’ in the game Street Fighter IV through—in Stiegler’s terms—‘exteriorising’ (Fault) game information into various data sets—producing what he calls ‘technicity’. In this way, Ash suggests that these paratextual materials don’t merely ‘influence play’ (Technology 200), but rather direct how players perceive time, and habituate exteriorised temporal rhythms into their embodied facility (a translation of high-level play). By doing so, the game can be played more proficiently. Following the broadly post-phenomenological direction of these works, I develop a brief account of two paratextual practices. Like Ash, I deploy the work of Stiegler (drawing also on Ash’s usage). I utilise Stiegler’s theoretical schema of technicity to roughly sketch how some other areas of skill-building via peripheral material can be placed within the context of play—looking particularly at the conditioning of embodied faculties of player anticipation, memory and perception through play and paratext alike. A Technicity of ParatextThe general premise of Stiegler’s technicity is that the human cannot be thought of independent from their technical supplements—that is, ‘exterior’ technical objects which could include, but are not limited to, technologies (Fault). Stiegler argues that the human, and their fundamental memory structure is finite, and as such is reliant on technical prostheses, which register and transmit experience (Fault 17). This technical supplement is what Stiegler terms ‘tertiary retention’. In short, for Stiegler, technicity can be understood as the interweaving of ‘lived’ consciousness (Cinematic 21) with tertiary retentional apparatus—which is palpably felt in our orientations in and toward time (Fault) and space (including the ‘space’ of our bodies, see New Critique 11).To be more precise, tertiary retention conditions the relationship between perception, anticipation, and subjective memory (or what Stiegler—by way of phenomenologist Edmund Husserl, whose work he renovates—calls primary retention, protention, and secondary retention respectively). As Ash demonstrates (Technology), Stiegler’s framework is rich with potential in investigating the relationship between videogames and their peripheral materials. Invoking technicity, we can rethink—and expand on—commonly encountered forms of paratexts, such as game guides or walkthroughs (an example Consalvo gives in Cheating). Stiegler’s framework provides a means to assess the technical organisation (through both games and paratexts) of embodied and temporal conditions of ‘skilled play’. Following Stiegler, Consalvo’s example of a game guide is a kind of ‘exteriorisation of play’ (to the guide) that adjusts the embodied and temporal conditions of anticipation and memory (which Sudnow would tell us are key in skill-development). To work through an example, if I was playing a hard game (such as Dark Souls [From Software]), the general idea is that I would be playing from memories of the just experienced, and with expectations of what’s to come based on everything that’s happened prior (following Stiegler). There is a technicity in the game’s design here, as Ash would tell us (Technology 190-91). By way of Stiegler (and his reading of Heidegger), Ash argues a popular trend in game design is to force a technologically-mediated interplay between memory, anticipation, and perception by making videogames ‘about’ a “a future outside of present experience” (Technology 191), but hinging this on past-memory. Players then, to be ‘skilful’, and move forward through the game environment without dying, need to manage cognitive and somatic memory (which, in Dark Souls, is conventionally accrued through trial-and-error play; learning through error incentivised through punitive game mechanics, such as item-loss). So, if I was playing against one of the game’s ‘bosses’ (powerful enemies), I would generally only be familiar with the way they manoeuvre, the speed with which they do so, and where and when to attack based on prior encounter. For instance, my past-experience (of having died numerous times) would generally inform me that using a two-handed sword allows me to get in two attacks on a boss before needing to retreat to avoid fatal damage. Following Stiegler, we can understand the inscription of videogame experience in objects like game guides as giving rise to anticipation and memory—albeit based on a “past that I have not lived but rather inherited as tertiary retentions” (Cinematic 60). Tertiary retentions trigger processes of selection in our anticipations, memories, and perceptions. Where videogame technologies are traditionally the tertiary retentions in play (Ash, Technologies), the use of game-guides refracts anticipation, memory, and perception through joint systems of tertiary retention—resulting in the outcome of more efficiently beating a game.To return to my previous example of navigating Dark Souls: where I might have died otherwise, via the guide, I’d be cognisant to the timings within which I can attack the boss without sustaining damage, and when to dodge its crushing blows—allowing me to eventually defeat it and move toward the stage’s end (prompting somatic and cognitive memory shifts, which influence my anticipation in-game). Through ‘neurological’ accounts of technology—such as Stiegler’s technicity—we can think more closely about how playing with a skill-building apparatus (like a game guide) works in practice; allowing us to identify how various situations ingame can be managed via deferring functions of the player (such as memory) to exteriorised objects—shifting conditions of skill building. The prism of technicity is also useful in conceptualising some of the new ways players are building skill beyond the game. In recent years, gaming paratexts have transformed in scope and scale. Gaming has shifted into an age of quantification—with analytics platforms which harvest, aggregate, and present player data gaining significant traction, particularly in competitive and multiplayer videogames. These platforms perform numerous operations that assist players in developing skill—and are marketed as tools for players to improve by reflecting on their own practices and the practices of others (functioning similarly to the previously noted practice of TheoryCraft, but operating at a wider scale). To focus on one example, the WarCraftLogs application in WoW (Image 1) is a highly-sophisticated form of videogame analytics; the perspective of technicity providing insights into its functionality as skill-building apparatus.Image 1: WarCraftLogs. Image credit: Ben Egliston. Following Ash’s use of Stiegler (Technology), quantifying the operations that go into playing WoW can be conceptualised as what Stiegler calls a system of traces (Technology 196). Because of his central thesis of ‘technical existence’, Stiegler maintains that ‘interiority’ is coincident with technical support. As such, there is no calculation, no mental phenomena, that does not arise from internal manipulation of exteriorised symbols (Cinematic 52-54). Following on with his discussion of videogames, Ash suggests that in the exteriorisation of gameplay there is “no opposition between gesture, calculation and the representation of symbols” (Technology 196); the symbols working as an ‘abbreviation’ of gameplay that can be read as such. Drawing influence from this view, I show that ‘Big Data’ analytics platforms like WarCraftLogs similarly allow users to ‘read’ play as a set of exteriorised symbols—with significant outcomes for skill-building; allowing users to exteriorise their own play, examine the exteriorised play of others, and compare exteriorisations of their own play with those of others. Image 2: WarCraftLogs Gameplay Breakdown. Image credit: Ben Egliston.Image 2 shows a screenshot of the WarCraftLogs interface. Here we can see the exteriorisation of gameplay, and how the platform breaks down player inputs and in-game occurrences (written and numeric, like Ash’s game data). The screenshot shows a ‘raid boss’ (where players team up to defeat powerful computer-controlled enemies)—atomising the sequence of inputs a player has made over the course of the encounter. This is an accurate ledger of play—a readout that can speak to mechanical performance (specific ingame events occurred at a specific time), as well as caching and providing parses of somatic inputs and execution (e.g. ability to trace the rates at which players expend in-game resources can provide insights into rapidity of button presses). If information falls outside what is presented, players can work with an Application Programming Interface to develop customised readouts (this is encouraged through other game-data platforms, like OpenDota in Dota 2). Through this system, players can exteriorise their own input and output or view the play of others—both useful in building skill. The first point here—of exteriorising one’s own experience—resonates with Stiegler’s renovation of Husserl's ‘temporal object’—that is, an object that exists in and is formed through time—through temporal fluxes of what appears, what happens and what manifests itself in disappearing (Cinematic 14). Stiegler suggests that tertiary retentional apparatus (e.g. a gramophone) allow us to re-experience a temporal object (e.g. a melody) which would otherwise not be possible due to the finitude of human memory.To elaborate, Stiegler argues that primary memories recede into secondary memory (which is selective reactivation of perception), but through technologies of recording, (such as game-data) we can re-experience these things verbatim. So ultimately, games analytics platforms—as exteriorised technologies of recording—facilitate this after-the-fact interplay between primary and secondary memory where players can ‘audit’ their past performance, reflecting on well-played encounters or revising error. These platforms allow the detailed examination of responses to game mechanics, and provide readouts of the technical and embodied rhythms of play (which can be incorporated into future play via reading the data). Beyond self-reflection, these platforms allow the examination of other’s play. The aggregation and sorting of game-data makes expertise both visible and legible. To elaborate, players are ranked on their performance based on all submitted log-data, offering a view of how expertise ‘works’.Image 3: Top-Ranking Players in WarCraftLogs. Image credit: Ben Egliston.Image 3 shows the top-ranked players on an encounter (the top 10 of over 100,000 logs), which means that these players have performed most competently out of all gameplay parses (the metric being most damage dealt per-second in defeating a boss). Users of the platform can look in detail at the actions performed by top players in that encounter—reading and mobilising data in a similar manner to game-guides; markedly different, however, in terms of the scope (i.e. there are many available logs to draw from) and richness of the data (more detailed and current—with log rankings recalibrated regularly). Conceptually, we can also draw parallels with previous work (see: Ash, Technology)—where the habituation of expert game data can produce new videogame technicities; ways of ‘experiencing’ play as ‘higher-level’ organisation of space and time (Ash, Technology). So, if a player wanted to ‘learn from the experts’ they would restructure their own rhythms of play around high-level logs which provide an ordered readout of various sequences of inputs involved in playing well. Moreover, the platform allows players to compare their logs to those of others—so these various introspective and outward-facing uses can work together, conditioning anticipations with inscriptions of past-play and ‘prosthetic’ memories through other’s log-data. In my experience as a WoW player, I often performed better (or built skill) by comparing and contrasting my own detailed readouts of play to the inputs and outputs of the best players in the world.To summarise, through technicity, I have briefly shown how exteriorising play shifts the conditions of skill-building from recalibrating msnesic and anticipatory processes through ‘firsthand’ play, to reworking these functions through engaging both games and extrinsic objects, like game guides and analytics platforms. Additionally, by reviewing and adopting various usages of technicity, I have pointed out how we might more holistically situate the gaming paratext in skill building. Conclusion There is little doubt—as exemplified through both scholarly and popular interest—that paratextual videogame material reframes modes of building game skill. Following recent work, and by providing a brief account of two paratextual practices (venturing the framework of technicity, via Stiegler and Ash—showing the complication of memory, perception, and anticipation in skill-building), I have contended that videogame-skill building—via paratextual material—can be rendered a process of operating outside of, but still caught up in, the complex assemblages of time, bodies, and technical architectures described by Sudnow at this article’s outset. Additionally, by reviewing and adopting ideas associated with technics and post-phenomenology, this article has aimed to contribute to the development of more ‘complete’ accounts of the processes and practices comprising skill building regimens of contemporary videogame players.References Ash, James. “Technology, Technicity and Emerging Practices of Temporal Sensitivity in Videogames.” Environment and Planning A 44.1 (2012): 187-201.———. “Technologies of Captivation: Videogames and the Attunement of Affect.” Body and Society 19.1 (2013): 27-51.Consalvo, Mia. Cheating: Gaining Advantage in Videogames. Cambridge: Massachusetts Institute of Technology P, 2007. Crogan, Patrick, and Helen Kennedy. “Technologies between Games and Culture.” Games and Culture 4.2 (2009): 107-14.Donaldson, Scott. “Mechanics and Metagame: Exploring Binary Expertise in League of Legends.” Games and Culture (2015). 4 Jun. 2015 <http://journals.sagepub.com/doi/abs/10.1177/1555412015590063>.From Software. Dark Souls. Playstation 3 Game. 2011.Harper, Todd. The Culture of Digital Fighting Games: Performance and Practice. New York: Routledge, 2014.Jayemanne, Darshana, Bjorn Nansen, and Thomas H. Apperley. “Postdigital Interfaces and the Aesthetics of Recruitment.” Transactions of the Digital Games Research Association 2.3 (2016): 145-72.Jenkins, Henry. Convergence Culture: Where Old and New Media Collide. New York: New York UP, 2006.Keogh, Brendan. “Across Worlds and Bodies.” Journal of Games Criticism 1.1 (2014). Jan. 2014 <http://gamescriticism.org/articles/keogh-1-1/>.Munster, Anna. “Materiality.” The Johns Hopkins Guide to Digital Media. Eds. Marie-Laure Ryan, Lori Emerson, and Benjamin J. Robertson. Baltimore: Johns Hopkins UP, 2014. 327-30. Nardi, Bonnie. My Life as Night Elf Priest: An Anthropological Account of World of Warcraft. Ann Arbor: Michigan UP, 2010. OpenDota. OpenDota. Web browser application. 2017.Paul, Christopher A. “Optimizing Play: How Theory Craft Changes Gameplay and Design.” Game Studies: The International Journal of Computer Game Research 11.2 (2011). May 2011 <http://gamestudies.org/1102/articles/paul>.Salen, Katie, and Eric Zimmerman. Rules of Play: Game Design Fundamentals. Cambridge: Massachusetts Institute of Technology P, 2004.Stiegler, Bernard. Technics and Time, 1: The Fault of Epimetheus. Stanford: Stanford UP, 1998.———. For a New Critique of Political Economy. Cambridge: Polity, 2010.———. Technics and Time, 3: Cinematic Time and the Question of Malaise. Stanford: Stanford UP, 2011.Sudnow, David. Pilgrim in the Microworld. New York: Warner Books, 1983.Taylor, T.L. “The Assemblage of Play.” Games and Culture 4.4 (2009): 331-39.———. Raising the Stakes: E-Sports and the Professionalization of Computer Gaming. Cambridge: Massachusetts Institute of Technology P, 2012.WarCraftLogs. WarCraftLogs. Web browser application. 2016.Witkowski, Emma. “On the Digital Playing Field: How We ‘Do Sport’ with Networked Computer Games.” Games and Culture 7.5 (2012): 349-74.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography