Academic literature on the topic 'Great Britain. Government Code and Cypher School'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Great Britain. Government Code and Cypher School.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Great Britain. Government Code and Cypher School"

1

Rico, José M. "L’indemnisation des victimes d’actes criminels." Acta Criminologica 1, no. 1 (January 19, 2006): 261–311. http://dx.doi.org/10.7202/017003ar.

Full text
Abstract:
Abstract COMPENSATION TO VICTIMS OF CRIMINAL OFFENCES The system of composition, which was developed during the Middle Ages, especially under Germanic penal law, represents not only an abatement of the system of collective vengeance characteristic of this era, but also the first step towards the principle of compensation to victims of criminal offences. With the development and consolidation of a strong central power, the State asked for a share of these transactions either in the form of sanction or as a price for its intervention. W^hen at last the central government obtained the full and exclusive right to inflict punishment and when private justice gave way to public justice, the State's share of compensation increased progressively and took the form of fines, while the victim's share gradually diminished and withdrew little by little from the penal system to become civil compensation for damages. Nevertheless, the total separation between public action, whose aim is to ensure punishment, and civil action, whose main object is to secure compensation to the victim, did not materialize until very recently. This principle of total separation, which was adopted by the classical school of criminal law, resulted in a complete overlooking of the victim's right to compensation, in daily legal practice. New solutions were therefore proposed to remedy this deficiency in the penal systems, the most original and daring being those to be found in the Spanish Penal Codes of 1822 and 1848 which compel the State to compensate victims of criminal offences when the wrong-doers or other responsible persons are unable to do so. This idea of compensation by the State to victims of crime, although taken lip and elaborated several years later by Bentham and the Italian Positivist School, had absolutely no repercussions as far as practice was concerned. It was only in the second half of the XXth Century that an Englishwoman, Margaret Fry, drew the attention to this problem. Inspired by her compatriot Bentham, Margaret Fry proclaimed that compensation for harm caused to victims of criminal violence should be assumed by the State. This was the starting point of a considerable development in the study of compensation to the victim. During the last ten years, not only were many papers and conferences devoted to the subject, but also many legislations adopted the progressive solution of conferring upon the State the task of compensating the victim of criminal offences. In most contemporary penal legislations, the dissociation between public and civil action has resulted in relegating the subject of compensation solely to the civil domain. A certain number of penal systems (France, Belgium, Germany, etc.), while accepting in principle the civil character of this matter, nevertheless offer the injured party the possibility of bringing his action for damages before criminal courts. A last group of systems (Spain, Italy, Switzerland) treat this problem within the framework of the criminal code, although in most cases they do nothing but repeat analogous paragraphs of the civil code. Upon examining these different methods of coping with the problem of compensating the victim for damages caused by criminal violence, we find that certain reforms were put into effect but that they chiefly hinge upon one preliminary question ~— the means available to the victim for bringing his case before the criminal courts and of engaging in the criminal procedure, to obtain recognition of his rights by the Court. However, it often happens that once the sentence has been passed, the victim is obliged to act on his own to recover the sum of the indemnity. Modern penal law, progressive and innovating as it is in certain respects, often neglects the victim of crime. Certain solutions were proposed and even introduced into positive penal legislations, in view of securing for the injured party, as much as possible, the recovery of the compensation decided upon by the courts in his favour, especially in cases where the offender is destitute. Among such solutions, one should stress legal solidarity between co-delinquents, priority accorded to the compensation debt, accessory imprisonment, compulsory work in prison and in liberty, compulsory insurance and the creation of a compensation fund. Similar proposals tend to consider compensation to the victim as an indispensable condition for the obtainment of certain privileges (pardon, parole, probation, legal rehabilitation, etc.). Due to the insufficiency of the classical systems and of the solutions destinated to secure compensation of the victim by the offender, one again began to wonder whether the State should not undertake the charge of repairing damages caused by crime. The main argument offered in favour of this system is the State's failure in preventing crime and in protecting its citiiens against felonious acts. Despite the numerous criticisms concerning the essentially judicial composition of the courts in charge of the application of the system as well as of the procedure to be followed, the infractions to be compensated, the amount to be paid and the total cost of the system, some countries have recognized the right of the victim to be compensated and consequently adopted measures to enforce this principle (New Zealand, 1963; Great Britain, 1964; States of California and New York, 1966; the Canadian province of Saskatchewan, 1967).
APA, Harvard, Vancouver, ISO, and other styles
2

Rice, Ken, Ben Wynne, Victoria Martin, and Graeme J. Ackland. "Effect of school closures on mortality from coronavirus disease 2019: old and new predictions." BMJ, October 7, 2020, m3588. http://dx.doi.org/10.1136/bmj.m3588.

Full text
Abstract:
Abstract Objective To replicate and analyse the information available to UK policymakers when the lockdown decision was taken in March 2020 in the United Kingdom. Design Independent calculations using the CovidSim code, which implements Imperial College London’s individual based model, with data available in March 2020 applied to the coronavirus disease 2019 (covid-19) epidemic. Setting Simulations considering the spread of covid-19 in Great Britain and Northern Ireland. Population About 70 million simulated people matched as closely as possible to actual UK demographics, geography, and social behaviours. Main outcome measures Replication of summary data on the covid-19 epidemic reported to the UK government Scientific Advisory Group for Emergencies (SAGE), and a detailed study of unpublished results, especially the effect of school closures. Results The CovidSim model would have produced a good forecast of the subsequent data if initialised with a reproduction number of about 3.5 for covid-19. The model predicted that school closures and isolation of younger people would increase the total number of deaths, albeit postponed to a second and subsequent waves. The findings of this study suggest that prompt interventions were shown to be highly effective at reducing peak demand for intensive care unit (ICU) beds but also prolong the epidemic, in some cases resulting in more deaths long term. This happens because covid-19 related mortality is highly skewed towards older age groups. In the absence of an effective vaccination programme, none of the proposed mitigation strategies in the UK would reduce the predicted total number of deaths below 200 000. Conclusions It was predicted in March 2020 that in response to covid-19 a broad lockdown, as opposed to a focus on shielding the most vulnerable members of society, would reduce immediate demand for ICU beds at the cost of more deaths long term. The optimal strategy for saving lives in a covid-19 epidemic is different from that anticipated for an influenza epidemic with a different mortality age profile.
APA, Harvard, Vancouver, ISO, and other styles
3

King, Emerald L., and Denise N. Rall. "Re-imagining the Empire of Japan through Japanese Schoolboy Uniforms." M/C Journal 18, no. 6 (March 7, 2016). http://dx.doi.org/10.5204/mcj.1041.

Full text
Abstract:
Introduction“From every kind of man obedience I expect; I’m the Emperor of Japan.” (“Miyasama,” from Gilbert and Sullivan’s musical The Mikado, 1885)This commentary is facilitated by—surprisingly resilient—oriental stereotypes of an imagined Japan (think of Oscar Wilde’s assertion, in 1889, that Japan was a European invention). During the Victorian era, in Britain, there was a craze for all things oriental, particularly ceramics and “there was a craze for all things Japanese and no middle class drawing room was without its Japanese fan or teapot.“ (V&A Victorian). These pastoral depictions of the ‘oriental life’ included the figures of men and women in oriental garb, with fans, stilt shoes, kimono-like robes, and appropriate headdresses, engaging in garden-based activities, especially tea ceremony variations (Landow). In fact, tea itself, and the idea of a ceremony of serving it, had taken up a central role, even an obsession in middle- and upper-class Victorian life. Similarly, landscapes with wild seas, rugged rocks and stunted pines, wizened monks, pagodas and temples, and particular fauna and flora (cranes and other birds flying through clouds of peonies, cherry blossoms and chrysanthemums) were very popular motifs (see Martin and Koda). Rather than authenticity, these designs heightened the Western-based romantic stereotypes associated with a stylised form of Japanese life, conducted sedately under rule of the Japanese Imperial Court. In reality, prior to the Meiji period (1868–1912), the Emperor was largely removed from everyday concerns, residing as an isolated, holy figure in Kyoto, the traditional capital of Japan. Japan was instead ruled from Edo (modern day Tokyo) led by the Shogun and his generals, according to a strict Confucian influenced code (see Keene). In Japan, as elsewhere, the presence of feudal-style governance includes policies that determine much of everyday life, including restrictions on clothing (Rall 169). The Samurai code was no different, and included a series of protocols that restricted rank, movement, behaviour, and clothing. As Vincent has noted in the case of the ‘lace tax’ in Great Britain, these restrictions were designed to punish those who seek to penetrate the upper classes through their costume (28-30). In Japan, pre-Meiji sumptuary laws, for example, restricted the use of gold, and prohibited the use of a certain shade of red by merchant classes (V&A Kimono).Therefore, in the governance of pre-globalised societies, the importance of clothing and textile is evident; as Jones and Stallybrass comment: We need to understand the antimatedness of clothes, their ability to “pick up” subjects, to mould and shape them both physically and socially—to constitute subjects through their power as material memories […] Clothing is a worn world: a world of social relations put upon the wearer’s body. (2-3, emphasis added)The significant re-imagining of Japanese cultural and national identities are explored here through the cataclysmic impact of Western ideologies on Japanese cultural traditions. There are many ways to examine how indigenous cultures respond to European, British, or American (hereafter Western) influences, particularly in times of conflict (Wilk). Western ideology arrived in Japan after a long period of isolation (during which time Japan’s only contact was with Dutch traders) through the threat of military hostility and war. It is after this outside threat was realised that Japan’s adoption of military and industrial practices begins. The re-imagining of their national identity took many forms, and the inclusion of a Western-style military costuming as a schoolboy uniform became a highly visible indicator of Japan’s mission to protect its sovereign integrity. A brief history of Japan’s rise from a collection of isolated feudal states to a unified military power, in not only the Asian Pacific region but globally, demonstrates the speed at which they adopted the Western mode of warfare. Gunboats on Japan’s ShorelinesJapan was forcefully opened to the West in the 1850s by America under threat of First Name Perry’s ‘gunboat diplomacy’ (Hillsborough 7-8). Following this, Japan underwent a rapid period of modernisation, and an upsurge in nationalism and military expansion that was driven by a desire to catch up to the European powers present in the Pacific. Noted by Ian Ferguson in Civilization: The West and the Rest, Unsure, the Japanese decided […] to copy everything […] Japanese institutions were refashioned on Western models. The army drilled like Germans; the navy sailed like Britons. An American-style system of state elementary and middle schools was also introduced. (221, emphasis added)This was nothing short of a wide-scale reorganisation of Japan’s entire social structure and governance. Under the Emperor Meiji, who wrested power from the Shogunate and reclaimed it for the Imperial head, Japan steamed into an industrial revolution, achieving in a matter of years what had taken Europe over a century.Japan quickly became a major player-elect on the world stage. However, as an island nation, Japan lacked the essentials of both coal and iron with which to fashion not only industrial machinery but also military equipment, the machinery of war. In 1875 Japan forced Korea to open itself to foreign (read: Japanese) trade. In the same treaty, Korea was recognised as a sovereign nation, separate from Qing China (Tucker 1461). The necessity for raw materials then led to the Sino-Japanese War (1894–95), a conflict between Japan and China that marked the emergence of Japan as a major world power. The Korean Peninsula had long been China’s most important client state, but its strategic location adjacent to the Japanese archipelago, and its natural resources of coal and iron, attracted Japan’s interest. Later, the Russo-Japanese War (1904–05), allowed a victorious Japan to force Russia to abandon its expansionist policy in the Far East, becoming the first Asian power in modern times to defeat a European power. The Russo-Japanese War developed out of the rivalry between Russia and Japan for dominance in Korea and Manchuria, again in the struggle for natural resources (Tucker 1534-46).Japan’s victories, together with the county’s drive for resources, meant that Japan could now determine its role within the Asia-Pacific sphere of influence. As Japan’s military, and their adoption of Westernised combat, proved effective in maintaining national integrity, other social institutions also looked to the West (Ferguson 221). In an ironic twist—while Victorian and Continental fashion was busy adopting the exotic, oriental look (Martin and Koda)—the kimono, along with other essentials of Japanese fashions, were rapidly altered (both literally and figuratively) to suit new, warlike ideology. It should be noted that kimono literally means ‘things that you wear’ and which, prior to exposure to Western fashions, signified all worn clothing (Dalby 65-119). “Wearing Things” in Westernised JapanAs Japan modernised during the late 1800s the kimono was positioned as symbolising barbaric, pre-modern, ‘oriental’ Japan. Indeed, on 17 January 1887 the Meiji Empress issued a memorandum on the subject of women’s clothing in Japan: “She [the Empress] believed that western clothes were in fact closer to the dress of women in ancient Japan than the kimonos currently worn and urged that they be adopted as the standard clothes of the reign” (Keene 404). The resemblance between Western skirts and blouses and the simple skirt and separate top that had been worn in ancient times by a people descended from the sun goddess, Amaterasu wo mikami, was used to give authority and cultural authenticity to Japan’s modernisation projects. The Imperial Court, with its newly ennobled European style aristocrats, exchanged kimono silks for Victorian finery, and samurai armour for military pomp and splendour (Figure 1).Figure 1: The Meiji Emperor, Empress and Crown Prince resplendent in European fashions on an outing to Asukayama Park. Illustration: Toyohara Chikanobu, circa 1890.It is argued here that the function of a uniform is to prepare the body for service. Maids and butlers, nurses and courtesans, doctors, policemen, and soldiers are all distinguished by their garb. Prudence Black states: “as a technology, uniforms shape and code the body so they become a unit that belongs to a collective whole” (93). The requirement to discipline bodies through clothing, particularly through uniforms, is well documented (see Craik, Peoples, and Foucault). The need to distinguish enemies from allies on the battlefield requires adherence to a set of defined protocols, as referenced in military fashion compendiums (see Molloy). While the postcolonial adoption of Western-based clothing reflects a new form of subservience (Rall, Kuechler and Miller), in Japan, the indigenous garments were clearly designed in the interests of ideological allegiance. To understand the Japanese sartorial traditions, the kimono itself must be read as providing a strong disciplinary element. The traditional garment is designed to represent an upright and unbending column—where two meters of under bindings are used to discipline the body into shape are then topped with a further four meters of a stiffened silk obi wrapped around the waist and lower chest. To dress formally in such a garment requires helpers (see Dalby). The kimono both constructs and confines the women who wear it, and presses them into their roles as dutiful, upper-class daughters (see Craik). From the 1890s through to the 1930s, when Japan again enters a period of militarism, the myth of the kimono again changes as it is integrated into the build-up towards World War II.Decades later, when Japan re-established itself as a global economic power in the 1970s and 1980s, the kimono was re-authenticated as Japan’s ‘traditional’ garment. This time it was not the myth of a people descended from solar deities that was on display, but that of samurai strength and propriety for men, alongside an exaggerated femininity for women, invoking a powerful vision of Japanese sartorial tradition. This reworking of the kimono was only possible as the garment was already contained within the framework of Confucian family duty. However, in the lead up to World War II, Japanese military advancement demanded of its people soldiers that could win European-style wars. The quickest solution was to copy the military acumen and strategies of global warfare, and the costumes of the soldiery and seamen of Europe, including Great Britain (Ferguson). It was also acknowledged that soldiers were ‘made not born’ so the Japanese educational system was re-vamped to emulate those of its military rivals (McVeigh). It was in the uptake of schoolboy uniforms that this re-imagining of Japanese imperial strength took place.The Japanese Schoolboy UniformCentral to their rapid modernisation, Japan adopted a constitutional system of education that borrowed from American and French models (Tipton 68-69). The government viewed education as a “primary means of developing a sense of nation,” and at its core, was the imperial authorities’ obsession with defining “Japan and Japaneseness” (Tipton 68-69). Numerous reforms eventually saw, after an abolition of fees, nearly 100% attendance by both boys and girls, despite a lingering mind-set that educating women was “a waste of time” (Tipton 68-69). A boys’ uniform based on the French and Prussian military uniforms of the 1860s and 1870s respectively (Kinsella 217), was adopted in 1879 (McVeigh 47). This jacket, initially with Prussian cape and cap, consists of a square body, standing mandarin style collar and a buttoned front. It was through these education reforms, as visually symbolised by the adoption of military style school uniforms, that citizen making, education, and military training became interrelated aspects of Meiji modernisation (Kinsella 217). Known as the gakuran (gaku: to study; ran: meaning both orchid, and a pun on Horanda, meaning Holland, the only Western country with trading relations in pre-Meiji Japan), these jackets were a symbol of education, indicating European knowledge, power and influence and came to reflect all things European in Meiji Japan. By adopting these jackets two objectives were realised:through the magical power of imitation, Japan would, by adopting the clothing of the West, naturally rise in military power; and boys were uniformed to become not only educated as quasi-Europeans, but as fighting soldiers and sons (suns) of the nation.The gakuran jacket was first popularised by state-run schools, however, in the century and a half that the garment has been in use it has come to symbolise young Japanese masculinity as showcased in campus films, anime, manga, computer games, and as fashion is the preeminent garment for boybands and Japanese hipsters.While the gakuran is central to the rise of global militarism in Japan (McVeigh 51-53), the jacket would go on to form the basis of the Sun Yat Sen and Mao Suits as symbols of revolutionary China (see McVeigh). Supposedly, Sun Yat Sen saw the schoolboy jacket in Japan as a utilitarian garment and adopted it with a turn down collar (Cumming et al.). For Sun Yat Sen, the gakuran was the perfect mix of civilian (school boy) and military (the garment’s Prussian heritage) allowing him to walk a middle path between the demands of both. Furthermore, the garment allowed Sun to navigate between Western style suits and old-fashioned Qing dynasty styles (Gerth 116); one was associated with the imperialism of the National Products Movement, while the other represented the corruption of the old dynasty. In this way, the gakuran was further politicised from a national (Japanese) symbol to a global one. While military uniforms have always been political garments, in the late 1800s and early 1900s, as the world was rocked by revolutions and war, civilian clothing also became a means of expressing political ideals (McVeigh 48-49). Note that Mahatma Ghandi’s clothing choices also evolved from wholly Western styles to traditional and emphasised domestic products (Gerth 116).Mao adopted this style circa 1927, further defining the style when he came to power by adding elements from the trousers, tunics, and black cotton shoes worn by peasants. The suit was further codified during the 1960s, reaching its height in the Cultural Revolution. While the gakuran has always been a scholarly black (see Figure 2), subtle differences in the colour palette differentiated the Chinese population—peasants and workers donned indigo blue Mao jackets, while the People’s Liberation Army Soldiers donned khaki green. This limited colour scheme somewhat paradoxically ensured that subtle hierarchical differences were maintained even whilst advocating egalitarian ideals (Davis 522). Both the Sun Yat Sen suit and the Mao jacket represented the rejection of bourgeois (Western) norms that objectified the female form in favour of a uniform society. Neo-Maoism and Mao fever of the early 1990s saw the Mao suit emerge again as a desirable piece of iconic/ironic youth fashion. Figure 2: An example of Gakuran uniform next to the girl’s equivalent on display at Ichikawa Gakuen School (Japan). Photo: Emerald King, 2015.There is a clear and vital link between the influence of the Prussian style Japanese schoolboy uniform on the later creation of the Mao jacket—that of the uniform as an integral piece of worn propaganda (Atkins).For Japan, the rapid deployment of new military and industrial technologies, as well as a sartorial need to present her leaders as modern (read: Western) demanded the adoption of European-style uniforms. The Imperial family had always been removed from Samurai battlefields, so the adoption of Western military costume allowed Japan’s rulers to present a uniform face to other global powers. When Japan found itself in conflict in the Asia Pacific Region, without an organised military, the first requirement was to completely reorganise their system of warfare from a feudal base and to train up national servicemen. Within an American-style compulsory education system, the European-based curriculum included training in mathematics, engineering and military history, as young Britons had for generations begun their education in Greek and Latin, with the study of Ancient Greek and Roman wars (Bantock). It is only in the classroom that ideological change on a mass scale can take place (Reference Please), a lesson not missed by later leaders such as Mao Zedong.ConclusionIn the 1880s, the Japanese leaders established their position in global politics by adopting clothing and practices from the West (Europeans, Britons, and Americans) in order to quickly re-shape their country’s educational system and military establishment. The prevailing military costume from foreign cultures not only disciplined their adopted European bodies, they enforced a new regime through dress (Rall 157-174). For boys, the gakuran symbolised the unity of education and militarism as central to Japanese masculinity. Wearing a uniform, as many authors suggest, furthers compliance (Craik, Nagasawa Kaiser and Hutton, and McVeigh). As conscription became a part of Japanese reality in World War II, the schoolboys just swapped their military-inspired school uniforms for genuine military garments.Re-imagining a Japanese schoolboy uniform from a European military costume might suit ideological purposes (Atkins), but there is more. The gakuran, as a uniform based on a close, but not fitted jacket, was the product of a process of advanced industrialisation in the garment-making industry also taking place in the 1800s:Between 1810 and 1830, technical calibrations invented by tailors working at the very highest level of the craft [in Britain] eventually made it possible for hundreds of suits to be cut up and made in advance [...] and the ready-to-wear idea was put into practice for men’s clothes […] originally for uniforms for the War of 1812. (Hollander 31) In this way, industrialisation became a means to mass production, which furthered militarisation, “the uniform is thus the clothing of the modern disciplinary society” (Black 102). There is a perfect resonance between Japan’s appetite for a modern military and their rise to an industrialised society, and their conquests in Asia Pacific supplied the necessary material resources that made such a rapid deployment possible. The Japanese schoolboy uniform was an integral part of the process of both industrialisation and militarisation, which instilled in the wearer a social role required by modern Japanese society in its rise for global power. Garments are never just clothing, but offer a “world of social relations put upon the wearer’s body” (Jones and Stallybrass 3-4).Today, both the Japanese kimono and the Japanese schoolboy uniform continue to interact with, and interrogate, global fashions as contemporary designers continue to call on the tropes of ‘military chic’ (Tonchi) and Japanese-inspired clothing (Kawamura). References Atkins, Jaqueline. Wearing Propaganda: Textiles on the Home Front in Japan, Britain, and the United States. Princeton: Yale UP, 2005.Bantock, Geoffrey Herman. Culture, Industrialisation and Education. London: Routledge & K. Paul, 1968.Black, Prudence. “The Discipline of Appearance: Military Style and Australian Flight Hostess Uniforms 1930–1964.” Fashion & War in Popular Culture. Ed. Denise N. Rall. Bristol: Intellect/U Chicago P, 2014. 91-106.Craik, Jenifer. Uniforms Exposed: From Conformity to Transgression. Oxford: Berg, 2005.Cumming, Valerie, Cecil Williet Cunnington, and Phillis Emily Cunnington. “Mao Style.” The Dictionary of Fashion History. Eds. Valerie Cumming, Cecil Williet Cunnington, and Phillis Emily Cunnington. Oxford: Berg, 2010.Dalby, Liza, ed. Kimono: Fashioning Culture. London: Vintage, 2001.Davis, Edward L., ed. Encyclopaedia of Contemporary Chinese Culture. London: Routledge, 2005.Dees, Jan. Taisho Kimono: Speaking of Past and Present. Milan: Skira, 2009.Ferguson, N. Civilization: The West and the Rest. London: Penguin, 2011.Foucault, Michel. Discipline and Punish: The Birth of the Prison. Trans. Alan Sheridan. London: Penguin, 1997. Gerth, Karl. China Made: Consumer Culture and the Creation of the Nation, Cambridge: East Asian Harvard Monograph 224, 2003.Gilbert, W.S., and Arthur Sullivan. The Mikado or, The Town of Titipu. 1885. 16 Nov. 2015 ‹http://math.boisestate.edu/gas/mikado/mk_lib.pdf›. Hillsborough, Romulus. Samurai Revolution: The Dawn of Modern Japan Seen through the Eyes of the Shogun's Last Samurai. Vermont: Tuttle, 2014.Jones, Anne R., and Peter Stallybrass, Renaissance Clothing and the Materials of Memory. Cambridge: Cambridge UP, 2000.Keene, Donald. Emperor of Japan: Meiji and His World, 1852-1912. New York: Columbia UP, 2002.King, Emerald L. “Schoolboys and Kimono Ladies.” Presentation to the Un-Thinking Asian Migrations Conference, University of Otago, Dunedin, New Zealand, 24-26 Aug. 2014. Kinsella, Sharon. “What’s Behind the Fetishism of Japanese School Uniforms?” Fashion Theory 6.2 (2002): 215-37. Kuechler, Susanne, and Daniel Miller, eds. Clothing as Material Culture. Oxford: Berg, 2005.Landow, George P. “Liberty and the Evolution of the Liberty Style.” 22 Aug. 2010. ‹http://www.victorianweb.org/art/design/liberty/lstyle.html›.Martin, Richard, and Harold Koda. Orientalism: Vision of the East in Western Dress. New York: Metropolitan Museum of Art, 1994.McVeigh, Brian J. Wearing Ideology: State, Schooling, and Self-Presentation in Japan. Oxford: Berg, 2000.Molloy, John. Military Fashion: A Comparative History of the Uniforms of the Great Armies from the 17th Century to the First World War. New York: Putnam, 1972.Peoples, Sharon. “Embodying the Military: Uniforms.” Critical Studies in Men’s Fashion 1.1 (2014): 7-21.Rall, Denise N. “Costume & Conquest: A Proximity Framework for Post-War Impacts on Clothing and Textile Art.” Fashion & War in Popular Culture, ed. Denise N. Rall. Bristol: Intellect/U Chicago P, 2014. 157-74. Tipton, Elise K. Modern Japan: A Social and Political History. 3rd ed. London: Routledge, 2016.Tucker, Spencer C., ed. A Global Chronology of Conflict: From the Ancient World to the Modern Middle East. Santa Barbara, CA: ABC-CLIO, 2013.V&A Kimono. Victoria and Albert Museum. “A History of the Kimono.” 2004. 2 Oct. 2015 ‹http://www.vam.ac.uk/content/articles/h/a-history-of-the-kimono/›.V&A Victorian. Victoria and Albert Museum. “The Victorian Vision of China and Japan.” 10 Nov. 2015 ‹http://www.vam.ac.uk/content/articles/t/the-victorian-vision-of-china-and-japan/›.Vincent, Susan J. The Anatomy of Fashion: Dressing the Body from the Renaissance to Today. Berg: Oxford, 2009.Wilde, Oscar. “The Decay of Lying.” 1889. In Intentions New York: Berentano’s 1905. 16 Nov. 2015 ‹http://virgil.org/dswo/courses/novel/wilde-lying.pdf›. Wilk, Richard. “Consumer Goods as a Dialogue about Development.” Cultural History 7 (1990) 79-100.
APA, Harvard, Vancouver, ISO, and other styles
4

Starrs, Bruno. "Publish and Graduate?: Earning a PhD by Published Papers in Australia." M/C Journal 11, no. 4 (June 24, 2008). http://dx.doi.org/10.5204/mcj.37.

Full text
Abstract:
Refereed publications (also known as peer-reviewed) are the currency of academia, yet many PhD theses in Australia result in only one or two such papers. Typically, a doctoral thesis requires the candidate to present (and pass) a public Confirmation Seminar, around nine to twelve months into candidacy, in which a panel of the candidate’s supervisors and invited experts adjudicate upon whether the work is likely to continue and ultimately succeed in the goal of a coherent and original contribution to knowledge. A Final Seminar, also public and sometimes involving the traditional viva voce or oral defence of the thesis, is presented two or three months before approval is given to send the 80,000 to 100,000 word tome off for external examination. And that soul-destroying or elation-releasing examiner’s verdict can be many months in the delivery: a limbo-like period during which the candidate’s status as a student is ended and her or his receipt of any scholarship or funding guerdon is terminated with perfunctory speed. This is the only time most students spend seriously writing up their research for publication although, naturally, many are more involved in job hunting as they pin their hopes on passing the thesis examination.There is, however, a slightly more palatable alternative to this nail-biting process of the traditional PhD, and that is the PhD by Published Papers (also known as PhD by Publications or PhD by Published Works). The form of my own soon-to-be-submitted thesis, it permits the submission for examination of a collection of papers that have been refereed and accepted (or are in the process of being refereed) for publication in academic journals or books. Apart from the obvious benefits in getting published early in one’s (hopefully) burgeoning academic career, it also takes away a lot of the stress come final submission time. After all, I try to assure myself, the thesis examiners can’t really discredit the process of double-blind, peer-review the bulk of the thesis has already undergone: their job is to examine how well I’ve unified the papers into a cohesive thesis … right? But perhaps they should at least be wary, because, unfortunately, the requirements for this kind of PhD vary considerably from institution to institution and there have been some cases where the submitted work is of questionable quality compared to that produced by graduates from more demanding universities. Hence, this paper argues that in my subject area of interest—film and television studies—there is a huge range in the set requirements for doctorates, from universities that award the degree to film artists for prior published work that has undergone little or no academic scrutiny and has involved little or no on-campus participation to at least three Australian universities that require candidates be enrolled for a minimum period of full-time study and only submit scholarly work generated and published (or submitted for publication) during candidature. I would also suggest that uncertainty about where a graduate’s work rests on this continuum risks confusing a hard-won PhD by Published Papers with the sometimes risible honorary doctorate. Let’s begin by dredging the depths of those murky, quasi-academic waters to examine the occasionally less-than-salubrious honorary doctorate. The conferring of this degree is generally a recognition of an individual’s body of (usually published) work but is often conferred for contributions to knowledge or society in general that are not even remotely academic. The honorary doctorate does not usually carry with it the right to use the title “Dr” (although many self-aggrandising recipients in the non-academic world flout this unwritten code of conduct, and, indeed, Monash University’s Monash Magazine had no hesitation in describing its 2008 recipient, musician, screenwriter, and art-school-dropout Nick Cave, as “Dr Cave” (O’Loughlin)). Some shady universities even offer such degrees for sale or ‘donation’ and thus do great damage to that institution’s credibility as well as to the credibility of the degree itself. Such overseas “diploma mills”—including Ashwood University, Belford University, Glendale University and Suffield University—are identified by their advertising of “Life Experience Degrees,” for which a curriculum vitae outlining the prospective graduand’s oeuvre is accepted on face value as long as their credit cards are not rejected. An aspiring screen auteur simply specifies film and television as their major and before you can shout “Cut!” there’s a degree in the mail. Most of these pseudo-universities are not based in Australia but are perfectly happy to confer their ‘titles’ to any well-heeled, vanity-driven Australians capable of completing the online form. Nevertheless, many academics fear a similarly disreputable marketplace might develop here, and Norfolk Island-based Greenwich University presents a particularly illuminating example. Previously empowered by an Act of Parliament consented to by Senator Ian Macdonald, the then Minister for Territories, this “university” had the legal right to confer honorary degrees from 1998. The Act was eventually overridden by legislation passed in 2002, after a concerted effort by the Australian Universities Quality Agency Ltd. and the Australian Vice-Chancellors’ Committee to force the accreditation requirements of the Australian Qualifications Framework upon the institution in question, thus preventing it from making degrees available for purchase over the Internet. Greenwich University did not seek re-approval and soon relocated to its original home of Hawaii (Brown). But even real universities flounder in similarly muddy waters when, unsolicited, they make dubious decisions to grant degrees to individuals they hold in high esteem. Although meaning well by not courting pecuniary gain, they nevertheless invite criticism over their choice of recipient for their honoris causa, despite the decision usually only being reached after a process of debate and discussion by university committees. Often people are rewarded, it seems, as much for their fame as for their achievements or publications. One such example of a celebrity who has had his onscreen renown recognised by an honorary doctorate is film and television actor/comedian Billy Connolly who was awarded an Honorary Doctor of Letters by The University of Glasgow in 2006, prompting Stuart Jeffries to complain that “something has gone terribly wrong in British academia” (Jeffries). Eileen McNamara also bemoans the levels to which some institutions will sink to in search of media attention and exposure, when she writes of St Andrews University in Scotland conferring an honorary doctorate to film actor and producer, Michael Douglas: “What was designed to acknowledge intellectual achievement has devolved into a publicity grab with universities competing for celebrity honorees” (McNamara). Fame as an actor (and the list gets even weirder when the scope of enquiry is widened beyond the field of film and television), seems to be an achievement worth recognising with an honorary doctorate, according to some universities, and this kind of discredit is best avoided by Australian institutions of higher learning if they are to maintain credibility. Certainly, universities down under would do well to follow elsewhere than in the footprints of Long Island University’s Southampton College. Perhaps the height of academic prostitution of parchments for the attention of mass media occurred when in 1996 this US school bestowed an Honorary Doctorate of Amphibious Letters upon that mop-like puppet of film and television fame known as the “muppet,” Kermit the Frog. Indeed, this polystyrene and cloth creation with an anonymous hand operating its mouth had its acceptance speech duly published (see “Kermit’s Acceptance Speech”) and the Long Island University’s Southampton College received much valuable press. After all, any publicity is good publicity. Or perhaps this furry frog’s honorary degree was a cynical stunt meant to highlight the ridiculousness of the practice? In 1986 a similar example, much closer to my own home, occurred when in anticipation and condemnation of the conferral of an honorary doctorate upon Prince Philip by Monash University in Melbourne, the “Members of the Monash Association of Students had earlier given a 21-month-old Chihuahua an honorary science degree” (Jeffries), effectively suggesting that the honorary doctorate is, in fact, a dog of a degree. On a more serious note, there have been honorary doctorates conferred upon far more worthy recipients in the field of film and television by some Australian universities. Indigenous film-maker Tracey Moffatt was awarded an honorary doctorate by Griffith University in November of 2004. Moffatt was a graduate of the Griffith University’s film school and had an excellent body of work including the films Night Cries: A Rural Tragedy (1990) and beDevil (1993). Acclaimed playwright and screenwriter David Williamson was presented with an Honorary Doctorate of Letters by The University of Queensland in December of 2004. His work had previously picked up four Australian Film Institute awards for best screenplay. An Honorary Doctorate of Visual and Performing Arts was given to film director Fred Schepisi AO by The University of Melbourne in May of 2006. His films had also been earlier recognised with Australian Film Institute awards as well as the Golden Globe Best Miniseries or Television Movie award for Empire Falls in 2006. Director George Miller was crowned with an Honorary Doctorate in Film from the Australian Film, Television, and Radio School in April 2007, although he already had a medical doctor’s testamur on his wall. In May of this year, filmmaker George Gittoes, a fine arts dropout from The University of Sydney, received an honorary doctorate by The University of New South Wales. His documentaries, Soundtrack to War (2005) and Rampage (2006), screened at the Sydney and Berlin film festivals, and he has been employed by the Australian Government as an official war artist. Interestingly, the high quality screen work recognised by these Australian universities may have earned the recipients ‘real’ PhDs had they sought the qualification. Many of these film artists could have just as easily submitted their work for the degree of PhD by Published Papers at several universities that accept prior work in lieu of an original exegesis, and where a film is equated with a book or journal article. But such universities still invite comparisons of their PhDs by Published Papers with honorary doctorates due to rather too-easy-to-meet criteria. The privately funded Bond University, for example, recommends a minimum full-time enrolment of just three months and certainly seems more lax in its regulations than other Antipodean institution: a healthy curriculum vitae and payment of the prescribed fee (currently AUD$24,500 per annum) are the only requirements. Restricting my enquiries once again to the field of my own research, film and television, I note that Dr. Ingo Petzke achieved his 2004 PhD by Published Works based upon films produced in Germany well before enrolling at Bond, contextualized within a discussion of the history of avant-garde film-making in that country. Might not a cynic enquire as to how this PhD significantly differs from an honorary doctorate? Although Petzke undoubtedly paid his fees and met all of Bond’s requirements for his thesis entitled Slow Motion: Thirty Years in Film, one cannot criticise that cynic for wondering if Petzke’s films are indeed equivalent to a collection of refereed papers. It should be noted that Bond is not alone when it comes to awarding candidates the PhD by Published Papers for work published or screened in the distant past. Although yet to grant it in the area of film or television, Swinburne University of Technology (SUT) is an institution that distinctly specifies its PhD by Publications is to be awarded for “research which has been carried out prior to admission to candidature” (8). Similarly, the Griffith Law School states: “The PhD (by publications) is awarded to established researchers who have an international reputation based on already published works” (1). It appears that Bond is no solitary voice in the academic wilderness, for SUT and the Griffith Law School also apparently consider the usual milestones of Confirmation and Final Seminars to be unnecessary if the so-called candidate is already well published. Like Bond, Griffith University (GU) is prepared to consider a collection of films to be equivalent to a number of refereed papers. Dr Ian Lang’s 2002 PhD (by Publication) thesis entitled Conditional Truths: Remapping Paths To Documentary ‘Independence’ contains not refereed, scholarly articles but the following videos: Wheels Across the Himalaya (1981); Yallambee, People of Hope (1986); This Is What I Call Living (1988); The Art of Place: Hanoi Brisbane Art Exchange (1995); and Millennium Shift: The Search for New World Art (1997). While this is a most impressive body of work, and is well unified by appropriate discussion within the thesis, the cynic who raised eyebrows at Petzke’s thesis might also be questioning this thesis: Dr Lang’s videos all preceded enrolment at GU and none have been refereed or acknowledged with major prizes. Certainly, the act of releasing a film for distribution has much in common with book publishing, but should these videos be considered to be on a par with academic papers published in, say, the prestigious and demanding journal Screen? While recognition at awards ceremonies might arguably correlate with peer review there is still the question as to how scholarly a film actually is. Of course, documentary films such as those in Lang’s thesis can be shown to be addressing gaps in the literature, as is the expectation of any research paper, but the onus remains on the author/film-maker to demonstrate this via a detailed contextual review and a well-written, erudite argument that unifies the works into a cohesive thesis. This Lang has done, to the extent that suspicious cynic might wonder why he chose not to present his work for a standard PhD award. Another issue unaddressed by most institutions is the possibility that the publications have been self-refereed or refereed by the candidate’s editorial colleagues in a case wherein the papers appear in a book the candidate has edited or co-edited. Dr Gillian Swanson’s 2004 GU thesis Towards a Cultural History of Private Life: Sexual Character, Consuming Practices and Cultural Knowledge, which addresses amongst many other cultural artefacts the film Lawrence of Arabia (David Lean 1962), has nine publications: five of which come from two books she co-edited, Nationalising Femininity: Culture, Sexuality and Cinema in Britain in World War Two, (Gledhill and Swanson 1996) and Deciphering Culture: Ordinary Curiosities and Subjective Narratives (Crisp et al 2000). While few would dispute the quality of Swanson’s work, the persistent cynic might wonder if these five papers really qualify as refereed publications. The tacit understanding of a refereed publication is that it is blind reviewed i.e. the contributor’s name is removed from the document. Such a system is used to prevent bias and favouritism but this level of anonymity might be absent when the contributor to a book is also one of the book’s editors. Of course, Dr Swanson probably took great care to distance herself from the refereeing process undertaken by her co-editors, but without an inbuilt check, allegations of cronyism from unfriendly cynics may well result. A related factor in making comparisons of different university’s PhDs by Published Papers is the requirements different universities have about the standard of the journal the paper is published in. It used to be a simple matter in Australia: the government’s Department of Education, Science and Training (DEST) held a Register of Refereed Journals. If your benefactor in disseminating your work was on the list, your publications were of near-unquestionable quality. Not any more: DEST will no longer accept nominations for listing on the Register and will not undertake to rule on whether a particular journal article meets the HERDC [Higher Education Research Data Collection] requirements for inclusion in publication counts. HEPs [Higher Education Providers] have always had the discretion to determine if a publication produced in a journal meets the requirements for inclusion in the HERDC regardless of whether or not the journal was included on the Register of Refereed Journals. As stated in the HERDC specifications, the Register is not an exhaustive list of all journals which satisfy the peer-review requirements (DEST). The last listing for the DEST Register of Refereed Journals was the 3rd of February 2006, making way for a new tiered list of academic journals, which is currently under review in the Australian tertiary education sector (see discussion of this development in the Redden and Mitchell articles in this issue). In the interim, some university faculties created their own rankings of journals, but not the Faculty of Creative Industries at the Queensland University of Technology (QUT) where I am studying for my PhD by Published Papers. Although QUT does not have a list of ranked journals for a candidate to submit papers to, it is otherwise quite strict in its requirements. The QUT University Regulations state, “Papers submitted as a PhD thesis must be closely related in terms of subject matter and form a cohesive research narrative” (QUT PhD regulation 14.1.2). Thus there is the requirement at QUT that apart from the usual introduction, methodology and literature review, an argument must be made as to how the papers present a sustained research project via “an overarching discussion of the main features linking the publications” (14.2.12). It is also therein stated that it should be an “account of research progress linking the research papers” (4.2.6). In other words, a unifying essay must make an argument for consideration of the sometimes diversely published papers as a cohesive body of work, undertaken in a deliberate journey of research. In my own case, an aural auteur analysis of sound in the films of Rolf de Heer, I argue that my published papers (eight in total) represent a journey from genre analysis (one paper) to standard auteur analysis (three papers) to an argument that sound should be considered in auteur analysis (one paper) to the major innovation of the thesis, aural auteur analysis (three papers). It should also be noted that unlike Bond, GU or SUT, the QUT regulations for the standard PhD still apply: a Confirmation Seminar, Final Seminar and a minimum two years of full-time enrolment (with a minimum of three months residency in Brisbane) are all compulsory. Such milestones and sine qua non ensure the candidate’s academic progress and intellectual development such that she or he is able to confidently engage in meaningful quodlibets regarding the thesis’s topic. Another interesting and significant feature of the QUT guidelines for this type of degree is the edict that papers submitted must be “published, accepted or submitted during the period of candidature” (14.1.1). Similarly, the University of Canberra (UC) states “The articles or other published material must be prepared during the period of candidature” (10). Likewise, Edith Cowan University (ECU) will confer its PhD by Publications to those candidates whose thesis consists of “only papers published in refereed scholarly media during the period of enrolment” (2). In other words, one cannot simply front up to ECU, QUT, or UC with a résumé of articles or films published over a lifetime of writing or film-making and ask for a PhD by Published Papers. Publications of the candidate prepared prior to commencement of candidature are simply not acceptable at these institutions and such PhDs by Published Papers from QUT, UC and ECU are entirely different to those offered by Bond, GU and SUT. Furthermore, without a requirement for a substantial period of enrolment and residency, recipients of PhDs by Published Papers from Bond, GU, or SUT are unlikely to have participated significantly in the research environment of their relevant faculty and peers. Such newly minted doctors may be as unfamiliar with the campus and its research activities as the recipient of an honorary doctorate usually is, as he or she poses for the media’s cameras en route to the glamorous awards ceremony. Much of my argument in this paper is built upon the assumption that the process of refereeing a paper (or for that matter, a film) guarantees a high level of academic rigour, but I confess that this premise is patently naïve, if not actually flawed. Refereeing can result in the rejection of new ideas that conflict with the established opinions of the referees. Interdisciplinary collaboration can be impeded and the lack of referee’s accountability is a potential problem, too. It can also be no less nail-biting a process than the examination of a finished thesis, given that some journals take over a year to complete the refereeing process, and some journal’s editorial committees have recognised this shortcoming. Despite being a mainstay of its editorial approach since 1869, the prestigious science journal, Nature, which only publishes about 7% of its submissions, has led the way with regard to varying the procedure of refereeing, implementing in 2006 a four-month trial period of ‘Open Peer Review’. Their website states, Authors could choose to have their submissions posted on a preprint server for open comments, in parallel with the conventional peer review process. Anyone in the field could then post comments, provided they were prepared to identify themselves. Once the usual confidential peer review process is complete, the public ‘open peer review’ process was closed and the editors made their decision about publication with the help of all reports and comments (Campbell). Unfortunately, the experiment was unpopular with both authors and online peer reviewers. What the Nature experiment does demonstrate, however, is that the traditional process of blind refereeing is not yet perfected and can possibly evolve into something less problematic in the future. Until then, refereeing continues to be the best system there is for applying structured academic scrutiny to submitted papers. With the reforms of the higher education sector, including forced mergers of universities and colleges of advanced education and the re-introduction of university fees (carried out under the aegis of John Dawkins, Minister for Employment, Education and Training from 1987 to 1991), and the subsequent rationing of monies according to research dividends (calculated according to numbers of research degree conferrals and publications), there has been a veritable explosion in the number of institutions offering PhDs in Australia. But the general public may not always be capable of differentiating between legitimately accredited programs and diploma mills, given that the requirements for the first differ substantially. From relatively easily obtainable PhDs by Published Papers at Bond, GU and SUT to more rigorous requirements at ECU, QUT and UC, there is undoubtedly a huge range in the demands of degrees that recognise a candidate’s published body of work. The cynical reader may assume that with this paper I am simply trying to shore up my own forthcoming graduation with a PhD by Published papers from potential criticisms that it is on par with a ‘purchased’ doctorate. Perhaps they are right, for this is a new degree in QUT’s Creative Industries faculty and has only been awarded to one other candidate (Dr Marcus Foth for his 2006 thesis entitled Towards a Design Methodology to Support Social Networks of Residents in Inner-City Apartment Buildings). But I believe QUT is setting a benchmark, along with ECU and UC, to which other universities should aspire. In conclusion, I believe further efforts should be undertaken to heighten the differences in status between PhDs by Published Papers generated during enrolment, PhDs by Published Papers generated before enrolment and honorary doctorates awarded for non-academic published work. Failure to do so courts cynical comparison of all PhD by Published Papers with unearnt doctorates bought from Internet shysters. References Brown, George. “Protecting Australia’s Higher Education System: A Proactive Versus Reactive Approach in Review (1999–2004).” Proceedings of the Australian Universities Quality Forum 2004. Australian Universities Quality Agency, 2004. 11 June 2008 ‹http://www.auqa.edu.au/auqf/2004/program/papers/Brown.pdf>. Campbell, Philip. “Nature Peer Review Trial and Debate.” Nature: International Weekly Journal of Science. December 2006. 11 June 2008 ‹http://www.nature.com/nature/peerreview/> Crisp, Jane, Kay Ferres, and Gillian Swanson, eds. Deciphering Culture: Ordinary Curiosities and Subjective Narratives. London: Routledge, 2000. Department of Education, Science and Training (DEST). “Closed—Register of Refereed Journals.” Higher Education Research Data Collection, 2008. 11 June 2008 ‹http://www.dest.gov.au/sectors/research_sector/online_forms_services/ higher_education_research_data_ collection.htm>. Edith Cowan University. “Policy Content.” Postgraduate Research: Thesis by Publication, 2003. 11 June 2008 ‹http://www.ecu.edu.au/GPPS/policies_db/tmp/ac063.pdf>. Gledhill, Christine, and Gillian Swanson, eds. Nationalising Femininity: Culture, Sexuality and Cinema in Britain in World War Two. Manchester: Manchester UP, 1996. Griffith Law School, Griffith University. Handbook for Research Higher Degree Students. 24 March 2004. 11 June 2008 ‹http://www.griffith.edu.au/centre/slrc/pdf/rhdhandbook.pdf>. Jeffries, Stuart. “I’m a celebrity, get me an honorary degree!” The Guardian 6 July 2006. 11 June 2008 ‹http://education.guardian.co.uk/higher/comment/story/0,,1813525,00.html>. Kermit the Frog. “Kermit’s Commencement Address at Southampton Graduate Campus.” Long Island University News 19 May 1996. 11 June 2008 ‹http://www.southampton.liu.edu/news/commence/1996/kermit.htm>. McNamara, Eileen. “Honorary senselessness.” The Boston Globe 7 May 2006. ‹http://www. boston.com/news/local/articles/2006/05/07/honorary_senselessness/>. O’Loughlin, Shaunnagh. “Doctor Cave.” Monash Magazine 21 (May 2008). 13 Aug. 2008 ‹http://www.monash.edu.au/pubs/monmag/issue21-2008/alumni/cave.html>. Queensland University of Technology. “Presentation of PhD Theses by Published Papers.” Queensland University of Technology Doctor of Philosophy Regulations (IF49). 12 Oct. 2007. 11 June 2008 ‹http://www.mopp.qut.edu.au/Appendix/appendix09.jsp#14%20Presentation %20of%20PhD%20Theses>. Swinburne University of Technology. Research Higher Degrees and Policies. 14 Nov. 2007. 11 June 2008 ‹http://www.swinburne.edu.au/corporate/registrar/ppd/docs/RHDpolicy& procedure.pdf>. University of Canberra. Higher Degrees by Research: Policy and Procedures (The Gold Book). 7.3.3.27 (a). 15 Nov. 2004. 11 June 2008 ‹http://www.canberra.edu.au/research/attachments/ goldbook/Pt207_AB20approved3220arp07.pdf>.
APA, Harvard, Vancouver, ISO, and other styles
5

Mahon, Elaine. "Ireland on a Plate: Curating the 2011 State Banquet for Queen Elizabeth II." M/C Journal 18, no. 4 (August 7, 2015). http://dx.doi.org/10.5204/mcj.1011.

Full text
Abstract:
IntroductionFirmly located within the discourse of visible culture as the lofty preserve of art exhibitions and museum artefacts, the noun “curate” has gradually transformed into the verb “to curate”. Williams writes that “curate” has become a fashionable code word among the aesthetically minded to describe a creative activity. Designers no longer simply sell clothes; they “curate” merchandise. Chefs no longer only make food; they also “curate” meals. Chosen for their keen eye for a particular style or a precise shade, it is their knowledge of their craft, their reputation, and their sheer ability to choose among countless objects which make the creative process a creative activity in itself. Writing from within the framework of “curate” as a creative process, this article discusses how the state banquet for Queen Elizabeth II, hosted by Irish President Mary McAleese at Dublin Castle in May 2011, was carefully curated to represent Ireland’s diplomatic, cultural, and culinary identity. The paper will focus in particular on how the menu for the banquet was created and how the banquet’s brief, “Ireland on a Plate”, was fulfilled.History and BackgroundFood has been used by nations for centuries to display wealth, cement alliances, and impress foreign visitors. Since the feasts of the Numidian kings (circa 340 BC), culinary staging and presentation has belonged to “a long, multifaceted and multicultural history of diplomatic practices” (IEHCA 5). According to the works of Baughman, Young, and Albala, food has defined the social, cultural, and political position of a nation’s leaders throughout history.In early 2011, Ross Lewis, Chef Patron of Chapter One Restaurant in Dublin, was asked by the Irish Food Board, Bord Bía, if he would be available to create a menu for a high-profile banquet (Mahon 112). The name of the guest of honour was divulged several weeks later after vetting by the protocol and security divisions of the Department of the Taoiseach (Prime Minister) and the Department of Foreign Affairs and Trade. Lewis was informed that the menu was for the state banquet to be hosted by President Mary McAleese at Dublin Castle in honour of Queen Elizabeth II’s visit to Ireland the following May.Hosting a formal banquet for a visiting head of state is a key feature in the statecraft of international and diplomatic relations. Food is the societal common denominator that links all human beings, regardless of culture (Pliner and Rozin 19). When world leaders publicly share a meal, that meal is laden with symbolism, illuminating each diner’s position “in social networks and social systems” (Sobal, Bove, and Rauschenbach 378). The public nature of the meal signifies status and symbolic kinship and that “guest and host are on par in terms of their personal or official attributes” (Morgan 149). While the field of academic scholarship on diplomatic dining might be young, there is little doubt of the value ascribed to the semiotics of diplomatic gastronomy in modern power structures (Morgan 150; De Vooght and Scholliers 12; Chapple-Sokol 162), for, as Firth explains, symbols are malleable and perfectly suited to exploitation by all parties (427).Political DiplomacyWhen Ireland gained independence in December 1921, it marked the end of eight centuries of British rule. The outbreak of “The Troubles” in 1969 in Northern Ireland upset the gradually improving environment of British–Irish relations, and it would be some time before a state visit became a possibility. Beginning with the peace process in the 1990s, the IRA ceasefire of 1994, and the Good Friday Agreement in 1998, a state visit was firmly set in motion by the visit of Irish President Mary Robinson to Buckingham Palace in 1993, followed by the unofficial visit of the Prince of Wales to Ireland in 1995, and the visit of Irish President Mary McAleese to Buckingham Palace in 1999. An official invitation to Queen Elizabeth from President Mary McAleese in March 2011 was accepted, and the visit was scheduled for mid-May of the same year.The visit was a highly performative occasion, orchestrated and ordained in great detail, displaying all the necessary protocol associated with the state visit of one head of state to another: inspection of the military, a courtesy visit to the nation’s head of state on arrival, the laying of a wreath at the nation’s war memorial, and a state banquet.These aspects of protocol between Britain and Ireland were particularly symbolic. By inspecting the military on arrival, the existence of which is a key indicator of independence, Queen Elizabeth effectively demonstrated her recognition of Ireland’s national sovereignty. On making the customary courtesy call to the head of state, the Queen was received by President McAleese at her official residence Áras an Uachtaráin (The President’s House), which had formerly been the residence of the British monarch’s representative in Ireland (Robbins 66). The state banquet was held in Dublin Castle, once the headquarters of British rule where the Viceroy, the representative of Britain’s Court of St James, had maintained court (McDowell 1).Cultural DiplomacyThe state banquet provided an exceptional showcase of Irish culture and design and generated a level of preparation previously unseen among Dublin Castle staff, who described it as “the most stage managed state event” they had ever witnessed (Mahon 129).The castle was cleaned from top to bottom, and inventories were taken of the furniture and fittings. The Waterford Crystal chandeliers were painstakingly taken down, cleaned, and reassembled; the Killybegs carpets and rugs of Irish lamb’s wool were cleaned and repaired. A special edition Newbridge Silverware pen was commissioned for Queen Elizabeth and Prince Philip to sign the newly ordered Irish leather-bound visitors’ book. A new set of state tableware was ordered for the President’s table. Irish manufacturers of household goods necessary for the guest rooms, such as towels and soaps, hand creams and body lotions, candle holders and scent diffusers, were sought. Members of Her Majesty’s staff conducted a “walk-through” several weeks in advance of the visit to ensure that the Queen’s wardrobe would not clash with the surroundings (Mahon 129–32).The promotion of Irish manufacture is a constant thread throughout history. Irish linen, writes Kane, enjoyed a reputation as far afield as the Netherlands and Italy in the 15th century, and archival documents from the Vaucluse attest to the purchase of Irish cloth in Avignon in 1432 (249–50). Support for Irish-made goods was raised in 1720 by Jonathan Swift, and by the 18th century, writes Foster, Dublin had become an important centre for luxury goods (44–51).It has been Irish government policy since the late 1940s to use Irish-manufactured goods for state entertaining, so the material culture of the banquet was distinctly Irish: Arklow Pottery plates, Newbridge Silverware cutlery, Waterford Crystal glassware, and Irish linen tablecloths. In order to decide upon the table setting for the banquet, four tables were laid in the King’s Bedroom in Dublin Castle. The Executive Chef responsible for the banquet menu, and certain key personnel, helped determine which setting would facilitate serving the food within the time schedule allowed (Mahon 128–29). The style of service would be service à la russe, so widespread in restaurants today as to seem unremarkable. Each plate is prepared in the kitchen by the chef and then served to each individual guest at table. In the mid-19th century, this style of service replaced service à la française, in which guests typically entered the dining room after the first course had been laid on the table and selected food from the choice of dishes displayed around them (Kaufman 126).The guest list was compiled by government and embassy officials on both sides and was a roll call of Irish and British life. At the President’s table, 10 guests would be served by a team of 10 staff in Dorchester livery. The remaining tables would each seat 12 guests, served by 12 liveried staff. The staff practiced for several days prior to the banquet to make sure that service would proceed smoothly within the time frame allowed. The team of waiters, each carrying a plate, would emerge from the kitchen in single file. They would then take up positions around the table, each waiter standing to the left of the guest they would serve. On receipt of a discreet signal, each plate would be laid in front of each guest at precisely the same moment, after which the waiters would then about foot and return to the kitchen in single file (Mahon 130).Post-prandial entertainment featured distinctive styles of performance and instruments associated with Irish traditional music. These included reels, hornpipes, and slipjigs, voice and harp, sean-nόs (old style) singing, and performances by established Irish artists on the fiddle, bouzouki, flute, and uilleann pipes (Office of Public Works).Culinary Diplomacy: Ireland on a PlateLewis was given the following brief: the menu had to be Irish, the main course must be beef, and the meal should represent the very best of Irish ingredients. There were no restrictions on menu design. There were no dietary requirements or specific requests from the Queen’s representatives, although Lewis was informed that shellfish is excluded de facto from Irish state banquets as a precautionary measure. The meal was to be four courses long and had to be served to 170 diners within exactly 1 hour and 10 minutes (Mahon 112). A small army of 16 chefs and 4 kitchen porters would prepare the food in the kitchen of Dublin Castle under tight security. The dishes would be served on state tableware by 40 waiters, 6 restaurant managers, a banqueting manager and a sommélier. Lewis would be at the helm of the operation as Executive Chef (Mahon 112–13).Lewis started by drawing up “a patchwork quilt” of the products he most wanted to use and built the menu around it. The choice of suppliers was based on experience but also on a supplier’s ability to deliver perfectly ripe goods in mid-May, a typically black spot in the Irish fruit and vegetable growing calendar as it sits between the end of one season and the beginning of another. Lewis consulted the Queen’s itinerary and the menus to be served so as to avoid repetitions. He had to discard his initial plan to feature lobster in the starter and rhubarb in the dessert—the former for the precautionary reasons mentioned above, and the latter because it featured on the Queen’s lunch menu on the day of the banquet (Mahon 112–13).Once the ingredients had been selected, the menu design focused on creating tastes, flavours and textures. Several draft menus were drawn up and myriad dishes were tasted and discussed in the kitchen of Lewis’s own restaurant. Various wines were paired and tasted with the different courses, the final choice being a Château Lynch-Bages 1998 red and a Château de Fieuzal 2005 white, both from French Bordeaux estates with an Irish connection (Kellaghan 3). Two months and two menu sittings later, the final menu was confirmed and signed off by state and embassy officials (Mahon 112–16).The StarterThe banquet’s starter featured organic Clare Island salmon cured in a sweet brine, laid on top of a salmon cream combining wild smoked salmon from the Burren and Cork’s Glenilen Farm crème fraîche, set over a lemon balm jelly from the Tannery Cookery School Gardens, Waterford. Garnished with horseradish cream, wild watercress, and chive flowers from Wicklow, the dish was finished with rapeseed oil from Kilkenny and a little sea salt from West Cork (Mahon 114). Main CourseA main course of Irish beef featured as the pièce de résistance of the menu. A rib of beef from Wexford’s Slaney Valley was provided by Kettyle Irish Foods in Fermanagh and served with ox cheek and tongue from Rathcoole, County Dublin. From along the eastern coastline came the ingredients for the traditional Irish dish of smoked champ: cabbage from Wicklow combined with potatoes and spring onions grown in Dublin. The new season’s broad beans and carrots were served with wild garlic leaf, which adorned the dish (Mahon 113). Cheese CourseThe cheese course was made up of Knockdrinna, a Tomme style goat’s milk cheese from Kilkenny; Milleens, a Munster style cow’s milk cheese produced in Cork; Cashel Blue, a cow’s milk blue cheese from Tipperary; and Glebe Brethan, a Comté style cheese from raw cow’s milk from Louth. Ditty’s Oatmeal Biscuits from Belfast accompanied the course.DessertLewis chose to feature Irish strawberries in the dessert. Pat Clarke guaranteed delivery of ripe strawberries on the day of the banquet. They married perfectly with cream and yoghurt from Glenilen Farm in Cork. The cream was set with Irish Carrageen moss, overlaid with strawberry jelly and sauce, and garnished with meringues made with Irish apple balsamic vinegar from Lusk in North Dublin, yoghurt mousse, and Irish soda bread tuiles made with wholemeal flour from the Mosse family mill in Kilkenny (Mahon 113).The following day, President McAleese telephoned Lewis, saying of the banquet “Ní hé go raibh sé go maith, ach go raibh sé míle uair níos fearr ná sin” (“It’s not that it was good but that it was a thousand times better”). The President observed that the menu was not only delicious but that it was “amazingly articulate in terms of the story that it told about Ireland and Irish food.” The Queen had particularly enjoyed the stuffed cabbage leaf of tongue, cheek and smoked colcannon (a traditional Irish dish of mashed potatoes with curly kale or green cabbage) and had noted the diverse selection of Irish ingredients from Irish artisans (Mahon 116). Irish CuisineWhen the topic of food is explored in Irish historiography, the focus tends to be on the consequences of the Great Famine (1845–49) which left the country “socially and emotionally scarred for well over a century” (Mac Con Iomaire and Gallagher 161). Some commentators consider the term “Irish cuisine” oxymoronic, according to Mac Con Iomaire and Maher (3). As Goldstein observes, Ireland has suffered twice—once from its food deprivation and second because these deprivations present an obstacle for the exploration of Irish foodways (xii). Writing about Italian, Irish, and Jewish migration to America, Diner states that the Irish did not have a food culture to speak of and that Irish writers “rarely included the details of food in describing daily life” (85). Mac Con Iomaire and Maher note that Diner’s methodology overlooks a centuries-long tradition of hospitality in Ireland such as that described by Simms (68) and shows an unfamiliarity with the wealth of food related sources in the Irish language, as highlighted by Mac Con Iomaire (“Exploring” 1–23).Recent scholarship on Ireland’s culinary past is unearthing a fascinating story of a much more nuanced culinary heritage than has been previously understood. This is clearly demonstrated in the research of Cullen, Cashman, Deleuze, Kellaghan, Kelly, Kennedy, Legg, Mac Con Iomaire, Mahon, O’Sullivan, Richman Kenneally, Sexton, and Stanley, Danaher, and Eogan.In 1996 Ireland was described by McKenna as having the most dynamic cuisine in any European country, a place where in the last decade “a vibrant almost unlikely style of cooking has emerged” (qtd. in Mac Con Iomaire “Jammet’s” 136). By 2014, there were nine restaurants in Dublin which had been awarded Michelin stars or Red Ms (Mac Con Iomaire “Jammet’s” 137). Ross Lewis, Chef Patron of Chapter One Restaurant, who would be chosen to create the menu for the state banquet for Queen Elizabeth II, has maintained a Michelin star since 2008 (Mac Con Iomaire, “Jammet’s” 138). Most recently the current strength of Irish gastronomy is globally apparent in Mark Moriarty’s award as San Pellegrino Young Chef 2015 (McQuillan). As Deleuze succinctly states: “Ireland has gone mad about food” (143).This article is part of a research project into Irish diplomatic dining, and the author is part of a research cluster into Ireland’s culinary heritage within the Dublin Institute of Technology. The aim of the research is to add to the growing body of scholarship on Irish gastronomic history and, ultimately, to contribute to the discourse on the existence of a national cuisine. If, as Zubaida says, “a nation’s cuisine is its court’s cuisine,” then it is time for Ireland to “research the feasts as well as the famines” (Mac Con Iomaire and Cashman 97).ConclusionThe Irish state banquet for Queen Elizabeth II in May 2011 was a highly orchestrated and formalised process. From the menu, material culture, entertainment, and level of consultation in the creative content, it is evident that the banquet was carefully curated to represent Ireland’s diplomatic, cultural, and culinary identity.The effects of the visit appear to have been felt in the years which have followed. Hennessy wrote in the Irish Times newspaper that Queen Elizabeth is privately said to regard her visit to Ireland as the most significant of the trips she has made during her 60-year reign. British Prime Minister David Cameron is noted to mention the visit before every Irish audience he encounters, and British Foreign Secretary William Hague has spoken in particular of the impact the state banquet in Dublin Castle made upon him. Hennessy points out that one of the most significant indicators of the peaceful relationship which exists between the two countries nowadays was the subsequent state visit by Irish President Michael D. Higgins to Britain in 2013. This was the first state visit to the United Kingdom by a President of Ireland and would have been unimaginable 25 years ago. The fact that the President and his wife stayed at Windsor Castle and that the attendant state banquet was held there instead of Buckingham Palace were both deemed to be marks of special favour and directly attributed to the success of Her Majesty’s 2011 visit to Ireland.As the research demonstrates, eating together unites rather than separates, gathers rather than divides, diffuses political tensions, and confirms alliances. It might be said then that the 2011 state banquet hosted by President Mary McAleese in honour of Queen Elizabeth II, curated by Ross Lewis, gives particular meaning to the axiom “to eat together is to eat in peace” (Taliano des Garets 160).AcknowledgementsSupervisors: Dr Máirtín Mac Con Iomaire (Dublin Institute of Technology) and Dr Michael Kennedy (Royal Irish Academy)Fáilte IrelandPhotos of the banquet dishes supplied and permission to reproduce them for this article kindly granted by Ross Lewis, Chef Patron, Chapter One Restaurant ‹http://www.chapteronerestaurant.com/›.Illustration ‘Ireland on a Plate’ © Jesse Campbell BrownRemerciementsThe author would like to thank the anonymous reviewers for their feedback and suggestions on an earlier draft of this article.ReferencesAlbala, Ken. The Banquet: Dining in the Great Courts of Late Renaissance Europe. Chicago: University of Illinois, 2007.———. “The Historical Models of Food and Power in European Courts of the Nineteenth Century: An Expository Essay and Prologue.” Royal Taste, Food Power and Status at the European Courts after 1789. Ed. Daniëlle De Vooght. Surrey: Ashgate Publishing, 2011. 13–29.Baughman, John J. “The French Banqueting Campaign of 1847–48.” The Journal of Modern History 31 (1959): 1–15. Cashman, Dorothy. “That Delicate Sweetmeat, the Irish Plum: The Culinary World of Maria Edgeworth.” ‘Tickling the Palate': Gastronomy in Irish Literature and Culture. Ed. Máirtín Mac Con Iomaire, and Eamon Maher. Oxford: Peter Lang, 2014. 15–34.———. “French Boobys and Good English Cooks: The Relationship with French Culinary Influence in Eighteenth- and Nineteenth-Century Ireland.” Reimagining Ireland: Proceedings from the AFIS Conference 2012. Vol. 55 Reimagining Ireland. Ed. Benjamin Keatinge, and Mary Pierse. Bern: Peter Lang, 2014. 207–22.———. “‘This Receipt Is as Safe as the Bank’: Reading Irish Culinary Manuscripts.” M/C Journal 16.3 (2013). ‹http://journal.media-culture.org.au/index.php/mcjournal›.———. “Ireland’s Culinary Manuscripts.” Irish Traditional Cooking, Recipes from Ireland’s Heritage. By Darina Allen. London: Kyle Books, 2012. 14–15.Chapple-Sokol, Sam. “Culinary Diplomacy: Breaking Bread to Win Hearts and Minds.” The Hague Journal of Diplomacy 8 (2013): 161–83.Cullen, Louis M. The Emergence of Modern Ireland 1600–1900. London: Batsford, 1981.Deleuze, Marjorie. “A New Craze for Food: Why Is Ireland Turning into a Foodie Nation?” ‘Tickling the Palate': Gastronomy in Irish Literature and Culture. Ed. Máirtín Mac Con Iomaire, and Eamon Maher. Oxford: Peter Lang, 2014. 143–58.“Details of the State Dinner.” Office of Public Works. 8 Apr. 2013. ‹http://www.dublincastle.ie/HistoryEducation/TheVisitofHerMajestyQueenElizabethII/DetailsoftheStateDinner/›.De Vooght, Danïelle, and Peter Scholliers. Introduction. Royal Taste, Food Power and Status at the European Courts after 1789. Ed. Daniëlle De Vooght. Surrey: Ashgate Publishing, 2011. 1–12.Diner, Hasia. Hungering for America: Italian, Irish & Jewish Foodways in the Age of Migration. Cambridge, MA: Harvard UP, 2001.Firth, Raymond. Symbols: Public and Private. London: George Allen & Unwin, 1973.Foster, Sarah. “Buying Irish: Consumer Nationalism in 18th Century Dublin.” History Today 47.6 (1997): 44–51.Goldstein, Darra. Foreword. ‘Tickling the Palate': Gastronomy in Irish Literature and Culture. Eds. Máirtín Mac Con Iomaire and Eamon Maher. Oxford: Peter Lang, 2014. xi–xvii.Hennessy, Mark. “President to Visit Queen in First State Visit to the UK.” The Irish Times 28 Nov. 2013. 25 May 2015 ‹http://www.irishtimes.com/news/world/uk/president-to-visit-queen-in-first-state-visit-to-the-uk-1.1598127›.“International Historical Conference: Table and Diplomacy—from the Middle Ages to the Present Day—Call for Papers.” Institut Européen d’Histoire et des Cultures de l’Alimentation (IEHCA) 15 Feb. 2015. ‹http://www.iehca.eu/IEHCA_v4/pdf/16-11-3-5-colloque-table-diplomatique-appel-a-com-fr-en.pdf›.Kane, Eileen M.C. “Irish Cloth in Avignon in the Fifteenth Century.” The Journal of the Royal Society of Antiquaries of Ireland. 102.2 (1972): 249–51.Kaufman, Cathy K. “Structuring the Meal: The Revolution of Service à la Russe.” The Meal: Proceedings of the Oxford Symposium on Food and Cookery 2001. Ed. Harlan Walker. Devon: Prospect Books, 2002. 123–33.Kellaghan, Tara. “Claret: The Preferred Libation of Georgian Ireland’s Elite.” Dublin Gastronomy Symposium. Dublin, 6 Jun. 2012. ‹http://arrow.dit.ie/dgs/2012/june612/3/›.Kelly, Fergus. “Early Irish Farming.” Early Irish Law Series. Ed. Fergus Kelly. Volume IV. Dublin: Dublin Institute for Advanced Studies, 1997.Kennedy, Michael. “‘Where’s the Taj Mahal?’: Indian Restaurants in Dublin since 1908.” History Ireland 18.4 (2010): 50–52. ‹http://www.jstor.org/stable/27823031›.Legg, Marie-Louise. “'Irish Wine': The Import of Claret from Bordeaux to Provincial Ireland in the Eighteenth Century.” Irish Provincial Cultures in the Long Eighteenth Century: Making the Middle Sort (Essays for Toby Barnard). Eds. Raymond Gillespie and R[obert] F[itzroy] Foster. Dublin: Four Courts Press, 2012.Mac Con Iomaire, Máirtín. “Haute Cuisine Restaurants in Nineteenth and Twentieth Century Ireland.” Proceedings of the Royal Irish Academy. Section C. DOI: 10.3318/PRIAC.2015.115.06. 2015.———. “‘From Jammet’s to Guilbaud’s’: The Influence of French Haute Cuisine on the Development of Dublin Restaurants.” ‘Tickling the Palate’: Gastronomy in Irish Literature and Culture. Eds. Máirtín Mac Con Iomaire and Eamon Maher. Oxford: Peter Lang, 2014. 121–41. ‹http://arrow.dit.ie/tschafbk/15/›.———. “Exploring the 'Food Motif' in Songs from the Irish Tradition.” Dublin Gastronomy Symposium. Dublin, 3 Jun. 2014. ‹http://arrow.dit.ie/dgs/2014/june314/7/›.———. “Gastro-Topography: Exploring Food Related Placenames in Ireland.” Canadian Journal of Irish Studies. 38.1-2 (2014): 126–57.———. “The Pig in Irish Cuisine and Culture.” M/C Journal 13.5 (2010). ‹http://journal.media-culture.org.au/index.php/mcjournal/article/viewArticle/296›.———. “The Emergence, Development and Influence of French Haute Cuisine on Public Dining Restaurants 1900–2000: An Oral History.” Doctoral Thesis. Dublin Institute of Technology, 2009. ‹http://arrow.dit.ie/tourdoc/12/›.———. “A History of Seafood in Irish Cuisine and Culture.” Wild Food: Proceedings of the Oxford Symposium on Food and Cookery 2004. Ed. Richard Hosking. Totnes, Devon: Prospect Books, 2006. ‹http://arrow.dit.ie/tfschcafcon/3/›.———. “The Pig in Irish Cuisine Past and Present.” The Fat of the Land: Proceedings of the Oxford Symposium on Food and Cookery 2002. Ed. Harlan Walker. Bristol: Footwork, 2003. 207–15. ‹http://arrow.dit.ie/tfschcafcon/1/›.———, and Dorothy Cashman. “Irish Culinary Manuscripts and Printed Books: A Discussion.” Petits Propos Culinaires 94 (2011): 81–101. 16 Mar. 2012 ‹http://arrow.dit.ie/tfschafart/111/›.———, and Tara Kellaghan. “Royal Pomp: Viceregal Celebrations and Hospitality in Georgian Dublin.” Celebration: Proceedings of the Oxford Symposium on Food and Cookery 2011. Ed. Mark McWilliams. Totnes, Devon: Prospect Books. 2012. ‹http://arrow.dit.ie/tfschafart/109/›.———, and Eamon Maher. Introduction. ‘Tickling the Palate': Gastronomy in Irish Literature and Culture. Eds. Máirtín Mac Con Iomaire and Eamon Maher. Oxford: Peter Lang, 2014. 1–11. ‹http://arrow.dit.ie/tschafbk/11/›.———, and Pádraic Óg Gallagher. “The Potato in Irish Cuisine and Culture.” Journal of Culinary Science and Technology 7.2-3 (2009): 152–67. 24 Sep. 2012 ‹http://arrow.dit.ie/tfschafart/3/›.McConnell, Tara. “'Brew as Much as Possible during the Proper Season': Beer Consumption in Elite Households in Eighteenth-Century Ireland.” ‘Tickling the Palate': Gastronomy in Irish Literature and Culture. Eds. Máirtín Mac Con Iomaire and Eamon Maher. Oxford: Peter Lang, 2014. 177–89.McDowell, R[obert] B[rendan]. Historical Essays 1938–2001. Dublin: The Lilliput Press, 2003.McQuillan, Deirdre. “Young Irish Chef Wins International Award in Milan.” The Irish Times. 28 June 2015. 30 June 2015 ‹http://www.irishtimes.com/life-and-style/food-and-drink/young-irish-chef-wins-international-award-in-milan-1.2265725›.Mahon, Bríd. Land of Milk and Honey: The Story of Traditional Irish Food and Drink. Cork: Mercier Press, 1991.Mahon, Elaine. “Eating for Ireland: A Preliminary Investigation into Irish Diplomatic Dining since the Inception of the State.” Diss. Dublin Institute of Technology, 2013.Morgan, Linda. “Diplomatic Gastronomy: Style and Power at the Table.” Food and Foodways: Explorations in the History and Culture of Human Nourishment 20.2 (2012): 146–66.O'Sullivan, Catherine Marie. Hospitality in Medieval Ireland 900–1500. Dublin: Four Courts Press, 2004.Pliner, Patricia, and Paul Rozin. “The Psychology of the Meal.” Dimensions of the Meal: The Science, Culture, Business, and Art of Eating. Ed. Herbert L. Meiselman. Gaithersburg, MD: Aspen, 2000. 19–46.Richman Kenneally, Rhona. “Cooking at the Hearth: The ‘Irish Cottage’ and Women’s Lived Experience.” Memory Ireland. Ed. Oona Frawley. Vol. 2. Syracuse: Syracuse UP, 2012. 224–41.Robins, Joseph. Champagne and Silver Buckles: The Viceregal Court at Dublin Castle 1700–1922. Dublin: The Lilliput Press, 2001.Sexton, Regina. A Little History of Irish Food. Dublin: Gill and Macmillan, 1998.Sobal, Jeffrey, Caron Bove, and Barbara Rauschenbach. "Commensal Careers at Entry into Marriage: Establishing Commensal Units and Managing Commensal Circles." The Sociological Review 50.3 (2002): 378-397.Simms, Katharine. “Guesting and Feasting in Gaelic Ireland.” Journal of the Royal Society of Antiquaries of Ireland 108 (1978): 67–100.Stanley, Michael, Ed Danaher, and James Eogan, eds. Dining and Dwelling. Dublin: National Roads Authority, 2009.Swift, Jonathan. “A Proposal for the Universal Use of Irish Manufacture.” The Prose Works of Jonathan Swift D.D. Ed. Temple Scott. Vol. 7: Historical and Political Tracts. London: George Bell & Sons, 1905. 17–30. 29 July 2015 ‹http://www.ucc.ie/celt/published/E700001-024/›.Taliano des Garets, Françoise. “Cuisine et Politique.” Sciences Po University Press. Vingtième Siècle: Revue d’histoire 59 (1998): 160–61. Williams, Alex. “On the Tip of Creative Tongues.” The New York Times. 4 Oct. 2009. 16 June 2015 ‹http://www.nytimes.com/2009/10/04/fashion/04curate.html?pagewanted=all&_r=0›.Young, Carolin. Apples of Gold in Settings of Silver. New York: Simon & Schuster, 2002.Zubaida, Sami. “Imagining National Cuisines.” TCD/UCD Public Lecture Series. Trinity College, Dublin. 5 Mar. 2014.
APA, Harvard, Vancouver, ISO, and other styles
6

Cushing, Nancy. "To Eat or Not to Eat Kangaroo: Bargaining over Food Choice in the Anthropocene." M/C Journal 22, no. 2 (April 24, 2019). http://dx.doi.org/10.5204/mcj.1508.

Full text
Abstract:
Kangatarianism is the rather inelegant word coined in the first decade of the twenty-first century to describe an omnivorous diet in which the only meat consumed is that of the kangaroo. First published in the media in 2010 (Barone; Zukerman), the term circulated in Australian environmental and academic circles including the Global Animal conference at the University of Wollongong in July 2011 where I first heard it from members of the Think Tank for Kangaroos (THINKK) group. By June 2017, it had gained enough attention to be named the Oxford English Dictionary’s Australian word of the month (following on from May’s “smashed avo,” another Australian food innovation), but it took the Nine Network reality television series Love Island Australia to raise kangatarian to trending status on social media (Oxford UP). During the first episode, aired in late May 2018, Justin, a concreter and fashion model from Melbourne, declared himself to have previously been a kangatarian as he chatted with fellow contestant, Millie. Vet nurse and animal lover Millie appeared to be shocked by his revelation but was tentatively accepting when Justin explained what kangatarian meant, and justified his choice on the grounds that kangaroo are not farmed. In the social media response, it was clear that eating only the meat of kangaroos as an ethical choice was an entirely new concept to many viewers, with one tweet stating “Kangatarian isn’t a thing”, while others variously labelled the diet brutal, intriguing, or quintessentially Australian (see #kangatarian on Twitter).There is a well developed literature around the arguments for and against eating kangaroo, and why settler Australians tend to be so reluctant to do so (see for example, Probyn; Cawthorn and Hoffman). Here, I will concentrate on the role that ethics play in this food choice by examining how the adoption of kangatarianism can be understood as a bargain struck to help to manage grief in the Anthropocene, and the limitations of that bargain. As Lesley Head has argued, we are living in a time of loss and of grieving, when much that has been taken for granted is becoming unstable, and “we must imagine that drastic changes to everyday life are in the offing” (313). Applying the classic (and contested) model of five stages of grief, first proposed by Elisabeth Kübler-Ross in her book On Death and Dying in 1969, much of the population of the western world seems to be now experiencing denial, her first stage of loss, while those in the most vulnerable environments have moved on to anger with developed countries for destructive actions in the past and inaction in the present. The next stages (or states) of grieving—bargaining, depression, and acceptance—are likely to be manifested, although not in any predictable sequence, as the grief over current and future losses continues (Haslam).The great expansion of food restrictive diets in the Anthropocene can be interpreted as part of this bargaining state of grieving as individuals attempt to respond to the imperative to reduce their environmental impact but also to limit the degree of change to their own diet required to do so. Meat has long been identified as a key component of an individual’s environmental footprint. From Frances Moore Lappé’s 1971 Diet for a Small Planet through the United Nations’ Food and Agriculture Organisation’s 2006 report Livestock’s Long Shadow to the 2019 report of the EAT–Lancet Commission on Healthy Diets from Sustainable Food Systems, the advice has been consistent: meat consumption should be minimised in, if not eradicated from, the human diet. The EAT–Lancet Commission Report quantified this to less than 28 grams (just under one ounce) of beef, lamb or pork per day (12, 25). For many this would be keenly felt, in terms of how meals are constructed, the sensory experiences associated with eating meat and perceptions of well-being but meat is offered up as a sacrifice to bring about the return of the beloved healthy planet.Rather than accept the advice to cut out meat entirely, those seeking to bargain with the Anthropocene also find other options. This has given rise to a suite of foodways based around restricting meat intake in volume or type. Reducing the amount of commercially produced beef, lamb and pork eaten is one approach, while substituting a meat the production of which has a smaller environmental footprint, most commonly chicken or fish, is another. For those willing to make deeper changes, the meat of free living animals, especially those which are killed accidentally on the roads or for deliberately for environmental management purposes, is another option. Further along this spectrum are the novel protein sources suggested in the Lancet report, including insects, blue-green algae and laboratory-cultured meats.Kangatarianism is another form of this bargain, and is backed by at least half a century of advocacy. The Australian Conservation Foundation made calls to reduce the numbers of other livestock and begin a sustainable harvest of kangaroo for food in 1970 when the sale of kangaroo meat for human consumption was still illegal across the country (Conservation of Kangaroos). The idea was repeated by biologist Gordon Grigg in the late 1980s (Jackson and Vernes 173), and again in the Garnaut Climate Change Review in 2008 (547–48). Kangaroo meat is high in protein and iron, low in fat, and high in healthy polyunsaturated fatty acids and conjugated linoleic acid, and, as these authors showed, has a smaller environmental footprint than beef, lamb, or pork. Kangaroo require less water than cattle, sheep or pigs, and no land is cleared to grow feed for them or give them space to graze. Their paws cause less erosion and compaction of soil than do the hooves of common livestock. They eat less fodder than ruminants and their digestive processes result in lower emissions of the powerful greenhouse gas methane and less solid waste.As Justin of Love Island was aware, kangaroo are not farmed in the sense of being deliberately bred, fed, confined, or treated with hormones, drugs or chemicals, which also adds to their lighter impact on the environment. However, some pastoralists argue that because they cannot prevent kangaroos from accessing the food, water, shelter, and protection from predators they provide for their livestock, they do effectively farm them, although they receive no income from sales of kangaroo meat. This type of light touch farming of kangaroos has a very long history in Australia going back to the continent’s first peopling some 60,000 years ago. Kangaroos were so important to Aboriginal people that a wide range of environments were manipulated to produce their favoured habitats of open grasslands edged by sheltering trees. As Bill Gammage demonstrated, fire was used as a tool to preserve and extend grassy areas, to encourage regrowth which would attract kangaroos and to drive the animals from one patch to another or towards hunters waiting with spears (passim, for example, 58, 72, 76, 93). Gammage and Bruce Pascoe agree that this was a form of animal husbandry in which the kangaroos were drawn to the areas prepared for them for the young grass or, more forcefully, physically directed using nets, brush fences or stone walls. Burnt ground served to contain the animals in place of fencing, and regular harvesting kept numbers from rising to levels which would place pressure on other species (Gammage 79, 281–86; Pascoe 42–43). Contemporary advocates of eating kangaroo have promoted the idea that they should be deliberately co-produced with other livestock instead of being killed to preserve feed and water for sheep and cattle (Ellicott; Wilson 39). Substituting kangaroo for the meat of more environmentally damaging animals would facilitate a reduction in the numbers of cattle and sheep, lessening the harm they do.Most proponents have assumed that their audience is current meat eaters who would substitute kangaroo for the meat of other more environmentally costly animals, but kangatarianism can also emerge from vegetarianism. Wendy Zukerman, who wrote about kangaroo hunting for New Scientist in 2010, was motivated to conduct the research because she was considering becoming an early adopter of kangatarianism as the least environmentally taxing way to counter the longterm anaemia she had developed as a vegetarian. In 2018, George Wilson, honorary professor in the Australian National University’s Fenner School of Environment and Society called for vegetarians to become kangatarians as a means of boosting overall consumption of kangaroo for environmental and economic benefits to rural Australia (39).Given these persuasive environmental arguments, it might be expected that many people would have perceived eating kangaroo instead of other meat as a favourable bargain and taken up the call to become kangatarian. Certainly, there has been widespread interest in trying kangaroo meat. In 1997, only five years after the sale of kangaroo meat for human consumption had been legalised in most states (South Australia did so in 1980), 51% of 500 people surveyed in five capital cities said they had tried kangaroo. However, it had not become a meat of choice with very few found to eat it more than three times a year (Des Purtell and Associates iv). Just over a decade later, a study by Ampt and Owen found an increase to 58% of 1599 Australians surveyed across the country who had tried kangaroo but just 4.7% eating it at least monthly (14). Bryce Appleby, in his study of kangaroo consumption in the home based on interviews with 28 residents of Wollongong in 2010, specifically noted the absence of kangatarians—then a very new concept. A study of 261 Sydney university students in 2014 found that half had tried kangaroo meat and 10% continued to eat it with any regularity. Only two respondents identified themselves as kangatarian (Grant 14–15). Kangaroo meat advocate Michael Archer declared in 2017 that “there’s an awful lot of very, very smart vegetarians [who] have opted for semi vegetarianism and they’re calling themselves ‘kangatarians’, as they’re quite happy to eat kangaroo meat”, but unless there had been a significant change in a few years, the surveys did not bear out his assertion (154).The ethical calculations around eating kangaroo are complicated by factors beyond the strictly environmental. One Tweeter advised Justin: “‘I’m a kangatarian’ isn’t a pickup line, mate”, and certainly the reception of his declaration could have been very cool, especially as it was delivered to a self declared animal warrior (N’Tash Aha). All of the studies of beliefs and practices around the eating of kangaroo have noted a significant minority of Australians who would not consider eating kangaroo based on issues of animal welfare and animal rights. The 1997 study found that 11% were opposed to the idea of eating kangaroo, while in Grant’s 2014 study, 15% were ethically opposed to eating kangaroo meat (Des Purtell and Associates iv; Grant 14–15). Animal ethics complicate the bargains calculated principally on environmental grounds.These ethical concerns work across several registers. One is around the flesh and blood kangaroo as a charismatic native animal unique to Australia and which Australians have an obligation to respect and nurture. Sheep, cattle and pigs have been subject to longterm propaganda campaigns which entrench the idea that they are unattractive and unintelligent, and veil their transition to meat behind euphemistic language and abattoir walls, making it easier to eat them. Kangaroos are still seen as resourceful and graceful animals, and no linguistic tricks shield consumers from the knowledge that it is a roo on their plate. A proposal in 2009 to market a “coat of arms” emu and kangaroo-flavoured potato chip brought complaints to the Advertising Standards Bureau that this was disrespectful to these native animals, although the flavours were to be simulated and the product vegetarian (Black). Coexisting with this high regard to kangaroos is its antithesis. That is, a valuation of them informed by their designation as a pest in the pastoral industry, and the use of the carcasses of those killed to feed dogs and other companion animals. Appleby identified a visceral, disgust response to the idea of eating kangaroo in many of his informants, including both vegetarians who would not consider eating kangaroo because of their commitment to a plant-based diet, and at least one omnivore who would prefer to give up all meat rather than eat kangaroo. While diametrically opposed, the end point of both positions is that kangaroo meat should not be eaten.A second animal ethics stance relates to the imagined kangaroo, a cultural construct which for most urban Australians is much more present in their lives and likely to shape their actions than the living animals. It is behind the rejection of eating an animal which holds such an iconic place in Australian culture: to the dexter on the 1912 national coat of arms; hopping through the Hundred Acre Wood as Kanga and Roo in A.A. Milne’s Winnie-the-Pooh children’s books from the 1920s and the Disney movies later made from them; as a boy’s best friend as Skippy the Bush Kangaroo in a fondly remembered 1970s television series; and high in the sky on QANTAS planes. The anthropomorphising of kangaroos permitted the spectacle of the boxing kangaroo from the late nineteenth century. By framing natural kangaroo behaviours as boxing, these exhibitions encouraged an ambiguous understanding of kangaroos as human-like, moving them further from the category of food (Golder and Kirkby). Australian government bodies used this idea of the kangaroo to support food exports to Britain, with kangaroos as cooks or diners rather than ingredients. The Kangaroo Kookery Book of 1932 (see fig. 1 below) portrayed kangaroos as a nuclear family in a suburban kitchen and another official campaign supporting sales of Australian produce in Britain in the 1950s featured a Disney-inspired kangaroo eating apples and chops washed down with wine (“Kangaroo to Be ‘Food Salesman’”). This imagining of kangaroos as human-like has persisted, leading to the opinion expressed in a 2008 focus group, that consuming kangaroo amounted to “‘eating an icon’ … Although they are pests they are still human nature … these are native animals, people and I believe that is a form of cannibalism!” (Ampt and Owen 26). Figure 1: Rather than promoting the eating of kangaroos, the portrayal of kangaroos as a modern suburban family in the Kangaroo Kookery Book (1932) made it unthinkable. (Source: Kangaroo Kookery Book, Director of Australian Trade Publicity, Australia House, London, 1932.)The third layer of ethical objection on the ground of animal welfare is more specific, being directed to the method of killing the kangaroos which become food. Kangaroos are perhaps the only native animals for which state governments set quotas for commercial harvest, on the grounds that they compete with livestock for pasturage and water. In most jurisdictions, commercially harvested kangaroo carcasses can be processed for human consumption, and they are the ones which ultimately appear in supermarket display cases.Kangaroos are killed by professional shooters at night using swivelling spotlights mounted on their vehicles to locate and daze the animals. While clean head shots are the ideal and regulations state that animals should be killed when at rest and without causing “undue agonal struggle”, this is not always achieved and some animals do suffer prolonged deaths (NSW Code of Practice for Kangaroo Meat for Human Consumption). By regulation, the young of any female kangaroo must be killed along with her. While averting a slow death by neglect, this is considered cruel and wasteful. The hunt has drawn international criticism, including from Greenpeace which organised campaigns against the sale of kangaroo meat in Europe in the 1980s, and Viva! which was successful in securing the withdrawal of kangaroo from sale in British supermarkets (“Kangaroo Meat Sales Criticised”). These arguments circulate and influence opinion within Australia.A final animal ethics issue is that what is actually behind the push for greater use of kangaroo meat is not concern for the environment or animal welfare but the quest to turn a profit from these animals. The Kangaroo Industries Association of Australia, formed in 1970 to represent those who dealt in the marsupials’ meat, fur and skins, has been a vocal advocate of eating kangaroo and a sponsor of market research into how it can be made more appealing to the market. The Association argued in 1971 that commercial harvest was part of the intelligent conservation of the kangaroo. They sought minimum size regulations to prevent overharvesting and protect their livelihoods (“Assn. Backs Kangaroo Conservation”). The Association’s current website makes the claim that wild harvested “Australian kangaroo meat is among the healthiest, tastiest and most sustainable red meats in the world” (Kangaroo Industries Association of Australia). That this is intended to initiate a new and less controlled branch of the meat industry for the benefit of hunters and processors, rather than foster a shift from sheep or cattle to kangaroos which might serve farmers and the environment, is the opinion of Dr. Louise Boronyak, of the Centre for Compassionate Conservation at the University of Technology Sydney (Boyle 19).Concerns such as these have meant that kangaroo is most consumed where it is least familiar, with most of the meat for human consumption recovered from culled animals being exported to Europe and Asia. Russia has been the largest export market. There, kangaroo meat is made less strange by blending it with other meats and traditional spices to make processed meats, avoiding objections to its appearance and uncertainty around preparation. With only a low profile as a novelty animal in Russia, there are fewer sentimental concerns about consuming kangaroo, although the additional food miles undermine its environmental credentials. The variable acceptability of kangaroo in more distant markets speaks to the role of culture in determining how patterns of eating are formed and can be shifted, or, as Elspeth Probyn phrased it “how natural entities are transformed into commodities within a context of globalisation and local communities”, underlining the impossibility of any straightforward ethics of eating kangaroo (33, 35).Kangatarianism is a neologism which makes the eating of kangaroo meat something it has not been in the past, a voluntary restriction based on environmental ethics. These environmental benefits are well founded and eating kangaroo can be understood as an Anthropocenic bargain struck to allow the continuation of the consumption of red meat while reducing one’s environmental footprint. Although superficially attractive, the numbers entering into this bargain remain small because environmental ethics cannot be disentangled from animal ethics. The anthropomorphising of the kangaroo and its use as a national symbol coexist with its categorisation as a pest and use of its meat as food for companion animals. Both understandings of kangaroos made their meat uneatable for many Australians. Paired with concerns over how kangaroos are killed and the commercialisation of a native species, kangaroo meat has a very mixed reception despite decades of advocacy for eating its meat in favour of that of more harmed and more harmful introduced species. Given these constraints, kangatarianism is unlikely to become widespread and indeed it should be viewed as at best a temporary exigency. As the climate warms and rainfall becomes more erratic, even animals which have evolved to suit Australian conditions will come under increasing pressure, and humans will need to reach Kübler-Ross’ final state of grief: acceptance. In this case, this would mean acceptance that our needs cannot be placed ahead of those of other animals.ReferencesAmpt, Peter, and Kate Owen. Consumer Attitudes to Kangaroo Meat Products. Canberra: Rural Industries Research and Development Corporation, 2008.Appleby, Bryce. “Skippy the ‘Green’ Kangaroo: Identifying Resistances to Eating Kangaroo in the Home in a Context of Climate Change.” BSc Hons, U of Wollongong, 2010 <http://ro.uow.edu.au/thsci/103>.Archer, Michael. “Zoology on the Table: Plenary Session 4.” Australian Zoologist 39, 1 (2017): 154–60.“Assn. Backs Kangaroo Conservation.” The Beverley Times 26 Feb. 1971: 3. 22 Feb. 2019 <http://nla.gov.au/nla.news-article202738733>.Barone, Tayissa. “Kangatarians Jump the Divide.” Sydney Morning Herald 9 Feb. 2010. 13 Apr. 2019 <https://www.smh.com.au/lifestyle/kangatarians-jump-the-divide-20100209-gdtvd8.html>.Black, Rosemary. “Some Australians Angry over Idea for Kangaroo and Emu-Flavored Potato Chips.” New York Daily News 4 Dec. 2009. 5 Feb. 2019 <https://www.nydailynews.com/life-style/eats/australians-angry-idea-kangaroo-emu-flavored-potato-chips-article-1.431865>.Boyle, Rhianna. “Eating Skippy.” Big Issue Australia 578 11-24 Jan. 2019: 16–19.Cawthorn, Donna-Mareè, and Louwrens C. Hoffman. “Controversial Cuisine: A Global Account of the Demand, Supply and Acceptance of ‘Unconventional’ and ‘Exotic’ Meats.” Meat Science 120 (2016): 26–7.Conservation of Kangaroos. Melbourne: Australian Conservation Foundation, 1970.Des Purtell and Associates. Improving Consumer Perceptions of Kangaroo Products: A Survey and Report. Canberra: Rural Industries Research and Development Corporation, 1997.Ellicott, John. “Little Pay Incentive for Shooters to Join Kangaroo Meat Industry.” The Land 15 Mar. 2018. 28 Mar. 2019 <https://www.theland.com.au/story/5285265/top-roo-shooter-says-harvesting-is-a-low-paid-job/>.Garnaut, Ross. Garnaut Climate Change Review. 2008. 26 Feb. 2019 <http://www.garnautreview.org.au/index.htm>.Gammage, Bill. The Biggest Estate on Earth: How Aborigines Made Australia. Sydney: Allen and Unwin, 2012.Golder, Hilary, and Diane Kirkby. “Mrs. Mayne and Her Boxing Kangaroo: A Married Woman Tests Her Property Rights in Colonial New South Wales.” Law and History Review 21.3 (2003): 585–605.Grant, Elisabeth. “Sustainable Kangaroo Harvesting: Perceptions and Consumption of Kangaroo Meat among University Students in New South Wales.” Independent Study Project (ISP). U of NSW, 2014. <https://digitalcollections.sit.edu/isp_collection/1755>.Haslam, Nick. “The Five Stages of Grief Don’t Come in Fixed Steps – Everyone Feels Differently.” The Conversation 22 Oct. 2018. 28 Mar. 2019 <https://theconversation.com/the-five-stages-of-grief-dont-come-in-fixed-steps-everyone-feels-differently-96111>.Head, Lesley. “The Anthropoceans.” Geographical Research 53.3 (2015): 313–20.Kangaroo Industries Association of Australia. Kangaroo Meat. 26 Feb. 2019 <http://www.kangarooindustry.com/products/meat/>.“Kangaroo Meat Sales Criticised.” The Canberra Times 13 Sep. 1984: 14. 22 Feb 2019 <http://nla.gov.au/nla.news-article136915919>.“Kangaroo to Be Food ‘Salesman.’” Newcastle Morning Herald and Miners’ Advocate, 2 Dec. 1954. 22 Feb 2019 <http://nla.gov.au/nla.news-article134089767>.Kübler-Ross, Elisabeth. On Death and Dying: What the Dying Have to Teach Doctors, Nurses, Clergy, and their own Families. New York: Touchstone, 1997.Jackson, Stephen, and Karl Vernes. Kangaroo: Portrait of an Extraordinary Marsupial. Sydney: Allen and Unwin, 2010.Lappé, Frances Moore. Diet for a Small Planet. New York: Ballantine Books, 1971.N’Tash Aha (@Nsvasey). “‘I’m a Kangatarian’ isn’t a Pickup Line, Mate. #LoveIslandAU.” Twitter post. 27 May 2018. 5 Apr. 2019 <https://twitter.com/Nsvasey/status/1000697124122644480>.“NSW Code of Practice for Kangaroo Meat for Human Consumption.” Government Gazette of the State of New South Wales 24 Mar. 1993. 22 Feb. 2019 <http://nla.gov.au/nla.news-page14638033>.Oxford University Press, Australia and New Zealand. Word of the Month. June 2017. <https://www.oup.com.au/dictionaries/word-of-the-month>.Pascoe, Bruce. Dark Emu, Black Seeds: Agriculture or Accident? Broome: Magabala Books, 2014.Probyn, Elspeth. “Eating Roo: Of Things That Become Food.” New Formations 74.1 (2011): 33–45.Steinfeld, Henning, Pierre Gerber, Tom Wassenaar, Vicent Castel, Mauricio Rosales, and Cees d Haan. Livestock’s Long Shadow: Environmental Issues and Options. Rome: Food and Agriculture Organisation of the United Nations, 2006.Trust Nature. Essence of Kangaroo Capsules. 26 Feb. 2019 <http://ncpro.com.au/products/all-products/item/88139-essence-of-kangaroo-35000>.Victoria Department of Environment, Land, Water and Planning. Kangaroo Pet Food Trial. 28 Mar. 2019 <https://www.wildlife.vic.gov.au/managing-wildlife/wildlife-management-and-control-authorisations/kangaroo-pet-food-trial>.Willett, Walter, et al. “Food in the Anthropocene: The EAT–Lancet Commission on Healthy Diets from Sustainable Food Systems.” The Lancet 16 Jan. 2019. 26 Feb. 2019 <https://www.thelancet.com/commissions/EAT>.Wilson, George. “Kangaroos Can Be an Asset Rather than a Pest.” Australasian Science 39.1 (2018): 39.Zukerman, Wendy. “Eating Skippy: The Future of Kangaroo Meat.” New Scientist 208.2781 (2010): 42–5.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Great Britain. Government Code and Cypher School"

1

Secret days: Code-breaking in Bletchley Park. London: Frontline Books, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Smith, Michael. The emperor's codes: The breaking of Japan's secret ciphers. New York: Arcade Pub., 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

The emperor's codes: The breaking of Japan's secret ciphers. New York: Arcade Pub., 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

The emperor's codes: Bletchley Park and the breaking of Japan's secret ciphers. London: Bantam, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Amber Shadows. Thorndike Press, 2018.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

The amber shadows. Pegasus Books, 2017.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ribchester, Lucy, and Lucy Scott. Amber Shadows. Oakhill Publishing (CD), 2016.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Amber Shadows: A Novel. Pegasus Books, 2018.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ribchester, Lucy. Amber Shadows. Simon & Schuster, Limited, 2016.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ribchester, Lucy. Amber Shadows: A Novel. Pegasus Books, 2017.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Great Britain. Government Code and Cypher School"

1

Kornicki, Peter. "Japan Must Fight Britain." In Eavesdropping on the Emperor, 1–20. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780197602805.003.0001.

Full text
Abstract:
After the Anglo-Japanese Alliance came to an end in 1923, and especially in the 1930s, relations between Britain and Japan gradually worsened. This had been predicted privately by Lt.-Gen. Sir Ian Hamilton in Britain but publicly by Hector Bywater and publicly in Japan by Ishimaru Tōta, whose books were translated into English. Although the War Office made no linguistic preparations for war, GC&CS (the Government Code & Cypher School) had begun working on Japanese naval codes in the 1920s and for this purpose hired former members of the British consular service in Japan, who had a good knowledge of Japanese, along with Eric Nave, a brilliant Australian linguist and cryptographer working for the Royal Australian Navy. The outbreak of war in Europe in 1939 created a need for linguists to work as censors, and this brought the famous translator Arthur Waley and a retired naval captain with a good knowledge of Japanese, Oswald Tuck, back to work.
APA, Harvard, Vancouver, ISO, and other styles
2

Greenberg, Joel. "The Enigma machine." In The Turing Guide. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198747826.003.0018.

Full text
Abstract:
Shortly after the end of the First World War, the German Navy learned that its encrypted communications had been read throughout the hostilities by both Britain and Russia. The German military realized that its approach to cipher security required a fundamental overhaul, and from 1926 different branches of the military began to adopt the encryption machine known as Enigma. By the start of the Second World War a series of modifications to military Enigma had made the machine yet more secure, and Enigma was at the centre of a remarkably effective military communications system. It would take some of the best minds in Britain—and before that, in Poland—to crack German military Enigma. The exact origins of the encryption machine that played such an important role in the Second World War are not entirely clear. In the early 1920s patent applications for a wheel-based cipher machine were filed by a Dutch inventor, Hugo Koch, as well as by a German engineer, Arthur Scherbius. In 1923, a company called Chiffrienmaschinen AG exhibited a heavy and bulky encryption machine at the International Postal Congress in Bern, Switzerland. This machine had a standard typewriter keyboard for input, and its design followed Scherbius’s original patent closely. Scherbius had named his machine ‘Enigma’, and this ‘Model A’ was the first of a long line of models to emerge. Models B, C, and D soon followed, and by 1927 Model D was selling widely for commercial use. A number of governments purchased Enigma machines in order to study them, and Edward Travis—the deputy head of Britain’s signals intelligence unit, the Government Code and Cypher School—bought one on behalf of the British government in the mid-1920s. In 1925, the German Navy decided to put Enigma into use the following year, despite having rejected one of Scherbius’s previous encryption mechanisms in 1918. Meanwhile, the German Army began to redesign Enigma, with the intention of strengthening its security. By 1928, Model G was in use, and in June 1930 Model I (Eins) became the standard version, deployed first by the army, then the navy in October 1934, and the air force in August 1935.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography