Academic literature on the topic 'Reasons All Can Accept'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Reasons All Can Accept.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Reasons All Can Accept"

1

Bohman, James, and Henry S. Richardson. "Liberalism, Deliberative Democracy, and “Reasons that All Can Accept”*." Journal of Political Philosophy 17, no. 3 (September 2009): 253–74. http://dx.doi.org/10.1111/j.1467-9760.2008.00330.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Steglich-Petersen, Asbjørn, and Mattias Skipper. "An Instrumentalist Account of How to Weigh Epistemic and Practical Reasons for Belief." Mind 129, no. 516 (October 1, 2019): 1071–94. http://dx.doi.org/10.1093/mind/fzz062.

Full text
Abstract:
Abstract When one has both epistemic and practical reasons for or against some belief, how do these reasons combine into an all-things-considered reason for or against that belief? The question might seem to presuppose the existence of practical reasons for belief. But we can rid the question of this presupposition. Once we do, a highly general ‘Combinatorial Problem’ emerges. The problem has been thought to be intractable due to certain differences in the combinatorial properties of epistemic and practical reasons. Here we bring good news: if we accept an independently motivated version of epistemic instrumentalism—the view that epistemic reasons are a species of instrumental reasons—we can reduce The Combinatorial Problem to the relatively benign problem of how to weigh different instrumental reasons against each other. As an added benefit, the instrumentalist account can explain the apparent intractability of The Combinatorial Problem in terms of a common tendency to think and talk about epistemic reasons in an elliptical manner.
APA, Harvard, Vancouver, ISO, and other styles
3

Lafont, Cristina. "The Priority of Public Reasons and Religious Forms of Life in Constitutional Democracies." European Journal for Philosophy of Religion 11, no. 4 (December 20, 2019): 45. http://dx.doi.org/10.24204/ejpr.v11i4.3036.

Full text
Abstract:
In this essay I address the difficult question of how citizens with conflicting religious and secular views can fulfill the democratic obligation of justifying the imposition of coercive policies to others with reasons that they can also accept. After discussing the difficulties of proposals that either exclude religious beliefs from public deliberation or include them without any restrictions, I argue instead for a policy of mutual accountability that imposes the same deliberative rights and obligations on all democratic citizens. The main advantage of this proposal is that it recognizes the right of all democratic citizens to adopt their own cognitive stance (whether religious or secular) in political deliberation in the public sphere without giving up on the democratic obligation to provide reasons acceptable to everyone to justify coercive policies with which all citizens must comply.
APA, Harvard, Vancouver, ISO, and other styles
4

Bespalov, Andrei. "Religious Faith and the Fallibility of Public Reasons." Oxford Journal of Law and Religion 8, no. 2 (June 1, 2019): 223–46. http://dx.doi.org/10.1093/ojlr/rwz014.

Full text
Abstract:
Abstract Rawlsian liberals define legitimacy in terms of the public justification principle (PJP): the exercise of political power is legitimate only if it is justified on the grounds of reasons that all may reasonably be expected to accept. Does PJP exclude religious reasons from public justification of legal provisions? I argue that the requirement of ‘reasonable acceptability’ is not clear enough to answer this question. Furthermore, it fails to address the problematic fact that justification on the grounds of religious faith involves non-negotiable claims, which is incompatible with respect for fellow citizens as co-legislators. Accordingly, I reformulate PJP in fallibilistic terms: the exercise of political power is legitimate only if it is justified on the grounds of reasons that can be subject to reasonable criticism. I show that reasons based on religious faith do not meet this principle, just like any other reasons that involve claims about final values.
APA, Harvard, Vancouver, ISO, and other styles
5

Cornell, David M. "Salvaging Truth from Ontological Scrap." Philosophy 96, no. 3 (February 17, 2021): 433–55. http://dx.doi.org/10.1017/s0031819121000048.

Full text
Abstract:
AbstractWhat should one do when one's philosophical conclusions run counter to common sense? Bow to the might of ordinary opinion or follow the indiscriminate force of philosophical reason, no matter where it leads? A few strategies have recently been proposed which suggest we needn't have to make this difficult choice at all. According to these views, we can accept the truths of common sense whilst simultaneously endorsing philosophical views with which they seem to conflict. We can, for instance, accept it as true that the Taj Mahal is in India, whilst also eliminating the Taj Mahal from our ontology. I argue that these strategies generate a new conflict with common sense and thus undercut one of the central motivations that drives them. I also argue for the stronger claim that these kinds of ‘truth-salvaging’ strategy are incapable in principle of reconciling theory with common sense. This does not mean that they must be abandoned, for there may be good independent reasons for endorsing them, but it does eliminate one of their most promising advantages. The upshot of the paper will be two-fold. First, one of the major motivations for endorsing these kinds of strategy will be severely undermined. Secondly, and perhaps more significantly, it will mean that for those who think philosophy should be strictly constrained by common sense, all radical ontological views will effectively be off the table.
APA, Harvard, Vancouver, ISO, and other styles
6

Komine, Atsushi. "Beveridge and his pursuit of an ideal economics: why did he come to accept Keynes’s ideas?" International Journal of Social Economics 43, no. 9 (September 12, 2016): 917–30. http://dx.doi.org/10.1108/ijse-06-2015-0149.

Full text
Abstract:
Purpose The purpose of this paper is to examine two (accidental and inevitable) reasons why W.H. Beveridge, who in 1936/1937 had rejected all of the elements of Keynes’s General Theory, came to accept it enthusiastically in the 1940s. Design/methodology/approach The paper answers this question in three steps. First, it distinguishes apparently changeable factors in Beveridge’s views, from consistent ones. Second, it looks for factors of the latter type in his three goals for economics. Third, it compares his goals with those of Keynes. Findings Beveridge’s three goals overlapped with Keynes’s ideals for economics and economists, and this is not historically accidental: economics should be useful as a basis for verification by fresh observations (as an exact science); economic knowledge should be useful in business and policy-making processes (for new kinds of educated professions); and economic studies requires a wide range of related subjects (a liberal education). Research limitations/implications This paper attempts to clarify the cognitive assumptions of the two economists. This clarification can contribute to understanding the process and reasons behind Beveridge’s acceptance of Keynesian economic theory and policies on a theoretical level. Originality/value This paper examines previously ignored reasons for Beveridge’s acceptance of Keynesian economics. Moreover, it suggests certain pre-analytic assumptions concerning the co-existence of social insurance and full employment policies. This perspective will be useful for historians of economics and the welfare state.
APA, Harvard, Vancouver, ISO, and other styles
7

Chawla, Raina, Rashmi Ahuja, and Priyanka Sharma. "Awareness of post partum intra uterine contraceptive device and reasons for its low acceptance in an urban Indian population." International Journal of Research in Medical Sciences 8, no. 2 (January 27, 2020): 701. http://dx.doi.org/10.18203/2320-6012.ijrms20200260.

Full text
Abstract:
Background: The safety and efficacy of the Post-Partum Intra Uterine Contraceptive Device (PPIUCD) has been documented worldwide. With increasing institutional deliveries and greater sensitization, the aim is to increase PPIUCD insertions. Many areas still report poor acceptance. Objectives of this study to determine the proportion of antenatal women willing to accept PPIUCD insertion and the reasons behind refusal to accept this method.Methods: A prospective questionnaire study was done between January 2019 to June 2019 of 200 women. Inclusion criteria were antenatal women in the 2nd/3rd trimester. Exclusion criteria were those opting for a permanent method of contraception and those with a contra-indication.Results: Eighty-four women (42%) had never used any method of contraception. Earlier Intrauterine device (IUD) use (including both interval and PPIUCD) was in only 18.9% of all contraceptive users. Only 2 women in the group had ever used PPIUCD. 79% of women were aware of IUDs. Those unaware were mainly nulliparous. Amongst those aware of an IUD, 88 (56%) were aware it could be inserted postpartum. Only 18% were aware it could be inserted intra-cesarean. All women who participated were offered the option for a PPIUCD. Fifty-nine (29.5%) of all women expressed their willingness but on follow up till delivery only 18 of these women got a PPIUCD inserted. Amongst those not willing for the PPIUCD insertion the commonest reason was general apprehension (39%) followed by partner refusal (33%) and fear of complications (31%). Six women (4.2%) gave history of complications following earlier use and were unwilling for its repeat use. Conclusion: The large unmet need for contraception in India can be solved through repeated counselling and discussions with the woman during her antenatal visits. Alleviating apprehension and addressing concerns of the couple will increase PPIUCD acceptance.
APA, Harvard, Vancouver, ISO, and other styles
8

McCullagh, C. Behan. "The Truth of Basic Historical Descriptions." Journal of the Philosophy of History 9, no. 1 (March 27, 2015): 97–117. http://dx.doi.org/10.1163/18722636-12341293.

Full text
Abstract:
Most historians and many philosophers of history persist in believing that present evidence can warrant belief in the truth of descriptions of particular events in the past. In most of his books on historical knowledge and understanding Alun Munslow has expressed his faith in basic historical descriptions too. Recently, however, he has presented several reasons for doubting their truth. He sees all historical descriptions as nothing but literary creations, reflecting not only the language but also the beliefs and conventions of the historian’s culture. He can find no meaningful relation between texts and events, especially between historical texts and past events that are beyond observation. He allows that we often accept the truth of historical descriptions for everyday purposes, but he offers philosophical reasons for denying that they have any intelligible relation to the past. In this paper I consider the reasons for his scepticism, discuss several popular theories of truth, and then explain why, and in what sense, we are often justified in believing that historical descriptions give us a true account of what happened in the past.
APA, Harvard, Vancouver, ISO, and other styles
9

van der Weele, Gerda M., Roos de Jong, Margot W. M. de Waal, Philip Spinhoven, Herman A. H. Rooze, Ria Reis, Willem J. J. Assendelft, Jacobijn Gussekloo, and Roos C. van der Mast. "Response to an unsolicited intervention offer to persons aged ≥ 75 years after screening positive for depressive symptoms: a qualitative study." International Psychogeriatrics 24, no. 2 (August 16, 2011): 270–77. http://dx.doi.org/10.1017/s1041610211001530.

Full text
Abstract:
ABSTRACTBackground: Screening can increase detection of clinically relevant depressive symptoms, but screen-positive persons are not necessarily willing to accept a subsequent unsolicited treatment offer. Our objective was to explore limiting and motivating factors in accepting an offer to join a “coping with depression” course, and perceived needs among persons aged ≥75 years who screened positive for depressive symptoms in general practice.Methods: In a randomized controlled trial, in which 101 persons who had screened positive for depressive symptoms were offered a “coping with depression” course, a sample of 23 persons were interviewed, of whom five (22%) accepted the treatment offer. Interview transcripts were coded independently by two researchers.Results: All five individuals who accepted a place on the course felt depressed and/or lonely and had positive expectations about the course. The main reasons for declining to join the course were: not feeling depressed, or having negative thoughts about the course effect, concerns about group participation, or about being too old to change and learn new things. Although perceived needs to relieve depressive symptoms largely matched the elements of the course, most of those who had been screened were not (yet) prepared to accept an intervention offer. Many expressed the need to discuss this treatment decision with their general practitioner.Conclusions: Although the unsolicited treatment offer closely matched the perceived needs of people screening positive for depressive symptoms, only those who combined feelings of being depressed or lonely with positive expectations about the offered course accepted it. Treatment should perhaps be more individually tailored to the patient's motivational stage towards change, a process in which general practitioners can play an important role.
APA, Harvard, Vancouver, ISO, and other styles
10

Olschewski, M., and H. Scheurlen. "Comprehensive Cohort Study: An Alternative to Randomized Consent Design in a Breast Preservation Trial." Methods of Information in Medicine 24, no. 03 (July 1985): 131–34. http://dx.doi.org/10.1055/s-0038-1635365.

Full text
Abstract:
SummaryIf in a clinical trial only few patients consent to randomization, all those patients meeting the clinical inclusion criteria should be considered for analysis. Prerandomization is shown to be efficient only if patients can be persuaded to accept the prescribed therapy and do not introduce a self-selection bias. Under full informed consent conditions, as, for example, in a breast preservation trial, the first requirement must be questioned for ethical, the second one for methodological reasons. Conventionally randomized trials with the inclusion of patients having therapeutic preferences appear to be preferable. We briefly discuss the way of analysing the data from such a cohort study.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Reasons All Can Accept"

1

Tonkin, Ryan. "Public reasons or public justification: conceptualizing “can” and the elimination of exclusion in politics." Thesis, 2011. http://hdl.handle.net/1828/3445.

Full text
Abstract:
In this essay, I aim to elucidate a concept of public justification. I outline several challenges faced by political philosophers, including a desire to secure stability and treat people respectfully against a background of reasonable pluralism. I suggest that John Rawls‟ account of public reason provides a helpful starting point for accomplishing these goals. But critics have been both persistent and persuasive in their objections to public reason‟s central element of reasons all can accept. I explicate three dominant criticisms: incomprehensibility, attenuation and exclusion. First, some critics have argued that the very idea of reasons all can accept cannot be plausibly articulated. Second, critics maintain that the set of reasons all can accept is insufficiently robust to solve constitutional essentials and matters of basic justice. Third, critics note that if public justification is constrained by reasons all can accept, then many informative and effective arguments must be excluded from the public sphere. In response to these criticisms, I argue for an interpretation of reasons all can accept which is sensitive to critics‟ reasonable demand for an explicit account of each element of the doctrine. My interpretation demonstrates the superfluity of what I call the sharability constraint—the thesis that only reasons acceptable to all can function as justifications in the public sphere. Once the sharability constraint is rejected, I argue that the problem of exclusion dissipates, but that substantive restrictions on acceptable reasons are still possible. I am optimistic that this approach is less attenuating than one constrained by sharability and that, at least under favourable empirical conditions, more problems can be resolved by this approach than by standard Rawlsian theory. I draw on actual convergence in the international realm to bolster this optimism. Finally, I relate this approach to the widespread influence of deliberative democracy. I argue that procedural apparatuses are insufficient for political legitimacy, but that deliberation may be an invaluable tool for uncovering reasons required by substantive justification.
Graduate
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Reasons All Can Accept"

1

Omodeo, Pietro Daniel. Amerigo Vespucci: The Historical Context of His Explorations and Scientific Contribution. Venice: Fondazione Università Ca’ Foscari, 2020. http://dx.doi.org/10.30687/978-88-6969-402-8.

Full text
Abstract:
This book offers a new reconstruction of Amerigo Vespucci’s navigational and scientific endeavours in their historical context. The author argues that all of the manuscripts or texts that Vespucci left to posterity are reliable and true, except for several amendments imposed upon him for reasons linked to the political and economic interests of those who authorised him to undertake his journeys or which were the result of relationships with his companions. The earliest genuine documentation, which dates from the late fifteenth century or early sixteenth century, confirms this position. Fortunately, careful philological studies of Vespucci’s principal written works are available, while some of his original drawings, which confirm, clarify and enrich what he narrated in his letters, can be identified in Waldseemüller’s large map known as Universalis cosmographia (1507).
APA, Harvard, Vancouver, ISO, and other styles
2

Mattelaer, Johan. For this Relief, Much Thanks ... Translated by Ian Connerty. NL Amsterdam: Amsterdam University Press, 2018. http://dx.doi.org/10.5117/9789462987326.

Full text
Abstract:
Even though peeing is something we all do several times a day, it is still a taboo subject. From an early age, we are taught to master our urinary urges and to use decent words for this most necessary physiological activity. This paradox has not gone unnoticed by artists through the ages. For this Relief, Much Thanks! Peeing in Art is a journey through time and space, stopping along the way to look at many different art forms. The reader-viewer will see how peeing figures - men and women, young and old, human and angelic - have been depicted over the centuries. You will be amazed to discover how often, even in famous works of art, you can find a man quietly peeing in a corner or a putto who is 'irrigating' some grassy field. A detail you will never have seen before, but one that you will never forget when confronted with those same art works in future! Artists have portrayed pee-ers in a variety of different ways and for a variety of different reasons: serious, frivolous, humorous, to make a protest, to make a statement... Whatever their purpose, these works of art always intrigue, not least because of their secret messages and symbolic references, which sometimes can only be unravelled by an expert - like the author of this book. The extensive background information about the artists and their work also gives interesting insights into the often complex origins of the different art forms. In short, a fascinating voyage of discovery awaits you!
APA, Harvard, Vancouver, ISO, and other styles
3

Hartley, Christie, and Lori Watson. Equal Citizenship and Public Reason. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190683023.001.0001.

Full text
Abstract:
This book is a defense of political liberalism as a feminist liberalism. The first half of the book develops and defends a novel interpretation of political liberalism. It is argued that political liberals should accept a restrictive account of public reason and that political liberals’ account of public justification is superior to the leading alternative, the convergence account of public justification. In the second half of the book, it is argued that political liberalism’s core commitments restrict all reasonable conceptions of justice to those that secure genuine, substantive equality for women and other marginalized groups. Here it is demonstrated how public reason arguments can be used to support law and policy needed to address historical sites of women’s subordination to advance equality; prostitution, the gendered division of labor and marriage, in particular, are considered.
APA, Harvard, Vancouver, ISO, and other styles
4

Foster, Charles. On Being Not Depressed. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198801900.003.0003.

Full text
Abstract:
This chapter is written from the perspective of someone who claims never to have suffered from depression. When asked if he has ever been depressed, the author of this chapter reports that he responds vaguely. He states he can be evasive and pompous, insisting that he does not think he satisfies all the clinical criteria. The reason why he is reluctant to accept the notion that he is depressed is not because he feels ashamed about being depressive. On the contrary, he feels that he does not belong to the community of the depressed—an elite club with a black, glorious fellowship of agony in which he cannot share. Another reason is that depression and its symptoms are impossible to describe, even if the will to describe them is intense. No metaphors or similes are sufficient to describe what happens. The author says he is better off with the unnamed and unnamable. He concludes by suggesting that the least unsatisfactory picture is of auto-immune disease: self-consumption.
APA, Harvard, Vancouver, ISO, and other styles
5

Heiner, Prof, Bielefeldt, Ghanea Nazila, Dr, and Wiener Michael, Dr. Part 3 Vulnerable Groups, 3.2 Persons Deprived of Their Liberty. Oxford University Press, 2016. http://dx.doi.org/10.1093/law/9780198703983.003.0020.

Full text
Abstract:
This chapter addresses the right to freedom of religion or belief, which all detainees should enjoy regardless of the reasons of their detention. Freedom of religion or belief can be deeply significant for detainees, since it can offer them comfort, rehabilitation, and hope at a time when they are experiencing a paucity of social interaction. The chapter highlights the positive duties upon the State in relation to detention due to the heightened risk of religious violations such as indoctrination, forced conversion or involuntary access to prison chaplains. Moreover, imprisonment imposes particular risks on limitations being imposed on manifestations of religion or belief such as fasting, access to religious materials, diet, clothing, and headdress.
APA, Harvard, Vancouver, ISO, and other styles
6

Irmgard, Marboe. 1 Introduction. Oxford University Press, 2017. http://dx.doi.org/10.1093/law/9780198749936.003.0001.

Full text
Abstract:
The calculation of compensation and damages plays an important role in international investment arbitration. Claimants are usually interested above all with the question of how much they can expect to receive after a possibly long-lasting legal procedure. In addition, general preventive reasons should also be kept in mind. It follows that the financial assessment of the legal claims should as closely as possible reflect economic realities. This means that generally accepted valuation standards and approaches should be applied which so far has not been self-evident in international legal disputes.
APA, Harvard, Vancouver, ISO, and other styles
7

Dooley, Brendan, ed. The Continued Exercise of Reason. The MIT Press, 2018. http://dx.doi.org/10.7551/mitpress/9780262535007.001.0001.

Full text
Abstract:
George Boole (1815–1864), remembered by history as the developer of an eponymous form of algebraic logic, can be considered a pioneer of the information age not only because of the application of Boolean logic to the design of switching circuits but also because of his contributions to the mass distribution of knowledge. In the classroom and the lecture hall, Boole interpreted recent discoveries and debates in a wide range of fields for a general audience. This collection of lectures, many never before published, offers insights into the early thinking of an innovative mathematician and intellectual polymath. Bertrand Russell claimed that “pure mathematics was discovered by Boole,” but before Boole joined a university faculty as professor of mathematics in 1849, advocacy for science and education occupied much of his time. He was deeply committed to the Victorian ideals of social improvement and cooperation, arguing that “the continued exercise of reason” joined all disciplines in a common endeavor. In these talks, Boole discusses the genius of Isaac Newton; ancient mythologies and forms of worship; the possibility of other inhabited planets in the universe; the virtues of free and open access to knowledge; the benefits of leisure; the quality of education; the origin of scientific knowledge; and the fellowship of intellectual culture. The lectures are accompanied by a substantive introduction that supplies biographical and historical context.
APA, Harvard, Vancouver, ISO, and other styles
8

Shaviro, Daniel N. Economics of Tax Law. Edited by Francesco Parisi. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199684250.013.020.

Full text
Abstract:
This chapter considers the question of how tax law can be designed with an eye to maximizing economic efficiency. From the standpoint of efficiency, no lump-sum tax is better than any other—by definition all succeed equally in avoiding the creation of deadweight loss. This leads directly to two main questions. First, why are lump-sum taxes, or instruments that come as close to them as possible, so absent, not just in actual practice but even in theoretical debate about tax policy? The answer turns on the importance of distributional issues. Second, how do considerations of efficiency operate once we have accepted, for distributional reasons, the need for tax instruments that have the unfortunate side effect of discouraging productive activity?
APA, Harvard, Vancouver, ISO, and other styles
9

Stroud, Barry. Naturalism and Skepticism in the Philosophy of Hume. Edited by Paul Russell. Oxford University Press, 2014. http://dx.doi.org/10.1093/oxfordhb/9780199742844.013.003.

Full text
Abstract:
Hume takes his “naturalistic” study of human nature to show that certain general “principles of the imagination” can explain how human beings come to think, feel, believe, and act in all the ways they do independently of the truth or reasonableness of those responses. This appears to leave the reflective philosopher with no reason for assenting to what he has discovered he cannot help believing anyway. Relief from this unacceptably extreme skepticism is found in acknowledging and acquiescing in those forces of “nature” that inevitably overcome the apparent dictates of “reason” and return the philosopher to the responses and beliefs of everyday life. Living in full recognition of these forces and limitations is what Hume means by the “mitigated scepticism” he accepts.
APA, Harvard, Vancouver, ISO, and other styles
10

Wedgwood, Ralph. The Pitfalls of ‘Reasons’. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780198802693.003.0005.

Full text
Abstract:
Many philosophers working on normative issues follow the ‘Reasons First’ program. According to this program, the concept of a ‘normative reason’ for an action or an attitude is the most fundamental normative concept, and all other normative and evaluative concepts can be defined in terms of this fundamental concept. This paper criticizes the foundational assumptions of this program. In fact, there are many different concepts that can be expressed by the term ‘reason’ in English. The best explanation of the data relating to these concepts is that they can all be defined in terms of explanatory concepts and other normative or evaluative notions: for example, in one sense, a ‘reason’ for you to go is a fact that helps to explain why you ought to go, or why it is good for you to go. This implies that none of the concepts expressed by ‘reason’ is fundamental.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Reasons All Can Accept"

1

Taslaman, Caner. "Can a Muslim be an Evolutionist?" In Abrahamic Reflections on Randomness and Providence, 107–17. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-75797-7_6.

Full text
Abstract:
AbstractI will argue that no claim about the emergence of life forms and humans can contradict Islam. Although the Quran clearly asserts that all species of life, including humans, are created by God, it does not reveal how God created. Since the Quran doesn’t teach how God created species, the Quran is compatible with evolution. Yet, although a Muslim can believe in evolution, I argue against the claim that a Muslim must accept evolution.
APA, Harvard, Vancouver, ISO, and other styles
2

Hutto, Daniel D., and Erik Myin. "Reasons to REConceive." In Evolving Enactivism. The MIT Press, 2017. http://dx.doi.org/10.7551/mitpress/9780262036115.003.0002.

Full text
Abstract:
Chapter 2 introduces REC’s Equal Partner Principle, according to which invoking neural, bodily, and environmental factors all make equally important contributions when it comes to explaining cognitive activity. In line with that principle, it is made clear how REC can accept that cognitive capacities depend on structural changes that occur inside organisms and their brains, without understanding such changes in information processing and representationalist terms. This chapter explicates the Hard Problem of Content, aka the HPC, as basis for a compelling argument for REC. The HPC is a seemingly intractable theoretical puzzle for defenders of unrestricted CIC. A straight solution to the HPC requires explaining how it is possible to get from informational foundations that are noncontentful to a theory of mental content using only the resources of a respectable explanatory naturalism that calls on the resources of the hard sciences. It is revealed how the need to deal with the HPC can be avoided by adopting REC’s revolutionary take on basic cognition, and why going this way has advantages over other possible ways of handling the HPC.
APA, Harvard, Vancouver, ISO, and other styles
3

Kogelmann, Brian, and Stephen G. W. Stich. "When Public Reason Falls Silent." In Oxford Studies in Political Philosophy Volume 7, 161–93. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780192897480.003.0006.

Full text
Abstract:
Public reason theorists argue that coercive state action must be justified to those subject to such action. Doing so requires citizens to give only those reasons that all can accept. These reasons, the chapter argues, include scientific and social scientific considerations. One ineliminable and arguably salutary property of the modern administrative state is that the coercive policies it produces can be justified only on the basis of extremely complex scientific and social scientific considerations. Many of these considerations are neither understood by most ordinary citizens nor agreed upon by experts. This means that the overwhelming majority of citizens do not accept the reasons justifying coercive administrative policies. As a result, public reason is inconsistent with the administrative state. There are deep implications to this result: if public reason is inconsistent with the administrative state, then it is also inconsistent with forms of social organization that presuppose it. This, the chapter argues, includes egalitarianism, which many proponents of public reason also endorse. Public reason theorists thus must choose: justification through public reason, or distributive equality?
APA, Harvard, Vancouver, ISO, and other styles
4

Hatzis, Nicholas. "Reasons for State Action." In Offensive Speech, Religion, and the Limits of the Law, 35–54. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780198758440.003.0003.

Full text
Abstract:
The claim that the government ought to prohibit offensive speech is a form of practical reasoning. It tells us what an agent has reason to do under certain circumstances. The first part of the chapter explores in more detail the structure of that claim and the underlying idea that respect for religious feelings is a value which is realized when insults are censored. The second part explores the types of reasons which can be legitimately invoked to justify the exercise of state coercion. We expect restrictions of liberty to be based on reasons which all citizens can be expected to accept, regardless of their own view about what kind of life is worth living (i.e. a public reason requirement). The fact that an act is incompatible with the teachings of a religion is never an adequate reason for its prohibition. After discussing different versions of public reason theory, I suggest that even those which allow for some reliance on religious justifications cannot support the use of coercion against speakers who hurt their listeners’ religious feelings.
APA, Harvard, Vancouver, ISO, and other styles
5

King, Zoë Johnson. "We Can Have Our Buck and Pass It, Too." In Oxford Studies in Metaethics Volume 14, 167–88. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198841449.003.0008.

Full text
Abstract:
Chapter 8 argues against the view that the moral rightness of an act is not a reason to perform it, and our reasons are instead the features that make the act right. Philosophers typically defend this view by noting that it seems redundant to take rightness to be an additional reason, once it has been acknowledged that the right-making features are already reasons. The author shows that this argument dramatically overgeneralizes, ruling out all cases in which two or more reasons are arranged in relationships of metaphysical constitution. She then proposes an alternative way of thinking about these metaphysical hierarchies: Rather than assuming that at most one of the facts in each hierarchy is the “real” reason, bearing all the normative weight, it should be accepted that these facts can all be genuine reasons, whose normative weight is shared in virtue of the metaphysical relationships between them. Some tests are offered that can be used to determine which facts occur in metaphysical hierarchies with shared weight, and it is argued that the fact that an act is morally right passes the tests. The author then explains what she takes to be some kernels of truth underlying the redundancy argument, arguing that these phenomena are pragmatic, not metaphysical.
APA, Harvard, Vancouver, ISO, and other styles
6

Kekes, John. "Is There an Absolute Value that Overrides All Other Considerations?" In Hard Questions, 15–42. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780190919986.003.0002.

Full text
Abstract:
One answer was given in Maccabeus 2 by a venerable old man and his imagined younger alter ego. The old man was guided by what he took to be an absolute value and died rather than violate it. The young one accepted the same value, but when it came into conflict with other things he had reason to value, he did not think it was absolute. This chapter considers how the reasons for the certainty of the old man and the flexibility of the young one could be evaluated. It shows that although their reasons were derived from their different personal and social circumstances, yet the reasons of one were better than those of the other. Having their cases in front of us, we can consider whether we have or should have an absolute value that we would be ready to die for, if it becomes necessary. I give reasons why we may individually make a reasonable absolute commitment to a person, ideal, or cause without claiming that reason requires everyone to make that or any other absolute commitment.
APA, Harvard, Vancouver, ISO, and other styles
7

Rowland, Richard. "Reasons as The Unity among the Varieties of Goodness." In The Normative and the Evaluative, 76–100. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198833611.003.0005.

Full text
Abstract:
The Buck-Passing Account of Value (BPA) analyses goodness simpliciter in terms of reasons for pro-attitudes. The Value-First Account (VFA) analyses reasons for pro-attitudes in terms of value. And the No-Priority View (NPV) holds that neither reasons nor value can be analysed in terms of one another. This chapter argues that BPA should be accepted rather than VFA or NPV because if BPA is accepted, then what all the different varieties of goodness have in common can be explained: but if VFA or NPV is accepted, what the different varieties of goodness have in common cannot be explained. In making this argument this chapter motivates and defends accounts of goodness for (prudential value) and goodness of a kind (attributive goodness) in terms of reasons for pro-attitudes. It shows that the objections that have been made to buck-passing accounts of goodness for and goodness of a kind can be overcome and that there are many advantages to accepting such accounts.
APA, Harvard, Vancouver, ISO, and other styles
8

Obladen, Michael. "Cast aside." In Oxford Textbook of the Newborn, edited by Michael Obladen, 191–96. Oxford University Press, 2021. http://dx.doi.org/10.1093/med/9780198854807.003.0027.

Full text
Abstract:
Trisomy 21 originated with Homo sapiens, or even before, as it exists in other primates. However, in antiquity, Down’s syndrome was rare: mothers were younger, and children failed to reach adulthood. For centuries, trisomy 21 and hypothyreosis were confused. Scientific reports originated from asylums for the mentally retarded. In 1866, John Langdon Down at Earlswood published a description of symptoms in his ‘Ethnic classification of idiots’ and coined the term ‘Mongolian’. Jerôme Lejeune identified an additional chromosome 21 causing the disorder. Maternal age rose markedly for various reasons, as did the prevalence of trisomy 21. From 1968, high-risk pregnancies were screened and interrupted because of Down’s syndrome. Non-invasive techniques now enable all pregnancies to be screened to detect chromosomal anomalies early and precisely. The topic is hotly debated and consensus unlikely. Legislation will not halt scientific progress, but it should ensure that in the same society contradictory attitudes can be held and mutually respected: the right to accept a disabled infant and the right not to accept it.
APA, Harvard, Vancouver, ISO, and other styles
9

Wood, William. "What Is Analytic Theology?" In Analytic Theology and the Academic Study of Religion, 48–78. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780198779872.003.0005.

Full text
Abstract:
Chapter 5 considers the still open question “What is Analytic Theology?” In dialogue with Timothy Pawl and William Hasker, I argue that analytic theology is a form of faith seeking understanding and a form of constructive theology. I then consider some efforts to push analytic theology into comparatively neglected areas, including topics related to social justice. I focus especially on Sameer Yadav’s call for analytic liberation theology. I conclude the chapter with a bit of additional reflection on why analytic theology is valuable. Christians should agree that it is good to try to answer rational objections to key Christian doctrines, and similarly good to try to give positive models for how to understand them in a way that coheres with other things we take ourselves to know. Those tasks are perennial, and analytic theology is one way that Christians today can pursue them fruitfully. At the same time, however, I also think that there are non-Christian reasons for valuing analytic theology, reasons that anyone might accept. Intellectual inquiry is good. Focused thinking about terrifically difficult problems is good. Watching very intelligent people think as deeply and carefully as they can about things that matter to them more than anything else—that too is good.77 Analytic theology displays all of these goods, something that anyone can recognize, even without accepting the underlying Christian framework.
APA, Harvard, Vancouver, ISO, and other styles
10

Gulmez, Hakan. "Detection of Chronic Disease in Primary Care Using Artificial Intelligence Techniques." In Advances in Healthcare Information Systems and Administration, 195–219. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-2581-4.ch009.

Full text
Abstract:
Chronic diseases are the leading causes of death and disability worldwide. By 2020, it is expected to increase to 73% of all deaths and 60% of global burden of disease associated with chronic diseases. For all these reasons, early diagnosis and treatment of chronic diseases is very important. Machine learning is an application of artificial intelligence that provides systems the ability to automatically learn and improve from experience without being explicitly programmed. Machine learning is the development of computer programs that can access data and use it to learn for themselves. The learning process starts by searching for patterns in the examples, experiences, or observations. It will make faster and better decisions in the future based on all these. The primary purpose in machine learning is to allow computers to learn automatically without human help and affect. Considering all the reasons above, this chapter finds the most appropriate artificial intelligence technique for the early detection of chronic diseases.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Reasons All Can Accept"

1

Belhaj, Hadi, Mohamad Haroun, and Terry Lay. "Keeping Net Cash Flow Alive for a Petroleum Exploration Project: Risk Analysis Approach." In ASME 2010 International Mechanical Engineering Congress and Exposition. ASMEDC, 2010. http://dx.doi.org/10.1115/imece2010-37190.

Full text
Abstract:
Meaningful risk analysis can be a tedious task to perform for many reasons, yet extremely rewarding. Lack of information, uncertainty surrounding risk parameters and their distributions, failure to define proper correlations relating some risk parameters, inappropriate selection of risk analysis criterion and misinterpretation of results are among these reasons. Risking net cash flow (NCF), through traditionally approaches can be a leap of faith. Rather, NCF should be treated with more subjectivity and in-depth understanding of all risk parameters and their interrelationships. Current practice of risk management in the petroleum industry adopts schemes that aim at separating risk into two main categories to understand, simplify, analyze, and evaluate existing contingencies. Commonly, the first category is referred to as subsurface risk that includes resource size, production rate, and access cost. Category two is surface risk that demonstrates total expenditure, facilities delivery, delays, performance, oil/gas prices, etc. Risk analysis of each is normally performed alone. Our study shows that separating risks for an investment with a singular outcome is misleading and extremely dangerous. In this paper, we introduce comprehensive criteria for handling risk associated with oil and gas exploration as well as development of mature reservoirs through EOR and IOR that involves large cash expenditures for; in-fill drilling, waterflooding, gas injection, and thermal and chemical treatment of heavy oil recovery. Basically, one or a combined uncertainty of these elements may create “business risk” that may cause “business impact”. The impact can be positive leading to “business opportunity” or negative leading to “business threat”. Also, instead of risking NCF using risk parameters like gross revenue that consists of hydrocarbon in-place and unit price of oil and gas, and net expenditure (CAPEX and OPEX) by simply defining their risk distributions and parameters, our approach breaks down each risk parameter to sub-parameters, then risk components and finally risk fragments. This produces a break-down model of risk analysis approach by including all parameters with no stage separation that avoids risk of poor assumptions. Hence, risk parameters are simplified by evaluating specific distributions. Case study involving one major Gulf States oil reservoirs is used to demonstrate the approach presented in this paper. Results show great improvement of results as compared to the traditionally used method.
APA, Harvard, Vancouver, ISO, and other styles
2

Kolomytsev, Alexander, and Yulia Pronyaeva Pronyaeva. "3D PETROPHYSICS FOR HAWE: CASE STUDIES." In 2021 SPWLA 62nd Annual Logging Symposium Online. Society of Petrophysicists and Well Log Analysts, 2021. http://dx.doi.org/10.30632/spwla-2021-0055.

Full text
Abstract:
Most conventional log interpretation technics use the radial model, which was developed for vertical wells and work well in them. But applying this model to horizontal wells can result in false conclusions. The reasons for this are property changes in vertical direction and different depth of investigation (DOI) of logging tools. DOI area probably can include a response from different layers with different properties. All of this complicates petrophysical modeling. The 3D approach for high angle well evaluation (HAWE) is forward modeling in 3D. For this modeling, it is necessary to identify the geological concept near the horizontal well section using multiscale data. The accuracy of modeling depends on the details of the accepted geological model based on the data of borehole images, logs, geosteering inversion, and seismic data. 3D modeling can be applied to improve the accuracy of reservoir characterization, well placement, and completion. The radial model is often useless for HAWE because LWD tools have different DOI and the invasion zone was not formed. But the difference between volumetric and azimuthal measurements is important for comprehensive interpretation because various formations have different properties in vertical directions. Resistivity tools have the biggest DOI. It is important to understand and be able to determine the reason for changes in log response: a change in the properties of the current layer or approaching the layers with other properties. For this, it is necessary to know the distance to the boundaries of formations with various properties and, therefore, to understand the geological structure of the discovered deposits, and such information on the scale of well logs can be obtained either by modeling or by using extra deep resistivity inversion (mapping). The largest amount of multidisciplinary information is needed for modeling purposes - from images and logs to mapping and seismic data. Case studies include successful examples from Western Siberia clastic formations. In frame of the cases, different tasks have been solved: developed geological concept, updated petrophysical properties for STOIIP and completion, and provided solutions during geosteering. Multiscale modeling, which includes seismic, geosteering mapping data, LWD, and imagers, has been used for all cases.
APA, Harvard, Vancouver, ISO, and other styles
3

Ahmadi, A., S. B. M. Beck, and R. Stanway. "The Swirling Orifice Plate: A Flowmeter for All Reasons." In ASME 2005 Fluids Engineering Division Summer Meeting. ASMEDC, 2005. http://dx.doi.org/10.1115/fedsm2005-77349.

Full text
Abstract:
The orifice plate flow meter is the most common form of differential pressure flow meter used in industry. The standard discharge coefficient, which is defined by both British Standard and ISO 5167, is only valid if the flow approaching the meter is perfectly settled and fully developed. However, in practical applications the flow approaching the orifice meter is often disturbed by pipe-fittings and consequently the measurements become inaccurate. Basically, the design of the orifice plate meters that are independent of the upstream disturbances is a main goal for orifice plate metering. This task can be achieved either by using a long straight settling length upstream and downstream of the orifice plate or using a flow conditioner upstream of an orifice plate. In addition, the standard orifice plate is vulnerable when metering dirty flow due to the accumulation of dirt in front of the orifice plate which can alter the accuracy of metering as well. In this paper the effect of the swirler flow conditioner for both standard and non-standard flow conditions has been investigated in an experimental rig and validation of the results has been justified with the appropriate CFD domains. In these investigations the effect of different designs of swirler flow conditioners have been examined in asymmetric and swirling flow profiles. The results so far show the cone swirler flow conditioner has a desirable effect for both asymmetric and swirling flow disturbances. They also show the error of metering for non-standard velocity profiles with the swirler flow conditioner is typically 1.5% compared to around 4% for a standard orifice plate. Moreover using a swirler conditioner tends to keep particles in suspension and thus prevents the accumulation of dirt particles in front of orifice plate. All experimental and numerical results here are presented for different velocity profiles both swirling and asymmetric profiles, mass flow rates and for β = 0.5.
APA, Harvard, Vancouver, ISO, and other styles
4

Bunnik, Tim, Jule Scharnke, and Erik-Jan de Ridder. "Efficient Indicators for Screening of Random Waves for Wave Impacts on a Jacket Platform and a Fixed Offshore Wind Turbine." In ASME 2019 38th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/omae2019-95481.

Full text
Abstract:
Abstract Renewed interest in wave impact assessment has risen for various reasons: • The low airgap of some existing Mobile Units in the North Sea • The COSL Innovator incident and related to this topic the new DNV-GL guidelines (OTG 13 and OTG 14). • the installation of many large-diameter monopile foundations for wind turbines in increasingly deep water in the North Sea. • The installation of many large-diameter wind turbines in increasingly deep water in the North Sea. • Seabed subsidence (and maybe water level rises due to global warming) and their effect on the decreasing airgap of fixed platforms. Wave impact assessment has been the subject of many recent studies and research projects, and there has been a strong knowledge and tool development during the last decade, both within model testing and numerical (CFD) analysis (Huang et.al (2017), de Ridder et.al, (2017), Vestbøstad et. al. (2017), Bunnik et.al. (2018)). However, there is still a lack of efficient methods and tools to properly analyze wave impacts and derive the statistical variation of these impacts in the sea states to which these structures are exposed during their lifetime. To reduce the statistical uncertainties that are naturally arising in estimates of design loads related to extreme waves, sufficient data must be gathered. In order to estimate the design loads it is common practice not to investigate all possible sea states (i.e. long-term analysis) but to investigate a few sea states and assume that the design value occurs at a prescribed probability level in the sea states with the same probability level (i.e. contour line approach). The estimate of the design value at that probability level is then based on results from a limited number of random realizations of these sea states. For linear or weakly nonlinear response types it is possible to estimate design loads accurately with a quite limited number of realizations. For strongly nonlinear problems however this is not possible due to the large statistical variation in the maximum observations, inherent to a random nonlinear process. Estimating accurately the tail of the load distribution requires many more realizations. This approach is restricted by time and costs and eventually one may have to accept an estimated design load with a large statistical uncertainty and account for the uncertainty with a higher safety margin. In this paper an improved methodology for estimating design loads related to extreme wave impacts will be presented. The methodology is based on screening many 3-hour realizations of the design sea states with simplified, fast but sufficiently accurate methods and to focus only on the potentially critical events with a model containing a more complete description of the physics. This can be either a model test or a non-linear impact simulation (i.e. CFD analysis). By doing this many more rare/critical events can be assessed, reducing the statistical uncertainty in the estimate of the design load. A screening method/wave impact indicator will be presented for a jacket platform and for a fixed offshore wind turbine. Existing model test data is used to show the correlation between indicator and actual impact events and to derive the efficiency of the impact indicators.
APA, Harvard, Vancouver, ISO, and other styles
5

Aumuller, John J., and Vincent A. Carucci. "Evaluation of ASME Pressure Vessel Code Prohibitions on Rod and Bar Stock and Potential Remedies." In ASME 2014 Pressure Vessels and Piping Conference. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/pvp2014-28081.

Full text
Abstract:
The ASME Codes and referenced standards provide industry and the public the necessary rules and guidance for the design, fabrication, inspection and pressure testing of pressure equipment. Codes and standards evolve as the underlying technologies, analytical capabilities, materials and joining methods or experiences of designers improve; sometimes competitive pressures may be a consideration. As an illustration, the design margin for unfired pressure vessels has decreased from 5:1 in the earliest ASME Code edition of the early 20th century to the present day margin of 3.5:1 in Section VIII Division 1. Design by analysis methods allow designers to use a 2.4:1 margin for Section VIII Division 2 pressure vessels. Code prohibitions are meant to prevent unsafe use of materials, design methods or fabrication details. Codes also allow the use of designs that have proven themselves in service in so much as they are consistent with mandatory requirements and prohibitions of the Codes. The Codes advise users that not all aspects of construction activities are addressed and these should not be considered prohibited. Where prohibitions are specified, it may not be readily apparent why these prohibitions are specified. The use of “forged bar stock” is an example where use in pressure vessels and for certain components is prohibited by Codes and standards. This paper examines the possible motive for applying this prohibition and whether there is continued technical merit in this prohibition, as presently defined. A potential reason for relaxing this prohibition is that current manufacturing quality and inspection methods may render a general prohibition overly conservative. A recommendation is made to better define the prohibition using a more measurable approach so that higher quality forged billets may be used for a wider range and size of pressure components. Jurisdictions with a regulatory authority may find that the authority is rigorous and literal in applying Code provisions and prohibitions can be particularly difficult to accept when the underlying engineering principles are opaque. This puts designers and users in these jurisdictions at a technical and economic disadvantage. This paper reviews the possible engineering considerations motivating these Code and standard prohibitions and proposes modifications to allow wider Code use of “high quality” forged billet material to reflect some user experiences.
APA, Harvard, Vancouver, ISO, and other styles
6

Henry, Chris, and Steven Grant. "Implementing New Automated Ticketing Technology at Virginia Railway Express." In 2012 Joint Rail Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/jrc2012-74054.

Full text
Abstract:
Virginia Railway Express (VRE) is at a crossroads at a key time with its current technology. In the near future, VRE will be required to replace its existing Automated Fare Collection (AFC) system. While this may not initially sound so different from what all rail agencies must eventually go through, ensuring that the system can be integrated into its neighboring Washington Metropolitan Area Transit Authority’s (WMATA) impending New Electronic Payments Program (NEPP) is a completely different story, and for many reasons. VRE is a key regional partner of WMATA and, as such, the two work hand-in-hand to ensure the interoperability between the two systems is maximized for the passengers who ride both services. Key to this is NEPP as an eventual replacement of WMATA’s SmarTrip® program. Since the majority of VRE’s ridership is Federal employees who carry PIV (Personal Identity Verification)/CAC (Common Access Card) cards and are making their way into the nation’s capital from Virginia and Maryland, the SmarTrip® program has been a major focus for VRE. While the NEPP program has several years before it goes live, it presents VRE with a valuable opportunity to review its current AFC system and use the interim to implement various concepts of operations for a future system. As such, VRE has become a willing partner for WMATA as a host for technology proof-of-concepts that will aid both VRE and WMATA in the long term. VRE is looking into hosting various technology options to pilot at key stations that may include mobile ticketing, Near Field Communication (NFC), or PIV/CAC cards as forms of payment, as well as proof of payment. As an open-gated system, VRE must tackle the problem of fare evasion, so looking to maximize its proof-of-payment capabilities with the latest technology is key. VRE would like to share with the rail community its thoughts and ideas for proof-of-concepts to utilize the latest payment technologies, as well as discuss its plans on interoperability with WMATA to assist agencies with similar challenges.
APA, Harvard, Vancouver, ISO, and other styles
7

Brown, Taylor, Sandra Hope, and Brian D. Jensen. "Fabrication and Testing of a MEMS System for Injection of DNA Into Plant Cells." In ASME 2019 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/detc2019-98019.

Full text
Abstract:
Abstract This paper describes the fabrication and testing of a system to inject DNA into plant leaves. Arrays of silicon lances were made using photolithographic and STS DRIE Bosch techniques. A nanoinjector device was also made to accept the silicon lance arrays and perform nanoinjections. Nanoinjections were performed on Arabidopsis and cotton cotyledons. Changes in the force applied during a nanoinjection and varying the number of repeated nanoinjections on the same cotyledon were observed. Too much force or too many repeated injections caused physical damage to the cotyledon. An optimal force and number of repeated injections can be performed without causing physical damage to the cotyledon. Several injections using DNA were performed without successful transfection of the leaves. Possible reasons for this failure to transfect were explored.
APA, Harvard, Vancouver, ISO, and other styles
8

Chin, Jessica, Ibrahim Zeid, Claire Duggan, and Sagar Kamarthi. "Why Engineering-Based Learning Can Revolutionize STEM Teaching in High Schools." In ASME 2012 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/imece2012-86355.

Full text
Abstract:
For many years, literature has documented the benefits of project-based learning (PBL) and its impact on student learning especially at the high school level. More often than not however, students are still losing interest in STEM (Science, Technology, Engineering, and Mathematics) education because current educational teaching pedagogies have become antiquated and are not impacting student learning, as it should. With that said, our discovery through elicitation of high school educators has cited the main reason for such disinterest is due to the inability of students to connect STEM abstract concepts and theory with STEM application to appreciate the value of learning STEM. With access to information easier than ever, students are forgetting that learning is not about getting the right answer but understanding how to solve a complex problem. In the past, PBL has benefited students in engaging them in hands-on learning however, with a more complex paradigm shift in student learning style, PBL and lecture-based learning are no longer the most effective methods of teaching. Engineering-based learning has the opportunity and potential to modify STEM education and revolutionize STEM teaching pedagogy by changing the one-size-fits-all model to an individual, student-centered learning approach where education is mass customized. This paper discusses a new teaching pedagogy dubbed Engineering-Based Learning (EBL) that is a more systematic approach to high school STEM teaching for open-ended problems. This paper presents the EBL model, the EBL tools, and its impact thus far on high school students. It also presents sample feedback from both teachers and students and how it has influenced their outlook of engineering and STEM in the real world. The purpose of this paper is also to disseminate this new teaching pedagogy to support the notion that STEM education can be successfully taught and provide students with a structured, systematic, hands-on approach, as well as the appropriate tools and resources allowing them to connect complex STEM theory and real-world application.
APA, Harvard, Vancouver, ISO, and other styles
9

Moody, Janette. "Public Perceptions of Biometric Devices:The Effect of Misinformation on Acceptance and Use." In InSITE 2004: Informing Science + IT Education Conference. Informing Science Institute, 2004. http://dx.doi.org/10.28945/2743.

Full text
Abstract:
Organizations are introducing biometric devices into various sections of the economy for various reasons. What began as a security feature for a limited number of government organizations has been adapted to such diverse uses as paying for school children’s lunches to tracking employees’ work attendance. From an organizational perspective, justifications for use of biometric devices are plentiful. However, the public’s perception of these devices may be quite different. These perceptions in turn will influence public willingness to accept and use biometric devices. Although employee use of biometric devices can be mandated, a more productive alternative might be to understand their perceptions and address those specifically through education and information. This paper describes common types of biometrics, reviews their current use in organizations, presents findings of a recent survey of public perceptions to determine the areas requiring the most education, and concludes with suggestions for providing this education.
APA, Harvard, Vancouver, ISO, and other styles
10

Tandon, S., S. Mitra, M. K. Sharma, U. Saxena, P. Ahlawat, I. Kaur, A. Chowdhary, and P. Surkar. "Image guided interstitial brachytherapy for locally advanced disease after external beam radiotherapy in a case of carcinoma cervix – our institutional experience." In 16th Annual International Conference RGCON. Thieme Medical and Scientific Publishers Private Ltd., 2016. http://dx.doi.org/10.1055/s-0039-1685276.

Full text
Abstract:
Purpose/Objective: Cervical cancer is the third most common cancer in women worldwide. Definitive chemoradiation is the accepted standard of care for patients especially for locally advanced cervical cancers. Intracavitary brachytherapy (ICBT) is an important part of definitive radiotherapy shown to improve overall survival. Interstitial brachytherapy (ISBT) is generally reserved for patients either with extensive pelvic and/or vaginal residual disease after external beam radiotherapy (EBRT) or with anatomy not allowing ICBT with standard applicators in an attempt to improve local control. We have conducted an observational study for patients who underwent image guided HDR-ISBT at our institute. Materials and Methods: Seven patients; diagnosed as a case of carcinoma cervix; were selected from the period of 2012 to 2015 who received EBRT by IMRT and for whom ICBT couldn’t be done for various reasons. These patients were then taken up for Martinez Universal Perineal Interstitial Template (MUPIT) image based ISBT. A descriptive analysis was done for doses received by HRCTV, bladder, rectum and sigmoid colon. At the end of treatment, early response at 3 months along with overall survival (OS) and disease free survival (DFS) was also calculated. Results: All the patients recruited were locally advanced with 3 patients in IIB, 1 patient in IIIA and 3 patients belonging to IIIB. The mean dose received by 95% high risk CTV (HRCTV) by IMRT was 49.75 Gy. Out of 7 patients, 3 were taken up for ISBT due to anatomical restriction whereas remaining 4 patients were included because of lack of dose coverage by ICBT. The mean doses received by 90% of HRCTV, 2 cc bladder, 2 cc rectum and 2 cc sigmoid colon were 20.58 Gy, 2.73 Gy, 3.19 Gy and 2.82 Gy respectively. The early response at 3 months was 57.14%. The DFS at one year and OS at 3 year were 53.6% and 53.3% respectively. Conclusions: Our descriptive analysis of seven patients being treated by image based ISBT have revealed that locally advanced cervical cancer patients for whom ICBT is unsuitable can achieve equitable LRC and OS with a combination of EBRT by IMRT and image based HDR-ISBT.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Reasons All Can Accept"

1

Setiawan, Ken M. P., Bronwyn A. Beech Jones, Rachael Diprose, and Amalinda Savirani, eds. Women’s Journeys in Driving Change: Women’s Collective Action and Village Law Implementation in Indonesia. University of Melbourne with Universitas Gadjah Mada and MAMPU, 2020. http://dx.doi.org/10.46580/124331.

Full text
Abstract:
This volume shares the life journeys of 21 women from rural villages from Sumatra, to Java, to Kalimantan, Sulawesi and East and West Nusa Tenggara (for ethical reasons, all names have been anonymised). In each of these villages, CSOs introduced and/or strengthened interventions to support gender inclusion, women’s collective action and empowerment. The stories of these village women offer unique insights into women’s aspirations, the challenges they have encountered and their achievements across multiple scales and domains, illustrating the lived complexities of women in rural Indonesia, particularly those from vulnerable groups. The stories shared highlight women’s own pathways of change and their resilience and determination often in the face of resistance from their families and communities, to ultimately reduce rural gender inequities and bolster gender inclusiveness. The stories also illustrate the important role CSOs—those that are focused on gender inclusion and facilitating grassroots women’s agency and empowerment—can play in supporting women’s voice and agency as they undertake this journey.
APA, Harvard, Vancouver, ISO, and other styles
2

Setiawan, Ken M. P., Bronwyn A. Beech Jones, Rachael Diprose, and Amalinda Savirani, eds. Women’s Journeys in Driving Change: Women’s Collective Action and Village Law Implementation in Indonesia. University of Melbourne with Universitas Gadjah Mada and MAMPU, 2020. http://dx.doi.org/10.46580/124331.

Full text
Abstract:
This volume shares the life journeys of 21 women from rural villages from Sumatra, to Java, to Kalimantan, Sulawesi and East and West Nusa Tenggara (for ethical reasons, all names have been anonymised). In each of these villages, CSOs introduced and/or strengthened interventions to support gender inclusion, women’s collective action and empowerment. The stories of these village women offer unique insights into women’s aspirations, the challenges they have encountered and their achievements across multiple scales and domains, illustrating the lived complexities of women in rural Indonesia, particularly those from vulnerable groups. The stories shared highlight women’s own pathways of change and their resilience and determination often in the face of resistance from their families and communities, to ultimately reduce rural gender inequities and bolster gender inclusiveness. The stories also illustrate the important role CSOs—those that are focused on gender inclusion and facilitating grassroots women’s agency and empowerment—can play in supporting women’s voice and agency as they undertake this journey.
APA, Harvard, Vancouver, ISO, and other styles
3

Karlstrom, Karl, Laura Crossey, Allyson Matthis, and Carl Bowman. Telling time at Grand Canyon National Park: 2020 update. National Park Service, April 2021. http://dx.doi.org/10.36967/nrr-2285173.

Full text
Abstract:
Grand Canyon National Park is all about time and timescales. Time is the currency of our daily life, of history, and of biological evolution. Grand Canyon’s beauty has inspired explorers, artists, and poets. Behind it all, Grand Canyon’s geology and sense of timelessness are among its most prominent and important resources. Grand Canyon has an exceptionally complete and well-exposed rock record of Earth’s history. It is an ideal place to gain a sense of geologic (or deep) time. A visit to the South or North rims, a hike into the canyon of any length, or a trip through the 277-mile (446-km) length of Grand Canyon are awe-inspiring experiences for many reasons, and they often motivate us to look deeper to understand how our human timescales of hundreds and thousands of years overlap with Earth’s many timescales reaching back millions and billions of years. This report summarizes how geologists tell time at Grand Canyon, and the resultant “best” numeric ages for the canyon’s strata based on recent scientific research. By best, we mean the most accurate and precise ages available, given the dating techniques used, geologic constraints, the availability of datable material, and the fossil record of Grand Canyon rock units. This paper updates a previously-published compilation of best numeric ages (Mathis and Bowman 2005a; 2005b; 2007) to incorporate recent revisions in the canyon’s stratigraphic nomenclature and additional numeric age determinations published in the scientific literature. From bottom to top, Grand Canyon’s rocks can be ordered into three “sets” (or primary packages), each with an overarching story. The Vishnu Basement Rocks were once tens of miles deep as North America’s crust formed via collisions of volcanic island chains with the pre-existing continent between 1,840 and 1,375 million years ago. The Grand Canyon Supergroup contains evidence for early single-celled life and represents basins that record the assembly and breakup of an early supercontinent between 729 and 1,255 million years ago. The Layered Paleozoic Rocks encode stories, layer by layer, of dramatic geologic changes and the evolution of animal life during the Paleozoic Era (period of ancient life) between 270 and 530 million years ago. In addition to characterizing the ages and geology of the three sets of rocks, we provide numeric ages for all the groups and formations within each set. Nine tables list the best ages along with information on each unit’s tectonic or depositional environment, and specific information explaining why revisions were made to previously published numeric ages. Photographs, line drawings, and diagrams of the different rock formations are included, as well as an extensive glossary of geologic terms to help define important scientific concepts. The three sets of rocks are separated by rock contacts called unconformities formed during long periods of erosion. This report unravels the Great Unconformity, named by John Wesley Powell 150 years ago, and shows that it is made up of several distinct erosion surfaces. The Great Nonconformity is between the Vishnu Basement Rocks and the Grand Canyon Supergroup. The Great Angular Unconformity is between the Grand Canyon Supergroup and the Layered Paleozoic Rocks. Powell’s term, the Great Unconformity, is used for contacts where the Vishnu Basement Rocks are directly overlain by the Layered Paleozoic Rocks. The time missing at these and other unconformities within the sets is also summarized in this paper—a topic that can be as interesting as the time recorded. Our goal is to provide a single up-to-date reference that summarizes the main facets of when the rocks exposed in the canyon’s walls were formed and their geologic history. This authoritative and readable summary of the age of Grand Canyon rocks will hopefully be helpful to National Park Service staff including resource managers and park interpreters at many levels of geologic understandings...
APA, Harvard, Vancouver, ISO, and other styles
4

Lazonick, William, Philip Moss, and Joshua Weitz. The Unmaking of the Black Blue-Collar Middle Class. Institute for New Economic Thinking Working Paper Series, May 2021. http://dx.doi.org/10.36687/inetwp159.

Full text
Abstract:
In the decade after the Civil Rights Act of 1964, African Americans made historic gains in accessing employment opportunities in racially integrated workplaces in U.S. business firms and government agencies. In the previous working papers in this series, we have shown that in the 1960s and 1970s, Blacks without college degrees were gaining access to the American middle class by moving into well-paid unionized jobs in capital-intensive mass production industries. At that time, major U.S. companies paid these blue-collar workers middle-class wages, offered stable employment, and provided employees with health and retirement benefits. Of particular importance to Blacks was the opening up to them of unionized semiskilled operative and skilled craft jobs, for which in a number of industries, and particularly those in the automobile and electronic manufacturing sectors, there was strong demand. In addition, by the end of the 1970s, buoyed by affirmative action and the growth of public-service employment, Blacks were experiencing upward mobility through employment in government agencies at local, state, and federal levels as well as in civil-society organizations, largely funded by government, to operate social and community development programs aimed at urban areas where Blacks lived. By the end of the 1970s, there was an emergent blue-collar Black middle class in the United States. Most of these workers had no more than high-school educations but had sufficient earnings and benefits to provide their families with economic security, including realistic expectations that their children would have the opportunity to move up the economic ladder to join the ranks of the college-educated white-collar middle class. That is what had happened for whites in the post-World War II decades, and given the momentum provided by the dominant position of the United States in global manufacturing and the nation’s equal employment opportunity legislation, there was every reason to believe that Blacks would experience intergenerational upward mobility along a similar education-and-employment career path. That did not happen. Overall, the 1980s and 1990s were decades of economic growth in the United States. For the emerging blue-collar Black middle class, however, the experience was of job loss, economic insecurity, and downward mobility. As the twentieth century ended and the twenty-first century began, moreover, it became apparent that this downward spiral was not confined to Blacks. Whites with only high-school educations also saw their blue-collar employment opportunities disappear, accompanied by lower wages, fewer benefits, and less security for those who continued to find employment in these jobs. The distress experienced by white Americans with the decline of the blue-collar middle class follows the downward trajectory that has adversely affected the socioeconomic positions of the much more vulnerable blue-collar Black middle class from the early 1980s. In this paper, we document when, how, and why the unmaking of the blue-collar Black middle class occurred and intergenerational upward mobility of Blacks to the college-educated middle class was stifled. We focus on blue-collar layoffs and manufacturing-plant closings in an important sector for Black employment, the automobile industry from the early 1980s. We then document the adverse impact on Blacks that has occurred in government-sector employment in a financialized economy in which the dominant ideology is that concentration of income among the richest households promotes productive investment, with government spending only impeding that objective. Reduction of taxes primarily on the wealthy and the corporate sector, the ascendancy of political and economic beliefs that celebrate the efficiency and dynamism of “free market” business enterprise, and the denigration of the idea that government can solve social problems all combined to shrink government budgets, diminish regulatory enforcement, and scuttle initiatives that previously provided greater opportunity for African Americans in the government and civil-society sectors.
APA, Harvard, Vancouver, ISO, and other styles
5

Führ, Martin, Julian Schenten, and Silke Kleihauer. Integrating "Green Chemistry" into the Regulatory Framework of European Chemicals Policy. Sonderforschungsgruppe Institutionenanalyse, July 2019. http://dx.doi.org/10.46850/sofia.9783941627727.

Full text
Abstract:
20 years ago a concept of “Green Chemistry” was formulated by Paul Anastas and John Warner, aiming at an ambitious agenda to “green” chemical products and processes. Today the concept, laid down in a set of 12 principles, has found support in various arenas. This diffusion was supported by enhancements of the legislative framework; not only in the European Union. Nevertheless industry actors – whilst generally supporting the idea – still see “cost and perception remain barriers to green chemistry uptake”. Thus, the questions arise how additional incentives as well as measures to address the barriers and impediments can be provided. An analysis addressing these questions has to take into account the institutional context for the relevant actors involved in the issue. And it has to reflect the problem perception of the different stakeholders. The supply chain into which the chemicals are distributed are of pivotal importance since they create the demand pull for chemicals designed in accordance with the “Green Chemistry Principles”. Consequently, the scope of this study includes all stages in a chemical’s life-cycle, including the process of designing and producing the final products to which chemical substances contribute. For each stage the most relevant legislative acts, together establishing the regulatory framework of the “chemicals policy” in the EU are analysed. In a nutshell the main elements of the study can be summarized as follows: Green Chemistry (GC) is the utilisation of a set of principles that reduces or eliminates the use or generation of hazardous substances in the design, manufacture and application of chemical products. Besides, reaction efficiency, including energy efficiency, and the use of renewable resources are other motives of Green Chemistry. Putting the GC concept in a broader market context, however, it can only prevail if in the perception of the relevant actors it is linked to tangible business cases. Therefore, the study analyses the product context in which chemistry is to be applied, as well as the substance’s entire life-cycle – in other words, the six stages in product innovation processes): 1. Substance design, 2. Production process, 3. Interaction in the supply chain, 4. Product design, 5. Use phase and 6. After use phase of the product (towards a “circular economy”). The report presents an overview to what extent the existing framework, i.e. legislation and the wider institutional context along the six stages, is setting incentives for actors to adequately address problematic substances and their potential impacts, including the learning processes intended to invoke creativity of various actors to solve challenges posed by these substances. In this respect, measured against the GC and Learning Process assessment criteria, the study identified shortcomings (“delta”) at each stage of product innovation. Some criteria are covered by the regulatory framework and to a relevant extent implemented by the actors. With respect to those criteria, there is thus no priority need for further action. Other criteria are only to a certain degree covered by the regulatory framework, due to various and often interlinked reasons. For those criteria, entry points for options to strengthen or further nuance coverage of the respective principle already exist. Most relevant are the deltas with regard to those instruments that influence the design phase; both for the chemical substance as such and for the end-product containing the substance. Due to the multi-tier supply chains, provisions fostering information, communication and cooperation of the various actors are crucial to underpin the learning processes towards the GCP. The policy options aim to tackle these shortcomings in the context of the respective stage in order to support those actors who are willing to change their attitude and their business decisions towards GC. The findings are in general coherence with the strategies to foster GC identified by the Green Chemistry & Commerce Council.
APA, Harvard, Vancouver, ISO, and other styles
6

Holland, Darren, and Nazmina Mahmoudzadeh. Foodborne Disease Estimates for the United Kingdom in 2018. Food Standards Agency, January 2020. http://dx.doi.org/10.46756/sci.fsa.squ824.

Full text
Abstract:
In February 2020 the FSA published two reports which produced new estimates of foodborne norovirus cases. These were the ‘Norovirus Attribution Study’ (NoVAS study) (O’Brien et al., 2020) and the accompanying internal FSA technical review ‘Technical Report: Review of Quantitative Risk Assessment of foodborne norovirus transmission’ (NoVAS model review), (Food Standards Agency, 2020). The NoVAS study produced a Quantitative Microbiological Risk Assessment model (QMRA) to estimate foodborne norovirus. The NoVAS model review considered the impact of using alternative assumptions and other data sources on these estimates. From these two pieces of work, a revised estimate of foodborne norovirus was produced. The FSA has therefore updated its estimates of annual foodborne disease to include these new results and also to take account of more recent data related to other pathogens. The estimates produced include: •Estimates of GP presentations and hospital admissions for foodbornenorovirus based on the new estimates of cases. The NoVAS study onlyproduced estimates for cases. •Estimates of foodborne cases, GP presentations and hospital admissions for12 other pathogens •Estimates of unattributed cases of foodborne disease •Estimates of total foodborne disease from all pathogens Previous estimates An FSA funded research project ‘The second study of infectious intestinal disease in the community’, published in 2012 and referred to as the IID2 study (Tam et al., 2012), estimated that there were 17 million cases of infectious intestinal disease (IID) in 2009. These include illness caused by all sources, not just food. Of these 17 million cases, around 40% (around 7 million) could be attributed to 13 known pathogens. These pathogens included norovirus. The remaining 60% of cases (equivalent to 10 million cases) were unattributed cases. These are cases where the causal pathogen is unknown. Reasons for this include the causal pathogen was not tested for, the test was not sensitive enough to detect the causal pathogen or the pathogen is unknown to science. A second project ‘Costed extension to the second study of infectious intestinal disease in the community’, published in 2014 and known as IID2 extension (Tam, Larose and O’Brien, 2014), estimated that there were 566,000 cases of foodborne disease per year caused by the same 13 known pathogens. Although a proportion of the unattributed cases would also be due to food, no estimate was provided for this in the IID2 extension. New estimates We estimate that there were 2.4 million cases of foodborne disease in the UK in 2018 (95% credible intervals 1.8 million to 3.1 million), with 222,000 GP presentations (95% Cred. Int. 150,000 to 322,000) and 16,400 hospital admissions (95% Cred. Int. 11,200 to 26,000). Of the estimated 2.4 million cases, 0.9 million (95% Cred. Int. 0.7 million to 1.2 million) were from the 13 known pathogens included in the IID2 extension and 1.4 million1 (95% Cred. Int. 1.0 million to 2.0 million) for unattributed cases. Norovirus was the pathogen with the largest estimate with 383,000 cases a year. However, this estimate is within the 95% credible interval for Campylobacter of 127,000 to 571,000. The pathogen with the next highest number of cases was Clostridium perfringens with 85,000 (95% Cred. Int. 32,000 to 225,000). While the methodology used in the NoVAS study does not lend itself to producing credible intervals for cases of norovirus, this does not mean that there is no uncertainty in these estimates. There were a number of parameters used in the NoVAS study which, while based on the best science currently available, were acknowledged to have uncertain values. Sensitivity analysis undertaken as part of the study showed that changes to the values of these parameters could make big differences to the overall estimates. Campylobacter was estimated to have the most GP presentations with 43,000 (95% Cred. Int. 19,000 to 76,000) followed by norovirus with 17,000 (95% Cred. Int. 11,000 to 26,000) and Clostridium perfringens with 13,000 (95% Cred. Int. 6,000 to 29,000). For hospital admissions Campylobacter was estimated to have 3,500 (95% Cred. Int. 1,400 to 7,600), followed by norovirus 2,200 (95% Cred. Int. 1,500 to 3,100) and Salmonella with 2,100 admissions (95% Cred. Int. 400 to 9,900). As many of these credible intervals overlap, any ranking needs to be undertaken with caution. While the estimates provided in this report are for 2018 the methodology described can be applied to future years.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography