To see the other types of publications on this topic, follow the link: Descriptive complexity theory.

Journal articles on the topic 'Descriptive complexity theory'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Descriptive complexity theory.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Sureson, Claude. "Descriptive set theory and Boolean complexity theory." Comptes Rendus de l'Académie des Sciences - Series I - Mathematics 326, no. 2 (January 1998): 255–60. http://dx.doi.org/10.1016/s0764-4442(97)89481-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Galambos, Adam. "Descriptive complexity and revealed preference theory." Mathematical Social Sciences 101 (September 2019): 54–64. http://dx.doi.org/10.1016/j.mathsocsci.2019.06.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Grohe, Martin. "Finite Variable Logics in Descriptive Complexity Theory." Bulletin of Symbolic Logic 4, no. 4 (December 1998): 345–98. http://dx.doi.org/10.2307/420954.

Full text
Abstract:
Throughout the development of finite model theory, the fragments of first-order logic with only finitely many variables have played a central role. This survey gives an introduction to the theory of finite variable logics and reports on recent progress in the area.For each k ≥ 1 we let Lk be the fragment of first-order logic consisting of all formulas with at most k (free or bound) variables. The logics Lk are the simplest finite-variable logics. Later, we are going to consider infinitary variants and extensions by so-called counting quantifiers.Finite variable logics have mostly been studied on finite structures. Like the whole area of finite model theory, they have interesting model theoretic, complexity theoretic, and combinatorial aspects. For finite structures, first-order logic is often too expressive, since each finite structure can be characterized up to isomorphism by a single first-order sentence, and each class of finite structures that is closed under isomorphism can be characterized by a first-order theory. The finite variable fragments seem to be promising candidates with the right balance between expressive power and weakness for a model theory of finite structures. This may have motivated Poizat [67] to collect some basic model theoretic properties of the Lk. Around the same time Immerman [45] showed that important complexity classes such as polynomial time (PTIME) or polynomial space (PSPACE) can be characterized as collections of all classes of (ordered) finite structures definable by uniform sequences of first-order formulas with a fixed number of variables and varying quantifier-depth.
APA, Harvard, Vancouver, ISO, and other styles
4

Tanaka, Hisao. "Some descriptive-set-theoretical problems in complexity theory." Publications of the Research Institute for Mathematical Sciences 28, no. 4 (1992): 603–14. http://dx.doi.org/10.2977/prims/1195168210.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bannach, Max, and Till Tantau. "On the Descriptive Complexity of Color Coding." Algorithms 14, no. 3 (March 19, 2021): 96. http://dx.doi.org/10.3390/a14030096.

Full text
Abstract:
Color coding is an algorithmic technique used in parameterized complexity theory to detect “small” structures inside graphs. The idea is to derandomize algorithms that first randomly color a graph and then search for an easily-detectable, small color pattern. We transfer color coding to the world of descriptive complexity theory by characterizing—purely in terms of the syntactic structure of describing formulas—when the powerful second-order quantifiers representing a random coloring can be replaced by equivalent, simple first-order formulas. Building on this result, we identify syntactic properties of first-order quantifiers that can be eliminated from formulas describing parameterized problems. The result applies to many packing and embedding problems, but also to the long path problem. Together with a new result on the parameterized complexity of formula families involving only a fixed number of variables, we get that many problems lie in FPT just because of the way they are commonly described using logical formulas.
APA, Harvard, Vancouver, ISO, and other styles
6

Leivant, Daniel. "Descriptive characterizations of computational complexity." Journal of Computer and System Sciences 39, no. 1 (August 1989): 51–83. http://dx.doi.org/10.1016/0022-0000(89)90019-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Saluja, S., K. V. Subrahmanyam, and M. N. Thakur. "Descriptive Complexity of #P Functions." Journal of Computer and System Sciences 50, no. 3 (June 1995): 493–505. http://dx.doi.org/10.1006/jcss.1995.1039.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Debs, Gabriel, Jean Saint Raymond, and Jean Saint Raymond. "The descriptive complexity of connectedness in Polish spaces." Fundamenta Mathematicae 249, no. 3 (2020): 261–86. http://dx.doi.org/10.4064/fm754-7-2019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Lautemann, Clemens, Pierre McKenzie, Thomas Schwentick, and Heribert Vollmer. "The Descriptive Complexity Approach to LOGCFL." Journal of Computer and System Sciences 62, no. 4 (June 2001): 629–52. http://dx.doi.org/10.1006/jcss.2000.1742.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Cucker, Felipe, and Klaus Meer. "Logics which capture complexity classes over the reals." Journal of Symbolic Logic 64, no. 1 (March 1999): 363–90. http://dx.doi.org/10.2307/2586770.

Full text
Abstract:
AbstractIn this paper we deal with the logical description of complexity classes arising in the real number model of computation introduced by Blum, Shub, and Smale [4]. We adapt the approach of descriptive complexity theory for this model developped in [14] and extend it to capture some further complexity classes over the reals by logical means. Among the latter we find NCℝ, PARℝ, EXPℝ and some others more.
APA, Harvard, Vancouver, ISO, and other styles
11

Verbitsky, Oleg, and Maksim Zhukovskii. "The Descriptive Complexity of Subgraph Isomorphism Without Numerics." Theory of Computing Systems 63, no. 4 (April 30, 2018): 902–21. http://dx.doi.org/10.1007/s00224-018-9864-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Kechris, Alexander S. "New Directions in Descriptive Set Theory." Bulletin of Symbolic Logic 5, no. 2 (June 1999): 161–74. http://dx.doi.org/10.2307/421088.

Full text
Abstract:
§1. I will start with a quick definition of descriptive set theory: It is the study of the structure of definable sets and functions in separable completely metrizable spaces. Such spaces are usually called Polish spaces. Typical examples are ℝn, ℂn, (separable) Hilbert space and more generally all separable Banach spaces, the Cantor space 2ℕ, the Baire space ℕℕ, the infinite symmetric group S∞, the unitary group (of the Hilbert space), the group of measure preserving transformations of the unit interval, etc.In this theory sets are classified in hierarchies according to the complexity of their definitions and the structure of sets in each level of these hierarchies is systematically analyzed. In the beginning we have the Borel sets in Polish spaces, obtained by starting with the open sets and closing under the operations of complementation and countable unions, and the corresponding Borel hierarchy ( sets). After this come the projective sets, obtained by starting with the Borel sets and closing under the operations of complementation and projection, and the corresponding projective hierarchy ( sets).There are also transfinite extensions of the projective hierarchy and even much more complex definable sets studied in descriptive set theory, but I will restrict myself here to Borel and projective sets, in fact just those at the first level of the projective hierarchy, i.e., the Borel (), analytic () and coanalytic () sets.
APA, Harvard, Vancouver, ISO, and other styles
13

Ghawadrah, Ghadeer. "The descriptive complexity of approximation properties in an admissible topology." Fundamenta Mathematicae 249, no. 3 (2020): 303–9. http://dx.doi.org/10.4064/fm758-8-2019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Kechris, A. S., and A. Louveau. "Descriptive set theory and harmonic analysis." Journal of Symbolic Logic 57, no. 2 (June 1992): 413–41. http://dx.doi.org/10.2307/2275277.

Full text
Abstract:
During the 1989 European ASL Summer Meeting in Berlin, the authors gave a series of eight lectures (short course) on the topic of the title. This survey article consists basically of the lecture notes for that course distributed to the participants of that conference. We have purposely tried in this printed version to preserve the informal style of the original notes.Let us say first a few things aboui the content of these lectures. Our aim has been to present some recent work in descriptive set theory and its applications to an area of harmonic analysis. Typical uses of descriptive set theory in analysis are most often through regularity properties of definable sets, like measurability, the property of Baire, capacitability, etc., which are used to show that certain problems have solutions that behave nicely. In the theory we will present, definability itself, in fact the precise analysis of the “definable complexity” of certain sets, will be the main concern. It will be through such knowledge that we will be able to infer important structural properties of various objects which will then be used to solve analysis problems.The first lecture provides a short historical introduction to the subject of uniqueness for trigonometric series, which is the area of harmonic analysis whose problems are the origin of this work. As is well known, it was Cantor who proved the first major result in this subject in 1870, and it was his subsequent work here that led him to the creation of set theory.
APA, Harvard, Vancouver, ISO, and other styles
15

Durand, Arnaud, Anselm Haak, Juha Kontinen, and Heribert Vollmer. "Descriptive complexity of #P functions: A new perspective." Journal of Computer and System Sciences 116 (March 2021): 40–54. http://dx.doi.org/10.1016/j.jcss.2020.04.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Ferrarotti, Flavio, Senén González, José María Turull Torres, Jan Van den Bussche, and Jonni Virtema. "Descriptive complexity of deterministic polylogarithmic time and space." Journal of Computer and System Sciences 119 (August 2021): 145–63. http://dx.doi.org/10.1016/j.jcss.2021.02.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Cozman, Fabio Gagliardi, and Denis Deratani Mauá. "The finite model theory of Bayesian network specifications: Descriptive complexity and zero/one laws." International Journal of Approximate Reasoning 110 (July 2019): 107–26. http://dx.doi.org/10.1016/j.ijar.2019.04.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Portugali, Juval. "Complexity Theory as a Link between Space and Place." Environment and Planning A: Economy and Space 38, no. 4 (April 2006): 647–64. http://dx.doi.org/10.1068/a37260.

Full text
Abstract:
Since the early 1970s, the notions of space and place have been located on the two sides of a barricade that divides what has been described as science's two great cultures. Space is located among the ‘hard’ sciences as a central term in the attempt of geography to transform the discipline from a descriptive into a quantitative, analytic, and thus scientific, enterprise. Place, on the other hand, is located among the ‘soft’ humanities and social philosophy oriented social sciences as an important notion in the post-1970 attempt to transform geography from a positivistic into a humanistic, structuralist, hermeneutic, critical science. More recently, the place-oriented geographies have adopted postmodern, poststructuralist, and deconstruction approaches, while the quantitative spatial geographies have been strongly influenced by theories of self-organization and complexity. In this paper I first point to, and then explore, structural similarities between complexity theories and theories oriented toward social philosophy. I then elaborate the thesis that, in consequence, complexity theories have the potential to bridge the geographies of space and place and, by implication, the two cultures of science. Finally, discuss in some detail conceptual and methodological implications.
APA, Harvard, Vancouver, ISO, and other styles
19

Mangraviti, Francesco, and Luca Motto Ros. "A descriptive Main Gap Theorem." Journal of Mathematical Logic 21, no. 01 (June 24, 2020): 2050025. http://dx.doi.org/10.1142/s0219061320500257.

Full text
Abstract:
Answering one of the main questions of [S.-D. Friedman, T. Hyttinen and V. Kulikov, Generalized descriptive set theory and classification theory, Mem. Amer. Math. Soc. 230(1081) (2014) 80, Chap. 7], we show that there is a tight connection between the depth of a classifiable shallow theory [Formula: see text] and the Borel rank of the isomorphism relation [Formula: see text] on its models of size [Formula: see text], for [Formula: see text] any cardinal satisfying [Formula: see text]. This is achieved by establishing a link between said rank and the [Formula: see text]-Scott height of the [Formula: see text]-sized models of [Formula: see text], and yields to the following descriptive set-theoretical analog of Shelah’s Main Gap Theorem: Given a countable complete first-order theory [Formula: see text], either [Formula: see text] is Borel with a countable Borel rank (i.e. very simple, given that the length of the relevant Borel hierarchy is [Formula: see text]), or it is not Borel at all. The dividing line between the two situations is the same as in Shelah’s theorem, namely that of classifiable shallow theories. We also provide a Borel reducibility version of the above theorem, discuss some limitations to the possible (Borel) complexities of [Formula: see text], and provide a characterization of categoricity of [Formula: see text] in terms of the descriptive set-theoretical complexity of [Formula: see text].
APA, Harvard, Vancouver, ISO, and other styles
20

Kernick, D. "Migraine — New Perspectives from Chaos Theory." Cephalalgia 25, no. 8 (August 2005): 561–66. http://dx.doi.org/10.1111/j.1468-2982.2005.00934.x.

Full text
Abstract:
Converging from a number of disciplines, non-linear systems theory and in particular chaos theory offer new descriptive and prescriptive insights into physiological systems. This paper briefly reviews an approach to physiological systems from these perspectives and outlines how these concepts can be applied to the study of migraine. It suggests a wide range of potential applications including new approaches to classification, treatment and pathophysiological mechanisms. A hypothesis is developed that suggests that dysfunctional consequences can result from a mismatch between the complexity of the environment and the system that is seeking to regulate it and that the migraine phenomenon is caused by an incongruity between the complexity of mid brain sensory integration and cortical control networks. Chaos theory offers a new approach to the study of migraine that complements existing frameworks but may more accurately reflect underlying physiological mechanisms.
APA, Harvard, Vancouver, ISO, and other styles
21

Mwangi, Patriciah G., Zachary B. Awino, Kennedy O. Ogollah, and Ganesh P. Pokhariyal. "Competitive Repertoire Complexity: A Potential Mediator in the Upper Echelons Propositions?" International Journal of Business and Management 13, no. 10 (September 6, 2018): 83. http://dx.doi.org/10.5539/ijbm.v13n10p83.

Full text
Abstract:
This study sought to evaluate the relationships between top management team (TMT) heterogeneity, competitive repertoire complexity and firm performance. The study was grounded on the upper echelons theory which argues that the TMT characteristics affect the organization’s performance through their influence on strategic choices. This study sought to investigate this relationship using the complete array of strategies deployed by heterogeneous TMTs. The study was conducted through a cross sectional descriptive survey of 53 large food and beverage manufacturers in Kenya. Primary data and secondary data was collected through a structured questionnaire and checklist respectively and analyzed by descriptive and inferential statistics. The study established that TMT heterogeneity had a significant negative effect on financial, internal processes and social performance in line with the upper echelons theory. Competitive repertoire complexity was not associated with TMT heterogeneity and did not significantly mediate the relationship between TMT heterogeneity and firm performance as expected from the information processing theory. This study contributed to the strategic management field by providing empirical evidence to the upper echelons and resource based view. Managers would benefit by careful consideration of how their TMTs were designed. Policy makers would also be aware about the competitive actions they adopted and their effect on their organizations performance.
APA, Harvard, Vancouver, ISO, and other styles
22

Durand, Arnaud, Neil D. Jones, Johann A. Makowsky, and Malika More. "Fifty years of the spectrum problem: survey and new results." Bulletin of Symbolic Logic 18, no. 4 (December 2012): 505–53. http://dx.doi.org/10.2178/bsl.1804020.

Full text
Abstract:
AbstractIn 1952, Heinrich Scholz published a question in The Journal of Symbolic Logic asking for a characterization of spectra, i.e., sets of natural numbers that are the cardinalities of finite models of first order sentences. Günter Asser in turn asked whether the complement of a spectrum is always a spectrum. These innocent questions turned out to be seminal for the development of finite model theory and descriptive complexity. In this paper we survey developments over the last 50-odd years pertaining to the spectrum problem. Our presentation follows conceptual developments rather than the chronological order. Originally a number theoretic problem, it has been approached by means of recursion theory, resource bounded complexity theory, classification by complexity of the defining sentences, and finally by means of structural graph theory. Although Scholz' question was answered in various ways, Asser's question remains open.
APA, Harvard, Vancouver, ISO, and other styles
23

Laengle, Sigifredo. "Articulating bargaining theories: movement, chance, and necessity as descriptive principles." Central European Journal of Operations Research 29, no. 1 (January 18, 2021): 49–71. http://dx.doi.org/10.1007/s10100-020-00729-y.

Full text
Abstract:
AbstractThe Nash Demand Game (NDG) has been one of the first models (Nash in Econometrica 21(1):128–140, 1953. 10.2307/1906951) that has tried to describe the process of negotiation, competition, and cooperation. This model has had enormous repercussions and has leveraged basic and applied research on bargaining processes. Therefore, we wonder whether it is possible to articulate extensive and multiple developments into a single unifying framework. The Viability Theory has this inclusive approach. Thus, we investigate the NDG under this point of view, and, carrying out this work, we find that the answer is not only affirmative but that we also advance in characterising viable NDGs. In particular, we found foundations describe the distributive Bargaining Theory: the principle of movement and the principle of chance and necessity. Finally, this initial work has many interesting perspectives. The probably most important idea is to integrate developments of the Bargaining Theory and thus capture the complexity of the real world in an articulated way.
APA, Harvard, Vancouver, ISO, and other styles
24

Selivanov, V. L. "Fine hierarchies and Boolean terms." Journal of Symbolic Logic 60, no. 1 (March 1995): 289–317. http://dx.doi.org/10.2307/2275522.

Full text
Abstract:
AbstractWe consider fine hierarchies in recursion theory, descriptive set theory, logic and complexity theory. The main results state that the sets of values of different Boolean terms coincide with the levels of suitable fine hierarchies. This gives new short descriptions of these hierarchies and shows that collections of sets of values of Boolean terms are almost well ordered by inclusion. For the sake of completeness we mention also some earlier results demonstrating the usefulness of fine hierarchies.
APA, Harvard, Vancouver, ISO, and other styles
25

Segoufin, Luc. "M. Grohe, Descriptive Complexity, Canonisation, and Definable Graph Structure Theory, Cambridge University Press, Cambridge, 2017, x + 544 pp." Bulletin of Symbolic Logic 23, no. 4 (December 2017): 493–94. http://dx.doi.org/10.1017/bsl.2018.1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Cerovic, Slobodan. "Complexity of the strategic management in tourism." Privredna izgradnja 46, no. 3-4 (2003): 207–20. http://dx.doi.org/10.2298/priz0304207c.

Full text
Abstract:
Studying and application of management in tourism, although recently established, represents very complex research area of management that gather and interweave different, not only scientific, but also practical disciplines: economy, statistics, mathematics, psychology, informative technologies, sociology and many others, that means the interdisciplinary approach is requested. From this point of view, the management is a critical process for the organization, the leadership is the support of the management and all kinds of decision making are the key of leadership. The aim of this work is, to certain extent, in overcoming the gap between normative and empiricist approach, in offering the basic solutions for making strategic and other decisions, thus the basis of the management in tourism, which partly relies on the logical consequences of normative theory and partly on the empiricist findings of descriptive studies. Therefore, the selection of the tourism enterprise management strategy is specially emphasized and based on that the conclusion that there is not a formula, but each enterprise has to analyze its market position for itself and. according to the formulated strategic goals, to select an adequate strategy.
APA, Harvard, Vancouver, ISO, and other styles
27

Calvert, Wesley, and Julia F. Knight. "Classification from a Computable Viewpoint." Bulletin of Symbolic Logic 12, no. 2 (June 2006): 191–218. http://dx.doi.org/10.2178/bsl/1146620059.

Full text
Abstract:
Classification is an important goal in many branches of mathematics. The idea is to describe the members of some class of mathematical objects, up to isomorphism or other important equivalence, in terms of relatively simple invariants. Where this is impossible, it is useful to have concrete results saying so. In model theory and descriptive set theory, there is a large body of work showing that certain classes of mathematical structures admit classification while others do not. In the present paper, we describe some recent work on classification in computable structure theory.Section 1 gives some background from model theory and descriptive set theory. From model theory, we give sample structure and non-structure theorems for classes that include structures of arbitrary cardinality. We also describe the notion of Scott rank, which is useful in the more restricted setting of countable structures. From descriptive set theory, we describe the basic Polish space of structures for a fixed countable language with fixed countable universe. We give sample structure and non-structure theorems based on the complexity of the isomorphism relation, and on Borel embeddings.Section 2 gives some background on computable structures. We describe three approaches to classification for these structures. The approaches are all equivalent. However, one approach, which involves calculating the complexity of the isomorphism relation, has turned out to be more productive than the others. Section 3 describes results on the isomorphism relation for a number of mathematically interesting classes—various kinds of groups and fields. In Section 4, we consider a setting similar to that in descriptive set theory. We describe an effective analogue of Borel embedding which allows us to make distinctions even among classes of finite structures. Section 5 gives results on computable structures of high Scott rank. Some of these results make use of computable embeddings. Finally, in Section 6, we mention some open problems and possible directions for future work.
APA, Harvard, Vancouver, ISO, and other styles
28

CHARALAMBIDIS, ANGELOS, CHRISTOS NOMIKOS, and PANOS RONDOGIANNIS. "The Expressive Power of Higher-Order Datalog." Theory and Practice of Logic Programming 19, no. 5-6 (September 2019): 925–40. http://dx.doi.org/10.1017/s1471068419000279.

Full text
Abstract:
AbstractA classical result in descriptive complexity theory states that Datalog expresses exactly the class of polynomially computable queries on ordered databases (Papadimitriou 1985; Grädel 1992; Vardi 1982; Immerman 1986; Leivant 1989). In this paper we extend this result to the case of higher-order Datalog. In particular, we demonstrate that on ordered databases, for all k ≥ 2, k-order Datalog captures (k − 1)-EXPTIME. This result suggests that higher-order extensions of Datalog possess superior expressive power and they are worthwhile of further investigation both in theory and in practice.
APA, Harvard, Vancouver, ISO, and other styles
29

Ansah, Richard, Ebenezer Agbaglo, and Regina A. T. Mensah. "Gender Variation in the Writings of Ghanaian Colleges of Education Students: A Study of Syntactic Complexity." Advances in Language and Literary Studies 12, no. 4 (August 31, 2021): 140. http://dx.doi.org/10.7575/aiac.alls.v.12n.4.p.140.

Full text
Abstract:
This study explored the differences in the writings produced by both male and female students in colleges of education in Ghana with respect to syntactic complexity. The study was based on a corpus of two hundred examination essays which were collected from two hundred students in Assin Fosu, Wesley and Presbyterian colleges of education who took the English language Studies course (FDC 211) in 2018/2019 academic year. The study adopted a descriptive design, involving qualitative and quantitative methods. The analysis showed that the male students were more syntactically complex than the female students in their writings. The study established clear variations in the areas of length of production unit, sentence complexity, amount of subordination and coordination and particular structures. It has therefore upheld the difference version of gender and language theory as compared to the discursive theory. Implications and areas for further research are also discussed.
APA, Harvard, Vancouver, ISO, and other styles
30

Nascimento, Keyla Cristiane do, and Alacoque Lorenzini Erdmann. "Understanding the dimensions of intensive care: transpersonal caring and complexity theories." Revista Latino-Americana de Enfermagem 17, no. 2 (April 2009): 215–21. http://dx.doi.org/10.1590/s0104-11692009000200012.

Full text
Abstract:
This is a descriptive, interpretive and qualitative study carried out at the ICU of a Brazilian teaching hospital. It aimed to understand the dimensions of human caring experienced by health care professionals, clients and their family members at an ICU, based on human caring complexity. The Transpersonal Caring and Complexity theories support theory and data analysis. The following dimensions of care emerged from the themes analyzed according to Ricoeur: self-care, care as an individual value, professional vs. informal care, care as supportive relationship, affective care, humanized care, care as act/attitude, care practice; educative care, dialogical relationship, care coupled to technology, loving care, interactive care, non-care, care ambience, the essence of life and profession, and meaning/purpose of care. We believe in care that encompasses several dimensions presented here, based on the relationship with the other, on the empathetic, sensitive, affectionate, creative, dynamic and understanding being in the totality of the human being.
APA, Harvard, Vancouver, ISO, and other styles
31

Endsley, Mica R. "Toward a Theory of Situation Awareness in Dynamic Systems." Human Factors: The Journal of the Human Factors and Ergonomics Society 37, no. 1 (March 1995): 32–64. http://dx.doi.org/10.1518/001872095779049543.

Full text
Abstract:
This paper presents a theoretical model of situation awareness based on its role in dynamic human decision making in a variety of domains. Situation awareness is presented as a predominant concern in system operation, based on a descriptive view of decision making. The relationship between situation awareness and numerous individual and environmental factors is explored. Among these factors, attention and working memory are presented as critical factors limiting operators from acquiring and interpreting information from the environment to form situation awareness, and mental models and goal-directed behavior are hypothesized as important mechanisms for overcoming these limits. The impact of design features, workload, stress, system complexity, and automation on operator situation awareness is addressed, and a taxonomy of errors in situation awareness is introduced, based on the model presented. The model is used to generate design implications for enhancing operator situation awareness and future directions for situation awareness research.
APA, Harvard, Vancouver, ISO, and other styles
32

Et. al., Bharath Booshan M. S,. "Hotel Selection Criteria Among Customers with Reference to Bangalore City." Turkish Journal of Computer and Mathematics Education (TURCOMAT) 12, no. 2 (April 11, 2021): 1265–73. http://dx.doi.org/10.17762/turcomat.v12i2.1187.

Full text
Abstract:
Hotel industry is considered to be one of the reputed industry in India. The study is to analyse the attitude of the respondents towards discounting offers and attractive package given by hotels and to find out the impact of Price complexity, discounting offers and attractive package towards selection of hotels. For this purpose a sample of 120 was collected from the respondents were percentage analysis, descriptive statistics, factor analysis, Kruskall wallis test, one way ANOVA and multiple regression were used as tools to analyse the data. The conclusion of the study is that price complexity has higher impact on price complexity than other variables taken for the study. It is also concluded that the quality of service has to be enhanced in future period of time based on packages designed and discounting factors.
APA, Harvard, Vancouver, ISO, and other styles
33

Behrendt, Hauke. "Soziale Teilhabe als Tatsache, Wert und Aufgabe." Deutsche Zeitschrift für Philosophie 67, no. 3 (September 10, 2019): 464–89. http://dx.doi.org/10.1515/dzph-2019-0037.

Full text
Abstract:
Abstract This paper aims to present the current debates in contemporary inclusion research in a structured manner and to highlight some systematic weaknesses. Although one cannot speak of a uniform research program in the strict sense, theoreticians of systems theory and inequality research in particular have taken up this young research subject. While within systems-theory approaches, inclusion is framed from the perspective of a functionalist social theory, inequality research has a rather socio-critical impetus which treats exclusion as a social problem. My thesis is that both approaches suffer from a lack of complexity regarding their concept of inclusion. While systems-theory approaches fail to include the ethical and political dimension of inclusion, inequality research is inadequate with regard to the descriptive dimension.
APA, Harvard, Vancouver, ISO, and other styles
34

Hella, Lauri, Phokion G. Kolaitis, and Kerkko Luosto. "Almost Everywhere Equivalence of Logics in Finite Model Theory." Bulletin of Symbolic Logic 2, no. 4 (December 1996): 422–43. http://dx.doi.org/10.2307/421173.

Full text
Abstract:
AbstractWe introduce a new framework for classifying logics on finite structures and studying their expressive power. This framework is based on the concept of almost everywhere equivalence of logics, that is to say, two logics having the same expressive power on a class of asymptotic measure 1. More precisely, if L, L′ are two logics and μ is an asymptotic measure on finite structures, then L ≡a.e.L′ (μ) means that there is a class C of finite structures with μ(C) = 1 and such that L and L′ define the same queries on C. We carry out a systematic investigation of ≡a.e. with respect to the uniform measure and analyze the ≡a.e.-equivalence classes of several logics that have been studied extensively in finite model theory. Moreover, we explore connections with descriptive complexity theory and examine the status of certain classical results of model theory in the context of this new framework.
APA, Harvard, Vancouver, ISO, and other styles
35

Fogal, Gary G. "Investigating Variability in L2 Development: Extending a Complexity Theory Perspective on L2 Writing Studies and Authorial Voice." Applied Linguistics 41, no. 4 (February 12, 2019): 575–600. http://dx.doi.org/10.1093/applin/amz005.

Full text
Abstract:
Abstract Applying a complex dynamic systems view of writing development, this study explored how developmental variability can contribute to conceptualizing changes in L2 writing. Forty-two writing samples were collected from one Thai university student in Thailand studying actuary science in English. The writing samples were composed over four years and were holistically coded for degrees of appropriate authorial voice. Descriptive techniques, including linear and polynomial trend lines and a min-max graph, informed a visual inspection of the data. These techniques revealed quantitatively distinct projections of authorial voice that were marked by periods of progress and regress. A Monte Carlo simulation then tested the hypothesis that the variability was not due to chance. The analysis showed that variability played a statistically significant role in constructing authorial voice. This work demonstrates developmental behavior consistent with complex systems and how other measures of L2 writing mature and substantiates findings on the meaningful role variability contributes to L2 development. This study also expands the explanatory potential of complex dynamic systems theory for conceptualizing writing and more generally L2 development.
APA, Harvard, Vancouver, ISO, and other styles
36

Grohe, Martin, and Daniel Neuen. "Isomorphism, canonization, and definability for graphs of bounded rank width." Communications of the ACM 64, no. 5 (May 2021): 98–105. http://dx.doi.org/10.1145/3453943.

Full text
Abstract:
We investigate the interplay between the graph isomorphism problem, logical definability, and structural graph theory on a rich family of dense graph classes: graph classes of bounded rank width. We prove that the combinatorial Weisfeiler-Leman algorithm of dimension (3 k + 4) is a complete isomorphism test for the class of all graphs of rank width at most k. A consequence of our result is the first polynomial time canonization algorithm for graphs of bounded rank width. Our second main result addresses an open problem in descriptive complexity theory: we show that fixed-point logic with counting expresses precisely the polynomial time properties of graphs of bounded rank width.
APA, Harvard, Vancouver, ISO, and other styles
37

Ghasedi, Parviz, Farideh Okati, Habibollah Mashhady, and Nasser Fallah. "THE EFFECTS OF SYMMETRICAL AND ASYMMETRICAL SCAFFOLDING ON SPEAKING COMPLEXITY, ACCURACY, AND FLUENCY." Indonesian EFL Journal 4, no. 1 (January 29, 2018): 1. http://dx.doi.org/10.25134/ieflj.v4i1.793.

Full text
Abstract:
This experimental study was set ought to explore the efficacy of symmetrical and asymmetrical scaffolding in boosting speaking complexity, accuracy, and fluency among 38 upper-intermediate EFL learners. To this end, the participants were assigned into random, homogeneous, and heterogeneous groups. The control group participated in normal speaking classroom, while the experimental groups shared their ideas and collaboratively complete tasks related to 7 lessons of New Interchange 2 during 15 sessions. Two different versions of IELTS speaking test were used as pre/post-test. The data were audio recorded and transcribed for statistical analysis. The results of Multivariate tests revealed that there was a statistically significant difference between the mean scores of control and experimental groups on complexity and fluency. On the other hand, descriptive statistics showed the superiority of heterogeneous groups over homogeneous ones. However, the results of Independent sample t-test indicated that the differences between homogeneous and heterogeneous groups reached the significant level just for complexity not fluency and accuracy. Briefly, the results lead support to the Vegotsky�s (1978) socio-cultural theory. The findings and pedagogical implications were discussed in details at the end of the study.Keywords: accuracy, asymmetrical scaffolding, complexity, fluency, symmetrical scaffolding
APA, Harvard, Vancouver, ISO, and other styles
38

Mayer, Robert. "The Complexity Assessment Method of Logical Reasonings." Scientific Research and Development. Socio-Humanitarian Research and Technology 9, no. 3 (September 21, 2020): 35–40. http://dx.doi.org/10.12737/2587-912x-2020-35-40.

Full text
Abstract:
The development of methods for assessing the various didactic characteristics of teaching texts and teacher messages is an urgent problem of modern teaching theory. Its solution allows to rank texts (descriptions, proofs, explanations, etc.) according to the degree of complexity, and this helps to optimize the learning process and increase the objectivity of assessing students' knowledge. In general, educational texts contain a descriptive-narrative and logical component; the share of the latter can be defined as the ratio of informative content of logical reasonings to the overall informative content of the text. The complexity (informativity) of the text can be found as a sum of the com-plexities of its constituent scientific terms and ordinary words. The semantic complexity of the term relative to the Z0 thesaurus is equal to the number of words that need to be spoken to explain the term to a student with the thesaurus Z0. It should be noted that each new mention of the term reduces its contribution to the overall complexity of the text. The proposed method for assessing the complexity of logical reasonings involves: 1) identifying all facts and logical rules used, the complexity of which exceeds the cho-sen level of knowledge; 2) expanding the text by adding all the statements necessary for conducting reasonings; 3) creation of text file text.txt containing verbal part of educa-tional text and verbally encoded formulas and figures; 4) creation of a dictionary (slo-var.txt file) containing the scientific terms used with indication of their level of com-plexity; 5) analysis of the text.txt file using the special computer program addressing to the dictionary which allows to determine the overall semantic complexity of the eva-luated text; 6) isolating logical reasonings from the text and determining their total complexity; 7) finding the share of logical reasoning in the text. The complexity of four text fragments was estimated, for which the share of logical reasonings and the average coefficient of information folding were calculated.
APA, Harvard, Vancouver, ISO, and other styles
39

Beckman, Robert A., Irina Kareva, and Frederick R. Adler. "How Should Cancer Models Be Constructed?" Cancer Control 27, no. 1 (January 1, 2020): 107327482096200. http://dx.doi.org/10.1177/1073274820962008.

Full text
Abstract:
Choosing and optimizing treatment strategies for cancer requires capturing its complex dynamics sufficiently well for understanding but without being overwhelmed. Mathematical models are essential to achieve this understanding, and we discuss the challenge of choosing the right level of complexity to address the full range of tumor complexity from growth, the generation of tumor heterogeneity, and interactions within tumors and with treatments and the tumor microenvironment. We discuss the differences between conceptual and descriptive models, and compare the use of predator-prey models, evolutionary game theory, and dynamic precision medicine approaches in the face of uncertainty about mechanisms and parameter values. Although there is of course no one-size-fits-all approach, we conclude that broad and flexible thinking about cancer, based on combined modeling approaches, will play a key role in finding creative and improved treatments.
APA, Harvard, Vancouver, ISO, and other styles
40

Lee, JoAnn S., and Michael Wolf-Branigin. "Innovations in Modeling Social Good: A Demonstration With Juvenile Justice Intervention." Research on Social Work Practice 30, no. 2 (May 30, 2019): 174–85. http://dx.doi.org/10.1177/1049731519852151.

Full text
Abstract:
Objectives: Using agent-based modeling (ABM) within a complexity theory framework provides an alternative and promising method for significantly advancing the study of social good. Complexity theory is a systems approach based on the idea that aggregate patterns arise from the interactions of agents and their environments. Such systems operate according to a set of simple rules, and patterns emerge from these simple interactions that sometimes cannot be predicted by examining those interactions alone. ABM is a computational approach that simulates the interactions of autonomous agents with each other and their environments (social and/or physical). Methods: We adapted the Rebellion model from the NetLogo software library to demonstrate the potential of this approach to measure social good. Specifically, we examine the impact of variables related to juvenile justice involvement on the converse of social good, social exclusion, which in this model was conceptualized as the lack of educational attainment among youth at risk of juvenile justice involvement. After designing our ABM, we ran a total of 2400 simulations where we systematically varied key variables, including arrest risk and maximum sentence. Results: We report the descriptive statistics from our simulations for key output variables in the ABM, including percent socially excluded and average accumulated jail time, and demonstrate the usefulness of this method by identifying nonlinear, bivariate associations across the simulations. Conclusion: Our model demonstrates the usefulness of an innovative methodological approach, complexity theory, coupled with an innovative technology, ABM, in developing policies and programs that will maximize social good.
APA, Harvard, Vancouver, ISO, and other styles
41

BEZGIN, Kostyantyn, and Volodymyr USHKALYOV. "BEHAVIORAL ECONOMICS: AN EPISTEMIC TURN IN THE INTERPRETATION OF RATIONALITY." Economy of Ukraine 2019, no. 7-8 (August 23, 2019): 3–15. http://dx.doi.org/10.15407/economyukr.2019.07.003.

Full text
Abstract:
The purpose of the article is to formulate a hypothesis regarding the relationship and dynamic balance between normative and descriptive epistemology, which is established in the process of interpreting rational human behavior to create and accumulate congruent economic knowledge. In the face of growing complexity and uncertainty of the external environment, the role of critical thinking skills is increasing, which intensify the cognitive co-evolution of a person and environment by neutralizing evolutionarily formed cognitive dysfunctions. As an axiological nucleus it is proposed to use the theory of rational choice – the standard of human behavior, which contributes to the diffusion of complexity and uncertainty of the external environment. However, the presence of an axiological nucleus is a necessary but not sufficient condition, which allows it to be adequately integrated with the subject substrate. For this, one requires the relevant knowledge of those behavioral features that are presented by the epistemic periphery that dynamically changes and also permanently detects and fixes the properties and characteristics of the neural substrate, its phenomenology and behavioral characteristics. The knowledge being accumulated on the epistemic periphery of economic science allows adequately reloading the programs of the human mind to bring them into line with modern requirements for the cognitive abilities of economic agents, and also to create an internally holistic and consistent system of economic knowledge, which will take into account the complexity and multidimensional development of human-sized systems. The epistemic balance of normative and descriptive epistemology in the context of the interpretation of rational behavior may lie in a hypothesis that eliminates the dichotomy of normative and positive economic knowledge and is based on taking into account the structure of human cognitive processes, as well as the growing complexity and uncertainty of the external environment.
APA, Harvard, Vancouver, ISO, and other styles
42

Karcagi-Kovács, Andrea. "Human rationality, environmental challenges and evolutionary game theory." Applied Studies in Agribusiness and Commerce 8, no. 1 (August 31, 2014): 15–19. http://dx.doi.org/10.19041/apstract/2014/1/2.

Full text
Abstract:
In recent years, game theory is more often applied to analyse several sustainable development issues such as climate change and biological diversity, but the explanations generally remain within a non-cooperative setting. In this paper, after reviewing important studies in this field, I will show that these methods and the assumptions upon which these explanations rest lack both descriptive accuracy and analytical power. I also argue that the problem may be better investigated within a framework of the evolutionary game theory that focuses more on the dynamics of strategy change influenced by the effect of the frequency of various competing strategies. Building on this approach, the paper demonstrates that evolutionary games can better reflect the complexity of sustainable development issues. It presents models of human – nature and human – human conflicts represented by two-player and multi-player games (with a very large population of competitors). The benefit in these games played several times (continuously) will be the ability of the human race to survive. Finally, the paper attempts to identify and classify the main problems of sustainable development on which the game theory could be applied and demonstrates that this powerful analytical tool has many further possibilities for analysing global ecological issues.
APA, Harvard, Vancouver, ISO, and other styles
43

RESCHES, MARIELA, and MIGUEL PÉREZ PEREIRA. "Referential communication abilities and Theory of Mind development in preschool children." Journal of Child Language 34, no. 1 (January 25, 2007): 21–52. http://dx.doi.org/10.1017/s0305000906007641.

Full text
Abstract:
This work aims to analyse the specific contribution of social abilities (here considered as the capacity for attributing knowledge to others) in a particular communicative context. 74 normally developing children (aged 3;4 to 5;9, M=4·6) were given two Theory of Mind (ToM) tasks, which are considered to assess increasing complexity levels of epistemic state attribution: Attribution of knowledge-ignorance (Pillow, 1989; adapted by Welch-Ross, 1997) and Understanding of False-belief (Baron Cohen, Leslie & Frith, 1985). Subjects were paired according to their age and level of performance in ToM tasks. These dyads participated in a referential communication task specially designed for this research. The resulting communicative interchanges were analysed using a three-level category system (pragmatic functions, descriptive accuracy, and ambiguity of messages). The results showed significant differences among subjects with different levels of social comprehension regarding the type of communicative resources used by them in every category level. In particular, understanding of false belief seems to be the most powerful predictor of changes in the children’s development of communicative competence.
APA, Harvard, Vancouver, ISO, and other styles
44

Leivant, Daniel, and Bob Constable. "Editorial." Journal of Functional Programming 11, no. 1 (January 2001): 1. http://dx.doi.org/10.1017/s0956796801009030.

Full text
Abstract:
This issue of the Journal of Functional Programming is dedicated to work presented at the Workshop on Implicit Computational Complexity in Programming Languages, affiliated with the 1998 meeting of the International Conference on Functional Programming in Baltimore.Several machine-independent approaches to computational complexity have been developed in recent years; they establish a correspondence linking computational complexity to conceptual and structural measures of complexity of declarative programs and of formulas, proofs and models of formal theories. Examples include descriptive complexity of finite models, restrictions on induction in arithmetic and related first order theories, complexity of set-existence principles in higher order logic, and specifications in linear logic. We refer to these approaches collectively as Implicit Computational Complexity. This line of research provides a framework for a streamlined incorporation of computational complexity into areas such as formal methods in software development, programming language theory, and database theory.A fruitful thread in implicit computational complexity is based on exploring the computational complexity consequences of introducing various syntactic control mechanisms in functional programming, including restrictions (akin to static typing) on scoping, data re-use (via linear modalities), and iteration (via ramification of data). These forms of control, separately and in combination, can certify bounds on the time and space resources used by programs. In fact, all results in this area establish that each restriction considered yields precisely a major computational complexity class. The complexity classes thus obtained range from very restricted ones, such as NC and Alternating logarithmic time, through the central classes Poly-Time and Poly-Space, to broad classes such as the Elementary and the Primitive Recursive functions.Considerable effort has been invested in recent years to relax as much as possible the structural restrictions considered, allowing for more exible programming and proof styles, while still guaranteeing the same resource bounds. Notably, more exible control forms have been developed for certifying that functional programs execute in Poly-Time.The 1998 workshop covered both the theoretical foundations of the field and steps toward using its results in various implemented systems, for example in controlling the computational complexity of programs extracted from constructive proofs. The five papers included in this issue nicely represent this dual concern of theory and practice. As they are going to print, we should note that the field of Implicit Computational Complexity continues to thrive: successful workshops dedicated to it were affiliated with both the LICS'99 and LICS'00 conferences. Special issues, of Information and Computation dedicated to the former, and of Theoretical Computer Science to the latter, are in preparation.
APA, Harvard, Vancouver, ISO, and other styles
45

Estoque, Homelo Valenzuela, and Reynold Culimay Padagas. "From Nursing to Courtroom: A Qualitative Descriptive Study of the Preparations, Motivations, and Barriers of Nurses Becoming Lawyers." Nurse Media Journal of Nursing 11, no. 1 (February 12, 2021): 10–23. http://dx.doi.org/10.14710/nmjn.v11i1.33886.

Full text
Abstract:
Background: Transitioning is a common phenomenon that happens such as in a career shift provoked by either internal or external factors. This phenomenon also occurs to nurses becoming lawyers. Considering its complexity, such transition entails a process.Purpose: This study aimed to describe and uncover the preparations, motivations, ad barriers of nurses who transitioned into nurse-lawyers in the Philippines. Methods: The study employed descriptive-qualitative research design utilizing twenty participants selected through purposive and snowball or referral sampling techniques. A semi-structured interview guide was used for the data collection using Google form. Braun and Clarke’s thematic analysis was utilized as the primary treatment of the transcribed data. Strict observance of ethical standards in conducting research was ensured.Results: The study found out several themes and subcategories from the thematic analysis conducted. These included (1) “pre-planning emotive expressions”; (2) “motivations of career shift”; (3) “support mechanisms to afford career shift”; (4) “barriers to career shift”; (5) “the interconnectedness of law and nursing”; and (6) “impacts of the career shift.” Conclusion: Generally, the career shift of the nurse-lawyers presented significant themes pertinent to their preparations, motivations, and barriers in becoming lawyers. Apparently, these are all primordial in the career transition of the nurse-lawyers. Essentially, the study provides preliminary findings that may become springboard in the construction of a grounded theory that would explicate the transition of the nurse-lawyers as a phenomenon uniting and expanding nursing and the practice of law as complementary sciences.
APA, Harvard, Vancouver, ISO, and other styles
46

Rosen, Eric. "Some Aspects of Model Theory and Finite Structures." Bulletin of Symbolic Logic 8, no. 3 (September 2002): 380–403. http://dx.doi.org/10.2178/bsl/1182353894.

Full text
Abstract:
Model theory is concerned mainly, although not exclusively, with infinite structures. In recent years, finite structures have risen to greater prominence, both within the context of mainstream model theory, e.g., in work of Lachlan, Cherlin, Hrushovski, and others, and with the advent of finite model theory, which incorporates elements of classical model theory, combinatorics, and complexity theory. The purpose of this survey is to provide an overview of what might be called the model theory of finite structures. Some topics in finite model theory have strong connections to theoretical computer science, especially descriptive complexity theory (see [26, 46]). In fact, it has been suggested that finite model theory really is, or should be, logic for computer science. These connections with computer science will, however, not be treated here.It is well-known that many classical results of ‘infinite model theory’ fail over the class of finite structures, including the compactness and completeness theorems, as well as many preservation and interpolation theorems (see [35, 26]). The failure of compactness in the finite, in particular, means that the standard proofs of many theorems are no longer valid in this context. At present, there is no known example of a classical theorem that remains true over finite structures, yet must be proved by substantially different methods. It is generally concluded that first-order logic is ‘badly behaved’ over finite structures.From the perspective of expressive power, first-order logic also behaves badly: it is both too weak and too strong. Too weak because many natural properties, such as the size of a structure being even or a graph being connected, cannot be defined by a single sentence. Too strong, because every class of finite structures with a finite signature can be defined by an infinite set of sentences. Even worse, every finite structure is defined up to isomorphism by a single sentence. In fact, it is perhaps because of this last point more than anything else that model theorists have not been very interested in finite structures. Modern model theory is concerned largely with complete first-order theories, which are completely trivial here.
APA, Harvard, Vancouver, ISO, and other styles
47

CAROTENUTO, GEMMA. "HANDMADE DENSITY SETS." Journal of Symbolic Logic 82, no. 1 (March 2017): 208–23. http://dx.doi.org/10.1017/jsl.2016.53.

Full text
Abstract:
AbstractGiven a metric space (X , d), equipped with a locally finite Borel measure, a measurable set $A \subseteq X$ is a density set if the points where A has density 1 are exactly the points of A. We study the topological complexity of the density sets of the real line with Lebesgue measure, with the tools—and from the point of view—of descriptive set theory. In this context a density set is always in $\Pi _3^0$. We single out a family of true $\Pi _3^0$ density sets, an example of true $\Sigma _2^0$ density set and finally one of true $\Pi _2^0$ density set.
APA, Harvard, Vancouver, ISO, and other styles
48

Lydon-Staley, David M., Eli J. Cornblath, Ann Sizemore Blevins, and Danielle S. Bassett. "Modeling brain, symptom, and behavior in the winds of change." Neuropsychopharmacology 46, no. 1 (August 28, 2020): 20–32. http://dx.doi.org/10.1038/s41386-020-00805-6.

Full text
Abstract:
AbstractNeuropsychopharmacology addresses pressing questions in the study of three intertwined complex systems: the brain, human behavior, and symptoms of illness. The field seeks to understand the perturbations that impinge upon those systems, either driving greater health or illness. In the pursuit of this aim, investigators often perform analyses that make certain assumptions about the nature of the systems that are being perturbed. Those assumptions can be encoded in powerful computational models that serve to bridge the wide gulf between a descriptive analysis and a formal theory of a system’s response. Here we review a set of three such models along a continuum of complexity, moving from a local treatment to a network treatment: one commonly applied form of the general linear model, impulse response models, and network control models. For each, we describe the model’s basic form, review its use in the field, and provide a frank assessment of its relative strengths and weaknesses. The discussion naturally motivates future efforts to interlink data analysis, computational modeling, and formal theory. Our goal is to inspire practitioners to consider the assumptions implicit in their analytical approach, align those assumptions to the complexity of the systems under study, and take advantage of exciting recent advances in modeling the relations between perturbations and system function.
APA, Harvard, Vancouver, ISO, and other styles
49

Ando, Hiroshi, and Yasumichi Matsuzawa. "The Weyl–von Neumann theorem and Borel complexity of unitary equivalence modulo compacts of self-adjoint operators." Proceedings of the Royal Society of Edinburgh: Section A Mathematics 145, no. 6 (October 29, 2015): 1115–44. http://dx.doi.org/10.1017/s0308210515000293.

Full text
Abstract:
The Weyl–von Neumann theorem asserts that two bounded self-adjoint operators A, B on a Hilbert space H are unitarily equivalent modulo compacts, i.e.uAu* + K = B for some unitary u 𝜖 u(H) and compact self-adjoint operator K, if and only if A and B have the same essential spectrum: σess (A) = σess (B). We study, using methods from descriptive set theory, the problem of whether the above Weyl–von Neumann result can be extended to unbounded operators. We show that if H is separable infinite dimensional, the relation of unitary equivalence modulo compacts for bounded self-adjoint operators is smooth, while the same equivalence relation for general self-adjoint operators contains a dense Gδ-orbit but does not admit classification by countable structures. On the other hand, the apparently related equivalence relation A ~ B ⇔ ∃u 𝜖 U(H) [u(A-i)–1u* - (B-i)–1 is compact] is shown to be smooth.
APA, Harvard, Vancouver, ISO, and other styles
50

Wati, Linda R., Kunawati T. Dewi, and Erdiana D. Putri. "Qualitative Study: The Diffusion Innovation Theory to Long Term Reversible Contraception Method Selection in High Risk Women, Malang District." Journal of Midwifery 6, no. 1 (July 7, 2021): 84. http://dx.doi.org/10.25077/jom.6.1.84-93.2021.

Full text
Abstract:
Research objective: Long-term reversible contraceptive method (LTRC) is the most effective form of contraception but women prefer lesser effective method. In this study we explored whether women of reproductive age will accept or reject LTRC as their contraceptive innovation, and how their perceptions on the innovation's attributes influence their decision in choosing LTRC.Design, participants, interventions, and outcomes: The research design used was a qualitative descriptive study with high-risk reproductive women who were over 35 years old (16 people) spread over 4 health centers (puskesmas) as participants. Data collection was done by using focus group discussion. The results were analyzed using thematic content analysis.Result: Data analysis revealed that there were 2 subgroups of participants based on their tendency to use LTRC: positive (n 6) and negative (n 10). Most of the participants were aware of the advantages of long-term reversible contraceptive methods. They get information from health workers, the media and other people's experiences. They think that both IUD and implants have a high complexity / difficulty, especially the IUD. The characteristics of innovation in the form of relative advantage, compatibility, complexity, trialability and ease of observation greatly influence the decision to choose a long-term reversible contraceptive method. Most of the participants refused to try using the LTRC because it was too risky to try and too complicated to use.Conclusion: Most women are still hesitant to choose LTRC as a contraceptive, especially intrauterine contraceptives. They need more information about the advantages of LTRC
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography