To see the other types of publications on this topic, follow the link: IHM intuitive.

Journal articles on the topic 'IHM intuitive'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 42 journal articles for your research on the topic 'IHM intuitive.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Budňák, Jan. "„Es hat mich immer zur Totalität gedrängt.“. Jüdische und christliche Charaktere im Werk von Jakob Julius David." Brünner Hefte zu Deutsch als Fremdsprache 4, no. 1-2 (2011): 43–53. http://dx.doi.org/10.5817/bhdf2011-1-2-43.

Full text
Abstract:
Über den religiösen Standpunkt des deutschmährischen Dichters Jakob Julius David (1859-1906) dürften drei Tatsachen bekannt sein: seine Herkunft aus dem ländlich-jüdischen Milieu Nordmährens, seine frühe Konversion zum Katholizismus, die oft als Anpassung gedeutet wird, und die Präferenz einer humanistischen Ethik vor einer religiösen. Trotzdem sind in seinem Werk (Lyrik, Prosa, Essays) zahlreiche Auseinandersetzungen mit religiösen Motiven und Stoffen zu finden. Der vorliegende Beitrag stellt diesbezüglich fest, dass Davids Bild vom Christentum dessen weltbejahende, intuitive Komponente (z. B. Erzählung „Filippinas Kind“) hervorhebt, im Gegensatz zu dem bei ihm als bedrückend erscheinenden Judentum (z. B. Roman Das Höferecht).
APA, Harvard, Vancouver, ISO, and other styles
2

Levy, Haim. "The Investment Home Bias with Peer Effect." Journal of Risk and Financial Management 13, no. 5 (2020): 94. http://dx.doi.org/10.3390/jrfm13050094.

Full text
Abstract:
Observed international diversification implies an investment home bias (IHB). Can bivariate preferences with a local domestic peer group rationalize the IHB? For example, it is argued that wishing to have a large correlation with the Standard and Poor’s 500 stock index (S&P 500 stock index) may induce an increase in the domestic investment weight by American investors and, hence, rationalize the IHB. While this argument is valid in the mean-variance framework, employing bivariate first-degree stochastic dominance (BFSD), we prove that this intuition is generally invalid. Counter intuitively, employing “keeping up with the Joneses” (KUJ) preference with actual international data even enhances the IHB phenomenon.
APA, Harvard, Vancouver, ISO, and other styles
3

Jeremiah, Sean S., H. Zabiri, M. Ramasamy, B. Kamaruddin, W. K. Teh, and A. A. A. Mohd Amiruddin. "IAM: An Intuitive ANFIS-based method for stiction detection." IOP Conference Series: Materials Science and Engineering 458 (December 24, 2018): 012054. http://dx.doi.org/10.1088/1757-899x/458/1/012054.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Cavallucci,, Denis, Philippe Lutz,, and Fabrice Thiebaud,. "Intuitive Design Method (IDM): A New Framework For Design Method Integration." Journal for Manufacturing Science and Production 3, no. 2-4 (2000): 95–102. http://dx.doi.org/10.1515/ijmsp.2000.3.2-4.95.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Čižiks, Jānis, and Pēteris Grabusts. "DATA PROCESSING USING THE IBM SPSS MODELER TOOL." HUMAN. ENVIRONMENT. TECHNOLOGIES. Proceedings of the Students International Scientific and Practical Conference, no. 23 (April 24, 2019): 16. http://dx.doi.org/10.17770/het2019.23.4388.

Full text
Abstract:
IBM SPPS Modeler platform is the commercial rival of RapidMiner platform, characterized by a low entrance threshold for beginners. Nonsense for beginners is expressed by the "autopilot" modes. Auto models (Auto Numeric, Auto Classifier) distinguishes several possible patterns with different parameters, which identify them better. Not an experienced analyst, using such a solution, is able to develop an adequate model. The SPSS user interface is constantly improving, making the system intuitive to understand. For simple tasks, such as fuling, there is no need for preparation in principle. This makes IBM SPSS Modeler a good solution for data analysis for beginners.
APA, Harvard, Vancouver, ISO, and other styles
6

Hird, Andrew P. "The Impact of Entrepreneurial Cognition on the Founding and Survival of New Small Businesses." Industry and Higher Education 26, no. 6 (2012): 453–60. http://dx.doi.org/10.5367/ihe.2012.0124.

Full text
Abstract:
This paper reports on an investigation into nascent entrepreneurship. Developing and sustaining a new business is a complex and uncertain process, and different types of individuals react to this uncertainty in different ways. It is argued that cognitive factors play a crucial role. Validated and reliable psychometric instruments were administered to 119 nascent entrepreneurs in the UK. The respondents were tracked through the nascent phase, business launch and to six months after launch. The findings indicate that cognitive style is not a predictor of nascent entrepreneurship but that it is highly influential in the process of founding a business. Both intuitive and analytic nascent entrepreneurs started businesses and cognitive style did not affect survival rates, but the process of business formation and survival developed in different ways. Most research to date has argued that an intuitive cognitive style is associated with the necessary characteristics for launching an entrepreneurial venture. It is possible that this conclusion has been drawn because most studies have been conducted among existing entrepreneurs. The findings of the present study indicate that, at the nascent stage of entrepreneurship, and particularly among inexperienced nascent entrepreneurs, this assertion is open to challenge. An awareness of an entrepreneur's cognitive style may assist those who seek to support and advise the nascent entrepreneur, but may also help individual entrepreneurs to recognize their strengths and weaknesses and, so, to develop appropriate strategies for business launch and survival.
APA, Harvard, Vancouver, ISO, and other styles
7

Cheng, Feng Lan, and Feng He Wu. "Topology Optimization of Headstock of Heavy Machine Tool." Advanced Materials Research 305 (July 2011): 442–45. http://dx.doi.org/10.4028/www.scientific.net/amr.305.442.

Full text
Abstract:
The problem of topology optimization of large spindle box of heavy machine was studied. Based on ICM topology optimization, the parameters topology optimization were changed by sensitivity analysis and the sensitivity parameters were normalized; the topological parameters were modified and removed which judgment based on the sensitivity of the cross-section unit cell in the progress of optimization. Example shown that the integrated optimization method made the structure optimization more intuitive, not only avoided effectively the occurrence of the load strange in the optimization process, but accelerated the convergence speed.
APA, Harvard, Vancouver, ISO, and other styles
8

Szyszkowska, Małgorzata A. "Narrative moment. Musical performance according to Lawrence Kramer and James Baldwin." Interdisciplinary Studies in Musicology, no. 20 (December 31, 2020): 21–32. http://dx.doi.org/10.14746/ism.2020.20.2.

Full text
Abstract:
My aims are to investigate how the concept of narrative moment may be helpful in capturing the role of music in creating profound communication on the level of performing as well as listening to musical performance. I aim to show how sharing a culminating moment in a musical experience may lead to inducing a state of self awareness and confidence in place of critical separation and distrust. I discuss Lawrence Kramer’s idea of the narrative moment explained in original in reference to a literary example and an improvised music. It is presented as an example of communicative potential in music performance, which as I argue, is worth exploring and explaining further. Suggesting a possibility of narrative moment in the experience of musical performances offers a comprehensible and applicable vision of communicative potential of music that is far reaching even if rarely achieved; a possibility of communication that is direct and intuitive, flexible and affective. Defining musical meaning in terms of its music’s communicative power and far reaching social consequences suggests deep connections between the social/intersubjective, individual/subjective and aesthetic aspects of life. The proper explanation of the meaning of music requires drawing from different domains, including metaphors and highly persuasive literary and musical examples.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhao, Shibo, Xingmin Ren, Yihao Liu, Kuan Lu, Chao Fu, and Yongfeng Yang. "A Dynamic-Balancing Testing System Designed for Flexible Rotor." Shock and Vibration 2021 (August 30, 2021): 1–17. http://dx.doi.org/10.1155/2021/9346947.

Full text
Abstract:
In this paper, a dynamic-balancing testing system is designed. The innovative feature of the testing system is the dynamic balancing of the rotor system with robustness and high balance efficiency which meets the requirements of engineering application. The transient characteristic-based balancing method (TCBM) interface and the influence coefficient method (ICM) interface are designed in the testing system. The TCBM calculates the unbalance by the transient vibration responses while accelerating rotor operating without trail-weight. The ICM calculates the unbalance by the steady-state vibration responses while the rotor system operates with trail-weight and constant speed. The testing system has the functions of monitoring operations synchronously, measuring and recording the required vibration responses, analyzing the dynamic characteristics, and identifying the unbalance parameters. Experiments of the single disc rotor system are carried out, and the maximum deflection of the measuring point has decreased by 73.11% after balancing by the TCBM interface. The maximum amplitude of the measuring point at 2914 r/min has decreased by 77.74% after balancing by ICM interface, while the maximum deflection during the whole operation has decreased by 70.00%. The experiments prove the effectiveness of the testing system, while the testing system has advantages of convenient and intuitive operation, high balance efficiency, and security.
APA, Harvard, Vancouver, ISO, and other styles
10

ODEDAIRO, BABATUNDE OMONIYI, EMILOLA HELEN ALABA, and INYENEOBONG EDEM. "A SYSTEM DYNAMICS MODEL TO DETERMINE THE VALUE OF INVENTORY HOLDING COST." Journal of Engineering Studies and Research 26, no. 3 (2020): 112–23. http://dx.doi.org/10.29081/jesr.v26i3.213.

Full text
Abstract:
Traditionally, Inventory Holding Cost (IHC) is assumed to be a combination of several costs and determined by the summation of these cost components. Several authors have suggested that the value of IHC ranges between 12-50% of the procurement cost of an item. However, due to the absence of a generally acceptable methodology, many practitioners still determine this percentage based on estimates, benchmarks and intuition. Giving considerations to this reality, a mathematical model to determine the value of IHC using systems dynamics approach was developed. IHC was viewed holistically to identify relevant quantities, their interactions (static or dynamics), behaviour and consequences. A Causal Loop Diagram (CLD) was developed to establish the relationship among these quantities. Thereafter, CLD was transformed into a Flow Diagram (FD). FD was used to formulate a set of systems dynamics equations to obtain IHC. The interaction among fraction of goods ordered per month (FOM), fraction sold per month (FSM) and fraction damaged per month (FDM) was simulated to obtain percentage values of IHC. The value of IHC obtained from the model and simulation analysis ranges between 22.58-25.39% of the item held in stock. Based on these results, it is concluded that the developed model can be used for simulation and system analysis of the holding cost component of an inventory system under different contextual settings.
APA, Harvard, Vancouver, ISO, and other styles
11

Kesting, Arne, Martin Treiber, and Dirk Helbing. "Enhanced intelligent driver model to access the impact of driving strategies on traffic capacity." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 368, no. 1928 (2010): 4585–605. http://dx.doi.org/10.1098/rsta.2010.0084.

Full text
Abstract:
With an increasing number of vehicles equipped with adaptive cruise control (ACC), the impact of such vehicles on the collective dynamics of traffic flow becomes relevant. By means of simulation, we investigate the influence of variable percentages of ACC vehicles on traffic flow characteristics. For simulating the ACC vehicles, we propose a new car-following model that also serves as the basis of an ACC implementation in real cars. The model is based on the intelligent driver model (IDM) and inherits its intuitive behavioural parameters: desired velocity, acceleration, comfortable deceleration and desired minimum time headway. It eliminates, however, the sometimes unrealistic behaviour of the IDM in cut-in situations with ensuing small gaps that regularly are caused by lane changes of other vehicles in dense or congested traffic. We simulate the influence of different ACC strategies on the maximum capacity before breakdown and the (dynamic) bottleneck capacity after breakdown. With a suitable strategy, we find sensitivities of the order of 0.3, i.e. 1 per cent more ACC vehicles will lead to an increase in the capacities by about 0.3 per cent. This sensitivity multiplies when considering travel times at actual breakdowns.
APA, Harvard, Vancouver, ISO, and other styles
12

Pavlin, Samo. "The role of higher education in supporting graduates’ early labour market careers." International Journal of Manpower 35, no. 4 (2014): 576–90. http://dx.doi.org/10.1108/ijm-05-2013-0105.

Full text
Abstract:
Purpose – The purpose of this paper is to explore perceptions by higher education (HE) senior professors and managers of their role in preparing graduates for entry to the labour market. By providing a theoretical and empirical overview of the functional role of HE institutions in preparing graduates for work, the paper designs an own research framework for future developments in this area. Design/methodology/approach – The conclusions in the paper are based on a literature review and approximately 240 semi-structured interviews with HE professors and managers from six European countries and from six different study fields. The results are broadly compared with an international survey of graduates from 20, mainly European, countries. Findings – The senior HE professors and managers see their own role in supporting graduates’ careers in a surprisingly intuitive way. There are some important differences and similarities among the study fields, although they are generally not motivated to apply an evidence-based approach to study and programme developments. Originality/value – This paper is based on the results of a three-year survey of the European research network, and two international conferences comprising more than 100 contributions from over 30, mainly European, countries.
APA, Harvard, Vancouver, ISO, and other styles
13

Haegeman, Liliane. "Be going to and will: a pragmatic account." Journal of Linguistics 25, no. 2 (1989): 291–317. http://dx.doi.org/10.1017/s0022226700014110.

Full text
Abstract:
In the literature on English tense usage, expressions of futurity such as(1) (a) I will/shall leave next week.(b) I'm going to leave next week.have already received a lot of attention, especially so in the pedagogical descriptive tradition of English linguistics (cf. Close, 1977; Haegeman, 1981, 1983; Leech, 1971; Quirk et al., 1985; Palmer, 1974, 1979; Wekker, 1976, etc.). Although these accounts are attractive, they raise further questions because most of them do not propose to deal with the problem against a formal theoretical background. As a consequence, the rules formulated to describe the use of shall/will or be going to in (1) tend to be intuitive and often do not really allow any decisive choice to be made in many instances of usage. On the other hand, Reichenbach type analyses of tense interpretation are usually mainly concerned with the general problem of tense representation and treat both examples in (1) as illustrations of future tense without detailed discussion of the contrasts between them.
APA, Harvard, Vancouver, ISO, and other styles
14

Oxnard, Geoffrey R., Mireille Cantarini, Paul Frewer, et al. "SAVANNAH: A Phase II trial of osimertinib plus savolitinib for patients (pts) with EGFR-mutant, MET-driven (MET+), locally advanced or metastatic non-small cell lung cancer (NSCLC), following disease progression on osimertinib." Journal of Clinical Oncology 37, no. 15_suppl (2019): TPS9119. http://dx.doi.org/10.1200/jco.2019.37.15_suppl.tps9119.

Full text
Abstract:
TPS9119 Background: The toxicity profile of the third-generation EGFR-tyrosine kinase inhibitor (TKI) osimertinib makes it an attractive backbone for combination with other targeted agents, possibly overcoming acquired resistance mechanisms. Combination with a MET-inhibitor is an intuitive approach as MET-amplification was identified as the most common mechanism of resistance to osimertinib in preliminary ctDNA data from the Phase III FLAURA (15% of pts) and AURA3 (19% of pts) studies. Savolitinib (AZD6094, HMPL-504, volitinib) is an oral, potent and highly selective MET-TKI that had an acceptable safety profile when combined with osimertinib in the Phase Ib TATTON study, providing the basis for this Phase II SAVANNAH study (NCT03778229). Other mechanisms of acquired resistance to osimertinib, including secondary EGFR mutations (e.g. C797S), RAS/RAF activation, and oncogenic gene fusions, provide additional opportunities for developing osimertinib-based combinations. Methods: Eligible pts will have histologically/cytologically confirmed EGFR-mutant NSCLC, and MET+ disease by central FISH, central IHC, or local NGS (retrospectively confirmed by central FISH/IHC). Pts must have documented radiological progression following 1–3 lines of prior therapy (must include osimertinib). Pts will receive osimertinib 80 mg plus weight-based dosing with savolitinib 300 or 600 mg PO QD, in 28-day cycles. The primary objective is efficacy (RECIST 1.1) by overall response rate (ORR) in pts who are MET+ by central FISH. Secondary endpoints include: ORR ( MET+ by central IHC and all pts); progression-free survival, overall survival, duration of response, percent change in tumor size, HRQoL, and EGFR mutation ctDNA clearance ( MET+ by central FISH, central IHC, and all pts); safety, and pharmacokinetics (all pts). Based on the TATTON study, we anticipate enrolling ~172 MET+ pts to include ≥117 pts with MET+ disease by central FISH. Enrolment began in Q1 2019. Ongoing development of complementary trials targeting other osimertinib resistance mechanisms will also be discussed. Clinical trial information: NCT03778229.
APA, Harvard, Vancouver, ISO, and other styles
15

Zalieckaitė, Laima, and Rimvydas Skyrius. "Analitinės programinės įrangos diegimo į mokymo procesą tyrimas." Informacijos mokslai 61 (January 1, 2012): 144–55. http://dx.doi.org/10.15388/im.2012.0.1071.

Full text
Abstract:
Besikeičianti ir sudėtinga ekonominė aplinka verčia verslo organizacijas sparčiai reaguoti į rinkos pokyčius ir ieškoti naujų galimybių, o tam reikalinga kokybiška ir operatyvia informacija grindžiami sprendimai. Didelė organizacijų saugomų duomenų apimtis ir įvairialypės informacijos šaltinių kiekis lemia, kad organizacijos privalo taikyti analitine programine įranga palaikomą verslo analitiką. Kita vertus, šio tipo priemonės tampa sprendimų priėmimo aplinka, kuri pateikia paprastą, interaktyvią ir intuityviai suvokiamą analitiką. Analitinės programinės įrangos rinkos plėtra ir funkcinių galimybių tobulėjimas liudija jos paklausą organizacijų vadyboje, o į tai turi reaguoti akademinės įstaigos, rengiančios vadybos specialistus. Šiame straipsnyje analizuojamos analitinės programinės įrangos taikymo mokymo procese prielaidos, problemos ir galimi įgyvendinimo metodai.Reikšminiai žodžiai: analitinė programinė įranga, verslo analitika, intelektualus analitinis duomenų apdorojimas.Research on the Implementation of Business Intelligence Software into Study ProcessLaima Zalieckaitė, Rimvydas SkyriusSummaryThe dynamic and complicated economic environment is driving businesses to respond swiftly to market changes and look for new opportunities; this drive requires decisions based on reliable and timely information. To handle huge volumes of accumulated data and a variety of information sources, companies have to apply business intelligence approaches. The set of business intelligence tools serves as a decision support environment providing simple, interactive and intuitive analytical functions. The growth of the business intelligence software market and functionality indicates its growing demand for business management; this growth has to be considered by academic institutions engaged in management studies. This paper examines the possible ways of using the business intelligence software in the study process, the possible problems and implementation modes.
APA, Harvard, Vancouver, ISO, and other styles
16

Novikova, Z. D., and N. V. Dvoryanchikov. "The study of self-image in individuals with disorders of gender identity as an additional criterion of differential diagnosis in the examination of disputable sexual states." Psychology and Law 6, no. 4 (2016): 142–54. http://dx.doi.org/10.17759/psylaw.2016060413.

Full text
Abstract:
The article discusses ways in-depth analysis of self-image in individuals with disorders of sexual identity, applied for examination of disputable sexual States. The comparative analysis of the image I, the specifics of the gender role internalization on emotional and logical level. Examples of the four different nosological groups conducted a study of the Central concepts of identity – image of "I'm real" and image "I am perfect". With the help of additional factors in the assessment are the attempt to formalize the data conduct of differentiatie between issues of sexual identity and problems of integrative identity that affect including sexual identity, as well as for a more accurate assessment of possible development of sexual identity and of belonging to a particular type of violations of identity. The study presents an attempt to identify the psychological characteristics of gender identity and to identify possible predictive criterion – the criterion of adaptability. Objective data are complemented by attempts to explicitate intuitive experience and make it available for an objective assessment during examination of the disputable sexual States. The results can be used by specialists in various fields, first and foremost, clinical psychologists, involved in conducting surveys of individuals with FIR in the examination of disputable sexual states.
APA, Harvard, Vancouver, ISO, and other styles
17

Yao, Bo, Pascal Belin, and Christoph Scheepers. "Silent Reading of Direct versus Indirect Speech Activates Voice-selective Areas in the Auditory Cortex." Journal of Cognitive Neuroscience 23, no. 10 (2011): 3146–52. http://dx.doi.org/10.1162/jocn_a_00022.

Full text
Abstract:
In human communication, direct speech (e.g., Mary said: “I'm hungry”) is perceived to be more vivid than indirect speech (e.g., Mary said [that] she was hungry). However, for silent reading, the representational consequences of this distinction are still unclear. Although many of us share the intuition of an “inner voice,” particularly during silent reading of direct speech statements in text, there has been little direct empirical confirmation of this experience so far. Combining fMRI with eye tracking in human volunteers, we show that silent reading of direct versus indirect speech engenders differential brain activation in voice-selective areas of the auditory cortex. This suggests that readers are indeed more likely to engage in perceptual simulations (or spontaneous imagery) of the reported speaker's voice when reading direct speech as opposed to meaning-equivalent indirect speech statements as part of a more vivid representation of the former. Our results may be interpreted in line with embodied cognition and form a starting point for more sophisticated interdisciplinary research on the nature of auditory mental simulation during reading.
APA, Harvard, Vancouver, ISO, and other styles
18

Demchenko, I., B. Maksymchuk, O. Protas, et al. "Dynamics of pedagogical skill development and interaction of factors of its formation." Scientific Journal of National Pedagogical Dragomanov University. Series 15. Scientific and pedagogical problems of physical culture (physical culture and sports), no. 1(121) (January 29, 2020): 34–39. http://dx.doi.org/10.31392/npu-nc.series15.2019.1(121)20.07.

Full text
Abstract:
Pedagogical skill is a component both of general professional training of a teacher of any profession and a high degree of his sectoral competence (pedagogical skills of a verbal teacher, a teacher of fine arts, a teacher of physical culture, etc.). Purpose of the article is to theoretical foundation of the dynamics of pedagogical skill development and interaction of factors of its formation. Methods of the research: theoretical - analysis and synthesis of literary sources, comparisons, systematization, generalization, abstraction, hypothetical-deductive, individualization, classification, analogy. A qualitative triad "activity - skill - art is in cultural studies, theory of creativity and including pedagogy". Extrapolating to the didactic plane of IHE, it is possible to formulate it in such way "ability - skills - study - the planned realization (activity) – plan-situational (improvisational-regulated) activity or skill". That is, the main difference between straight - a student and a student who owns pedagogical skills is that the first in his production practice approaches the solution of pedagogical situations more instructively, standardized, while the other - situational, intuitive and more creative. The general conclusion regarding the subsoil for the highest level of pedagogical skill is as follows: if a future teacher of physical culture perfectly assimilates, reproduces and uses in practice pedagogical, sports, achievements, but doesn’t include his personality in the cognitive and creative process, he discovers a high level of pedagogical activity, while the student, adding to the above his own view, experience, interpretation, creativity in solving pedagogical problems, which increases the efficiency of the pedagogical process, fully embodies pedagogical skill.
APA, Harvard, Vancouver, ISO, and other styles
19

Yang, Mengyuan, Dan Li, Wu Jiang, et al. "Development and external validation of a novel nomogram for screening Chinese Lynch syndrome: based on a multicenter, population study." Therapeutic Advances in Medical Oncology 13 (January 2021): 175883592110232. http://dx.doi.org/10.1177/17588359211023290.

Full text
Abstract:
Background: This multicenter study aimed to reveal the genetic spectrum of colorectal cancer (CRC) with deficient mismatch repair (dMMR) and build a screening model for Lynch syndrome (LS). Methods: Through the immunohistochemical (IHC) screening of mismatch repair protein results in postoperative CRC patients, 311 dMMR cases, whose germline and somatic variants were detected using the ColonCore panel, were collected. Univariate and multivariate logistic regression analysis was performed on the clinical characteristics of these dMMR individuals, and a clinical nomogram, incorporating statistically significant factors identified using multivariate logistic regression analysis, was constructed to predict the probability of LS. The model was validated externally by an independent cohort. Results: In total, 311 CRC patients with IHC dMMR included 95 identified MMR germline variant (LS) cases and 216 cases without pathogenic or likely pathogenic variants in MMR genes (non-Lynch-associated dMMR). Of the 95 individuals, approximately 51.6%, 28.4%, 14.7%, and 5.3% cases carried germline MLH1, MSH2, MSH6, and PMS2 pathogenic or likely pathogenic variants, respectively. A novel nomogram was then built to predict the probability of LS for CRC patients with dMMR intuitively. The receiver operating characteristic (ROC) curve informed that this nomogram-based screening model could identify LS with a higher specificity and sensitivity with an area under curve (AUC) of 0.87 than current screening criteria based on family history. In the external validation cohort, the AUC of the ROC curve reached 0.804, inferring the screening model’s universal applicability. We recommend that dMMR-CRC patients with a probability of LS greater than 0.435 should receive a further germline sequencing. Conclusion: This novel screening model based on the clinical characteristic differences between LS and non-Lynch-associated dMMR may assist clinicians to preliminarily screen LS and refer susceptible patients to experienced specialists.
APA, Harvard, Vancouver, ISO, and other styles
20

Corwin, D. L., and B. L. Waggoner. "TETrans: A User-Friendly, Functional Model of Solute Transport." Water Science and Technology 24, no. 6 (1991): 57–65. http://dx.doi.org/10.2166/wst.1991.0141.

Full text
Abstract:
A beta-test version of TETrans (acronym for Trace Element Transport) is presented which models the movement of inorganic solutes through the vadose zone under transient-state conditions. TETrans is a complete software package consisting of an interactive tutorial, user's guide and applications software. Both IBM-compatible and Macintosh versions are available to users. TETrans is specifically designed to be intuitive in its operation and require only readily available input parameters in order to enhance its utility as a real-world applications tool for transport modeling. The transport model utilizes a mass-balance approach to determine solute concentration distributions over time and solute-loading to the groundwater. Several modeling options are available for simulating such transport-influencing factors as plant-water uptake, hydraulic bypass and adsorption. In the Macintosh version, TETrans makes full use of the Macintosh interface to enhance the user-friendliness of the model. All functions are available from pull-down menus, and simulation results are displayed in text and graphic windows. Test data are shown which include a comparison of simulated and measured results for the movement of chloride through a soil lysimeter column over a 900-day study period. Excellent agreement is found when a single parameter for bypass is used. TETrans is distinguished from other transport models in the straightforward manner in which it handles the exacerbating problem of hydraulic bypass. The hydraulic bypass parameter, termed the mobility coefficient, is determined from simple chemical analysis of chloride in the soil solution through the soil profile following the application of a plug of chloride in the irrigation water. The mobility coefficient reflects the volume of soil water which is not subject to piston displacement
APA, Harvard, Vancouver, ISO, and other styles
21

Fu, Dan, Leqiu Chen, and Zhou Cheng. "Integration of Wearable Smart Devices and Internet of Things Technology into Public Physical Education." Mobile Information Systems 2021 (August 18, 2021): 1–10. http://dx.doi.org/10.1155/2021/6740987.

Full text
Abstract:
With the development of the Internet, virtual reality technology is manifested in various products through a more intuitive visual experience. As the carrier of virtual reality technology, according to many predictions, smart wearable devices will be the main development direction in the next few years; the era of intelligence also provides new requirements and new challenges for the education and teaching mode of ordinary colleges and universities. This paper aims to integrate wearable smart devices into public physical education, adopt comparative experiment method and data statistics method, and design and conduct teaching experiments with wearable smart devices based on the Internet of Things technology. The experimental class students’ shooting percentage score increased from 12.80 points to 21.43 points, calculated by SPSS. SPSS is “Statistical Products and Service Solutions” software. Initially, the full name of the software was “Statistical Software Package for Social Sciences.” However, with the expansion of the SPSS product service field and the increase of service depth, SPSS officially changed the English full name to “Statistical Products and Service Solutions” in 2000, which marked the strategic direction of SPSS which is making major adjustments. SPSS is the general term for a series of software products and related services for statistical analysis operations, data mining, predictive analysis, and decision support tasks launched by IBM. There are Windows and Mac OS X versions. P = 0.003 < 0.01 , which showed very obvious difference. At the same time, after the experiment, the comparison of the two groups of students’ skill scores showed that there is also a significant difference ( P = 0.003 < 0.05 ). A modern leap-forward teaching model that organically integrates wearable smart devices and public physical education is realized.
APA, Harvard, Vancouver, ISO, and other styles
22

Lebedinski, Lara, and Vincent Vandenberghe. "Assessing education’s contribution to productivity using firm-level evidence." International Journal of Manpower 35, no. 8 (2014): 1116–39. http://dx.doi.org/10.1108/ijm-06-2012-0090.

Full text
Abstract:
Purpose – There is plenty of individual-level evidence, based on the estimation of Mincerian equations, showing that better-educated individuals earn more. This is usually interpreted as a proof that education raises labour productivity. Some macroeconomists, analysing cross-country time series, also support the idea that the continuous expansion of education has contributed positively to growth. Surprisingly, most economists with an interest in human capital have neglected the level of the firm to study the education-productivity-wage nexus. And the few published works considering firm-level evidence are lacking a proper strategy to cope with the endogeneity problem inherent to the estimation production and wage functions. The purpose of this paper is to aim at providing estimates of the causal effect of education on productivity and wage labour costs. Design/methodology/approach – This paper taps into a rich, firm-level, Belgian panel database that contains information on productivity, labour cost and the workforce’s educational attainment to deliver estimates of the causal effect of education on productivity and wage/labour costs. Therefore, it exclusively resorts to within firm changes to deal with time-invariant heterogeneity bias. What is more, it addresses the risk of simultaneity bias (endogeneity of firms’ education-mix choices in the short run) using the structural approach suggested by Ackerberg et al. (2006), alongside more traditional system-GMM methods (Blundell and Bond, 1998) where lagged values of labour inputs are used as instruments. Findings – Results suggest that human capital, in particular larger shares of university-educated workers inside firms, translate into significantly higher firm-level labour productivity, and that labour costs are relatively well aligned on education-driven labour productivity differences. In other words, the authors find evidence that the Mincerian relationship between education and individual wages is driven by a strong positive link between education and firm-level productivity. Originality/value – Surprisingly, most economists with an interest in human capital have neglected the level of the firm to study the education-productivity-pay nexus. Other characteristics of the workforce, like gender or age have been much more investigated at the level of the firm by industrial or labour economists (Hellerstein et al., 1999; Aubert and Crépon, 2003; Hellerstein and Neumark, 2007; Vandenberghe, 2011a, b, 2012; Rigo et al., 2012; Dostie, 2011; van Ours and Stoeldraijer, 2011). At present, the small literature based on firm-level evidence provides some suggestive evidence of the link between education, productivity and pay at the level of firms. Examples are Hægeland and Klette (1999); Haltiwanger et al. (1999). Other notable papers examining a similar question are Galindo-Rueda and Haskel (2005), Prskawetz et al. (2007) and Turcotte and Whewell Rennison (2004). But, despite offering plausible and intuitive results, many of the above studies essentially rely on cross-sectional evidence and most of them do not tackle the two crucial aspects of the endogeneity problem affecting the estimation of production and wage functions (Griliches and Mairesse, 1995): first, heterogeneity bias (unobserved time-invariant determinants of firms’ productivity that may be correlated to the workforce structure) and second, simultaneity bias (endogeneity in input choice, in the short-run, that includes the workforce mix of the firm). While the authors know that labour productivity is highly heterogeneous across firms (Syverson, 2011), only Haltiwanger et al. (1999) control for firm level-unobservables using firm-fixed effects. The problem of simultaneity has also generally been overlooked. Certain short-term productivity shocks affecting the choice of labour inputs, can be anticipated by the firms and influence their employment decision and thus the workforce mix. Yet these shocks and the resulting decisions by firms’ manager are unobservable by the econometrician. Hægeland and Klette (1999) try to solve this problem by proxying productivity shocks with intermediate goods, but their methodology inspired by Levinsohn and Petrin (2003) suffers from serious identification issues due to collinearity between labour and intermediate goods (Ackerberg et al., 2006).
APA, Harvard, Vancouver, ISO, and other styles
23

Posokhova, S. T., M. Kh Izotova, and M. V. Zemlyanykh. "Pedagogical empathy as a professional resource to create an inclusive environment." SHS Web of Conferences 87 (2020): 00060. http://dx.doi.org/10.1051/shsconf/20208700060.

Full text
Abstract:
The article is devoted to one of the most important components of interpersonal communication between teachers and students - empathy. The article discusses the features of the pedagogical empathy of teachers of correctional schools teaching children with mental retardation and teachers of secondary schools. The research is based on the definition of empathy given by I.M. Yusupov, who, considering empathy in the framework of the ways of understanding (rational, emotional and behavioral) and emphasizes that empathy occupies a key place in a person’s understanding of objects of social nature; in the acquisition of communicative competence by a person in the effective interaction of a teacher and a student [1]. The purpose of the study: to identify the level of empathy, the structure of empathy among teachers of correctional schools. The study involved 107 teachers. 57 of them are teachers of correctional schools, 50 are teachers of secondary schools. The age range of the study participants ranged from 36 to 45 years. Teaching experience - from 8-11 years. Research methods: diagnostics of the level of empathic abilities according to V.V. Boyko, “Balanced Emotional Empathy Scale - BEES” by A. Mehrabyan as modified by N. Epstein, “Determination of personality orientation” by B. Bass. To identify the statistical significance of the differences, the Student’s t-test was used. The interdependencies between psychodiagnostic indicators were studied by the method of correlation analysis using Spearman’s rank correlation coefficient. The results of the study showed that teachers of correctional institutions are distinguished by significantly higher indicators, compared with teachers of general education schools, of rational, emotional and intuitive channels of empathy, less representation of emotional barriers in communication, greater focus on emotional communication and a variety of forms of empathic response.
APA, Harvard, Vancouver, ISO, and other styles
24

Palomba, M. Lia, Kelly Piersanti, Andrew D. Zelenetz, Marcel R. M. van den Brink, and Gregoire Altan-Bonnet. "Chronic Lymphocytic Leukemia (CLL) Signaling Profiling Identifies CLL-Specific Signaling Impairment That Discriminates CLL B-Cells From Normal Circulating B Cells." Blood 116, no. 21 (2010): 1374. http://dx.doi.org/10.1182/blood.v116.21.1374.1374.

Full text
Abstract:
Abstract Abstract 1374 CLL is the most common leukemia in the Western world. It is characterized by the clonal expansion of CD5+/CD19+ B cells, expressing surface immunoglobulins (sIg), most often of the IgM subset. Clinically, it can either manifest as an indolent disease with little impact on the lifespan of a subset of patients or as an aggressive, highly drug-resistant disease with lethal outcome but all possible scenarios in between these extremes are routinely observed. The degree of somatic mutations of the sIg has been shown to affect survival; patients expressing highly mutated sIg have a better prognosis than patients whose sIg are less than 2% different from germline sequences. Therefore, it is intuitive that signaling through the sIg, or B-cell receptor (BCR), must have a role in determining the fate of the CLL cells, veering the intracellular machinery towards proliferation or apoptosis. We have systematically investigated the downstream events caused by BCR stimulation via crosslinking of the sIg in a series of about 60 heterogeneous CLL patients and 15 normal donor peripheral blood mononuclear cells (PBMCs). Using multiplexed phosphoflow cytometry we were able to assess the intensity of multiple signaling events at specific stages of the BCR signaling pathway. By combining phosphospecific staining and classic immunophenotyping we were also able to evaluate signaling intensity at the at the single cell level as well as correlating the signal intensity of CLL-specific surface markers to that of the phosphoproteins analyzed. We used principal component analysis to combine phosphospecific staining within individual cells and generate a variable that discriminates efficiently between CLL and PBMCs B-cells. We found that signaling properties of CLL B-cells were impaired when compared to normal B cells from healthy donors, both at the level of upstream members of the BCR signaling pathway (pBLNK, pSYK, pZap70, pBTK, pPLCg2) and downstream components such as pp38 and pERK. However, within the cohort of CLL samples we identified a small subpopulation in which signaling proceeded as well or more efficiently than in normal B cells. We also used our principal component variable to quantify the heterogeneity of signaling responses within populations of CLL B-cells and correlate them with clinical parameters that are known prognostic indicators in CLL. In conclusion, multiplexed phosphoflow cytometry of individual CLL cells provides a novel and highly specific method to determine the use of signaling pathways (such as BCR-mediated) in comparison to normal B cells. Our studies indicate a significant decrease in BCR signaling of CLL B cells, which involves both upstream and downstream components. Disclosures: van den Brink: Cytheris: Research Funding.
APA, Harvard, Vancouver, ISO, and other styles
25

Zhao, Fengyi, Lei Zhang, Yan Qin, et al. "Characterization of Methylation Patterns in Diffuse Large B Cell Lymphoma By Genome-Wide Methylation Analysis." Blood 134, Supplement_1 (2019): 1243. http://dx.doi.org/10.1182/blood-2019-131656.

Full text
Abstract:
Background: Diffuse large B cell lymphoma (DLBCL) is the most common type of non-Hodgkin lymphoma worldwide. Although the reference standard for identifying of the cell types is considered of gene expression profiling (GEP). But immunohistochemistry (IHC) is the most common method commercially available. The purpose of this study was to characterize the circulating cell-free DNA (cfDNA) methylation profile in DLBCL and to compare this profile with methylation observed in formalin fixed paraffin-embedded (FFPE) tissues. Additional efforts were made to correlate the observed methylation patterns with prognostic analysis and selected clinical features. Methods: The cfDNA and DNA of FFPE were extracted from 72 patients and 39 patients respectively. We assessed DNA methylation from plasma samples obtained from 29 individuals with GCB DLBCL at the time before treatment along with 43 samples of non-GCB DLBCL as controls. DNA from FFPE tissues were extracted from 11 individuals of GCB DLBCL and 28 individuals with non-GCB DLBCL. DNA methylation was analyzed with the Infinium MethylationEPIC BeadChip that quantitatively measures the methylation levels of more than 850,000 CpG sites across the genome. M values were used for visualization and intuitive interpretation of the results. Moreover, pathway enrichment analysis was performed with the Kyoto Encyclopedia of Genes and Genomes (KEGG) Pathway Database. Results: We found a total of 207 significant differentional differentially methylated positions (DMPs) of cfDNA between the GCB and non-GCB groups, identified with a p value of 0.001 (Fig. 1A). Of these, 65 presented at least 10% (|Δbeta| > 0.1) difference in the methylation level between GCB and non-GCB. 29 (44.6%) were found hypermethylated in GCB DLBCL, while 36 (55.4%) appeared hypomethylated (Fig. 1B). The distribution of the DMPs identified according to their location relative to CpG islands (CGI) were represented in Fig. 1C. Unsupervised clustering performed on DNA methylation values for the 207 DMPs identified is presented in Fig. 1D. These results highlight the differences between GCB and non-GCB samples. There are 1549 significant DMPs of DNA from FFPE between the GCB and non-GCB groups, identified with a p value of 0.001 (Fig. 1E). Of these, 1512 presented at least 10% (|Δbeta| > 0.1) difference in the methylation level between GCB and non-GCB . 1370 (90.6%) were found hypermethylated in GCB DLBCL, while 142 (9.4%) appeared hypomethylated (Fig. 1F). The distribution of the DMPs identified according to their location relative to CpG islands (CGI) were represented in Fig. 1G. Unsupervised clustering performed on DNA methylation values for the 1549 DMPs identified is presented in Fig. 1H. These results highlight the differences between GCB and non-GCB in FFPE samples which according with that in serum. The KEGG pathway enrichment analysis of DNA from FFPE tissue methylation revealed that the process "PI3K/Akt, Ras, MAPK signaling pathway" and "Human papillomavirus infection" are likely major contributors to Hans pathological type. In addition, the enrichment analysis of cfDNA methylation revealed that the process "MAPK signaling pathway" is likely the most important factor. Furthermore, we also have analyzed the methylation level between refractory or relapsed (R/R) DLBCL patients and individuals with a good prognosis. The differential methylation patterns were also found both in serums and FFPE tissues. Conclusions: The DNA methylation differs in GCB and non-GCB DLBCL patients. MAPK signaling pathway plays an important role in it. The mechanism needs to be further explored. Figure 1 Disclosures No relevant conflicts of interest to declare.
APA, Harvard, Vancouver, ISO, and other styles
26

Kalu, Nene, Patricia A. O'Neal, Catherine Nwokolo, Sharmin Diaz, and Oluwakemi Owoyemi. "The Use of Marijuana and Hydroxyurea Among Sickle Cell Patients." Blood 128, no. 22 (2016): 5913. http://dx.doi.org/10.1182/blood.v128.22.5913.5913.

Full text
Abstract:
Abstract Objective Sickle cell crises are a common, painful complication of sickle cell disease. These crises are treated in the emergency department with narcotic analgesia. However due to concerns about addiction, tolerance and side effects of pain medications, physicians specifically hematologists recommend the use of hydroxyurea as an alternative therapeutic medication for sickle cell patients. Hydroxyurea is the only FDA approved therapeutic medication proven to decrease the painful vasoocclusive process in sickle cell disease. Similarly, marijuana is being studied as an alternative in reducing pain in sickle cell disease. However, few have looked at the prevalence of hydroxyurea and marijuana use in sickle cell disease patients. We want to explore the prevalence of marijuana use among our hydroxyurea and non-hydroxyurea users. Method We developed a structured self-administered anonymous questionnaire that was provided to our adult sickle cell patients. One hundred and three (103) patients with HbSS, HbSC, HbSBetaThalassemia and HbSS disease with Alpha Thalassemia trait were consented and explained the purpose of the questionnaire. The questionnaire was administered in a private room in our clinic. The questionnaire focused on the following variables: (1) current regimen of their prescribed narcotic medication at steady state (2) current regimen of their prescribed narcotic medication during a moderate to severe pain crisis (3) use of Marijuana and Hydroxyurea during steady state and during a moderate to severe pain crisis (4) whether or not they were taking Hydroxyurea and the (5) physical effects of combining prescribed narcotic medication with illicit drug use. The questionnaire consisted of twenty-five (25) questions which took on average 10-15 minutes. The answers to the questionnaire were entered into REDCap (Research Electronic Data Capture). It is a secure, web-based application designed to support data capture for research studies, providing 1) an intuitive interface for validated data entry; 2) audit trails for tracking data manipulation and export procedures; 3) automated export procedures for seamless data downloads to common statistical packages; and 4) procedures for importing data from external sources. The data was analyzed using SPSS 23.0 (IBM Corp. Armonk, NY). Chi-square or Fisher exact testing was used to test the association between variables on the prevalence of cannabis use for pain management and other symptom relief and its side effects during self-administration in patients with sickle cell disease. Results Our analysis showed that fifty-one (51%) percent of the participants were females while forty-nine (49%) percent were males. Thirty-six (36%) percent of patients were between 25-34 years of age and the median age was 30 years. Thirty-three percent (33%) had a college degree or higher. Sixty percent (60%) of participants admitted to using marijuana at least once. Fifty-six percent (56%) of patients used both hydroxyurea and marijuana while forty-four percent (44%) of patients did not use hydroxyurea but used marijuana. Thirty percent (30%) of participants stated they used marijuana within the last 12 months to relieve symptoms associated with sickle cell disease. The main reasons for use were to increase appetite, improve sleep, mood and concentration, relieve stress and anxiety and alleviate pain. Eighty percent (80%) of participants stated that marijuana was not as effective in managing their sickle-related pain as hydroxyurea. There was no significant difference (p= 0.173) between the use of marijuana among hydroxyurea users and non-hydroxyurea users. Five percent(5%) of participants reported combining marijuana with prescribed narcotic medications to alleviate their sickle-related pain. Conclusion In summary, our data showed that within the groups of hydroxyurea users and non-hydroxyurea users, there was no significant difference in the use of marijuana. Both groups use marijuana as much as the general population. However, the use of marijuana among our sickle cell patients was not solely dependent on pain control. Further studies need to be conducted to show any significant results between hydroxyurea and non-hydroxyurea users. Disclosures No relevant conflicts of interest to declare.
APA, Harvard, Vancouver, ISO, and other styles
27

Lopes, Régia Lúcia. "Editorial." Revista Diálogos da Extensão 1, no. 1 (2015): 1. http://dx.doi.org/10.15628/dialogos.2015.3924.

Full text
Abstract:
As ações de extensão desenvolvidas pelo IFRN remontam sua antecessora, a “Escola Técnica Federal do Rio Grande do Norte”, a qual, já nos anos de 1992, iniciava projetos de cunho educacional para atendimento a demandas da comunidade. Os primeiros projetos desenvolvidos nesse âmbito levaram alunosdo antigo curso de “Saneamento” para cidades do interior do Rio Grande do Norte, fazendo naquele momento, a extensão da teoria, aliada à prática profissional, em benefício da comunidade. Os alunos organizados em grupos de áreas de conhecimentos, após iniciarem o semestre letivo e compreenderem os aspectos teóricos e práticos de sua profissão, iam para o campo realizar levantamentosde dados, fazer projetos e propor soluções muitas vezes simples, mas que deixavam na comunidade a marcado saber produzido na Instituição.A Lei no. 11.892/2008, que criou os Institutos Federais, coloca com uma das finalidades e características dessas instituições,o desenvolvimento de programas de extensão e de divulgação científica e tecnológica. Esse desafio, aliado à diversificação de ofertas de cursos em vários níveis, modalidade de ensinoe áreas de conhecimentos requer a sistematização das ações que podem ser empreendidas por servidores com a participação fundamental de discentes, sejam como bolsistas ou voluntários,e que possam, de fato, se constituir em ações de extensão. Dessa forma, a promoção de ações de extensão atua no ciclo virtuoso da geração do conhecimento, socialização do saber e retorno para instituição do saber transformado, promovendo, então, a formação cidadã, o crescimento profissional e, aomesmo tempo, colaborando com o desenvolvimento de onde a instituição está situada. As ações de extensão devem, portanto, estar articuladas com as demandas da sociedade com ênfase naprodução, desenvolvimento e difusão dos conhecimentos científicos e tecnológicos.A partir dessa nova institucionalidade, o fazer da extensão se amplia cada vez mais, fazendo-se merecedor de valorizaçãoe importância no cenário institucional com a criação do Programade Apoio Institucional à Extensão-IFRN, a partir de 2011, quando passam a ser apoiados 70 projetos desenvolvidosnos campi em funcionamento naquele momento, com recursos alocados pela Pró-Reitoria de Extensão (PROEX) e com apoio de recursos próprios dos campi.Além desses projetos, o instituto vem captando recursos junto ao Programa de Extensão Universitária – PROEXT, para o desen- volvimento de programas e projetos no âmbito de seus campi,dentre outros órgãos de fomento. O número de projetos vem, ano a ano, se ampliado e, em 2014, foram fomentados 91 projetos, que são desenvolvidos por docentes e técnicos administrativos com apoio de alunos bolsistas e voluntários.Essas ações estão consubstanciadas nos objetivos do IFRN, promovendo a integração da Instituição com a comunidade,e contribuindo para a formação de discentes e para o desenvolvimento da comunidade local.A extensão no IFRN busca atuar em consonância com atividades de ensino e pesquisa e desenvolve importantes projetosque possibilitam o intercâmbio de práticas e transferência de conhecimento para a sociedade. Traduzem, portanto,o nosso fazer acadêmico e nossa expertise no campo da educação profissional, sem perder de vista e ressaltando,a formação humana e cidadã, de modo a promover o desenvolvimento social da comunidade e dos discentes.É por tudo isso que a PROEX/IFRN tem a imensa satisfação de apresentar o primeiro número da Revista Diálogos da Extensão que brinda a comunidade com a publicação de uma coletânea de 20 relatos de experiências vividas ao longo da execução de projetos nas áreas temáticas de cultura, comunicação,educação, saúde, trabalho e tecnologia e produção. Os projetos foram fomentados com recursos dos Editais de 2012 a 2014 e com apoio dos campi. Os relatos foram selecionadospor meio do Edital 06/2014-PROEX e constituem um espaço de divulgação de importantes ações de extensão desenvolvidaspelo IFRN. Trata-se de um desafio a ser persistidoe também uma primeira análise crítica do que fazemos no contexto geral das nossas práticas extensionistas, com o intuitode buscar, cada vez mais, consolidar uma política de extensão e uma cultura interna que ampliem as relações com o ensino, apesquisa e a sociedade.Esperamos que, nos próximos números, novos relatos sejam apresentados, fortalecendo assim a extensão como estratégiapara o desenvolvimento territorial sustentável e cidadania. Boa Leitura!
APA, Harvard, Vancouver, ISO, and other styles
28

Lugya, Fredrick Kiwuwa. "User-friendly libraries for active teaching and learning." Information and Learning Science 119, no. 5/6 (2018): 275–94. http://dx.doi.org/10.1108/ils-07-2017-0073.

Full text
Abstract:
Purpose The purpose of this paper is to report the training of college librarians, academic and management staff, IT managers and students on how to organise, manage and use a user-friendly library. In Uganda, as in many countries, the problem is that school and/or college libraries are managed by librarians who may have good cataloguing and management skills, but who do not have the pedagogic skills and knowledge of the school curricula that are necessary for librarians to be able to guide and mentor both teachers and students or organise curriculum-related activities or facilitate research. The development of user-friendly libraries contributes in improving education quality through nurturing the interest of students and teachers in literacy activities and active search for knowledge. Under the stewardship of the Belgium Technical Cooperation and the Ministry of Education in Uganda, library stakeholders were trained on how to put users – rather than themselves – in the centre of the library’s operations and introduced to active teaching and learning methodologies and activities with emphasis on getting engaged in transforming spaces, services, outreach to users and collections. Several measures, short and long term were taken to address the gaps limiting the performance of the librarians. Given the disparities in the trainees’ education level and work experience, the training was delivered in seven modules divided into three units for over eight months in 2015. By the end of the training, trainees developed unique library strategic plan, library policies and procedures, capacity to use library systems, physical design and maintenance systems, partnerships, library structure and staff job descriptions. Design/methodology/approach To effectively engage the participants each topic was conducted using active teaching and learning (ATL) methodologies, including: lecture with slides and hands-on practice – each topic was introduced in a lecture form with slides and hands-on exercises. The main goal was to introduce the participants to the concepts discussed, offer opportunities to explore alternative approaches, as well define boundaries for discussion through brainstorming. The question-answer approach kept the participants alert and to start thinking critically on the topic discussed – brainstorming sessions allowed thinking beyond the presentation room, drawing from personal experiences to provide alternatives to anticipated challenges. The goal here was for the participants to provide individual choices and approaches for real life problems; group discussions: case study/ scenario and participant presentations – participants were provided with a scenario and asked to provide alternative approaches that could solve the problem based on their personal experience at their colleges. By the end of the group discussion, participants presented a draft of the deliverable as per the topic under discussion. More so, group discussions were an excellent approach to test participant’s teamwork skills and ability to compromise, as well as respecting team decisions. It was an opportunity to see how librarians will work with the library committees. Group discussions further initiated and cemented the much-needed librarian–academic staff – college management relationship. During the group discussion, librarians, teaching staff, ICT staff and college management staff, specifically the Principals and Deputy Principals interacted freely thus starting and cultivating a new era of work relationship between them. Individual presentation: prior to the workshop, participants were sent instructions to prepare a presentation on a topic. For example, participants were asked to provide their views of what a “user-friendly library” would look like or what would constitute a “user-friendly library”; the college library of HTC-Mulago was asked to talk about their experience working with book reserves, challenges faced and plans they have to address the challenges, while the college librarian from NTC-Kaliro was asked to describe a situation where they were able to assist a patron, the limitations they faced and how they addressed them. Doing so did not only assist to emotionally prepare the participants for the training but also helped to make them start thinking about the training in relation to their libraries and work. Take-home assignment: at the end of each session, participants were given home assignments to not only revise the training material but also prepare for the next day training. Further the take-home assignments provided time for the participants to discuss with their colleagues outside of the training room so as to have a common ground/ understanding on some of the very sensitive issues. Most interesting assignment was when participants were asked to review an article and to make a presentation in relation to their library experiences. Participant reports: participant reports resulted from the take-home assignments and participants were asked to make submission on a given topic. For example, participants were asked to review IFLA section on library management and write a two-page report on how such information provided supported their own work, as well as a participant report came from their own observation after a library visit. Invited talks with library expert: two invited talks by library experts from Consortium of Uganda University Libraries and Uganda Library and Information Science Association with the goal to share their experience, motivate the participants to strive higher and achieve great things for their libraries. Library visitation: there were two library visits conducted on three separate days – International Hospital Kampala (IHK) Library, Makerere University Library and Aga Khan University Hospital Library. Each of these library visits provided unique opportunities for the participants to explore best practices and implement similar practices in their libraries. Visual aids – videos, building plans and still photos: these were visual learning aids to supplement text during the lectures because they carried lot of information while initiating different thoughts best on the participants’ past experience and expertise. The training advocated for the use of ATL methodologies and likewise similar methodologies were used to encourage participants do so in their classrooms. Findings Addressing Key Concerns: Several measures, both long and short term, were taken to address the gaps limiting the performance of the librarians. The measures taken included: selected representative sample of participants including all college stakeholders as discussed above; active teaching and learning methodologies applied in the training and blended in the content of the training materials; initiated and formulated approaches to collaborations, networks and partnerships; visited different libraries to benchmark library practices and encourage future job shadowing opportunities; and encouraged participants to relate freely, understand and value each other’s work to change their mindsets. College librarians were encouraged to ensure library priorities remain on the agenda through advocacy campaigns. Short-term measures: The UFL training was designed as a practical and hands-on training blended with individual and group tasks, discussions, take-home assignments and presentations by participants. This allowed participates to engage with the material and take responsibility for their own work. Further, the training material was prepared with a view that librarians support the academic life of teaching staff and students. Participants were tasked to develop and later fine-tune materials designed to support their work. For example, developing a subject bibliography and posting it on the library website designed using open source tools such as Google website, Wikis, blogs. The developed library manual includes user-friendly policies and procedures referred to as “dos and don’ts in the library” that promote equitable open access to information; drafting book selection memos; new book arrivals lists; subscribing to open access journals; current awareness services and selective dissemination of information service displays and electronic bulletins. Based on their library needs and semester calendar, participants developed action points and timelines to implement tasks in their libraries at the end of each unit training. Librarians were encouraged to share their experiences through library websites, Facebook page, group e-mail/listserv and Instagram; however, they were challenged with intimate internet access. College libraries were rewarded for their extraordinary job. Given their pivotal role in the management and administration of financial and material resources, on top of librarians, the participants in this training were college administrators/ management, teaching and ICT staff, researchers and student leadership. Participants were selected to address the current and future needs of the college library. These are individuals that are perceived to have a great impact towards furthering the college library agenda. The practical nature of this training warranted conducting the workshops from developed but similar library spaces, for example, Aga Khan University Library and Kampala Capital City, Makerere University Library, International Hospital Kampala Library and Uganda Christian University Library. Participants observed orientation sessions, reference desk management and interviews, collection management practices, preservation and conservation, secretarial bureau management, etc. Long-term measures: Changing the mindset of librarians, college administrators and teaching staff is a long-term commitment which continues to demand for innovative interventions. For example: job shadowing allowed college librarian short-term attachments to Makerere University Library, Uganda Christian University Library, Aga Khan Hospital University Library and International Hospital Kampala Library – these libraries were selected because of their comparable practices and size. The mentorship programme lasted between two-three weeks; on-spot supervision and follow-up visits to assess progress with the action plan by the librarians and college administration and college library committee; ensuring that all library documents – library strategic plan, library manual, library organogram, etc are approved by the College Governing Council and are part of the college wide governing documents; and establishing the library committee with a job description for each member – this has strengthened the library most especially as an advocacy tool, planning and budgeting mechanism, awareness channel for library practices, while bringing the library to the agenda – reemphasizing the library’s agenda. To bridge the widened gap between librarians and the rest of the stakeholders, i.e. teaching staff, ICT staff, college administration and students, a college library committee structure and its mandate were established comprising: Library Committee Chairperson – member of the teaching staff; Library Committee Secretary – College Librarian; Student Representative – must be a member of the student Guild with library work experience; and Representative from each college academic department. A library consortium was formed involving all the four project supported colleges to participate in resource sharing practices, shared work practices like shared cataloguing, information literacy training, reference interview and referral services as well a platform for sharing experiences. A library consortium further demanded for automating library functions to facilitate collaboration and shared work. Plans are in place to install Koha integrated library system that will cultivate a strong working relationship between librarians and students, academic staff, college administration and IT managers. This was achieved by ensuring that librarians innovatively implement library practices and skills acquired from the workshop as well as show their relevance to the academic life of the academic staff. Cultivating relationships takes a great deal of time, thus college librarians were coached on: creating inclusive library committees, timely response to user needs, design library programmes that address user needs, keeping with changing technology to suite changing user needs, seeking customer feedback and collecting user statistics to support their requests, strengthening the library’s financial based by starting a secretarial bureau and conducting user surveys to understand users’ information-seeking behaviour. To improve the awareness of new developments in the library world, college librarians were introduced to library networks at national, regional and international levels, as a result they participated in conferences, workshops, seminars at local, regional and international level. For example, for the first time and with funding from Belgium Technical Cooperation, college librarians attended 81st IFLA World Library and Information Congress in South African in 2015. College libraries are now members of the Consortium of Uganda University Libraries and Uganda Library and Information Science Association and have attended meetings of these two very important library organisations in Uganda’s LIS profession. The college librarians have attended meetings and workshops organized by these two organisations. Originality/value At the end of the three units training, participants were able to develop: a strategic plan for their libraries; an organogram with staffing needs and job description matching staff functions; a Library Committee for each library and with a structure unifying all the four project-support Colleges; a library action plan with due dates including deliverables and responsibilities for implementation; workflow plan and organisation of key sections of the library such as reserved and public spaces; furniture and equipment inventory (assets); a library manual and collection development policy; partnerships with KCCA Library and Consortium of Uganda University Libraries; skills to use Koha ILMS for performing library functions including: cataloguing, circulation, acquisitions, serials management, reporting and statistics; skills in searching library databases and information literacy skills; skills in designing simple and intuitive websites using Google Sites tools; and improved working relationship between the stakeholders was visible. To further the user-friendly libraries principle of putting users in the centre of the library’s operations, support ATL methodologies and activities with emphasis on getting engaged in transforming spaces, services, outreach to users and collections the following initiatives are currently implemented in the colleges: getting approval of all library policy documents by College Governing Council, initiating job shadowing opportunities, conducting on-spot supervision, guide libraries to set up college library committees and their job description, design library websites, develop dissemination sessions for all library policies, incorporate user-friendly language in all library documents, initiate income generation activities for libraries, set terms of reference for library staff and staffing as per college organogram, procurement of library tools like DDC and library of congress subject headings (LCSH), encourage attendance to webinars and space planning for the new libraries.
APA, Harvard, Vancouver, ISO, and other styles
29

Magoma, Tshepo, Sithembiso Khumalo, and Tanya Du Plessis. "Affordability of IBM Cognos business intelligence tool features suitable for small-and medium-sized enterprises’ decision-making." SA Journal of Information Management 23, no. 1 (2021). http://dx.doi.org/10.4102/sajim.v23i1.1291.

Full text
Abstract:
Background: Business intelligence (BI) tools are generally associated with organisations that have resources to purchase and implement these tools. Evidences abound regarding the correlation between BI tools and improved business decision-making. This study’s unit of analysis is affordability as a feature of IBM Cognos making it suitable for small-and medium-sized enterprises (SMEs).Objective: The research aim was to identify the fundamental features of IBM Cognos which would address decision-making needs of SMEs. The objective was to determine the significance of BI tool features by means of identifying affordable features suitable for SMEs’ decision-making.Method: Quantitative research design and a deductive approach were best suited for assessing the fundamental features of IBM Cognos for SMEs’ decision-making needs. The signification framework variables, such as presumed, prized and perceived value of BI tool features, werequantified and measured using statistical analysis tools. A non-probability convenience sampling technique was used with a sample size of 200, that is, 80 BI consultants, 60 SME BI developers and 60 SME managers.Results: Affordable key features of BI tools in the context of SMEs’ business decision-making include consistency and comfort, intuitive interface, avoiding impulsivity, cost effectiveness, availability of information, best programmed visualisations, reporting quickly and easily and financial decision-making.Conclusion: The signification framework’s presumed, prized and perceived value indicators link affordable BI tool features to the consistency of the decision-making process and present an alternative view of affordability. An intuitive interface relates to convenience and ease of authoring content, designing, building and securing reports to the SME, which helps inimproving consistent decision-making.
APA, Harvard, Vancouver, ISO, and other styles
30

Cornell, Stephen J., Yevhen F. Suprunenko, Dmitri Finkelshtein, Panu Somervuo, and Otso Ovaskainen. "A unified framework for analysis of individual-based models in ecology and beyond." Nature Communications 10, no. 1 (2019). http://dx.doi.org/10.1038/s41467-019-12172-y.

Full text
Abstract:
Abstract Individual-based models, ‘IBMs’, describe naturally the dynamics of interacting organisms or social or financial agents. They are considered too complex for mathematical analysis, but computer simulations of them cannot give the general insights required. Here, we resolve this problem with a general mathematical framework for IBMs containing interactions of an unlimited level of complexity, and derive equations that reliably approximate the effects of space and stochasticity. We provide software, specified in an accessible and intuitive graphical way, so any researcher can obtain analytical and simulation results for any particular IBM without algebraic manipulation. We illustrate the framework with examples from movement ecology, conservation biology, and evolutionary ecology. This framework will provide unprecedented insights into a hitherto intractable panoply of complex models across many scientific fields.
APA, Harvard, Vancouver, ISO, and other styles
31

LaSarre, Breah, David T. Kysela, Barry D. Stein, Adrien Ducret, Yves V. Brun, and James B. McKinlay. "Restricted Localization of Photosynthetic Intracytoplasmic Membranes (ICMs) in Multiple Genera of Purple Nonsulfur Bacteria." mBio 9, no. 4 (2018). http://dx.doi.org/10.1128/mbio.00780-18.

Full text
Abstract:
ABSTRACTIn bacteria and eukaryotes alike, proper cellular physiology relies on robust subcellular organization. For the phototrophic purple nonsulfur bacteria (PNSB), this organization entails the use of a light-harvesting, membrane-bound compartment known as the intracytoplasmic membrane (ICM). Here we show that ICMs are spatially and temporally localized in diverse patterns among PNSB. We visualized ICMs in live cells of 14 PNSB species across nine genera by exploiting the natural autofluorescence of the photosynthetic pigment bacteriochlorophyll (BChl). We then quantitatively characterized ICM localization using automated computational analysis of BChl fluorescence patterns within single cells across the population. We revealed that while many PNSB elaborate ICMs along the entirety of the cell, species across as least two genera restrict ICMs to discrete, nonrandom sites near cell poles in a manner coordinated with cell growth and division. Phylogenetic and phenotypic comparisons established that ICM localization and ICM architecture are not strictly interdependent and that neither trait fully correlates with the evolutionary relatedness of the species. The natural diversity of ICM localization revealed herein has implications for both the evolution of phototrophic organisms and their light-harvesting compartments and the mechanisms underpinning spatial organization of bacterial compartments.IMPORTANCEMany bacteria organize their cellular space by constructing subcellular compartments that are arranged in specific, physiologically relevant patterns. The purple nonsulfur bacteria (PNSB) utilize a membrane-bound compartment known as the intracytoplasmic membrane (ICM) to harvest light for photosynthesis. It was previously unknown whether ICM localization within cells is systematic or irregular and if ICM localization is conserved among PNSB. Here we surveyed ICM localization in diverse PNSB and show that ICMs are spatially organized in species-specific patterns. Most strikingly, several PNSB resolutely restrict ICMs to regions near the cell poles, leaving much of the cell devoid of light-harvesting machinery. Our results demonstrate that bacteria of a common lifestyle utilize unequal portions of their intracellular space to harvest light, despite light harvesting being a process that is intuitively influenced by surface area. Our findings therefore raise fundamental questions about ICM biology and evolution.
APA, Harvard, Vancouver, ISO, and other styles
32

Muhammed, Abiola, Anne Dodd, Suzanne Guerin, Susan Delaney, and Philip Dodd. "Complicated grief knowledge and practice: a qualitative study of general practitioners in Ireland." Irish Journal of Psychological Medicine, January 22, 2021, 1–6. http://dx.doi.org/10.1017/ipm.2020.122.

Full text
Abstract:
Objective: Complicated grief is a debilitating condition that individuals may experience after losing a loved one. General practitioners (GPs) are well positioned to provide patients with support for grief-related issues. Traditionally, Irish GPs play an important role in providing patients with emotional support regarding bereavement. However, GPs have commonly reported not being aptly trained to respond to bereavement-related issues. This study explores GPs’ current knowledge of and practice regarding complicated grief. Methods: A qualitative study adopting a phenomenological approach to explore the experiences of GPs on this issue. Semi-structured interviews were carried out with a purposive sample of nine GPs (five men and four women) in Ireland. Potential participants were contacted via email and phone. Interviews were audio-recorded, transcribed and analysed using Braun & Clarke’s (2006) model of thematic analysis. Results: GPs had limited awareness of the concept of complicated grief and were unfamiliar with relevant research. They also reported that their training was either non-existent or outdated. GPs formed their own knowledge of grief-related issues based on their intuition and experiences. For these reasons, there was not one agreed method of how to respond to grief-related issues reported by patients, though participants recognised the need for intervention, onward referral and review. Conclusions: The research highlighted that GPs felt they required training in complicated grief so that they would be better able to identify and respond to complicated grief.
APA, Harvard, Vancouver, ISO, and other styles
33

Seruffo, Marcos César da Rocha, Bruno Anderson Pereira Ferreira, and Yomara Pinheiro Pires. "Planejamento e construção de um protótipo de aplicativo mobile para visualização de dados de sistema de monitoramento de máquinas e equipamentos." Revista Principia - Divulgação Científica e Tecnológica do IFPB, September 5, 2021. http://dx.doi.org/10.18265/1517-0306a2021id5435.

Full text
Abstract:
<p>A crescente demanda por processos digitalizados a partir da evolução tecnológica na Indústria 4.0 é de suma importância, havendo a necessidade de acesso a informações de maneira mais rápida, intuitiva e barata. Neste contexto, este artigo propõe o desenvolvimento de um protótipo de aplicativo móvel nativo Android de um sistema web existente que atua no auxílio à visualização de dados, com o intuito de monitoramento para suporte à manutenção preventiva de máquinas e equipamentos de uma empresa no setor energético no Brasil. Para tanto, a aplicação foi desenvolvida seguindo etapas essenciais a um projeto de software e foram consideradas recomendações de Interação Humano-Computador (IHC), garantindo a usabilidade e navegabilidade do aplicativo. Como resultado, é apresentado um protótipo de aplicação que foi validado a partir de entrevistas semiestruturadas com especialistas que relataram o alcance do objetivo proposto, com grande potencial de uso e diversas oportunidades de estudos e aperfeiçoamento.</p>
APA, Harvard, Vancouver, ISO, and other styles
34

Aliakbari, Fatemeh, Masoumeh Ghaedamini, Fatemeh Deris, and Reza Masoudi. "Relationship Between Nurses’ Decision-Making Style and Their Disaster Response Competencies." Disaster Medicine and Public Health Preparedness, September 23, 2020, 1–6. http://dx.doi.org/10.1017/dmp.2020.225.

Full text
Abstract:
ABSTRACT Nurses are the first respondents to the critical situations and therefore must be able to effectively manage the critical situations using their competencies. Given that the decision-making style under the stressful critical situations is an important component of the care process in these situations, this study was conducted with the aim of determining the relationship between decision-making style and nurses’ disaster response competencies. This descriptive, analytical study was conducted in Shahr-e Kord city in 2018. A total of 300 nurses were selected from Ayatollah Kashani and Hajar hospitals by multistage sampling and from the Emergency Medical Services Center by the census method. Data were collected using the Disaster Nursing Competence Assessment and the decision-making style questionnaires and analyzed with SPSS 21 (IBM Corp, Armonk, NY). Most of the nurses used the intuitive decision-making style and the total score of disaster nursing competencies was 162.58 ± 22.70. Pearson’s correlation coefficient indicated that there was a positive relation between decision-making style and nurses’ disaster response competencies. The results show that decision-making style affects nurses’ competencies for disaster response and provides evidence for the development of educational policies in disaster nursing education.
APA, Harvard, Vancouver, ISO, and other styles
35

Kaviza, M. "LEVEL OF HISTORICAL THINKING SKILLS PRACTICE AND ACQUASITION: STUDENTS PERSPECTIVE BASED ON FOUR DIMENSION OF LEARNING STYLE." International Journal of Education, Psychology and Counseling, July 7, 2019, 32–45. http://dx.doi.org/10.35631/ijepc.431004.

Full text
Abstract:
The purpose of this study is to examine the level of historical thinking skills practice and acquisition among the students based on four dimensions of learning styles such as input (visual and verbal), processing (active and reflective), observation (sensing and intuitive) and understanding (sequential and global). The design of this study is a descriptive survey method involving a total of 865 forms two students from 30 secondary schools in a state of north Peninsular of Malaysia using simple random sampling technique. Two adapted questionnaires and a performance test which has been verified by the content expert and have a good reliability value, discrimination and difficulty index are used in this study. The data were analyzed by descriptive statistics using IBM SPSS version 24. The findings of this study indicated that the historical thinking skills practice at a moderate level and the historical thinking skills acquisition at the lower level among the students. Therefore, historical thinking skills practice also at a moderate level and historical thinking skills acquisition at the lower level based on four dimensions of learning style among students. This study is also expected the history teacher will designing a more effective teaching and learning method in order to improve the level of historical thinking skills based on students learning the style in the ways they receive, understand and implement the knowledge and skills.
APA, Harvard, Vancouver, ISO, and other styles
36

Siket, Matthew S., and Jay M. Baruch. "Principles of Neurologic Ethics." DeckerMed Medicine, October 25, 2018. http://dx.doi.org/10.2310/im.1287.

Full text
Abstract:
Neuroethics refers to the branch of applied bioethics pertaining to the neurosciences and emerging technologies that impact our ability to understand or enhance a human mind. In the setting of emergency medicine, the clinician will encounter neuroethical dilemmas pertaining to the acutely brain injured or impaired; similar to other ethical decisions encountered in emergency medicine, such neuroethical dilemmas are often complicated by insufficient information regarding the patient’s wishes and preferences and a short time frame in which to obtain this information. This review examines the basis of neuroethics in emergency medicine; neuroethical inquiry; the neuroscience of ethics and intuition; issues regarding autonomy, informed consent, paternalism, and persuasion; shared decision making; situations in which decision-making capacity is in question; beneficence/nonmaleficence; incidental findings and their implications; risk predictions; and issues of justice. The figure shows the use of tissue plasminogen activator (t-PA) for cerebral ischemia within 3 hours of onset and changes in outcome due to treatment. Tables list common ethical theories, virtues/values of an acute care provider, components of informed consent discussion unique to t-PA in acute ischemic stroke, models of the physician-patient relationship, eight ways to promote effective shared decision making, components of capacity assessment, and emergency department assessment of futility. This review contains 1 figure, 9 tables, and 90 references. Keywords: Ethics, autonomy, shared decision-making, moral dilemmas, framing, decision-making capacity, beneficence and nonmaleficence
APA, Harvard, Vancouver, ISO, and other styles
37

Millán Vaquero, Ricardo Manuel, Alexander Vais, Sean Dean Lynch, et al. "Helical Axis Data Visualization and Analysis of the Knee Joint Articulation." Journal of Biomechanical Engineering 138, no. 9 (2016). http://dx.doi.org/10.1115/1.4034005.

Full text
Abstract:
We present processing methods and visualization techniques for accurately characterizing and interpreting kinematical data of flexion–extension motion of the knee joint based on helical axes. We make use of the Lie group of rigid body motions and particularly its Lie algebra for a natural representation of motion sequences. This allows to analyze and compute the finite helical axis (FHA) and instantaneous helical axis (IHA) in a unified way without redundant degrees of freedom or singularities. A polynomial fitting based on Legendre polynomials within the Lie algebra is applied to provide a smooth description of a given discrete knee motion sequence which is essential for obtaining stable instantaneous helical axes for further analysis. Moreover, this allows for an efficient overall similarity comparison across several motion sequences in order to differentiate among several cases. Our approach combines a specifically designed patient-specific three-dimensional visualization basing on the processed helical axes information and incorporating computed tomography (CT) scans for an intuitive interpretation of the axes and their geometrical relation with respect to the knee joint anatomy. In addition, in the context of the study of diseases affecting the musculoskeletal articulation, we propose to integrate the above tools into a multiscale framework for exploring related data sets distributed across multiple spatial scales. We demonstrate the utility of our methods, exemplarily processing a collection of motion sequences acquired from experimental data involving several surgery techniques. Our approach enables an accurate analysis, visualization and comparison of knee joint articulation, contributing to the evaluation and diagnosis in medical applications.
APA, Harvard, Vancouver, ISO, and other styles
38

Ku, Huan-Yu, Neill Lambert, Feng-Jui Chan, Clive Emary, Yueh-Nan Chen, and Franco Nori. "Experimental test of non-macrorealistic cat states in the cloud." npj Quantum Information 6, no. 1 (2020). http://dx.doi.org/10.1038/s41534-020-00321-x.

Full text
Abstract:
AbstractThe Leggett–Garg inequality attempts to classify experimental outcomes as arising from one of two possible classes of physical theories: those described by macrorealism (which obey our intuition about how the macroscopic classical world behaves) and those that are not (e.g., quantum theory). The development of cloud-based quantum computing devices enables us to explore the limits of macrorealism. In particular, here we take advantage of the properties of the programmable nature of the IBM quantum experience to observe the violation of the Leggett–Garg inequality (in the form of a ‘quantum witness’) as a function of the number of constituent systems (qubits), while simultaneously maximizing the ‘disconnectivity’, a potential measure of macroscopicity, between constituents. Our results show that two- and four-qubit ‘cat states’ (which have large disconnectivity) are seen to violate the inequality, and hence can be classified as non-macrorealistic. In contrast, a six-qubit cat state does not violate the ‘quantum witness’ beyond a so-called clumsy invasive-measurement bound, and thus is compatible with ‘clumsy macrorealism’. As a comparison, we also consider un-entangled product states with n = 2, 3, 4 and 6 qubits, in which the disconnectivity is low.
APA, Harvard, Vancouver, ISO, and other styles
39

Moura, Kamilla de Castro Carvalho, Tamires Ribeiro Afonso, Liliane da Costa Jacobs Lames, and Edilei Rodrigues de Lames. "Planejamento de custos: um estudo de caso na prestação de serviços elétricos em Hortolândia/SP." Revista da Micro e Pequena Empresa, January 5, 2021, 61–82. http://dx.doi.org/10.48099/1982-2537/2020v14n2p6182.

Full text
Abstract:
RESUMO No cenário empresarial, é notória a carência de conhecimento, pelos empreendedores, sobre a importância do planejamento e da análise de custos para suas empresas. O objetivo desse estudo é demonstrar a relevância do planejamento e da análise dos custos e sua aplicabilidade nas empresas prestadoras de serviços. O estudo é exploratório, com abordagem qualitativa, mediante pesquisa participante, entrevista e análise documental. Os dados foram obtidos por meio de entrevista realizada com o proprietário de uma empresa prestadora de serviços elétricos na Região Metropolitana de Campinas, utilizando-se da análise de conteúdo. Os dados secundários foram analisados por meio do editor de planilhas Microsoft Excel e Software IBM SPSS Statistics. Constatou-se que a empresa não possuia controle e registros dos seus gastos e que o processo de formação do preço de venda era feito de forma intuitiva. A gestão da empresa desconhecia o resultado efetivo de cada período assim como as ferramentas gerenciais necessárias para o bom andamento do negócio. Conclui-se que as análises realizadas nessa empresa reforçam a importância do planejamento, controle, registro e conhecimento destes instrumentos gerenciais, inclusive na pequena empresa, os quais proporcionam ferramentas de análise do resultado obtido, ponto de equilíbrio, margem de contribuição, precificação, classificações dos gastos, entre outros. Estes elementos melhoram os resultados da empresa e tornam a gestão menos amadora e mais estratégica, o que é essencial para que a empresa permaneça competitiva no mercado. Os resultados também servem de modelo para micro e pequenas empresas que desejem implementar o controle e planejamento de custos. Palavras-chave: Controladoria; Planejamento de custos; Prestadoras de serviço.
APA, Harvard, Vancouver, ISO, and other styles
40

Razia, Deepika, Amy Trahan, Luca Giulini, Komeil M. Baboli, and Sumeet K. Mittal. "807 ASSOCIATION OF BOLUS TRANSIT TIME ON BARIUM ESOPHAGOGRAM WITH ESOPHAGEAL PERISTALSIS." Diseases of the Esophagus 34, Supplement_1 (2021). http://dx.doi.org/10.1093/dote/doab052.807.

Full text
Abstract:
Abstract The threshold criteria for diagnosing ineffective esophageal motility (IEM) has changed over the years and is based on the proportion of failed and weak peristalses. Bolus transit time (BTT) on barium esophagogram (BE) can intuitively be the ‘gold standard' for assessing the effectiveness of esophageal peristalsis. The aim of this study was to associate upright and prone BTT with esophageal peristalsis and dysphagia in patients with normal lower esophageal sphincter (LES) parameters. Methods Patients with normal LES on high-resolution manometry (HRM) who also had a standard-protocol BE from 2017 to 2020 were included. Patients with previous foregut surgery, hiatal hernia, jackhammer esophagus, distal esophageal spasm, fragmented peristalsis, and those with < or > 10 single swallows on HRM were excluded. Based on the number of normal swallows (DCI >450 mmHg.s.cm), the patients were divided into 11 groups (10 normal to 0 normal). Upright and prone BTT were measured on BE. Fractional polynomial and logistic regression analysis were used to study association (along with rate of change) between BTT, dysphagia, and peristalsis. Results In total, 146 patients met the inclusion criteria. Prone BTT increased in tandem with a decrease in the number of normal peristalses (p < 0.001), but no difference was noted in upright BTT (p = 0.317). Two deflection points were noted on the association between peristalsis and prone BTT at 50%, 40 seconds and 30%, 80 seconds on the y and x-axes, respectively, after which declining peristaltic function was independent of prone BTT. Patients with prone BTT >40 seconds had nearly 6-fold higher odds of having zero normal peristalses (p = 0.002). Increasing prone BTT was associated with increasing dysphagia (p < 0.05). Conclusion Prone, but not upright BTT, correlates with the proportion of normal esophageal peristalses and dysphagia. The phenotype of abnormal swallows (failed, weak) appears to have minimal impact on BTT. The current perspective of manometric classification may need to be adjusted to use the proportion of normal peristalses as a criterion.
APA, Harvard, Vancouver, ISO, and other styles
41

Deck, Andy. "Treadmill Culture." M/C Journal 6, no. 2 (2003). http://dx.doi.org/10.5204/mcj.2157.

Full text
Abstract:
Since the first days of the World Wide Web, artists like myself have been exploring the new possibilities of network interactivity. Some good tools and languages have been developed and made available free for the public to use. This has empowered individuals to participate in the media in ways that are quite remarkable. Nonetheless, the future of independent media is clouded by legal, regulatory, and organisational challenges that need to be addressed. It is not clear to what extent independent content producers will be able to build upon the successes of the 90s – it is yet to be seen whether their efforts will be largely nullified by the anticyclones of a hostile media market. Not so long ago, American news magazines were covering the Browser War. Several real wars later, the terms of surrender are becoming clearer. Now both of the major Internet browsers are owned by huge media corporations, and most of the states (and Reagan-appointed judges) that were demanding the break-up of Microsoft have given up. A curious about-face occurred in U.S. Justice Department policy when John Ashcroft decided to drop the federal case. Maybe Microsoft's value as a partner in covert activity appealed to Ashcroft more than free competition. Regardless, Microsoft is now turning its wrath on new competitors, people who are doing something very, very bad: sharing the products of their own labour. This practice of sharing source code and building free software infrastructure is epitomised by the continuing development of Linux. Everything in the Linux kernel is free, publicly accessible information. As a rule, the people building this "open source" operating system software believe that maintaining transparency is important. But U.S. courts are not doing much to help. In a case brought by the Motion Picture Association of America against Eric Corley, a federal district court blocked the distribution of source code that enables these systems to play DVDs. In addition to censoring Corley's journal, the court ruled that any programmer who writes a program that plays a DVD must comply with a host of license restrictions. In short, an established and popular media format (the DVD) cannot be used under open source operating systems without sacrificing the principle that software source code should remain in the public domain. Should the contents of operating systems be tightly guarded secrets, or subject to public review? If there are capable programmers willing to create good, free operating systems, should the law stand in their way? The question concerning what type of software infrastructure will dominate personal computers in the future is being answered as much by disappointing legal decisions as it is by consumer choice. Rather than ensuring the necessary conditions for innovation and cooperation, the courts permit a monopoly to continue. Rather than endorsing transparency, secrecy prevails. Rather than aiming to preserve a balance between the commercial economy and the gift-economy, sharing is being undermined by the law. Part of the mystery of the Internet for a lot of newcomers must be that it seems to disprove the old adage that you can't get something for nothing. Free games, free music, free pornography, free art. Media corporations are doing their best to change this situation. The FBI and trade groups have blitzed the American news media with alarmist reports about how children don't understand that sharing digital information is a crime. Teacher Gail Chmura, the star of one such media campaign, says of her students, "It's always been interesting that they don't see a connection between the two. They just don't get it" (Hopper). Perhaps the confusion arises because the kids do understand that digital duplication lets two people have the same thing. Theft is at best a metaphor for the copying of data, because the original is not stolen in the same sense as a material object. In the effort to liken all copying to theft, legal provisions for the fair use of intellectual property are neglected. Teachers could just as easily emphasise the importance of sharing and the development of an electronic commons that is free for all to use. The values advanced by the trade groups are not beyond question and are not historical constants. According to Donald Krueckeberg, Rutgers University Professor of Urban Planning, native Americans tied the concept of property not to ownership but to use. "One used it, one moved on, and use was shared with others" (qtd. in Batt). Perhaps it is necessary for individuals to have dominion over some private data. But who owns the land, wind, sun, and sky of the Internet – the infrastructure? Given that publicly-funded research and free software have been as important to the development of the Internet as have business and commercial software, it is not surprising that some ambiguity remains about the property status of the dataverse. For many the Internet is as much a medium for expression and the interplay of languages as it is a framework for monetary transaction. In the case involving DVD software mentioned previously, there emerged a grass-roots campaign in opposition to censorship. Dozens of philosophical programmers and computer scientists asserted the expressive and linguistic bases of software by creating variations on the algorithm needed to play DVDs. The forbidden lines of symbols were printed on T-shirts, translated into different computer languages, translated into legal rhetoric, and even embedded into DNA and pictures of MPAA president Jack Valenti (see e.g. Touretzky). These efforts were inspired by a shared conviction that important liberties were at stake. Supporting the MPAA's position would do more than protect movies from piracy. The use of the algorithm was not clearly linked to an intent to pirate movies. Many felt that outlawing the DVD algorithm, which had been experimentally developed by a Norwegian teenager, represented a suppression of gumption and ingenuity. The court's decision rejected established principles of fair use, denied the established legality of reverse engineering software to achieve compatibility, and asserted that journalists and scientists had no right to publish a bit of code if it might be misused. In a similar case in April 2000, a U.S. court of appeals found that First Amendment protections did apply to software (Junger). Noting that source code has both an expressive feature and a functional feature, this court held that First Amendment protection is not reserved only for purely expressive communication. Yet in the DVD case, the court opposed this view and enforced the inflexible demands of the Digital Millennium Copyright Act. Notwithstanding Ted Nelson's characterisation of computers as literary machines, the decision meant that the linguistic and expressive aspects of software would be subordinated to other concerns. A simple series of symbols were thereby cast under a veil of legal secrecy. Although they were easy to discover, and capable of being committed to memory or translated to other languages, fair use and other intuitive freedoms were deemed expendable. These sorts of legal obstacles are serious challenges to the continued viability of free software like Linux. The central value proposition of Linux-based operating systems – free, open source code – is threatening to commercial competitors. Some corporations are intent on stifling further development of free alternatives. Patents offer another vulnerability. The writing of free software has become a minefield of potential patent lawsuits. Corporations have repeatedly chosen to pursue patent litigation years after the alleged infringements have been incorporated into widely used free software. For example, although it was designed to avoid patent problems by an array of international experts, the image file format known as JPEG (Joint Photographic Experts Group) has recently been dogged by patent infringement charges. Despite good intentions, low-budget initiatives and ad hoc organisations are ill equipped to fight profiteering patent lawsuits. One wonders whether software innovation is directed more by lawyers or computer scientists. The present copyright and patent regimes may serve the needs of the larger corporations, but it is doubtful that they are the best means of fostering software innovation and quality. Orwell wrote in his Homage to Catalonia, There was a new rule that censored portions of the newspaper must not be left blank but filled up with other matter; as a result it was often impossible to tell when something had been cut out. The development of the Internet has a similar character: new diversions spring up to replace what might have been so that the lost potential is hardly felt. The process of retrofitting Internet software to suit ideological and commercial agendas is already well underway. For example, Microsoft has announced recently that it will discontinue support for the Java language in 2004. The problem with Java, from Microsoft's perspective, is that it provides portable programming tools that work under all operating systems, not just Windows. With Java, programmers can develop software for the large number of Windows users, while simultaneously offering software to users of other operating systems. Java is an important piece of the software infrastructure for Internet content developers. Yet, in the interest of coercing people to use only their operating systems, Microsoft is willing to undermine thousands of existing Java-language projects. Their marketing hype calls this progress. The software industry relies on sales to survive, so if it means laying waste to good products and millions of hours of work in order to sell something new, well, that's business. The consequent infrastructure instability keeps software developers, and other creative people, on a treadmill. From Progressive Load by Andy Deck, artcontext.org/progload As an Internet content producer, one does not appeal directly to the hearts and minds of the public; one appeals through the medium of software and hardware. Since most people are understandably reluctant to modify the software running on their computers, the software installed initially is a critical determinant of what is possible. Unconventional, independent, and artistic uses of the Internet are diminished when the media infrastructure is effectively established by decree. Unaccountable corporate control over infrastructure software tilts the playing field against smaller content producers who have neither the advance warning of industrial machinations, nor the employees and resources necessary to keep up with a regime of strategic, cyclical obsolescence. It seems that independent content producers must conform to the distribution technologies and content formats favoured by the entertainment and marketing sectors, or else resign themselves to occupying the margins of media activity. It is no secret that highly diversified media corporations can leverage their assets to favour their own media offerings and confound their competitors. Yet when media giants AOL and Time-Warner announced their plans to merge in 2000, the claim of CEOs Steve Case and Gerald Levin that the merged companies would "operate in the public interest" was hardly challenged by American journalists. Time-Warner has since fought to end all ownership limits in the cable industry; and Case, who formerly championed third-party access to cable broadband markets, changed his tune abruptly after the merger. Now that Case has been ousted, it is unclear whether he still favours oligopoly. According to Levin, global media will be and is fast becoming the predominant business of the 21st century ... more important than government. It's more important than educational institutions and non-profits. We're going to need to have these corporations redefined as instruments of public service, and that may be a more efficient way to deal with society's problems than bureaucratic governments. Corporate dominance is going to be forced anyhow because when you have a system that is instantly available everywhere in the world immediately, then the old-fashioned regulatory system has to give way (Levin). It doesn't require a lot of insight to understand that this "redefinition," this slight of hand, does not protect the public from abuses of power: the dissolution of the "old-fashioned regulatory system" does not serve the public interest. From Lexicon by Andy Deck, artcontext.org/lexicon) As an artist who has adopted telecommunications networks and software as his medium, it disappoints me that a mercenary vision of electronic media's future seems to be the prevailing blueprint. The giantism of media corporations, and the ongoing deregulation of media consolidation (Ahrens), underscore the critical need for independent media sources. If it were just a matter of which cola to drink, it would not be of much concern, but media corporations control content. In this hyper-mediated age, content – whether produced by artists or journalists – crucially affects what people think about and how they understand the world. Content is not impervious to the software, protocols, and chicanery that surround its delivery. It is about time that people interested in independent voices stop believing that laissez faire capitalism is building a better media infrastructure. The German writer Hans Magnus Enzensberger reminds us that the media tyrannies that affect us are social products. The media industry relies on thousands of people to make the compromises necessary to maintain its course. The rapid development of the mind industry, its rise to a key position in modern society, has profoundly changed the role of the intellectual. He finds himself confronted with new threats and new opportunities. Whether he knows it or not, whether he likes it or not, he has become the accomplice of a huge industrial complex which depends for its survival on him, as he depends on it for his own. He must try, at any cost, to use it for his own purposes, which are incompatible with the purposes of the mind machine. What it upholds he must subvert. He may play it crooked or straight, he may win or lose the game; but he would do well to remember that there is more at stake than his own fortune (Enzensberger 18). Some cultural leaders have recognised the important role that free software already plays in the infrastructure of the Internet. Among intellectuals there is undoubtedly a genuine concern about the emerging contours of corporate, global media. But more effective solidarity is needed. Interest in open source has tended to remain superficial, leading to trendy, cosmetic, and symbolic uses of terms like "open source" rather than to a deeper commitment to an open, public information infrastructure. Too much attention is focussed on what's "cool" and not enough on the road ahead. Various media specialists – designers, programmers, artists, and technical directors – make important decisions that affect the continuing development of electronic media. Many developers have failed to recognise (or care) that their decisions regarding media formats can have long reaching consequences. Web sites that use media formats which are unworkable for open source operating systems should be actively discouraged. Comparable technologies are usually available to solve compatibility problems. Going with the market flow is not really giving people what they want: it often opposes the work of thousands of activists who are trying to develop open source alternatives (see e.g. Greene). Average Internet users can contribute to a more innovative, free, open, and independent media – and being conscientious is not always difficult or unpleasant. One project worthy of support is the Internet browser Mozilla. Currently, many content developers create their Websites so that they will look good only in Microsoft's Internet Explorer. While somewhat understandable given the market dominance of Internet Explorer, this disregard for interoperability undercuts attempts to popularise standards-compliant alternatives. Mozilla, written by a loose-knit group of activists and programmers (some of whom are paid by AOL/Time-Warner), can be used as an alternative to Microsoft's browser. If more people use Mozilla, it will be harder for content providers to ignore the way their Web pages appear in standards-compliant browsers. The Mozilla browser, which is an open source initiative, can be downloaded from http://www.mozilla.org/. While there are many people working to create real and lasting alternatives to the monopolistic and technocratic dynamics that are emerging, it takes a great deal of cooperation to resist the media titans, the FCC, and the courts. Oddly enough, corporate interests sometimes overlap with those of the public. Some industrial players, such as IBM, now support open source software. For them it is mostly a business decision. Frustrated by the coercive control of Microsoft, they support efforts to develop another operating system platform. For others, including this writer, the open source movement is interesting for the potential it holds to foster a more heterogeneous and less authoritarian communications infrastructure. Many people can find common cause in this resistance to globalised uniformity and consolidated media ownership. The biggest challenge may be to get people to believe that their choices really matter, that by endorsing certain products and operating systems and not others, they can actually make a difference. But it's unlikely that this idea will flourish if artists and intellectuals don't view their own actions as consequential. There is a troubling tendency for people to see themselves as powerless in the face of the market. This paralysing habit of mind must be abandoned before the media will be free. Works Cited Ahrens, Frank. "Policy Watch." Washington Post (23 June 2002): H03. 30 March 2003 <http://www.washingtonpost.com/ac2/wp-dyn/A27015-2002Jun22?la... ...nguage=printer>. Batt, William. "How Our Towns Got That Way." 7 Oct. 1996. 31 March 2003 <http://www.esb.utexas.edu/drnrm/WhatIs/LandValue.htm>. Chester, Jeff. "Gerald Levin's Negative Legacy." Alternet.org 6 Dec. 2001. 5 March 2003 <http://www.democraticmedia.org/resources/editorials/levin.php>. Enzensberger, Hans Magnus. "The Industrialisation of the Mind." Raids and Reconstructions. London: Pluto Press, 1975. 18. Greene, Thomas C. "MS to Eradicate GPL, Hence Linux." 25 June 2002. 5 March 2003 <http://www.theregus.com/content/4/25378.php>. Hopper, D. Ian. "FBI Pushes for Cyber Ethics Education." Associated Press 10 Oct. 2000. 29 March 2003 <http://www.billingsgazette.com/computing/20001010_cethics.php>. Junger v. Daley. U.S. Court of Appeals for 6th Circuit. 00a0117p.06. 2000. 31 March 2003 <http://pacer.ca6.uscourts.gov/cgi-bin/getopn.pl?OPINION=00a0... ...117p.06>. Levin, Gerald. "Millennium 2000 Special." CNN 2 Jan. 2000. Touretzky, D. S. "Gallery of CSS Descramblers." 2000. 29 March 2003 <http://www.cs.cmu.edu/~dst/DeCSS/Gallery>. Links http://artcontext.org/lexicon/ http://artcontext.org/progload http://pacer.ca6.uscourts.gov/cgi-bin/getopn.pl?OPINION=00a0117p.06 http://www.billingsgazette.com/computing/20001010_cethics.html http://www.cs.cmu.edu/~dst/DeCSS/Gallery http://www.democraticmedia.org/resources/editorials/levin.html http://www.esb.utexas.edu/drnrm/WhatIs/LandValue.htm http://www.mozilla.org/ http://www.theregus.com/content/4/25378.html http://www.washingtonpost.com/ac2/wp-dyn/A27015-2002Jun22?language=printer Citation reference for this article Substitute your date of access for Dn Month Year etc... MLA Style Deck, Andy. "Treadmill Culture " M/C: A Journal of Media and Culture< http://www.media-culture.org.au/0304/04-treadmillculture.php>. APA Style Deck, A. (2003, Apr 23). Treadmill Culture . M/C: A Journal of Media and Culture, 6,< http://www.media-culture.org.au/0304/04-treadmillculture.php>
APA, Harvard, Vancouver, ISO, and other styles
42

Cesarini, Paul. "‘Opening’ the Xbox." M/C Journal 7, no. 3 (2004). http://dx.doi.org/10.5204/mcj.2371.

Full text
Abstract:
“As the old technologies become automatic and invisible, we find ourselves more concerned with fighting or embracing what’s new”—Dennis Baron, From Pencils to Pixels: The Stage of Literacy Technologies What constitutes a computer, as we have come to expect it? Are they necessarily monolithic “beige boxes”, connected to computer monitors, sitting on computer desks, located in computer rooms or computer labs? In order for a device to be considered a true computer, does it need to have a keyboard and mouse? If this were 1991 or earlier, our collective perception of what computers are and are not would largely be framed by this “beige box” model: computers are stationary, slab-like, and heavy, and their natural habitats must be in rooms specifically designated for that purpose. In 1992, when Apple introduced the first PowerBook, our perception began to change. Certainly there had been other portable computers prior to that, such as the Osborne 1, but these were more luggable than portable, weighing just slightly less than a typical sewing machine. The PowerBook and subsequent waves of laptops, personal digital assistants (PDAs), and so-called smart phones from numerous other companies have steadily forced us to rethink and redefine what a computer is and is not, how we interact with them, and the manner in which these tools might be used in the classroom. However, this reconceptualization of computers is far from over, and is in fact steadily evolving as new devices are introduced, adopted, and subsequently adapted for uses beyond of their original purpose. Pat Crowe’s Book Reader project, for example, has morphed Nintendo’s GameBoy and GameBoy Advance into a viable electronic book platform, complete with images, sound, and multi-language support. (Crowe, 2003) His goal was to take this existing technology previously framed only within the context of proprietary adolescent entertainment, and repurpose it for open, flexible uses typically associated with learning and literacy. Similar efforts are underway to repurpose Microsoft’s Xbox, perhaps the ultimate symbol of “closed” technology given Microsoft’s propensity for proprietary code, in order to make it a viable platform for Open Source Software (OSS). However, these efforts are not forgone conclusions, and are in fact typical of the ongoing battle over who controls the technology we own in our homes, and how open source solutions are often at odds with a largely proprietary world. In late 2001, Microsoft launched the Xbox with a multimillion dollar publicity drive featuring events, commercials, live models, and statements claiming this new console gaming platform would “change video games the way MTV changed music”. (Chan, 2001) The Xbox launched with the following technical specifications: 733mhz Pentium III 64mb RAM, 8 or 10gb internal hard disk drive CD/DVD ROM drive (speed unknown) Nvidia graphics processor, with HDTV support 4 USB 1.1 ports (adapter required), AC3 audio 10/100 ethernet port, Optional 56k modem (TechTV, 2001) While current computers dwarf these specifications in virtually all areas now, for 2001 these were roughly on par with many desktop systems. The retail price at the time was $299, but steadily dropped to nearly half that with additional price cuts anticipated. Based on these features, the preponderance of “off the shelf” parts and components used, and the relatively reasonable price, numerous programmers quickly became interested in seeing it if was possible to run Linux and additional OSS on the Xbox. In each case, the goal has been similar: exceed the original purpose of the Xbox, to determine if and how well it might be used for basic computing tasks. If these attempts prove to be successful, the Xbox could allow institutions to dramatically increase the student-to-computer ratio in select environments, or allow individuals who could not otherwise afford a computer to instead buy and Xbox, download and install Linux, and use this new device to write, create, and innovate . This drive to literally and metaphorically “open” the Xbox comes from many directions. Such efforts include Andrew Huang’s self-published “Hacking the Xbox” book in which, under the auspices of reverse engineering, Huang analyzes the architecture of the Xbox, detailing step-by-step instructions for flashing the ROM, upgrading the hard drive and/or RAM, and generally prepping the device for use as an information appliance. Additional initiatives include Lindows CEO Michael Robertson’s $200,000 prize to encourage Linux development on the Xbox, and the Xbox Linux Project at SourceForge. What is Linux? Linux is an alternative operating system initially developed in 1991 by Linus Benedict Torvalds. Linux was based off a derivative of the MINIX operating system, which in turn was a derivative of UNIX. (Hasan 2003) Linux is currently available for Intel-based systems that would normally run versions of Windows, PowerPC-based systems that would normally run Apple’s Mac OS, and a host of other handheld, cell phone, or so-called “embedded” systems. Linux distributions are based almost exclusively on open source software, graphic user interfaces, and middleware components. While there are commercial Linux distributions available, these mainly just package the freely available operating system with bundled technical support, manuals, some exclusive or proprietary commercial applications, and related services. Anyone can still download and install numerous Linux distributions at no cost, provided they do not need technical support beyond the community / enthusiast level. Typical Linux distributions come with open source web browsers, word processors and related productivity applications (such as those found in OpenOffice.org), and related tools for accessing email, organizing schedules and contacts, etc. Certain Linux distributions are more or less designed for network administrators, system engineers, and similar “power users” somewhat distanced from that of our students. However, several distributions including Lycoris, Mandrake, LindowsOS, and other are specifically tailored as regular, desktop operating systems, with regular, everyday computer users in mind. As Linux has no draconian “product activation key” method of authentication, or digital rights management-laden features associated with installation and implementation on typical desktop and laptop systems, Linux is becoming an ideal choice both individually and institutionally. It still faces an uphill battle in terms of achieving widespread acceptance as a desktop operating system. As Finnie points out in Desktop Linux Edges Into The Mainstream: “to attract users, you need ease of installation, ease of device configuration, and intuitive, full-featured desktop user controls. It’s all coming, but slowly. With each new version, desktop Linux comes closer to entering the mainstream. It’s anyone’s guess as to when critical mass will be reached, but you can feel the inevitability: There’s pent-up demand for something different.” (Finnie 2003) Linux is already spreading rapidly in numerous capacities, in numerous countries. Linux has “taken hold wherever computer users desire freedom, and wherever there is demand for inexpensive software.” Reports from technology research company IDG indicate that roughly a third of computers in Central and South America run Linux. Several countries, including Mexico, Brazil, and Argentina, have all but mandated that state-owned institutions adopt open source software whenever possible to “give their people the tools and education to compete with the rest of the world.” (Hills 2001) The Goal Less than a year after Microsoft introduced the The Xbox, the Xbox Linux project formed. The Xbox Linux Project has a goal of developing and distributing Linux for the Xbox gaming console, “so that it can be used for many tasks that Microsoft don’t want you to be able to do. ...as a desktop computer, for email and browsing the web from your TV, as a (web) server” (Xbox Linux Project 2002). Since the Linux operating system is open source, meaning it can freely be tinkered with and distributed, those who opt to download and install Linux on their Xbox can do so with relatively little overhead in terms of cost or time. Additionally, Linux itself looks very “windows-like”, making for fairly low learning curve. To help increase overall awareness of this project and assist in diffusing it, the Xbox Linux Project offers step-by-step installation instructions, with the end result being a system capable of using common peripherals such as a keyboard and mouse, scanner, printer, a “webcam and a DVD burner, connected to a VGA monitor; 100% compatible with a standard Linux PC, all PC (USB) hardware and PC software that works with Linux.” (Xbox Linux Project 2002) Such a system could have tremendous potential for technology literacy. Pairing an Xbox with Linux and OpenOffice.org, for example, would provide our students essentially the same capability any of them would expect from a regular desktop computer. They could send and receive email, communicate using instant messaging IRC, or newsgroup clients, and browse Internet sites just as they normally would. In fact, the overall browsing experience for Linux users is substantially better than that for most Windows users. Internet Explorer, the default browser on all systems running Windows-base operating systems, lacks basic features standard in virtually all competing browsers. Native blocking of “pop-up” advertisements is still not yet possible in Internet Explorer without the aid of a third-party utility. Tabbed browsing, which involves the ability to easily open and sort through multiple Web pages in the same window, often with a single mouse click, is also missing from Internet Explorer. The same can be said for a robust download manager, “find as you type”, and a variety of additional features. Mozilla, Netscape, Firefox, Konqueror, and essentially all other OSS browsers for Linux have these features. Of course, most of these browsers are also available for Windows, but Internet Explorer is still considered the standard browser for the platform. If the Xbox Linux Project becomes widely diffused, our students could edit and save Microsoft Word files in OpenOffice.org’s Writer program, and do the same with PowerPoint and Excel files in similar OpenOffice.org components. They could access instructor comments originally created in Microsoft Word documents, and in turn could add their own comments and send the documents back to their instructors. They could even perform many functions not yet capable in Microsoft Office, including saving files in PDF or Flash format without needing Adobe’s Acrobat product or Macromedia’s Flash Studio MX. Additionally, by way of this project, the Xbox can also serve as “a Linux server for HTTP/FTP/SMB/NFS, serving data such as MP3/MPEG4/DivX, or a router, or both; without a monitor or keyboard or mouse connected.” (Xbox Linux Project 2003) In a very real sense, our students could use these inexpensive systems previously framed only within the context of entertainment, for educational purposes typically associated with computer-mediated learning. Problems: Control and Access The existing rhetoric of technological control surrounding current and emerging technologies appears to be stifling many of these efforts before they can even be brought to the public. This rhetoric of control is largely typified by overly-restrictive digital rights management (DRM) schemes antithetical to education, and the Digital Millennium Copyright Act (DMCA). Combined,both are currently being used as technical and legal clubs against these efforts. Microsoft, for example, has taken a dim view of any efforts to adapt the Xbox to Linux. Microsoft CEO Steve Ballmer, who has repeatedly referred to Linux as a cancer and has equated OSS as being un-American, stated, “Given the way the economic model works - and that is a subsidy followed, essentially, by fees for every piece of software sold - our license framework has to do that.” (Becker 2003) Since the Xbox is based on a subsidy model, meaning that Microsoft actually sells the hardware at a loss and instead generates revenue off software sales, Ballmer launched a series of concerted legal attacks against the Xbox Linux Project and similar efforts. In 2002, Nintendo, Sony, and Microsoft simultaneously sued Lik Sang, Inc., a Hong Kong-based company that produces programmable cartridges and “mod chips” for the PlayStation II, Xbox, and Game Cube. Nintendo states that its company alone loses over $650 million each year due to piracy of their console gaming titles, which typically originate in China, Paraguay, and Mexico. (GameIndustry.biz) Currently, many attempts to “mod” the Xbox required the use of such chips. As Lik Sang is one of the only suppliers, initial efforts to adapt the Xbox to Linux slowed considerably. Despite that fact that such chips can still be ordered and shipped here by less conventional means, it does not change that fact that the chips themselves would be illegal in the U.S. due to the anticircumvention clause in the DMCA itself, which is designed specifically to protect any DRM-wrapped content, regardless of context. The Xbox Linux Project then attempted to get Microsoft to officially sanction their efforts. They were not only rebuffed, but Microsoft then opted to hire programmers specifically to create technological countermeasures for the Xbox, to defeat additional attempts at installing OSS on it. Undeterred, the Xbox Linux Project eventually arrived at a method of installing and booting Linux without the use of mod chips, and have taken a more defiant tone now with Microsoft regarding their circumvention efforts. (Lettice 2002) They state that “Microsoft does not want you to use the Xbox as a Linux computer, therefore it has some anti-Linux-protection built in, but it can be circumvented easily, so that an Xbox can be used as what it is: an IBM PC.” (Xbox Linux Project 2003) Problems: Learning Curves and Usability In spite of the difficulties imposed by the combined technological and legal attacks on this project, it has succeeded at infiltrating this closed system with OSS. It has done so beyond the mere prototype level, too, as evidenced by the Xbox Linux Project now having both complete, step-by-step instructions available for users to modify their own Xbox systems, and an alternate plan catering to those who have the interest in modifying their systems, but not the time or technical inclinations. Specifically, this option involves users mailing their Xbox systems to community volunteers within the Xbox Linux Project, and basically having these volunteers perform the necessary software preparation or actually do the full Linux installation for them, free of charge (presumably not including shipping). This particular aspect of the project, dubbed “Users Help Users”, appears to be fairly new. Yet, it already lists over sixty volunteers capable and willing to perform this service, since “Many users don’t have the possibility, expertise or hardware” to perform these modifications. Amazingly enough, in some cases these volunteers are barely out of junior high school. One such volunteer stipulates that those seeking his assistance keep in mind that he is “just 14” and that when performing these modifications he “...will not always be finished by the next day”. (Steil 2003) In addition to this interesting if somewhat unusual level of community-driven support, there are currently several Linux-based options available for the Xbox. The two that are perhaps the most developed are GentooX, which is based of the popular Gentoo Linux distribution, and Ed’s Debian, based off the Debian GNU / Linux distribution. Both Gentoo and Debian are “seasoned” distributions that have been available for some time now, though Daniel Robbins, Chief Architect of Gentoo, refers to the product as actually being a “metadistribution” of Linux, due to its high degree of adaptability and configurability. (Gentoo 2004) Specifically, the Robbins asserts that Gentoo is capable of being “customized for just about any application or need. ...an ideal secure server, development workstation, professional desktop, gaming system, embedded solution or something else—whatever you need it to be.” (Robbins 2004) He further states that the whole point of Gentoo is to provide a better, more usable Linux experience than that found in many other distributions. Robbins states that: “The goal of Gentoo is to design tools and systems that allow a user to do their work pleasantly and efficiently as possible, as they see fit. Our tools should be a joy to use, and should help the user to appreciate the richness of the Linux and free software community, and the flexibility of free software. ...Put another way, the Gentoo philosophy is to create better tools. When a tool is doing its job perfectly, you might not even be very aware of its presence, because it does not interfere and make its presence known, nor does it force you to interact with it when you don’t want it to. The tool serves the user rather than the user serving the tool.” (Robbins 2004) There is also a so-called “live CD” Linux distribution suitable for the Xbox, called dyne:bolic, and an in-progress release of Slackware Linux, as well. According to the Xbox Linux Project, the only difference between the standard releases of these distributions and their Xbox counterparts is that “...the install process – and naturally the bootloader, the kernel and the kernel modules – are all customized for the Xbox.” (Xbox Linux Project, 2003) Of course, even if Gentoo is as user-friendly as Robbins purports, even if the Linux kernel itself has become significantly more robust and efficient, and even if Microsoft again drops the retail price of the Xbox, is this really a feasible solution in the classroom? Does the Xbox Linux Project have an army of 14 year olds willing to modify dozens, perhaps hundreds of these systems for use in secondary schools and higher education? Of course not. If such an institutional rollout were to be undertaken, it would require significant support from not only faculty, but Department Chairs, Deans, IT staff, and quite possible Chief Information Officers. Disk images would need to be customized for each institution to reflect their respective needs, ranging from setting specific home pages on web browsers, to bookmarks, to custom back-up and / or disk re-imaging scripts, to network authentication. This would be no small task. Yet, the steps mentioned above are essentially no different than what would be required of any IT staff when creating a new disk image for a computer lab, be it one for a Windows-based system or a Mac OS X-based one. The primary difference would be Linux itself—nothing more, nothing less. The institutional difficulties in undertaking such an effort would likely be encountered prior to even purchasing a single Xbox, in that they would involve the same difficulties associated with any new hardware or software initiative: staffing, budget, and support. If the institutional in question is either unwilling or unable to address these three factors, it would not matter if the Xbox itself was as free as Linux. An Open Future, or a Closed one? It is unclear how far the Xbox Linux Project will be allowed to go in their efforts to invade an essentially a proprietary system with OSS. Unlike Sony, which has made deliberate steps to commercialize similar efforts for their PlayStation 2 console, Microsoft appears resolute in fighting OSS on the Xbox by any means necessary. They will continue to crack down on any companies selling so-called mod chips, and will continue to employ technological protections to keep the Xbox “closed”. Despite clear evidence to the contrary, in all likelihood Microsoft continue to equate any OSS efforts directed at the Xbox with piracy-related motivations. Additionally, Microsoft’s successor to the Xbox would likely include additional anticircumvention technologies incorporated into it that could set the Xbox Linux Project back by months, years, or could stop it cold. Of course, it is difficult to say with any degree of certainty how this “Xbox 2” (perhaps a more appropriate name might be “Nextbox”) will impact this project. Regardless of how this device evolves, there can be little doubt of the value of Linux, OpenOffice.org, and other OSS to teaching and learning with technology. This value exists not only in terms of price, but in increased freedom from policies and technologies of control. New Linux distributions from Gentoo, Mandrake, Lycoris, Lindows, and other companies are just now starting to focus their efforts on Linux as user-friendly, easy to use desktop operating systems, rather than just server or “techno-geek” environments suitable for advanced programmers and computer operators. While metaphorically opening the Xbox may not be for everyone, and may not be a suitable computing solution for all, I believe we as educators must promote and encourage such efforts whenever possible. I suggest this because I believe we need to exercise our professional influence and ultimately shape the future of technology literacy, either individually as faculty and collectively as departments, colleges, or institutions. Moran and Fitzsimmons-Hunter argue this very point in Writing Teachers, Schools, Access, and Change. One of their fundamental provisions they use to define “access” asserts that there must be a willingness for teachers and students to “fight for the technologies that they need to pursue their goals for their own teaching and learning.” (Taylor / Ward 160) Regardless of whether or not this debate is grounded in the “beige boxes” of the past, or the Xboxes of the present, much is at stake. Private corporations should not be in a position to control the manner in which we use legally-purchased technologies, regardless of whether or not these technologies are then repurposed for literacy uses. I believe the exigency associated with this control, and the ongoing evolution of what is and is not a computer, dictates that we assert ourselves more actively into this discussion. We must take steps to provide our students with the best possible computer-mediated learning experience, however seemingly unorthodox the technological means might be, so that they may think critically, communicate effectively, and participate actively in society and in their future careers. About the Author Paul Cesarini is an Assistant Professor in the Department of Visual Communication & Technology Education, Bowling Green State University, Ohio Email: pcesari@bgnet.bgsu.edu Works Cited http://xbox-linux.sourceforge.net/docs/debian.php>.Baron, Denis. “From Pencils to Pixels: The Stages of Literacy Technologies.” Passions Pedagogies and 21st Century Technologies. Hawisher, Gail E., and Cynthia L. Selfe, Eds. Utah: Utah State University Press, 1999. 15 – 33. Becker, David. “Ballmer: Mod Chips Threaten Xbox”. News.com. 21 Oct 2002. http://news.com.com/2100-1040-962797.php>. http://news.com.com/2100-1040-978957.html?tag=nl>. http://archive.infoworld.com/articles/hn/xml/02/08/13/020813hnchina.xml>. http://www.neoseeker.com/news/story/1062/>. http://www.bookreader.co.uk>.Finni, Scott. “Desktop Linux Edges Into The Mainstream”. TechWeb. 8 Apr 2003. http://www.techweb.com/tech/software/20030408_software. http://www.theregister.co.uk/content/archive/29439.html http://gentoox.shallax.com/. http://ragib.hypermart.net/linux/. http://www.itworld.com/Comp/2362/LWD010424latinlinux/pfindex.html. http://www.xbox-linux.sourceforge.net. http://www.theregister.co.uk/content/archive/27487.html. http://www.theregister.co.uk/content/archive/26078.html. http://www.us.playstation.com/peripherals.aspx?id=SCPH-97047. http://www.techtv.com/extendedplay/reviews/story/0,24330,3356862,00.html. http://www.wired.com/news/business/0,1367,61984,00.html. http://www.gentoo.org/main/en/about.xml http://www.gentoo.org/main/en/philosophy.xml http://techupdate.zdnet.com/techupdate/stories/main/0,14179,2869075,00.html. http://xbox-linux.sourceforge.net/docs/usershelpusers.html http://www.cnn.com/2002/TECH/fun.games/12/16/gamers.liksang/. Citation reference for this article MLA Style Cesarini, Paul. "“Opening” the Xbox" M/C: A Journal of Media and Culture <http://www.media-culture.org.au/0406/08_Cesarini.php>. APA Style Cesarini, P. (2004, Jul1). “Opening” the Xbox. M/C: A Journal of Media and Culture, 7, <http://www.media-culture.org.au/0406/08_Cesarini.php>
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!