Academic literature on the topic 'Educational systems evaluation'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Educational systems evaluation.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Educational systems evaluation"
Kurubacak, Gulsun. "Evaluation of Educational Management Systems." i-manager's Journal of Educational Technology 2, no. 4 (March 15, 2006): 19–27. http://dx.doi.org/10.26634/jet.2.4.750.
Full textGarcía-Peñalvo, Francisco José, Lourdes Moreno López, and Mª Cruz Sánchez-Gómez. "Empirical evaluation of educational interactive systems." Quality & Quantity 52, no. 6 (August 13, 2018): 2427–34. http://dx.doi.org/10.1007/s11135-018-0808-4.
Full textParchami, Abbas, and Mashaallah Mashinchi. "The evaluation of educational systems: an application study." Mathematical Sciences 6, no. 1 (2012): 61. http://dx.doi.org/10.1186/2251-7456-6-61.
Full textAl-Husseini, Khansaa Azeez Obayes, Ali Hamzah Obaid, and Ola Najah Kadhim. "Evaluating the Effectiveness of E-learning: Based on Academic Staff Evaluation." Webology 19, no. 1 (January 20, 2022): 367–79. http://dx.doi.org/10.14704/web/v19i1/web19027.
Full textStetskyi, Vasyl. "Local educational system." Visnyk of the Lviv University. Series Geography, no. 47 (November 27, 2014): 265–72. http://dx.doi.org/10.30970/vgg.2014.47.971.
Full textZhang, Shuai. "Review of automated writing evaluation systems." Journal of China Computer-Assisted Language Learning 1, no. 1 (August 1, 2021): 170–76. http://dx.doi.org/10.1515/jccall-2021-2007.
Full textLawton, Stephen B., Ethel Auster, and David To. "A Systems Evaluation of the Educational Information System for Ontario." Journal of the American Society for Information Science 30, no. 1 (September 6, 2007): 33–40. http://dx.doi.org/10.1002/asi.4630300107.
Full textHarvey, Mark T., Michael E. May, and Craig H. Kennedy. "Nonconcurrent Multiple Baseline Designs and the Evaluation of Educational Systems." Journal of Behavioral Education 13, no. 4 (December 2004): 267–76. http://dx.doi.org/10.1023/b:jobe.0000044735.51022.5d.
Full textGiang, Christian, Alberto Piatti, and Francesco Mondada. "Heuristics for the Development and Evaluation of Educational Robotics Systems." IEEE Transactions on Education 62, no. 4 (November 2019): 278–87. http://dx.doi.org/10.1109/te.2019.2912351.
Full textKutsenko, O. I., V. D. Yakovenko, and Ye O. Yakovenko. "EVALUATION OF QUALITY MANAGEMENT SYSTEMS." Scientific Notes of Junior Academy of Sciences of Ukraine, no. 3(19) (2020): 59–70. http://dx.doi.org/10.51707/2618-0529-2020-19-07.
Full textDissertations / Theses on the topic "Educational systems evaluation"
Mosley, Dracaena. "A Mixed Methods Evaluation of New Teacher Support Systems at an Urban Elementary." ScholarWorks, 2014. https://scholarworks.waldenu.edu/dissertations/154.
Full textNeigel, Scott. "Assessing the Meaning and Value of Traditional Grading Systems| Teacher Practices and Perspectives." Thesis, University of Southern California, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10683026.
Full textThis study employed a mixed methods approach to evaluate the meaning and value of grades within a traditional grading system. Teachers’ grading and assessment practices were examined in terms of clarity, consistency, and to what extent assessment guided instruction. Teachers from a high-performing suburban high school in the Northeast responded to an electronic survey and participated in focus groups regarding their grading and assessment practices. Gradebooks were analyzed to triangulate teacher practices and perspectives regarding the meaning of student grades. Clark and Estes (2008) Gap Analysis Framework was utilized to assess knowledge, motivational, and organizational influences on teachers’ grading and assessment practices. The findings of this assessment revealed that teachers possessed knowledge about assessment and the motivation to apply it, but faced organizational barriers implementing effective practices in a traditional grading system. Responses indicated that teachers understood and used formative assessment during class, but also included it in students’ grades to elicit effort and ensure sufficient graded assignments to justify student performance. Organizational constructs such as marking periods and online grading systems, in addition to an overall lack of organizational support and training, were found to be substantial obstacles to teachers achieving the stakeholder and organizational goals. The findings of this study emphasized the need for enhanced training, collaboration, and communication on grading and assessment. The development and implementation of an effective plan to address these organizational issues could shift schools from using traditional grading systems to rank and sort students to assessment programs that promote student learning.
Smith, Laura. "A Mixed Methods Comparative Analysis of the Implementation of the Multi-Tiered Systems of Support in Missouri Elementary Public Schools." Thesis, Lindenwood University, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10934605.
Full textThis study consisted of a mixed-methods comparative analysis of the implementation of the Multi-Tiered Systems of Support (MTSS) in public school districts in the state of Missouri. The researcher surveyed nine public school districts similar in demographics of socio-economic representation, free and reduced lunch percentage, and average daily fund expended to educate students. One district administrator responsible for the implementation of MTSS represented each school district. In the qualitative component of the study, the researcher utilized an original electronic survey to gather insights into the unique implementation path each district employed. Coding and analysis resulted in identification of themes, similarities, and differences. The researcher interviewed 2 state-level leaders integral in the design and implementation recommendations from a state-level perspective. Coding and analysis of interview responses resulted in identification of similarities and differences in state and district-level implementation of MTSS. The quantitative component of study included collection and analysis of secondary data obtained from the Missouri Department of Elementary and Secondary Education via the Missouri Comprehensive Data System. The researcher obtained and analyzed elementary achievement and student attendance data to determine a difference within districts with full and partial implementation of MTSS. Through analysis of the qualitative surveys and interviews, the researcher found unique implementation paths among the study districts. All nine study districts implemented differently and none utilized a recommended path or blueprint. District implementations varied from perceptions held among the state-level leaders interviewed. Through analysis of the quantitative component of the study, the researcher identified no difference in achievement and student attendance in districts deemed full implementation in comparison to partial implementation. The researcher recommended continued attention to successful implementation of MTSS at state and district levels. Future attention with focus on increased technical support and funding at the state level held the promise of prompt, appropriate supports to students who struggle in the academic, behavioral, and social skill areas.
SILVA, Manoela Milena Oliveira da. "Evolution of the use of Augument Reality Tools in the Education Fied." Universidade Federal de Pernambuco, 2015. https://repositorio.ufpe.br/handle/123456789/14921.
Full textMade available in DSpace on 2016-01-18T13:14:50Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) dissertacao_manoela_silva.pdf: 36581423 bytes, checksum: d8c07c2d0fe78c155c2e4ab98bc705f2 (MD5) Previous issue date: 2015-08-31
CNPq
Augmented Reality technology (AR) has a huge potential to be applied in the education field. The coexistence of real and virtual environments enables experiences that would not be possible without this technology. Some of the reasons why AR learning experiences differ from other technology are: (i) it enables contextualized interaction between real and virtual worlds, (ii) it enables tangible interaction metaphors for object manipulation, and, finally, (iii) it enables smooth transition between the real and virtual contents. While AR offers new learning opportunities, it also creates new challenges for education in different domains, such as technological, learning and pedagogical issues. This work intends to provide some reflections about the challenges involved in the process of evaluating AR educational technologies. In order to better understand those issues, a systematic review was carried out aiming to identify how AR technology has been evaluated. Taking into account lessons learned during the review, a projective educational AR tool, especially designed to young children education, the ARBlocks, was evaluated. This tool was evaluated in the field of language learning with three different groups. The study involved the teacher as an instructional designer along with the use of multiple metrics. From the analysis of the ARBlocks in the classrooms, it was possible to observe that this tool offered different possibilities for language teaching to young children. The results obtained demonstrated that, in general, the ARblocks contributed to student’s learning and the practice and reinforcement of language abilities. From the reflections presented, some guidelines were proposed in order to assist the evaluation of AR educational tools. The use of multiple metrics as well as the active involvement of teachers in the elaboration of contents are encouraged as way to better understand the impact of technology in the teaching and learning process.
A tecnologia de Realidade Aumentada (RA) possui grande potencial de aplicação na área educativa. A coexistência de ambientes reais e virtuais abre possibilidades de aprendizado que não poderiam ser possíveis sem este tipo de tecnologia. Algumas razões pelas quais experiências de aprendizagem com RA se diferem das demais são: (i) suporte a uma interação contextualizada entre ambientes reais e virtuais, (ii) uso de metáforas com interfaces tangíveis para manipulação de objetos e, por fim, (iii) a habilidade de transição suave entre o real e o virtual. Enquanto oferece inúmeras novas oportunidades de aprendizagem, a introdução de novas tecnologias com RA cria desafios em diferentes domínios, como o tecnológico, de aprendizagem e os desafios pedagógicos. Este trabalho visa prover algumas reflexões acerca dos desafios envolvidos no processo de avaliação de tecnologias educativas com RA. Como forma de melhor compreender essas questões, foi realizada uma revisão sistemática com vistas a identificar como são realizadas avaliações de ferramentas educacionais com RA. A partir da revisão empreendida, foram observadas as principais formas de avaliação de ferramentas educativas envolvendo esta tecnologia. A partir destes conhecimentos adquiridos, foi realizada a avaliação de uma ferramenta educativa baseada em RA projetiva, especialmente desenvolvida para o ensino infantil, o ARBlocks. Tal ferramenta foi avaliada no campo da aprendizagem de novas línguas com três grupos distintos. O estudo envolveu o professor como instructional designer, bem como o uso de múltiplas métricas de avaliação. A partir da análise do ARBlocks em sala de aula, foi possível observar que esta ferramenta oferece inúmeras possibilidades para o ensino de línguas para crianças pequenas. Os resultados obtidos demonstraram que, no geral, o ARBlocks auxiliou no processo de aprendizagem dos estudantes e na prática e reforço das habilidades linguísticas. A partir das reflexões apresentadas, alguns guidelines foram propostos com vistas a auxiliar na avaliação de ferramentas educativas com RA. O uso de múltiplas métricas e o envolvimento ativo dos professores na elaboração dos conteúdos é encorajado como forma de melhor compreender os impactos provocados pela tecnologia no processo de ensino-aprendizagem.
McNaughton, Amy K. "Instructional management profiles the relationship between teaching styles, grade level preferences, and related factors /." Lynchburg, Va. : Liberty University, 2007. http://digitalcommons.liberty.edu.
Full textDonmez, Ayca. "The Evaluation Of Communication And Customer Relations Training Program At Tepe Defence And Security Systems." Master's thesis, METU, 2005. http://etd.lib.metu.edu.tr/upload/12605671/index.pdf.
Full texts reaction to the training program, finding out behavioral changes of the security staff which may positively affect the company. The instruments, an evaluation scale and two different case studies, were administered to 204 randomly selected security staff who have been working at Tepe Security in Ankara. Data was analyzed both qualitatively and quantitatively such as interviews, observation notes, frequencies, percentages, means, standard deviations, and the t test.
Tucker, Pamela DuPriest. "Administrative response to teacher incompetence: The role of teacher evaluation systems." W&M ScholarWorks, 1997. https://scholarworks.wm.edu/etd/1539618396.
Full textMcAdoo, Charlie Edward II. "The Identification and Prioritization of the Professional Development Needs for Teachers of Career, Technical, and Agricultural Subjects within Georgia Metropolitan Area School Systems." Thesis, Valdosta State University, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10930416.
Full textThe purpose of this research study was to identify and prioritize the professional development needs for teachers of CTAE subjects within metropolitan Atlanta school systems. The methodology was primarily relational with descriptive components that relied on quantitative data. The administered survey called for participants to self-report demographic groups (i.e. Experience Level, School Type, and School Population). Secondly, participants completed online surveys yielding data that identified professional development needs relative to demographic variables. A modified Borich (1980) Needs Assessment Model was used to identify the perceived importance and perceived competency of 20 competencies prescribed by the Georgia Teacher Assessment of Performance Standards (TAPS). Once analyzed, the researcher identified and described professional development needs relative to demographic variables.
Mize, Brenda Gail. "Teachers Perceptions of the Impact of Online Grading Systems." Digital Commons @ East Tennessee State University, 2011. https://dc.etsu.edu/etd/1321.
Full textNordstrom, Karen Lynn. "Pedagogical Praxis Models in Sustainability Education: A Focus on Food Systems and Environment." ScholarWorks @ UVM, 2015. http://scholarworks.uvm.edu/graddis/390.
Full textBooks on the topic "Educational systems evaluation"
Flagg, Barbara N. Formative evaluation for educational technologies. Hillsdale, N.J: L. Erlbaum Associates, 1990.
Find full textDively, Dwight. Evaluation of the Washington Educational Telecommunications System. Seattle, Wash: The Board, 1985.
Find full textLinn, Robert L. The design and evaluation of educational assessment and accountability systems. Los Angeles, CA: Center for the Study of Evaluation, National Center for Research on Evaluation, Standards, and Student Testing, Graduate School of Education & Information Studies, University of California, Los Angeles, 2001.
Find full textVincenza, Capursi, and SpringerLink (Online service), eds. Statistical Methods for the Evaluation of University Systems. Heidelberg: Springer-Verlag Berlin Heidelberg, 2011.
Find full textDesjardins, Richard. Benchmarking education and training systems in Europe: An international comparative study. Stockholm: Institute of International Education, Stockholm University, 2004.
Find full textDively, Dwight. Evaluation of the Washington Educational Telecommunications System: Report. Seattle, Wash. (2101 Fourth Avenue, Suite 250, Seattle, WA 98121): The Board, 1985.
Find full textHow to conduct a formative evaluation. Alexandria, Virginia: Association for Supervision and Curriculum Development, 1995.
Find full textInc, SETS. Lesotho, Basic and Non-formal Education Systems (BANFES): Interim evaluation (632-0222). [Honolulu, Hawaii]: SETS, Inc., 1988.
Find full textAnne, Lewis. Comprehensive systems for educational accounting and improvement: R&D results : 1998 CRESST conference proceedings. Los Angeles, CA: National Center for Research on Evaluation, Standards, and Student Testing, Center for the Study of Evaluation, Graduate School of Education & Information Studies, University of California, Los Angeles, 1999.
Find full textPlanning and conducting formative evaluations: Improving the quality of education and training. London: Kogan Page, 1993.
Find full textBook chapters on the topic "Educational systems evaluation"
Toçoğlu, Mansur Alp, and Aytuğ Onan. "Sentiment Analysis on Students’ Evaluation of Higher Educational Institutions." In Advances in Intelligent Systems and Computing, 1693–700. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-51156-2_197.
Full textJiménez, Samantha, Reyes Juárez-Ramírez, Víctor H. Castillo, Alan Ramírez-Noriega, and Sergio Inzunza. "Affective Evaluation of Educational Lexicon in Spanish for Learning Systems." In Advances in Intelligent Systems and Computing, 1074–83. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-77712-2_103.
Full textKarampiperis, Pythagoras, and Demetrios G. Sampson. "Performance Evaluation of Decision-Based Content Selection Approaches in Adaptive Educational Hypermedia Systems." In Intelligent and Adaptive Educational-Learning Systems, 161–82. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-30171-1_7.
Full textGupta, Heena, and Sanjay Kumar Dubey. "Ranking of Educational Web Sites in Indian Perspective for Usability Evaluation." In Advances in Intelligent Systems and Computing, 839–45. Singapore: Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-10-5903-2_87.
Full textSasaki, Toshiya. "Research on Design Skills for Personnel Evaluation Systems and Educational Programs." In Advances in Intelligent Systems and Computing, 1284–88. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-39512-4_196.
Full textEagle, Michael, and Tiffany Barnes. "Intelligent Tutoring Systems, Educational Data Mining, and the Design and Evaluation of Video Games." In Intelligent Tutoring Systems, 215–17. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-13437-1_23.
Full textEndo, Keiichi, Ayame Onoyama, Dai Okano, Yoshinobu Higami, and Shinya Kobayashi. "Comparative Evaluation of Bluetooth and Wi-Fi Direct for Tablet-Oriented Educational Applications." In Intelligent Information and Database Systems, 345–54. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-54472-4_33.
Full textPerłowski, Ryszard, Arkadiusz Gola, and Katarzyna Antosz. "Evaluation of the Effectiveness of Standard Scheduling Rules – An Educational Approach." In Advances in Manufacturing Processes, Intelligent Methods and Systems in Production Engineering, 344–57. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-90532-3_26.
Full textHoltermann, Renan Silveira, Ricardo Matos Chaim, and Bruno Contessotto Bragança Pinheiro. "Evaluation, Analysis, and Treatment of Educational Risks in Migration of Presential Teaching for Remote." In Advances in Intelligent Systems and Computing, 232–43. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-72660-7_23.
Full textO’Mahony, C. D. "Information systems effectiveness and organisational culture: an underlying model for ITEM evaluation." In Information Technology in Educational Management for the Schools of the Future, 65–72. Boston, MA: Springer US, 1997. http://dx.doi.org/10.1007/978-0-387-35090-5_9.
Full textConference papers on the topic "Educational systems evaluation"
Matulčíková, Marta, and Daniela Breveníková. "QUALITY OF EDUCATION AND SYSTEMS–BASED EDUCATIONAL EVALUATION." In 44th International Academic Conference, Vienna. International Institute of Social and Economic Sciences, 2018. http://dx.doi.org/10.20472/iac.2018.044.029.
Full textYao, Xueying, Tianyu Zhou, and Yafeng Niu. "The Construction of the Evaluation Index System of Children's Educational Game Learning Accessibility." In Intelligent Human Systems Integration (IHSI 2022) Integrating People and Intelligent Systems. AHFE International, 2022. http://dx.doi.org/10.54941/ahfe1001093.
Full textLiu, Lizhen, Wenbin Xu, Wei Song, Hanshi Wang, Hao Liu, Haining Xu, and Tao Chi. "Study Quality Evaluation in Educational System." In 2014 Enterprise Systems Conference (ES). IEEE, 2014. http://dx.doi.org/10.1109/es.2014.12.
Full textOsma, Jose Ignacio Palacios, Jose Andres Gamboa Suarez, Carlos Enrique Montenegro Marin, and Jose Ignacio Rodriguez Molano. "Metric LMS: Educational evaluation platforms." In 2016 11th Iberian Conference on Information Systems and Technologies (CISTI). IEEE, 2016. http://dx.doi.org/10.1109/cisti.2016.7521434.
Full textde Sales, Andre Barros, and Joao Gabriel Antunes. "Evaluation of Educational Games Usage Satisfaction." In 2021 16th Iberian Conference on Information Systems and Technologies (CISTI). IEEE, 2021. http://dx.doi.org/10.23919/cisti52073.2021.9476400.
Full textPajor, Gizella Csikos, and Dragica Radosav. "Semiautomatic evaluation using educational software eMax." In 2010 IEEE 8th International Symposium on Intelligent Systems and Informatics (SISY 2010). IEEE, 2010. http://dx.doi.org/10.1109/sisy.2010.5647291.
Full textda Silva Gomes de Oliveira, Eloiza, Marcia Souto Maior Mourao Sa, Caio Abitbol Carvalho, and Raphael Silberman Dereczynski. "New educational technologies and learning evaluation: A challenge presented to education." In 2014 9th Iberian Conference on Information Systems and Technologies (CISTI). IEEE, 2014. http://dx.doi.org/10.1109/cisti.2014.6876962.
Full textMarante, Yelco, Vinicius Alberto Alves da Silva, Jorão Gomes Jr., Marluce Aparecida Vitor, André Ferreira Martins, and Jairo Francisco De Souza. "Evaluating Educational Recommendation Systems: a systematic mapping." In Simpósio Brasileiro de Informática na Educação. Sociedade Brasileira de Computação, 2020. http://dx.doi.org/10.5753/cbie.sbie.2020.912.
Full textChrysafiadi, Konstantina, Spyros Papadimitriou, and Maria Virvou. "Which is better for learning: a web-based educational application or an educational game?" In 2019 International Symposium on Performance Evaluation of Computer and Telecommunication Systems (SPECTS). IEEE, 2019. http://dx.doi.org/10.23919/spects.2019.8823232.
Full textVeiga, Francisco, and Antonio Andrade. "Evaluation of apps used in an educational context." In 2020 15th Iberian Conference on Information Systems and Technologies (CISTI). IEEE, 2020. http://dx.doi.org/10.23919/cisti49556.2020.9140964.
Full textReports on the topic "Educational systems evaluation"
Kaffenberger, Michelle, Jason Silberstein, and Marla Spivack. Evaluating Systems: Three Approaches for Analyzing Education Systems and Informing Action. Research on Improving Systems of Education (RISE), April 2022. http://dx.doi.org/10.35489/bsg-rise-wp_2022/093.
Full textThomson, Sue, Nicole Wernert, Sima Rodrigues, and Elizabeth O'Grady. TIMSS 2019 Australia. Volume I: Student performance. Australian Council for Educational Research, December 2020. http://dx.doi.org/10.37517/978-1-74286-614-7.
Full textMcGuinn, Patrick. Evaluating Progress: State Education Agencies and the Implementation of New Teacher Evaluation Systems. Consortium for Policy Research in Education, September 2015. http://dx.doi.org/10.12698/cpre.wp2015-09.seas.
Full textArmas, Elvira, and Magaly Lavadenz. The Observation Protocol for Academic Literacies (OPAL); A Tool for Supporting Teachers of English Language Learners. CEEL, 2011. http://dx.doi.org/10.15365/ceel.article.2011.1.
Full textRabush, Carol M., Melissa S. Berkowitz, and Richard M. Modjeski. The Evaluation of the Army Education Information System. Fort Belvoir, VA: Defense Technical Information Center, August 1985. http://dx.doi.org/10.21236/ada171421.
Full textmcguinn, patrick. State Education Agencies and the Implementation of New Teacher Evaluation Systems. Consortium for Policy Research in Education, September 2015. http://dx.doi.org/10.12698/cpre.pb15-2.2015.
Full textAiyar, Yamini, Vincy Davis, Gokulnath Govindan, and Taanya Kapoor. Rewriting the Grammar of the Education System: Delhi’s Education Reform (A Tale of Creative Resistance and Creative Disruption). Research on Improving Systems of Education (RISE), November 2021. http://dx.doi.org/10.35489/bsg-rise-misc_2021/01.
Full textBrasil, André. Multidimensionality through self-evaluation: From theory to practice in the Brazilian graduate system. Fteval - Austrian Platform for Research and Technology Policy Evaluation, April 2022. http://dx.doi.org/10.22163/fteval.2022.546.
Full textBraslavskaya, Elena, and Tatyana Pavlova. English for IT-Specialists. SIB-Expertise, June 2021. http://dx.doi.org/10.12731/er0464.21062021.
Full textAiginger, Karl, Andreas Reinstaller, Michael Böheim, Rahel Falk, Michael Peneder, Susanne Sieber, Jürgen Janger, et al. Evaluation of Government Funding in RTDI from a Systems Perspective in Austria. Synthesis Report. WIFO, Austria, August 2009. http://dx.doi.org/10.22163/fteval.2009.504.
Full text