Academic literature on the topic 'Assistant logiciel'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Assistant logiciel.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Assistant logiciel"

1

Contamines, Julien, Gilbert Paquette, and Richard Hotte. "LÉO, assistant logiciel pour une scénarisation pédagogique dirigée par les compétences." Revue internationale des technologies en pédagogie universitaire 6, no. 2-3 (2009): 26. http://dx.doi.org/10.7202/1000009ar.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ribaud, Vincent, Philippe Saliou, and Claude Y. Laporte. "Un assistant de mémoire pour les très petits projets d’ingénierie du logiciel." Études de communication, no. 36 (June 1, 2011): 67–86. http://dx.doi.org/10.4000/edc.2631.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Donin, Nicolas, and Jacques Theureau. "L’atelier d’un réalisateur en informatique musicale." Circuit 18, no. 1 (2008): 31–38. http://dx.doi.org/10.7202/017906ar.

Full text
Abstract:
Résumé Cet entretien avec Gilbert Nouno interroge le travail quotidien d’un assistant musical (ou réalisateur en informatique musicale). En février 2007, Nouno achevait une collaboration avec Xavier Dayer pour son oeuvre Delights (pour choeur, ensemble et électronique) et poursuivait une collaboration au long cours avec Jonathan Harvey à l’occasion de deux oeuvres : Wagner Dream et une nouvelle oeuvre pour orchestre. Ces projets impliquent des modalités de collaboration variées (périodicité, type d’implication du compositeur dans l’informatique, etc.) et divers usages du logiciel Max/MSP et de programmes associés. Nouno évoque également des interactions avec Pierre Boulez, Michael Jarrell et Marc Monnet. Enfin sont interrogées la variabilité, la plasticité, voire la précarité de l’activité de réalisateur en informatique musicale (et de son atelier).
APA, Harvard, Vancouver, ISO, and other styles
4

Kuehni, Morgane, and Julie Tiberghien. "Introduction d’un logiciel de gestion dans les services sociaux." Réseaux N° 249-250, no. 1-2 (2025): 247–75. https://doi.org/10.3917/res.249.0247.

Full text
Abstract:
Cet article s’appuie sur une étude ethnographique de dix mois, menée au sein de cinq services sociaux en Suisse romande. Il vise à explorer les reconfigurations du travail des assistantes sociales et assistants sociaux par le biais de l’introduction d’un logiciel de gestion en 2021. Il analyse les conséquences de la numérisation du travail sur les pratiques professionnelles quotidiennes, explorant les contraintes que le dispositif numérique impose et les effets qu’il suscite, notamment les sentiments de surveillance et de contrôle chez les professionnels. En son cœur, l’article révèle les arrangements développés par les assistantes sociales et assistants sociaux, qui se traduisent par des pratiques de protection tant de leur personne, que de leur métier.
APA, Harvard, Vancouver, ISO, and other styles
5

Chaix, B., G. Lobre, S. Mahboub, G. Delamon, J. E. Bibault, and B. Brouard. "Le chatbot, outil d’accompagnement thérapeutique de la dépression chez les patientes atteintes d’un cancer du sein." Psycho-Oncologie 14, no. 1-2 (2020): 17–21. http://dx.doi.org/10.3166/pson-2020-0113.

Full text
Abstract:
Objectif : Le chatbot est un logiciel qui utilise l’apprentissage statistique et a pour objectif de simuler une conversation par message textuel ou vocal. Le chatbot Vik a été développé dans le but d’améliorer la qualité de vie des patients atteints d’un cancer ou d’une maladie chronique. L’objectif de cette étude pilote est de mesurer l’humeur de patients atteints d’un cancer du sein, avant et après accompagnement par le chatbot Vik. Matériel et méthodes : Les patients ont été recrutés lors de la première utilisation de Vik. Ils ont été triés en fonction des critères d’inclusion (âge > 18 ans, atteints d’un cancer du sein et en cours de traitement, non-opposition, connaissance d’Internet). Ils ne devaient pas être suivis pour des troubles dépressifs ou en cours d’une psychothérapie. Le questionnaire PHQ-9 a été utilisé pour l’évaluation des symptômes de dépression. Seuls les patients ayant un score supérieur à 5 étaient inclus dans l’étude (j0). Le PHQ-9 était ensuite reproposé à j+15 puis à j+30. Résultats : Les utilisateurs recrutés (n = 74) avaient entre 26 et 78 ans. La moyenne d’âge était de 50 ans. Le taux de satisfaction globale était de 94 %. Le score moyen obtenu au PHQ-9 avant utilisation du chatbot (j0) était de 9,73 (ET : 2,02). À l’issue des 30 jours de l’expérimentation, il était de 5,00 (ET : 2,82). L’évolution de l’humeur au cours du temps était croissante à mesure que les participants discutaient avec le chatbot. Conclusion : Cette étude apporte des éléments prometteurs sur la possibilité d’un assistant virtuel conçu pour soutenir les patients d’offrir une méthode attrayante de suivi et de complémentarité aux méthodes traditionnelles de thérapie.
APA, Harvard, Vancouver, ISO, and other styles
6

Chandrakar, Chandu Lal, and Yuan Bentao. "From Learning Theory to Academic Organisation: The Institutionalisation of Higher Education Teaching Assistant Position in China." International Journal of Higher Education 7, no. 3 (2018): 124. http://dx.doi.org/10.5430/ijhe.v7n3p124.

Full text
Abstract:
This exploratory study critically investigates the teaching assistant regulations of higher education institutions of China. On the basis of content analysis of the teaching assistant regulations of five premier universities of China this study analyses the possible discrepancies that might compromise the principles of transparency, equal opportunity and encouraging excellence as stipulated in the vision, mission, and goal of the regulations. Teacher assistants do make more than two third of the academic staff at the universities in China. Besides, China has a second largest higher education system in terms of scale in the world. Practices of sharing skills and imparting knowledge at these institutions have been intermediated by a semi-institutionalized position, called ‘teacher assistants’. It’s therefore, the informal submission of assignments without record at the PhD level questions the purpose of integrity and academic freedom of the higher education at the universities. On the basis of an instrumentalised framework guided by the dimensions of decision making and learning organization theories this study using content analysis has formulated the recommendations for the institutions while selecting and training the students as teaching assistants. A critical but logical illustration of the teaching assistant regulations has also been detailed regarding academic integrity in this study.
APA, Harvard, Vancouver, ISO, and other styles
7

Rocha, Berenice I. "Language Assistant Programs and CLIL: Developing and Validating Questionnaires to Determine Program Training Needs." Journal of Language Teaching and Research 16, no. 2 (2025): 359–70. https://doi.org/10.17507/jltr.1602.01.

Full text
Abstract:
Research on language assistants in content and language integrated learning (CLIL) classrooms in the context of Spain has been scarce. Given the unique, demanding nature of CLIL instruction, more investigation into the implementation of language assistant programs in CLIL using validated research tools is essential. Therefore, this paper outlines the process for developing and validating questionnaires created to analyze the perceptions and training needs of language assistants, CLIL teachers, and bilingual program coordinators in relation to effectively implementing language assistant programs in CLIL. In this study, three questionnaires were created for three specific cohorts (i.e., language assistants, CLIL teachers, and bilingual program coordinators). The first step of the double-fold pilot process used to validate the questionnaires entailed an expert rating approach, where the experts rated each item’s clarity, precision, and relevance, followed by rating the entire questionnaire’s logical order, number of items, and content validity. The second step involved a pilot study conducted among all three cohorts. Next, Cronbach’s alpha was calculated for each section of the three questionnaires, except for demographic information, and for each questionnaire as a whole. The results show that all sections and questionnaires are internally consistent and thus could be used in a wide array of studies analyzing the implementation of language assistant programs in CLIL. The questionnaires allow the use of data source triangulation to determine the training needs of three cohorts in both their CLIL pedagogical knowledge and language assistant program implementation skills, specifically in a secondary school classroom setting.
APA, Harvard, Vancouver, ISO, and other styles
8

Shazhaev, Ilman, Dmitry Mikhaylov, Abdulla Shafeeg, Arbi Tularov, and Islam Shazhaev. "Personal Voice Assistant: from Inception to Everyday Application." Indonesian Journal of Data and Science 4, no. 2 (2023): 64–72. http://dx.doi.org/10.56705/ijodas.v4i2.69.

Full text
Abstract:
The creation of automatic speech recognition systems is a popular trend in the development of information technology. These technologies are developing at a very fast pace, gradually covering more and more areas: already now we can say how firmly they have settled in our lives. The term "speech technologies" means a fairly large layer of information technologies, but one of the most advanced products in this area is a voice assistant, which includes the use of all types of speech technologies: speech recognition, speech synthesis, a system for developing and analyzing voice information, and voice biometrics. A voice assistant is software that allows you to control your device using voice commands. The range of possibilities does not end with the execution of commands; a modern assistant is even able to maintain a conversation with the user. Since the voice assistant is a complex innovation that consists of many different technologies, the task of a smart assistant is to ensure that they work smoothly with each other. Now there are many voice assistants on the market that can make life easier for a person. Now you don’t have to manually enter a question into a search engine or search for a song, you just need to say what you want, and the voice assistant will find everything on its own. But every year there are more and more assistants and it becomes more and more difficult for the average user to choose, because each assistant has its own characteristics. Nevertheless, the daily use of this technology is becoming more widespread. Another area that is rapidly developing is gaming. Using more and more innovations, the next logical step is to implement a voice assistant, at least at the training stage. However, real progress is out of the question for the time being.
APA, Harvard, Vancouver, ISO, and other styles
9

Niu, Yue, Jonathan Sterling, Harrison Grodin, and Robert Harper. "A cost-aware logical framework." Proceedings of the ACM on Programming Languages 6, POPL (2022): 1–31. http://dx.doi.org/10.1145/3498670.

Full text
Abstract:
We present calf , a c ost- a ware l ogical f ramework for studying quantitative aspects of functional programs. Taking inspiration from recent work that reconstructs traditional aspects of programming languages in terms of a modal account of phase distinctions , we argue that the cost structure of programs motivates a phase distinction between intension and extension . Armed with this technology, we contribute a synthetic account of cost structure as a computational effect in which cost-aware programs enjoy an internal noninterference property: input/output behavior cannot depend on cost. As a full-spectrum dependent type theory, calf presents a unified language for programming and specification of both cost and behavior that can be integrated smoothly with existing mathematical libraries available in type theoretic proof assistants. We evaluate calf as a general framework for cost analysis by implementing two fundamental techniques for algorithm analysis: the method of recurrence relations and physicist’s method for amortized analysis . We deploy these techniques on a variety of case studies: we prove a tight, closed bound for Euclid’s algorithm, verify the amortized complexity of batched queues, and derive tight, closed bounds for the sequential and parallel complexity of merge sort, all fully mechanized in the Agda proof assistant. Lastly we substantiate the soundness of quantitative reasoning in calf by means of a model construction.
APA, Harvard, Vancouver, ISO, and other styles
10

St Dizier de Almeida, Valérie. "Modélisation d'une assistance interactive pour améliorer l'accessibilité d'un logiciel." Sciences et techniques éducatives 4, no. 1 (1997): 13–39. http://dx.doi.org/10.3406/stice.1997.1326.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Assistant logiciel"

1

BIENVENU, OLIVIER. "Conception d'un assistant pour un logiciel d'elements finis dedie aux calculs de champs electromagnetiques." Paris 6, 1998. http://www.theses.fr/1998PA066033.

Full text
Abstract:
Aujourd'hui, les logiciels de simulation sont devenus de plus en plus performants. Notamment, la methode des elements finis est de plus en plus utilisee pour determiner les caracteristiques ou bien encore pour optimiser un systeme. Cependant l'emploi de cette methode necessite des connaissances et de l'experience dans le domaine des techniques numeriques et de la physique du probleme traite. Afin de faciliter l'acces a la methode des elements finis, il a ete entrepris de realiser un logiciel d'aide aux utilisateurs. Apres avoir etudier differentes methodes, le choix s'est porte sur un assistant. En effet l'assistant logiciel est une methode de prise de decision qui s'adapte tres bien a un logiciel qui utilise la methode des elements finis. Pour que la mise en uvre de l'assistant soit complete, il a ete necessaire de developper un logiciel d'elements finis. Celui-ci comporte tous les outils necessaires a la realisation d'un calcul par la methode des elements finis : un pre-processeur, un mailleur, un code de calcul genere dynamiquement et un post-processeur. Le fonctionnement de l'assistant repose sur l'exploitation d'un fichier contenant des questions liees entres elles. La liaison entre les questions se fait grace a la reponse fourni par l'utilisateur. Les reponses sont donnees dans le fichier d'assistance. Afin d'agir sur le logiciel d'elements finis, des actions sont memorisees en meme temps que la reponse est donnee. A la fin de la session d'expertise, l'ensemble des actions sont appliquees pour que l'utilisateur puisse obtenir le resultat attendu. En conclusion, l'utilisation d'un logiciel d'assistance est tout a fait satisfaisante lorsqu'il s'agit d'aider a prendre des decisions pour l'exploitation correcte d'un logiciel d'elements finis. De plus l'emploi d'un fichier permet une evolution aisee de l'assistant lors de l'ajout de nouvelles fonctionnalites au programme d'elements finis.
APA, Harvard, Vancouver, ISO, and other styles
2

Zimmermann, Théo. "Challenges in the collaborative evolution of a proof language and its ecosystem." Thesis, Université de Paris (2019-....), 2019. http://www.theses.fr/2019UNIP7163.

Full text
Abstract:
Dans cette thèse, je présente l'application de méthodes et de connaissances en génie logiciel au développement, à la maintenance et à l'évolution de Coq —un assistant de preuve interactif basé sur la théorie des types— et de son écosystème de paquets. Coq est développé chez Inria depuis 1984, mais sa base d’utilisateurs n’a cessé de s’agrandir, ce qui suscite désormais une attention renforcée quant à sa maintenabilité et à la participation de contributeurs externes à son évolution et à celle de son écosystème de plugins et de bibliothèques.D'importants changements ont eu lieu ces dernières années dans les processus de développement de Coq, dont j'ai été à la fois un témoin et un acteur (adoption de GitHub en tant que plate-forme de développement, tout d'abord pour son mécanisme de pull request, puis pour son système de tickets, adoption de l'intégration continue, passage à des cycles de sortie de nouvelles versions plus courts, implication accrue de contributeurs externes dans les processus de développement et de maintenance open source). Les contributions de cette thèse incluent une description historique de ces changements, le raffinement des processus existants et la conception de nouveaux processus, la conception et la mise en œuvre de nouveaux outils facilitant l’application de ces processus, et la validation de ces changements par le biais d’évaluations empiriques rigoureuses.L'implication de contributeurs externes est également très utile au niveau de l'écosystème de paquets. Cette thèse contient en outre une analyse des méthodes de distribution de paquets et du problème spécifique de la maintenance à long terme des paquets ayant un seul responsable<br>In this thesis, I present the application of software engineering methods and knowledge to the development, maintenance, and evolution of Coq —an interactive proof assistant based on type theory— and its package ecosystem. Coq has been developed at Inria since 1984, but has only more recently seen a surge in its user base, which leads to much stronger concerns about its maintainability, and the involvement of external contributors in the evolution of both Coq, and its ecosystem of plugins and libraries.Recent years have seen important changes in the development processes of Coq, of which I have been a witness and an actor (adoption of GitHub as a development platform, first for its pull request mechanism, then for its bug tracker, adoption of continuous integration, switch to shorter release cycles, increased involvement of external contributors in the open source development and maintenance process). The contributions of this thesis include a historical description of these changes, the refinement of existing processes, and the design of new ones, the design and implementation of new tools to help the application of these processes, and the validation of these changes through rigorous empirical evaluation.Involving external contributors is also very useful at the level of the package ecosystem. This thesis additionally contains an analysis of package distribution methods, and a focus on the problem of the long-term maintenance of single-maintainer packages
APA, Harvard, Vancouver, ISO, and other styles
3

Savary-Leblanc, Maxime. "Augmenting software engineers with modeling assistants." Thesis, Université de Lille (2018-2021), 2021. https://pepite-depot.univ-lille.fr/LIBRE/EDMADIS/2021/2021LILUB027.pdf.

Full text
Abstract:
La connaissance du domaine est une condition préalable à la conception et à la mise en œuvre de logiciels adaptés aux exigences des parties prenantes. Une façon courante de formaliser cette connaissance est réalisée par des modèles conceptuels, qui sont couramment utilisés pour décrire ou simuler un système. L'acquisition d'une telle expertise nécessite de discuter avec des parties prenantes bien informées et/ou d'avoir accès à des documents utiles, qui ne sont pas toujours facilement accessibles. Dans le même temps, de plus en plus d'échantillons de modèles peuvent être rassemblés à partir de sources multiples, ce qui représente un nombre croissant d'éléments de connaissance déjà formalisés et accessibles. Par exemple, certaines entreprises conservent des archives de référentiels de modèles internes. Il existe également de nombreux projets open source qui contiennent des modèles, tandis que certains outils de modélisation offrent même la possibilité de créer des projets publics que l'on peut parcourir librement. Ces sources de données pourraient être exploitées pour créer une connaissance du domaine qui pourrait être fournie aux ingénieurs logiciels lors de la modélisation. Pour être utile, cette connaissance doit être de haute qualité, mais doit aussi être bien intégrée dans le processus de modélisation du logiciel. L'objectif de cette thèse est de fournir un cadre pour exploiter les connaissances afin d'aider les utilisateurs d'outils de modélisation informatique avec des assistants de modélisation logicielle. Cette thèse présente d'abord nos questions de recherche basées sur une étude de cartographie systématique sur les assistants logiciels pour l'ingénierie logicielle, et se concentre ensuite sur les assistants logiciels pour la modélisation. Elle rend compte de la conception d'assistants de modélisation basée sur une approche centrée sur l'utilisateur. Nous présentons les conclusions des entretiens menés avec des experts en modélisation, une étape au cours de laquelle les exigences sont recueillies. Ensuite, nous développons la création d'un prototype de base de connaissances en modélisation permettant (i) de créer des connaissances artificielles générales et spécifiques en modélisation, et (ii) de les mettre à disposition de tout client logiciel via des recommandations. Après avoir présenté les résultats d'une expérience concernant la précision du système, nous discutons ces résultats préliminaires. Enfin, cette thèse présente l'implémentation d'un assistant de modélisation logiciel intégré à l'outil Papyrus, qui vise à cognifier l'environnement de modélisation UML en intégrant les connaissances précédemment créées. Notre travail permet de clarifier le besoin d'assistance pendant les travaux de modélisation de logiciels, de présenter une première approche de la conception d'assistants logiciels pour la modélisation de logiciels, et d'identifier les défis de recherche dans l'assistance à la modélisation<br>Domain knowledge is a prerequisite to produce software design and implementation tailored to stakeholders’ requirements. One common way to formalize that knowledge is achieved through conceptual models, which are commonly used to describe or simulate a system. Acquiring such expertise requires to discuss with knowledgeable stakeholders and/or to get an access to useful documents, which both might not always be easily accessible. In the same time, more and more model samples can be gathered from multiple sources, what represents an increasing number of already formalized and accessible knowledge pieces. For example, some companies keep archives of internal model repositories. There also exist numerous open source projects that contain models while some modeling tools even offer the possibility to create public projects that are free to browse. Such data sources could be exploited to create domain knowledge that could be provided to software engineers while modeling. To be useful, this knowledge must be of high quality, but must also be well integrated into the software modeling process. The focus of this thesis is to provide a framework to exploit knowledge to assist users of computer-based modeling tools with software modeling assistants. This thesis first introduces our research questions based on a systematic mapping study about software assistants for software engineering, and then focuses on software assistants for modeling. It reports on the design of modeling assistants based on a user-centered approach. We present the conclusions of interviews conducted with experts in modeling, a stage in which requirements are collected. Then, we develop the creation of a prototype modeling knowledge base allowing (i) to create general and specific artificial modeling knowledge, and (ii) to make them available to any software client via recommendations. After introducing the results of an experiment regarding the accuracy of the system, we discuss these preliminary results. Finally, this thesis presents a software modeling assistant implementation integrated to the Papyrus tool, which aims to cognify the UML modeling environment by integrating the previously created knowledge. Our work helps to clarify the need for assistance during software modeling work, presents an initial approach to the design of software assistants for software modeling, and identify research challenges in modeling assistance
APA, Harvard, Vancouver, ISO, and other styles
4

Delépine, Ludovic. "L'assistance à la navigation hyperdocumentaire : un assistant logiciel d'aide à la recherche de documents visités par un lecteur dans le contexte du Web : une approche sémio-technologique." Dijon, 2003. http://www.theses.fr/2003DIJOS009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zucchini, Rébecca. "Bibliothèque certifiée en Coq pour la provenance des données." Electronic Thesis or Diss., université Paris-Saclay, 2023. http://www.theses.fr/2023UPASG040.

Full text
Abstract:
La présente thèse se situe à l'intersection des méthodes formelles et des bases de données, et s'intéresse à la formalisation de la provenance des données à l'aide de l'assistant de preuve Coq. L'étude de la provenance des données, qui permet de retracer leur origine et leur historique, est essentielle pour assurer la qualité des données, éviter les interprétations erronées et favoriser la transparence dans le traitement des données. La thèse propose ainsi la formalisation de deux types de provenance des données couramment utilisés, la How-provenance et la Where-provenance. Cette formalisation a permis de comparer leur sémantique, mettant en évidence leurs différences et leurs complémentarités. En outre, elle a conduit à la proposition d'une structure algébrique et d'une sémantique unificatrice pour ces deux types de provenance<br>This thesis is about the formalization of data provenance using the Coq proof assistant, at the intersection of formal methods and database communities. It explores the importance of data provenance, which tracks the origin and history of data, in addressing issues such as poor data quality, incorrect interpretations, and lack of transparency in data processing. The thesis proposes formalizations of two commonly used types of data provenance, How-provenance and Where-provenance. Formalizing both types of provenance allowed us to compare their semantics, highlighting their differences and complementarities. Additionally, the formalization of these two types of provenance led to the proposal of an algebraic structure that provides a unifying semantics
APA, Harvard, Vancouver, ISO, and other styles
6

Filou, Vincent. "Une étude formelle de la théorie des calculs locaux à l'aide de l'assistant de preuve Coq." Thesis, Bordeaux 1, 2012. http://www.theses.fr/2012BOR14708/document.

Full text
Abstract:
L'objectif de cette thèse est de produire un environnement permettant de raisonner formellement sur la correction de systèmes de calculs locaux, ainsi que sur l'expressivité de ce modèle de calcul. Pour ce faire, nous utilisons l'assistant de preuve Coq. Notre première contribution est la formalisation en Coq de la sémantique des systèmes de réétiquetage localement engendrés, ou calculs locaux. Un système de calculs locaux est un système de réétiquetage de graphe dont la portée est limitée. Nous proposons donc tout d'abord une implantation succincte de la théorie des graphes en Coq, et utilisons cette dernière pour définir les systèmes de réétiquetage de graphes localement engendrés. Nous avons relevé, dans la définition usuelle des calculs locaux, certaines ambiguïtés. Nous proposons donc une nouvelle définition, et montrons formellement que celle-ci capture toutes les sous-classes d'algorithmes étudiées. Nous esquissons enfin une méthodologie de preuve des systèmes de calculs locaux en Coq.Notre seconde contribution consiste en l'étude formelle de l'expressivité des systèmes de calculs locaux. Nous formalisons un résultat de D. Angluin (repris par la suite par Y. Métivier et J. Chalopin): l'inexistence d'un algorithme d'élection universelle. Nous proposons ensuite deux lemmes originaux concernant les calculs locaux sur les arêtes (ou systèmes LC0), et utilisons ceux-ci pour produire des preuves formelles d'impossibilité pour plusieurs problèmes: calcul du degré de chaque sommet, calcul d'arbre recouvrant, etélection. Nous proposons informellement une nouvelles classes de graphe pour laquelle l'élection est irréalisable par des calculs locaux sur les arêtes.Nous étudions ensuite les transformations de systèmes de calculs locaux et de leur preuves. Nous adaptons le concept de Forward Simulation de N. Lynch aux systèmes de calculs locaux et utilisons ce dernier pour démontrer formellement l'inclusion de deux modes de détection de terminaison dans le cas des systèmes LC0. La preuve de cette inclusion estsimplifiée par l'utilisation de transformations "standards" de systèmes, pour lesquels des résultats génériques ont été démontrés. Finalement, nous réutilisons ces transformations standards pour étudier, en collaboration avec M. Tounsi, deux techniques de composition des systèmes de réétiquetage LC0. Une bibliothèque Coq d'environ 50000 lignes, contenant les preuves formelles des théorèmes présentés dans le mémoire de thèse à été produite en collaboration avec Pierre Castéran (dont environ 40%produit en propre par V. Filou) au cours de cette thèse<br>The goal of this work is to build a framework allowing the study, in aformal setting, of the correctness of local computations systems aswell as the expressivity of this model. A local computation system isa set of graph relabelling rules with limited scope, corresponding to a class of distributed algorithms.Our first contribution is the formalisation, in the Coq proofassistant, of a relationnal semantic for local computation systems.This work is based on an original formal graph theory for Coq.Ambiguities inherent to a "pen and paper" definition of local computations are corrected, and we prove that our definition captures all sub-classes of relabelling relations studied in the remainder. We propose a draft of a proof methodology for local computation systems in Coq. Our second contribution is the study of the expressivity of classes of local computations inside our framework. We provide,for instance, a formal proof of D. Angluin results on election and graph coverings. We propose original "meta-theorems" concerningthe LC0 class of local computation, and use these theorem to produce formal impossibility proofs.Finally we study possible transformations of local computation systemsand of their proofs. To this end, we adapt the notion of ForwardSimulation, originally formulated by N. Lynch, to localcomputations. We use this notion to define certified transformationsof LC0 systems. We show how those certified transformation can be useto study the expressivity of certain class of algorithm in ourframework. We define, as certified transformation, two notions ofcomposition for LC0 systems.A Coq library of ~ 50000 lines of code, containing the formal proofs of the theorems presented in the thesis has been produced in collaboration with Pierre Castéran
APA, Harvard, Vancouver, ISO, and other styles
7

Lelay, Catherine. "Repenser la bibliothèque réelle de Coq : vers une formalisation de l'analyse classique mieux adaptée." Thesis, Paris 11, 2015. http://www.theses.fr/2015PA112096/document.

Full text
Abstract:
L'analyse réelle a de nombreuses applications car c'est un outil approprié pour modéliser de nombreux phénomènes physiques et socio-économiques. En tant que tel, sa formalisation dans des systèmes de preuve formelle est justifié pour permettre aux utilisateurs de vérifier formellement des théorèmes mathématiques et l'exactitude de systèmes critiques. La bibliothèque standard de Coq dispose d'une axiomatisation des nombres réels et d'une bibliothèque de théorèmes d'analyse réelle. Malheureusement, cette bibliothèque souffre de nombreuses lacunes. Par exemple, les définitions des intégrales et des dérivées sont basées sur les types dépendants, ce qui les rend difficiles à utiliser dans la pratique. Cette thèse décrit d'abord l'état de l'art des différentes bibliothèques d'analyse réelle disponibles dans les assistants de preuve. Pour pallier les insuffisances de la bibliothèque standard de Coq, nous avons conçu une bibliothèque facile à utiliser : Coquelicot. Une façon plus facile d'écrire les formules et les théorèmes a été mise en place en utilisant des fonctions totales à la place des types dépendants pour écrire les limites, dérivées, intégrales et séries entières. Pour faciliter l'utilisation, la bibliothèque dispose d'un ensemble complet de théorèmes couvrant ces notions, mais aussi quelques extensions comme les intégrales à paramètres et les comportements asymptotiques. En plus, une hiérarchie algébrique permet d'appliquer certains théorèmes dans un cadre plus générique comme les nombres complexes pour les matrices. Coquelicot est une extension conservative de l'analyse classique de la bibliothèque standard de Coq et nous avons démontré les théorèmes de correspondance entre les deux formalisations. Nous avons testé la bibliothèque sur plusieurs cas d'utilisation : sur une épreuve du Baccalauréat, pour les définitions et les propriétés des fonctions de Bessel ainsi que pour la solution de l'équation des ondes en dimension 1<br>Real analysis is pervasive to many applications, if only because it is a suitable tool for modeling physical or socio-economical systems. As such, its support is warranted in proof assistants, so that the users have a way to formally verify mathematical theorems and correctness of critical systems. The Coq system comes with an axiomatization of standard real numbers and a library of theorems on real analysis. Unfortunately, this standard library is lacking some widely used results. For instance, the definitions of integrals and derivatives are based on dependent types, which make them cumbersome to use in practice. This thesis first describes various state-of-the-art libraries available in proof assistants. To palliate the inadequacies of the Coq standard library, we have designed a user-friendly formalization of real analysis: Coquelicot. An easier way of writing formulas and theorem statements is achieved by relying on total functions in place of dependent types for limits, derivatives, integrals, power series, and so on. To help with the proof process, the library comes with a comprehensive set of theorems that cover not only these notions, but also some extensions such as parametric integrals and asymptotic behaviors. Moreover, an algebraic hierarchy makes it possible to apply some of the theorems in a more generic setting, such as complex numbers or matrices. Coquelicot is a conservative extension of the classical analysis of Coq's standard library and we provide correspondence theorems between the two formalizations. We have exercised the library on several use cases: in an exam at university entry level, for the definitions and properties of Bessel functions, and for the solution of the one-dimensional wave equation
APA, Harvard, Vancouver, ISO, and other styles
8

Mouhcine, Houda. "Formal Proofs in Applied Mathematics : A Coq Formalization of Simplicial Lagrange Finite Elements." Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG112.

Full text
Abstract:
Cette thèse est dédiée au développement de preuves formelles de théorèmes et propositions mathématiques dans le domaine de l'analyse réelle, en utilisant l'assistant de preuve Coq pour garantir leur exactitude. Le cœur de ce travail est divisé en deux parties principales.La première partie se concentre sur l'utilisation de Coq pour formaliser des principes mathématiques clés tels que le principe d'induction de Lebesgue et le théorème de Tonelli, permettant le calcul d'intégrales doubles sur des espaces produits en intégrant de manière itérative par rapport à chaque variable. Ce travail s'appuie sur des recherches antérieures en théorie de la mesure et sur l'intégrale de Lebesgue.La deuxième partie de cette thèse fonctionne dans le cadre de la méthode des éléments finis (MEF), une technique numérique largement utilisée pour résoudre des équations aux dérivées partielles (EDP). La MEF joue un rôle important dans de nombreux programmes de simulation industrielle, notamment dans l'approximation des solutions à des problèmes complexes tels que le transfert de chaleur, la dynamique des fluides et les simulations de champs électromagnétiques. Plus spécifiquement, nous visons à construire des éléments finis, en nous concentrant sur les éléments finis de Lagrange simpliciaux avec des degrés de liberté répartis uniformément. Ce travail mobilise un large éventail de concepts algébriques, y compris les familles finies, les monoïdes, les espaces vectoriels, les espaces affines, les sous-structures et les espaces de dimensions finies.Pour mener à bien cette étude, nous formalisons dans Coq en logique classique plusieurs composants fondamentaux. Nous commençons par construire un élément fini général, puis nous procédons à la définition de plusieurs composants fondamentaux, y compris la construction de l'espace d'approximation de Lagrange, l'expression de sa base de polynômes de Lagrange et la formalisation des transformations géométriques affines et de la propriété d'unisolvance des éléments finis de Lagrange<br>This thesis is dedicated to developing formal proofs of mathematical theorems and propositions within the field of real analysis using the Coq proof assistant to ensure their correctness. The core of this work is divided into two main parts.The first part focuses on using Coq to formalize key mathematical principles such as the Lebesgue induction principle and the Tonelli theorem, allowing the computation of double integrals on product spaces by iteratively integrating with respect to each variable. This work builds upon previous research in measure theory and the Lebesgue integral.The second part of this thesis operates within the framework of the Finite Element Method (FEM), a widely used numerical technique for solving partial differential equations (PDEs). FEM plays an important role in numerous industrial simulation programs, particularly in approximating solutions to complex problems such as heat transfer, fluid dynamics, and electromagnetic field simulations. Specifically, we aim to construct finite elements, focusing on simplicial Lagrange finite elements with evenly distributed degrees of freedom. This work engages a broad range of algebraic concepts, including finite families, monoids, vector spaces, affine spaces, substructures, and finite-dimensional spaces.To conduct this study, we formalize in Coq in classical logic several foundational components. We begin by constructing a general finite element, then proceed to define several foundational components, including the construction of the Lagrange approximation space, expressing its Lagrange polynomial basis, and formalizing affine geometric transformations and the unisolvence property of Lagrange finite elements
APA, Harvard, Vancouver, ISO, and other styles
9

Nemouchi, Yakoub. "Model-based Testing of Operating System-Level Security Mechanisms." Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLS061/document.

Full text
Abstract:
Le test à base de modèle, en particulier test basé sur des assistants à la preuve, réduit de façon transparente l'écart entre la théorie, le modèle formel, et l’implémentation d'un système informatique. Actuellement, les techniques de tests offrent une possibilité d'interagir directement avec de "vrais" systèmes : via différentes propriétés formelles, les tests peuvent être dérivés et exécutés sur le système sous test. Convenablement, l'ensemble du processus peut être entièrement automatisé. Le but de cette thèse est de créer un environnement de test de séquence à base de modèle pour les programmes séquentiels et concurrents. Tout d'abord une théorie générique sur les monades est présentée, qui est indépendante de tout programme ou système informatique. Il se trouve que notre théorie basée sur les monades est assez expressive pour couvrir tous les comportements et les concepts de tests. En particulier, nous considérons ici : les exécutions séquentielles, les exécutions concurrentes, les exécutions synchronisées, les exécutions avec interruptions. Sur le plan conceptuel, la théorie apporte des notions comme la notion raffinement de test, les cas de tests abstraits, les cas de test concrets, les oracles de test, les scénarios de test, les données de tests, les pilotes de tests, les relations de conformités et les critères de couverture dans un cadre théorique et pratique. Dans ce cadre, des règles de raffinement de comportements et d'exécution symbolique sont élaborées pour le cas générique, puis affinées et utilisées pour des systèmes complexes spécifique. Comme application pour notre théorie, nous allons instancier notre environnement par un modèle séquentiel d'un microprocesseur appelé VAMP développé au cours du projet Verisoft. Pour le cas d'étude sur la concurrence, nous allons utiliser notre environnement pour modéliser et tester l'API IPC d'un système d'exploitation industriel appelé PikeOS.Notre environnement est implémenté en Isabelle / HOL. Ainsi, notre approche bénéficie directement des modèles, des outils et des preuves formelles de ce système<br>Formal methods can be understood as the art of applying mathematical reasoningto the modeling, analysis and verification of computer systems. Three mainverification approaches can be distinguished: verification based on deductive proofs,model checking and model-based testing.Model-based testing, in particular in its radical form of theorem proving-based testingcite{brucker.ea:2012},bridges seamlessly the gap between the theory, the formal model, and the implementationof a system. Actually,theorem proving based testing techniques offer a possibility to directly interactwith "real" systems: via differentformal properties, tests can be derived and executed on the system under test.Suitably supported, the entire process can fully automated.The purpose of this thesis is to create a model-based sequence testing environmentfor both sequential and concurrent programs. First a generic testing theory basedon monads is presented, which is independent of any concrete program or computersystem. It turns out that it is still expressive enough to cover all common systembehaviours and testing concepts. In particular, we consider here: sequential executions,concurrent executions, synchronised executions, executions with abort.On the conceptual side, it brings notions like test refinements,abstract test cases, concrete test cases,test oracles, test scenarios, test data, test drivers, conformance relations andcoverage criteria into one theoretical and practical framework.In this framework, both behavioural refinement rules and symbolic executionrules are developed for the generic case and then refined and used for specificcomplex systems. As an application, we will instantiate our framework by an existingsequential model of a microprocessor called VAMP developed during the Verisoft-Project.For the concurrent case, we will use our framework to model and test the IPC API of areal industrial operating system called PikeOS.Our framework is implemented in Isabelle/HOL. Thus, our approach directly benefitsfrom the existing models, tools, and formal proofs in this system
APA, Harvard, Vancouver, ISO, and other styles
10

Guettala, Abdelheq Et-Tahir. "VizAssist : un assistant utilisateur pour le choix et le paramétrage des méthodes de fouille visuelle de données." Thesis, Tours, 2013. http://www.theses.fr/2013TOUR4017/document.

Full text
Abstract:
Nous nous intéressons dans cette thèse au problème de l’automatisation du processus de choix et de paramétrage des visualisations en fouille visuelle de données. Pour résoudre ce problème, nous avons développé un assistant utilisateur "VizAssist" dont l’objectif principal est de guider les utilisateurs (experts ou novices) durant le processus d’exploration et d’analyse de leur ensemble de données. Nous illustrons, l’approche sur laquelle s’appuie VizAssit pour guider les utilisateurs dans le choix et le paramétrage des visualisations. VizAssist propose un processus en deux étapes. La première étape consiste à recueillir les objectifs annoncés par l’utilisateur ainsi que la description de son jeu de données à visualiser, pour lui proposer un sous ensemble de visualisations candidates pour le représenter. Dans cette phase, VizAssist suggère différents appariements entre la base de données à visualiser et les visualisations qu’il gère. La seconde étape permet d’affiner les différents paramétrages suggérés par le système. Dans cette phase, VizAssist utilise un algorithme génétique interactif qui a pour apport de permettre aux utilisateurs d’évaluer et d’ajuster visuellement ces paramétrages. Nous présentons enfin les résultats de l’évaluation utilisateur que nous avons réalisé ainsi que les apports de notre outil à accomplir quelques tâches de fouille de données<br>In this thesis, we deal with the problem of automating the process of choosing an appropriate visualization and its parameters in the context of visual data mining. To solve this problem, we developed a user assistant "VizAssist" which mainly assist users (experts and novices) during the process of exploration and analysis of their dataset. We illustrate the approach used by VizAssit to help users in the visualization selection and parameterization process. VizAssist proposes a process based on two steps. In the first step, VizAssist collects the user’s objectives and the description of his dataset, and then proposes a subset of candidate visualizations to represent them. In this step, VizAssist suggests a different mapping between the database for representation and the set of visualizations it manages. The second step allows user to adjust the different mappings suggested by the system. In this step, VizAssist uses an interactive genetic algorithm to allow users to visually evaluate and adjust such mappings. We present finally the results that we have obtained during the user evaluation that we performed and the contributions of our tool to accomplish some tasks of data mining
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Assistant logiciel"

1

Khawaja, Sarmad. Measuring statistical capacity building: A logical framework approach. International Monetary Fund, Statistics Department, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Morrison, Thomas K., and Sarmad Khawaja. Measuring Statistical Capacity Building: A Logical Framework Approach. International Monetary Fund, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Morrison, Thomas K., and Sarmad Khawaja. Measuring Statistical Capacity Building: A Logical Framework Approach. International Monetary Fund, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Morrison, Thomas K., and Sarmad Khawaja. Measuring Statistical Capacity Building: A Logical Framework Approach. International Monetary Fund, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Handbook of Research on Investigations in Artificial Life Research and Development. IGI Global, 2018.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Brown, Christopher D. Crash Course in Technology Planning. ABC-CLIO, LLC, 2016. http://dx.doi.org/10.5040/9798400633102.

Full text
Abstract:
This easy-to-use primer will empower anyone—even those with no IT background—to face the challenge of adding one or more technologies to library services or maintaining existing technologies. Most of the public libraries in the United States are operating on tight budgets without allocation for IT personnel; in school libraries, the librarian often takes on the lion's share of IT responsibility. This book is an invaluable guide for library staff members who are put in the position of maintaining their own networks and computers with very little training or support. Authored by an assistant library director with years of firsthand experience working as a solo IT manager within public libraries, this guide provides practical information about overcoming the unique challenges of managing IT in a smaller organization, juggling multiple job roles, being limited by a restrictive budget, and working directly with the public. Crash Course in Technology Planning addresses a wide variety of IT topics in the library sphere, providing information in a logical manner and order. It begins with an explanation of triaging existing IT issues, then moves into diagnosing and repairing both individual PCs as well as the library Local Area Network (LAN). The following chapters cover other important topics, such as the best way to inventory computers and equipment, how to budget for and procure new equipment, and recommended ways for an IT layperson to set and achieve goals.
APA, Harvard, Vancouver, ISO, and other styles
7

Horan, Timothy. Create Your School Library Writing Center. ABC-CLIO, LLC, 2016. http://dx.doi.org/10.5040/9798400633164.

Full text
Abstract:
Colleges typically have writing centers to which students can bring their writing assignments to a peer tutor for assistance, but most high schools and middle schools do not. This book advocates for the creation of writing centers in 7–12 schools and explains why the school library is the best place for the writing center. There is a glaring absence of writing centers in today's K–12 schools. More and more students are being asked in college entrance testing to submit samples of their writing, and employers are expecting their workers to write correctly and clearly. This book addresses the critical lack of writing centers below the undergraduate level. It demonstrates how middle school and high school librarians can create writing centers in their school libraries, explains how to assist students through a one-on-one writing tutorial method, and gives students and teachers the tools for learning and understanding the complex art of writing. Author Timothy Horan–inventor of the School Library Writing Center–establishes why school libraries represent the best–and most logical–places to create writing centers, and why school librarians are the natural choice to direct writing center operations. He then takes readers through the process of creating a writing center from original conception up through opening day. Additional topics covered include how to publicize and ""grow"" your School Library Writing Center; maintaining your writing center for efficient operation on a daily basis as well as for years to come; how to become an effective writing center director and writing tutor; the most current technology that can be used to assist in the writing, composition, and research process; and working with English language learner (ELL) students within your writing center.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Assistant logiciel"

1

Chatzikyriakidis, Stergios, and Zhaohui Luo. "Proof Assistants for Natural Language Semantics." In Logical Aspects of Computational Linguistics. Celebrating 20 Years of LACL (1996–2016). Springer Berlin Heidelberg, 2016. http://dx.doi.org/10.1007/978-3-662-53826-5_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

El-Fiqi, Heba, Kathryn Kasmarik, and Hussein A. Abbass. "Logical Shepherd Assisting Air Traffic Controllers for Swarm UAV Traffic Control Systems." In Unmanned System Technologies. Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-60898-9_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Nirmala Devi, K., Pinaka Pani Kasetty, Surya Prakash Kukati, and Srikanth Kolakani. "IoT Based Logical Smart Glove Design with Voice Assistance to Support Deaf and Dumb People." In Lecture Notes in Electrical Engineering. Springer Nature Singapore, 2025. https://doi.org/10.1007/978-981-96-3694-5_25.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Maemura, Yu Oliver. "Jijodoryoku: The Spirit of Self-Help in Development Cooperation." In The Semantics of Development in Asia. Springer Nature Singapore, 2024. http://dx.doi.org/10.1007/978-981-97-1215-1_9.

Full text
Abstract:
AbstractJijodoryoku “self-help effort” is a central and prominent ODA policy concept that undoubtedly resonates amongst policymakers and industry stakeholders in Japan. However, attempts to systematically define and operationalize the concept are noticeably lacking within the development assistance literature. What such a policy principle entails in terms of its impact on, or coherence with, institutional practice, program design, or project logic, is worthy of critical examination. This chapter critically re-examines and attempts to refine the definition of jijodoryoku as a policy principle, by considering how the conceptualization of “self-help” in Japanese and international ODA policy has evolved over time. The chapter illustrates how current Japanese policy conceptualizes the term as an approach to ODA by examining the logical structure of high-level Japanese ODA policy, while distinguishing it from “self-reliance”, which can also be observed in global development policy. The historical and comparative examination of ODA policies presented in this chapter reveals how the policy principle of “support for self-help efforts” can be traced back to discussions among DAC countries regarding the sharing of the financial burden of assistance, which contrasts starkly to its current vernacular usage as an a priori condition of the approach to Japanese ODA. The chapter ends by considering the roots of “self-help” in Japan to argue that, rather than a concept denoting praxis, it would be more accurate to conceptualize jijodoryoku as a highly general and abstract term denoting the social-psychological state (i.e., the “spirit”) of a recipient, which could then be utilized to stipulate and evaluate general conditions for ODA recipients.
APA, Harvard, Vancouver, ISO, and other styles
5

Chan, Jonathan, and Stephanie Weirich. "Stratified Type Theory." In Lecture Notes in Computer Science. Springer Nature Switzerland, 2025. https://doi.org/10.1007/978-3-031-91118-7_10.

Full text
Abstract:
Abstract A hierarchy of type universes is a rudimentary ingredient in the type theories of many proof assistants to prevent the logical inconsistency resulting from combining dependent functions and the type-in-type axiom. In this work, we argue that a universe hierarchy is not the only option for universes in type theory. Taking inspiration from Leivant’s Stratified System F, we introduce Stratified Type Theory (), where rather than stratifying universes by levels, we stratify typing judgements and restrict the domain of dependent functions to strictly lower levels. Even with type-in-type, this restriction suffices to enforce consistency. In , we consider a number of extensions beyond just stratified dependent functions. First, the subsystem employs McBride’s crude-but-effective stratification (also known as displacement) as a simple form of level polymorphism where global definitions with concrete levels can be displaced uniformly to any higher level. Second, to recover some expressivity lost due to the restriction on dependent function domains, the full includes a separate nondependent function type with a floating domain whose level matches that of the overall function type. Finally, we have implemented a prototype type checker for extended with datatypes and inference for level and displacement annotations, along with a small core library. We have proven to be consistent and to be type safe, but consistency of the full remains an open problem, largely due to the interaction between floating functions and cumulativity of judgements. Nevertheless, we believe to be consistent, and as evidence have verified the ill-typedness of some well-known type-theoretic paradoxes using our implementation.
APA, Harvard, Vancouver, ISO, and other styles
6

"Operator Assistance." In Why Did the Logician Cross the Road? Bloomsbury Academic, 2021. http://dx.doi.org/10.5040/9781350178946.ch-006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Dingli Alexiei, Abela Charlie, and D'Ambrogio Ilenia. "A Pervasive Assistant for Hospital Staff." In Ambient Intelligence and Smart Environments. IOS Press, 2011. https://doi.org/10.3233/978-1-60750-795-6-751.

Full text
Abstract:
Owing to the increasing population health needs, the ratio of nurses and doctors to patients keeps diminishing, yet the quality in healthcare services is expected to rise. PINATA seeks to tackle this matter through the merging of Ambient Intelligence (AmI) and semantic web technologies. The quality of healthcare services is enhanced by the use of pervasive devices to help doctors and nurses to concentrate on the patient. In this paper we discuss similar AmI system architectures; sum up the physical and logical design of PINATA; give details of the knowledgebase modelled in RDF/S and the ontologies designed for components of interest, including resources and context-related notions. An additional set of classes and properties amalgamate these individual models into an associated, context-rich data model. The presented results were based on a number of tests done and are very promising.
APA, Harvard, Vancouver, ISO, and other styles
8

Tharpe, Kimberly. "Assisting Teachers With Grieving Students." In Advances in Educational Marketing, Administration, and Leadership. IGI Global, 2024. http://dx.doi.org/10.4018/979-8-3693-1375-6.ch007.

Full text
Abstract:
The educational setting has provided students with supports in the event of emotional struggles such as grief. Teachers are often the first individual a student seeks for support and some understanding. This chapter involves dividing the content into logical, structured sections that guide educators and counselors through understanding the role of the school, school counselors and teachers when effectively assisting grieving students. Issues of grief associated with the impact of COVID and the school will be addressed. A brief theoretical background examining relational developmental systems (RDS) metatheory as a conceptual framework will be discussed understanding the outcomes of school relationship and connectivity. Additionally, specific strategies for school counselors will be provided, to utilize when assisting teachers with grieving students.
APA, Harvard, Vancouver, ISO, and other styles
9

Schlichtkrull Anders. "Formalization of Algorithms and Logical Inference Systems in Proof Assistants." In Frontiers in Artificial Intelligence and Applications. IOS Press, 2015. https://doi.org/10.3233/978-1-61499-589-0-188.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ciabattoni Agata, Vetterlein Thomas, and Adlassnig Klaus-Peter. "A Formal Logical Framework for Cadiag-2." In Studies in Health Technology and Informatics. IOS Press, 2009. https://doi.org/10.3233/978-1-60750-044-5-648.

Full text
Abstract:
Cadiag-2&amp;mdash;where &amp;ldquo;Cadiag&amp;rdquo; stands for &amp;ldquo;computer-assisted diagnosis&amp;rdquo;&amp;mdash; is an expert system based on fuzzy logic assisting in the differential diagnosis in internal medicine. With its aid, it is possible to derive from possibly vague information about a patient's symptoms, signs, laboratory test results, and clinical findings conjectures about present diseases. In this paper, we provide a mathematical formalization of the inferential mechanism of Cadiag-2. The aim is to have a formal logical calculus at hand which corresponds to the mode of operation of Cadiag-2 and which is among others needed to perform consistency checking of Cadiag-2's medical knowledge base.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Assistant logiciel"

1

Geetha, R., S. Kalpana, R. Roopa Chandrika, M. Preetha, P. Santhoshini, and Ezhil E. Nithila. "An Advanced Decentralized Architecture enabled Logical Identity based Encryption Logic using Blockchain Assistance." In 2024 4th International Conference on Intelligent Technologies (CONIT). IEEE, 2024. http://dx.doi.org/10.1109/conit61985.2024.10626801.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

S, Arvinthan D., Gugan K, Kalaimanivel K. A, and Kumar P. "Smart Industrial Automation System Using Logical Sensors with IoT Assistance." In 2023 3rd International Conference on Pervasive Computing and Social Networking (ICPCSN). IEEE, 2023. http://dx.doi.org/10.1109/icpcsn58827.2023.00188.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Maiorano, Massimo, and Enrico Sciubba. "Heat Exchangers Networks Synthesis and Optimisation Performed by an Exergy-Based Expert Assistant." In ASME 1999 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 1999. http://dx.doi.org/10.1115/imece1999-0851.

Full text
Abstract:
Abstract This paper presents a novel method for the design of “optimal” (or quasi-optimal) HEN. The method consists of an Expert System (“ES”) based on a small number of powerful and strongly selective heuristic rules. The important contribution of this study does not lie in the formulation of the rules, that have been adapted from the existing literature, but in their expression as logical propositions, and in their subsequent implementation in a prototype ES that performs interactively with the user. It is not unusual to find chemical processes with as many as 100 interacting streams, and even simple thermal processes, excluding refineries and chemical plants, contain at least a 10-streams-HEN: hence the high demand for an “automatic” (in some sense) Design Procedure that may conveniently be adapted to design-and-optimisation problems. Pinch Technology (“PT”), at present the almost universally adopted design procedure, is very successful in most types of applications (except in cases where mechanical and thermal power must be optimised concurrently), but it constitutes an operative tool, and does not improve its user’s comprehension of the problem: it assumes, rather, that the user is already familiar with the design of HEN. The approach we present in this paper is entirely different: we do not “mask” the thermodynamic and thermo-economic principles that guide the engineer in the path towards the “optimal” HEN configuration, and do not allow concerns about “user friendliness” to impair the necessary participation of the user to the HEN synthesis procedure. In fact, though our ES (which we prefer to call “Expert Assistant”, to underline its peculiarity of constantly interacting with the user) is still lacking many of the capabilities that a good designer possesses, the underlying procedure is, unlike any of the other existing Design-and-Optimisation Procedures, entirely inspectable by the user for what its decision-making rules are concerned. It can be interrogated about its decision making, so that the logical path followed from the design data to the final solution can be inspected at will, and it can be used to directly compare different alternatives in a logically systematic fashion. The paper begins with a brief review of the HEN design problem, followed by a critical discussion of the heuristic rules that form the basis for the Inference Engine of the Expert System. The formalisation of these rules into logical propositions suitable for Knowledge Based Methods is then presented, and the resulting macrocode developed. As a preliminary validation, two examples of application of the code (named Heat Exchanger Network Expert Assistant, HENEA for short) are presented and discussed: since both cases have published, and their “optimal” solutions are known, the performance of HENEA can be assessed by comparison.
APA, Harvard, Vancouver, ISO, and other styles
4

Schroeder, Gustavo Lazarotto, Leonardo dos Santos Paula, Rosemary Francisco, and Jorge Luis Victória Barbosa. "A-Track: An Ontological Approach to Assisting Anxiety Management." In Simpósio Brasileiro de Computação Aplicada à Saúde. Sociedade Brasileira de Computação - SBC, 2025. https://doi.org/10.5753/sbcas.2025.6928.

Full text
Abstract:
Anxiety, a natural survival mechanism, becomes chronic under modern stressors, escalating into chronic disorders with multifaceted health impacts. While early detection is crucial, healthcare systems struggle with scalability. This study introduces the A-Track Ontology, a digital tool designed to model anxiety through personalized context histories. Validated through logical consistency, domain coverage, and utility assessments, the ontology synthesizes multimodal data into actionable insights for proactive intervention. Integrating ontological reasoning with real-world context awareness, this approach addresses clinical scalability gaps, enabling personalized, data-driven strategies for anxiety management.
APA, Harvard, Vancouver, ISO, and other styles
5

McVea, William M., and Kamyar Haghighi. "KADS2: A Next Generation Knowledge Aided Design System." In ASME 1997 Design Engineering Technical Conferences. American Society of Mechanical Engineers, 1997. http://dx.doi.org/10.1115/detc97/dac-4312.

Full text
Abstract:
Abstract Research has been conducted in the areas of design methodology, automation and use of knowledge based systems as a tool to improve the design efficiency, accuracy and consistency for mechanical power transmissions. The research capitalized on previous work related to component level design synthesis and analysis. The next logical step in the research progression was to look into system development and integration of design synthesis and analysis tools. Deliverables from this research include new knowledge acquisition techniques, a more complete model of design information flow and development and a knowledge based design assistant system, capable of integrating multiple discrete and disparate design tools.
APA, Harvard, Vancouver, ISO, and other styles
6

Srivastava, Shashank, Amos Azaria, and Tom Mitchell. "Parsing Natural Language Conversations using Contextual Cues." In Twenty-Sixth International Joint Conference on Artificial Intelligence. International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/571.

Full text
Abstract:
In this work, we focus on semantic parsing of natural language conversations. Most existing methods for semantic parsing are based on understanding the semantics of a single sentence at a time. However, understanding conversations also requires an understanding of conversational context and discourse structure across sentences. We formulate semantic parsing of conversations as a structured prediction task, incorporating structural features that model the `flow of discourse' across sequences of utterances. We create a dataset for semantic parsing of conversations, consisting of 113 real-life sequences of interactions of human users with an automated email assistant. The data contains 4759 natural language statements paired with annotated logical forms. Our approach yields significant gains in performance over traditional semantic parsing.
APA, Harvard, Vancouver, ISO, and other styles
7

Liu, Qian, Robert de Simone, Xiaohong Chen, and Jing Liu. "Multiform Logical Time & Space for Specification of Automated Driving Assistance Systems: Work-in-Progress." In 2020 International Conference on Embedded Software (EMSOFT). IEEE, 2020. http://dx.doi.org/10.1109/emsoft51651.2020.9244041.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Liu, Qian, Robert de Simone, Xiaohong Chen, et al. "Multiform Logical Time & Space for Mobile Cyber-Physical System With Automated Driving Assistance System." In 2020 27th Asia-Pacific Software Engineering Conference (APSEC). IEEE, 2020. http://dx.doi.org/10.1109/apsec51365.2020.00050.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Wakatani, Akiyoshi, and Toshiyuki Maeda. "Prototype Advisory System for Learning C Programming Using Generative AI." In 15th International Conference on Education and Educational Psychology. Emanate Publishing House Ltd., 2024. https://doi.org/10.70020/eesh.2024.12.5.

Full text
Abstract:
The technology of generative AI using LLM (Large Language Models) has made remarkable progress, and attempts to apply the technology of program code generation to programming education have attracted much attention. In this paper, we evaluate the suitability of a virtual TA (Teaching Assistant) system for beginner-level learners using OpenAI’s API, which takes C programs with errors and error messages as input to the LLM and outputs appropriate advice. Among the programming errors, the proposed system was generally effective in generating appropriate advice for syntactic and semantic errors, which is sufficient for learners to solve the problems on their own. For logical errors, the proposed system was generally effective in generating appropriate advice, although in some cases the advice was only a general explanation. Moreover, there are cases in which the prompts give appropriate advice to review the part of the sentence that includes division, so there is a possibility to obtain appropriate advice by asking for advice more than once.
APA, Harvard, Vancouver, ISO, and other styles
10

Castro, Elena, Dolores Cuadra, Paloma Martinez, and Ana Iglesias. "Integrating Intelligent Methodological and Tutoring Assistance in a CASE Platform: The PANDORA Experience." In 2002 Informing Science + IT Education Conference. Informing Science Institute, 2002. http://dx.doi.org/10.28945/2458.

Full text
Abstract:
Database Design discipline involves so different aspects as conceptual and logical modelling knowledge or domain understanding. That implies a great effort to carry out the real world abstraction task and represent it through a data model. CASE tools emerge in order to automating the database development process. These platforms try to help to the database designer in different database design phases. Nevertheless, this tools are frequently mere diagrammers and do not carry completely out the design methodology that they are supposed to support; furthermore, they do not offer intelligent methodological advice to novice designers. This paper introduces the PANDORA tool (acronym of Platform for Database Development and Learning via Internet) that is being developed in a research project which tries to mitigate some of the deficiencies observed in several CASE tools, defining methods and techniques for database development which are useful for students and practitioners. Specifically, this work is focused on two PANDORA components: Conceptual Modelling and Learning Support subsystems.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Assistant logiciel"

1

Reproductive decisionmaking in the context of HIV/AIDS in Ndola, Zambia. Population Council, 1999. http://dx.doi.org/10.31899/rh1999.1018.

Full text
Abstract:
Family planning (FP) programs are increasingly being considered as a logical focal point for STD and HIV/AIDS prevention services because they serve large numbers of women at risk, address the sensitive issue of sexual behavior and fertility control, and the methods for preventing unwanted pregnancy and disease can be the same. FP programs, by providing contraceptive methods, are currently one of the few sources of assistance in the sub-Saharan African region for preventing perinatal transmission of HIV, while the promotion of barrier methods contributes to the prevention of heterosexual transmission. Given this potential, research is needed to understand how the HIV epidemic influences reproductive decision-making. The Africa OR/TA II Project undertook an exploratory study of women and men’s attitudes and experiences regarding reproductive decision-making in a setting of high HIV prevalence in Ndola, Zambia. The objectives, as described in this report, were to examine perceptions of risk by men and women living in a high HIV prevalence setting, how these perceptions are related to decisions about childbearing and contraceptive use, and to identify opportunities for FP programs to expand services to address HIV prevention.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!