Academic literature on the topic 'Inductive supervised learning'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Inductive supervised learning.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Inductive supervised learning"
Wu, Haiping, Khimya Khetarpal, and Doina Precup. "Self-Supervised Attention-Aware Reinforcement Learning." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 12 (May 18, 2021): 10311–19. http://dx.doi.org/10.1609/aaai.v35i12.17235.
Full textBisio, Federica, Sergio Decherchi, Paolo Gastaldo, and Rodolfo Zunino. "Inductive bias for semi-supervised extreme learning machine." Neurocomputing 174 (January 2016): 154–67. http://dx.doi.org/10.1016/j.neucom.2015.04.104.
Full textHovsepian, Karen, Peter Anselmo, and Subhasish Mazumdar. "Supervised inductive learning with Lotka–Volterra derived models." Knowledge and Information Systems 26, no. 2 (January 16, 2010): 195–223. http://dx.doi.org/10.1007/s10115-009-0280-5.
Full textJuan, Liu, and Li Weihua. "A hybrid genetic algorithm for supervised inductive learning." Wuhan University Journal of Natural Sciences 1, no. 3-4 (December 1996): 611–16. http://dx.doi.org/10.1007/bf02900895.
Full textB, Amarnath, and S. Appavu alias Balamurugan. "Feature Selection for Supervised Learning via Dependency Analysis." Journal of Computational and Theoretical Nanoscience 13, no. 10 (October 1, 2016): 6885–91. http://dx.doi.org/10.1166/jctn.2016.5642.
Full textZhu, Ruifeng, Fadi Dornaika, and Yassine Ruichek. "Inductive semi-supervised learning with Graph Convolution based regression." Neurocomputing 434 (April 2021): 315–22. http://dx.doi.org/10.1016/j.neucom.2020.12.084.
Full textYang, Shuyi, Dino Ienco, Roberto Esposito, and Ruggero G. Pensa. "ESA☆: A generic framework for semi-supervised inductive learning." Neurocomputing 447 (August 2021): 102–17. http://dx.doi.org/10.1016/j.neucom.2021.03.051.
Full textDornaika, F., R. Dahbi, A. Bosaghzadeh, and Y. Ruichek. "Efficient dynamic graph construction for inductive semi-supervised learning." Neural Networks 94 (October 2017): 192–203. http://dx.doi.org/10.1016/j.neunet.2017.07.006.
Full textZhang, Zhao, Lei Jia, Mingbo Zhao, Qiaolin Ye, Min Zhang, and Meng Wang. "Adaptive non-negative projective semi-supervised learning for inductive classification." Neural Networks 108 (December 2018): 128–45. http://dx.doi.org/10.1016/j.neunet.2018.07.017.
Full textTian, Xilan, Gilles Gasso, and Stéphane Canu. "A multiple kernel framework for inductive semi-supervised SVM learning." Neurocomputing 90 (August 2012): 46–58. http://dx.doi.org/10.1016/j.neucom.2011.12.036.
Full textDissertations / Theses on the topic "Inductive supervised learning"
Khalid, Fahad. "Measure-based Learning Algorithms : An Analysis of Back-propagated Neural Networks." Thesis, Blekinge Tekniska Högskola, Avdelningen för för interaktion och systemdesign, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4795.
Full textThe study is an investigation on the feasibility of using a generic inductive bias for backpropagation artificial neural networks, which could incorporate any one or a combination of problem specific performance metrics to be optimized. We have identified several limitations of both the standard error backpropagation mechanism as well the inherent gradient search approach. These limitations suggest exploration of methods other than backpropagation, as well use of global search methods instead of gradient search. Also, we emphasize the importance of taking the representational bias of the neural network in consideration, since only a combination of both procedural and representational bias can provide highly optimal solutions.
Carroll, James Lamond. "A Bayesian Decision Theoretical Approach to Supervised Learning, Selective Sampling, and Empirical Function Optimization." Diss., CLICK HERE for online access, 2010. http://contentdm.lib.byu.edu/ETD/image/etd3413.pdf.
Full textLehmann, Jens. "Learning OWL Class Expressions." Doctoral thesis, Universitätsbibliothek Leipzig, 2010. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-38351.
Full textBayoudh, Meriam. "Apprentissage de connaissances structurelles à partir d’images satellitaires et de données exogènes pour la cartographie dynamique de l’environnement amazonien." Thesis, Antilles-Guyane, 2013. http://www.theses.fr/2013AGUY0671/document.
Full textClassical methods for satellite image analysis are inadequate for the current bulky data flow. Thus, automate the interpretation of such images becomes crucial for the analysis and management of phenomena changing in time and space, observable by satellite. Thus, this work aims at automating land cover cartography from satellite images, by expressive and easily interpretable mechanism, and by explicitly taking into account structural aspects of geographic information. It is part of the object-based image analysis framework, and assumes that it is possible to extract useful contextual knowledge from maps. Thus, a supervised parameterization methods of a segmentation algorithm is proposed. Secondly, a supervised classification of geographical objects is presented. It combines machine learning by inductive logic programming and the multi-class rule set intersection approach. These approaches are applied to the French Guiana coastline cartography. The results demonstrate the feasibility of the segmentation parameterization, but also its variability as a function of the reference map classes and of the input data. Yet, methodological developments allow to consider an operational implementation of such an approach. The results of the object supervised classification show that it is possible to induce expressive classification rules that convey consistent and structural information in a given application context and lead to reliable predictions, with overall accuracy and Kappa values equal to, respectively, 84,6% and 0,7. In conclusion, this work contributes to the automation of the dynamic cartography from remotely sensed images and proposes original and promising perpectives
MOTTA, EDUARDO NEVES. "SUPERVISED LEARNING INCREMENTAL FEATURE INDUCTION AND SELECTION." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2014. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=28688@1.
Full textCOORDENAÇÃO DE APERFEIÇOAMENTO DO PESSOAL DE ENSINO SUPERIOR
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
PROGRAMA DE EXCELENCIA ACADEMICA
A indução de atributos não lineares a partir de atributos básicos é um modo de obter modelos preditivos mais precisos para problemas de classificação. Entretanto, a indução pode causar o rápido crescimento do número de atributos, resultando usualmente em overfitting e em modelos com baixo poder de generalização. Para evitar esta consequência indesejada, técnicas de regularização são aplicadas, para criar um compromisso entre um reduzido conjunto de atributos representativo do domínio e a capacidade de generalização Neste trabalho, descrevemos uma abordagem de aprendizado de máquina supervisionado com indução e seleção incrementais de atributos. Esta abordagem integra árvores de decisão, support vector machines e seleção de atributos utilizando perceptrons esparsos em um framework de aprendizado que chamamos IFIS – Incremental Feature Induction and Selection. Usando o IFIS, somos capazes de criar modelos regularizados não lineares de alto desempenho utilizando um algoritmo com modelo linear. Avaliamos o nosso sistema em duas tarefas de processamento de linguagem natural em dois idiomas. Na primeira tarefa, anotação morfossintática, usamos dois corpora, o corpus WSJ em língua inglesa e o Mac-Morpho em Português. Em ambos, alcançamos resultados competitivos com o estado da arte reportado na literatura, alcançando as acurácias de 97,14 por cento e 97,13 por cento, respectivamente. Na segunda tarefa, análise de dependência, utilizamos o corpus da CoNLL 2006 Shared Task em português, ultrapassando os resultados reportados durante aquela competição e alcançando resultados competitivos com o estado da arte para esta tarefa, com a métrica UAS igual a 92,01 por cento. Com a regularização usando um perceptron esparso, geramos modelos SVM que são até 10 vezes menores, preservando sua acurácia. A redução dos modelos é obtida através da regularização dos domínios dos atributos, que atinge percentuais de até 99 por cento. Com a regularização dos modelos, alcançamos uma redução de até 82 por cento no tamanho físico dos modelos. O tempo de predição do modelo compacto é reduzido em até 84 por cento. A redução dos domínios e modelos permite também melhorar a engenharia de atributos, através da análise dos domínios compactos e da introdução incremental de novos atributos.
Non linear feature induction from basic features is a method of generating predictive models with higher precision for classification problems. However, feature induction may rapidly lead to a huge number of features, causing overfitting and models with low predictive power. To prevent this side effect, regularization techniques are employed to obtain a trade-off between a reduced feature set representative of the domain and generalization power. In this work, we describe a supervised machine learning approach that incrementally inducts and selects feature conjunctions derived from base features. This approach integrates decision trees, support vector machines and feature selection using sparse perceptrons in a machine learning framework named IFIS – Incremental Feature Induction and Selection. Using IFIS, we generate regularized non-linear models with high performance using a linear algorithm. We evaluate our system in two natural language processing tasks in two different languages. For the first task, POS tagging, we use two corpora, WSJ corpus for English, and Mac-Morpho for Portuguese. Our results are competitive with the state-of-the-art performance in both, achieving accuracies of 97.14 per cent and 97.13 per cent, respectively. In the second task, Dependency Parsing, we use the CoNLL 2006 Shared Task Portuguese corpus, achieving better results than those reported during that competition and competitive with the state-of-the-art for this task, with UAS score of 92.01 per cent. Applying model regularization using a sparse perceptron, we obtain SVM models 10 times smaller, while maintaining their accuracies. We achieve model reduction by regularization of feature domains, which can reach 99 per cent. Using the regularized model we achieve model physical size shrinking of up to 82 per cent. The prediction time is cut by up to 84 per cent. Domains and models downsizing also allows enhancing feature engineering, through compact domain analysis and incremental inclusion of new features.
Boonkwan, Prachya. "Scalable semi-supervised grammar induction using cross-linguistically parameterized syntactic prototypes." Thesis, University of Edinburgh, 2014. http://hdl.handle.net/1842/9808.
Full textPacker, Thomas L. "Scalable Detection and Extraction of Data in Lists in OCRed Text for Ontology Population Using Semi-Supervised and Unsupervised Active Wrapper Induction." BYU ScholarsArchive, 2014. https://scholarsarchive.byu.edu/etd/4258.
Full textAugier, Sébastien. "Apprentissage Supervisé Relationnel par Algorithmes d'Évolution." Phd thesis, Université Paris Sud - Paris XI, 2000. http://tel.archives-ouvertes.fr/tel-00947322.
Full textBook chapters on the topic "Inductive supervised learning"
Zhang, Mingxing, Fumin Shen, Hanwang Zhang, Ning Xie, and Wankou Yang. "Hashing with Inductive Supervised Learning." In Lecture Notes in Computer Science, 447–55. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-24078-7_45.
Full textLahbib, Dhafer, Marc Boullé, and Dominique Laurent. "Itemset-Based Variable Construction in Multi-relational Supervised Learning." In Inductive Logic Programming, 130–50. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38812-5_10.
Full textEl Hamri, Mourad, Younès Bennani, and Issam Falih. "Inductive Semi-supervised Learning Through Optimal Transport." In Communications in Computer and Information Science, 668–75. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-92307-5_78.
Full textBisio, Federica, Sergio Decherchi, Paolo Gastaldo, and Rodolfo Zunino. "Inductive Bias for Semi-supervised Extreme Learning Machine." In Proceedings of ELM-2014 Volume 1, 61–70. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-14063-6_6.
Full textVenturini, Gilles. "SIA: A supervised inductive algorithm with genetic search for learning attributes based concepts." In Machine Learning: ECML-93, 280–96. Berlin, Heidelberg: Springer Berlin Heidelberg, 1993. http://dx.doi.org/10.1007/3-540-56602-3_142.
Full textGovada, Aruna, Pravin Joshi, Sahil Mittal, and Sanjay K. Sahay. "Hybrid Approach for Inductive Semi Supervised Learning Using Label Propagation and Support Vector Machine." In Machine Learning and Data Mining in Pattern Recognition, 199–213. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-21024-7_14.
Full textMartin, Eric, Samuel Kaski, Fei Zheng, Geoffrey I. Webb, Xiaojin Zhu, Ion Muslea, Kai Ming Ting, et al. "Supervised Descriptive Rule Induction." In Encyclopedia of Machine Learning, 938–41. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_802.
Full textNovak, Petra Kralj, Nada Lavrač, and Geoffrey I. Webb. "Supervised Descriptive Rule Induction." In Encyclopedia of Machine Learning and Data Mining, 1–4. Boston, MA: Springer US, 2016. http://dx.doi.org/10.1007/978-1-4899-7502-7_808-1.
Full textNovak, Petra Kralj, Nada Lavrač, and Geoffrey I. Webb. "Supervised Descriptive Rule Induction." In Encyclopedia of Machine Learning and Data Mining, 1210–13. Boston, MA: Springer US, 2017. http://dx.doi.org/10.1007/978-1-4899-7687-1_808.
Full textToscano, David S., and Enrique V. Carrera. "Failure Detection in Induction Motors Using Non-supervised Machine Learning Algorithms." In Systems and Information Sciences, 48–59. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-59194-6_5.
Full textConference papers on the topic "Inductive supervised learning"
Shi, Yuan, Zhenzhong Lan, Wei Liu, and Wei Bi. "Extending Semi-supervised Learning Methods for Inductive Transfer Learning." In 2009 Ninth IEEE International Conference on Data Mining (ICDM). IEEE, 2009. http://dx.doi.org/10.1109/icdm.2009.75.
Full textHovsepian, Karen, Peter Anselmo, and Subhasish Mazumdar. "Supervised Inductive Learning with Lotka-Volterra Derived Models." In 2008 Eighth IEEE International Conference on Data Mining (ICDM). IEEE, 2008. http://dx.doi.org/10.1109/icdm.2008.108.
Full textZhan, Wang, and Min-Ling Zhang. "Inductive Semi-supervised Multi-Label Learning with Co-Training." In KDD '17: The 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3097983.3098141.
Full textSarkar, Anoop, and Gholamreza Haffari. "Inductive semi-supervised learning methods for natural language processing." In the Human Language Technology Conference of the NAACL, Companion Volume: Tutorial Abstracts. Morristown, NJ, USA: Association for Computational Linguistics, 2006. http://dx.doi.org/10.3115/1614101.1614106.
Full textYoo, Jaemin, Hyunsik Jeon, and U. Kang. "Belief Propagation Network for Hard Inductive Semi-Supervised Learning." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/580.
Full textChen, Wentao, Chenyang Si, Wei Wang, Liang Wang, Zilei Wang, and Tieniu Tan. "Few-Shot Learning with Part Discovery and Augmentation from Unlabeled Images." In Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. California: International Joint Conferences on Artificial Intelligence Organization, 2021. http://dx.doi.org/10.24963/ijcai.2021/313.
Full textWang, De, Feiping Nie, and Heng Huang. "Large-scale adaptive semi-supervised learning via unified inductive and transductive model." In KDD '14: The 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, NY, USA: ACM, 2014. http://dx.doi.org/10.1145/2623330.2623731.
Full textLi, Zhi, and Zhoujun Li. "Inductive and Effective Privacy-preserving Semi-supervised Learning with Harmonic Anchor Mixture." In ISEEIE 2021: 2021 International Symposium on Electrical, Electronics and Information Engineering. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3459104.3459187.
Full textde Sousa, Celso A. R. "An inductive semi-supervised learning approach for the Local and Global Consistency algorithm." In 2016 International Joint Conference on Neural Networks (IJCNN). IEEE, 2016. http://dx.doi.org/10.1109/ijcnn.2016.7727722.
Full textCropper, Andrew. "Playgol: Learning Programs Through Play." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/841.
Full text