|Course Name:||Machine Learning Techniques: Perfomance and Applicability|
|Fecha:||05 de Junio al 16 de Junio – 2018|
|Profesor:||Arnulfo Palacio Azcarraga / Judith J. Azcarraga|
By the end of the Machine Learning (ML) course, the participant will be able to apply the essential ML techniques in a host of applications domains – and to build on these techniques and adapt them to suit the specific nature of the problem at hand. The participant would also be able to compare the various ML techniques not only based on how they operate, but also on their applicability given the nature of the problem, the amount of computing resources available, and the importance of being able to verify the results produced by each ML technqiue. The course begins by discussing ML as just one of several approaches to building Intelligent systems. Some of the basic ML techniques will then be used to illustrate the basic components of any computational intelligence system that relies on data and examples to automatically improve its perfomance – as opposed to being programmed to be “intelligent” by anticipating every possible scenario for the system (and being given the directives for how to deal with each scenario). Naive Bayes (NB) systems, K-Nearest Neighbors (KNN), and Decision Trees (DT) will be used to talk of performance and verifiability as important considerations for choosing from among a wide range of possible ML techniques.
These three techniques, often used as baseline techniques to which better-performing ML systems are benchmarked, will also be used to familiarize the participants with RapidMiner, and possibly Weka and other off-the-shelf ML tools. The necessary steps of data preparation and synchronization, as well as the various options for feature selection, dimensionality reduction, and the plan for training, validation, and testing will also be discussed. The important notions of supervised and unsupervised learning will likewise be discussed, with supplemental discussions on Self-Organizing Maps, K-Means algorithm, and other clustering methods suitable for Big Data applications. And, whenever it will be useful, some related issues in neurobiology and cognitive science will be discussed. Finally, the more performant ML techniques, particularly artificial Neural Networks, will be tackled. Discussions on Multi-Layered Perceptrons (MLP) will be extended to cover Convolutional Neural Networks (CNN) and Deep Learning, as the cutting edge technologies that have mainstreamed aritificial neural networks, thus promising to make 2018 as the “year of AI”..