Advanced Topics in Artificial Intelligence

NEWS

mar 20: - slides from 19/3 in page - text on Gibbs & VE revised.

COVID-19

Dear Students.

Due to the current limitations, classes are being given through Zoom video-conference, and:

Take care and thanks!

Vítor

Robots

Program


  1. Graphical Systems (Probabilistic)

  2. Neural Systems

  3. Logic Based Aproaches

  4. Other representations

  5. Integrated Systems

Instructors

Classes

self-driving

Class I - Welcome Class [14/02]

Class II - The beginning is a good place to start [14/02]

Class III - BFF: Linear Regression and AI [21/02]

Class IV - Seeing Things: Linear Regression and AI [21/02]

Class V - A Light in Darkness: Probabilistic Graphical Models [28/02]

Slides available for Naive Bayes.

Class V - Into the Deeps: Deep Learning Basics

Mini-Tasks

Inference

Implement a program to output the posterior probabilities of a random variable given a bayes network and evidence.

  1. The R bnlearn Repo has a collection of Bayes nets that you can use.

  2. Implement the algorithm you choose: look into this text for a brief overview of two simple algrithms: Gibbs Sampling and Variable Elimination. Feel free to use existing BN scanner. Last, implement a small query answering mechanism.

  3. Evaluate by comparing with existing libraries.

Eval criteria:

Learning from data

The goal is to compare deep network technology on a dataset: what kind of techniques will be useful, how does searhcing ?

  1. First, you should choose a dataset

  2. Is it large enough? Too Large? Too Many Parameters? Too few? Missing Data?

  3. Are there published results? If so, are they reproducible?

  4. Try Naive Bayes(or linear regression) and RFs: they will be the baseline.

  5. Generate a DNN. Ways to do so, many are. A way is by starting from the keras tutorials in tf.org.

  6. Refine the model.

  7. Evaluate on set-aside data, or using cross-val.

eval criteria:

Deadlines:

Submission:

To submit you must present - source + small report + run log (eg, jupyter notebook) to be sent to vscosta AT fc.up.pt, subj TAIA.

Support Material

Tutorials

Why attend a tutorial? Introduce, explain and comment on the material on n ai tutorial. Look for good tutorials at:

Slides

Probabilistic Systems

Crowd

Main slides, also check quick reading slides:

  1. Adrian Weller, MLSALT4 graphical model

  2. Marc Toussaint University of Stuttgart Summer 2015, Machine Learning, Graphical Models

  3. For detailed information, try Daphne Koller Open Class Slides

Not-so-quick Reading:

a. Daphne Koller and Nir Friedman, Probabilistic Graphical Models: everything you wanted and everything you didn't

b. Kevin P. Murphy, Machine Learning A Probabilistic Learning: A probabilistic view of the world.

c. David Barber, Bayesian Reasoning and Machine Learning: pdf from author available/ <!--

Propositional Inference

Following slides discuss the connection between SAT solvers, BDDs and trees:

  1. Binary Decision Diagrams are one of the most widely used tools in CS. Their application to BN was proposed by Minato et al but they are not a very popular approach to compile BNs. Model Counting is also not widely used.

  2. The Problog language was initially implemented on BDDs. The ProbLog2 system can use BDDs. model-counting or trees.

  3. Arithmetic Circuits are a very efficient approach for bayesian inference; the Darwiche group recently proposed an extension, SDDs.

  4. The connection to constraint solving is discussed by Dechter.

Optimisation

Optimisation is a fundamental tool for modern machine learning and other areas of AI. Techniques are often based on work from the Operations Research and Constraint communities.