We can plug any values we like into this formula to infer the probability of the conclusion given the evidence.

we can use Bayes rule to calculate probability of passing the exam:

The probability of passing the exam given you attend lectures is 0.96.

This works for simple scenarios, e.g., where we have a lot probabilities relating diseases to symptoms, and want a rule that produces a diagnosis from the symptoms shown.

But in more complex cases, we may have a networks or chain of probabilistic relationships to deal with.

For example,

cheapMoney => consumerBorrowing => highDemand => inflationHow do we represent and perform inference with complex chains of this sort?

Answer: **Bayesian networks**

We have a tangle of relationships to take into account. Sunshine increases the humidity but so does rain and temperature, both of which are themselves affected by sunshine.

- Represent each element of the domain as a
*variable*which takes certain values (e.g., sun=yes, sun=no). - Represent the relationships between variables in terms of
conditional probabilities, e.g., probabilities like
P(temp=high|sun=yes) = 0.8, P(temp=high|sun=no) = 0.2,
P(rain=yes|temp=low) = 0.6 etc.

This is also known as forwards propagation.

An arrow points from parent to

Variable is then said to be 's *child*, while
and all 's children are 's *descendants*.

When reasoning is done using probability propagation, the
assumption is made that two variables are conditionally
independent of all *non-descendants* given their parents.

This is another way of saying that variables are only influenced by their parents.

This defines the distribution on recursively.
Each value is obtained by iterating over the
combinations of parental values taking the product of
the combination's probability and the probability of the
value which is *conditional* on the combination.

Note that distributions must sum to 1 (so normalization may be required).

But depending on how variables are related, we can easily end up with very uncertain conclusions.

The key factor which affects performance is the level of uncertainty we have about conditioned variables.

This is the termed **equivocation**.

is the conditioned probability of the *i*th value of
the conditioned variable, and is the probability of the
conditioning value.

Other things being equal, higher equivocation will mean less successful Bayesian reasoning, i.e., less certain conclusions.

.

- Bayes rule again
- Probabilistic representation
- Use of Bayesian networks
- Reasoning as propagation
- Top-down propagation
- Equivocation

- Let's say the university communicates your degree
result to you using either a tick or a cross. What is
the level of equivocation?

- Use Bayes' rule to work out P(east|sun) given that
P(sun)= 0.3, P(east)=0.4 and P(sun|east)=0.6.
- Use the frequency interpretation of probability to explain why Bayes rule works.