Bayes Network Tool To Save Christmas in Springfield
Bayes Network Tool To Save Christmas in Springfield
For this mini-project, you will be using the bayes net tool available
at:
http://www.aispace.org/bayes/index.shtml
The tool is reasonably intuitive to use. Nevertheless, a very nice
interactive tutorial on the tool usage is available in the same page.
Problem:
We have decided to use the bayes-network technology to model the
Springfield nuclear power plant (where Homer Simpson is gainfully
employed). Here is our understanding of the domain (this is all
official based on my conversations with Bart). The presence of
inferior plutonium or low quality heavy water (D20) in the plant
reactor leads to a core melt-down in the reactor. When core-meltdown
occurs, it tends to irradiate the employees (such as Homer) causing
them to glow in the dark. Core-meltdown also tends to shutoff the
power grid which in turn causes the slurpees in Apu's convenience
store (called Squishees at Kwikee Mart) to melt and get watery.
Here are a bunch of probabilities I got from the Springfield
Statistics Beureau: The dependency between Core Meltdown and Inferior
Plutonium and Low Quality Heavy Water can be modeled as a "Noisy-OR"
distribution. Inferior plutonium fails to cause core melt down with a
probability of 0.7; and low quality heavy water fails to cause core
meltdown with a probabilityof 0.8.
Core-meltdown leads to glowing-in-the-dark employees with probability
0.5. What with lax quality control over at Springfield plant, even
under normal circumstances, Homer and his buddies tend to glow in the
dark with probability 0.05. Core-meltdown causes slurpee liquification
with 0.9, and over at Apu's slurpees tend to get watery even without a
core-meltdown with probability 0.1.
Finally, the probability that Springfield plant gets inferior-quality
plutonium is 0.3 and that it gets low-quality heavy water is 0.4 (you
know that wacky Burns--he is always trying to buy cheap stuff and make
more bucks).
Tasks:
Do all these tasks. Write down your observations for each
task. Include snapshots of the bayes net tool as appropriate.
Part I. [Bayesnet]
1. Create the bayes network that you have above in the bayes net
tool. Enter the conditional probability tables as appropriate. To
show that you have done this task, you need to (1) include a bitmap
of the network (use Alt-Printscreen in windows) and (2) include the
.xml format representation of the network (you can output this
by going to the edit menu, and selecting the first command).
2. Now go to the solve pane and evaluate the following
queries--in that order. *Comment* on whether the relative values
are in accordance with our intuitions.
P(IP)
P(IP|ASL)
P(IP|CM)
P(IP|CM,ASL)
P(IP|CM,LHW)
If in the above, any of the probabilities don't change when extra
evidence is added, use the D-Separation criterion you learned in the
class to verify that it is as expected (for example, if P(IP|CM) is
the same as P(IP|CM,ASL) then it must be the case that IP is
conditionaly independent of ASL given CM).
(You can accomplish these easily by "monitoring" the IP node, and
observing/de-observing the appropriate variables).
(Glossary: IP--Inferior Plutonium. ASL-->Apu's Slurpees
liquify. CM-->Core Meltdown. GID-->Glow in the Dark, LHW-->Low
quality Heavy Water).
Part II [Relations with logic]
1. Modify the network such that the causations are "perfect" and
"exhaustive" (e.g. Inferior plutonium _always_ causes Core Melt down,
and core meltdown will not be true if none of its causing variables
are true). Confirm that you modified it by including a .xml format
representation of the new network.
[Note: You will only change the causations. You will keep the prior
probabilities of IP and LHW as before. ]
2. Write down a set of propositional logic statements that capture the
knowledge encoded in the bayes network.
3. Evaluate the following probabilities in this bayes network
P(IP|ASL)
P(IP|ASL,~LHW)
P(IP|ASL,~GID)
Interpret the answers. Comment on whether these answers are in line
with what propositional logic would have us derive given the
formulation in II.2.
Part III [Reformulating Bayesnet]
Consider the following alternative way of specifying this bayes
net. Here, we introduce the random variables in the following order
into the network:
1. Apus's slurpees are liquified
2. Core melt down occured
3. Employees glow in the dark
4. Low quality heavy water
5. Inferior quality plutonium
1. Show the network that will result if we specified the numbers this
way (need both the topology as a screen dump, and the .xml format
representation). Note that you also need to put in the CPTs
(conditional probability tables) for each variable. To do this, you
will have to use the network as it existed at the end of Part I.1. as
the expert--and get the required CPTs by querying the network.
Comment on whether this network is better or worse in terms of number
of probabilites that you needed to assess.
Once you specify the entire new network, to see that this and the
earlier network are equivalent, compute the the following thre
probabilities for the new network--and compare your values with the
answer for I.2.
P(IP)
P(IP|ASL)
P(IP|CM)
P(IP|CM,ASL)
P(IP|CM,LHW)
Part IV:
We found out more information about the causes behind Inferior
Plutonium (IP) and Low Quality Heavy Water (LHW). It turns out that
Mr. Burns' stinginess is partly to blame for these. We know that
Mr. Burns _is_ stingy with 0.99 probability. We also found that when he is stingy, he is
likely to buy inferior plutonium with probability 0.3 and low quality
heavy water with probability 0.4. When he is not stingy, he buys IP
with 0.0001 probability and low quality heavy water with 0.0002
probability.
1. Modify the bayes network to show this improved understanding of the
domain. Show the topology as well as the .xml representation
2. Is the new network singly connected or multiply connected? If it is
multiply connected, please provide an equivalent singly connected
network (once again, you will need to show the topology and .bn
representation).
Subbarao Kambhampati
Last modified: [Oct 27, 2015]