hbayes 0.3

Lot of news in this version : big network, mpe, soft evidence, logical table ...

The documentation is available in the package.

1. Big graphs

First, I changed the cost function for the graph triangulation and I removed some space leaks. As a consequence, I can now process a graph as big as the Diabete one. You can look at the PDF attached to this email ! It is a HUGE graph. And most of the nodes have 10 or 20 levels !

Of course, hbayes is not as fast as professional C librairies. It is taking a bit more than 2 minutes on my MacBook Air to compute the junction tree and a first posterior. But it looks like it is working and the memory consumption is reasonable.

2. Typed discrete variables

I have introduced typed discrete variables (TDV). It may be less convenient in some cases but it generally makes the code more readable.

3. Logical tables

Making complex logical queries with a junction tree is not easy. So, it is generally better to introduce auxiliary nodes in the graph. For this, hbayes is introducing a collection of operators to create logical tables.

The philosophy of Bayesian networks is to factor big probability tables because they are not manageable. So, even if it is possible to create big logical tables, it is not advised. It is better to combine small logical tables and create a subnetwork.

Here is an example:

exampleLogical :: ([TDV Bool], SBN CPT)
exampleLogical = runBN $ do 
    a <- variable "a" (t :: Bool)
    b <- variable "b" (t :: Bool)
    notV <- variable "notV" (t :: Bool)
    andV <- variable "andV" (t :: Bool)
    orV <- variable "orV" (t :: Bool)
    let ta = a .==. True 
        tb = b .==. True
    logical notV ((.!.) ta)
    logical andV (ta .&. tb)
    logical orV (ta .|. tb)
    return $ [a,b,notV,andV,orV]

4. Noisy OR

Noisy OR is often used in modeling network so I have added an operator for this.

5. Soft evidence

A few new functions have been introduced to set soft evidence. It is not as easy as hard evidence since you need to change the values of the probability table of an auxiliary node.


inferencesWithSoftEvidence = do 
    let ((a,seNode),exampleG) = exampleSoftEvidence 
        jt = createJunctionTree nodeComparisonForTriangulation exampleG
        theNewFactor x = fromJust $ se seNode a x -- x % success for the sensor
        jt' = changeEvidence [seNode =: True] jt
    print "Sensor 90%"
    print $ posterior (changeFactor (theNewFactor 0.9) jt') a

    print "Sensor 50%"
    print $ posterior (changeFactor (theNewFactor 0.5) jt') a

    print "Sensor 10%"
    print $ posterior (changeFactor (theNewFactor 0.1) jt') a

6. MPE

It is now possible, using variable elimination, to compute the most likely instantiation.

The algorithm is simple so you won't probably be able to use it on big networks.

mpeStandardNetwork = do
    let ([winter,sprinkler,rain,wet,road,roadandrain],exampleG) = example

    print exampleG
    print "Most likely explanation if grass wet and road slippery"
    let m = mpe exampleG [wet,road] [winter,sprinkler,rain,roadandrain] [wet =: True, road =: True]
        typedResult = map (map tdvi) m :: [[(TDV Bool,Bool)]]
    print typedResult
    putStrLn ""
    let m = mpe exampleG [wet,road,roadandrain,winter] [sprinkler,rain] [wet =: True, road =: True]
        typedResult = map (map tdvi) m :: [[(TDV Bool,Bool)]]
    print typedResult
    putStrLn ""

You can download hbayes on hackage.

blog comments powered by Disqus