Used for propVars, we only import 'union': > import Data.List (union) *Exercise session 6* This is my second-last exercise session. You can request topics for the next exercise session (questions, repetition, additional explanation,..). Send me a mail: Sheet 6 will be the last graded sheet by us. Sheet 7 and its solution will be released next Tuesday, and the FM part exercises will start the following week. We recommend you solve that exercise sheet before looking at the solution. * Previous sheet Ex. 3 Ex. 4 * Fold and structural induction * lazy/eager evaluation * induction hypothesis strengthening ** Ex. 3 *** (a) Formulas in propositional logic are built from variables of type 'a', negatoins, conjunctions, and disjunctions. Specify a Haskell data type 'Prop a' to represent formulas in propositional logic. Your first task was translating this from text to Haskell (an *important* skill, also for the exam!) > data Prop a = PVar a | Not (Prop a) > | And (Prop a) (Prop a) | Or (Prop a) (Prop a) > deriving Show *** (b) > foldProp :: (a -> b) -- Var > -> (b -> b) -- Not > -> (b -> b -> b) -- And > -> (b -> b -> b) -- Or > -> Prop a -> b > foldProp fVar fNot fAnd fOr = go > where go (PVar name) = fVar name > go (Not x) = fNot (go x) > go (And x y) = fAnd (go x) (go y) > go (Or x y) = fOr (go x) (go y) Allows for short function definitions. *** (c) > evalProp :: (a -> Bool) -> (Prop a -> Bool) > evalProp v = foldProp v not (&&) (||) *** (d) > propVars :: Eq a => Prop a -> [a] > propVars = foldProp (\x -> [x]) id union union The function 'union' from Data.List returns the list union of two lists. If both lists are duplicate-free then the resulting list will be duplicate-free as well. *** (e) The basic idea: - Collect all variables that occur in a formula. - Build all assignments for these variables. - Evaluate the proposition with all of these assignments. - The formula is satisfiable if and only if it evaluates to True for one of these variable assignments. The function vars returns the list of variables that occur in a given formula. > satProp :: Eq a => Prop a -> Bool > satProp p = any (\f -> evalProp f p) assg > where assg = map (flip elem) > (pow (propVars p)) > pow :: [a] -> [[a]] > pow [] = [[]] > pow (x:xs) = let a = pow xs in map (x:) a ++ a Alternative definition: > satProp' p = aux (pow (propVars p)) > where aux [] = evalProp (\a -> False) p > aux (t:ts) = (evalProp (\a -> elem a t) p) || (aux ts) ** Ex. 4 a) > data Tree t = Leaf | Node t (Tree t) (Tree t) The task was to implement a breadth-first search of the tree. Not trivial, since the algebraic datatype definition makes DFS more natural. Using this tree as an example: 1 / \ / \ 2 3 /\ /\ 4 5 6 7 We can proceed in a "level-by-level" fashion; we'll carry around a list of nodes which are on the same level. > bfs :: Tree a -> [a] > bfs x = traverse [x] > traverse :: [Tree a] -> [a] > traverse [] = [] > traverse ts = rootlabels ++ traverse (concatMap children ts) > where rootlabels = [ x | Node x _ _ <- ts] > children Leaf = [] > children (Node _ l r) = [l,r] *** Exercise: 1 / \ / \ 2 3 /\ /\ 4 5 6 7 Another idea for a solution would be to use a helper function which also performs a BFS, but has a type Tree a -> [ [a] ], where the "levels" of the tree are segregated in the result; i.e. for the tree above, the result would be [[ 1 ], [2, 3], [4,5,6,7]] Exercise: - write the bfsHelper function described as above - construct a Tree Int value corresponding to the tree depicted above and test your function - re-implement the above function using foldTree ** Ex. 4 b) You were asked to use *structural induction* again. Problems again defining the exact predicate over whose parameter we induct on. Guidelines: - normally you will need to define the predicate with just one parameter, e.g. P(x) - you normally don't quantify over the x itself *inside* of the predicate; the quantifier is obtained in the conclusion of the induction rule, which normally has the form ∀x :: a. P(x) - litmus test to check whether your predicate makes sense, using example of trees: P(Leaf) is obtained by substituting "Leaf" for x everywhere in P. This gives you the predicate you need to prove your base case. Also, note that there exists one canonical structural induction rule per data type. The fold function of an algebraic data type is very similar to its structural induction scheme, let's use Tree as an example: Fold for 'Tree a': INPUT: For each constructor C, give a function fC that combines the results of "folding" the immediate subformulas of C f1 .. fk to a value of type b. OUTPUT: A function from 'Tree a' to 'b' Induction scheme for 'Tree a' INPUT: For each constructor C, give a proof that if P holds for all immediate subformulas of 'C t1 .. tk', then it also holds for 'C t1 .. tk'. OUTPUT: A proof that property P holds for all 't :: Tree a'. foldTree :: b (~= () -> b) Γ ⊢ ∀ v :: a. P(Leaf) -> (a -> b -> b -> b) Γ ⊢ ∀ v :: a, t,t' :: Tree a. P(t) ∧ P(t') ==> P(Node v t t) ————————————————————————————————————————————————————————————— -> (Tree a -> b) Γ ⊢ ∀ t :: Tree a. P(t) *** Exercise: Given the type for foldProp, come up with the structural induction scheme for Prop a foldProp :: (a -> b) -> (b -> b) -> (b -> b -> b) -> (b -> b -> b) -> (Prop a -> b) **** Solution: foldProp :: (a -> b) Γ ⊢ ∀ v :: a. P(PVar v) -> (b -> b) Γ ⊢ ∀ t :: Prop a. P(t) ==> P(Not t) -> (b -> b -> b) Γ ⊢ ∀ t,t':: Prop a. P(t) ∧ P(t') ==> P(And t t') -> (b -> b -> b) Γ ⊢ ∀ t,t':: Prop a. P(t) ∧ P(t') ==> P(Or t t') —————————————————————————————————————————————————— -> (Prop a -> b) Γ ⊢ ∀ t :: Prop a. P(t) ** Let typing Γ ⊢ t1 :: σ Γ,x:σ |- t2 :: τ -------------------------------- x not in Γ Γ ⊢ let x = t1 in t2 :: τ *** Another approach: Translate the let into a known language construct. (let x = t1 in t2) ~= (λx. t2) t1 Γ ⊢ t1 :: σ Γ,x:σ ⊢ t2 :: τ —————————————————————————————- x not in Gamma Γ ⊢ (λx. t2) t1 :: τ ------------------------------ Γ ⊢ let x = t1 in t2 :: τ * Sheet 6 ** Lazy & Eager Evaluation Lazy evaluation strategy for 't1 t2': o Evaluate t1 o t2 is substituted without evaluation o No evaluation under an abstraction Eager evaluation strategy for 't1 t2': o Evaluate t1 o t2 is evaluated prior to substitution o Evaluation is carried out under an abstraction | Reduction under abstraction | Reduction of arguments —————————————————————————————————————————————————————————————– eager | YES | YES lazy | NO | NO What we call lazy is also called call-by-name reduction. The corresponding normal form is called weak-head-normal-form. What we call eager is also called call-by-value reduction. The corresponding normal form is called normal-form. *** Exercise 1: Evaluate the following expression both lazily and eagerly: (%x. x) (%x. (%y. y) x) **** Eager (%x. x) (%x. (%y. y) x) = (%x. x) (%x. x) = (%x. x) **** Lazy (%x. x) (%x. (%y. y) x) = (%x. (%y. y) x) *** Exercise 2: The following example shows that how to avoid capturing the free variables. (%x. x (%y. x y)) (%x. y x) **** Eager (%x. x (%y. x y)) (%x. y x) = (%x. y x) (%y'. (%x. y x) y') = (%x. y x) (%y'. y y') = y (%y'. y y') **** Lazy (%x. x (%y. x y)) (%x. y x) = (%x. y x) (%y'. (%x. y x) y') = y (%y'. (%x. y x) y') ** Strengthening the induction hypothesis Exercise 2 will ask you for yet another induction proof. However, this time the task will not be so easy. Being precise about your induction predicate will be crucial. Hint: review the slides for w2-lists, 29-32