Our implementation (`BinomialHeap.elm`

) exports the same type signatures as the previous implementations of min-heaps.

```
type alias Rank = Int
type Tree a = Node Rank a (List (Tree a))
rank (Node r _ _) = r
root (Node _ x _) = x
```

*Binomial trees* of rank `r`

are defined inductively as follows.

`Node 0 n []`

is a binomial tree of rank`0`

.`Node r n ts`

is a binomial tree of rank`r > 0`

if`ts`

is a list of`r`

binomial trees with rank`r-1`

through`0`

, respectively.

A binomial tree of rank `r`

has *2 ^{r}* nodes.

A binomial tree of rank `r + 1`

is formed by linking together two trees of rank `r`

, making one the leftmost child of the other.

The `link`

function below links two binomial trees, choosing to keep the smaller of the two elements at the root. Therefore, if `t1`

and `t2`

are both heap-ordered, then so is the result.

```
link : Tree comparable -> Tree comparable -> Tree comparable
link t1 t2 =
let
(Node r x1 ts1) = t1
(Node _ x2 ts2) = t2
in
if x1 <= x2
then Node (r+1) x1 (t2::ts1)
else Node (r+1) x2 (t1::ts2)
```

`type Heap a = Heap (List (Tree a))`

A *binomial heap* is list of heap-ordered binomial trees, kept in strictly-increasing order of rank. A binomial heap containing *n* elements is represented using at most *O(log n)* binomial trees, analogous to the binary representation of *n*.

```
empty : Heap comparable
empty = Heap []
isEmpty : Heap comparable -> Bool
isEmpty h = h == empty
```

The `findMin`

function searches for the smallest root among all of the binomial trees, taking *O(log n)* time.

```
findMin : Heap comparable -> Maybe comparable
findMin (Heap ts) =
case List.map root ts of
[] -> Nothing
n::ns -> Just (List.foldl min n ns)
```

See Homework 4 for a way to implement `findMin`

so that it runs in *O(1)* time, as for other heap implementations.

Inserting into a binomial heap requires pairwise `link`

ing of `Trees`

with equal rank. Think “sum” and “carry” bits as in arithmetic addition. This is analogous to arithmetic addition.

```
insert : comparable -> Heap comparable -> Heap comparable
insert x (Heap ts) = Heap (insertTree (Node 0 x []) ts)
insertTree : Tree comparable -> List (Tree comparable) -> List (Tree comparable)
insertTree t ts = case ts of
[] -> [t]
t1::ts1 ->
if rank t == rank t1 then insertTree (link t t1) ts1
else if rank t < rank t1 then t :: ts
else Debug.crash "insertTree: impossible"
```

There are *O(m)* recursive calls to `insertTree`

(where *m* is the length of `ts`

), each of which performs *O(1)* work (to `link`

two trees). Thus, `insert`

runs in *O(m) = O(log n)* time.

```
merge : Heap comparable -> Heap comparable -> Heap comparable
merge (Heap ts1) (Heap ts2) = Heap (merge_ ts1 ts2)
merge_
: List (Tree comparable) -> List (Tree comparable)
-> List (Tree comparable)
merge_ ts1 ts2 = case (ts1, ts2) of
([], _) -> ts2
(_, []) -> ts1
(t1::ts1_rest, t2::ts2_rest) ->
if rank t1 < rank t2 then t1 :: merge_ ts1_rest ts2
else if rank t2 < rank t1 then t2 :: merge_ ts2_rest ts1
else insertTree (link t1 t2) (merge_ ts1_rest ts2_rest)
```

To analyze the running time of `merge_`

, let *m* be the total number of trees in both `ts1`

and `ts2`

. The first two cases run in *O(1)* time. Each recursive call to `merge_`

decreases *m* by one (in the third and fourth cases) or two (in the fifth case). The cons operations in the third and fourth cases require *O(1)* work. In the fifth case, `link`

requires *O(1)* time, `insertTree`

requires *O(m)* time, and the recursive call to `merge_`

requires *T(m-2)* time. There are *O(m)* recursive calls, each of which requires at most *O(m)* time. The result is a *O(m ^{2})* running time, which is

A more subtle analysis, however, can be used to argue that the implementation of `merge_`

runs in *O(log n)* time. The argument requires a more careful accounting of how many times `link`

is called (which is the crux of both `insertTree`

and `merge_`

) based on the analogy between merging lists and adding two numbers in binary representation.

For our purposes, we will consider an alternative, one-pass definition (also drawn from this post) that is slightly easier to analyze. Notice the new `r == r1 == r2`

case for `merge_wc`

.

```
merge_one_pass
: List (Tree comparable) -> List (Tree comparable)
-> List (Tree comparable)
merge_one_pass ts1 ts2 = case (ts1, ts2) of
([], _) -> ts2
(_, []) -> ts1
(t1::ts1_rest, t2::ts2_rest) ->
if rank t1 < rank t2 then t1 :: merge_one_pass ts1_rest ts2
else if rank t2 < rank t1 then t2 :: merge_one_pass ts1 ts2_rest
else merge_wc (link t1 t2) ts1_rest ts2_rest
merge_wc
: Tree comparable -> List (Tree comparable) -> List (Tree comparable)
-> List (Tree comparable)
merge_wc t ts1 ts2 = case (ts1, ts2) of
([], _) -> insertTree t ts2
(_, []) -> insertTree t ts1
(t1::ts1_rest, t2::ts2_rest) ->
let (r,r1,r2) = (rank t, rank t1, rank t2) in
if r < r1 && r < r2 then t :: merge_one_pass ts1 ts2
else if r < r1 && r == r2 then merge_wc (link t t2) ts1 ts2_rest
else if r == r1 && r < r2 then merge_wc (link t t1) ts1_rest ts2
else if r == r1 && r == r2 then t :: merge_wc (link t1 t2) ts1_rest ts2_rest
-- else if r == r1 && r == r2 then merge_wc (link t t1) ts1_rest ts2
else Debug.crash "merge_wc: impossible"
```

Let *T(m)* and *S(m)* be the running times of `merge_one_pass`

and `merge_wc`

, respectively, where *m* is an upper bound on the number of trees in both input lists combined. Consider each of the five cases of `merge_one_pass`

:

- Cases 1 and 2:
*T(m)*=*O(1)* - Cases 3 and 4:
*T(m)*=*O(1)*+*T(m-1)* - Case 5:
*T(m)*=*O(1)*+*S(m-2)*

Consider each of the six cases of `merge_wc`

:

- Cases 1 and 2:
*S(m)*=*O(m)* - Case 3:
*S(m)*=*O(1)*+*T(m)* - Cases 4, 5, and 6:
*S(m)*=*O(1)*+*S(m-1)*

There are at most *O(m)* mutually recursive calls between the two functions. The last call to `merge_wc`

may take *O(m)* time, but all other calls take *O(1)* time. Thus, the worst-case running time for each of these two functions is *O(m)+O(m) = O(m)* time. That is, *O(log n)* time.

```
removeMinTree
: List (Tree comparable)
-> (Tree comparable, List (Tree comparable))
removeMinTree ts = case ts of
[] -> Debug.crash "removeMinTree: impossible"
[t] -> (t, [])
t::ts_rest ->
let (minTree, restTrees) = removeMinTree ts_rest in
if root t < root minTree
then (t, ts_rest)
else (minTree, t::restTrees)
deleteMin : Heap comparable -> Maybe (comparable, Heap comparable)
deleteMin (Heap ts) = case ts of
[] -> Nothing
_ -> let (Node _ x ts1, ts2) = removeMinTree ts in
Just (x, Heap (merge_ (List.reverse ts1) ts2))
```

We can reuse the `removeMinTree`

helper function to reimplement `findMin`

.

```
findMin (Heap ts) =
case ts of
[] -> Nothing
_ -> Just (root (Tuple.first (removeMinTree ts)))
```

- Okasaki, Chapter 3.2