Comonads


In the last few posts, I've been talking about how we can derive monads and monad transformers from comonads. Along the way we learned that there are more monads than comonads in Haskell.

The question I hope to answer this time, is whether or not we turn any Haskell Comonad into a comonad transformer.

(more...)

Last time in Monad Transformers from Comonads I showed that given any comonad we can derive the monad-transformer

 
newtype CoT w m a = CoT { runCoT :: w (a -> m r) -> m r
 

and so demonstrated that there are fewer comonads than monads in Haskell, because while every Comonad gives rise to a Monad transformer, there are Monads that do not like IO, ST s, and STM.

I want to elaborate a bit more on this topic.

(more...)

Last time, I showed that we can transform any Comonad in Haskell into a Monad in Haskell.

Today, I'll show that we can go one step further and derive a monad transformer from any comonad!

(more...)

Today I'll show that you can derive a Monad from any old Comonad you have lying around.

(more...)

Last time, I said that I was going to put our cheap new free monad to work, so let's give it a shot.

(more...)

Last night, Chung-Chieh Shan posted an example of a pointed-set monad on his blog, which happens to be isomorphic to a non-empty stream monad with a different emphasis.

But, I thought I should point out that the pointed set that he posted also has a comonadic structure, which may be exploited since it is just a variation on the "zipper comonad," a structure that is perhaps more correctly called a "pointing comonad."

(more...)

I want to spend some more time talking about Kan extensions, composition of Kan extensions, and the relationship between a monad and the monad generated by a monad.

But first, I want to take a moment to recall adjunctions and show how they relate to some standard (co)monads, before tying them back to Kan extensions.

Adjunctions 101

An adjunction between categories $\mathcal{C}$ and $\mathcal{D}$ consists of a pair of functors $F : \mathcal{C} -> \mathcal{D}$, and $G : \mathcal{D} -> \mathcal{C}$ and a natural isomorphism:

$\phi : \mathrm{Hom}_\mathcal{D} (F-, =) -> \mathrm{Hom}_\mathcal{C} (-, G=)$

We call $F$ the left adjoint functor, and $G$ the right adjoint functor and $(F,G)$ an adjoint pair, and write this relationship as $F \dashv G$

(more...)

I think I may spend a post or two talking about Kan extensions.

They appear to be black magic to Haskell programmers, but as Saunders Mac Lane said in Categories for the Working Mathematician:

All concepts are Kan extensions.

So what is a Kan extension? They come in two forms: right- and left- Kan extensions.

First I'll talk about right Kan extensions, since Haskell programmers have a better intuition for them.

(more...)

 
> import Control.Arrow ((|||),(&&&),left)
> newtype Mu f = InF { outF :: f (Mu f) }
 

I want to talk about a novel recursion scheme that hasn't received a lot of attention from the Haskell community and its even more obscure dual -- which is necessarily more obscure because I believe this is the first time anyone has talked about it.

Jiri Adámek, Stefan Milius and Jiri Velebil have done a lot of work on Elgot algebras. Here I'd like to translate them into Haskell, dualize them, observe that the dual can encode primitive recursion, and provide some observations.

You can kind of think an Elgot algebra as a hylomorphism that cheats.

 
> elgot :: Functor f => (f b -> b) -> (a -> Either b (f a)) -> a -> b
> elgot phi psi = h where h = (id ||| phi . fmap h) . psi
 

(more...)

Kefer asked a question in the comments of my post about (co)monadic (co)strength about Uustalu and Vene's ComonadZip class from p157 of The Essence of Dataflow Programming. The class in question is:

 
class Comonad w => ComonadZip w where
     czip :: f a -> f b -> f (a, b)
 

In response I added Control.Functor.Zip [Source] to my nascent rebundled version of category-extras, which was posted up to hackage earlier today.

(more...)

 
{-# OPTIONS -fglasgow-exts -fallow-undecidable-instances #-}
import Control.Monad
import Control.Monad.Identity
import Control.Arrow ((&&&), (***),(+++), (|||))
 

I want to talk about duality briefly. I don't want to go all the way to Filinski-style or Haskell is Not Not ML-style value/continuation duality, but I do want to poke a bit at the variant/record duality explified by the extensible cases used to handle variants in MLPolyR.

The need for extensible cases to handle open variants is part of the expression problem as stated by Wadler:

The goal is to define a data type by cases, where one can add new cases to the data type and new functions over the data type, without recompiling existing code, and while retaining static type safety.

One obvious trick is to use an extensible record of functions as a 'case' statement, with each field corresponding to one of the variants. To index into records you can use an extensible variant of functions to represent a field selection. In a purer form ala the Filinski or the Haskell is Not Not ML approach mentioned above, you can replace the word 'function' with continuation and everything works out.

Sweirstra recently tackled the extensible variant side of the equation with in Data types a la carte using the free monad coproduct to handle the 'variant' side of things, leaving the handling of cases to typeclasses, but we can see if we can go one better and just exploit the variant/record duality directly.

(more...)

No, this isn't some uplifting piece about deriving courage from sloth in the face of adversity.

What I want to talk about is monadic strength.

Transcribing the definition from category theory into Haskell we find that a strong monad is a functor such that there exists a morphism:

$t_{A, B} : M A * B -> M (A * B)$

with a couple of conditions on it that I'll get to later.

Currying that to get something that feels more natural to a Haskell programmer we get:

 
mstrength :: Monad m => m a -> b -> m (a,b)
 

Pardo provided us with a nice definition for that in Towards merging recursion and comonads:

(more...)

In case it wasn't obvious, I thought I should mention that Kabanov and Vene's dynamorphisms which optimize histomorphisms for dynamic programming can be expressed readily as chronomorphisms; they just use an anamorphism instead of a futumorphism.

(more...)

Back in the days of HYLO, it was common to write hylomorphisms with an additional natural transformation in them. Well, I was still coding in evil imperative languages back then, but I have it on reliable, er.. well supposition, that this is probably the case, or at least that they liked to do it back in the HYLO papers anyways.

Transcoding the category theory mumbo-jumbo into Haskell, so I can have a larger audience, we get the following 'frat combinator' -- you can blame Jules Bean from #haskell for that.

 
hyloEta :: Functor f =>
     (g b -> b) ->
     (forall a. f a -> g a) ->
     (a -> f a)
hyloEta phi eta psi = phi . eta . fmap (hyloEta phi eta psi) . psi
 

We placed eta in the middle of the argument list because it is evocative of the fact that it occurs between phi and psi, and because that seems to be where everyone else puts it.

Now, clearly, we could roll eta into phi and get the more traditional hylo where f = g. Less obviously we could roll it into psi because it is a natural transformation and so the following diagram commutes:

\bfig \square/>`>`>`>/[F(A)`F(B)`G(A)`G(B);F {[}\hspace{-0.8pt}{[}f, g{]}\hspace{-0.8pt}{]} `\eta_A`\eta_B`G {[}\hspace{-0.8pt}{[}f, g{]}\hspace{-0.8pt}{]}] \efig

This 'Hylo Shift' property (mentioned in that same paper) allows us to move the 'eta' term into the phi term or into the psi term as we see fit. Since we can move the eta term around and it adds no value to the combinator, it quietly returned to the void from whence it came. hyloEta offers us no more power than hylo, so out it goes.

So, if its dead, why talk about it?

(more...)

First, we can make the generalized hylomorphism from the other day more efficient by noting that once you inline the hylomorphism, you can see that you do 3 fmaps over the same structure, so we can fuse those together yielding:

 
g_hylo :: (Comonad w, Functor f, Monad m) =>
          (forall a. f (w a) -> w (f a)) ->
	  (forall a. m (f a) -> f (m a)) ->
	  (f (w b) -> b) ->
	  (a -> f (m a)) ->
          (a -> b)
g_hylo w m f g = extract . g_hylo' w m f g . return
 
-- | the kernel of the generalized hylomorphism
g_hylo' :: (Comonad w, Functor f, Monad m) =>
          (forall a. f (w a) -> w (f a)) ->
	  (forall a. m (f a) -> f (m a)) ->
	  (f (w b) -> b) ->
	  (a -> f (m a)) ->
          (m a -> w b)
g_hylo' w m f g =
    liftW f . w .
    fmap (duplicate . g_hylo' w m f g . join) .
    m . liftM g
 

Also, the above made me realize that most of the generalized cata/ana, etc morphisms give you a little more interesting stuff to do if you separate out the recursive part. Then you can pass it a monad built with something other than return to perform substitution on, or inspect the comonadic wrapper on the result.

Oh, and to support my earlier claim that g_hylo generalizes g_cata and g_ana here are derivations of each in terms of g_hylo.

 
g_cata :: (Functor f, Comonad w) =>
          (forall a. f (w a) -> w (f a)) ->
          (f (w a) -> a) ->
          Mu f -> a
 
g_cata k f = g_hylo k (fmap Id . runId) f (fmap Id . outF)
 
g_ana :: (Functor f, Monad m) =>
         (forall a. m (f a) -> f (m a)) ->
         (a -> f (m a)) ->
         a -> Nu f
g_ana k g = g_hylo (Id . fmap runId) k (InF . fmap runId) g
 

As an aside, histomorphisms have a dual that seems to be elided from most lists of recursion schemes: Uustalu and Vene call it a futumorphism. It basically lets you return a structure with seeds multiple levels deep rather than have to plumb 'one level at a time' through the anamorphism. While a histomorphism is a generalized catamorphism parameterized by the cofree comonad of your functor, a futumorphism is a generalized anamorphism parameterized by the free monad of your functor.

 
futu :: Functor f => (a -> f (Free f a)) -> a -> Nu f
futu f = ana ((f ||| id) . runFree) . return
 

Now, g_hylo is painfully general, so lets look at a particularly interesting choice of comonad and monad for a given functor that always have a distributive law: the cofree comonad, and the free monad of that very same functor!

This gives rise to a particular form of morphism that I haven't seem talked about in literature, which after kicking a few names around on the haskell channel we chose to call a chronomorphism because it subsumes histo- and futu- morphisms.

 
chrono :: Functor f =>
          (f (Cofree f b) -> b) ->
          (a -> f (Free f a)) ->
          a -> b
 

Unlike most of the types of these generalized recursion schemes, chrono's type is quite readable!

A chronomorphism's fold operation can 'look back' at the results it has given, and its unfold operation can 'jump forward' by returning seeds nested multiple levels deep. It relies on the fact that you always have a distributive law for the cofree comonad of your functor over the functor itself and also one for the functor over its free monad and so it works for any Functor.

You can generalize it like you generalize histomorphisms and futumorphisms, and derive ana and catamorphisms from it by noting the fact that you can fmap extract or fmap return to deal with the cofree comonad or free monad parts of the term.

Alternately, since the 'identity comonad' can be viewed as the cofree comonad of the $\perp$ Functor that maps everything to $\perp$, you can also choose to rederive generalized futumorphisms from generalized chronomorphism using the distributive law of the identity comonad.

Below you'll find source code for generalized hylo- cata- ana- histo- futu- chrono- etc... morphisms and their separated kernels.

Source Code

As an aside, Dan Doel (dolio) has started packaging these up for addition to category-extras in Hackage.

I haven't seen written up anywhere the following operator (g_hylo), defined in the spirit of generalized catamorphisms and generalized anamorphisms, which seems to follow rather naturally from the definition of both -- I'm using liftW & liftM rather than fmap to make it clear what is being lifted over what.

 
class Functor w => Comonad w where
        -- minimal definition: extend & extract or duplicate & extract
        duplicate :: w a -> w (w a)
        extend :: (w a -> b) -> w a -> w b
        extract :: w a -> a
        extend f = fmap f . duplicate
        duplicate = extend id
 
liftW :: Comonad w => (a -> b) -> w a -> w b
liftW f = extend (f . extract)
 
g_hylo :: (Comonad w, Functor f, Monad m) =>
          (forall a. f (w a) -> w (f a)) ->
          (forall a. m (f a) -> f (m a)) ->
          (f (w b) -> b) ->
          (a -> f (m a)) ->
          a -> b
g_hylo w m f g =
     extract .
     hylo (liftW f . w . fmap duplicate) (fmap join . m . liftM g)
     . return
   where
     hylo f g = f . fmap (hylo f g) . g
 

In the above, w and m are the distributive laws for the comonad and monad respectively, and hylo is a standard hylomorphism. In the style of Dave Menendez's Control.Recursion code it would be a 'refoldWith' and it can rederive a whole lot of recursion and corecursion patterns if not all of them.

Anyone?