Monads


Recently, a fellow in category land discovered a fact that we in Haskell land have actually known for a while (in addition to things most of us probably don't). Specifically, given two categories $\mathcal{C}$ and $\mathcal{D}$, a functor $G : \mathcal{C} \rightarrow \mathcal{D}$, and provided some conditions in $\mathcal{D}$ hold, there exists a monad $T^G$, the codensity monad of $G$.

In category theory, the codensity monad is given by the rather frightening expression:

$ T^G(a) = \int_r \left[\mathcal{D}(a, Gr), Gr\right] $

(more...)

Andrej Bauer recently gave a really nice talk on how you can exploit side-effects to make a faster version of Martin Escardo's pseudo-paradoxical combinators.

A video of his talk is available over on his blog, and his presentation is remarkably clear, and would serve as a good preamble to the code I'm going to present below.

Andrej gave a related invited talk back at MSFP 2008 in Iceland, and afterwards over lunch I cornered him (with Dan Piponi) and explained how you could use parametricity to close over the side-effects of monads (or arrows, etc) but I think that trick was lost in the chaos of the weekend, so I've chosen to resurrect it here, and improve it to handle some of his more recent performance enhancements, and show that you don't need side-effects to speed up the search after all!

(more...)

Last time we derived an entailment relation for constraints, now let's get some use out of it.

Reflecting Classes and Instances

Most of the implications we use on a day to day basis come from our class and instance declarations, but last time we only really dealt with constraint products.

(more...)

Max Bolingbroke has done a wonderful job on adding Constraint kinds to GHC.

Constraint Kinds adds a new kind Constraint, such that Eq :: * -> Constraint, Monad :: (* -> *) -> Constraint, but since it is a kind, we can make type families for constraints, and even parameterize constraints on constraints.

So, let's play with them and see what we can come up with!

(more...)

Last time, I showed that we can build a small parsec clone with packrat support.

This time I intend to implement packrat directly on top of Parsec 3.

One of the main topics of discussion when it comes to packrat parsing since Bryan Ford's initial release of Pappy has been the fact that in general you shouldn't use packrat to memoize every rule, and that instead you should apply Amdahl's law to look for the cases where the lookup time is paid back in terms of repetitive evaluation, computation time and the hit rate. This is great news for us, since, we only want to memoize a handful of expensive combinators.

(more...)

You never heard of the Millenium Falcon? It's the ship that made the Kessel Run in 12 parsecs.

I've been working on a parser combinator library called trifecta, and so I decided I'd share some thoughts on parsing.

Packrat parsing (as provided by frisby, pappy, rats! and the Scala parsing combinators) and more traditional recursive descent parsers (like Parsec) are often held up as somehow different.

Today I'll show that you can add monadic parsing to a packrat parser, sacrificing asymptotic guarantees in exchange for the convenient context sensitivity, and conversely how you can easily add packrat parsing to a traditional monadic parser combinator library.

(more...)

Today I hope to start a new series of posts exploring constructive abstract algebra in Haskell.

In particular, I want to talk about a novel encoding of linear functionals, polynomials and linear maps in Haskell, but first we're going to have to build up some common terminology.

Having obtained the blessing of Wolfgang Jeltsch, I replaced the algebra package on hackage with something... bigger, although still very much a work in progress.

(more...)

In the last few posts, I've been talking about how we can derive monads and monad transformers from comonads. Along the way we learned that there are more monads than comonads in Haskell.

The question I hope to answer this time, is whether or not we turn any Haskell Comonad into a comonad transformer.

(more...)

Last time in Monad Transformers from Comonads I showed that given any comonad we can derive the monad-transformer

 
newtype CoT w m a = CoT { runCoT :: w (a -> m r) -> m r
 

and so demonstrated that there are fewer comonads than monads in Haskell, because while every Comonad gives rise to a Monad transformer, there are Monads that do not like IO, ST s, and STM.

I want to elaborate a bit more on this topic.

(more...)

Last time, I showed that we can transform any Comonad in Haskell into a Monad in Haskell.

Today, I'll show that we can go one step further and derive a monad transformer from any comonad!

(more...)

Today I'll show that you can derive a Monad from any old Comonad you have lying around.

(more...)

Last time, I said that I was going to put our cheap new free monad to work, so let's give it a shot.

(more...)

Last time, I started exploring whether or not Codensity was necessary to improve the asymptotic performance of free monads.

This time I'll show that the answer is no; we can get by with something smaller.

(more...)

A couple of years back Janis Voigtländer wrote a nice paper on how one can use the codensity monad to improve the asymptotic complexity of algorithms using the free monads. He didn't use the name Codensity in the paper, but this is essentially the meaning of his type C.

I just returned from running a workshop on domain-specific languages at McMaster University with the more than able assistance of Wren Ng Thornton. Among the many topics covered, I spent a lot of time talking about how to use free monads to build up term languages for various DSLs with simple evaluators, and then made them efficient by using Codensity.

This has been shown to be a sufficient tool for this task, but is it necessary?

(more...)

I have updated the reflection package on hackage to use an idea for avoiding dummy arguments posted to the Haskell cafe mailing list by Bertram Felgenhauer, which adapts nicely to the case of handling Reflection. The reflection package implements the ideas from the Functional Pearl: Implicit Configurations paper by Oleg Kiselyov and Chung-chieh Shan.

Now, you no longer need to use big scary undefineds throughout your code and can instead program with implicit configurations more naturally, using Applicative and Monad sugar.

(more...)

I want to spend some more time talking about Kan extensions, composition of Kan extensions, and the relationship between a monad and the monad generated by a monad.

But first, I want to take a moment to recall adjunctions and show how they relate to some standard (co)monads, before tying them back to Kan extensions.

Adjunctions 101

An adjunction between categories $\mathcal{C}$ and $\mathcal{D}$ consists of a pair of functors $F : \mathcal{C} -> \mathcal{D}$, and $G : \mathcal{D} -> \mathcal{C}$ and a natural isomorphism:

$\phi : \mathrm{Hom}_\mathcal{D} (F-, =) -> \mathrm{Hom}_\mathcal{C} (-, G=)$

We call $F$ the left adjoint functor, and $G$ the right adjoint functor and $(F,G)$ an adjoint pair, and write this relationship as $F \dashv G$

(more...)

I think I may spend a post or two talking about Kan extensions.

They appear to be black magic to Haskell programmers, but as Saunders Mac Lane said in Categories for the Working Mathematician:

All concepts are Kan extensions.

So what is a Kan extension? They come in two forms: right- and left- Kan extensions.

First I'll talk about right Kan extensions, since Haskell programmers have a better intuition for them.

(more...)

 
> import Control.Arrow ((|||),(&&&),left)
> newtype Mu f = InF { outF :: f (Mu f) }
 

I want to talk about a novel recursion scheme that hasn't received a lot of attention from the Haskell community and its even more obscure dual -- which is necessarily more obscure because I believe this is the first time anyone has talked about it.

Jiri Adámek, Stefan Milius and Jiri Velebil have done a lot of work on Elgot algebras. Here I'd like to translate them into Haskell, dualize them, observe that the dual can encode primitive recursion, and provide some observations.

You can kind of think an Elgot algebra as a hylomorphism that cheats.

 
> elgot :: Functor f => (f b -> b) -> (a -> Either b (f a)) -> a -> b
> elgot phi psi = h where h = (id ||| phi . fmap h) . psi
 

(more...)

Twan van Laarhoven pointed out that fzip from the other day is a close cousin of applicative chosen to be an inverse of the universal construction 'unfzip'.

During that post I also put off talking about the dual of zipping, so I figured I'd bring up the notion of choosing a notion of 'cozipping' by defining it as an inverse to a universally definable notion of 'counzipping'.

(more...)

 
{-# OPTIONS -fglasgow-exts -fallow-undecidable-instances #-}
import Control.Monad
import Control.Monad.Identity
import Control.Arrow ((&&&), (***),(+++), (|||))
 

I want to talk about duality briefly. I don't want to go all the way to Filinski-style or Haskell is Not Not ML-style value/continuation duality, but I do want to poke a bit at the variant/record duality explified by the extensible cases used to handle variants in MLPolyR.

The need for extensible cases to handle open variants is part of the expression problem as stated by Wadler:

The goal is to define a data type by cases, where one can add new cases to the data type and new functions over the data type, without recompiling existing code, and while retaining static type safety.

One obvious trick is to use an extensible record of functions as a 'case' statement, with each field corresponding to one of the variants. To index into records you can use an extensible variant of functions to represent a field selection. In a purer form ala the Filinski or the Haskell is Not Not ML approach mentioned above, you can replace the word 'function' with continuation and everything works out.

Sweirstra recently tackled the extensible variant side of the equation with in Data types a la carte using the free monad coproduct to handle the 'variant' side of things, leaving the handling of cases to typeclasses, but we can see if we can go one better and just exploit the variant/record duality directly.

(more...)

Next Page »