[Source](http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/ "Permalink to Seemingly impossible functional programs | Mathematics and Computation") # Seemingly impossible functional programs | Mathematics and Computation [ # Mathematics and Computation ## [Mathematics for computers][1] Menu [Skip to content][2] * [About][3] * [Publications][4] * [Software][5] * [Eff][6] * [Alg][7] * [RZ][8] Search for: # Seemingly impossible functional programs [September 28, 2007][9][Computation][10], [Guest post][11], [Tutorial][12][Martin Escardo][13] Andrej has invited me to write about certain surprising functional programs. The first program, due to [Ulrich Berger][14] (1990), performs exhaustive search over the “[Cantor space][15]” of infinite sequences of binary digits. I have included references at the end. A weak form of exhaustive search amounts to checking whether or not a total predicate holds for all elements of the Cantor space. Thus, this amounts to universal quantification over the Cantor space. Can this possibly be done algorithmically, in finite time? A stronger one amounts to finding an example such that the predicate holds, if such an example exists, and saying that there isn’t any otherwise. I will use the language [Haskell][16], but it is possible to quickly translate the programs to e.g. ML or [OCaml][17]. The source code shown here is attached as [seemingly-impossible.hs][18]. We could use the booleans to represent binary digits, or even the integers, but I prefer to use a different type to avoid confusions: > data Bit = Zero | One > deriving (Eq) The `deriving` clause tells Haskell to figure out how to decide equality of bits automatically. For the type of infinite sequences, we could use the built-in type of lazy lists for most algorithms considered here. But, in order to illustrate certain points, I will take the mathematical view and regard sequences as functions defined on the natural numbers. The next version of the definition of Haskell will have a built-in type of natural numbers. For the moment, I implement it as the type of integers: > type Natural = Integer > type Cantor = Natural -> Bit The operator `(#)` takes a bit `x` and a sequence `a` and produces a new sequence `x # a` with `x` as the head and `a` as the tail (very much like the built-in operation `(:)` for lists): > (#) :: Bit -> Cantor -> Cantor > x # a = i -> if i == 0 then x else a(i-1) Notice that the notation `i -> ...` stands for $lambda i. dots$. Next, we come to the heart of the matter, the functions that perform exhaustive search over the Cantor space. The specification of the function `find` is that, for any total `p`, one should have that `find p` is always a total element of the Cantor space, and, moreover, if there is `a` in the Cantor space with `p a = True`, then `a = find p` is an example of such an `a`. > forsome, forevery :: (Cantor -> Bool) -> Bool > find :: (Cantor -> Bool) -> Cantor Because I will have several implementations of `find`, I have to choose one to be able to compile and run the program. A canonical choice is the first one, > find = find_i but you are invited to experiment with the other ones. For the following definition of `find_i` to make sense, you have to take the above choice. The function `find` takes a predicate on the Cantor space, and hence it will typically have a $lambda$-expression as argument. In the following definition this is not necessary, because `(a -> p a) = p` by the $eta$ rule. But I have adopted it for the sake of clarity, as then we can read “`find(a -> p a)`” aloud as “find `a` such that `p(a)`“: > forsome p = p(find(a -> p a)) > forevery p = not(forsome(a -> not(p a))) Notice that the function `forevery` (universal quantification) is obtained from the function `forsome` (existential quantification) via the [De Morgan Law][19]. The functionals `forsome` and `find_i` are defined by mutual recursion: > find_i :: (Cantor -> Bool) -> Cantor > find_i p = if forsome(a -> p(Zero # a)) > then Zero # find_i(a -> p(Zero # a)) > else One # find_i(a -> p(One # a)) The intuitive idea of the algorithm `find_i` is clear: if there is an example starting with zero, then the result is taken to start with zero, otherwise it must start with one. Then we recursively build the tail using the same idea. What may not be clear is whether the recursion eventually produces a digit, because of the indirect recursive call via the call to `forsome`. A mathematical proof proceeds by induction on the modulus of uniform continuity of `p`, defined below. It may be more natural to return an example only if there is one, and otherwise tell there isn’t any: > search :: (Cantor -> Bool) -> Maybe Cantor > search p = if forsome(a -> p a) then Just(find(a -> p a)) else Nothing The `Maybe` type constructor is predefined by Haskell as > `data Maybe a = Just a | Nothing` Type-theoretic remark: the type `Maybe a` corresponds to the sum type $A+1$, where the only element of $1$ is called `Nothing` and where `Just` is the insertion $A to A+1$. Exercise: show that both `forsome` and `find` can be defined directly from `search` assuming we had defined `search` first. Common wisdom tells us that function types don’t have decidable equality. In fact, e.g. the function type `Integer -> Integer` doesn’t have decidable equality because of the [Halting Problem][20], as is well known. However, common wisdom is not always correct, and, in fact, some other function types do have decidable equality, for example the type `Cantor -> y` for any type `y` with decidable equality, without contradicting Turing: > equal :: Eq y => (Cantor -> y) -> (Cantor -> y) -> Bool > equal f g = forevery(a -> f a == g a) This seems strange, even fishy, because the Cantor space is in some sense bigger than the integers. In a follow-up post, I’ll explain that this has to do with the fact that the Cantor space is topologically compact, but the integers are not. Let’s run an example: > coerce :: Bit -> Natural > coerce Zero = 0 > coerce One = 1 > f, g, h :: Cantor -> Integer > f a = coerce(a(7 * coerce(a 4) + 4 * (coerce(a 7)) + 4)) > g a = coerce(a(coerce(a 4) + 11 * (coerce(a 7)))) > h a = if a 7 == Zero > then if a 4 == Zero then coerce(a 4) else coerce(a 11) > else if a 4 == One then coerce(a 15) else coerce(a 8) Now we call the `ghci` interpreter: $ ghci seemingly-impossible.lhs ___ ___ _ / _ / // __(_) / /_// /_/ / / | | GHC Interactive, version 6.6, for Haskell 98. / /_/ __ / /___| | http://www.haskell.org/ghc/ ____// /_/____/|_| Type :? for help. Loading package base ... linking ... done. [1 of 1] Compiling Main ( seemingly-impossible.lhs, interpreted ) Ok, modules loaded: Main. *Main> At this point we can evaluate expressions at the interpreter’s prompt. First I ask it to print time and space usage after each evaluation: *Main> :set +s On my Dell 410 laptop running at 1.73GHz, I test the following expressions: *Main> equal f g False (0.10 secs, 3490296 bytes) *Main> equal f h True (0.87 secs, 36048844 bytes) *Main> equal g h False (0.09 secs, 3494064 bytes) *Main> equal f f True (0.91 secs, 38642544 bytes) *Main> equal g g True (0.15 secs, 6127796 bytes) *Main> equal h h True (0.83 secs, 32787372 bytes) By changing the implementation of `find`, I’ll make this faster and also will be able to run bigger examples. But let’s carry on with the current implementation for the moment. The following was Berger’s main motivation for considering the above constructions: > modulus :: (Cantor -> Integer) -> Natural > modulus f = least(n -> forevery(a -> forevery(b -> eq n a b --> (f a == f b)))) This is sometimes called the _Fan Functional_, and goes back to Brouwer (1920’s) and it is well known in the higher-type computability theory community (see Normann (2006) below). It finds the _modulus of uniform continuity_, defined as the least natural number $n$ such that > $forall alpha,beta(alpha =_n beta to f(alpha)=f(beta),$ where > $alpha =_n beta iff forall i < n. alpha_i = beta_i.$ What is going on here is that computable functionals are continuous, which amounts to saying that finite amounts of the output depend only on finite amounts of the input. But the Cantor space is compact, and in analysis and topology there is a theorem that says that continuous functions defined on a compact space are [_uniformly_][21] continuous. In this context, this amounts to the existence of a single $n$ such that for all inputs it is enough to look at depth $n$ to get the answer (which in this case is always finite, because it is an integer). I’ll explain all this in another post. Here I will illustrate this by running the program in some examples. Notice that the Haskell definition is the same as the mathematical one, provided we define all the other needed ingredients: > least :: (Natural -> Bool) -> Natural > least p = if p 0 then 0 else 1 + least(n -> p(n+1)) > (-->) :: Bool -> Bool -> Bool > p --> q = not p || q > eq :: Natural -> Cantor -> Cantor -> Bool > eq 0 a b = True > eq (n+1) a b = a n == b n && eq n a b To understand the modulus functional in practice, define projections as follows: > proj :: Natural -> (Cantor -> Integer) > proj i = a -> coerce(a i) Then we get: *Main> modulus (a -> 45000) 0 (0.00 secs, 0 bytes) *Main> modulus (proj 1) 2 (0.00 secs, 0 bytes) *Main> modulus (proj 2) 3 (0.01 secs, 0 bytes) *Main> modulus (proj 3) 4 (0.05 secs, 820144 bytes) *Main> modulus (proj 4) 5 (0.30 secs, 5173540 bytes) *Main> modulus (proj 5) 6 (1.69 secs, 31112400 bytes) *Main> modulus (proj 6) 7 (9.24 secs, 171456820 bytes) So, intuitively, the modulus is the last index of the input that the function uses plus one. For a constant function, like the above, the modulus is zero, because no index is used. **Technical remark**. The notion of modulus of uniform continuity needed for the proof of termination of `find_i` is not literally the same as above, but a slight variant (sometimes called the _intensional_ modulus of uniform continuity, whereas ours is referred to as the _extensional_ one). But I won’t go into such mathematical subtleties here. The main idea is that when the modulus is $0$ the recursion terminates and one of the branches of the definition of `find_i` is followed, and a new recursion is started, to produce the next digit of the example. When the modulus of `p` is $n+1$, the modulus of the predicate `a -> p(Zero # a)` is $n$ or smaller, and so recursive calls are always made with smaller moduli and hence eventually terminate. **End of remark.** Now I’ll try to get faster implementations of `find`. I’ll modify the original implementation in several stages. Firstly, I will remove the mutual recursion by expanding the definition of the function `forsome` in the definition of the function `find_i`: > find_ii p = if p(Zero # find_ii(a -> p(Zero # a))) > then Zero # find_ii(a -> p(Zero # a)) > else One # find_ii(a -> p(One # a)) This should have essentially the same speed. Now notice that the branches of the conditional are the same if we make zero and one into a parameter `h`. Hence one can “factor out” the conditional as follows: > find_iii :: (Cantor -> Bool) -> Cantor > find_iii p = h # find_iii(a -> p(h # a)) > where h = if p(Zero # find_iii(a -> p(Zero # a))) then Zero else One This is (exponentially!) faster for some examples. A clue for this is that `h` will be evaluated only if and when it is “used” (our language is lazy). Let’s run an example, replacing the above definition of `find` by `find = find_iii`: *Main> equal f h True (0.00 secs, 522668 bytes) *Main> equal (proj 1000) (proj 1000) True (0.00 secs, 0 bytes) *Main> equal (proj 1000) (proj 4000) False (0.03 secs, 1422680 bytes) *Main> equal (proj 1000) (proj(2^20)) False (7.02 secs, 336290704 bytes) As you can see, the bigger the projection functions we try, the longer the comparison gets. To see how bad the first algorithm is, let’s switch back to `find = find_i`: *Main> equal (proj 10) (proj 10) True (0.05 secs, 1529036 bytes) *Main> equal (proj 10) (proj 15) False (1.61 secs, 72659036 bytes) *Main> equal (proj 10) (proj 20) False (60.62 secs, 2780497676 bytes) The previous examples cannot be run with this algorithm unless we had more bits available than there are atoms in the observable universe and we were willing to wait several billion-billion years, because the algorithm is exponential in the modulus of continuity. You probably noticed that there is another obvious improvement starting from `find_ii`: > find_iv :: (Cantor -> Bool) -> Cantor > find_iv p = let leftbranch = Zero # find_iv(a -> p(Zero # a)) > in if p(leftbranch) > then leftbranch > else One # find_iv(a -> p(One # a)) Actually, I never thought about the performance of this algorithm or experimented with it. Let’s see what we get (you need to replace `find = find_iv`): *Main> equal (proj 10) (proj 20) False (0.00 secs, 522120 bytes) *Main> equal (proj 10) (proj 200) False (0.04 secs, 1550824 bytes) *Main> equal (proj 10) (proj 2000) False (3.71 secs, 146039744 bytes) *Main> equal (proj 10) (proj 20000) Interrupted. Much better than `find_i`, but much worse than `find_iii`! I gave up in the last example, because it started to slow down my edition of this post after a minute or so. But there is a much better algorithm, which I now present. I won’t attempt to explain the working of this algorithm in this post (see my LICS’2007 paper below if you are really interested), but I include a few remarks below: > find_v :: (Cantor -> Bool) -> Cantor > find_v p = n -> if q n (find_v(q n)) then Zero else One > where q n a = p(i -> if i < n then find_v p i else if i == n then Zero else a(i-n-1)) All the above algorithms, except this one, can be easily rewritten to use lazy lists rather than functions defined on the natural numbers. This algorithm takes advantage of the fact that to access an element of a sequence represented as a function, it is not necessary to scan all the preceding elements. In a perhaps mysterious way, this algorithm implicitly figures out which entries of its argument `p` uses, and constructs only those explicitly. You can access the other ones if you wish, but the algorithm `find_v` doesn’t force their evaluation. One way to see that `find_v` is correct is to show, by induction on `n`, that `find_i p n = find_v p n`, which is not too difficult, although the calculations get big at some stages if one doesn’t carefully introduce suitable auxiliary notation. A better way is to understand this directly, as done in the above paper (you need to look for the product functional, which generalizes this). Now this gets really fast (take `find = find_v`): *Main> equal (proj (2^300)) (proj (2^300)) True (0.00 secs, 522148 bytes) *Main> equal (proj (2^300)) (proj (2^400)) False (0.00 secs, 525064 bytes) But if the functions use several of their arguments, not just one (see example below), this isn’t so good any more. To fix this, first rewrite the above program as follows, introducing an auxiliary variable `b` to name the result, and replace one of the recursive calls (there are two) to use `b` instead: > find_vi :: (Cantor -> Bool) -> Cantor > find_vi p = b > where b = n -> if q n (find_vi(q n)) then Zero else One > q n a = p(i -> if i < n then b i else if i == n then Zero else a(i-n-1)) Lazy evaluation doesn’t help here, because `b` is a function, and in fact this makes the program slightly slower. Now, to make it significantly faster, we apply the identity function to the definition of `b`. Or rather an elaborate implementation of the identity function, that stores `b` into an infinite binary tree in a breadth-first manner and then retrieves it back (this trick implements memoization with logarithmic overhead): > find_vii :: (Cantor -> Bool) -> Cantor > find_vii p = b > where b = id'(n -> if q n (find_vii(q n)) then Zero else One) > q n a = p(i -> if i < n then b i else if i == n then Zero else a(i-n-1)) > data T x = B x (T x) (T x) > code :: (Natural -> x) -> T x > code f = B (f 0) (code(n -> f(2*n+1))) > (code(n -> f(2*n+2))) > decode :: T x -> (Natural -> x) > decode (B x l r) n | n == 0 = x > | odd n = decode l ((n-1) `div` 2) > | otherwise = decode r ((n-2) `div` 2) > id' :: (Natural -> x) -> (Natural -> x) > id' = decode.code Now take `find = find_vii`, and test: > f',g',h' :: Cantor -> Integer > f' a = a'(10*a'(3^80)+100*a'(4^80)+1000*a'(5^80)) where a' i = coerce(a i) > g' a = a'(10*a'(3^80)+100*a'(4^80)+1000*a'(6^80)) where a' i = coerce(a i) > h' a = a' k > where i = if a'(5^80) == 0 then 0 else 1000 > j = if a'(3^80) == 1 then 10+i else i > k = if a'(4^80) == 0 then j else 100+j > a' i = coerce(a i) *Main> equal f' g' False (6.75 secs, 814435692 bytes) *Main> equal f' h' True (3.20 secs, 383266912 bytes) *Main> equal g' h' False (6.79 secs, 813083216 bytes) *Main> equal f' f' True (3.20 secs, 383384908 bytes) *Main> equal g' g' True (3.31 secs, 400711084 bytes) *Main> equal h' h' True (3.22 secs, 383274252 bytes) Among all the above algorithms, only `find_vii` can cope with the above examples. A more interesting example is this. Two finite sequences $s$ and $t$ of natural numbers have the same _set_ of elements iff the two functions > $bigwedge_{i < |s|} mathrm{proj}_{s_i}$ and $bigwedge_{i < |t|} mathrm{proj}_{t_i}$ are equal, where the above notation indicates the pointwise logical-and (conjunction) of the projections, and where $|s|$ is the length of $s$. Here is an implementation of this idea: > pointwiseand :: [Natural] -> (Cantor -> Bool) > pointwiseand [] = b -> True > pointwiseand (n:a) = b -> (b n == One) && pointwiseand a b > sameelements :: [Natural] -> [Natural] -> Bool > sameelements a b = equal (pointwiseand a) (pointwiseand b) *Main> sameelements [6^60, 5^50, 1, 5, 6, 6, 8, 9, 3, 4, 6, 2, 4,6, 1, 6^60, 7^700, 1, 1, 1, 3, 3^30] [1, 2, 3, 4, 5, 6, 7, 8, 9, 3^30, 5^50, 6^60, 7^70] False (0.14 secs, 21716248 bytes) *Main> sameelements [6^60, 5^50, 1, 5, 6, 6, 8, 9, 3, 4, 6, 2, 4,6, 1, 6^60, 7^70, 1, 1, 1, 3, 3^30] [1, 2, 3, 4, 5, 6, 7, 8, 9, 3^30, 5^50, 6^60, 7^70] False (0.10 secs, 14093520 bytes) *Main> sameelements [6^60, 5^50, 1, 5, 6, 6, 8, 9, 3, 4, 6, 2, 4,6, 1, 6^60, 7^70, 1, 1, 1, 3, 3^30] [1, 2, 3, 4, 5, 6, 8, 9, 3^30, 5^50, 6^60, 7^70] True (0.10 secs, 12610056 bytes) *Main> sameelements [6^60, 5^50, 1, 5, 6, 6, 8, 9, 3, 4, 6, 2, 4,6, 1, 6^60, 7^70, 1, 1, 1, 3, 3^30] [1, 2, 3, 4, 5, 6, 8, 9, 3^30, 5^50, 6^60, 7^700] False (0.12 secs, 17130684 bytes) *Main> sameelements [6^60, 5^50, 1, 5, 6, 6, 8, 9, 3, 4, 6, 2, 4,6, 1, 6^60, 7^700, 1, 1, 1, 3, 3^30] [1, 2, 3, 4, 5, 6, 8, 9, 3^30, 5^50, 6^60, 7^700] True (0.12 secs, 17604776 bytes) It is natural to ask whether there are applications to program verification. I don’t know, but [Dan Ghica][22] and I speculate that there are, and we are planning to investigate this. An even faster search algorithm is offered in the first comment below. * * * ### References with comments 1. [Ulrich Berger][14]. **Totale Objekte und Mengen in der Bereichtheorie**. PhD thesis, Munich LMU, 1990. 2. [M.H. Escardo][23]. ** [Infinite sets that admit fast exhaustive search][24]**. In LICS’2007, Poland, Wroclaw, July. Download [companion Haskell program][25].This paper investigates which kinds of infinite sets admit exhaustive search. It gives several algorithms for systematically building new searchable sets from old. It also shows that, for a rich collection of types, any subset that admits a quantifier also admits a searcher. The algorithm for constructing the searcher from the quantifier is slow, and so at present this result is of theoretical interest only. But the other algorithms are fast. 3. M.H. Escardo. ** [Synthetic topology of data types and classical spaces][26]**. ENTCS, Elsevier, volume 87, pages 21-156, November 2004.I would nowadays call this algorithmic topology of program types, rather than synthetic topology of data types, and in future writings this is how I’ll refer to it. It shows how notions and theorems from general topology can be directly mapped into programming languages (I use Haskell again). But it also shows how this can be mapped back to classical topology. Exhaustively searchable sets feature there as computational manifestations of compact sets. 4. M.H. Escardo and W.K. Ho. ** [Operational domain theory and topology of a sequential programming language][27]**. In Proceedings of the 20th Annual IEEE Symposium on Logic in Computer Science (LICS), June 2005, pages 427-436.If you want to learn a bit about the use of domain theory and topology for reasoning about programs, this is a possible starting point. Rather than defining a denotational semantics using domain theory and topology, we extract them directly from the operational semantics of the language (PCF, which can be regarded as a faithful subset of Haskell), side-stepping denotational semantics. Many definitions of domain theory and denotational semantics arise as theorems here. There is a proof of Berger’s program, including the appropriate notion of modulus of uniform continuity needed for the proof. 5. [Dag Normann][28]. ** [Computing with functionals – computability theory or computer science? ][29]**. Bulletin of Symbolic Logic, 12(1):43-59, 2006.This is a nice short history and survey of higher-type computability theory (the subject was established in the late 1950’s by work of Kleene and Kreisel, but had some precursors). This theory is not well known among functional programmers, but it probably should be, at least among those with theoretical inclinations. Somebody someday should write an expository account to computer scientists. 6. [Alex Simpson][30]. [**Lazy Functional Algorithms for Exact Real Functionals][31]**. In _Mathematical Foundations of Computer Science 1998_, Springer LNCS 1450, pp. 456-464, 1998.One can use infinite sequences to represent real numbers exactly. Using the universal quantification functional, Alex Simpson developed an algorithm for Riemann integration. The algorithm turns out to be inefficient, but this is not the universal quantifier’s fault: the Haskell profiler tells me that the integration program performs about 10000 universal quantifications per second, 3000 of which return `True`. ## About Martin Escardo Reader at the School of Computer Science of the University of Birmingham, UK [ View all posts by Martin Escardo → ][32] # Post navigation ←[ The Role of the Interval Domain in Modern Exact Real Arithmetic][33] [A constructive theory of domains suitable for implementation →][34] ## 21 thoughts on “Seemingly impossible functional programs” 1. [**Martin Escardo][23]** says: [ October 1, 2007 at 16:28 ][35] I’ve found a way of making the fastest algorithm about 40 faster by avoiding the trees and instead treating functions as trees: > find = find_viii > findBit :: (Bit -> Bool) -> Bit > findBit p = if p Zero then Zero else One > branch :: Bit -> Cantor -> Cantor -> Cantor > branch x l r n | n == 0 = x > | odd n = l ((n-1) `div` 2) > | otherwise = r ((n-2) `div` 2) > find_viii :: (Cantor -> Bool) -> Cantor > find_viii p = branch x l r > where x = findBit(x -> forsome(l -> forsome(r -> p(branch x l r)))) > l = find_viii(l -> forsome(r -> p(branch x l r))) > r = find_viii(r -> p(branch x l r)) This doesn’t need the memoization, because the things one needs to remember are bound to variables in the where-clause. Using this algorithm, comparing f’ and g’, and f’ and h’ for equality moves from 6.75 and 3.20 seconds to respectively 0.14 and 0.09 seconds (using the Glasgow interpreter in all cases). [Reply][36] 2. Pingback: [What do We Have to Know About a Function in Order to Compute its Definite Integral? « XOR’s Hammer][37] 3. Pingback: [A Rant A Day - The Power of The Functional or: I’ll bet you can’t do this in Java][38] 4. [**Jason Orendorff][39]** says: [ December 9, 2009 at 20:06 ][40] Martin, this whole post was fascinating but even so those last two sentences pack the biggest punch of all. Where should I look for the follow-up? [Reply][41] 5. **Martin Escardo** says: [ December 16, 2009 at 11:29 ][42] The promised follow-up blog post hasn’t been written, although there is a related one at There are, however, a couple of papers that develop/apply these ideas, available from my web page, written after this post: 1\. Exhaustible sets in higher-type computation 2\. (With P. Oliva) Selection functions, bar recursion, and backward induction. 3\. Semi-decidability of may, must and probabilistic testing in a higher-type setting. 4\. Computability of continuous solutions of higher-type equations. [Reply][43] 6. Pingback: [Searchable Data Types « Luke Palmer][44] 7. Pingback: [Not all countably infinite sets are equal | Amateur Topologist][45] 8. [**Bodo Junglas][46]** says: [ July 14, 2011 at 01:58 ][47] Call me crazy, but I took the “You can’t do this in Java” bet. Here it comes: The seemingly impossible Java-implementation of a seemingly impossible functional problem: [https://github.com/untoldwind/seemingly_impossible][48] Ok, I just implemented the first three find’s and I admit that this implementation is a sure way to exhaust the Java-Stack quite easily (if you get a StackOverflow running this stuff you might have to use -Xss512m or something). For completeness I’ve added my own find-variant (FindJ), i.e. how would you solve this little problem in “regular Java”, whatever that means (*cough* it’s doing the examples above in <1ms *cough*). Now, before the tomatoes fly: This is a hack! Also this is **not** about “my language is better than yours”. Functional programming languages have their appeal and are able so solve certain types of problems in a very natural way, the Java-implementation is a syntactic mess in compassion. But: I seriously doubt that there are problems you can only solve with a functional programming language. After all the Haskell compiler can not do magic, even though it might seem so sometimes. [Reply][49] 9. Pingback: [A Haskell monad for infinite search in finite time « Mathematics and Computation][50] 10. Pingback: [How to make the “impossible” functionals run even faster « Mathematics and Computation][51] 11. **Ralph Loader** says: [ January 26, 2012 at 05:05 ][52] Do these work for functions that have state? To state precisely what’s wanted one would have to be careful about extensionality (although you glossed over it, just as you need to be careful about totality in the pure functional setting). [Even if the particular programs above don’t work with stateful functions, the Kreisel-Lacombe-Shoenfield theorem makes one think that it is possible to do the same thing for stateful functions.] [Reply][53] 12. [**Andrej Bauer][54]** says: [ January 26, 2012 at 09:11 ][55] @Ralph: This post is written in the context of purely functional programs, i.e., no state, and all functionals are presumed to be (hereditarily) total. With state things are more complicated (global state, local state, what does it mean for an extensional functional to use state?). [Reply][56] 13. [**Martin Escardo][23]** says: [ January 26, 2012 at 12:22 ][57] Yes, everything here is extensional and total, as Andrej says, and there are no effects. But Andrej gave a nice talk in MAP’2011 in the Lorentz Center showing how to do it faster with effects. But in this case, as you say, it is difficult to treat extensionality rigorously. [Reply][58] 14. **Ralph Loader** says: [ January 28, 2012 at 03:28 ][59] It turns out that you can give a sensible, negative, answer to my question. find can be parametrized with a monad pretty easily e.g., [conflating Bit and Bool to save typing]. module MFind where import Control.Monad.State findM :: Ord i => Monad m => ((i -> m Bool) -> m Bool) -> (i -> m Bool) findM p n = do { x <- q (findM q) ; return (not x) } where q a = p (i -> if i == n then return False else if i < n then findM p i else a i) -- Make a dummy call to (c x), where x increments by 1 each time we're called. -- Then return (c 5). fifthWithDummy :: (Integer -> State Integer Bool) -> State Integer Bool fifthWithDummy c = do x <- get put (x + 1) b <- c x if b then c 5 else c 5 -- Now compare -- findM ($ 5) 0 -- with -- runState (findM (fifthWithDummy) 0) 0 never returns. [Reply][60] 15. [**Andrej Bauer][54]** says: [ January 28, 2012 at 09:14 ][61] @Martin: my talk about faster functionals using effects is [on line][51]. [Reply][62] 16. Pingback: [The topology of the set of all types « Mathematics and Computation][63] 17. Pingback: [Seemingly impossible proofs | Mathematics and Computation][64] 18. **Andrew Zabavnikov** says: [ August 30, 2015 at 09:46 ][65] Martin, please consider answering the following question: Thanks in advance for attention :-) [Reply][66] 19. Pingback: [Functional Programming: Links, News And Resources (16) | Angel "Java" Lopez on Blog][67] 20. Pingback: [Abusing the continuation monad – Obsolete Wall Street][68] 21. Pingback: [[Solved]: Turing Completeness + Dataflow Unification = Arbitrarily invertible (pure, nonrecursive) functions? – Ignou Group][69] ### Leave a Reply [Cancel reply][70] Your email address will not be published. Required fields are marked * Comment Name * Email * Website ![CAPTCHA][71] [ ![Refresh][72]][73] CAPTCHA Code* ### Meta * [Log in][74] * [Entries RSS][75] * [Comments RSS][76] * [WordPress.org][77] ### Related * [Abstract Stone Duality][78] * [CCA Network][79] * [Homotopy type theory][80] ### Various * [Andrej Bauer][81] * [Photos of mathematicians][82] ### Categories Categories Select Category Computation  (31) Constructive math  (36)    Gems and stones  (8) Coq  (1) General  (28) Guest post  (8) Homotopy type theory  (12) Logic  (16) News  (13) Off topic  (4) Programming  (23)    Andromeda  (2)    Eff  (7) Publications  (31) Random art  (2) Software  (24)    alg  (1)    PL zoo  (6)    RZ  (6) Synthetic computability  (4) Talks  (24) Teaching  (6) Tutorial  (27) Videos  (3) ### Archives Archives Select Month December 2017 May 2017 March 2017 October 2016 September 2016 August 2016 July 2016 June 2016 January 2016 December 2015 November 2015 August 2015 July 2015 May 2015 April 2015 November 2014 October 2014 July 2014 May 2014 March 2014 January 2014 December 2013 August 2013 June 2013 December 2012 November 2012 October 2012 September 2012 August 2012 May 2012 March 2012 January 2012 December 2011 July 2011 June 2011 May 2011 March 2011 February 2011 January 2011 November 2010 September 2010 August 2010 July 2010 May 2010 April 2010 March 2010 January 2010 December 2009 October 2009 September 2009 May 2009 April 2009 March 2009 January 2009 December 2008 November 2008 September 2008 August 2008 May 2008 February 2008 January 2008 September 2007 May 2007 April 2007 January 2007 November 2006 August 2006 April 2006 March 2006 January 2006 December 2005 September 2005 August 2005 July 2005 May 2005 April 2005 July 2004 May 2004 November 2002 July 2002 April 2002 April 2001 September 2000 May 2000 August 1999 August 1997 [Proudly powered by WordPress][83] [1]: http://math.andrej.com/ "Mathematics and Computation" [2]: http://math.andrej.com#content "Skip to content" [3]: http://math.andrej.com/about/ [4]: http://math.andrej.com/category/papers/ [5]: http://math.andrej.com/category/software/ [6]: http://math.andrej.com/eff/ [7]: http://math.andrej.com/alg/ [8]: http://math.andrej.com/rz/ [9]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/ "Permalink to Seemingly impossible functional programs" [10]: http://math.andrej.com/category/computation/ [11]: http://math.andrej.com/category/guest-post/ [12]: http://math.andrej.com/category/tutorial/ [13]: http://math.andrej.com/author/mhe/ "View all posts by Martin Escardo" [14]: http://www.cs.swan.ac.uk/~csulrich/ [15]: http://en.wikipedia.org/wiki/Cantor_space [16]: http://www.haskell.org/ [17]: http://www.ocaml.org [18]: http://math.andrej.com/wp-content/uploads/2007/09/seemingly-impossible.hs [19]: http://en.wikipedia.org/wiki/De_Morgan's_laws [20]: http://en.wikipedia.org/wiki/Halting_problem [21]: http://en.wikipedia.org/wiki/Uniform_continuity [22]: http://www.cs.bham.ac.uk/~drg/ [23]: http://www.cs.bham.ac.uk/~mhe/ [24]: http://www.cs.bham.ac.uk/~mhe/papers/exhaustive.pdf [25]: http://www.cs.bham.ac.uk/~mhe/papers/exhaustive.hs [26]: http://www.cs.bham.ac.uk/~mhe/papers/entcs87.pdf [27]: http://www.cs.bham.ac.uk/~mhe/papers/escardo-ho-operational.pdf [28]: http://www.math.uio.no/~dnormann/ [29]: http://www.math.uio.no/~dnormann/Bulletin.Normann.04.pdf [30]: http://homepages.inf.ed.ac.uk/als/ [31]: http://homepages.inf.ed.ac.uk/als/Research/lazy.ps.gz [32]: http://math.andrej.com/author/mhe/ [33]: http://math.andrej.com/2007/09/18/the-role-of-the-interval-domain-in-modern-exact-real-airthmetic/ [34]: http://math.andrej.com/2008/01/31/a-constructive-theory-of-domains-suitable-for-implementation/ [35]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/#comment-5868 [36]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/?replytocom=5868#respond [37]: http://xorshammer.wordpress.com/2008/08/21/compute-definite-integral/ [38]: http://arantaday.com/blog/the-power-of-the-functional/ [39]: http://jorendorff.blogspot.com/ [40]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/#comment-13004 [41]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/?replytocom=13004#respond [42]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/#comment-13032 [43]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/?replytocom=13032#respond [44]: http://lukepalmer.wordpress.com/2010/11/17/searchable-data-types/ [45]: http://www.amateurtopologist.com/2011/01/10/not-all-countably-infinite-sets-are-equal/ [46]: http://objectcode.de [47]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/#comment-16512 [48]: http://github.com/untoldwind/seemingly_impossible [49]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/?replytocom=16512#respond [50]: http://math.andrej.com/2008/11/21/a-haskell-monad-for-infinite-search-in-finite-time/ [51]: http://math.andrej.com/2011/12/06/how-to-make-the-impossible-functionals-run-even-faster/ [52]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/#comment-18831 [53]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/?replytocom=18831#respond [54]: http://andrej.com/ [55]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/#comment-18832 [56]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/?replytocom=18832#respond [57]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/#comment-18834 [58]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/?replytocom=18834#respond [59]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/#comment-18852 [60]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/?replytocom=18852#respond [61]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/#comment-18856 [62]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/?replytocom=18856#respond [63]: http://math.andrej.com/2012/03/30/the-topology-of-the-set-of-all-types/ [64]: http://math.andrej.com/2014/05/08/seemingly-impossible-proofs/ [65]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/#comment-57551 [66]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/?replytocom=57551#respond [67]: https://ajlopez.wordpress.com/2015/09/23/functional-programming-links-news-and-resources-16/ [68]: https://obsoletewallstreet.wordpress.com/2016/09/22/abusing-the-continuation-monad/ [69]: https://www.ignougroup.com/2017/02/solved-turing-completeness-dataflow/ [70]: http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/#respond [71]: http://math.andrej.com/wp-content/plugins/si-captcha-for-wordpress/captcha/securimage_show.php?si_sm_captcha=1&si_form_id=com "CAPTCHA" [72]: http://math.andrej.com/wp-content/plugins/si-captcha-for-wordpress/captcha/images/refresh.png [73]: http://math.andrej.com# "Refresh" [74]: http://math.andrej.com/wp-login.php [75]: http://math.andrej.com/feed/ [76]: http://math.andrej.com/comments/feed/ [77]: https://wordpress.org/ "Powered by WordPress, state-of-the-art semantic personal publishing platform." [78]: http://www.paultaylor.eu/ASD/ "An abstract formulation of general topology" [79]: http://cca-net.de/ "Computability and Complexity in Analysis Network" [80]: http://homotopytypetheory.org/ [81]: http://andrej.com/ "Owner of the blog" [82]: http://andrej.com/mathematicians/ "Photos of mathematicians, ultra-mathematicians, semi-mathematicians, and quasi-mathematicians" [83]: https://wordpress.org/ [*RSS]: Really Simple Syndication