There are multiple aspects to this:
- You want memoisation as a cheap and cheerful way of making top-down recursion emulate dynamic programming; so the same subproblems aren’t revisited by multiple paths through the recursion tree. Dynamic programming systematically evaluates the subproblems up from the bottom, so each subproblem gets evaluated once, but…
- A nice thing that can happen in theory with the top-down recursion is that not all the subproblems evaluated by a dynamic programming approach need to be evaluated by a recursive approach - that is what I meant by the ‘thinning-out’. I did some investigation of this for a three-way longest common subsequence algorithm and found that yes, there is some slight thinning on average when using recursion-with-memoization, but not all that much.
Some numbers taken from a property-based test for a longest common subsequence algorithm, the first are the number of subproblem evaluations via top-down recursion with memoization, the second are via dynamic programming:
(85139,85140)
(5481,5508)
(1612,1620)
(12990,12992)
(19998,20000)
(6894,44772)
(16606,18676)
(52285,52316)
(40497,40500)
(44456,44462)
(25165,25168)
(5513,36504)
(1193,1200)
(16976,42398)
(21793,21840)
(1343,5984)
(63165,63168)
(22388,75348)
(446,8800)
(7149,7150)
(3862,3900)
(22617,22620)
(32637,32640)
(15282,15283)
(15707,15708)
(21915,21945)
(27986,28014)
(98788,98838)
(32719,47040)
(19962,19992)
(80353,80401)
Note that there are the occasional big wins, but overall the recursive strategy doesn’t optimise that much. Furthermore…
- There is the question of stack space when recursing. Even in a language with non-strict evaluation as a possibility, the subproblems require strict evaluation…
…or so I found. Again, I would like to be proven wrong here with a demonstration, because I like the clean recursive approach.
EDIT: what am I blathering on about? - I ended up using Cats to unroll the recursion via continuations IIRC, but it was incredibly slow.
- There is also the question of heap space for the cached subproblems, this is not a trivial amount. It is possible to organise the dynamic programming algorithm to work in a bounded heap space requirement (what I call a ‘swathe’), but I couldn’t find a way around this for the top-down recursive memoized approach. Here is my mumbing to myself on this topic before I gave up: Revisit `LongestCommonSubsequence.of` to use a top-down recursive approach. · Issue #107 · sageserpent-open/kineticMerge · GitHub.
Not trying to put you off this at all, please do experiment, just found the topic interesting.
Have fun with it.