Chris Cogan | 10 Nov 02:59 2008
Picon
Picon

[stack] Evolutionary Programming

It's been a long time since I've participated in this forum, but it hasn't
been because of loss of interest. Mostly, I've been preoccupied with other
things. I'm very glad to see that this forum is still active.

I thought I'd report on an evolutionary programming project using Joy as the
language of the evolved programs. Several months ago, I started this project
but only worked on it for an hour or two, and then set it aside until the
day before yesterday.

General Description

The idea is to write a program that writes Joy programs, tests them for
fitness against samples of the task data, where the task is do something
like handwriting recognition. Those that produce the most nearly correct
results are saved and used as parents for the next generation, which will
consist mostly of variations on each of the parent programs. Because each
one will be tested in isolation against the same test data, there will be no
direct competition for resources (as there would be in Tom Ray's Tierra).

1. A program generator takes as input an existing program and returns a
small modification of it, such as adding, deleting, or replacing a single
atom or item in a list, etc. (I'll talk in Joy terms, since I'm not
sufficiently knowledgeable about any of the other languages discussed in
this forum.) Use this generator to generate a slew of offspring from each of
the programs, if any, that made it through fitness testing for the previous
generation.

3. Execute each generated program on the test data and measure fitness based
on the results. Save the programs with the highest fitness rating and
discard all the rest.

4. When completely successful code is produced, stop.

5. Otherwise go back to step, using the most successful programs so far as
parents of the next generation.

Why Use Joy?

I considered using Prolog (before I ever heard of Joy) for this kind of
thing, because prolog programs can effectively modify themselves in useful
ways (by adding and deleting rules, mainly), and because it has an automatic
solution-tree search as the core of the way Prolog programs work.

However, a modified Joy is a lot easier to work with because Joy has no
syntax to speak of (although Prolog's syntax is simple compared to that of,
say, C++), except at the lowest level (lists must have a bracket at each
end, numbers can't have two decimal points, etc.). This makes generation and
modification trivial, because the code doing either doesn't have to worry
about complex control constructs, and it doesn't have to worry about it in
making modifications. Once you have code for properly recognizing aggregate
data types (lists, strings, etc.), your work is largely done.

There is a problem, but it is relatively minor: Exceptions, such as dividing
by zero. Normally, these can crash a program, and code generated
unintelligently can easily lead to such errors.

The solution is a modified Joy that basically handles all errors by ignoring
them. In Visual Basic, this can be done with a "Try" block, which can be
used to cause execution to resume after the block if an error occurs in it.
It can also be done with two versions of an "On Error" statement: "On Error
Resume Next" and "On Error Goto <label>" (where <label> is a label at a
location in the code where you want execution to resume if an error occurs).
Once one has a version of Joy that won't crash on errors (simply by
modifying a bunch of the primitives), one can then generate truly arbitrary
string of code, as long as it is syntactically correct, and they will
"execute" (perhaps doing nothing much at all, but still terminating without
error (infinite loops are still possible in combinators that repeat chunks
of code, but even this can be counteracted by putting a fixed limit of, say,
one million, on the number of iterations).

Sticking closely to the evolutionary paradigm can mean that one must start
with truly simple versions of the task, such as distinguishing between two
values of a variable. If we start with samples of the full task, such as
recognizing full documents of text correctly, we won't be able to provide
any fitness tests that are both meaningful and useful, and it could be
billions of years before anything works. Narrowing the task down to
something completely simple allows for some fitness to be discovered early
on (then we increase the difficulty of the task a little).

So, I'm planning to start my own evolver on a single Boolean variable. When
the evolver finally produces one that can handle this task reliably, I can
switch to having it try to make code that correctly reports the values of
two Booleans. At some point, I should be able to add code for creating
simple images in memory, and start over with one-pixel images, and only one
distinction (white vs. non-white, perhaps).

Eventually, I should be able to use images with enough pixels to represent
symbols such as plus-signs and minus-signs. From there, I should be able to
start using actual image files, which will require scanning samples of my
handwritten material (over two hundred pounds of the stuff, written over the
past twenty-odd years) and cutting it into small, one-character image files.
This will also require that I provide the correct text for each sample. At
first, I will probably just include the text in the names of the files (and
make sure that the generated code has no access to this information, to
avoid evolving code that only reports text from the file names!).

For now, I'll be happy just to get my evolver working well enough to evolve
code to make some simple distinction reliably, even if I have to hand-tweak
mutation-rates and types of mutations allowed. I'm using what will be my
third implementation of Joy (more of a mini-Joy, in this case). (I've got
one implementation in Visual Basic (Visual Studio) and another in Visual
Basic for Applications (the macro language for the Microsoft Office programs
-- Microsoft Word in this case).

My Visual Studio version of Joy "compiles" to lists of what I call "code
objects," and this makes for a very generalized implementation and allows
for easy de-compilation, etc., but it's also slow, so my new version will
not use "objects," and this will eliminate at least one level of
indirection, in that it will use "delegates" (which are basically pointers
to functions, but with type-checking for parameters and return types). It
will also not "compile" code but will only process code into arrays of
strings (even lists with many elements will each just be a single string
until it is "executed" (pushed onto the stack), at which point it will be
made into an array of strings (one string for each element).

This version of Joy will also be the least faithful to Manfred's version.
I'm more interested, this time, in making a language that simply has a
suitable set of primitives for my evolutionary programming purposes. One
difference will be that executing code can define new functions (and
re-define them, thus making them into variables). This violates Joy's
variable-free nature, especially if I allow non-local variables (but I
haven't decided on this yet).

It's interesting that Joy code is more like the genetic code of DNA than
ordinary programming languages are, in that DNA doesn't have much in the way
of syntax, and it doesn't have any explicit looping or other typical control
constructs. DNA is constructed in a concatenative way, as long chains,
without loops or other interconnections. Of course, this parallelism is very
limited, but it suggests why I think Joy is a good programming language for
evolutionary programming: The only structuring is done by defining functions
and making chunks of code into lists. Both of these are forms of
modularization (and defining functions is basically just naming a list
(quoted program) so it can be used by reference rather than being included
bodily (which also requires an extra step to execute its contents as code).

I had hoped to have some evolving going on by now (Sunday evening), but I
took time out to write this post so I may not get to actual evolution for a
day or so, depending on available time. But, I'm close.

I'm working on the Joy source code preprocessor now. Since I've done it
twice before and since I'm using a simplified version of Joy, this shouldn't
take long.

I'll report on results, soon, I hope.

Chris Cogan

__._,_.___
Recent Activity
    Visit Your Group
    Yahoo! Finance

    It's Now Personal

    Guides, news,

    advice & more.

    New web site?

    Drive traffic now.

    Get your business

    on Yahoo! search.

    Find helpful tips

    for Moderators

    on the Yahoo!

    Groups team blog.

    .

    __,_._,___
    chris glur | 15 Nov 08:32 2008
    Picon

    Re: [stack] Evolutionary Programming

    > The idea is to write a program that writes Joy programs,
    > tests them for fitness against samples of the task data,
    > where the task is do something like handwriting recognition.
    --snip--
    >3. Execute each generated program on the test data and measure
    >fitness based on the results. Save the programs with the highest
    >fitness rating and discard all the rest.

    Having just read 75% of 'The collapse of chaos' by Cohen &
    Ian Stew[au]rt I'm less believing than when I read of some
    claimed successes in this scheme previously.

    I see now that my previous extention of neural-nets
    concepts, where the state space searched is continuous
    and 'hill climbing' is aimed at; is completely misplaced.

    The author starts by looking at Mandelbrot fractals ..etc.
    where even with infinitesimal errors the output is still
    unpredictable. Ie. effectively discontinuity.

    >Sticking closely to the evolutionary paradigm can mean that
    >one must start with truly simple versions of the task, such
    >as distinguishing between two values of a variable. If we
    >start with samples of the full task, such as recognizing full
    >documents of text correctly, we won't be able to provide any
    >fitness tests that are both meaningful and useful, and it could
    >be billions of years before anything works. Narrowing the task
    >down to something completely simple allows for some fitness to
    >be discovered early on (then we increase the difficulty of the
    >task a little).

    OK, this sound plausible.
    But then only the micro-modules evolve, and the intelligent
    human uses them to build the complete algorithm.
    Or you can automate the combinations of the modules, but
    the human has designed the initial intelligent module
    partitioning first.

    >It's interesting that Joy code is more like the genetic code
    >of DNA than ordinary programming languages are, in that DNA
    >doesn't have much in the way of syntax, and it doesn't have
    >any explicit looping or other typical control constructs.
    I suspect these claims are mostly intuitive ?

    IMO 'looping' can only apply in a digital computer.
    Whereas DNA is digital, I think the functions which it
    'controls' are analog-like. Ie. "DO x UNTIL enough".
    So you could say 'the molecules of water evaporated
    until the cup was dry", but it that really 'looping'.

    I suspect that the 'divide by zero' is only one of [possibly
    infinite] many discontinuities in the scheme, as I understand
    your description of it.

    You can simulate an analog-system in your output, but
    your input is discontinuous/chaotic ?
    OTOH you could simulate all the analog-processes, with
    your digital computer ?

    Can you give any mathematical insight into why:
    run M*N should be closer to a goal than run N ?

    It seems like one of these schemes which sound
    plausible on a first superficial consideration, but
    fails on deeper analysis.

    An fascinating can-O-worms ?

    It would be nice to be able to consult biologist-Cohen
    & mathematician-Stewart on this ?

    == Chris Glur.

    __._,_.___
    Recent Activity
      Visit Your Group
      Yahoo! Finance

      It's Now Personal

      Guides, news,

      advice & more.

      Dog Groups

      on Yahoo! Groups

      discuss everything

      related to dogs.

      Yahoo! Groups

      Going Green Zone

      Save the planet.

      Your resources to go green.

      .

      __,_._,___
      janvanlent | 16 Nov 20:06 2008
      Picon
      Picon

      [stack] Re: Evolutionary Programming

      --- In concatenative <at> yahoogroups.com, "Chris Cogan" <ccogan <at> ...> wrote:
      > I thought I'd report on an evolutionary programming project using
      Joy as the
      > language of the evolved programs.

      You might be interested in the work by Lee Spector on the Push, PushGP
      and Pushpop systems [1].

      jan

      [1] http://hampshire.edu/lspector/push.html

      __._,_.___
      Recent Activity
        Visit Your Group
        Yahoo! Finance

        It's Now Personal

        Guides, news,

        advice & more.

        Ads on Yahoo!

        Learn more now.

        Reach customers

        searching for you.

        Yahoo! Groups

        Going Green Zone

        Find Green groups.

        Find Green resources.

        .

        __,_._,___
        Stevan Apter | 16 Nov 20:25 2008

        Re: [stack] Re: Evolutionary Programming

        or search back in the concatenative archives and read billy
        tanksley's remarks about his languages 01 and 10 and their
        suitability for darwinian evolutionary experiments.

        i did an implementation in k (www.nsl.com/k/10.k) and stopped
        at the point where i needed to adapt dennis shasha's tree-
        distance algorithm. it still seems to me that 10 is a perfect
        candidate for this kind of work.

        ----- Original Message -----
        From: "janvanlent" <jan.vanlent+concatenative <at> cs.kuleuven.ac.be>
        To: <concatenative <at> yahoogroups.com>
        Sent: Sunday, November 16, 2008 2:06 PM
        Subject: [stack] Re: Evolutionary Programming

        > --- In concatenative <at> yahoogroups.com, "Chris Cogan" <ccogan <at> ...> wrote:
        >> I thought I'd report on an evolutionary programming project using
        > Joy as the
        >> language of the evolved programs.
        >
        > You might be interested in the work by Lee Spector on the Push, PushGP
        > and Pushpop systems [1].
        >
        > jan
        >
        > [1] http://hampshire.edu/lspector/push.html
        >
        >
        >

        __._,_.___
        Recent Activity
          Visit Your Group
          Yahoo! Finance

          It's Now Personal

          Guides, news,

          advice & more.

          New business?

          Get new customers.

          List your web site

          in Yahoo! Search.

          Find helpful tips

          for Moderators

          on the Yahoo!

          Groups team blog.

          .

          __,_._,___
          William Tanksley, Jr | 16 Nov 21:46 2008
          Picon

          Re: [stack] Re: Evolutionary Programming

          Stevan Apter <sa <at> nsl.com> wrote:
          > or search back in the concatenative archives and read billy
          > tanksley's remarks about his languages 01 and 10 and their
          > suitability for darwinian evolutionary experiments.

          The nice thing about 01 (or 10, it's just a logical NOT away) is that
          you can cut and splice anything anywhere and it's still a valid
          program. The bad thing is that I still haven't decided how to express
          input/output... It's merely a matter of convention, but it's a pretty
          important convention.

          I've not touched "tworing", my 10 lab program, in a while; one of its
          unit tests fail, and I'm not sure why. I want to use it to design a
          more effective base ... I've got some theories about what might make a
          base better than another one, I just need to do some testing.

          BTW, while you're thinking about evolutionary algorithms, check this
          page out: http://williamtozier.com/slurry/2008/04/02/search-algorithms.

          -Wm

          __._,_.___
          Recent Activity
            Visit Your Group
            Yahoo! Finance

            It's Now Personal

            Guides, news,

            advice & more.

            New web site?

            Drive traffic now.

            Get your business

            on Yahoo! search.

            Best of Y! Groups

            Discover groups

            that are the best

            of their class.

            .

            __,_._,___
            Stevan Apter | 16 Nov 23:00 2008

            Re: [stack] Re: Evolutionary Programming


            ----- Original Message -----
            From: "William Tanksley, Jr" <wtanksleyjr <at> gmail.com>
            >
            > I've not touched "tworing", my 10 lab program, in a while; one of its
            > unit tests fail, and I'm not sure why.

            perhaps you can post your 10 unit tests. i'd like to try running them in
            my implementation.

            __._,_.___
            Recent Activity
              Visit Your Group
              Yahoo! Finance

              It's Now Personal

              Guides, news,

              advice & more.

              Need traffic?

              Drive customers

              With search ads

              on Yahoo!

              Yahoo! Groups

              Cats Group

              Join a group for

              cat owners like you

              .

              __,_._,___
              John Meacham | 17 Nov 05:21 2008
              Picon

              Re: [stack] Evolutionary Programming

              Hi, you would probably be really interested in the work of Jurgen
              Schmidhuber, in particular the 'OOPS' optimal ordered problem solver[1].

              It is an algorithm to optimally find the solution to any problem, via
              a brute force search of the program space. Now, it has been known for a
              while that there exists a simple provably optimal solution to all
              definable problems (see LSEARCH and HSEARCH), however, it was of purely
              theoretical interest since the constant factor was on the order of
              2^500.

              OOPS has actually made this sort of algorithm practical via the clever
              use of.. a concatinative language, in particular a varient of forth. (it
              is described in the paper). The novel use of forth in OOPS was taking
              advantage of the fact that a program that solves a subset of a certain
              problem is more likely to be of use to a program that solves the general
              problem, and that the concatination of valid forth programs is a valid
              program. In particular, it attempts to solve problems in order from
              smallest to larger, incrementally expanding the forth programs biased
              towards using succesful runs of smaller functions.

              As to a couple other points mentioned, when it comes to exceptions, you
              actually, perhaps counterintuitivly, want to abort programs that come
              across an exception rather than trying to handle it. in fact, you should
              be ultra paranoid. You want to weed out the bad programs as soon as
              possible so they don't take up time when you could be testing other
              candidates, in particular, you want the most 'dense' encoding possible,
              in that you don't want to spend time testing algorithms that have
              already been tested, by ignoring exceptions, you have increased your
              search space with a lot of 'duplicate' entries. a program that has an
              ignored exception and one that never attempted the invalid op in the
              first place behave identically, and you will end up running both to
              completion needlessly. Of course, there is no way to remove all
              redundancy, but you certainly can make it less likely by allowing
              exceptions to kill your program.

              Also, when it comes to automatcally generating joy programs, you can
              avoid having to deal with the bit of syntax joy has by using the floy[2]
              variant of joy. floy is joy where are quotations are restricted to a
              single symbol. it turns out to be just as expressive as joy, but you
              don't need to worry about matching quotations, your primitive 'base
              pairs' are simple the joy primitives and the quoted versions of each
              and nothing else.

              When it comes to recursion, no need to fear, you actually can express
              arbitrary recursion without bindings by use of a y combinator[3], which
              is easy enough to provide as a primitive in joy, if you include it as a
              choice in your genetic code, you are not limited by your lack of
              definitions in any way.

              I think the following works... but someone check my work.

              fix == dup [fix] dip i;

              John

              [1] http://www.idsia.ch/~juergen/oops.html
              [2] http://www.latrobe.edu.au/philosophy/phimvt/joy/jp-flatjoy.html
              [3] http://en.wikipedia.org/wiki/Fixed_point_combinator

              --
              John Meacham - ⑆repetae.net⑆john⑈

              __._,_.___
              Recent Activity
                Visit Your Group
                Yahoo! Finance

                It's Now Personal

                Guides, news,

                advice & more.

                New web site?

                Drive traffic now.

                Get your business

                on Yahoo! search.

                Yahoo! Groups

                Everyday Wellness Zone

                Check out featured

                healthy living groups.

                .

                __,_._,___
                chris glur | 18 Nov 17:37 2008
                Picon

                Re: [stack] Evolutionary Programming

                If it's to run as an auto-evolving machine, then decoding,
                analysing & reverse-engineering the results is not realistic.
                And not a goal.

                What is the advantage of having a less syntaxy-language?
                Well what does syntax mean ?
                In this context it's a human aid to grouping and classifying.
                Ie. imposing order. Which is exactly what you DON'T want.
                You want the finest possible granularity.
                At the same time you want to use the cost effectivness and
                speed of a PC.

                In fact why bother with joy ?
                Just go as close as managable [without crashes] to the
                native instruction set of the machine/PC.

                For those who know PCs down to the register level,
                "divide by zero" is a completely artifical human concept.
                In fact "divide" is not know at the register level.
                [ It might be interesting to 'evolve' a divide machine
                instruction sequence, from the basic add, subtr shifts and
                various logical-operations and test/branch instructions ?!]

                Similarly the concept of "loop" is just a human construct,
                invented for manageability purposes.

                When the stone fell from PizaTower it didn't think in
                terms of a loop: REPEAT decr(1 mm.) UNTIL (crash).
                -------------

                So yes, joy is much more appropriate than most
                other "languages", but a more general purpose,
                finely granulated instruction set, as close as possible
                to the hardware's instruction set is best.

                Of course the finer the granualarity the closer to a
                continuous/analog-computer and the less 'problems'
                with chaotic discontinuity.

                Alternatively use standard languages to simulate a
                continuous analog machine, which would be orders of
                magnitude slower, and impose a whole extra layer of
                complexity in between.

                IMO the divide-by-zero crash is merely one example
                of a more general problem. The 'problem' is stability:
                a monotomical increase - towards infinity.

                Similarly I guess, any infinite-loop [by definition a
                stability] is a failure in the-game-of-life algorithm.

                So how does natural evolution handle this ?
                1. a random mutation breaks the stability;
                2. the species goes off towards infinity before a mutation
                'turns it', and a failed evolutionary branch results.

                == Chris Glur.

                On 11/17/08, John Meacham <john <at> repetae.net> wrote:
                > Hi, you would probably be really interested in the work of Jurgen
                > Schmidhuber, in particular the 'OOPS' optimal ordered problem solver[1].
                >
                > It is an algorithm to optimally find the solution to any problem, via
                > a brute force search of the program space. Now, it has been known for a
                > while that there exists a simple provably optimal solution to all
                > definable problems (see LSEARCH and HSEARCH), however, it was of purely
                > theoretical interest since the constant factor was on the order of
                > 2^500.
                >
                > OOPS has actually made this sort of algorithm practical via the clever
                > use of.. a concatinative language, in particular a varient of forth. (it
                > is described in the paper). The novel use of forth in OOPS was taking
                > advantage of the fact that a program that solves a subset of a certain
                > problem is more likely to be of use to a program that solves the general
                > problem, and that the concatination of valid forth programs is a valid
                > program. In particular, it attempts to solve problems in order from
                > smallest to larger, incrementally expanding the forth programs biased
                > towards using succesful runs of smaller functions.
                >
                > As to a couple other points mentioned, when it comes to exceptions, you
                > actually, perhaps counterintuitivly, want to abort programs that come
                > across an exception rather than trying to handle it. in fact, you should
                > be ultra paranoid. You want to weed out the bad programs as soon as
                > possible so they don't take up time when you could be testing other
                > candidates, in particular, you want the most 'dense' encoding possible,
                > in that you don't want to spend time testing algorithms that have
                > already been tested, by ignoring exceptions, you have increased your
                > search space with a lot of 'duplicate' entries. a program that has an
                > ignored exception and one that never attempted the invalid op in the
                > first place behave identically, and you will end up running both to
                > completion needlessly. Of course, there is no way to remove all
                > redundancy, but you certainly can make it less likely by allowing
                > exceptions to kill your program.
                >
                > Also, when it comes to automatcally generating joy programs, you can
                > avoid having to deal with the bit of syntax joy has by using the floy[2]
                > variant of joy. floy is joy where are quotations are restricted to a
                > single symbol. it turns out to be just as expressive as joy, but you
                > don't need to worry about matching quotations, your primitive 'base
                > pairs' are simple the joy primitives and the quoted versions of each
                > and nothing else.
                >
                > When it comes to recursion, no need to fear, you actually can express
                > arbitrary recursion without bindings by use of a y combinator[3], which
                > is easy enough to provide as a primitive in joy, if you include it as a
                > choice in your genetic code, you are not limited by your lack of
                > definitions in any way.
                >
                > I think the following works... but someone check my work.
                >
                > fix == dup [fix] dip i;
                >
                >
                > John
                >
                >
                >
                > [1] http://www.idsia.ch/~juergen/oops.html
                > [2] http://www.latrobe.edu.au/philosophy/phimvt/joy/jp-flatjoy.html
                > [3] http://en.wikipedia.org/wiki/Fixed_point_combinator
                >
                > --
                > John Meacham - ⑆repetae.net⑆john⑈
                >

                __._,_.___
                Recent Activity
                  Visit Your Group
                  Yahoo! Finance

                  It's Now Personal

                  Guides, news,

                  advice & more.

                  Search Ads

                  Get new customers.

                  List your web site

                  in Yahoo! Search.

                  All-Bran

                  10 Day Challenge

                  Join the club and

                  feel the benefits.

                  .

                  __,_._,___
                  John Nowak | 19 Nov 15:13 2008

                  [stack] utility of the retain stack

                  Hello all. One of the problems I've run into with a second-order
                  language is that you don't get partial application. (I've recently
                  been able to extend the intersection-based type system to a higher
                  order language, so this second-order restriction may not last anyway.)

                  Essentially, the problem is something like this. Let's say I want to
                  write a function that has this effect:

                  ('a0 .. 'aN) 'x -> ('a0 F 'x G .. 'aN F 'x G) 'x

                  In other words, given a list of objects 'a0 to 'aN and some object 'x,
                  we want to map over the list applying some function F to each element,
                  followed by applying G with 'x on the stack. We want to keep 'x around
                  at the end. An implementation in Cat would look as such:

                  pick [[quote [dip] compose] dip compose papply map] dip

                  We can demonstrate its correctness as follows:

                  ('a0 .. 'aN) 'x [F] [G] pick [[quote [dip] compose] dip compose papply
                  map] dip
                  ('a0 .. 'aN) 'x [F] [G] 'x [[quote [dip] compose] dip compose papply
                  map] dip
                  ('a0 .. 'aN) 'x [F] [G] [quote [dip] compose] dip compose papply map 'x
                  ('a0 .. 'aN) 'x [F] quote [dip] compose [G] compose papply map 'x
                  ('a0 .. 'aN) 'x [[F]] [dip] compose [G] compose papply map 'x
                  ('a0 .. 'aN) 'x [[F] dip] [G] compose papply map 'x
                  ('a0 .. 'aN) 'x [[F] dip G] papply map 'x
                  ('a0 .. 'aN) ['x [F] dip G] map 'x
                  ('a0 F 'x G .. 'aN F 'x G) 'x

                  This works fine (there may be an nicer way to write it), but there are
                  two problems. One, it's too complicated. The same thing with named
                  variables (i.e. naming 'x and lifting it into the function) is *much*
                  more direct (where \ is a lambda of sorts):

                  ('a0 .. 'aN) 'x \y. [F y G] map
                  ('a0 .. 'aN) [F 'x G] map
                  ('a0 F 'x G .. 'aN F 'x G) 'x

                  Two, there's no way to do it in a second order language as we're
                  relying on higher order functions.

                  The solution seems to be to introduce a retain stack a la Forth (and
                  Factor). Here, we indicate the value of the retain stack by placing it
                  to the right of the dot. '+r' copies the top element of the retain
                  stack to the data stack:

                  ('a0 .. 'aN) 'x >r map[F +r G] r>
                  ('a0 .. 'aN) map[F +r G] r> . 'x
                  ('a0 F +r G .. 'aN F +r G) r> . 'x
                  ('a0 F 'x G .. 'aN F 'x G) r> . 'x
                  ('a0 F 'x G .. 'aN F 'x G) 'x

                  It's worth nothing here that we can only evaluate as such because
                  we're assuming F, G, and map do not alter the retain stack. This is
                  something a compiler could easily enforce by tracking the retain stack
                  on the type level and disallowing named functions or arguments to a
                  combining form from using elements already on the retain stack at the
                  time they're called. An annotation could be used to bypass this
                  restriction.

                  Considering how much simpler this is than the version without the
                  retain stack, I'm wondering if Joy and Cat are missing something by
                  not offering that second stack. Here's how the retain stack version
                  might work in Joy:

                  ('a0 .. 'aN) 'x r> [F +r G] map r>
                  ('a0 .. 'aN) [F +r G] map r> . 'x
                  ('a0 F +r G .. 'aN F +r G) r> . 'x
                  ('a0 F 'x G .. 'aN F 'x G) r> . 'x
                  ('a0 F 'x G .. 'aN F 'x G) 'x

                  Any thoughts or am I just rambling?

                  - John

                  __._,_.___
                  Recent Activity
                    Visit Your Group
                    Yahoo! Finance

                    It's Now Personal

                    Guides, news,

                    advice & more.

                    Ads on Yahoo!

                    Learn more now.

                    Reach customers

                    searching for you.

                    Yahoo! Groups

                    Going Green Zone

                    Learn to go green.

                    Save energy. Save the planet.

                    .

                    __,_._,___
                    John Nowak | 19 Nov 15:16 2008

                    Re: [stack] utility of the retain stack


                    On Nov 19, 2008, at 9:13 AM, John Nowak wrote:

                    > ('a0 .. 'aN) 'x \y. [F y G] map
                    > ('a0 .. 'aN) [F 'x G] map
                    > ('a0 F 'x G .. 'aN F 'x G) 'x

                    Minor correction:

                    ('a0 .. 'aN) 'x \y. [F y G] map y
                    ('a0 .. 'aN) [F 'x G] map 'x
                    ('a0 F 'x G .. 'aN F 'x G) 'x

                    __._,_.___
                    Recent Activity
                      Visit Your Group
                      Yahoo! Finance

                      It's Now Personal

                      Guides, news,

                      advice & more.

                      Need traffic?

                      Drive customers

                      With search ads

                      on Yahoo!

                      Special K Group

                      on Yahoo! Groups

                      Learn how others

                      are losing pounds.

                      .

                      __,_._,___

                      Gmane