Genetic Programming: Evolution of Mona Lisa

Added FAQ here:
Added Gallery here:

This weekend I decided to play around a bit with genetic programming and put evolution to the test, the test of fine art :-)

I created a small program that keeps a string of DNA for polygon rendering. 
The procedure of the program is quite simple:

0) Setup a random DNA string  (application start)

1) Copy the current DNA sequence and mutate it slightly
2) Use the new DNA to render polygons onto a canvas
3) Compare the canvas to the source image
4) If the new painting looks more like the source image than the previous painting did, then overwrite the current DNA with the new DNA
5) repeat from 1

Now to the interesting part :-)

Could you paint a replica of the Mona Lisa using only 50 semi transparent polygons?

That is the challenge I decided to put my application up to.

The image below is the result of that test:
(The number below each image is the number of generations it took to reach that specific painting)


So what do you think?

Added FAQ here:
Added Gallery here:

342 thoughts on “Genetic Programming: Evolution of Mona Lisa

  1. Very cool, I have been thinking about doing the same thing actually (but not using polygons, by using sort of line creatures).

    What algorithm / library did you use for image comparison?

    I have been pretty much obsessed with evolution stuff this past 1-2 years, read all Dawkins’s books. I Can really recommend the selfish gene, and the blind watchmaker, and ancestors tale, plus Genome by Matt Ridley was great, really interesting how DNA really can be viewed as a hard drive littered with remains neutralized viruses and other self serving copying processes.

    If I ever did PhD research it would be in evolution simulation :)

  2. …interesting. If the representation of the polygons was small enough, you’ve probably got a new compression algorithm.

  3. Very nice! That’s a vectorizer and image compressor in one. Even with scalable compression level, just add another 20 polygons.

    I guess you have to trade time, though :-)

    Same request: source would be nice – a toy to play with on christmas :-)

  4. Wow, your phenotype–genotype mapping seems particuarly effective for this sort of thing!

    Dare I ask, what is it?

  5. I like this! Any chance of seeing the evolution as a video?

    One thing that puzzles me, why aren’t there 50 polygons in the initial images?

  6. Pretty impressive. However, you said the DNA sequence was randomly created… why is the first screen black, instead of showing randomly placed triangles?

  7. It’s not really a genetic algorithm that produced the picture, but a stochastic hill-climber. No recombination is done, the population size is one, so the genotype-phenotype mapping is not really important. Also it seems step 0 was not executed, as others already pointed out.

    And finally, looking at the number of steps it did take to produce that drawing, the convergence speed is not very impressive.

  8. How long does it take to generate the final result?

    Do the number of points in the polygons change overtime? If it does, how do you take that in account in your DNA sequence?

  9. That’s very cool. How did you do the selection? Was it automated somehow, or did you manually compare each image to decide whether to keep/discard it?

  10. I am curious about your fitness function. How, for example, did you decide that 466 looks more like the Mona Lisa than 372?

  11. I’ve implemented genetic algorithms before but only for optimization. Neat application of it!
    Unfortunately, seems like it takes an exponential number of iterations to get finer and finer details.

  12. Seems to me the real magic is in two words: “mutate slightly.” That is the part of the code I would like to see :)

  13. Cool! What compression ratio do you get from the DNA compared to the original source image?

  14. Brilliant idea!

    As per the people asking about the algorithm, I’m curious of algorithm. I imagine the genes have 50 phenotypes, each one being 3 points and 4 colours (RGBA). To show the ‘construction’ of the image, you just draw all of the phenotypes in (a random) order?

    The fitness function sounds like it’d be fun. Objective image comparison is hard to create and might fall apart at higher image sizes. :) Are you using line tracing or something like that?

  15. Some more information would be cool. How long did it run? How did you compare the picture to the actual Mona Lisa?
    You had a population of one, i feel that that might be best in this case, but i do not see why exactly.

  16. Wow! I am impressed. Of course you left out a number of important details in your description. For starters you state the in step 3 you “Compare the canvas to the source image”. Does that mean a simple pixel by pixel comparison, luminance averaging within a fixed or titrating grid, facial recognition, or what? As always the fitness function(s) are at the heart of a GA/GP but you give no clue.

    Also you took 900K generations to get there but you never mentioned your population size, rate of crossover, mutation, etc. Basically, I’d love to know as much about the algorithms you used as you’d be willing to share. A look at the code would be best although I agree with Andy Dwelly that if the the storage size was significantly less than MPG4, etc. (and the calculation effort was small enough!) you might have an interesting and marketable compression algorithm. In that case I’d work hard to get it patented and to market ASP.

    Again, great work and thanks for sharing….

  17. One explanation could be, that one of the mutation possibilities is adding a new polygon to the generating DNA string.

  18. Pingback: Google Adsense Tips & News » Blog Archive » Polygons Evolving Into Mona Lisa

  19. @Luis: probably because it starts off with DNA being initialized to a blank slate (even though it’s not stated in the steps) and then generates the random polygons. So in this case, the black screen was a closer match then random polygons.

  20. This is seriously impressive. How long did it take to reach that final image? What algorithm did you use to compare the images?

    Is it really Genetic Programming though? From your description it sounds more like some other kind of evolutionary algorithm. It would be interesting to run it again with a larger (>1) population and cross-over. I’d imagine it would converge much quicker.

  21. I would say it is using Genetic Algorithm and if the algorithm is correct it sounds like single individual elitist GA. The easiest fitness function would be to do a per-pixel RGB comparison and optimising for lowest possible fitness value.

    It might fall somewhat under the GP category if the number of points in the polygons is variable. The easiest approach would probably be to use triangles with each triangle represented as 3 positions and RGBA-components.

    Going to try this out using Covariance Matrix Adaptation Evolution Strategies once I have more free time. This might actually result in a fast enough optimisation to make this a viable method for vectorising and compressing images with high complexity.

  22. @Dan: The algorithm as describes sounds like a simple form of simulated annealing without a probability of accepting worse matches with a probability > 0 and no cooloff. Another algorith that could work is MCMC (Markov chain Monte Carlo), which is usually used with bayesian statistics and probabilities, but can easily applied as a generic optimizqation algorithm.

  23. Pingback: Genetic Programming: Evolution of Mona Lisa - The WebZappr

  24. Compression through GA search is not a new idea. I have tried something similar to this myself a while ago.
    This implementation works better because of the alpha-blended polygons though.

  25. The comparison function could be as simple as the quadratic error, sum(square(pxy – qxy)). The used it in fractal image compression. (Note the applications of this to fractal image compression? ;))

    It’s interesting how one can get close to the target in the presence of a fitness function. There’s, however, a drawback to the algorithm that’s not immediately apparent. The approach is entirely greedy and with this limited population of (two) images at each generation you won’t be able to take the image through depression points. You don’t notice this, because your tools (polygons) are coarse grained and hence the goal is not a very fine picture to begin with.

    In any case, impressive.

  26. Dude, any chance of posting this up on youtube as a proper video/animation?

    Not sure about how effective this is as a genetic algorithm, but you could have a real gem of a video effect here!

  27. Nice! I would be interested in more detail:

    What limit did you put on the maximum degree for each poly? Did you considered limiting the total number of vertices rather than the number of polygons? How does varying the poly limit affect the final image? (ie what is the limit of convergence for different poly counts?)

    What did your mutation function look like – shift-vertex, add-vertex, tweak-color, tweak-alpha, what else? What relative-occurrence-probabilities did you use?

    Do you use edge-crossing-detection to ‘untwist’ the poly, or just ignore cross-overs? Do you limit new-vertex locations, ie midpoint-offset, or just use random-in-canvas? Did you consider a vertex-pruning mutator? Have you got any statistics on result-acceptance rates for different mutators? How do these rates change at different stages of convergence?

    What do you do when you hit the max number of polys – drop the least-fit, or just stop creating new ones? If drop, how did you determine least-fit?

    What color space did you use, RGB/LSV/? What measure did you use for image-comparison (sum of absolute pixel-differences?)? Have you considered using variable-resolution comparison to speed things up (ie start with a lo-res image and increase resolution as fitness increases)?

    Is it possible to calculate per-poly best-fit alphas as part of the image comparison? If so, can we do the same for each color channel?

    It sounds like you used population-1, ie random-walk search. Any idea what a valid crossover operator would look like – something like polygon-swapping with alpha adjustment?

    Have you tried using smart/directed mutators – ie, pick one poly to mutate, cache the image-and-difference for everything-but, then try a set of mutations on the poly and use the one which increases the fitness the most?

    Have you investigated how changing some of the parameters affects convergence speed in terms of either generations or overall computation time?

    For fun – what about using this alg to morph from one image to another?

  28. Pingback: Mona Lisa: Evolution durch künstliche Intelligenz KI - Von Alper Iseri - Roger Alsing, Mona Lisa, Lisa, Mona, Original, Bild, Programm, Vorgänger, Wenn, Vergleiche - meetinx-Blog

  29. The representation of triangles needs
    3 vertices each having X and Y coordinates, plus the color (R,G,B), so 9 numbers per triangle, times 50, i.e. 450 numbers. This could be 1800 bytes if they all were single precision floats,
    or we could do integer arithmetic, with say 16-bit
    X/Y coordinates, and 24-bit color, i.e. 19 bytes per triangle, or 950 bytes total.

  30. I would be curious to see this where every corner of a triangle has a different color and interpolate the polygon colors to create a gradient. You might be able to get a more faithful representation that way and possibly be able to reduce the amount of polygons.

  31. To silveira: I do not have the code right here but I can sketch the approach: the gene size was variable, encoding a variable number of rectangles with a colour. The aprroach described in this page has a fixed number of arbitrary polygons and the colour combination function is some type of blending. I think the variable number of polygons is a better feature but this experiment shows that having general polygons instead of rectangles and specially blending colours (which makes the fitness function change more smoothly) is a good improvement. I wasn’t trying to solve exactly the same problem though.

    You can do a google search on ‘genetic algorithms image compression’ and you’ll see a few hits on the subject.

  32. Having now made a movie out of it myself, it is a lot easier to see what is going on.

    I think it would be interesting to place some limitations on the evolution:

    1) each polygon is limited to up to n-sides, or
    2) the total number of vertices is limited to some number.

    Or both.

    Also, I too would like to know more about your fitness algorithm. I would’ve thought that some regions would’ve gained fidelity (detail) more quickly than others, but it seems — from the samples you’ve provided — that detail sort of emerges together.

  33. Hmmmm…. I don’t think this is a true test of Evolution, though… The algorithm always had access to the original “Design” of Mona Lisa to compare itself with. The process and theory of evolution states that there is no Designer and that things happened randomly (uncoordinated). So, for me, evolution requires MORE faith than believing in God or an intelligent Designer.

  34. OK OK , I Will try to make a movie out of this.

    I’ll even to make a new run and output SVG or some other vector format out of the images :-)

    Also, the polygons are 3-x points large, so they are not only triangles but can be big complex polygons.
    (Saw a few comments about triangles in here)

    And I’ll blog about the various aspects on this topic as soon as the kids go to bed.

  35. Could you post the code? It would be interesting to see it. Especially the code used to compare to the original and estimate the match.

    My understanding is that that is where the magic really happens and the most difficult thing to design.

  36. Pingback: a blog about stocks » Blog Archive » links for December 8th

  37. About that first black image. I would guess its the first generation and selecting a strand with all initial values as zero is just as good as a random starting strand.

    Basically you need something to compare the next generation with and a black screen works? =)

  38. This is thoroughly awesome. It really is.
    Just curious – how long did the computation take ?

  39. I wouldn’t call this a genetic algorithm because you are not keeping multiple candidates at each iteration nor breeding them (that is, swapping DNA from multiple candidates). This is simply a straightforward hill climbing algorithm. Not that the method impacts on the results.

    If it took about 1M iterations to get to your result with hill climbing, I would expect you could do much much better with a genetic algorithm provided you were smart about how DNA got swapped. If it was swapped spatially, somehow, then you could get a result where a candidate which was good at the face bred with a candidate which was good at the background and you got a candidate which was good at both.

    Which isn’t to say that such an approach might not take more computation over all, keeping around and breeding multiple candidates takes time.

  40. Pingback: Extremely cool: genetic programming evolves picture of Mona Lisa « Later On

  41. Hey, instead of so narrowly directing the evolution, you could do something more like real life, by coupling a facial recognition program to the fitness testing. See what kinds of ugly mugshots you get coming out!

  42. Pingback: ILLUZIVE - Personal » Blog Archive » Genetic Programming: Evolution of Mona Lisa « Roger Alsing Weblog

  43. I will release the source tomorrow, I just have to clean it up so I don’t have to feel too ashamed of the actual code quiality.

    (Threading it so that it wont lock up your computer etc)

    The app is written in C# 3 using .NET framework 3.5.
    So thats the requirements for using/compiling it.

    Just so you know :-)

    Im off to bed.

  44. source or more information, please? (in particular, i’m interested in how you compared particular states with the actual mona lisa)

    and, like others said, this isn’t really genetic programming (as there is only a population of 1), is it? more like a genetic algorithm?

  45. it is realy impressive, but i would also like to know more details about the algorithm. I think it would be also interesting to let fight several runs of the same test to find better result with same number of polygons (or simillar with less number).
    I also vote for canvas version here ;)
    And maybe we can all try to optimise the algorithm

  46. Pingback: The World’s Most Famous Painting : LikeItHateIt

  47. Interesting, but it isn’t evolution. You just showed intelligent design. You had an end result that you compared your program to. So you had an ultimate goal in mind. You didn’t really do it “randomly” though. So thanks for showing designs have designers. :)

  48. Incredible! As a programmer of genetic algorithms myself I must tip my hat to your ingenuity. For even I, with no 3D programming experience could write something to combine 50 semi transparent polygons, however the ingenuity of thought, the stroke of brilliance to apply such a seemingly impossible task to such simple constraints, that! is true genius.

    *tips hat*

  49. Pingback: Genetic programming - Hack a Day

  50. Just to echo a lot of other people, I’d really love to see the source. I’ve been fascinated by genetic algorithms ever since I first heard of them eight years ago, and I’ve long wanted to experiment with them myself, but the initial hurdle of coming up with a GA framework on my own has proved to be too much. If you could provide the source for this project (say, under the GNU GPL), that’d be a great starting point for me and many others. I’d start by evolving a different image, and then from there, who knows what I’d do?

    Thanks a lot.

  51. While I find this very interesting (and, frankly, cool), I hardly think this could be used for a new image compression algorithm — at least not in any practical manner.

    First, it’s likely to be hugely costly in terms of memory and computation in order to do so (almost 1M generations, not to mention an unknown population size). It’s probably also quite likely that the polygons evolved in the process are non-trivial. What you’re really looking at is the building of an SVG. Therefore, it’s possible that this could be used either by itself or perhaps in tandem with an existing algorithm to convert raster images to vector representations.

    I’m skeptical as to whether there’s any commercial viability to such code. If I’m proven wrong, then kudos! But not every project is meant to be commercialized.

    Good work!

  52. Pingback: The mona lisa « Stever’s blog

  53. Pingback: Matrix67: My Blog » Blog Archive » 强大的遗传算法:用50个半透明多边形重现蒙娜丽莎

  54. I’m wondering if the 50 polygons are particularly suited to rendering a face. Might 50 polygons have a harder time rendering “scream?” Would 50 polygons be really good at rendering a picasso?

  55. For all those people who want the code, why don’t you just follow the procedure that’s explained at the top of the page?

    If you can’t do that by yourself, then what the heck are you going to do with code except get confused?

  56. Sorry but I think you are not using a genetic algorithm at all like Johannes and Johan mentioned. Please change the title and description to refelect this since you are misleading other people.

  57. For MrE

    Why don’t you write the code based on the procedure and then explain it to us such that we won’t get confused with your code

  58. Are you sure this is GP or GA? Unless some of your polygons are being drawn off the canvas, I’m not quite following. Since the image is slowly converging by *adding* polygons and not mutating their positions, it looks like hill climbing to me.

  59. Damn. I did this same thing a year and half ago. Didn’t think there would be a Wired article about it.

  60. Very cool. I agree with previous posters though: this is a hill climber, not an EA.
    I’m guessing a hill climber is likely to work well in the given fitness landscape. Depends on the fitness function of course, but it would likely have no local maxima, making a hill climber more effective.

    Still: Seeing Mona Lisa slowly appearing like that is really cool.

  61. It is interesting to note, that 10^6 generations is equivalent for example 57*10^3 years of evolutions of drozofila. So: how complex structure may be evolved by this short time…

  62. Kakaz: Because each iteration happens immediately after the other.. no need to wait for the old generation to “reproduce”.

  63. Every mention of evolution shouldn’t turn into a god vs. science debate, but I’m going to fuel the fire anyway. To those saying that this is no proof of evolution: true, the algorithm did have access to the ‘finished product’. However, the final form provided to the algorithm (AFAIK) is a raster image, not the polygons from which it is formed by the algorithm. In the same way, the final form dictated by evolution is survival, but the method to obtain that isn’t completely specified.

  64. Pingback: Evolution Proof

  65. Yes I agree it’s not a genetic algorithm, but I think the convergence rate is fine. Why is that an issue? He’s come up with a good, 50-polygon approximation with a “population size” of 1 and less than a million iterations. This is a reasonable amount of work compared to the true GAs I’ve seen with populations of thousands and needing thousands of iterations. But, more to the point, why optimize a one-shot calculation? That would be premature …

    To the ID straw-clutchers, this is not evidence that mankind evolved through ID. Evolution is not purely random; it occurs within an epigenetic landscape which includes (at least) the fixed laws of physics and chemistry. This calculation uses an epigenetic landscape of the Mona Lisa.

    A good example to demonstrate true evolution to the non-believers would be to have a million people rate the output of the algorithm and evolve it using their opinions. Then the output would still be “designed”, but by a million people rather than one all-powerful “designer”.

    Now create a statistical approximation of the ratings you got, and use that instead of people to rate the pictures. I bet you’d get a picture out which was aesthetically pleasing but had no designer. That’d be a nail in the coffin for those who believe ID because they don’t understand evolution :)

  66. Pingback: Geeko! » Dibuja la Mona Lisa con 50 polígonos

  67. Pingback: La Mona Lisa dibujada con 50 polígonos | ..: Cristian Eslava | Diseño Gráfico / Web | Maquetación | Formación :..

  68. How about creating three sets of polygons, each set for one channel in YCbCr -color space. They could still be alpha blended in this stage.

  69. Pingback: appler's me2DAY

  70. Pingback: Creative Computation | Genetic Algorithm Paints Mona Lisa

  71. Wow, this is certenly impressive. Admited, I personaly won’t ever need somthing like that then again screw practical use! It is just cool ;).
    I too can’t wait to have a look at the source (even so I’ll have a hard time to run it w/o windows) but I doubt it will take long and this algorithm is ported to a few other languages :)

    Well done!

  72. It doesn’t really have much to do with nature’s evolution when you set up a specific goal, and discard everything that doesn’t take you one step closer to it. This is a misconception that many people on both sides seem to have.

  73. very impressive. Do you use 50 polygons at the start? , it seems as though the algorithm adds them progressively.

  74. An etch-a-sketch picture is ONE polygon with a huge number of sides. I’ve seen TV interviews with artists who can produce superb quality pictures just using this child’s toy, so it’s quite surprising that this code needs 50x as many polygons for the Mona Lisa.

    As previous comments have said, you should try to limiting the number of sides or vertexes to make this impressive.

  75. “I have been pretty much obsessed with evolution stuff this past 1-2 years, read all Dawkins’s books. I Can really recommend the selfish gene, and the blind watchmaker, and ancestors tale, plus Genome by Matt Ridley was great, really interesting how DNA really can be viewed as a hard drive littered with remains neutralized viruses and other self serving copying processes.”

    Problem with these two guys is that they write nicely, but have very little actual knowledge of what they’re talking about. Neither has any background in genetics – let alone modern genetics (genomics/postgenomics) – that goes beyond undergrad level. Ridley is a passable zoologist, while Dawkins just writes nicely, but is not even a biologist at all and has fscked up the meaning of “gene” throughly and beyond redemption, thank you very much NOT: when Dawkins says “gene”, it may be anything between a single allele and a multigene complex like the bacterial flagellum and its chemical “rotary motor”.

    If they were correct, how come life ever progressed beyond protists? The price you pay for multicellularism is death, and that’s where it is hard to argue for “selfishness”. Sexual reproduction, as far as can be told, evolved not coincident with terminal mortality.

    The present example brings the point across rather nicely: no single “DNA sequence” benefits from being too “selfish”. They have to tread the middle gound between resource-hogging but replicating abundantly, and evolving but dying out for lack of replication carefully.

    Try Nature Reviews Genetics – there, you’ll find the real hardcore stuff. Dawkins and Ridley are, from a philosophical perspective, rather annoying logical positivists, with all weaknesses for their theories this implies: they only tell you what fits. That there are counterexamples galore, they’ll never tell you.

  76. Hm. I like that.


    Anthropomorphized “goal”: reproduction and survival (micro and macro). Fitness algorithm: reproduction and survival. Generation production: reproducing and surviving, or the opposite.

    How “much” does it have to do with “nature’s evolution” before it has *something* to do with it? And what’s your point, anyway? It’s still G(evolutionary)P–which is all that’s claimed.

  77. “It doesn’t really have much to do with nature’s evolution when you set up a specific goal, and discard everything that doesn’t take you one step closer to it. This is a misconception that many people on both sides seem to have.”

    Indeed. You can only assume temporary local optima; the “global optimum” is a misconception that smacks of ID, the pareidolia of the intelligent observer.

    “Un dimanche après-midi à l’Île de la Grande Jatte” might be a better work to start with. But nonwithstanding, you’d need an optimization against locally differing *variables*.

    You might *not* need randomness except in the “mutation” and as a starting condition however. The oldest biogenic stromatolites are more than half as old as this planet itself, so the first *nonrandom structures* created *by organisms* must be older.

    (Dawkin’s one *really* great idea was the extended phenotype. For anything else, refer to current peer-reviewed journals.)

  78. For those interested in genetic algorithms, I have a program (in C++) which encodes words into a Boggle grid. The source is freely available to view and play with. It is also written using templates so you can modify it easily to work on other genetic optimization problems.

    Look for the “Inverse Boggle Encoder” at

  79. Pingback: Evolving Art with Nature Inspired Algorithms

  80. Marvelous!

    Are the polygons always convex? Looking at the samples I don’t see any that aren’t.

  81. Evolutionist: The goal and discard is simply substituting for the “fittest” criterion of Nature’s “survival of the fittest”. However, earlier criticism is correct to some degree – this is a very limited version of “genetic programming”.

  82. I agree with Evolutionist – this is goal-directed evolution (more like – egads – intelligent design). A truly evolutionary algorithm would compare each picture and the LAST picture against a shifting set of environmental conditions. The order emerges from that process over time.

    However, as a demonstration that a randomised process can produce a highly non-random outcome, its certainly nice. As others have identified, the exact cirteria for accepting or rejecting each picture as closer than the last is critical. As ecological modelers say – the devils in the assumptions.

  83. *sigh* As someone with a degree in AI, (and written GAs that did vaguely the reverse… sought a set of transparent rectangles to best classify images) I’ve got to apologize for all the misguided, and even downright stupid comments that have been posted.

    Your description was perfectly adequate to code an equivalent. I’m not at all surprised by the result, though it is a wonderful demonstration.

    It’s a shame people keep trying to read complexity into the algorithm, rather than appreciating it’s sheer simplicity. I hope this becomes a textbook example.

    Finally, All GA’s are “hill climbers”, (variants of simulated annealing) just with different randomizing functions and population sets. Just because you only evolved one code doesn’t make it any less a GA. There might have been efficiencies in using a thousand parallel codes and cross-breeding, but it’s mathematically the same thing.

    Kudos to you.

  84. I would love to see the source code and make it trully genetic. We have here a climbing hill solution for 1.000.000 individuals. What would be the result for a 1000 generations of 1000 individuals with true genetic algorithm? What will it be with 10 generations of 100.000 individuals each? Looks funny to try.

  85. Pingback: neonascent » Blog Archive » Evolution of Mona Lisa

  86. Pingback:

  87. Pingback: Genetic Programming: Mona Lisa FAQ « Roger Alsing Weblog

  88. Pingback: Evolving the Mona Lisa by natural selection | Cow's Blog

  89. Pingback: Geweldig nutteloze AI | GeekSpeak

  90. Pingback: /dev/Kico » Blog Archive » Programação Genética: recriando a Mona Lisa com apenas 50 poligonos!

  91. Pingback: BloGals » Archive du blog » Programmation Génétique - Genetic Programming

  92. Pingback: What Good Is Half A Machine? | MetaFilter

  93. Pingback: P3naeus monod0n » Blog Archive » I, For One, Welcome Our New Software Overlords.

  94. Pingback: Top Posts «

  95. Pingback: Top Posts «

  96. Pingback: links for 2008-12-09

  97. Pingback: Genetic Programming: Evolution Of The Mona Lisa | The Current Buzz - Tech

  98. Pingback: Hypnotizing Rabbit » Blog Archive » Genetic programming and the Mona Lisa

  99. Pingback: Shorter Than Ever | Lycanthropia

  100. Pingback: Evolution of Mankind Art - Evolution of Mona Lisa Through Genetic Programming

  101. Pingback: Querystring » Genetic programming

  102. Pingback: » The Evolution of the Mona Lisa - Genetic Programming

  103. Pingback: The Mona Lisa recreated using 50 transparent polygons… | About Colon Blank :: The Procrastinators Wet Dream!

  104. Pingback: 遗传算法与蒙娜丽莎 « Ryanchen’s Weblog

  105. Pingback: Evolution of Mona Lisa – today and tomorrow

  106. Pingback: Genetic Programming - The Prophecy Forums

  107. Pingback: Free Gadget News » Algorytmy ewolucyjne w obrazkach

  108. Pingback: Another Evolutionary Painting « Gems Sty

  109. Pingback: Genetic Programming: Evolution of Mona Lisa « Roger Alsing Weblog :: MezzoMondo

  110. Pingback: Genetric Programming Recreates Mona Lisa « MRod says:

  111. Pingback: leipies-bloggy » Blog Archive » Genetic search and Art

  112. Pingback: Auto(-)evolution « allzutaegliches

  113. Pingback: A evolução da Monalisa | | Sua zona de fragmentos digitais

  114. Pingback: Bunch of links - #5 « PICDIT

  115. Pingback: Image Evolution using Javascript and `canvas` | Phil Nelson Writes Here

  116. Pingback: Evolving the Mona Lisa « The Maas:Media

  117. Pingback: وبگردی در دنیای مجازی » روند تدریجی تکامل را در یک عکس تجربه کنید

  118. Pingback: Coding is like gardening… with less mud » Genetic Algorithms: Approximating Fine Art

  119. Pingback: Genetic Algorithms in Perl « Dabbler

  120. Pingback: Extremely Late Renaissance » Mona Lisa Underdrawn

  121. Pingback: Evolving the Mona Lisa « Twisted One 151’s Weblog

  122. Pingback: Google Adsense Tips & News » Blog Archive » Polygons Evolving Into Your Custom Picture

  123. Pingback: Wordpress Update | Rhinosphere

  124. Pingback: A single snowflake in the fractal explosion of technology

  125. Pingback: links for 2008-12-11 : Eggplantia5

  126. Pingback: jon abad dot com · I’m told that I am the internet.

  127. Pingback: The Less Interesting Times » Blog Archive » 11/12/2K8 והרי החדשות

  128. Pingback: Navigator Technology » Blog Archive » Genetic Programming: Evolution of Mona Lisa

  129. Pingback: Genetic Gallery « Roger Alsing Weblog

  130. Pingback: Mona Lisa aus Polygonen in 900.000 Schritten : Netzfischer.EU

  131. Pingback: ZihuAztlan » Blog Archive » La Mona Lisa mediante Programación Genética

  132. Pingback: links for 2008-12-11 |

  133. Pingback: Friday Procrastination: Link Love : OUPblog

  134. Pingback: links for 2008-12-12 | /dev/random

  135. Pingback: links for 2008-12-12 « Mike’s Blog

  136. Pingback: Evolutionary misconceptions « Roger Alsing Weblog

  137. Pingback: Evolution of Mona Lisa « Adam Kapler

  138. Pingback: Even more totally awesome stuff « Circadian Rhythms

  139. Pingback: Friday Links #29 | Blue Onion Software *

  140. Pingback: » My photo rendered using polygons

  141. Pingback: piece 0 plastic - the revolution will be blogged » ruff linkage 200850

  142. Pingback: » Genetic Programming: Evolution of Mona Lisa « Roger Alsing Weblog 老木,一路向西

  143. Pingback: Col’s Rational World » Blog Archive » Yay! Evolving art!

  144. Pingback: Mona lisa remake | Mundo Gump

  145. Pingback: Besserwisser blog « My Blog

  146. Pingback: Modulo Errors » Blog Archive » Genetic Algorithms with Processing

  147. Pingback: This week on C9: Oxite, Mona Lisa, Pool hacks, and Coding4Fun Gifts | This Week On Channel 9 | Channel 9

  148. Pingback: This week on C9: Oxite, Mona Lisa, Pool hacks, and Coding4Fun Gifts | Games Money

  149. Pingback: Top Posts «

  150. Pingback: All about link building.

  151. Pingback: Matthew Lynch vs Rationality » Blog Archive » Fun with Genetic Algorithms

  152. Pingback: blueblog » Blog Archive » genetische Evolution von Bildern

  153. Pingback: Living Code » Blog Archive » Drawing with opacity

  154. Pingback: Besserwisser post on EvoLisa « Dan Byström’s Weblog

  155. Pingback: Microsoft NZ MSDN Blog : MSDN Flash Christmas Special

  156. Pingback: Nigel Parker's Outside Line : MSDN Flash Christmas Special

  157. Pingback: This week on C9: Oxite, Mona Lisa, Pool hacks, and Coding4Fun Gifts | CHARGED's Digital Lifestyle at Work or Play

  158. Pingback: Hugh’s ramblings » Blog Archive » Genetic Vectorizer

  159. Pingback: RSS agregator » Blog Archive » Interesting genetic programming exercise

  160. Pingback: Genetic Programming Example « Rare Intellect Blog

  161. Pingback: Optimizing away « Dan Byström’s Weblog

  162. Pingback: Alt Gr

  163. Pingback: turn that shit UP » Blog Archive » Genetic Programming: Evolution of Mona Lisa « Roger Alsing Weblog

  164. Pingback: » Blog Archive » Building The Mona Lisa

  165. Pingback: » about wealth and taste

  166. Pingback: Free of All » Blog Archive » Genetic programming

  167. Pingback: Alt Gr

  168. Pingback: Tweak and Geek » Blog Archive » TG#16.5: Mona Lisa Evolution

  169. Pingback: Interesting stuff for December 2008 « The Outer Hoard

  170. Pingback: blog@kde» Blogarchiv » Genetische Programmierung: Evolution der Mona Lisa

  171. Pingback: Quirky Sunday Links

  172. Pingback: Pinceladas de evolución | la ciudadela

  173. Pingback: Complicated Algorithm, Neat Images | Slaptijack

  174. Pingback: The Evolution of Mona Lisa | Nature Notes from Harold Stiver

  175. Pingback: Evolucija Mona Lise

  176. Pingback: Evolving Genetic Algorithms in Lisp | So Jake Says

  177. Pingback: Kangaroo sequences, cocaininated bees and stupid robots « It’s Alive!!

  178. Pingback: » Blog Archive » Evolutionary Computing

  179. Pingback: Real-Time Rendering » Blog Archive » This, That, and the Other

  180. Pingback: West Karana » The Evolution of Ebony

  181. Pingback: Genetic Program Creates Image of Mona Lisa at The Tech Record

  182. Pingback: Lousy Canuck » Python Evolution - part 1

  183. Pingback: The World of Stuff - Blog Archive - We are the champions

  184. Pingback: » Blog Archive » Tumblelog: 090112

  185. Pingback: EvoLisa: Optimizations and Improved quality « Roger Alsing Weblog

  186. Pingback: Me made out of Genetic Programming! | Oscar and Friends

  187. Pingback: C Hees . info » Mona Lisa with Genetic Programming

  188. Pingback: New Adventures in Software » Practical Evolutionary Computation: An Introduction

  189. Pingback: Brian Low » EvoLisa Video

  190. Pingback: Hevolisa « e7ektroblog

  191. Pingback: STNDE | Image evolution | Javascriptで画像マイニング

  192. Pingback: The evolution of a media player « Subjects Keeping Me Awake

  193. Pingback: Evolving vectorization - Westhoffswelt - Welcome to the real world

  194. Pingback: » Blog Archive » Simulated evolution parlor tricks

  195. Pingback: Genetic Algorithms.., A Primer « The Green Destiny

  196. Pingback: Evolution of Mona Lisa « Fatals To Browser

  197. Pingback: Today’s Mindboggler: 1000:1 compression ratios? « OpticalFlow

  198. Pingback: Genetic Algorithm Part I « Eager to Code, Enjoy to Debug ~ Embark into Each Stage with Your Heart

  199. Pingback: :: Grrr

  200. Pingback: Evolutionary Compression « Roger Alsing Weblog

  201. Pingback: LCS and GBML » Blog Archive » Evolution of Mona Lisa

  202. Pingback: Lousy Canuck » Python Picture Evolution part 2, finally

  203. Pingback: Russian Coding 4 Fun : Мона Лиза у меня неплохо получилась

  204. Pingback: » Blog Archive » Genetic Algorithms: Evolving Human Faces

  205. Pingback: New Adventures in Software » Watchmaker 0.6.0 - Evolutionary Computation for Java

  206. Pingback: {5} Setfive - Talking to the World » Blog Archive » Monkeys and shakespeare: genetic algorithms with Jenes

  207. Pingback: Hodgepodge « Tish Tosh Tesh

  208. Pingback: Tagz | "Genetic Programming: Evolution of Mona Lisa « Roger Alsing Weblog" | Comments

  209. Pingback: Logic Nest · The Consequences of Trusting Computers

  210. Pingback: mental_floss Blog » Evolving Mona Lisa

  211. Pingback: Künstliche Intelligenz – Durch Evolution zur Mona Lisa | Das Meinungs-Blog

  212. Pingback: Artistic Evolution » Blog Archive » The Application of an Idea

  213. Pingback: InVisible Blog » links for 2009-10-25

  214. Pingback: Whole

  215. Pingback: Masuto Studios » Blog Archive » Monalisa por un algoritmo genetico

  216. Pingback: » Making fun things from the knowledge of evolution

  217. Pingback: Painting with Computers | Flirting With Models

  218. Pingback: pings from the void » Evolving Geometry. Or what I did with my Sunday.

  219. Pingback: Interesting Reading… – The Blogs at HowStuffWorks

  220. Pingback: Tim

  221. Pingback: La Mathematica non è un Pantone « Macworld Online

  222. Pingback: AC » Archive » genetic programming 多边形近似

  223. Pingback: Some applications ! : Ben's Blog

  224. Pingback: Mona Lisa replicated in software “using only 50 semi transparent polygons”

  225. Pingback: [links] Link salad’s face is at first just ghostly |

  226. Pingback: Friday Links: Zach Attack Edition

  227. Pingback: PHP Hackery » Blog Archive » How To Heat Your House With PHP

  228. Pingback: 27遗传程序:蒙娜丽莎的演化 - 新鲜事 - 演化 - 蒙娜丽莎 - 米叻!

  229. Pingback: Reproducing images with primitive shapes. (Graphics optimization problem) |

  230. Pingback: The Mario Genome! « Kele's Science Blog

  231. Pingback: Mona Lisa, algoritmos genéticos y HTML5: la evolución de una sonrisa | Shft

  232. Pingback: Genetic Algorithm picture generation | sharpsgenetic

  233. Pingback: Nice Vector Images photos

  234. Pingback: Dabbler : Genetic Algorithms in Perl

  235. Pingback: » Időszaki szumma

  236. Pingback: genetic painting agent | Mathisonian

  237. Pingback: links for 2011-02-12 « Boskabout

  238. Pingback: Genetic Algorithm Examples «

  239. Pingback: HTML5 » Blog Archive » Mona Lisa, algoritmos genéticos y HTML5: la evolución de una sonrisa

  240. Pingback: Marbled Rye and Evolutionary Algorithms « The Happy Technologist

  241. Pingback: » Marbled Rye and Evolutionary Algorithms

  242. Pingback: A Arte dos Algoritmos | Teoria da Conspiração

  243. Pingback: Genetic Programming: Evolution of Mona Lisa |

  244. Pingback: SO COOL! |

  245. Pingback: Qball’s Weblog » EvO – Image vectorization using evolution

  246. Pingback: Beating PNG « Bainbridge Code

  247. Pingback: Pinceladas de evolución |

  248. Pingback: Nature of Code Final | ITP Blog

  249. Pingback: RESTART IT HERE | MonaLisa in Nerd style | Massimo Meijer's inspiration blog

  250. Pingback: “进化”出来的蒙娜丽莎 « T客网 ︱ Techpot

  251. Pingback: Generating Faces from Random Polygons | Monkeyologist

  252. Pingback: Reproducing images with primitive shapes. (Graphics optimization problem)

  253. Pingback: Pareidoloop – algorithmic face generation | CreativeJS

  254. Pingback: Pareidoloop :: iobound

  255. Pingback: СЛУЧАЙНЫЕ ЛИЦА | Altsoph

  256. Pingback: Computer Computer Interaction | Inventing Interactive

  257. Pingback: What qualifies something as AI? | A.I. Spark

  258. Pingback: What are some interesting and relatively simple to implement projects to learn Neural Networks? | A.I. Spark

  259. Pingback: Should real-valued neural network inputs be scaled based on minimum/maximum values or by variance? | A.I. Spark

  260. Pingback: Genetic Algorithms: Links and resources (1) « Angel ”Java” Lopez on Blog

  261. Pingback: Pareidoloop | Robot Monkeys

  262. Pingback: How can I avoid premature convergence to a local optimum on my genetic algorithm? | A.I. Spark

  263. Pingback: » What is Computer Science? ACM@LETU

  264. Pingback: Genetics | Ene mene mu …

  265. Pingback: Class 11: Digitisation – Links « Waving at the Machines / ITP

  266. Pingback: Thought this was cool: Genetic Programming: Evolution of Mona Lisa | Roger Alsing Weblog « CWYAlpha

  267. Pingback: BrightestYoungThings – NYC – Rise and Shine: The Internet Told Me So …

  268. Pingback: BrightestYoungThings – Planet – Rise and Shine: The Internet Told Me So …

  269. Pingback: Guerrilla Monkey – Bookmarks

  270. Pingback: Winter Break Project | leadiv

  271. Pingback: BrightestYoungThings – DC – Rise and Shine: The Internet Told Me So …

  272. Pingback: Drain the Main Brain

  273. Pingback: Poster 08 – Creative, Easy Charts and Graphs Using Flash and XML #heweb10 | Elemental Cabal of Aardwolf mud

  274. Pingback: » Quinta clase #Optimización de estructuras de hormigón

  275. Pingback: An aid to AID: SMS image xfer service « Chewy Chunks

  276. Pingback: Ross Churchley › Mildly Interesting | Image Evolution

  277. Pingback: Ross Churchley › Blog › Image Evolution

  278. Pingback: Introduction to Simulated Annealing: I

  279. Pingback: Home Manchester | Playing with HOME data

  280. Pingback: Genetic Algorithm | Robert Murrish

  281. Pingback: Tile-Based Image Compression | Pointers Gone Wild

  282. Pingback: Mona Lisa in 50 Evolutionary Polygons | Beyond the Beyond | Wired

  283. Pingback: Image Estimation using Stochastic Hill Climbing | CCTV Computer Vision Blog

  284. Pingback: documenting gleitzeit paul jaisini invisible

  285. Pingback: Paul Jaisini Invisible Paintings from 1994

  286. Pingback: Natural selection at work | Jeremy Yoder

  287. Pingback: parse calm record into small settlement controlling matlab - Zietlow Gyan

  288. Pingback: Boo! The optics behind “ghost” imaging | Skulls in the Stars

Comments are closed.