19 responses

  1. Leen
    March 25, 2010

    Dammit they are about to take my job as an artist now! Pretty cool robot though.

  2. Leen
    March 25, 2010

    Dammit they are about to take my job as an artist now! Pretty cool robot though.

  3. Six
    March 25, 2010

    I fail to see how exactly this isn’t just drawing what it is programmed to. It looks to just be running some image processing over a camera image, then using a simple algorithm to try to replicate that image with a pen on paper. Unless there is some level of self learning I’m not aware of, that seems to me to just be drawing as programmed.

  4. Six
    March 25, 2010

    I fail to see how exactly this isn’t just drawing what it is programmed to. It looks to just be running some image processing over a camera image, then using a simple algorithm to try to replicate that image with a pen on paper. Unless there is some level of self learning I’m not aware of, that seems to me to just be drawing as programmed.

    • Aaron Saenz
      March 25, 2010

      @Six. I think the innovation comes from that “simple algorithm”, which isn’t translating pixel for pixel but rather decides what constitutes a line and selectively choosing which lines to draw.

      “It looks to just be running some image processing over a camera image, then using a simple algorithm to try to replicate that image with a pen on paper.” –This could also be used to describe human drawing to some degree.

  5. Stafford Williams
    March 26, 2010

    this is ridiculous. how is this any different from a printer? a print driver tells the hardware how to produce a copy of a digital image. this looks like an expensive (time and/or money) gimmick.

    • Frederic Fol Leymarie
      March 31, 2010

      Hi Stafford,
      have a look at the project’s website please: http://www.aikon-gold.com
      We are trying to better understand how the mind (of the artist) functions by building what amounts to a simulator made of software and hardware parts.
      AIkon is also a subject of study for us, where we learn from it, by trying/implementing/testing various models and ideas on various steps we think are at play when the artist thinks and acts.

      The input is an image, but we do not just process the image and re-present it in a single step. Rather we have defined a system with multiple steps and are now introducing feedback loops. We have tested some form of visual feedback so that AIkon will eventually be directly influenced by what it draws (at the moment it is mainly influenced by what it sees by looking at the sitter only; although there is some uncertainty built in by
      using a cheap mechanical robot arm and having some noise in the algorithmically created plan — leading to gesture curves for the robot to follow).

      LMK if this is helpful or not.

      Regards,
      Frederic

      • Nunya
        October 5, 2010

        How about focusing on building automated alternatives to jobs people don’t WANT to do, like mopping or cleaning the toilet?

      • PantOpticon
        December 10, 2010

        That’s TRADITIONAL robot research thinking. This is getting outside of the box.

  6. Stafford Williams
    March 26, 2010

    this is ridiculous. how is this any different from a printer? a print driver tells the hardware how to produce a copy of a digital image. this looks like an expensive (time and/or money) gimmick.

    • Frederic Fol Leymarie
      March 31, 2010

      Hi Stafford,
      have a look at the project’s website please: http://www.aikon-gold.com
      We are trying to better understand how the mind (of the artist) functions by building what amounts to a simulator made of software and hardware parts.
      AIkon is also a subject of study for us, where we learn from it, by trying/implementing/testing various models and ideas on various steps we think are at play when the artist thinks and acts.

      The input is an image, but we do not just process the image and re-present it in a single step. Rather we have defined a system with multiple steps and are now introducing feedback loops. We have tested some form of visual feedback so that AIkon will eventually be directly influenced by what it draws (at the moment it is mainly influenced by what it sees by looking at the sitter only; although there is some uncertainty built in by
      using a cheap mechanical robot arm and having some noise in the algorithmically created plan — leading to gesture curves for the robot to follow).

      LMK if this is helpful or not.

      Regards,
      Frederic

  7. cybotic
    May 19, 2010

    I hope they’re using Epsom approved ink in that pen ;-)

  8. cybotic
    May 19, 2010

    I hope they’re using Epsom approved ink in that pen ;-)

  9. Plasticfrank
    October 5, 2010

    Man… People are so close minded. Even I can see that this project is more closely related to AI research than automation. Just seems like some smart people trying to understand creativity filtered through digital processes. I thought this was a tech blog.

    • Mgbeers
      October 5, 2010

      This is dumb. What kind of ai is this? There have been booths that you can sit in where a CAMERA would capture a face and simulate drawing it with a pen. This is no more robotic innovation than that. Simply put, the “eye” could take a picture and “draw” it out (really just a plotter with a simulated drawing technique) Photoshop-like software can apply filters to an image before printing and the “robot” would just render the image that way. What a joke, this is not futuristic at all!

Leave a Reply

You must be to post a comment.

Back to top
mobile desktop