user generated

Podcast with Jeffrey Ventrella – on the Versatilist

Hey wiggy peeps,

I was recently interviewed on the Versatilist Podcast, by Patrick O’Shea.


In this podcast, Patrick and I kick around lots of ideas on artificial life, artificial intelligence, and augmented reality. I describe my ongoing efforts to develop a kind of self-animated character that can thrive in our highly augmented future.

Wiglet on a Wiener: How to make your own Wiglet videos

How did I make this video?  Read on to learn!

Initial Preparation: You’ll need to make sure that your iPhone/iPad is upgraded to iOS 8 and that your desktop/laptop Mac is upgraded to OS X Yosemite.  Also, your iPhone/iPad must be a newer model that uses a cable with an 8-pin Lightning connector for recharging.  Unfortunately, devices that use cables with the older 30-pin connector won’t work.


If the end of the cable that plugs into your iPad/iPhone looks like this, then you’re all set.

Install Peck-Peck’s Perch on your iPhone/iPad and then visit Wiggle Planet’s Shop to download and print out our free augmented reality markers.

“I’m ready for my closeup. Let’s get going!”

Step 1: Connect your iPhone/iPad to your computer using the Lightning cable.  Open QuickTime Player on your Mac and choose “New Movie Recording” from the File menu.


Step 2: Click the little arrow next to the red record button and choose iPhone or iPad as the source camera. Also select your device name in the Microphone section to record all audio coming out of your iOS device.


Step 3: Run Peck-Peck’s Perch on your iPad/iPhone and press the red record button on your Mac to start the recording.

Point your iPad/iPhone at one of the printed augmented reality markers and let the fun begin!  When you’ve had enough, click the stop button and save your movie on your Mac.  You can then upload it to Youtube to share with the world.

If you make a cool Wiglet video, please leave a comment so we can check it out.  Thanks!

Take care,
-John “Pathfinder” Lester

Is “Artificial Life Game” an Oxymoron?

langtoncaArtificial Life (Alife) began with a colorful collection of biologists, robot engineers, computer scientists, artists, and philosophers. It is a cross-disciplinary field, although many believe that biologists have gotten the upper-hand on the agendas of Alife. This highly-nuanced debate is alluded to in this article.


What better way to get a feel for the magical phenomenon of life than through simulation games! (You might argue that spending time in nature is the best way to get a feel for life; I would suggest that a combination of time with nature and time with well-crafted simulations is a great way to get deep intuition. And I would also recommend reading great books like The Ancestor’s Tale :)

Simulation games can help build intuition on subjects like adaptation, evolution, symbiosis, inheritance, swarming behavior, food chains….the list goes on.

Screen Shot 2014-10-17 at 7.48.02 PMScreen Shot 2014-10-19 at 12.24.54 PMOn the more abstract end of the spectrum are simulation-like interactive experiences involving semi-autonomous visual stuff (or sound) that generates novelty. Kinetic art that you can touch, influence, and witness lifelike dynamics can be more than just aesthetic and intellectually stimulating.

These interactive experiences can also build intuition and insight about the underlying forces of nature that come together to oppose the direction of entropy (that ever-present tendency for things in the universe to decay).

Screen Shot 2014-10-17 at 7.58.33 PM

On the less-abstract end of the spectrum, we have virtual pets and avatars (a subject I discussed at a keynote I gave at VISIGRAPP Barcelona).

“Hierarchy Hinders” –  Lesson from Spore

Screen Shot 2014-10-17 at 8.18.59 PMWill Wright, the designer of Spore, is a celebrated simulation-style game designer who introduced many Alife concepts in the “Sim” series of games. Many of us worried that his epic Spore would encounter some challenges, considering that Maxis had been acquired by Electronic Arts. The Sims was quite successful, but Spore fell short of expectations. Turns out there is a huge difference between building a digital dollhouse game and building a game about evolving lifeforms.

Also, mega-game corporations have their share of social hierarchy, with well-paid executives at the top and sweat shop animators and code monkeys at the bottom. Hierarchy (of any kind) is generally not friendly to artificial life.

For blockbuster games, there are expectations of reliable, somewhat repeatable behavior, highly-crafted game levels, player challenges, scoring, etc. Managing expectations for artificial life-based games is problematic. It’s also hard to market a game which is essentially a bunch of game-mechanics rolled into one. Each sub-game features a different “level of emergence” (see the graph below for reference). Spore presents several slices of emergent reality, with significant gaps in-between. Spore may have also suffered partly due to overhyped marketing.

Artificial Life is naturally and inherently unpredictable. It is close cousins with chaos theory, fractals, emergence, and uh…life itself.


alife graphAt the right is a graph I drew which shows how an Alife simulation (or any emergent system) creates novelty, creativity, adaptation, and emergent behavior. This emergence grows out of the base level inputs into the system. At the bottom are atoms, molecules, and bio-chemistry. Simulated protein-folding for discovering new drugs might be an example of a simulation that explores the space of possibilities and essentially pushes up to a higher level (protein-folding creates the 3-dimensional structure that makes complex life possible).

The middle level might represent some evolutionary simulation whereby new populations emerge that find a novel way to survive within a fitness landscape. On the higher level, we might place artificial intelligence, where basic rules of language, logic, perception, and internal modeling of the world might produce intelligent behavior.

In all cases, there is some level of emergence that takes the simulation to a higher level. The more emergence, the more the simulation is able to exhibit behaviors on the higher level. What is the best level of reality to create an artificial life game? And how much emergence is needed for it to be truly considered “artificial life”?

Out Of Control

Can a mega-corporation like Electronic Arts give birth to a truly open-ended artificial life game? Alife is all about emergence. An Alife engineer or artist expects the unexpected. Surprise equals success. And the more unexpected, the better. Surprise, emergent novelty, and the unexpected – these are not easy things to manage…or to build a brand around – at least not in the traditional way.

Screen Shot 2014-10-17 at 9.04.07 PMMaybe the best way to make an artificial life game is to spread the primordial soup out into the world, and allow “crowdsourced evolution” of emergent lifeforms.  OpenWorm comes to mind as a creative use of crowdsourcing.

What if we replaced traditional marketing with something that grows organically within the culture of users? What if, in addition to planting the seeds of evolvable creatures, we also planted the seeds of an emergent culture of users? This is not an unfamiliar kind problem to many internet startups.

Are you a fan of artificial life-based games? God games? Simulations for emergence? What is your opinion of Spore, and the Sims games that preceded it?

This is a subject that I have personally been interested in for my entire career. I think there are still unanswered questions. And I also think that there is a new genre of artificial game that is just waiting to be invented…

…or evolved in the wild.

Onward and Upward.


The Open Brain

No, this is not a blog post about being open-minded.

Nor is it a blog post about brain surgery.


It’s a blog post about Open-Sourcing the code of wiglets allowing others to develop artificial intelligence (AI) algorithms.


The fact is, even though I have a graduate degree from MIT, I may not be the best one to write the AI code for wiglets.

Okay, maybe I am  the best … BUT, do I have the time? Is there any time left in the day as I try to start a company?

Bloody no.

And besides, open-sourcing the AI component of autonomous animated characters is totally reasonable, considering that the primary goal of our technology is to allow for user-generated content: digital goods created by all you people out there in user-land. I want the wonderful world of wiglets to emerge from the populace – not from the board rooms of marketing teams.

Your creativity and interest can be the driving factor for how these critters come into being, and eventually evolve into the muppets of the digital age.

So, how will we make the brain open-source?

The key is to use the four pillars of situated AI:


Think of actuators as the body. Your body acts on the environment (and generally changes the environment in the immediate vicinity of the body). The sensors perceive the environment and inform the brain of what’s going on. The brain then takes it in and decides (or not) what to do.

Here’s the cool part: what happens inside of the brain can be just about anything. When I was at MIT, Marvin Minsky told a bunch of us that the brain is a magnificent hack: there is no single perfect AI algorithm. In fact, there are many many hacks that have been messily munged together over the course of animal evolution to give us the brains we have.


It’s our frontal lobes that create the illusion that we are making clear, rational decisions – that our brains are well-designed.

This is why some of the early AI programmers made the mistake of looking for the perfect AI. It would seem (to them) that there must be a way to engineer that perfect-feeling of clarity that we call consciousness and rational thought.

But it’s just a feeling.

Uh, what’s my point? My point is that we can take this fact of animal intelligence and apply it to the simulation of wiglets. You (the folks who I’d like to be put in charge of building the brains of wiglets) get to use whatever you want to make wiglets do what they do.

Think of it as crowd-sourced AI.

You can use neural nets; you can use finite state machines; you can design a thousand if-then statements to account for every combination of stimuli; you can attach a big pipe to Google and use the power of the internet; you can make it completely random and hallucinogenic.

Uh, what?


Let’s start a Cambrian Explosion of Brain Design!

I have finished version 1 of the Brain Interface, which implements the sensors and actuators (the inputs and outputs). If you, or if anyone you know – knows the C++ language and would like to try out our new brain interface, let me know :)