Theater of Lost Species
"People simply have no history of living in a world without an abundance of other animals in the wild. We have no precedent for it. Such a world will be new to the children who come after us; indeed, it will be alien. This planet will no longer be our old, familiar home, but something completely other. And that will change the character, the aesthetics, the ideals of our descendants, growing up on a globe that has almost in the blink of an eye been purged of its ancient evolutionary richness." -Lydia Millet "The Child's Menagerie", NYT Op-Ed, 9/12/12
Part virtual menagerie, part memory chamber, part urban spectacle, The Theater of Lost Species invites one to view and interact with a virtual collection of swarming digital creatures, flora and fauna. Screens mounted at the end of glowing viewing cones display a curated virtual ecology of lost species. Sensors enable these digital creatures to react to visitors, while slowly pulsating light rods create a dynamic atmosphere at night. When the Theater is not in use it autonomously navigates through its environment scanning and documenting plants and animals before they are forever lost.
Read more about it here.
Design: Jason Kelly Johnson & Nataly Gattegno
Team: Ji Ahn, Fernando Amenedo, Ripon DeLeon, Shawn Komlos
In collaboration with: Matthew Clapham (UCSC); Dr. Jonathan Payne (Stanford University)
Exhibited at: Ambiguous Territory Symposium, University of Michigan Taubman College of Architecture and Urban Planning, October 2017; Ambiguous Territory Exhibition, University of Michigan Taubman College Gallery, September-October 2017; Pratt Manhattan Gallery, New York , December 2018-January 2019; Ithaca College, New York Aug 28-Dec 15, 2019.
Hi-Res Images: LINK
FAQ - FREQUENTLY ASKED QUESTIONS:
1. What's the big idea?
The Theater of Lost Species is an object for collective celebration and mourning, a catalyst for conversation, philosophical debate and ecological engagement. It is a device for both viewing and interacting with a collection of fantastic, yet extinct, sea creatures. The project is inspired by a number of influences such as Traveling Menageries, Chinese Lanterns and portable Camera Obscura devices from the 1800's, Time Capsules from the 1950-60's and various recreations of Noah's Arc. An important tandem project are Seed Banks such as the on at Svalbard, Norway. After reading Lydia Millet's Op-Ed in the NYT "The Child's Menagerie" we began the process of conceptualizing and designing the Theater. We set out to address Millet's question:"Can you feel the loss of something you never knew in the first place?"
2. What will visitors see through the viewing cones? What will it look like at night?
The long viewing cones focus on digital display screens that are portals to a seamless virtual aquarium. Within the aquarium, digital sea creatures swarm to the viewing cones, engaging the subtle motion of viewers. In the evening the theater will glow and pulsate as the swarms slowly navigate within the virtual aquarium. We are developing a custom physcial-digital interface using the Processing programming language (connecting Arduino microcontrollers, Infrared (IR) Sensors and LEDs) allowing viewers to actively engage the virtual creatures. We have been inspired by projects like Soda Constructor, Oasis and Manifest (* these links will not work on iOS mobile devices).
3. How big is it and what will it be made out of?
The Theater has a footprint of approx. 16' x 16' and is 16' tall . It will be made out of lightweight Carbon Fiber Reinforced Panels (FRP) and resin. The 15 unique hexagonal panels will be made by Kreysler & Associates in the Bay Area. The glowing pins will be made out of translucent cast rods connected to super-bright LEDS. A custom steel chasis will connect all the pieces and will be bolted to the ground. The entire assembly will be protototyped at Future Cities Lab in the Dogpatch neighborhood of San Francisco.
4. How did you design and prototype this?
The Theater is being designed using Rhino and Grasshopper softwares with the Kangaroo and Firefly plug-ins. The interactive components are being programmed in Processing, Arduino, Python and Ruby. The physical components are being prototyped using a combination of laser cutting, cnc milling and 3d printing (we used a Makerbot Replicator 2 generously donated to us by Jake Lodwick from Elepath). Each viewing cone has three integrated IR sensors that allow the microcontrollers to sense visitors proximity which in turn activate the virtual swarm and glowing LEDs pins.