THE THIRD EYE – International Project for the European Capital of Culture Ruhr.2010
THE THIRD EYE is a 4-year interdisciplinary art project for the European Capital of Culture Ruhr.2010. “What is a city, how do I live?” and “What could a city be?” – in interaction with science, education and art, these main questions will be examined to create visions for the metropolises of the future. Scientists, pupils, students, artists, architects, urban planners and designers from different parts of Europe construct miniature-models of their future metropolises in a process that include the central themes Urbanism, Identity and Integration, such as CYBERCITY Ruhrstadt (GER), Randstadt (NL) or Tallinn (EST). When constructing the models, pupils from different cities combine place and buildings from their actual living environment with elements which they miss in their city and which they wish to have for themselves in the future. Visions for new cities and metropolises emerge from the future cities of the children and teenagers and the ideas of the artists, architects, urban planners and designers, made comprehensible within the models. These impulses for future European metropolises will be presented in a central interactive, multi-media based exhibition. In the exhibition the viewer roams around the models with the help of the remote controlled video robot – which serve as “avatars” – and “dives” virtually into the model worlds to explore them from the perspective of a pedestrian. People can visit the exhibition on site and via internet from all over the world.
DATE – 2010
DISCIPLINE – Art
MEDIUM – Interactive video installation
STATUS – Displayed at Transmediale in Berlin, Germany
WEBLINKS
https://transmediale.de/de/vernissage-intersection-graham-smith
https://transmediale.de/de/content/graham-smith
INTERSECTION is a work exploring the history of the Berlin Wall and is comprised of an interactive video installation that morphs between a set of a panoramic murals shot at the same Berlin Wall site between the years 1988 to 2009. The 3.6 meter high by 15 meter long piece uses beamers, cameras and computer software to track the location of people in front of the piece which is designed to convey a rich, complex impression of the enormous physical and geopolitical transformations the area around the wall has under gone over the last 20 years by allowing people the ability to literally “walk in and out of time”. By giving viewers the ability to explore history with their bodies as well as their eyes the piece aims to create an image that becomes alive, reacting to the movement of the viewers to project a sense of the multifaceted history the area holds. The Intersection exhibition was funded through a Media Arts grant from the Canada Council of the Arts.
DATE – 2010 – 2017 DISCIPLINE – Industrial Design MEDIUM – Videoconferencing STATUS – In production with 100 deployed WEBLINKS – www.webchair.com Telepresence display
I have been involved in the space art community for 25 years with the artistic aim of exploring the relationship between our perceptions of the environment and how we conceptualize it as a world view. In the early stages I experiemented with panoramic photographs which I wanted to have shot in space and later developed a technology similar to todays Quicktime VR/Google Styreetview that displayed panoramic imagery to viewers wearing virtual reality Head Mounted Displays and at SIGGRAPH 1990 I demonstrated a version called Videosphere with VR pioneer Jaron Lanier. In 1993 with the the help of Gord Harris of the IMAX corporation I conducted a test using a 2 meter wide screen in a pool to demonstrate that movies can be projected underwater onto the inside of domes. As part of the experiment I entered the pool with a diving mask and found that seeing the earth in this new way to be a very powerful experience. In 2004 I was a speaker at the 8th Space Art conference held at the European Space Agency in Nordwick, Netherlands and in 2009-2010 was an artist in residence at the same facility invited to demonstrate the core underwater video technology.
I developed the core underwater cinema technology in 2010 when I was an artist in residence at the European Space Agency’s technology research centre in Noordwijk (ESA), Netherlands http://www.esa.int/esaMI/ESTEC/index.html where I tested and built 2 versions of the underwater theatre. The project was showcased on Discovery Channel in 2010 and represents 25 years of work experimenting and building various technologies to exhibit the concept. With the completion of the physical task of building the core display technology (thanks to the European Space Agency) I wish to focus on the creation of an interactive element, imagery and an exhibition at V2.
The core concept behind the project is the realization of a way to de-contextualize video imagery for viewers so that they can view the image in a new way. By immersing the participant underwater in a weightless state people loose their sensation of a local environment and “feel” the work in a new and unique way. Everyone who participated in the tests we did at ESA commented on the realism of the experience and experienced the sensation astronauts have called the”Overview Effect”. As part of my research at ESA I contacted astonaut Frank Mitchell (Apollo 14) who was the 6th person to walk on the moon and talkedabout how it has proven quite difficult to communicate more than just a portion of this potentially revolutionary experience to people. LIBERATION is a project which will overcome this physical barrier by providing people with a convincing, weightless simulation of this experience.
“For V2 the project LIBERATION is a next step in the research and understanding of Bodily perception in general and in media environments, or variable gravitational environments specifically. This research lines up with the Unstable Media practices that V2 has been developing over the last 20 years in the domain of the arts but always within an interdisciplinary setting. The project of Graham Smith will be a major contribution to this as it brings together a perfect match of theoretical issues and direct experiences around perception and Media environments”.
Alex Adriaansens – Director V2
The world would be a better place if every person could spend an hour space-walking above the earth’s surface to experience for themselves the world as it truly is, a fragile beautiful sphere spinning through the vastness of space. We still live on an earth that seems flat because that is how we physically experience it in our everyday lives. It is how we express it as we see the sun rising in the east, moving across the sky and finally setting in the west. Only the few people who have actually been in space, those that have felt for themselves the so called Overview Effect, understand and perceive the world in this wholly new way. The majority of humanity though never truly comprehends this reality due to our inability to easily break the bonds of gravity and propel our bodies into space to experience first hand the world as an interconnected whole.
“Liberation” is a space art project which re-creates the Overview Effect for people by enabling them to experience a weightless simulation of floating in space and traveling around the earth. It is an underwater movie theater (Submersive Cinema) which is constructed in a swimming pool and projects imagery onto a 7 meter wide, bowl shaped screen which is situated underwater. The 35mm movie projector, held by a mechanical arm which is situated over the center of the screen, is pointing straight down. From 1 meter underwater, in a waterproof housing, a 120 degree image of the earth from orbit is projected onto the underwater screen. 6 people at a time equipped with scuba masks, snorkels and neutral buoyancy vests immerse themselves within the pool to experience what astronauts call the Overview Effect for themselves. In addition a railing running around the edge of the pool allows an additional 50 audience members to gain a sense of the experience without getting wet. The ultimate goal is the placement of a high resolution video camera with a panoramic lens on the space station or a satellite which will continuously transmit a live world-view from orbit.
I have been working on the “Liberation” project since 1982 taking panoramic photographs which I wanted to have shot in space. I developed a technology that displays panoramic imagery to viewers wearing virtual reality Head Mounted Displays and at SIGGRAPH 1990 in Dallas I demonstrated a version called Videosphere with VR pioneer Jaron Lanier. With the the help of the IMAX corporation in 1993 I conducted a test using a 2 meter wide screen in a pool to demonstrate that movies can be projected underwater onto the inside of domes. As part of the experiment I entered the pool with a diving mask and found that seeing the earth in this new way to be a very powerful experience.
I am currently experimenting with different ways to create the imagery for the project by combining satellite databases such as the Geosphere project by space artist pioneer Tom van Sant with computer graphics clouds. Another approach I am researching is the use of an optical printer to re film enlarged, still images of the earth to simulate a movie of the earth from orbit. Recently, space science researchers from Canada’s York University have expressed an interest in using Submersive Cinema to perform studies into human perceptual cues while weightless.
At the dawn of the 21st century the human race is finally beginning to understand how physically connected we all are. To move forward as a species we must overcome the differences which separate us and act as one planet. LIBERATION is an artwork that will help people understand this ideal and allow them to see their world in a new way.
Often understood to be diametric opposites, art and technology are two central aspects which permeate my artistic philosophy. Uniquely purveying their thoughts and feelings through creative works, artists are enabled by the multiplicity of choices that technology affords, offering alternative method and mediums. Because the mind is perpetually limited by human physicality (McLuhan 19XX), the translation of thought into tangible physical objects outside of body is, at best, an attempt to creatively express or represent the intended meaning. Working within the limitations of television and film, artists must express their message through the video and audio channels; these individuals shape their message accordingly, fitting their message into this means of dissemination. A problem surfaces, however, when an audience member is unable to access one of these avenues of meaning-making, the deaf or hard of hearing, for example. Although several strategies have been developed to avoid this problem, evident in the widespread adoption of close captioning, those who are deaf or hard of hearing receive a diluted and potentially highly subjective interpretation of non-dialogue sound information such as music and sound effects, as spoken dialogue is prioritized and only so much can be fit within the available caption space and time.
I want to construct a multimedia system that allows people who are deaf and hard of hearing the opportunity to access and experience sound information through alternative means, using visual and tactile media to represent meaning that is more indicative of the film or television content producer’s original goal. Working with the Ryerson human computer interaction team, I want to enable these individuals to feel and see sound information, privileging the auditory experience as a whole rather than focusing solely on information gained from dialogue alone.
My role as the artist in this project will be to determine how different types of sound information should be expressed, based on a variety of factors. Music, for example, ranges in style, mood and intensity while spoken dialogue varies according to the speaker’s intonation, speed, and the words that are emphasized, among other factors. The translation of these different factors into visual and tactile outputs requires my unique perspective, as I am knowledgeable and experienced in morphing my ideas into unconventional physical and multimedia representations.
In one of my projects, for example, I wanted to show that while technology can enable communication over distances, something is lost when an individual’s presence is limited to the space and functionality of a computer screen. I created a presence chair that uses a screen at the back of a chair to display the image of the remote person in a video conference. This person then looks like she is sitting in a chair. She can also swivel the chair from the remote location to show that she is directing her attention to someone else in the physical room. Using similar devices, she then has some of the important physical movements that are commonly used in meetings or public speaking events (such as turning to see a speaker). This project was part of the sentient creatures lecture series where Jaron Lanier, renown digital media artist and computer scientist, presented a “guest” lecture from a remote location to an audience of sixty. Jaron was able to interact with his audience, for, when he moved, the chair mimicked a variety of his movements, allowing him to have a physical presence within the room (see http://connectmedia.waag.org/media/SentientCreatures/ jaronlanier.mov).
I have made several preliminary sketches (see Figure XX) to illustrate my initial thoughts on the possible functions and aesthetic look of the device that we are constructing. I am eager to develop a relationship with the Ryerson team, for I know that this project has the ability to cater to a community that has long been marginalized by mainstream entertainment. By providing alternative representations of sound information for the deaf and hard of hearing, I seek to create an entertainment experience that is as rich and meaningful for deaf and hard of hearing individuals as the audio visual experience that most hearing individuals are able to enjoy.
For the most part, my work as an artist has centered on the interrelation of art and media. In creating multimedia installations, I challenge my audiences to embrace that which is created through the interplay of computers, art and media, acknowledging their preconceptions in hopes. For example, my art project, “Sentor PoBot Goes to Washington” used an early video conferencing system built into a wireless robot. The purpose of this was to allow poets and musicians to communicate with people on the streets in front of the White House in Washington DC from various remote locations around the world. Artists were able to express their opinions on a “virtual soap box” while audience members on the street experienced a different physical and audio-visual form of the artist. Audience members were able to see how art and music, with the assistance of technology, could give an artist presence within a remote location which he would have been unable to access otherwise.
Working with Jeff Mann and Michelle Teran, I created the “Live Form Telekinetics Projects” which occurred at both InterAccess and Toronto and the WAAG in Amsterdam simultaneously. We worked with eight artists to create a telepresence dinner linked with various telepresence scultural objects. Physical objects were duplicated on each side and when manipulated in real space an equivalent manipulation occurred on the side. For example, if someone “clinked” the wineglass in Toronto, a robotic mechanism clinked the equivalent wineglass in Amsterdam. A telepresence table was built where half the table contained physical objects and the other half contained a video projection from the remote location. If a person in the remote location placed their hand on the table, it would appear as an actual size hand on the virtual portion of the table at the
FULL CITATION (TITLE/REFERENCE) REFEREED
JOURNAL ARTICLES CONFERENCE
PRESENTATION/POSTER OTHER (INCLUDING TECHNICAL REPORTS, NON-REFEREED
ARTICLES, ETC.)
Accepted/Published
Karam, M. and Russo, F. and Fels, D. (2009). Designing the Model Human Cochlea: An Ambient crossmodal audio-tactile display. (2009) IEEE Transactions on haptics: Special issue on ambient haptic systems. 2(3). 160-169. X
Branje, C. Maksimouski, M., Karam, M., Russo, R. & Fels, D.I. Vibrotactile display of music on the human back. (in press). Proceedings of The Third International Conferences on Advances in Computer-Human Interactions (ACHI 2010), Barcelona.
X
Karam, M., Branje, C., Russo, F., Fels, D.I. (accepted). The EmotiChair – An Interactive Crossmodal tactile music exhibit. ACM CHI, Atlanta, April 2010. X
Branje, C.J., Karam, M., Russo, F., Fels, D.I. (2009). Enhancing entertainment through a multimodal chair interface. IEEE Symposium on Human Factors and Ergonomics. Toronto. X
Karam, M. and Nespoli, G. and Russo, F. and Fels, D.I. (2009). Modelling Perceptual Elements of Music in a Vibrotactile Display for Deaf Users: A Field Study. In Proceedings of The Second International Conferences on Advances in Computer-Human Interactions (ACHI 2009). Cancun.
X
Branje, C. and Maksimowski, M., Nespoli, G., Karam, M., Fels, D. I. & Russo, F. (2009). Development and validation of a sensory-substitution technology for music. 2009 Conference of the Canadian Acoustical Association. Niagara on the Lake. X
Karam, M., Russo, F., Branje, C., Price, E., & Fels, D. I. (2008). Towards a model human cochlea: sensory substitution for crossmodal audio-tactile displays. In Proceedings of Graphics Iinterface. Windsor. pp. 267-274.
X
Karam, M. & Fels D.I. (2008). Designing a Model Human Cochlea: Issues and Challenges in Crossmodal Audio to Touch Displays. Invited Paper: Workshop on Haptic in Ambient Systems. Quebec City. X
Feel the Music: An accessible concert for deaf or hard of hearing audiences. Live music concert. Clinton’s Tavern. Toronto. (2009) X
Feel the Music: Concert at Ontario Science Centre. Three live bands and vibrochair. (2009). X
Emoti-chair: 5-month installation exhibit at Ontario Science Centre. Weston Family Ideas Centre. (2009). X
DATE – 2007
DISCIPLINE – Art
MEDIUM – Interactive robotic sculpture
STATUS – Displayed at the Ontario Science Centre, Toronto, Canada
WEBLINKS
http://gaggio.blogspirit.com/archive/2007/04/11/mobi.html
MOBI (Mobile Operating Bi-directional Interface), by Graham Smith, is a human sized telepresence robot that users remotely control to move through distant environments, see through its camera eye, talk through its speakers and hear via its microphone ear. Simultaneously a life sized image of themselves is projected onto the robots LCD face, creating a robotic avatar. MOBI allows people to “explore far away art shows, attend distant presentations and make public appearances from anywhere on earth, thus helping to reduce air travel and reduce global warming”. MOBI is at DEAF 07