DATE – 2017-2018
DISCIPLINE – Art
MEDIUM – Robotic Installation
STATUS – HUBOT MOBI, Dutch Design Week, Eindhoven, The Netherlands
WEBLINKS – https://www.nextnature.net/projects/hubot/
The HUBOT MOBI robots were created for the NEXT NATURE exhibition by to allow people the ability to interact within the exhibition and be guided through the exhibits. The 2 robots feature motorized bases, LCD touch screen interfaces, cameras, speakers, videoconferencing software and printers to give users the results of their feedback.
DATE – June 25, 26, 2017
LOCATION – Ryerson University, Rogers Communications Centre,
80 Gould Street, Toronto, Ontario, Canada
DISCIPLINE – Interactive art
MEDIUM – Telepresence HMD
STATUS – Confirmed
WEBLINK – https://conference.virtualreality.to
FUNDING – Canada Council, Media Arts
David Rokeby Graham Smith Vincent John Vincent
1983 – Very Nervous System 1983 – Displaced Perspective 1986 – Mandela
2016 – Hand – held 2017 – Displaced Perspective 2.0 2017 – Mandela 2.0
What happens to presence when it is subjected to screens, to total surround, to telepresence, to interactivity, to simultaneous access to and from geographically separate other presences? Probing precisely this dimension, 3 international media artists from Toronto revisit 30 years of experimentation and achievements.
Featuring 3 of the world’s first interactive Virtual Reality artworks, “Presence Fields” is an exhibition by Toronto artists David Rokeby, Graham Smith and Vincent John Vincent that explore the possible message behind the new medium of Virtual Reality. First created by the artists in the early 1980’s, the exhibit parallels these works with 3 of their current works in the same field of artistic exploration. The exhibit is a manifestation of McLuhan’s concepts that artists have a historical role as technological pioneers and that avant-garde artworks can be important “markers” that help define how a new medium impacts society.
Presence Fields interrogates the meaning and the experience of presence in different configurations and situations as they are affected by technology. Telepresence comes to mind first because it is the most obvious addition introduced by new technologies, from the time of radio and the telephone, but improving every decade all the way from voice, image and movement transmission to touch and 3D today.
Presence, however, means more than “being there”. Presence is a complex feeling, rich in variations, closely related to touch, affecting emotions in both positive and negative ways. A close cousin to the feeling of intimacy, the sense of presence reaches deeply in human sensibility. It’s also a physical paradox, because it requires the object to be there (with exception in all kinds of delusion), but that object can be out of sensorial reach and reside legitimately only in imagination.
The experiences offered to the patron in Presence Fields thus tend to underline the user’s presence as much as the sense or intuition of the presence of another. Their value is authentic in that it is not enough to read about them or watch them in an internet video, because they only reveal their meaning by the experience itself.
One thing these artists all share, is a genuine respect for Marshall McLuhan, a respect that the Canadian prophet malgré lui reciprocated. “The artist, he wrote, is the man in any field, scientific or humanistic, who grasps the implications of his actions and of new knowledge in his own time. He is the man of integral awareness.”
Derrick de Kerckhove
Director of the McLuhan Program in Culture and Technology, University of Toronto, 1983 to 2008
David Rokeby is a Toronto-based artist who works with a variety of digital media to critically explore the impacts these media are having on contemporary human lives. His early work “Very Nervous System” (1982-1991) was a pioneering work of interactive art, translating physical gestures into real-time interactive sound environments.
Graham Smith is a pioneer in the area of VR having exhibited the world’s first Head Mounted Display artwork in 1983, the first trans-Atlantic telepresence robotic artwork in 1986 and the first use of panoramic imagery in a VR environment in 1988. He is Chief Science Officer at the Dutch company Webchair and a PhD student at University College Dublin where he is researching the positive effects telepresence technology can have to help re-integrate autistic children back into school environments in a Ryerson University based research project called WEBMOTI.
Vincent John Vincent
Vincent John Vincent is a Toronto based artist, inventor, and entrepreneur, who co-invented video gesture control technology with his partner, Francis McDougall starting back in 1985. They have been the world leaders, forging and establishing the application and market for the technology, through their company, GestureTek, winning countless awards and picking up over 45 patents on the way.
David Rokeby Graham Smith Vincent John Vincent
1983 – Very Nervous System 1983 – Displaced Perspective 1986 – Mandela
2016 – Hand – held 2017 – Displaced Perspective 2.0 2017 – Mandela 2.0
Presence is a complex feeling, rich in variations, closely related to touch, affecting emotions in both positive and negative ways. A close cousin to the feeling of intimacy, the sense of presence reaches deeply in human sensibility. It’s also a physical paradox. Because it requires the object to be there (with exception in all kinds of delusion), but that object can be out of sensorial reach and reside legitimately only in imagination.
What happens to presence when it is subjected to screens, to total surround, to telepresence, to interactivity, to simultaneous access to and from geographically separate other presences, etc.? Probing precisely this dimension, three international media artists from Toronto revisit 30 years of experimentation and achievements.
David Rokeby, for example, has been testing the sensations of touch and presence, first creating a new perception of space itself by involving the whole body in the production of sound. In Very Nervous System, the specific sound produced by specific gestures in space reveals in real time a continuity of my presence in and with space itself. It gives me a new sense of my own presence. The VNS version presented in Rotterdam in 2011, also uses the combination of bodily movement and sound, but in a very different way, instead of producing sound as would a musical instrument VNS2 allows two or more persons to enter in immediate contact with each other via sound as the body of one runs virtually into that of the other on the other side of the ocean. The real body and the virtual lock into a paradoxical and sonorous embrace. The presential power of that installation is extremely powerful. Hand-held, the well named installation of an invisible sculpture revealed by touch alone, has an area where the user’s hand touches other hands in the sculpture, strongly evoking the tactile experience. This is virtual presence par excellence.
Graham Smith’s Displaced Perspective 2.0 adds a significant feature over the 1.0 version of 1983, that is, to allow the user to experience bodily the space on the other side of the telepresence transportation via the pool tables gaming surface. Smith quite literally wants to be in two places at once. In William Gibson’s 1984 best-seller, Neuromancer, there is mention of a program called the “Rider” that allows a physical person to tele-operate a virtual projection into a real but inhospitable space so as to explore and manage it without danger. His fascination with tele-robotics has led Smith since the early 1990s precisely in that direction, which is now rapidly turning into an industry. It is worth noting that Smith’s other dominant, not to say obsessive, fascination is now as then about the experience and the transmission of total surround and real time 360-degree video and photography.
By going 3D and high definition, the Mandala 2.0 is also a big improvement over the first edition. It also invites the user to occupy another space, this one not real, but virtual and allowing nevertheless complex interactions with objects that respond in movement, sound and texture, as real ones would. Vincent John Vincent is offering a new level of depth-of-field interactions that permits one to experience vertigo in total safety.
The experiences offered to the patron in Presence Fields thus tend to underline the user’s presence as much as the sense or intuition of the presence of another. Their value is authentic in that it is not enough to read about them or watch them on Vimeo, because they only reveal their meaning by the experience itself
David Rokeby, Graham Smith and Vincent John Vincent met at or via the McLuhan Program in Culture and Technology, at the University of Toronto. Graham Smith was a resident artist of the Program. He created there the Virtual Reality Artist Access Program (VRAAP), an offshoot of the weekly “artist-engineers” seminars series I had created and run at the Program since 1983, precisely. I was deeply inspired by McLuhan’s personal injunctions to privilege the artist if I really wanted to know anything really useful about my own time. He was fond of quoting Wyndham Lewis statement that “The artist is always engaged in writing a detailed history of the future because he is the only person aware of the nature of the present.” The series hosted many other Toronto and international artists, such as Norman White, Timothy Leary, Joe Davis, Stelarc Warren Rudd, Kit Galloway, Sherrie Rabinowitz and many others who fed into an informal network of pioneers. These artists didn’t think of themselves as pioneers at the time. They just did it, whatever it is they did. It is only looking back that one realizes that they were almost alone in experimenting with various aspects of presence that today we take for granted and have become the norm.
Most of them became very successful, the three artists featured in Presence Fields, being international stars. All three were involved in a dozen events initiated by the McLuhan Program, among which two transatlantic videoconferences for artistic performances. The first was Strategic Arts Initiatives (1986), between Salerno in Italy and Toronto, and later between Paris and Toronto, and the second, Les Transinteractifs/Transinteractivity (1988). Between Toronto, Salerno and Paris, Doug Back (in Toronto, at the Art Culture Resources Centre) and Norman White (in Salerno) performed the world’s first transatlantic arm-wrestling, each one weighing on a steel rod haptically connected to the other city by a pressure measuring and activating motor on each side and very low speed modems feeding that pressure across the ocean in streams of digital bits. They repeated the performance in 2011 from V2 in Rotterdam to Interaccess in Toronto, also brought together by Graham Smith.
Did they form a “school”? Not really. If there is a leadership, very loose and not constraining it could be attributed to Graham Smith who has done the most, including this event, to bring the other together. One thing they all share, is a genuine respect for Marshall McLuhan, a respect that the Canadian prophet malgré lui reciprocated. “The artist, he wrote, is the man in any field, scientific or humanistic, who grasps the implications of his actions and of new knowledge in his own time. He is the man of integral awareness.”
By Derrick de Kerckhove, June 12, 2017
David Rokeby is a Toronto-based artist who works with a variety of digital media to critically explore the impacts these media are having on contemporary human lives. His early work “Very Nervous System” (1982-1991) was a pioneering work of interactive art, translating physical gestures into real-time interactive sound environments. He has exhibited and lectured extensively internationally and has received numerous international awards including a Governor General’s Award in Visual and Media Arts (2002), a Prix Ars Electronica Golden Nica for Interactive Art (2002), and a British Academy of Film and Television Arts “BAFTA” award in Interactive art (2000).
1983 – Very Nervous System (1991 version)
“Very Nervous System” was one of the first interactive artworks to present a compellingly visceral and immersive interactive audio experience. The 1983 version used custom cameras, an Apple ][ computer, a tape loop, an analogue synthesizer and custom sound distribution system to create a responsive sound environment that translated movement into sound in real-time. The project was under development from 1981 to 1991, evolving as the technologies of personal computers, video sensing and sound synthesizer evolved. It has been presented widely internationally, including at the Venice Biennale in 1986.
2016 – Hand-held (Documentation)
“Hand-held”, presented at Nuit Blanche Toronto this past October, creates an invisible 3-dimensional sound and image construct suspended in the exhibition space. This ’sculpture of possibilities’ reveals itself on the hands of the interacting public as they explore the installation.
Graham Smith is a pioneer in the area of VR having exhibited the world’s first Head Mounted Display artwork in 1983, the first trans-Atlantic telepresence robotic artwork in 1986 and the first use of panoramic imagery in a VR environment in 1988. He is Chief Science Officer at the Dutch company Webchair and a PhD student at University College Dublin where he is researching the positive effects telepresence technology can have to help re-integrate autistic children back into school environments in a Ryerson University based research project called WEBMOTI. He is currently completing 2 major interactive VR artworks called “Sensing Presence” and “Intersections” which will be exhibited in 2018.
1983 – Displaced Perspective
“Displaced Perspective” is the world’s first Head Mounted Display artwork which uses a live camera and a custom-built HMD that holds two displays which is suspended via counterweights from a movable boom on the ceiling. When the viewer puts on the HMD they see themselves from a different perspective 2 meters away looking back at themselves which provided them with a remote point of view of their body image and kinesthetic feedback as they moved around the room. A second recorded camera perspective is also displayed which creates a déjà vu like of sensation for the viewers who do not know what is real and what was recorded. This alteration of people’s perception of real verses recorded space created a powerful sensation of telepresence for the audience and generated a new perspective on perspective.
2017 – Displaced Perspective 2.0
“Displaced Perspective 2.0” is a telepresence installation that allows people the ability to play a physical game of pool with someone in another location and is part of the Cohesion Cafés initiative, in collaboration with Dr Martine Delfos, that aims to create a global network of cafes that connect people in a natural setting that feels like a local café. Unlike Skype or other such videoconferencing systems, the life-sized imagery and display technologies used projects a realistic sensation of human presence allowing the audience the chance to become immersed within the linked environments and creates a new form of “environmental telepresence” for the audience.
Vincent John Vincent
Vincent John Vincent is a Toronto based artist, inventor, and entrepreneur, who co-invented video gesture control technology with his partner, Francis McDougall starting back in 1985. They have been the world leaders, forging and establishing the application and market for the technology, through their company, GestureTek, winning countless awards and picking up over 45 patents on the way. Vincent toured the globe playing music on virtual instruments as audiences went on a creative journey as he danced through cyberspace. The Smithsonian Institute in D.C is just one of 9,500 installations worldwide. Vincent has received a Life Time Achievement Award from the Canadian New Media Association (1995), was inducted into the Digifest Digital Pioneer Hall of Fame (2012), and has been given the “Meet the Media Guru” award in Milan, Italy.
1986 – Mandala
The initial video gesture control system, known as the “Mandala Virtual World System”, was written on an Amiga PC and moved to a 486 based PC. A standard color camera placed your live video image on the screen, inside the virtual world, that responded to gesture interaction. It has a full 2D skeleton tracking system and let the user touch, push, pick up, bounce or throw an animated object. In the PC version the video image was digital and could shrink or grow in the 3D worlds. Users leaned right, left, up or down to move through the virtual worlds. Their company GestureTek, licensed their technology/patents to Sony for the Eyetoy/Move and Microsoft for the Kinect.
2017 -Mandala 2.0
The system now uses the Unity 3D graphic engine to create 3D virtual worlds in high resolution and millions of colors and the 3D depth gesture recognition software they invented. The system can be installed on walls, windows, counters and floors or run on a single flat screen or a video wall, with the ability for multiple people to use it simultaneously and is currently being integrated with true 3D sound.
DATE – 2017
DISCIPLINE – Art
MEDIUM – Interactive video installation
STATUS – In production for display in 2019
FUNDING – Ontario Arts Council, Media Arts
INTERSECTIONS is an interactive video installation that documents the transformation areas around the former Berlin Wall have gone through from 1988 – 2009 – 2017. It is an interactive video installation that morphs between time periods depending on the motion and location of the audience. The 3.6 meter high by 80 meter long piece uses cameras and computer software to track the location of people in front of the piece to allow the audience the ability to shift the images shown on the 14 video projectors accordingly. The original 675 image cubist panoramas that form the image set were shot before the wall came down in 1988 using a unique 5 Polaroid camera system of my own invention that created panoramic imagery. The 2nd set of images were shot in 2009 using the same panoramic camera rig as used in 1988 but with 5 digital cameras and re-photographed exactly the same locations. The final set of images will be shot in 2017 and will be used as the 3rd set of images for an exhibition in 2019 to celebrate the 30 year anniversary of the fall of the wall and the transformation of Berlin.
The piece is designed to convey a rich, complex impression of the enormous physical and geopolitical transformations the area around the wall has under gone over the last 20 years by allowing people the ability to literally “walk in and out of time”. By giving viewers the ability to explore history with their bodies as well as their eyes the piece aims to create an image that becomes alive, reacting to the movement of the viewers to project a sense of the multifaceted history the area holds. Just as history is constantly in motion by being re-evaluated by each new generation, INTERSECTIONS takes the static images from the past and transforms them into a constantly morphing vista of the present.
1) 1988 Polaroids – The 1st set of images are 600 separate images that make up 3 panoramic murals that I shot of the Berlin Wall area in 1988 with 5 Polaroid cameras and a custom dolly. The cameras recorded 360 degree scans of the environment as it was moved forward in space and a clock was mounted on dolly to record the passage of time.
“Checkpoint” – 150 images total – 30 images long – 2.5 panoramic scans
The “Checkpoint” image was shot at Checkpoint Charlie, the symbolic Cold War focal point and was still extremely dangerous in 1988. The East German boarder guards were very unhappy with my project and not only tried there best to stay out of my photograph (unsuccessfully) but did try and arrest me numerous times as I crossed the white line that marked the exact boarder and forced them to cross the cameras field of view and become part of the image.
“Door – 300 images total – 60 images long – 5 panoramic scans
The “Door” image is the longest and most complex of the images as it takes in 6 complete panoramic scans in 1 long continuous movement that involved the shooting of 300 Polaroid images in sequence. The image moves 6 meters from inside a hallway beside a beautiful old door across a street to cm’s from the Berlin Wall.
“Bridge” – 150 images total – 30 images long – 2.5 panoramic scans
The “Bridge” image is similar in nature to the “Door” image as it involves not only a series of 360 degree scans of the environment but also a simultaneous linear dolly move. The image was shot on a railway bridge that became useless after the wall was erected and is now part of the S Bahn commuter rail system close to the Kolln Heide station.
2) 2009 Digital images – With help of a Canada Council, Media Arts grant I traveled back to Berlin in the fall of 2009 and reshot the Check point Charlie images from exactly the same positions and angles using an adaptation of the original camera rig that I adapted to fit digital cameras. These contemporary high resolution images were morphed with the original 1988 images and displayed at Transmediale in Berlin in 2010 and allowed the audience the ability to interact with the landscape by using their body as input devices so they could move through time and space to reveal the dramatically changed in which the wall is gone and the city has been transformed by 20 years of peace.
3) Berlin Wall – Remaining parts of the Berlin Wall using a digital SLR camera and will create a composite image of the structure in Photoshop. This image was displayed when people are close to the projection and is a very realistic recreation due to the fact that the wall is essentially a 2 dimensional concrete surface that can be projected very realistically in this way.
4) 2017/18 Digital Images – A final set of images will be shot in 2017/18 that will form the basis of an exhibition in 2019 to celebrate the 30 year anniversary of the fall of the wall and the transformation of the “death strip” to its current manifestation as an example of modern urban renewal.
DATE – 2017
DISCIPLINE – Art
MEDIUM – Telepresence, VR and robotic art retrospective
STATUS – In development for exhibition in the Fall of 2017
FUNDING: Canada Council, Media Arts
“Sensing Presence” is a project to restore 3 early media artworks which are historically significant because they were the world’s 1st art installation that used a Head Mounted Display (1983 Displaced Perspective), the 1st telepresence robotic sculpture (1986 Displaced Perspectives) and the 1st virtual reality panoramic video exhibition (1989 Videosphere). The second part of the project involves the construct of 2.0 versions of these early works using current technology to create an exhibition that explores both the historical roots as well as the future potential of virtual artistic expression.
As an artist I have been exhibiting artworks using virtual reality, telepresence robotics and video panoramic imaging since the early 1980’s and have been credited with exhibiting the world’s first artworks in these fields. This project will restore the early works and create updated versions using current technology. It is a project close to my heart as these works were exhibited at the very start of my artistic career and now 33 years later I feel I have developed a unique perspective on where these new media evolved from, the impact they have had on society and where they will have in the future.
“Sensing Presence” is a project that recreates the world’s first Head Mounted Display (HMD) virtual reality and telepresence robotic artwork that I created between 1983 -1989 in an exhibit that will feature both the original works as well as new versions based on the original ideas but using 21st century technology. These early VR and telepresence works are regarded historically as the world’s first artworks using HMD’s, telepresence robots and panoramic imagery in virtual reality environments and when displayed beside the updated 2.0 versions of each piece will provide the audience with a historical reference in regards to current projects now being created around the world.
ORIGINAL WORK 2.0 VERSION
1983 – Displaced Perspective 2017 – Displaced Perspective 2.0
1986 – Displaced Perspectives 2011- Displaced Perspectives 2.0
1989 – Videosphere 2017 – Nanocopter
1983 – DISPLACED PERSPECTIVE
Graduation exhibition at the Ontario College of Art
Immersive Head Mounted Display & live camera video installation
As part of my graduation exhibition at the Ontario College of Art I displayed what the worlds first head mounted display artwork called displaced perspective. This piece used a camera and custom-built pair of video glasses that held two small black and white TVs in them suspended via counterweights from a movable boom on the ceiling. When the viewer put on the video glasses they saw themselves from a different perspective 2 m away looking back at themselves with the video glasses on providing them with me Remote perspective of their on body image and kinesthetic. As well as second camera perspective had been set up Camera witch and Displayed a prerecorded version of the scene with people moving through looking around which created strange déjà vu kind of feeling the viewers as they did not know what was real and what was Live. People you thought or which created powerful since station of telepresence and the new perspective on what perspective was.
1986 – DISPLACED PERSPECTIVES
Strategic Arts Initiative exhibition at Arc in Toronto & U of Salerno in Italy
Telepresence robotic sculptures linking Toronto to Salerno
1989 – VIDEOSPHERE
SIGGRAPH exhibition in Las Vegas
Virtual reality panoramic video artwork using panoramic imagery
L panoramic imaging artworks using custom-built camera rigs that I invented and I displayed these internationally in North America and Europe. My panoramic film frame to frame was one of these projects that was shot in 1986 and continues to this day. Nano spear the Project developed with the help of the National research Council was another project to load for the sensation of scale and presents to play with artwork. My series funny farm what you showed only adapted interactive microfiche and was another piece of wood displayed Montréal in 1992 and this is a showed using an early form hello display later became somewhere to Google Street viewmater in the 1980s I developed a series of
1983 as part of my graduation exhibition at the Ontario College of Art I displayed what the worlds first head mounted display artwork called displaced perspective. This piece used a camera and custom-built parros video glasses that held two small black and white TVs in them suspended via counterweights from a movable boom on the ceiling. When the viewer put on the video glasses they saw themselves from a different perspective 2 m away looking back at themselves with the video glasses on providing them with me Remote perspective of their on body image and kinesthetic. As well as second camera perspective had been set up Camera witch and Displayed a prerecorded version of the scene with people moving through looking around which created strange déjà vu kind of feeling the viewers as they did not know what was real and what was Live. People you thought or which created powerful since station of telepresence and the new perspective on what perspective was.
This is a new artwork called DISPLACED PERSPECTIVE the NANOCOPTER that will allow an audience the ability to shrink their perspective and literally enter into the 360 degree world of the microscope like a character in the film “Fantastic Voyage”. This transformation is designed to alter people’s perception as to the nature of the world around them and to extend our understanding of the Nano scale ecosystem that envelopes us but of which we are oblivious. By creating an experience which allows people the ability to physically enter into the microscopic world in a 360 degree panoramic VR type environment, NANOCOPTER will result in an artwork that pushes both esthetic as well as technical boundaries and open up a new frontier of artistic potential.
Our ability to see and explore the world around us is critical in providing us with a perspective on our environment and gives us the ability to comprehend the interactions that surround us. When viewing the microscopic world through microscopes we are given a window into his world but only from the perspective of looking down on it as if from a aircraft or satellite. We can’t truly enter the world of the ant or grass, to see it from that scale and move within that environment rather than simply observing it from above or through the video eye of an Endecsope type probe. Our ability to fully understand and relate to the microscopic world that surrounds and defines us, is limited by this inability to see and interact with it in a natural and human way. The Nanocopter project is designed to help overcome this issue and for the first time “shrink” people down to a microscopic level using VR display systems like Oculus Rift and my own invention of the world’s first microscopic panoramic camera.
HISTORY OF THE PROJECT
The history of my artwork that uses virtual reality Head Mounted Displays (HMD) and panoramic imaging systems like Google Streetview dates back to the early 1980’s and were the world’s first artworks to use of such display technologies. In 1983 as my graduation project at the Ontario College of Art I displayed the world’s first HMD based artwork called DISPLACED PERSPECTIVE which used a custom built HMD which I fabricated from 2 small camera eyepieces linked to a live camera. In the installation the viewer wore the HMD and saw themselves from a live perspective of 2 meters away. The second eyepiece in the HMD displayed the same image perspective but was pre-recorded which created a complex experience for the viewer as it was not obvious what or who in the image was real or recorded. The kinesthetic feeling of seeing yourself live from a different viewpoint was very powerful and
In the mid 1980’s I developed a panoramic imaging system which recorded the entire 360 degree by 180 degree image sphere which displayed all possible perspectives in one image made up of 432 separately shot images. I displayed these images in numerous exhibitions in Canada and Europe and also found that I could cut the images into gore like strips and place them onto a physical globe which undistorted the image back into a conventional perspective. Looking at the top of the sphere recreated a view of looking straight up and the 360 degree circumference was a horizontal panning perspective.
In the late 1980’s I experimented with a system that used a mirrored half sphere to act as a lens to record motion panoramic imagery for an early virtual reality headset system.
This worked very well and numerous companies now use this concept to record imagery with various versions available such as ????.com. As part of these experiments I used a tiny mirrored ball bearing that I had vapor coated with a mirrored reflective coating to create a very small mirrored sphere like the larger nes I first tested.
I then placed the 1mm mirrored sphere on the end of a pin and placed it onto a slide tray and positioned a microscope overhead. I shot a series of images with various types of very small scale objects in the slide tray and photographed the sphere so that using the microscope it would fill the image field of view. The resulting images created microscopic scale panoramic images that took in the full 360 degree environments. My idea back in 1989 was to then use computer algorithms to unwrap the images and display them to people using head mounted displays but this proved impossible at that time due to the limitations of the computers available then as well as my lack of computer programing skills.
With the exponential increase in computer power, availability of low cost HMD’s like Oculus Rift and the development of panoramic software that is capable of transforming spherical imagery in real time my early experiment 26 years ago has become possible for the first time. My proposal is for a piece called NANOCOPTER in which people can sit in a chair, put on an HMD and suddenly become immersed within the world of the nano as a live panoramic eye. By looking around people can see the tiny creatures that inhabit this environment from their perspective and move around within it by using a joystick to control the slide specimen tray. In many ways it will be similar to actually taking a “FANTASTIC VOYAGE” as due to the live nature of the imagery and interactive capabilities of the system people will for the first time experience Nano scale life forms as if they were human sized. This shift in perspective will give the users a new outlook on the world and transform their understanding of the invisible world that surrounds us.
“Froken MOBI” is a robotic sculpture that is installed as a permanent exhibit at the Vitenfabrikken Science Centre in Sandnes, Norway. The robot lives in its own custom built house that is located in the main lobby and comes out to meet guests 2 times a day. “Froken MOBI” is classified as a remotely controlled telepresence robot which is operated from a distance via a wireless Internet connection by a human operator who can see through her eye, listen via her microphone. and speak through her speakers. She could be located in the same building or on the other side of the world.
The robot “Froken MOBI” is a further development of this cooperation, where art and technology combined with dialogue and communication with children is central. In the summer of 2015 she moved into Casa Mobius, a small house that stands in the cafe area at the Science Center. She actually comes from the Netherlands and is the youngest member of the large robot families that Graham Smith has created. “Froken MOBI” both lives and works at the Science Center. The main job she has is to greet and chat with guests, preferably children. She also tries to keep up with what’s happening at the house to tell this on to the visitors. Other times she must rest a little in his house and charge up.
“Froken MOBI” has her own Facebook page where she tells a little about life at the Science Center. Follow her here.
DATE – 2015
DISCIPLINE – Art
MEDIUM – Kinetic sculpture
STATUS – Displayed at Delft Sculpture Park from 2014 to 2015
“Undulations” is a solar powered kinetic sculpture that was installed at the Delft Sculpture Park in the Netherlands from Sept 2014 to Feb. 2015. The piece reacts to its environment like a plant in nature by storing solar energy and discharging it as motion into the environment. Its limbs hold clusters of solar cells that accumulate power, which is released in a wavelike pulse into a 10m tall spring that hangs vertically like a stainless steel stem. During sunny periods a mechanism creates quick intervals of waveforms in the vertically hanging spring, during cloudy periods the cycle of waves slows down, at night the piece stops completely and goes to sleep when the sun goes down and then at sunrise it awakens with the first light that re-energizes the system. The piece represents a balanced technological ecosystem that takes in and releases compatible levels of energy in relation to its habitat. “Undulations” was funded through a Media Arts grant from the Canada Council of the Arts.
DATE – 2014
DISCIPLINE – Art
MEDIUM – Telepresence robotic sculpture
STATUS – Exhibited from July to August in 2014
MOBI Jr 2.0 was exhibited at the Vitenfabriken Science Center in Sandnes Norway in July and August, 2014. The telepresence robot was controlled by visitors who used a “haptic” step pad interface to see through the robots eye and control its movement through exhibition hall.
DATE – 2013
DISCIPLINE – Art
What is the role of the individual in society to act when confronted with a duplicity so horrifying and enormous that it impacts the very nature of social reality? This is the question that Reichstagsband 2.0 asks as it confronts viewers with the choice of continuing to accept the obvious deception of 9/11 and the resolve to alter the situation in what ever means possible and confront the situation in all its manifestations.
Wie definiert sich die Verantwortung des Einzelnen innerhalb der Gesellschaft, sieht er sich plötzlich einem doppeltem Spiel gegenübergestellt, das auf die Grundsätze unseres sozialen Zusammenlebens abzielt? Reichstagsbrand 2.0 konfrontiert den Betrachter mit einer ethisch geprägten Wahlmöglichkeit: entweder eine offensichtliche Täuschung zu akzeptieren oder sich der Situation in all ihren Erscheinungsformen zu stellen.
Wie definiert sich die Verantwortung des Einzelnen innerhalb der Gesellschaft, sieht er sich plötzlich einer erschreckenden Doppelzüngigkeit gegenübergestellt, die auf die Grundsätze unseres sozialen Zusammenlebens abzielt? Reichstagsbrand 2.0 konfrontiert den Betrachter mit einer ethisch geprägten Wahlmöglichkeit: entweder eine offensichtliche und grauenerregende Täuschung zu.
September 11th 2013
Berthelsdorfer Strasse 12, 12043, Berlin
September 11th – 14th 2013
September 11th 19:00 Vernissage
12th – 14th 15:00 – 19:00
DATE – 2012
DISCIPLINE – Art
MEDIUM – Interactive LED fabric done in collaboration with fashion designer Anouk Wipprecht
STATUS – Exhibited at Technosensual in Vienna, Austria
The “Chameleon Dress” is a joint project by Graham Smith and Anouk Wipprecht and is an interactive fashion initiative to create a new form of wearable clothing that changes color in relation to the environment it is placed in. The “Chameleon Coat” utilizes advanced LED, camera and fiber optic technologies in new ways that combine together to redefine how fabric interacts with its surroundings. The first version of the Chameleon Dress will utilize a live camera and miniature battery powered LED beamer connected to a bundle of 300 fiber optic cables which will transmit the color sensation of the environment to the fabric. The effect will mimic the way a real Cameleon reacts to its environment and will redefine the concept of what color something is by always changing and altering depending on where it is placed.