HELIX

QR CODE Helix
Tap or scan with your phone to try the VR

LOCATION – Redwood Theatre
1300 Gerrard St E, Toronto M4L1V7
DATE – Saturday Sept. 17, 2022, Noon – 9pm
Sunday Sept. 18, 2022, Noon – 6pm
September 21-24 – By appointment (437) 249- 1137
DISCIPLINE – Art
MEDIUM – Video/Photo Installation
Writer, Producer and Director- Graham Smith
Editor and Cinematographer – Chris Terry
Media Artist – Jeff Mann
First Nations Consultant – Philip Cote
Video Documentation – Dave Brunning
Assistant Camera – Ron Hewitt
Production Co-ordinator – Tony Tobias
Engineering Consultant – Gerd Kurz
Electronics – Doug Back
Metal Fabrication – James Maxwell
Photo Documentation – Miki Mervaslov
Assistant Editor – Samantha Chung Sang
Web Design – Steev Morgan

The world exists without the construct of a frame as the spherical 360 by 180 degree bubble of all possible perspectives that defines a point in space is in essence “borderless” as it is the constraint of human vision which limits our view to only a portion of this “sphere in space” that defines a frame. As humans we look all around us over time and scan our environment to build up an understanding of our environment and view the world as a series of “frames” that allows us to build up an understand the world around us and navigate through it. It is this element of the human experience, the limits to our visual construct and the integration of time into our perceptual understanding of the world, that lies at the heart of HELIX as it is a piece which expands the human perception of panoramic space.

Übergang

LOCATION – Zimmerstrasse 23, 10969, Berlin, Germany
DATE – Nov. 9th, 2019 – From 9:00 until 22:00
DISCIPLINE – Art
MEDIUM – Video/Photo Installation
THANKS TO – Ontario Arts Council – Media Arts
The Canada Council – Photography
ThoughtWorks Berlin – Exhibition

Übergang was first presented at ThoughtWorks in Berlin, November 2019.
Announcements: Art in Berlin (in German), ThoughtWorks

Time is measured not only in segments called seconds, minutes, hours, days and years but also in 30 year long periods called generations which relate to human maturation from vulnerable newborn to independent adult. In many ways this cycle forms the basis of human interpersonal interaction as it bonds people within a delineated population who have experienced the same significant events within a given period of time. “Übergang” is an interactive video/photo based artwork that explores this generational interconnection of time and space as it relates to the fracturing of Germany by the construction of the Berlin Wall in 1963 and the nations eventual reintegration after 1989 when the barrier was demolished. The work focuses on both the global experience of the passage of “meta time” as defined by 30 year generational periods as well as the more immediate, personal movement of the physical body through space that defines the human experience of time “real space”. “Übergang” is designed to convey a rich, complex impression of the enormous physical and geopolitical transformations the area around the wall has under gone and is going through, allowing people the ability to literally walk in and out of time periods by using their body as an input device. The piece aims to create an exhibition environment that is alive for viewers, reacting to the movement of the people exploring it, taking the static images from the past and transforming them into a complex vista of the currently unnamed era that is still being defined.

Übergang is made up of 3 interrelated components…

Übergang #1– An interactive video installation that is projected onto 2 opposing walls that are 2.5m high and 3.6m wide to allow people the ability to alter both the physical movement of the camera perspective as well as the time period they are viewing. It is like an interactive time portal that allows for the exploration of the area around “Checkpoint Charlie” between the times 1988 and 2019 using a combination of original panoramic imagery shot meters from the exhibition space, life sized photos of the actual Berlin Wall. The work allows viewers the ability to interactively move their perspective forwards or backwards in both time and space using a simple to use “step pad” user interface that is both intuitive to use as well as wheelchair friendly so that everyone who visits the exhibition can interact with the piece no matter their age or physical challenge.

Übergang #2 – An interactive photograph that is 1m high by 3.6m long image that displays dual perspectives shot in the years 1988-2009 at Check Point Charlie. It uses the “Tabula Scalata” faceted image technique in which 2 photographs are displayed simultaneously in vertical wedge shaped triangular sections that allows viewers the ability to shift their visual perspective from one time period to the next as they move back and forth along the image. It is a simple yet effective way to display 2 time periods at the same time that uses an ancient imaging technique to bring to life the transformation the area only 30 meters away has undergone since the fall of the wall.

Übergang #3 – A central sculptural element of the work is the panoramic machine invented by the artist to shoot the work in both 1988 and 2019 which uses 5 Polaroid cameras on a rotating, wheeled base that has a clock attached to record the passage of time. This portion of the display anchors the work to its origins when the original imagery was shot just meters from the exhibition space at Thought Works and adds a physical 3 dimensional quality which references back to the origins of the work itself. It will feature the 5 Polaroid cameras which were used to shoot the original work in 1988 and offers a historical facet to the work, functioning as both iconic design element and imaging tool.

ARTISTS


Graham Smith is the lead artist responsible for the entire project and

Jeff Mann

Jeff Mann is a media artist responsible for interaction and interface design, software development, and electronics for the installation; and


Joshua Tzventarny is a co-artist who has been involved in the development of Übergang #2.

Gallery exhibition views – ThoughtWorks Berlin, November 2019:

ThoughtWorks Opening Night
Photo credit: Jeff Mann

Photo credit: Jeff Mann

 

HUBOT MOBI

DATE – 2017-2018
DISCIPLINE – Art
MEDIUM – Robotic Installation
STATUS – HUBOT MOBI, Dutch Design Week, Eindhoven, The Netherlands
WEBLINKS – https://www.nextnature.net/projects/hubot/

The HUBOT MOBI robots were created for the NEXT NATURE exhibition by to allow people the ability to interact within the exhibition and be guided through the exhibits. The 2 robots feature motorized bases, LCD touch screen interfaces, cameras, speakers, videoconferencing software and printers to give users the results of their feedback.

 

Presence Fields

Displaced Perspective - Graham Smith - 1983 VRTO Conference June 24-26, 2017 Ryerson University, Rogers Communications Centre 80 Gould St., Toronto, Ontario, Canada

DATE – June 25, 26, 2017
LOCATION – Ryerson University, Rogers Communications Centre,
80 Gould Street, Toronto, Ontario, Canada
DISCIPLINE – Interactive art
MEDIUM – Telepresence HMD
STATUS – Confirmed
WEBLINK – https://conference.virtualreality.to
https://keram-malicki-sanche.ongoodbits.com/2017/05/02/vrto-newsletter-no-50-giants
FUNDING – Canada Council, Media Arts

The artist picks up the message of cultural and technological challenge decades before its transforming impact occurs.” – Understanding Media: The Extensions of Man – Marshall McLuhan©1964

David Rokeby                                         Graham Smith                                       Vincent John Vincent
1983 – Very Nervous System                   1983 – Displaced Perspective                   1986 – Mandela
2016 – Hand – held                                 2017 – Displaced Perspective 2.0             2017 – Mandela 2.0

What happens to presence when it is subjected to screens, to total surround, to telepresence, to interactivity, to simultaneous access to and from geographically separate other presences? Probing precisely this dimension, 3 international media artists from Toronto revisit 30 years of experimentation and achievements.

Featuring 3 of the world’s first interactive Virtual Reality artworks, “Presence Fields” is an exhibition by Toronto artists David Rokeby, Graham Smith and Vincent John Vincent that explore the possible message behind the new medium of Virtual Reality. First created by the artists in the early 1980’s, the exhibit parallels these works with 3 of their current works in the same field of artistic exploration. The exhibit is a manifestation of McLuhan’s concepts that artists have a historical role as technological pioneers and that avant-garde artworks can be important “markers” that help define how a new medium impacts society.

Presence Fields interrogates the meaning and the experience of presence in different configurations and situations as they are affected by technology. Telepresence comes to mind first because it is the most obvious addition introduced by new technologies, from the time of radio and the telephone, but improving every decade all the way from voice, image and movement transmission to touch and 3D today.

Presence, however, means more than “being there”. Presence is a complex feeling, rich in variations, closely related to touch, affecting emotions in both positive and negative ways. A close cousin to the feeling of intimacy, the sense of presence reaches deeply in human sensibility. It’s also a physical paradox, because it requires the object to be there (with exception in all kinds of delusion), but that object can be out of sensorial reach and reside legitimately only in imagination.

The experiences offered to the patron in Presence Fields thus tend to underline the user’s presence as much as the sense or intuition of the presence of another. Their value is authentic in that it is not enough to read about them or watch them in an internet video, because they only reveal their meaning by the experience itself.

One thing these artists all share, is a genuine respect for Marshall McLuhan, a respect that the Canadian prophet malgré lui reciprocated. “The artist, he wrote, is the man in any field, scientific or humanistic, who grasps the implications of his actions and of new knowledge in his own time. He is the man of integral awareness.”

Derrick de Kerckhove
Director of the McLuhan Program in Culture and Technology, University of Toronto, 1983 to 2008

David Rokeby
http://www.davidrokeby.com
David Rokeby is a Toronto-based artist who works with a variety of digital media to critically explore the impacts these media are having on contemporary human lives. His early work “Very Nervous System” (1982-1991) was a pioneering work of interactive art, translating physical gestures into real-time interactive sound environments.

Graham Smith
https://designartscience.org
Graham Smith is a pioneer in the area of VR having exhibited the world’s first Head Mounted Display artwork in 1983, the first trans-Atlantic telepresence robotic artwork in 1986 and the first use of panoramic imagery in a VR environment in 1988. He is Chief Science Officer at the Dutch company Webchair and a PhD student at University College Dublin where he is researching the positive effects telepresence technology can have to help re-integrate autistic children back into school environments in a Ryerson University based research project called WEBMOTI.

Vincent John Vincent
http://www.vjvincent.com
Vincent John Vincent is a Toronto based artist, inventor, and entrepreneur, who co-invented video gesture control technology with his partner, Francis McDougall starting back in 1985. They have been the world leaders, forging and establishing the application and market for the technology, through their company, GestureTek, winning countless awards and picking up over 45 patents on the way.

“I think of art, at its most significant, as a DEW line, a Distant Early Warning system that can always be relied on to tell the old culture what is beginning to happen to it” – Marshall McLuhan

David Rokeby                                      Graham Smith                                    Vincent John Vincent

1983 – Very Nervous System                 1983 – Displaced Perspective                 1986 – Mandela
2016 – Hand – held                               2017 – Displaced Perspective 2.0           2017 – Mandela 2.0

 

Displaced Perspective - Graham Smith - 1983 VRTO Conference June 24-26, 2017 Ryerson University, Rogers Communications Centre 80 Gould St., Toronto, Ontario, Canada
Displaced Perspective – Graham Smith – 1983 VRTO Conference June 24-26, 2017 Ryerson University, Rogers Communications Centre 80 Gould St., Toronto, Ontario, Canada

 

The artist picks up the message of cultural and technological challenge decades before its transforming impact occurs.” – Understanding Media: The Extensions of Man – Marshall McLuhan©1964
Presence Fields interrogates the meaning and the experience of presence in different configurations and situations as they are affected by technology. Telepresence comes to mind first because it is the most obvious addition introduced by new technologies, from the time of radio and the telephone, but improving every decade all the way from voice, image and movement transmission to touch and 3D today. Presence, however, means more than “being there”.

Presence is a complex feeling, rich in variations, closely related to touch, affecting emotions in both positive and negative ways. A close cousin to the feeling of intimacy, the sense of presence reaches deeply in human sensibility. It’s also a physical paradox. Because it requires the object to be there (with exception in all kinds of delusion), but that object can be out of sensorial reach and reside legitimately only in imagination.

What happens to presence when it is subjected to screens, to total surround, to telepresence, to interactivity, to simultaneous access to and from geographically separate other presences, etc.? Probing precisely this dimension, three international media artists from Toronto revisit 30 years of experimentation and achievements.

David Rokeby, for example, has been testing the sensations of touch and presence, first creating a new perception of space itself by involving the whole body in the production of sound. In Very Nervous System, the specific sound produced by specific gestures in space reveals in real time a continuity of my presence in and with space itself. It gives me a new sense of my own presence. The VNS version presented in Rotterdam in 2011, also uses the combination of bodily movement and sound, but in a very different way, instead of producing sound as would a musical instrument VNS2 allows two or more persons to enter in immediate contact with each other via sound as the body of one runs virtually into that of the other on the other side of the ocean. The real body and the virtual lock into a paradoxical and sonorous embrace. The presential power of that installation is extremely powerful. Hand-held, the well named installation of an invisible sculpture revealed by touch alone, has an area where the user’s hand touches other hands in the sculpture, strongly evoking the tactile experience. This is virtual presence par excellence.

Graham Smith’s Displaced Perspective 2.0 adds a significant feature over the 1.0 version of 1983, that is, to allow the user to experience bodily the space on the other side of the telepresence transportation via the pool tables gaming surface. Smith quite literally wants to be in two places at once. In William Gibson’s 1984 best-seller, Neuromancer, there is mention of a program called the “Rider” that allows a physical person to tele-operate a virtual projection into a real but inhospitable space so as to explore and manage it without danger. His fascination with tele-robotics has led Smith since the early 1990s precisely in that direction, which is now rapidly turning into an industry. It is worth noting that Smith’s other dominant, not to say obsessive, fascination is now as then about the experience and the transmission of total surround and real time 360-degree video and photography.

By going 3D and high definition, the Mandala 2.0 is also a big improvement over the first edition. It also invites the user to occupy another space, this one not real, but virtual and allowing nevertheless complex interactions with objects that respond in movement, sound and texture, as real ones would. Vincent John Vincent is offering a new level of depth-of-field interactions that permits one to experience vertigo in total safety.

The experiences offered to the patron in Presence Fields thus tend to underline the user’s presence as much as the sense or intuition of the presence of another. Their value is authentic in that it is not enough to read about them or watch them on Vimeo, because they only reveal their meaning by the experience itself

David Rokeby, Graham Smith and Vincent John Vincent met at or via the McLuhan Program in Culture and Technology, at the University of Toronto. Graham Smith was a resident artist of the Program. He created there the Virtual Reality Artist Access Program (VRAAP), an offshoot of the weekly “artist-engineers” seminars series I had created and run at the Program since 1983, precisely. I was deeply inspired by McLuhan’s personal injunctions to privilege the artist if I really wanted to know anything really useful about my own time. He was fond of quoting Wyndham Lewis statement that “The artist is always engaged in writing a detailed history of the future because he is the only person aware of the nature of the present.” The series hosted many other Toronto and international artists, such as Norman White, Timothy Leary, Joe Davis, Stelarc Warren Rudd, Kit Galloway, Sherrie Rabinowitz and many others who fed into an informal network of pioneers. These artists didn’t think of themselves as pioneers at the time. They just did it, whatever it is they did. It is only looking back that one realizes that they were almost alone in experimenting with various aspects of presence that today we take for granted and have become the norm.

Most of them became very successful, the three artists featured in Presence Fields, being international stars. All three were involved in a dozen events initiated by the McLuhan Program, among which two transatlantic videoconferences for artistic performances. The first was Strategic Arts Initiatives (1986), between Salerno in Italy and Toronto, and later between Paris and Toronto, and the second, Les Transinteractifs/Transinteractivity (1988). Between Toronto, Salerno and Paris, Doug Back (in Toronto, at the Art Culture Resources Centre) and Norman White (in Salerno) performed the world’s first transatlantic arm-wrestling, each one weighing on a steel rod haptically connected to the other city by a pressure measuring and activating motor on each side and very low speed modems feeding that pressure across the ocean in streams of digital bits. They repeated the performance in 2011 from V2 in Rotterdam to Interaccess in Toronto, also brought together by Graham Smith.

Did they form a “school”? Not really. If there is a leadership, very loose and not constraining it could be attributed to Graham Smith who has done the most, including this event, to bring the other together. One thing they all share, is a genuine respect for Marshall McLuhan, a respect that the Canadian prophet malgré lui reciprocated. “The artist, he wrote, is the man in any field, scientific or humanistic, who grasps the implications of his actions and of new knowledge in his own time. He is the man of integral awareness.”

By Derrick de Kerckhove, June 12, 2017

David Rokeby
www.davidrokeby.com
David Rokeby is a Toronto-based artist who works with a variety of digital media to critically explore the impacts these media are having on contemporary human lives. His early work “Very Nervous System” (1982-1991) was a pioneering work of interactive art, translating physical gestures into real-time interactive sound environments. He has exhibited and lectured extensively internationally and has received numerous international awards including a Governor General’s Award in Visual and Media Arts (2002), a Prix Ars Electronica Golden Nica for Interactive Art (2002), and a British Academy of Film and Television Arts “BAFTA” award in Interactive art (2000).

1983 – Very Nervous System (1991 version)
“Very Nervous System” was one of the first interactive artworks to present a compellingly visceral and immersive interactive audio experience. The 1983 version used custom cameras, an Apple ][ computer, a tape loop, an analogue synthesizer and custom sound distribution system to create a responsive sound environment that translated movement into sound in real-time. The project was under development from 1981 to 1991, evolving as the technologies of personal computers, video sensing and sound synthesizer evolved. It has been presented widely internationally, including at the Venice Biennale in 1986.

2016 – Hand-held (Documentation)
“Hand-held”, presented at Nuit Blanche Toronto this past October, creates an invisible 3-dimensional sound and image construct suspended in the exhibition space. This ’sculpture of possibilities’ reveals itself on the hands of the interacting public as they explore the installation.

Graham Smith
grahamthomassmith.com
Graham Smith is a pioneer in the area of VR having exhibited the world’s first Head Mounted Display artwork in 1983, the first trans-Atlantic telepresence robotic artwork in 1986 and the first use of panoramic imagery in a VR environment in 1988. He is Chief Science Officer at the Dutch company Webchair and a PhD student at University College Dublin where he is researching the positive effects telepresence technology can have to help re-integrate autistic children back into school environments in a Ryerson University based research project called WEBMOTI. He is currently completing 2 major interactive VR artworks called “Sensing Presence” and “Intersections” which will be exhibited in 2018.

1983 – Displaced Perspective
“Displaced Perspective” is the world’s first Head Mounted Display artwork which uses a live camera and a custom-built HMD that holds two displays which is suspended via counterweights from a movable boom on the ceiling. When the viewer puts on the HMD they see themselves from a different perspective 2 meters away looking back at themselves which provided them with a remote point of view of their body image and kinesthetic feedback as they moved around the room. A second recorded camera perspective is also displayed which creates a déjà vu like of sensation for the viewers who do not know what is real and what was recorded. This alteration of people’s perception of real verses recorded space created a powerful sensation of telepresence for the audience and generated a new perspective on perspective.

2017 – Displaced Perspective 2.0
“Displaced Perspective 2.0” is a telepresence installation that allows people the ability to play a physical game of pool with someone in another location and is part of the Cohesion Cafés initiative, in collaboration with Dr Martine Delfos, that aims to create a global network of cafes that connect people in a natural setting that feels like a local café. Unlike Skype or other such videoconferencing systems, the life-sized imagery and display technologies used projects a realistic sensation of human presence allowing the audience the chance to become immersed within the linked environments and creates a new form of “environmental telepresence” for the audience.

Vincent John Vincent
http://www.vjvincent.com
Vincent John Vincent is a Toronto based artist, inventor, and entrepreneur, who co-invented video gesture control technology with his partner, Francis McDougall starting back in 1985. They have been the world leaders, forging and establishing the application and market for the technology, through their company, GestureTek, winning countless awards and picking up over 45 patents on the way. Vincent toured the globe playing music on virtual instruments as audiences went on a creative journey as he danced through cyberspace. The Smithsonian Institute in D.C is just one of 9,500 installations worldwide. Vincent has received a Life Time Achievement Award from the Canadian New Media Association (1995), was inducted into the Digifest Digital Pioneer Hall of Fame (2012), and has been given the “Meet the Media Guru” award in Milan, Italy.

1986 – Mandala
The initial video gesture control system, known as the “Mandala Virtual World System”, was written on an Amiga PC and moved to a 486 based PC. A standard color camera placed your live video image on the screen, inside the virtual world, that responded to gesture interaction. It has a full 2D skeleton tracking system and let the user touch, push, pick up, bounce or throw an animated object. In the PC version the video image was digital and could shrink or grow in the 3D worlds. Users leaned right, left, up or down to move through the virtual worlds. Their company GestureTek, licensed their technology/patents to Sony for the Eyetoy/Move and Microsoft for the Kinect.

2017 -Mandala 2.0
The system now uses the Unity 3D graphic engine to create 3D virtual worlds in high resolution and millions of colors and the 3D depth gesture recognition software they invented. The system can be installed on walls, windows, counters and floors or run on a single flat screen or a video wall, with the ability for multiple people to use it simultaneously and is currently being integrated with true 3D sound.

Intersections

INTERSECTIONS: “Steinstucken” – 75 images total – 15 images long - 1.25 panoramic scans DATE 1988 DISCIPLINE - Art MEDIUM - Interactive video installation STATUS - In production for display in 2019 WEBLINKS -https://www.youtube.com/watch?v=6YOFNSRiexY FUNDING - Ontario Arts Council, Media Arts

DATE – 2017
DISCIPLINE – Art
MEDIUM – Interactive video installation
STATUS – In production for display in 2019
WEBLINKS –https://www.youtube.com/watch?v=6YOFNSRiexY
FUNDING – Ontario Arts Council, Media Arts

INTERSECTIONS: DATE - 2017 DISCIPLINE - Art MEDIUM - Interactive video installation STATUS - In production for display in 2019 WEBLINKS -https://www.youtube.com/watch?v=6YOFNSRiexY FUNDING - Ontario Arts Council, Media ArtsINTERSECTIONS: DATE - 2017 DISCIPLINE - Art MEDIUM - Interactive video installation STATUS - In production for display in 2019 WEBLINKS -https://www.youtube.com/watch?v=6YOFNSRiexY FUNDING - Ontario Arts Council, Media ArtsINTERSECTIONS: DATE - 2017 DISCIPLINE - Art MEDIUM - Interactive video installation STATUS - In production for display in 2019 WEBLINKS -https://www.youtube.com/watch?v=6YOFNSRiexY FUNDING - Ontario Arts Council, Media Arts

INTERSECTIONS is an interactive video installation that documents the transformation areas around the former Berlin Wall have gone through from 1988 – 2009 – 2017. It is an interactive video installation that morphs between time periods depending on the motion and location of the audience. The 3.6 meter high by 80 meter long piece uses cameras and computer software to track the location of people in front of the piece to allow the audience the ability to shift the images shown on the 14 video projectors accordingly. The original 675 image cubist panoramas that form the image set were shot before the wall came down in 1988 using a unique 5 Polaroid camera system of my own invention that created panoramic imagery. The 2nd set of images were shot in 2009 using the same panoramic camera rig as used in 1988 but with 5 digital cameras and re-photographed exactly the same locations. The final set of images will be shot in 2017 and will be used as the 3rd set of images for an exhibition in 2019 to celebrate the 30 year anniversary of the fall of the wall and the transformation of Berlin.

The piece is designed to convey a rich, complex impression of the enormous physical and geopolitical transformations the area around the wall has under gone over the last 20 years by allowing people the ability to literally “walk in and out of time”. By giving viewers the ability to explore history with their bodies as well as their eyes the piece aims to create an image that becomes alive, reacting to the movement of the viewers to project a sense of the multifaceted history the area holds. Just as history is constantly in motion by being re-evaluated by each new generation, INTERSECTIONS takes the static images from the past and transforms them into a constantly morphing vista of the present.

INTERSECTIONS: DATE - 2017 DISCIPLINE - Art MEDIUM - Interactive video installation STATUS - In production for display in 2019 FUNDING - Ontario Arts Council, Media Arts4 IMAGE SETS
1) 1988 Polaroids
2) 2009 Digital images
3) Berlin Wall
4) 2017 Digital images

1) 1988 Polaroids – The 1st set of images are 600 separate images that make up 3 panoramic murals that I shot of the Berlin Wall area in 1988 with 5 Polaroid cameras and a custom dolly. The cameras recorded 360 degree scans of the environment as it was moved forward in space and a clock was mounted on dolly to record the passage of time.

“Checkpoint” – 150 images total – 30 images long – 2.5 panoramic scans
The “Checkpoint” image was shot at Checkpoint Charlie, the symbolic Cold War focal point and was still extremely dangerous in 1988. The East German boarder guards were very unhappy with my project and not only tried there best to stay out of my photograph (unsuccessfully) but did try and arrest me numerous times as I crossed the white line that marked the exact boarder and forced them to cross the cameras field of view and become part of the image.

INTERSECTIONS: "Checkpoint" DATE - 2017 DISCIPLINE - Art MEDIUM - Interactive video installation STATUS - In production for display in 2019 WEBLINKS -https://www.youtube.com/watch?v=6YOFNSRiexY FUNDING - Ontario Arts Council, Media Arts
“Door – 300 images total – 60 images long – 5 panoramic scans
The “Door” image is the longest and most complex of the images as it takes in 6 complete panoramic scans in 1 long continuous movement that involved the shooting of 300 Polaroid images in sequence. The image moves 6 meters from inside a hallway beside a beautiful old door across a street to cm’s from the Berlin Wall.

INTERSECTIONS: "Door - 300 images total - 60 images long - 5 panoramic scans DATE - 2017 DISCIPLINE - Art MEDIUM - Interactive video installation STATUS - In production for display in 2019 WEBLINKS -https://www.youtube.com/watch?v=6YOFNSRiexY FUNDING - Ontario Arts Council, Media Arts“Bridge” – 150 images total – 30 images long – 2.5 panoramic scans
The “Bridge” image is similar in nature to the “Door” image as it involves not only a series of 360 degree scans of the environment but also a simultaneous linear dolly move. The image was shot on a railway bridge that became useless after the wall was erected and is now part of the S Bahn commuter rail system close to the Kolln Heide station.INTERSECTIONS: "Bridge" - 150 images total – 30 images long - 2.5 panoramic scansDATE - 1988 DISCIPLINE - Art MEDIUM - Interactive video installation STATUS - In production for display in 2019 WEBLINKS -https://www.youtube.com/watch?v=6YOFNSRiexY FUNDING - Ontario Arts Council, Media Arts

“Steinstucken” – 75 images total – 15 images long – 1.25 panoramic scansINTERSECTIONS: “Steinstucken” – 75 images total – 15 images long - 1.25 panoramic scans DATE 1988 DISCIPLINE - Art MEDIUM - Interactive video installation STATUS - In production for display in 2019 WEBLINKS -https://www.youtube.com/watch?v=6YOFNSRiexY FUNDING - Ontario Arts Council, Media Arts

2) 2009 Digital images – With help of a Canada Council, Media Arts grant I traveled back to Berlin in the fall of 2009 and reshot the Check point Charlie images from exactly the same positions and angles using an adaptation of the original camera rig that I adapted to fit digital cameras. These contemporary high resolution images were morphed with the original 1988 images and displayed at Transmediale in Berlin in 2010 and allowed the audience the ability to interact with the landscape by using their body as input devices so they could move through time and space to reveal the dramatically changed in which the wall is gone and the city has been transformed by 20 years of peace.
https://transmediale.de/content/graham-smith

INTERSECTIONS: 2009 Digital images DISCIPLINE - Art MEDIUM - Interactive video installation STATUS - In production for display in 2019 WEBLINKS -https://www.youtube.com/watch?v=6YOFNSRiexY FUNDING - Ontario Arts Council, Media Arts3) Berlin Wall – Remaining parts of the Berlin Wall using a digital SLR camera and will create a composite image of the structure in Photoshop. This image was displayed when people are close to the projection and is a very realistic recreation due to the fact that the wall is essentially a 2 dimensional concrete surface that can be projected very realistically in this way.

INTERSECTIONS: Berlin Wall DATE - 2009 DISCIPLINE - Art MEDIUM - Interactive video installation STATUS - In production for display in 2019 WEBLINKS -https://www.youtube.com/watch?v=6YOFNSRiexY FUNDING - Ontario Arts Council, Media Arts4) 2017/18 Digital Images – A final set of images will be shot in 2017/18 that will form the basis of an exhibition in 2019 to celebrate the 30 year anniversary of the fall of the wall and the transformation of the “death strip” to its current manifestation as an example of modern urban renewal.

Painting the wind using the wind

DATE – 2015
DISCIPLINE – Education
MEDIUM – Installation and kinetic art
WEBLINKS

http://v2.nl/events/hacking-event
A two day event in collaboration with the minor ‘Hacking’ of the Willem de Kooning Academy in Rotterdam. Plus an opening night. Wind and wrestling and dancing robots!
APRIL 17-18 2015
18:00 TO 18:00
Location: V2_ Institute For The Unstable Media, Eendrachtsstraat 10, Rotterdam And Worm

Paint the Wind Using the Wind vs Sumo Wrestling and Dancing Robot Competition is an event featuring the best projects from the Willem de Kooning Academy Hacking Department course. The 2nd year students have constructed artworks that took the concept of ‘painting the wind using the wind’ in many directions. Expect the V2_ Groundfloor to look and feel like a windy beach…
Opening: 17 April, 18:00h
Opening times 18 April: 12:00 – 18:00h
On the 18th of April there will be a robot building workshop and competition by the studentes. The assignment is to build Dancing Robots and Sumo Wrestling Robots.
A Sumo Wrestling Robot and Dancing Robot event will take place at WORM (just around the corner of V2_) from 13:00 to 16:00h on 18 April. Afterwards the robots will be on display at V2.

The event features works by:

  • Jurriaan Boerman
  • Barbra Boers
  • Esther Brakel
  • Meike Brand
  • Benita Brouwer
  • Kotryna Buruckaite
  • Aylin Buyruk
  • Jade  Cadogan
  • Phillp Ficozzi
  • Steven van der Gaarden
  • Lynette van der Giessen
  • Brian Groenhorst
  • Chanelle Hool
  • Josien kamp
  • Fay Klaassen
  • Sanneke Kleingeld
  • Follkert Koelewijn
  • Leendert van der Meer
  • Aylin Buyruk
  • Annebel de Kok
  • Max Kowalski
  • Elise Marcus
  • Marish van Meurs
  • Barbra Monster
  • Dorinda Oosterhout
  • Nicole Pashchenko
  • Corne Schep
  • Rachelle Joy Slingerland
  • Michiel Tollig
  • Carlijn Veld
  • Jordi Verbaan
  • Menno Visser
  • Corlieske Visser
  • Mickey Vissers
  • Alexandra Smorenberg

 

Teachers:

  • Graham Smith
  • Rob Dielissen
  • Simon de Bakker
  • Suzzanne Rademaker

Webmoti

DATE 2017
DISCIPLINE Science
MEDIUM Telepresence and tactile interface for autistic children
STATUS Research project started Jan. 2017 at Ryerson University and University College Dublin with FUNDING: Ontario Centre of Excellence

PhD Project Proposal
Graham Thomas Smith
Thematic PhD in Inclusive Design and Creative Technology Innovation
Cognate Fields: Inclusive Design, Robotics, Telematics, Autism, Education

Proposed Supervisory/DSP Team as discussed and agreed with team members:
Principal Supervisor: Prof Lizbeth Goodman, MME: Creative Technology Innovation/Inclusive Design
Co-Supervisor: Dr Brian MacNamee, Insight/Computer Science
Doctoral Studies Panel: Chair- Dr Suzanne Guerin, Psychology/Disability Studies, Prof Goodman, Dr MacNamee

PhD Project Title – WEBMOTI – a multimodal computer interface to help autistic children mature their sensory systems to allow them the ability to balance their cognitive and emotional development.
Main Field of Study – Inclusive Design
Research Question – Can the use of a multimodal computer interface that includes both videoconferencing as well as tactile feedback allow young people dealing with the issue of autism the ability to mature their sensory systems to balance their cognitive and emotional development.College Dublin

Webmoti
PhD Project Proposal
Graham Thomas Smith
Thematic PhD in Inclusive Design and Creative Technology Innovation
Cognate Fields: Inclusive Design, Robotics, Telematics, Autism, Education

Proposed Supervisory/DSP Team as discussed and agreed with team members:
Principal Supervisor: Prof Lizbeth Goodman, MME: Creative Technology Innovation/Inclusive Design
Co-Supervisor: Dr Brian MacNamee, Insight/Computer Science
Doctoral Studies Panel: Chair- Dr Suzanne Guerin, Psychology/Disability Studies, Prof Goodman, Dr MacNamee

PhD Project Title – WEBMOTI – a multimodal computer interface to help autistic children mature their sensory systems to allow them the ability to balance their cognitive and emotional development.
Main Field of Study – Inclusive Design
Research Question – Can the use of a multimodal computer interface that includes both videoconferencing as well as tactile feedback allow young people dealing with the issue of autism the ability to mature their sensory systems to balance their cognitive and emotional development.

METHODOLOGY
Children with autism can face on-going and lifelong challenges with social, cultural and well-being issues. They are often ostracized by peers, mistreated by an education system with limited resources and incomplete or ineffective tools, misunderstood and excluded. Not only does this potentially contravene the Canadian Charter of Rights and Freedoms but also it can reduce or eliminate an individual’s ability to make cultural, intellectual, work and family contributions as a citizen in society which can have a serious and direct impact on that individual’s educational achievement, health and well-being that can tax the educational and health care systems.

The WebMoti is a tool to observe with and without participation the way in which peers in the classroom behave and socialize using the powerful camera that enables to zoom in the classroom. By being the “coolest kid” in the class (inside a computer) and being able to observe without having to participate in person students with ASC are less prone to ostracism and bullying which can then translate into improved self-esteem and a corresponding reduction in absenteeism, depression and related health issues. The WebMoti can make students aware of their hearing and tactile senses, and manipulate these to attend lessons without being overwhelmed. Manipulating these thresholds may also stimulate sensory awareness and self-actualization [24, 29].
The main qualitative outcomes of this project involve improved participation in school that may also translate into improved mental health for children with ASC because of support for the sensory maturation process and the reduction of the impact of bullying and ostracization on the mental health of these children. Using a technological intervention that combines multi-sensory output system (tactile, auditory and visual modalities are available) with a real-time, connected representative (Webchair), children with ASC can control and mediate their sensory environment and their participation in the social setting of school. The main quantitative outcomes will be improved attendance, participation in school, and perhaps academic achievement scores.
The newest research shows that autism spectrum conditions (ASC) are not a defect but rather a difference [9]. According to the theory of the Socioscheme with the MAS1P (Mental Age Spectrum within 1 Person), autism is about accelerated cognitive and delayed social and physical maturation of the central nervous system [10, 13].
The number of children diagnosed with ASC is about 1 in 94 in Canada [26] and 1 in 100 in Europe [10]. A child with autism is often ashamed, feels suppressed, is angry, and amazed not to be heard. They are fiercely and regularly bullied and have an acute sense of failure. Children with autism tend to drop out of school for sheer fear of peers, of the educational system and run a high risk for depression and lower quality of life [29, 30].
Children with autism often have issues with sensory integration [28, 16], and it has been suggested that this results from a lack of synaptic pruning in the brain [5, 14]. For example, in terms auditory sensing, the foreground sounds are strengthened and the background sounds are pushed to the background. When maturation lags, autistic individuals hear everything as it presents itself, which produces a disturbing and painful sound situation. At school, concentration is almost impossible and children are exhausted, which potentially has a serious impact on the child’s health and wellbeing. Allowing children to explore and play with sensory information may not only help ASC children better understand their sensory needs and self-advocate on behalf of necessary accommodations [6, 12, 24].
ASC children have significant sensory processing challenges versus typically developing peers [20]. Educational programs rarely take this into account when working to accommodate ASC children, and ABA/IBI therapeutic approach currently widely used in Ontario does little to address cognitive overload. Our WebMoti approach recognizes the need for ASC children to have opportunities to reduce cognitive/sensory overload through remote presence and at the same time use this remote presence to fine tune the level of interaction and engagement they have with their peers. The potential here is to support the creation of conditions whereby the child can gradually increase the amount of face-to-face interaction with peers through guided experience in full knowledge of the opportunity to step back into our digitally mediated environment. At the same time, WebMoti allows the child to control and fine tune the kind and amount of somatosensory stimulation necessary for homeostatic balance. Helping a child modulate sensory information based on their own personal needs not only has the potential for the development a greater understanding of their sensory needs and interests [17], but also as a clear influence on their behaviour and can lead to greater self-regulation [9].
Sensory substitution, such as vibration or visualisation replacing sound and vice-versa (e.g., [1, 15], offers opportunities to mediate sensory stimulation and allow an individual to obtain the information using alternative means [2, 3, 4].The Emoti-chair is a sensory substitution system so that when combined with the Webchair students with ASC can use the sensory modality that is best suited to their informational needs when interacting with others at a distance and in the safety of their own environment. While both systems have been used independently and be shown to be effective in some domains (e.g., access to sound-based music for the Emoti-chair and remote connectivity for hospital-bound children for the Webchair), there is no system known to the authors that provides multi-sensory input/output across a distance, specifically oriented for children with ASC who often benefit from alternative stimuli. In this project, we will then be able to provide such a system by integrating the Webchair and Emoti-chair as well as demonstrate its effectiveness through formative and summative evaluations as described in Section 4.
Webchair in the Netherlands is the only company in the world using videoconferencing technology to give homebound autistic students the chance to integrate into conventional school classrooms. To date over 20 students have been re-integrated back into their classes using the technology and pedagogical program developed by the company that gives control to the autistic child over how and when they attend class. When the students are to stressed to attend class physically, for whatever reason, they can come into class via the Webchair from home or one of the dedicated distant learning centers set up as alternatives to remaining at home in isolation. The students can even attend class from a “safe room” set up in the student’s school as a 3rd possibility as sometimes they need to physically leave the classroom but still want to continue to be in class remotely.
This approach has been very successful as the students can control the amount of sensory input they are exposed to in relation to their level of stress at any given time. The students are encouraged to sometimes physically be in their classrooms but this decision is left up to them. It has been observed that the more the children actually attend class the more they connect with other students and begin to understand the complex social environment of the classroom.
The knowledge of how to connect and re-integrate the autistic students as well as the dedicated piece of technological equipment (the Webchair) are the 2 things that will be transferred to Ontario. Webchair works closely with Dr. Martine Delfos who is our leading research professional in the area of autism and this is also knowledge that will be transferred to the Ontario team at Ryerson University.
Overall Objective
The main objective of this project is to develop and evaluate a multi-media, multi-sensory connection system, called Webmoti, which supports the social and educational needs of children and teenagers with autism. For those children who have issues with hearing, Webmoti will be evaluated with a variety of audio controls (e.g., stop, adjust volume levels) and sensory substitution techniques for sound alternatives (e.g., vibrotactile or visual representations) to suit different hearing needs.
The goal is to achieve a new way allowing students with ASC to participate in school that provides them with control and agency. However, it is important that this process fits within a school setting given the need for technology, particularly video technology, to be present in the classroom. The Accessibility for Ontarians with Disability Act and the Intersection of Disability, Achievement and Equity document of the Toronto District School Board [8] provides for accessibility to be embraced by Ontario school boards and thus this project is consistent with these mandates. The school systems in the Netherlands have begun to incorporate an earlier version of the WebMoti system, called the WebChair, with hundreds being installed over the past three years. However, consultation and pilot studies are required before the WebMoti system can be considered as a wide-spread solution for children with ASC in Ontario as there are implications for staff training and understanding some of the issues that can arise (e.g., occasionally sound can become distorted).

Objective 1: Prototype development
The first objective will be to combine the Webchair and Emoti-chair technologies into one prototype system. Combining the two systems may also involve adding other sensory substitution techniques such as smell according to research carried out by [18]. However, decisions regarding specific sensory preferences and technology options will be made in consultation with users. Part of Nolan’s contribution to the research will focus on the further development and testing of a previously developed wearable prototype, developed out of a previous research project [18] that will provide real-time telemetry related to the stress/anxiety levels of participants.
A formative user-study method will be used to evaluate early iterations of the system with 4 to 5 participant users with ASC to evaluate usability elements (ease of learning and use, and satisfaction). The objective of these evaluations are to engage users in finding interaction problems and problem solving technological solutions for these identified problems during prototyping. Once these participant users are satisfied that the system does not have major interaction issues, a longitudinal evaluation of the system will occur with participants from the Toronto area for at least one school term to assess the long-term efficacy of the system. Participants will use WebMoti to attend school when they are unable to attend physically (which can vary from infrequently to often). Each session with WebMoti will be video/audio recorded. In addition, all stakeholders including the student him/herself, teachers, parents, and healthcare providers will be asked to complete a pre-study survey followed by a bi-weekly surveys to track change and difference. Once the study term is complete, all stakeholders will be asked to either complete a summary survey or participate in one-on-one interviews. The purpose of this last step is to gather impressions and attitudes towards the process and technology, likes/dislikes, and any issues that should be addressed in future work. (ease of learning and use, and satisfaction) [27]. The 10-item System Usability Scale [7] will be used. In addition, the attitude toward well-being instrument Life Satisfaction Matrix [19] and the Sensory Behaviour Scale [11] will be evaluated for their efficacy in this context. The objective of these evaluations is to gather user’s reactions to and performance with different versions of the prototype with the aim of finding user interaction problems and rectifying them.
Data from the formative user studies will be analysed using qualitative methods such as thematic analysis [22] as there is insufficient power to use more than descriptive statistics. The results from the analysis will be fed back into the development process.
Objective 2: Summative evaluation
Once the research team is satisfied that the system does not have major interaction issues, a longitudinal summative evaluation will occur. For this evaluation, systems will be placed with 16 participants in Ontario for at least one school term to assess the long-term efficacy of the system.
Participants will use WebMoti to attend school when they are unable to attend physically (which can vary from infrequently to often). Each session with WebMoti will be video/audio recorded. In addition, all stakeholders including the student him/herself, teachers, parents, and healthcare providers will be asked to complete a pre- study survey followed by a bi-weekly surveys to track change and difference. Once the study term is complete, all stakeholders will be asked to either complete a summary survey or participate in one-on-one interviews depending on availability. The purpose of this last step is to gather impressions and attitudes towards the process and technology, likes/dislikes, and any issues that should be addressed in future work. We anticipate this phase to last 18 months corresponding with school terms (e.g., not including summer months).
The quantitative data (responses to forced-choice questions) will be analyzed using repeated measures ANOVA and correlation to examine change over time. Qualitative data will be analyzed using thematic analyses. The results will then be used to inform a framework of sensory substitution techniques, ASC and health and wellbeing outcomes based on the instruments described in Objective 1.
REFERENCES
[1] Abboud, S., Hanassy, S., Levy-Tzedek, S., Maidenbaum, S., & Amedi, A. (2014). EyeMusic: Introducing a “visual” colorful experience for the blind using auditory sensory substitution. Restorative neurology and neuroscience, 32(2), 247-257.
[2] Attwood, T. (2006). The complete guide to Asperger’s syndrome. Jessica Kingsley Publishers.
[3] Baranek, G. T. (2002). Efficacy of sensory and motor interventions for children with autism. Journal of autism and developmental disorders, 32(5), 397-422.
[4] Baranek, G. T., David, F. J., Poe, M. D., Stone, W. L., & Watson, L. R. (2006). Sensory Experiences Questionnaire: Discriminating sensory features in young children with autism, developmental delays, and typical development. Journal of Child Psychology and Psychiatry, 47(6), 591-601
[5] Bourgeron, T. (2009). A synaptic trek to autism. Current opinion in neurobiology, 19(2), 231- 234.
[6] Broderick, A. A., and A. Ne’eman. 2008. Autism as metaphor: Narrative and counter-narrative. International Journal of Inclusive Education 12 (5–6): 459–576.
[7] Brooke, J. (1996). SUS-A quick and dirty usability scale. Usability evaluation in industry, 189(194), 4-7.
[8] Brown, R. S. & Parekh, G. (2013). The intersection of disability, achievement, and equity: A system review of special education in the TDSB (Research Report No. 12-13-12). Toronto, Ontario, Canada: Toronto District School Board.
[9] Case-Smith, J., Weaver, L. L., & Fristad, M. A. (2014). A systematic review of sensory processing interventions for children with autism spectrum disorders. Autism, 1362361313517762.
[10] Delfos, M.F. (2016). Wondering about the world. About Autism Spectrum Conditions. Amsterdam: SWP. [8] Elsabbagh, M., Divan, G., Koh, Y. J., Kim, Y. S., Kauchali, S., Marcín, C., … & Fombonne, E. (2012). Global prevalence of autism and other pervasive developmental disorders. Autism Research, 5(3), 160-179.
[11] Harrison, J., & Hare, D. J. (2004). Brief report: Assessment of sensory abnormalities in people with autistic spectrum disorders. Journal of Autism and Developmental Disorders, 34(6), 727-730. doi:10.1007/s10803-004-5293-z.
[12] Harley, D., McBride, M., Chu, J. H., Kwan, J., Nolan, J., & Mazalek, A. (2016). Sensing context: Reflexive design principles for intersensory museum interactions. MW2016: Museums and the Web 2016.
[13] Hong, H., Kim, J. G., Abowd, G. D., & Arriaga, R. I. (2012, February). Designing a social network to support the independence of young adults with autism. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work (pp. 627-636). ACM.
[14] Johnson, M. H. (2005). Sensitive periods in functional brain development: Problems and prospects. Developmental psychobiology, 46(3), 287-292.
[15] Karam, M., Branje, C., Russo, F., Fels, D.I. (2010). Vibrotactile display of music on the human back. ACHI2010. Barcelona. Pp. 154-159.
[16] Kern, J. K., Trivedi, M. H., Grannemann, B. D., Garver, C. R., Johnson, D. G., Andrews, A. A.,… & Schroeder, J. L. (2007). Sensory correlations in autism. Autism, 11(2), 123-134.
[17] Kern, J. K., Trivedi, M. H., Garver, C. R., Grannemann, B. D., Andrews, A. A., Savla, J. S., & Schroeder, J. L. (2006). The pattern of sensory processing abnormalities in autism. Autism, 10(5), 480-494.
[18] Koller, D., McPherson, A., Lockwood, I., Blain-Moraes, S., & Nolan, J. (Submitted). The Impact of Snoezelen in Pediatric Complex Continuing Care: A Pilot Study. Journal of Pediatric Rehabilitation Medicine
[19] Lyons, G. (2005), The Life Satisfaction Matrix: an instrument and procedure for assessing the subjective quality of life of individuals with profound multiple disabilities. Journal of Intellectual Disability Research, 49: 766–769. doi: 10.1111/j.1365-2788.2005.00748.x
[20] Marco, E. J., Hinkley, L. B., Hill, S. S., & Nagarajan, S. S. (2011). Sensory processing in autism: a review of neurophysiologic findings. Pediatric Research, 69, 48R-54R.
[21] McBride, M., Harley, D., Mazalek, A., & Nolan, J. (2016). Beyond Vapourware: Considerations for Meaningful Design with Smell. CHI’16, Extended Abstracts on Human Factors in Computing Systems. ACM. San Jose, U.S.
[22] Miles, M.B., Huberman, M.A., Saldana, J. (2014). Qualitative Data Analysis. Sage Publishing: London
[23] Nielsen, J. (1993). “Iterative User Interface Design”. IEEE Computer vol.26 no.11 pp 32-41.
[24] Nolan, J., McBride, M., & Harley, D. (2016; draft). Making sense of my choices: an autistic selfadvocacy handbook. RE/Lab working paper series.
[25] Oesterwalder, A. (2016). Business Model Canvas Tool. Retrieved Feb. 22, 2016 from
http://businessmodelgeneration.com/canvas/bmc.
[26] Ouellette-Kuntz, H., Coo, H., Lam, M., Breitenbach, M. M., Hennessey, P. E., Jackman, P. D.,… & Chung, A. M. (2014). The changing prevalence of autism in three regions of Canada. Journal of autism and developmental disorders, 44(1), 120-136.
[27] Rogers, Y, Sharp, H., Preece, J. (2011). Interaction Design: beyond human-computer interaction. Wiley:
Toronto.
[28] Stoddart, K. P. (2005). Children, youth and adults with Asperger syndrome: Integrating multiple perspectives. Jessica Kingsley Publishers.
[29] Strang, J. F., Kenworthy, L., Daniolos, P., Case, L., Wills, M. C., Martin, A., & Wallace, G. L. (2012).Depression and anxiety symptoms in children and adolescents with autism spectrum disorders without intellectual disability. Research in Autism Spectrum Disorders, 6(1), 406- 412.
[30] van Heijst, B. F., & Geurts, H. M. (2014). Quality of life in autism across the lifespan: A meta- analysis. Autism, 136-153.

Sensing Presence

DISPLACED PERSPECTIVE

DATE – 2017
DISCIPLINE – Art
MEDIUM – Telepresence, VR and robotic art retrospective
STATUS –  In development for exhibition in the Fall of 2017
FUNDING: Canada Council, Media Arts
DevPost
VideoSphere Siggraph

“I think of art, at its most significant, as a DEW line, a Distant Early Warning system that can always be relied on to tell the old culture what is beginning to happen to it” – Marshall McLuhan

“Sensing Presence” is a project to restore 3 early media artworks which are historically significant because they were the world’s 1st art installation that used a Head Mounted Display (1983 Displaced Perspective), the 1st telepresence robotic sculpture (1986 Displaced Perspectives) and the 1st virtual reality panoramic video exhibition (1989 Videosphere). The second part of the project involves the construct of 2.0 versions of these early works using current technology to create an exhibition that explores both the historical roots as well as the future potential of virtual artistic expression.

As an artist I have been exhibiting artworks using virtual reality, telepresence robotics and video panoramic imaging since the early 1980’s and have been credited with exhibiting the world’s first artworks in these fields. This project will restore the early works and create updated versions using current technology. It is a project close to my heart as these works were exhibited at the very start of my artistic career and now 33 years later I feel I have developed a unique perspective on where these new media evolved from, the impact they have had on society and where they will have in the future.

“Sensing Presence” is a project that recreates the world’s first Head Mounted Display (HMD) virtual reality and telepresence robotic artwork that I created between 1983 -1989 in an exhibit that will feature both the original works as well as new versions based on the original ideas but using 21st century technology. These early VR and telepresence works are regarded historically as the world’s first artworks using HMD’s, telepresence robots and panoramic imagery in virtual reality environments and when displayed beside the updated 2.0 versions of each piece will provide the audience with a historical reference in regards to current projects now being created around the world.

ORIGINAL WORK                                            2.0 VERSION
1983 – Displaced Perspective                        2017 – Displaced Perspective 2.0
1986 – Displaced Perspectives                       2011- Displaced Perspectives 2.0
1989 – Videosphere                                      2017 – Nanocopter

1983 – DISPLACED PERSPECTIVE
Graduation exhibition at the Ontario College of Art
Immersive Head Mounted Display & live camera video installation

As part of my graduation exhibition at the Ontario College of Art I displayed what the worlds first head mounted display artwork called displaced perspective. This piece used a camera and custom-built pair of video glasses that held two small black and white TVs in them suspended via counterweights from a movable boom on the ceiling. When the viewer put on the video glasses they saw themselves from a different perspective 2 m away looking back at themselves with the video glasses on providing them with me Remote perspective of their on body image and kinesthetic. As well as second camera perspective had been set up Camera witch and Displayed a prerecorded version of the scene with people moving through looking around which created strange déjà vu kind of feeling the viewers as they did not know what was real and what was Live. People you thought or which created powerful since station of telepresence and the new perspective on what perspective was.

1986 – DISPLACED PERSPECTIVES
Strategic Arts Initiative exhibition at Arc in Toronto & U of Salerno in Italy
Telepresence robotic sculptures linking Toronto to Salerno

1989 – VIDEOSPHERE
SIGGRAPH exhibition in Las Vegas
Virtual reality panoramic video artwork using panoramic imagery
L panoramic imaging artworks using custom-built camera rigs that I invented and I displayed these internationally in North America and Europe. My panoramic film frame to frame was one of these projects that was shot in 1986 and continues to this day. Nano spear the Project developed with the help of the National research Council was another project to load for the sensation of scale and presents to play with artwork. My series funny farm what you showed only adapted interactive microfiche and was another piece of wood displayed Montréal in 1992 and this is a showed using an early form hello display later became somewhere to Google Street viewmater in the 1980s I developed a series of

1983 as part of my graduation exhibition at the Ontario College of Art I displayed what the worlds first head mounted display artwork called displaced perspective. This piece used a camera and custom-built parros video glasses that held two small black and white TVs in them suspended via counterweights from a movable boom on the ceiling. When the viewer put on the video glasses they saw themselves from a different perspective 2 m away looking back at themselves with the video glasses on providing them with me Remote perspective of their on body image and kinesthetic. As well as second camera perspective had been set up Camera witch and Displayed a prerecorded version of the scene with people moving through looking around which created strange déjà vu kind of feeling the viewers as they did not know what was real and what was Live. People you thought or which created powerful since station of telepresence and the new perspective on what perspective was.

This is a new artwork called DISPLACED PERSPECTIVE the NANOCOPTER that will allow an audience the ability to shrink their perspective and literally enter into the 360 degree world of the microscope like a character in the film “Fantastic Voyage”. This transformation is designed to alter people’s perception as to the nature of the world around them and to extend our understanding of the Nano scale ecosystem that envelopes us but of which we are oblivious. By creating an experience which allows people the ability to physically enter into the microscopic world in a 360 degree panoramic VR type environment, NANOCOPTER will result in an artwork that pushes both esthetic as well as technical boundaries and open up a new frontier of artistic potential.

Our ability to see and explore the world around us is critical in providing us with a perspective on our environment and gives us the ability to comprehend the interactions that surround us. When viewing the microscopic world through microscopes we are given a window into his world but only from the perspective of looking down on it as if from a aircraft or satellite. We can’t truly enter the world of the ant or grass, to see it from that scale and move within that environment rather than simply observing it from above or through the video eye of an Endecsope type probe. Our ability to fully understand and relate to the microscopic world that surrounds and defines us, is limited by this inability to see and interact with it in a natural and human way. The Nanocopter project is designed to help overcome this issue and for the first time “shrink” people down to a microscopic level using VR display systems like Oculus Rift and my own invention of the world’s first microscopic panoramic camera.

HISTORY OF THE PROJECT
The history of my artwork that uses virtual reality Head Mounted Displays (HMD) and panoramic imaging systems like Google Streetview dates back to the early 1980’s and were the world’s first artworks to use of such display technologies. In 1983 as my graduation project at the Ontario College of Art I displayed the world’s first HMD based artwork called DISPLACED PERSPECTIVE which used a custom built HMD which I fabricated from 2 small camera eyepieces linked to a live camera. In the installation the viewer wore the HMD and saw themselves from a live perspective of 2 meters away. The second eyepiece in the HMD displayed the same image perspective but was pre-recorded which created a complex experience for the viewer as it was not obvious what or who in the image was real or recorded. The kinesthetic feeling of seeing yourself live from a different viewpoint was very powerful and

In the mid 1980’s I developed a panoramic imaging system which recorded the entire 360 degree by 180 degree image sphere which displayed all possible perspectives in one image made up of 432 separately shot images. I displayed these images in numerous exhibitions in Canada and Europe and also found that I could cut the images into gore like strips and place them onto a physical globe which undistorted the image back into a conventional perspective. Looking at the top of the sphere recreated a view of looking straight up and the 360 degree circumference was a horizontal panning perspective.

In the late 1980’s I experimented with a system that used a mirrored half sphere to act as a lens to record motion panoramic imagery for an early virtual reality headset system.

This worked very well and numerous companies now use this concept to record imagery with various versions available such as ????.com. As part of these experiments I used a tiny mirrored ball bearing that I had vapor coated with a mirrored reflective coating to create a very small mirrored sphere like the larger nes I first tested.

I then placed the 1mm mirrored sphere on the end of a pin and placed it onto a slide tray and positioned a microscope overhead. I shot a series of images with various types of very small scale objects in the slide tray and photographed the sphere so that using the microscope it would fill the image field of view. The resulting images created microscopic scale panoramic images that took in the full 360 degree environments. My idea back in 1989 was to  then use computer algorithms to unwrap the images and display them to people using head mounted displays but this proved impossible at that time due to the limitations of the computers available then as well as my lack of computer programing skills.

With the exponential increase in computer power, availability of low cost HMD’s like Oculus Rift and the development of panoramic software that is capable of transforming spherical imagery in real time my early experiment 26 years ago has become possible for the first time. My proposal is for a piece called NANOCOPTER in which people can sit in a chair, put on an HMD and suddenly become immersed within the world of the nano as a live panoramic eye. By looking around people can see the tiny creatures that inhabit this environment from their perspective and move around within it by using a joystick to control the slide specimen tray. In many ways it will be similar to actually taking a “FANTASTIC VOYAGE” as due to the live nature of the imagery and interactive capabilities of the system people will for the first time experience Nano scale life forms as if they were human sized. This shift in perspective will give the users a new outlook on the world and transform their understanding of the invisible world that surrounds us.