Webmoti

DATE 2017
DISCIPLINE Science
MEDIUM Telepresence and tactile interface for autistic children
STATUS Research project started Jan. 2017 at Ryerson University and University College Dublin with FUNDING: Ontario Centre of Excellence

PhD Project Proposal
Graham Thomas Smith
Thematic PhD in Inclusive Design and Creative Technology Innovation
Cognate Fields: Inclusive Design, Robotics, Telematics, Autism, Education

Proposed Supervisory/DSP Team as discussed and agreed with team members:
Principal Supervisor: Prof Lizbeth Goodman, MME: Creative Technology Innovation/Inclusive Design
Co-Supervisor: Dr Brian MacNamee, Insight/Computer Science
Doctoral Studies Panel: Chair- Dr Suzanne Guerin, Psychology/Disability Studies, Prof Goodman, Dr MacNamee

PhD Project Title – WEBMOTI – a multimodal computer interface to help autistic children mature their sensory systems to allow them the ability to balance their cognitive and emotional development.
Main Field of Study – Inclusive Design
Research Question – Can the use of a multimodal computer interface that includes both videoconferencing as well as tactile feedback allow young people dealing with the issue of autism the ability to mature their sensory systems to balance their cognitive and emotional development.College Dublin

Webmoti
PhD Project Proposal
Graham Thomas Smith
Thematic PhD in Inclusive Design and Creative Technology Innovation
Cognate Fields: Inclusive Design, Robotics, Telematics, Autism, Education

Proposed Supervisory/DSP Team as discussed and agreed with team members:
Principal Supervisor: Prof Lizbeth Goodman, MME: Creative Technology Innovation/Inclusive Design
Co-Supervisor: Dr Brian MacNamee, Insight/Computer Science
Doctoral Studies Panel: Chair- Dr Suzanne Guerin, Psychology/Disability Studies, Prof Goodman, Dr MacNamee

PhD Project Title – WEBMOTI – a multimodal computer interface to help autistic children mature their sensory systems to allow them the ability to balance their cognitive and emotional development.
Main Field of Study – Inclusive Design
Research Question – Can the use of a multimodal computer interface that includes both videoconferencing as well as tactile feedback allow young people dealing with the issue of autism the ability to mature their sensory systems to balance their cognitive and emotional development.

METHODOLOGY
Children with autism can face on-going and lifelong challenges with social, cultural and well-being issues. They are often ostracized by peers, mistreated by an education system with limited resources and incomplete or ineffective tools, misunderstood and excluded. Not only does this potentially contravene the Canadian Charter of Rights and Freedoms but also it can reduce or eliminate an individual’s ability to make cultural, intellectual, work and family contributions as a citizen in society which can have a serious and direct impact on that individual’s educational achievement, health and well-being that can tax the educational and health care systems.

The WebMoti is a tool to observe with and without participation the way in which peers in the classroom behave and socialize using the powerful camera that enables to zoom in the classroom. By being the “coolest kid” in the class (inside a computer) and being able to observe without having to participate in person students with ASC are less prone to ostracism and bullying which can then translate into improved self-esteem and a corresponding reduction in absenteeism, depression and related health issues. The WebMoti can make students aware of their hearing and tactile senses, and manipulate these to attend lessons without being overwhelmed. Manipulating these thresholds may also stimulate sensory awareness and self-actualization [24, 29].
The main qualitative outcomes of this project involve improved participation in school that may also translate into improved mental health for children with ASC because of support for the sensory maturation process and the reduction of the impact of bullying and ostracization on the mental health of these children. Using a technological intervention that combines multi-sensory output system (tactile, auditory and visual modalities are available) with a real-time, connected representative (Webchair), children with ASC can control and mediate their sensory environment and their participation in the social setting of school. The main quantitative outcomes will be improved attendance, participation in school, and perhaps academic achievement scores.
The newest research shows that autism spectrum conditions (ASC) are not a defect but rather a difference [9]. According to the theory of the Socioscheme with the MAS1P (Mental Age Spectrum within 1 Person), autism is about accelerated cognitive and delayed social and physical maturation of the central nervous system [10, 13].
The number of children diagnosed with ASC is about 1 in 94 in Canada [26] and 1 in 100 in Europe [10]. A child with autism is often ashamed, feels suppressed, is angry, and amazed not to be heard. They are fiercely and regularly bullied and have an acute sense of failure. Children with autism tend to drop out of school for sheer fear of peers, of the educational system and run a high risk for depression and lower quality of life [29, 30].
Children with autism often have issues with sensory integration [28, 16], and it has been suggested that this results from a lack of synaptic pruning in the brain [5, 14]. For example, in terms auditory sensing, the foreground sounds are strengthened and the background sounds are pushed to the background. When maturation lags, autistic individuals hear everything as it presents itself, which produces a disturbing and painful sound situation. At school, concentration is almost impossible and children are exhausted, which potentially has a serious impact on the child’s health and wellbeing. Allowing children to explore and play with sensory information may not only help ASC children better understand their sensory needs and self-advocate on behalf of necessary accommodations [6, 12, 24].
ASC children have significant sensory processing challenges versus typically developing peers [20]. Educational programs rarely take this into account when working to accommodate ASC children, and ABA/IBI therapeutic approach currently widely used in Ontario does little to address cognitive overload. Our WebMoti approach recognizes the need for ASC children to have opportunities to reduce cognitive/sensory overload through remote presence and at the same time use this remote presence to fine tune the level of interaction and engagement they have with their peers. The potential here is to support the creation of conditions whereby the child can gradually increase the amount of face-to-face interaction with peers through guided experience in full knowledge of the opportunity to step back into our digitally mediated environment. At the same time, WebMoti allows the child to control and fine tune the kind and amount of somatosensory stimulation necessary for homeostatic balance. Helping a child modulate sensory information based on their own personal needs not only has the potential for the development a greater understanding of their sensory needs and interests [17], but also as a clear influence on their behaviour and can lead to greater self-regulation [9].
Sensory substitution, such as vibration or visualisation replacing sound and vice-versa (e.g., [1, 15], offers opportunities to mediate sensory stimulation and allow an individual to obtain the information using alternative means [2, 3, 4].The Emoti-chair is a sensory substitution system so that when combined with the Webchair students with ASC can use the sensory modality that is best suited to their informational needs when interacting with others at a distance and in the safety of their own environment. While both systems have been used independently and be shown to be effective in some domains (e.g., access to sound-based music for the Emoti-chair and remote connectivity for hospital-bound children for the Webchair), there is no system known to the authors that provides multi-sensory input/output across a distance, specifically oriented for children with ASC who often benefit from alternative stimuli. In this project, we will then be able to provide such a system by integrating the Webchair and Emoti-chair as well as demonstrate its effectiveness through formative and summative evaluations as described in Section 4.
Webchair in the Netherlands is the only company in the world using videoconferencing technology to give homebound autistic students the chance to integrate into conventional school classrooms. To date over 20 students have been re-integrated back into their classes using the technology and pedagogical program developed by the company that gives control to the autistic child over how and when they attend class. When the students are to stressed to attend class physically, for whatever reason, they can come into class via the Webchair from home or one of the dedicated distant learning centers set up as alternatives to remaining at home in isolation. The students can even attend class from a “safe room” set up in the student’s school as a 3rd possibility as sometimes they need to physically leave the classroom but still want to continue to be in class remotely.
This approach has been very successful as the students can control the amount of sensory input they are exposed to in relation to their level of stress at any given time. The students are encouraged to sometimes physically be in their classrooms but this decision is left up to them. It has been observed that the more the children actually attend class the more they connect with other students and begin to understand the complex social environment of the classroom.
The knowledge of how to connect and re-integrate the autistic students as well as the dedicated piece of technological equipment (the Webchair) are the 2 things that will be transferred to Ontario. Webchair works closely with Dr. Martine Delfos who is our leading research professional in the area of autism and this is also knowledge that will be transferred to the Ontario team at Ryerson University.
Overall Objective
The main objective of this project is to develop and evaluate a multi-media, multi-sensory connection system, called Webmoti, which supports the social and educational needs of children and teenagers with autism. For those children who have issues with hearing, Webmoti will be evaluated with a variety of audio controls (e.g., stop, adjust volume levels) and sensory substitution techniques for sound alternatives (e.g., vibrotactile or visual representations) to suit different hearing needs.
The goal is to achieve a new way allowing students with ASC to participate in school that provides them with control and agency. However, it is important that this process fits within a school setting given the need for technology, particularly video technology, to be present in the classroom. The Accessibility for Ontarians with Disability Act and the Intersection of Disability, Achievement and Equity document of the Toronto District School Board [8] provides for accessibility to be embraced by Ontario school boards and thus this project is consistent with these mandates. The school systems in the Netherlands have begun to incorporate an earlier version of the WebMoti system, called the WebChair, with hundreds being installed over the past three years. However, consultation and pilot studies are required before the WebMoti system can be considered as a wide-spread solution for children with ASC in Ontario as there are implications for staff training and understanding some of the issues that can arise (e.g., occasionally sound can become distorted).

Objective 1: Prototype development
The first objective will be to combine the Webchair and Emoti-chair technologies into one prototype system. Combining the two systems may also involve adding other sensory substitution techniques such as smell according to research carried out by [18]. However, decisions regarding specific sensory preferences and technology options will be made in consultation with users. Part of Nolan’s contribution to the research will focus on the further development and testing of a previously developed wearable prototype, developed out of a previous research project [18] that will provide real-time telemetry related to the stress/anxiety levels of participants.
A formative user-study method will be used to evaluate early iterations of the system with 4 to 5 participant users with ASC to evaluate usability elements (ease of learning and use, and satisfaction). The objective of these evaluations are to engage users in finding interaction problems and problem solving technological solutions for these identified problems during prototyping. Once these participant users are satisfied that the system does not have major interaction issues, a longitudinal evaluation of the system will occur with participants from the Toronto area for at least one school term to assess the long-term efficacy of the system. Participants will use WebMoti to attend school when they are unable to attend physically (which can vary from infrequently to often). Each session with WebMoti will be video/audio recorded. In addition, all stakeholders including the student him/herself, teachers, parents, and healthcare providers will be asked to complete a pre-study survey followed by a bi-weekly surveys to track change and difference. Once the study term is complete, all stakeholders will be asked to either complete a summary survey or participate in one-on-one interviews. The purpose of this last step is to gather impressions and attitudes towards the process and technology, likes/dislikes, and any issues that should be addressed in future work. (ease of learning and use, and satisfaction) [27]. The 10-item System Usability Scale [7] will be used. In addition, the attitude toward well-being instrument Life Satisfaction Matrix [19] and the Sensory Behaviour Scale [11] will be evaluated for their efficacy in this context. The objective of these evaluations is to gather user’s reactions to and performance with different versions of the prototype with the aim of finding user interaction problems and rectifying them.
Data from the formative user studies will be analysed using qualitative methods such as thematic analysis [22] as there is insufficient power to use more than descriptive statistics. The results from the analysis will be fed back into the development process.
Objective 2: Summative evaluation
Once the research team is satisfied that the system does not have major interaction issues, a longitudinal summative evaluation will occur. For this evaluation, systems will be placed with 16 participants in Ontario for at least one school term to assess the long-term efficacy of the system.
Participants will use WebMoti to attend school when they are unable to attend physically (which can vary from infrequently to often). Each session with WebMoti will be video/audio recorded. In addition, all stakeholders including the student him/herself, teachers, parents, and healthcare providers will be asked to complete a pre- study survey followed by a bi-weekly surveys to track change and difference. Once the study term is complete, all stakeholders will be asked to either complete a summary survey or participate in one-on-one interviews depending on availability. The purpose of this last step is to gather impressions and attitudes towards the process and technology, likes/dislikes, and any issues that should be addressed in future work. We anticipate this phase to last 18 months corresponding with school terms (e.g., not including summer months).
The quantitative data (responses to forced-choice questions) will be analyzed using repeated measures ANOVA and correlation to examine change over time. Qualitative data will be analyzed using thematic analyses. The results will then be used to inform a framework of sensory substitution techniques, ASC and health and wellbeing outcomes based on the instruments described in Objective 1.
REFERENCES
[1] Abboud, S., Hanassy, S., Levy-Tzedek, S., Maidenbaum, S., & Amedi, A. (2014). EyeMusic: Introducing a “visual” colorful experience for the blind using auditory sensory substitution. Restorative neurology and neuroscience, 32(2), 247-257.
[2] Attwood, T. (2006). The complete guide to Asperger’s syndrome. Jessica Kingsley Publishers.
[3] Baranek, G. T. (2002). Efficacy of sensory and motor interventions for children with autism. Journal of autism and developmental disorders, 32(5), 397-422.
[4] Baranek, G. T., David, F. J., Poe, M. D., Stone, W. L., & Watson, L. R. (2006). Sensory Experiences Questionnaire: Discriminating sensory features in young children with autism, developmental delays, and typical development. Journal of Child Psychology and Psychiatry, 47(6), 591-601
[5] Bourgeron, T. (2009). A synaptic trek to autism. Current opinion in neurobiology, 19(2), 231- 234.
[6] Broderick, A. A., and A. Ne’eman. 2008. Autism as metaphor: Narrative and counter-narrative. International Journal of Inclusive Education 12 (5–6): 459–576.
[7] Brooke, J. (1996). SUS-A quick and dirty usability scale. Usability evaluation in industry, 189(194), 4-7.
[8] Brown, R. S. & Parekh, G. (2013). The intersection of disability, achievement, and equity: A system review of special education in the TDSB (Research Report No. 12-13-12). Toronto, Ontario, Canada: Toronto District School Board.
[9] Case-Smith, J., Weaver, L. L., & Fristad, M. A. (2014). A systematic review of sensory processing interventions for children with autism spectrum disorders. Autism, 1362361313517762.
[10] Delfos, M.F. (2016). Wondering about the world. About Autism Spectrum Conditions. Amsterdam: SWP. [8] Elsabbagh, M., Divan, G., Koh, Y. J., Kim, Y. S., Kauchali, S., Marcín, C., … & Fombonne, E. (2012). Global prevalence of autism and other pervasive developmental disorders. Autism Research, 5(3), 160-179.
[11] Harrison, J., & Hare, D. J. (2004). Brief report: Assessment of sensory abnormalities in people with autistic spectrum disorders. Journal of Autism and Developmental Disorders, 34(6), 727-730. doi:10.1007/s10803-004-5293-z.
[12] Harley, D., McBride, M., Chu, J. H., Kwan, J., Nolan, J., & Mazalek, A. (2016). Sensing context: Reflexive design principles for intersensory museum interactions. MW2016: Museums and the Web 2016.
[13] Hong, H., Kim, J. G., Abowd, G. D., & Arriaga, R. I. (2012, February). Designing a social network to support the independence of young adults with autism. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work (pp. 627-636). ACM.
[14] Johnson, M. H. (2005). Sensitive periods in functional brain development: Problems and prospects. Developmental psychobiology, 46(3), 287-292.
[15] Karam, M., Branje, C., Russo, F., Fels, D.I. (2010). Vibrotactile display of music on the human back. ACHI2010. Barcelona. Pp. 154-159.
[16] Kern, J. K., Trivedi, M. H., Grannemann, B. D., Garver, C. R., Johnson, D. G., Andrews, A. A.,… & Schroeder, J. L. (2007). Sensory correlations in autism. Autism, 11(2), 123-134.
[17] Kern, J. K., Trivedi, M. H., Garver, C. R., Grannemann, B. D., Andrews, A. A., Savla, J. S., & Schroeder, J. L. (2006). The pattern of sensory processing abnormalities in autism. Autism, 10(5), 480-494.
[18] Koller, D., McPherson, A., Lockwood, I., Blain-Moraes, S., & Nolan, J. (Submitted). The Impact of Snoezelen in Pediatric Complex Continuing Care: A Pilot Study. Journal of Pediatric Rehabilitation Medicine
[19] Lyons, G. (2005), The Life Satisfaction Matrix: an instrument and procedure for assessing the subjective quality of life of individuals with profound multiple disabilities. Journal of Intellectual Disability Research, 49: 766–769. doi: 10.1111/j.1365-2788.2005.00748.x
[20] Marco, E. J., Hinkley, L. B., Hill, S. S., & Nagarajan, S. S. (2011). Sensory processing in autism: a review of neurophysiologic findings. Pediatric Research, 69, 48R-54R.
[21] McBride, M., Harley, D., Mazalek, A., & Nolan, J. (2016). Beyond Vapourware: Considerations for Meaningful Design with Smell. CHI’16, Extended Abstracts on Human Factors in Computing Systems. ACM. San Jose, U.S.
[22] Miles, M.B., Huberman, M.A., Saldana, J. (2014). Qualitative Data Analysis. Sage Publishing: London
[23] Nielsen, J. (1993). “Iterative User Interface Design”. IEEE Computer vol.26 no.11 pp 32-41.
[24] Nolan, J., McBride, M., & Harley, D. (2016; draft). Making sense of my choices: an autistic selfadvocacy handbook. RE/Lab working paper series.
[25] Oesterwalder, A. (2016). Business Model Canvas Tool. Retrieved Feb. 22, 2016 from
http://businessmodelgeneration.com/canvas/bmc.
[26] Ouellette-Kuntz, H., Coo, H., Lam, M., Breitenbach, M. M., Hennessey, P. E., Jackman, P. D.,… & Chung, A. M. (2014). The changing prevalence of autism in three regions of Canada. Journal of autism and developmental disorders, 44(1), 120-136.
[27] Rogers, Y, Sharp, H., Preece, J. (2011). Interaction Design: beyond human-computer interaction. Wiley:
Toronto.
[28] Stoddart, K. P. (2005). Children, youth and adults with Asperger syndrome: Integrating multiple perspectives. Jessica Kingsley Publishers.
[29] Strang, J. F., Kenworthy, L., Daniolos, P., Case, L., Wills, M. C., Martin, A., & Wallace, G. L. (2012).Depression and anxiety symptoms in children and adolescents with autism spectrum disorders without intellectual disability. Research in Autism Spectrum Disorders, 6(1), 406- 412.
[30] van Heijst, B. F., & Geurts, H. M. (2014). Quality of life in autism across the lifespan: A meta- analysis. Autism, 136-153.

TARI – Telepresence Autism Research Initiative

DATE – 2011
DISCIPLINE – Science

Webchair Autism Transition Technology (WATT) … a bridge to communication, education, transformation, and personal freedom.

A. Harris Stone, Ed.D. Andrew D. Summa, Ph.D., Ed.S., Ed.D.
Chancellor, The Graduate Institute Executive Director,

National Center for Electronically Mediated Learning

The increased prevalence of Autism Spectrum Disorders (ASD) in children aged 3-12 (i.e., one (1) out of ninety-one (91) children is diagnosed with ASD, Centers for Disease Control Statistics issued in 2010) has led to a clarion call for educational interventions that mitigate the current constraints in educating children who are afflicted by one or more of the Autism Spectrum Disorders (i.e., Autism, Pervasive Developmental Disorder – Not Otherwise Specified [PDD-NOS] and Asperger Disorder).¹

For the purpose of establishing a working definition of ASD for this article, the term Autism Spectrum Disorders (ASDs) are defined as a group of disabilities characterized by atypical development in three behavioral domains: 1) social interaction; 2) language, communication and imaginative play; and 3) diminished range of interests and activities.²

However, the thrust of this article is not to examine reasons for the increased prevalence of Autism Spectrum Disorders. Rather, this article is intended to herald a promising new technology, referred to as the Webchair Autism Transition Technology, that holds the potential to address the educational, social, and communication needs of students diagnosed with an Autism Spectrum Disorder.

The Webchair Autism Transition Technology, referred to as “WATT”, is designed to connect autistic individuals to their regular school environment and/or community through the use of a revolutionary assistive technology that utilizes telepresence to effect three measurable outcomes: 1) promote self-sufficiency and interpersonal (i.e., social) skills that facilitate transitions from autism center-based placements to inclusionary school classrooms or workplace settings; 2) enhance the academic, vocational, sensory, and psychological development of individuals on the autism spectrum; and 3) facilitate learning that converts disparate information into personal meaning.

Webchair Operational Design
Webchair is an enhanced videoconferencing technology designed to link hospitalized, homebound, or autistic children to their school classrooms. Controlled by the child at his/her remote location (i.e., hospital, home or autism center), it serves to place the child in virtual actuality of their regular (i.e., school site) classroom – in real-time and with dynamic communication potential. Webchair integrates children with their regular classroom activities by enabling them to function and interact as if actually being at school. It creates a “telepresence” in the classroom where teachers and classmates interact with hospitalized, homebound or autistic children located at remote sites. Thus, Webchair brings a dynamic force to the life of the hospitalized, homebound, or autistic child. As a result, it eases the tension associated with placement at an autistic center or hospital, mediates anxiety, maintains the child’s educational, social, and familial networks, and introduces fun and hope into the child’s difficult daily life while hospitalized or assigned to a remote autism center.

How Webchair Works
Webchair accomplishes its goals through highly developed, yet simply operated, assistive technology that provides seamless videoconferencing between the remote site and the school classroom. Using two systems, with both units controlled by the child, Webchair allows the child to interact with the teacher and engage in all classroom activities and school functions, while communicating directly with the classroom practitioner (i.e., the teacher) and speaking in real-time to friends and classmates throughout the school day.

The Autism Transition Technology. . . Webchair in transformation
One perspective concerning the profound increase in diagnosis of children on the Autism Spectrum Disorder is grounded in the relationship between new media (i.e., digital technology, iPod, and iPad, etc.) and it’s deadening effect on language and social development and, by extension, the neural development of young children.³ With the advent and popularity of infant/early childcare programs, language development as an integral component of neural development has been seriously truncated by many custodial-based childcare programs. In essence, the behaviors demonstrated by many autistic children, ages 5-12, may be the exacerbated result of their relationship with new media technology. Clearly, one-way digital technology has created a generation conditioned by the fragmented focus induced by deep and repeated involvement with pseudo-interactive technology.

The Webchair Autism Transition Technology (WATT) provides a developmental bridge connecting the world of the autistic child to the environmental trappings and institutions of society. It expands children’s early operant conditioning induced by many childcare facilities by providing enriched opportunities for language development in real world applications. The Webchair technology allows children on the Autism Spectrum Disorder to assume full control of their personal interactions, including formal and informal communication. It also accelerates their cognitive development in both “remote” (i.e., Autism Centers) educational settings and/or authentic classroom settings (i.e., public school classrooms). The Webchair technology offers opportunities that mitigate traditional programmatic constraints that inhibit educational, social, sensory, and cognitive development. The technology also impacts the autistic child’s access to, and comfort level with, the implicate order of his/her perceived external world.

The Webchair Autism Transition Technology (WATT) is designed to replace the autistic child’s current “one-way digital technology” world with highly interactive Webchair technology-based experiences. The Webchair assistive technology serves as the singular “Portal of Possibility” that provides access to the closed world of the autistic individual.

Realities of Autism
Autism, a developmental disability with a complex set of psycho-social implications, is manifest as a developmental disorder in a variety of ways in afflicted individuals. Although a high percentage of children and adults diagnosed with Autistic Spectrum Disorders possess above -average intelligence, they are often constrained from participation in traditional learning environments (e.g., classrooms) by their inability to communicate directly with others, lack of control of their emotional state, or inappropriate social behavior.4 In some cases, these individuals are mysteriously captured in a world of silence. Their interactions with peers and adults are often characterized by having little or no eye contact, absence of speech, and the inability to be in control of their own emotions, cognition, and spirit. The condition is profoundly debilitating to the individual, extremely disruptive to the achievement of productive educational outcomes, and extraordinarily costly to families, schools, communities and, by extension, to the nation’s culture.

Webchair Engagement Protocol: The Parallax Phenomenon
The Webchair Autism Transition technology addresses the visual and social engagement issues so critical to educating children diagnosed with autism. The Webchair technology utilizes the Parallax Phenomenon, available only through the medium of “telepresence”, as an instrument to transform an autistic person’s behavioral traits. The parallax phenomenon addresses the “direct view (i.e., eye gaze) aspect of a child’s interactive diversion,” a behavior common to children with autism, by facilitating less threatening child-adult visual and social interactions. Utilizing the naturally occurring downward gaze intrinsic to the Webchair telepresence technology, the child’s visual and cognitive connectivity to the classroom environment is supported by the effect of a slightly diverted interactive gaze of the teacher as projected through the Webchair “telepresence” camera technology. This subtle aspect is crucial to enabling student-teacher connectivity through the phenomenon of priming, the condition in which readiness and receptivity for learning becomes manifest.5 Thus, learning is facilitated by the indirect eye gaze phenomenon (i.e., Parallax Phenomenon) of the revolutionary Webchair technology.

In the cognitive domain, the Webchair Autism Transition Technology (WATT) autonomically informs the child’s implicate cognitive, social, sensory, and cultural experience. It allows for the child’s explicate interaction with external environments through the internalization of language, creation of personal meaning, ordering and filtering of sensory stimuli, interpreting and integrating symbolic language, and facilitating personal communicative processes. In the affective domain, the Webchair technology facilitates the stabilization of emotion-based behaviors. Moreover, the technology facilitates the autistic child’s capacity to experience the explicate world as perceived by non-autistic individuals.

The results of two pilot studies conducted using a Beta (i.e., PEBBLES) telepresence systems from FY 2007-09 at Blythdale Children’s Hospital-Mt. Pleasant School District, Valhalla, NY, and at the Center for Discovery, Harris, NY, the northeast’s largest autism center, provide preliminary data to support the efficacy of the Webchair technology as a developmental bridge connecting the autistic child to real world experiences in the classroom, in the home, and in the community.

Scope and Prevalence of Autism in the US
• One (1) in ninety-one (91) children (early childhood – elementary aged) is diagnosed with autism6 (Source: Autism Spectrum Disorder Prevalence Statistics from the Centers for Disease Control and Prevention, 2009).
• 1.5 million adults are diagnosed with autism7 (Source: Centers for Disease Control and Prevention).
• Autism Spectrum Disorders represent the fastest-growing developmental disability in the United States.
• Trends in the Incidence of Autism: 1987-20078 (U.S. Department of Education’s Twenty-First Annual Report to Congress on the IDEA Statistics)
● U.S. school-age population diagnosed with autism from 1987-20099
> (1987) 1 in 3,000 (Centers for Disease Control statistics)
> (2005) 1 in 166 (Centers for Disease Control statistics)
> (2009) 1 in 91 (Centers for Disease Control statistics)
> (2017) 1 in 30 (Projected Statistic – Centers for Disease Control)
● Annual cost for ASD services: $90 billion (2008)10
● Cost of life-long ASD care can be reduced by two-thirds with early diagnosis
and interventions such as the WEBCHAIR technology (Source: Jarbrink, K.
Knapp, M. 2001. The Economic Impact of Autism)11
● By 2017, the annual cost for providing services to individuals with autism
will be $400 billion (Source: Autism Society of America)12

Implications of Early Diagnosis and Treatment
Research indicates that early diagnosis is associated with dramatically better
educational and social outcomes for individuals with autism. The Centers for
Disease Control and Prevention state that “it is critical that we
treat the developmental disability of autism as a condition of urgent public health
concern, [and] do all we can to identify children’s learning needs, and start
interventions as early as possible.”13

Comparison of Autism Disorder Rates to Rates of Other Disabilities and Diseases
Diagnosed in Children.
• Down Syndrome, the most commonly identified cause of mental retardation, occurs in 1 in 800 births.14
• Juvenile Diabetes occurs in 1 in 400 children and adolescents.15
• Autism Spectrum Disorders occur in 1 in 91 children.16

Webchair Project Accomplishments (to date)

In its first phase using the earlier PEBBLES telepresence systems, extending from 2001 to the present, Webchair Technology demonstrated its efficacy in mitigating the constraints that many Local Educational Agencies (i.e. school districts) faced when confronted by the needs of educating medically fragile children. The Webchair technology enabled numerous hospitalized or homebound children to maintain virtual contact with their educational community. The technology also allowed medically fragile students to sustain their academic performance and social connectivity to a degree which not only fulfilled their educational needs, but also contributed to their recovery from challenging medical afflictions.

The National Center for Electronically Mediated Learning (NCEML), located in Bethany, Connecticut, implemented a national demonstration and dissemination of the Webchair initiative in 2001 that is still in operation today. Since the Project’s inception, the Webchair Assistive Technology has been deployed in eighteen hospitals and twenty children’s homes, nationwide.

The National Center for Electronically Mediated Learning, Inc. (NCEML) currently maintains the proprietary rights to the Webchair technology. Since 2001, the NCEML has received $2.9 million from the U.S. Department of Education to support the Webchair initiative. Clearly, this project has been a work of spirit and commitment.

In FY 2007, personnel from the National Center for Electronically Mediated Learning, working in collaboration with Blythdale Children’s Hospital (NY) and The Discover Center (NY), initiated an eighteen month Pilot study on the application and efficacy of the technology on children with autism. Preliminary data from the pilot demonstrated that the Webchair Autism Transition Technology may be an effective tool in addressing the educational and social needs of children on the Autism Spectrum Disorder. The promising results of this application demand that the Webchair Assistive Technology be expanded to address the educational needs of autistic children on the national level. Hence, the National Center for Electronically Mediated Learning is currently seeking funding to deploy and empirically measure the efficacy of the Webchair Autism Transition Technology.

The scope and severity of the autism epidemic requires a bold and innovative approach. The Webchair Autism Transition Technology (WATT) may be the panacea to remedy the scourge of this disorder.

A. Harris Stone, Ed.D. is the Founder and Chancellor of The Graduate Institute in Bethany, CT, where he pursues his interest in developing educational offerings that affect the positive transformation of culture. Andrew Summa, Ph.D., Ed.S., Ed.D., is the Vice President for Institutional Advancement at The Graduate Institute, where he oversees the growth and expansion of the Institute’s Master’s degree programs in emerging fields of inquiry. Dr. Summa also serves as the Executive Director of The National Center for Electronically Mediated Learning, Inc.
References

1. CDC. Prevalence of Autism Spectrum Disorder – Autism and Developmental
Disabilities Monitoring Network, Surveillance Summaries, Jan. 2010.

2. Yeargin-Allsopp, M., Rice, C., Karapurkar, T., Doernberg, N., Boyle C. and
Murphy, C. Prevalence of Autism in a U.S. Metropolitan Area. JAMA. 2003; 289:
49-55.

3. Rutter, M. Incidence of Autism Spectrum Disorders: Changes Overtime and Their
Meaning. Acta Paediatrica. 2005; 94: 2-15.

4. Hertz-Picciotto, E. and Delwiche, L. The Rise in Autism and the Role of Age at
Diagnosis. Epidemiology. 2009; 20: 84-90.

5. Posserud, M.B., Lundervold, A.J., and Gillberg. C., Autistic Features in a Total
Population of 7-9 Year-Old Children Assessed by the ASSQ (Autism Spectrum
Screening Questionnaire). Child Psychlogy of Psychiatry. 2006; 47: 167-175.

6, 7, 8, 9, 10, 11, 12, 13, 14, 15 and 16. CDC, National Center for Health Statistics.
Estimates of the July 1, 2000 – July 1, 2009 United States resident population from
the vintage 2007-09 postcensal series by year, county, age, sex, race, and ethnicity.
Prepared under a collaborative arrangement with the U.S. Census Bureau.
Bethesda, MD: U.S. Department of Health and Human Services, CDC, National
Center for Health Statistics; 2007. Available at
www.cdc.gov/nch/bridged_race/data_documentation.htm, # Vintage 2007-09.
Accessed December, 2009.

Liberation

DATE – 2010
DISCIPLINE – Science
MEDIUM –
STATUS – Funded by the European Space Agency Technology Centre
WEBLINKS
http://www.olats.org/space/13avril/2004/mono_index.html
http://www.esa.int/spaceinimages/Images/2011/05/AquaCinema_s_monitor_shows_space_view_in_pool
http://esamultimedia.esa.int/docs/TTP/space-to-business-2011-01.pdf

I have been involved in the space art community for 25 years with the artistic aim of exploring the relationship between our perceptions of the environment and how we conceptualize it as a world view. In the early stages I experiemented with panoramic photographs which I wanted to have shot in space and later developed a technology similar to todays Quicktime VR/Google Styreetview that displayed panoramic imagery to viewers wearing virtual reality Head Mounted Displays and at SIGGRAPH 1990 I demonstrated a version called Videosphere with VR pioneer Jaron Lanier. In 1993 with the the help of Gord Harris of the IMAX corporation I conducted a test using a 2 meter wide screen in a pool to demonstrate that movies can be projected underwater onto the inside of domes. As part of the experiment I entered the pool with a diving mask and found that seeing the earth in this new way to be a very powerful experience. In 2004 I was a speaker at the 8th Space Art conference held at the European Space Agency in Nordwick, Netherlands and in 2009-2010 was an artist in residence at the same facility invited to demonstrate the core underwater video technology.

I developed the core underwater cinema technology in 2010 when I was an artist in residence at the European Space Agency’s technology research centre in Noordwijk (ESA), Netherlands http://www.esa.int/esaMI/ESTEC/index.html where I tested and built 2 versions of the underwater theatre. The project was showcased on Discovery Channel in 2010 and represents 25 years of work experimenting and building various technologies to exhibit the concept. With the completion of the physical task of building the core display technology (thanks to the European Space Agency) I wish to focus on the creation of an interactive element, imagery and an exhibition at V2.

The core concept behind the project is the realization of a way to de-contextualize video imagery for viewers so that they can view the image in a new way. By immersing the participant underwater in a weightless state people loose their sensation of a local environment and “feel” the work in a new and unique way. Everyone who participated in the tests we did at ESA commented on the realism of the experience and experienced the sensation astronauts have called the”Overview Effect”. As part of my research at ESA I contacted astonaut Frank Mitchell (Apollo 14) who was the 6th person to walk on the moon and talkedabout how it has proven quite difficult to communicate more than just a portion of this potentially revolutionary experience to people. LIBERATION is a project which will overcome this physical barrier by providing people with a convincing, weightless simulation of this experience.

“For V2 the project LIBERATION is a next step in the research and understanding of Bodily perception in general and in media environments, or variable gravitational environments specifically. This research lines up with the Unstable Media practices that V2 has been developing over the last 20 years in the domain of the arts but always within an interdisciplinary setting. The project of Graham Smith will be a major contribution to this as it brings together a perfect match of theoretical issues and direct experiences around perception and Media environments”.
Alex Adriaansens – Director V2
The world would be a better place if every person could spend an hour space-walking above the earth’s surface to experience for themselves the world as it truly is, a fragile beautiful sphere spinning through the vastness of space. We still live on an earth that seems flat because that is how we physically experience it in our everyday lives. It is how we express it as we see the sun rising in the east, moving across the sky and finally setting in the west. 
Only the few people who have actually been in space, those that have felt for themselves the so called Overview Effect, understand and perceive the world in this wholly new way. The majority of humanity though never truly comprehends this reality due to our inability to easily break the bonds of gravity and propel our bodies into space to experience first hand the world as an interconnected whole.
“Liberation” is a space art project which re-creates the Overview Effect for people by enabling them to experience a weightless simulation of floating in space and traveling around the earth. It is an underwater movie theater (Submersive Cinema) which is constructed in a swimming pool and projects imagery onto a 7 meter wide, bowl shaped screen which is situated underwater. The 35mm movie projector, held by a mechanical arm which is situated over the center of the screen, is pointing straight down. From 1 meter underwater, in a waterproof housing, a 120 degree image of the earth from orbit is projected onto the underwater screen. 6 people at a time equipped with scuba masks, snorkels and neutral buoyancy vests immerse themselves within the pool to experience what astronauts call the Overview Effect for themselves. In addition a railing running around the edge of the pool allows an additional 50 audience members to gain a sense of the experience without getting wet. The ultimate goal is the placement of a high resolution video camera with a panoramic lens on the space station or a satellite which will continuously transmit a live world-view from orbit.
I have been working on the “Liberation” project since 1982 taking panoramic photographs which I wanted to have shot in space. I developed a technology that displays panoramic imagery to viewers wearing virtual reality Head Mounted Displays and at SIGGRAPH 1990 in Dallas I demonstrated a version called Videosphere with VR pioneer Jaron Lanier. With the the help of the IMAX corporation in 1993 I conducted a test using a 2 meter wide screen in a pool to demonstrate that movies can be projected underwater onto the inside of domes. As part of the experiment I entered the pool with a diving mask and found that seeing the earth in this new way to be a very powerful experience.
I am currently experimenting with different ways to create the imagery for the project by combining satellite databases such as the Geosphere project by space artist pioneer Tom van Sant with computer graphics clouds. Another approach I am researching is the use of an optical printer to re film enlarged, still images of the earth to simulate a movie of the earth from orbit. Recently, space science researchers from Canada’s York University have expressed an interest in using Submersive Cinema to perform studies into human perceptual cues while weightless.
At the dawn of the 21st century the human race is finally beginning to understand how physically connected we all are. To move forward as a species we must overcome the differences which separate us and act as one planet. LIBERATION is an artwork that will help people understand this ideal and allow them to see their world in a new way.

Emotichair

DATE – 2007- 2009
DISCIPLINE – Science
MEDIUM – Interactive Display
STATUS – Funded by a joint NSERC and Canada Council grant
WEBLINKS

http://imdc.ca/asid/ASID.html
http://www.ryerson.ca/psychology/emotichair/
https://www.newscientist.com/article/dn16738-vibrating-chair-lets-users-rock-to-the-beat/
http://www.ryerson.ca/news/news/Research_News/20090227_Emoti/
http://torontoist.com/2009/02/emoti-chairs_at_clintons/
http://www.torontosun.com/news/torontoandgta/2009/02/26/8536856-sun.html
https://www.thestar.com/life/health_wellness/2008/07/02/emotichair_delivers_good_vibrations_to_deaf.html

Often understood to be diametric opposites, art and technology are two central aspects which permeate my artistic philosophy. Uniquely purveying their thoughts and feelings through creative works, artists are enabled by the multiplicity of choices that technology affords, offering alternative method and mediums. Because the mind is perpetually limited by human physicality (McLuhan 19XX), the translation of thought into tangible physical objects outside of body is, at best, an attempt to creatively express or represent the intended meaning. Working within the limitations of television and film, artists must express their message through the video and audio channels; these individuals shape their message accordingly, fitting their message into this means of dissemination. A problem surfaces, however, when an audience member is unable to access one of these avenues of meaning-making, the deaf or hard of hearing, for example. Although several strategies have been developed to avoid this problem, evident in the widespread adoption of close captioning, those who are deaf or hard of hearing receive a diluted and potentially highly subjective interpretation of non-dialogue sound information such as music and sound effects, as spoken dialogue is prioritized and only so much can be fit within the available caption space and time.

I want to construct a multimedia system that allows people who are deaf and hard of hearing the opportunity to access and experience sound information through alternative means, using visual and tactile media to represent meaning that is more indicative of the film or television content producer’s original goal. Working with the Ryerson human computer interaction team, I want to enable these individuals to feel and see sound information, privileging the auditory experience as a whole rather than focusing solely on information gained from dialogue alone.

My role as the artist in this project will be to determine how different types of sound information should be expressed, based on a variety of factors. Music, for example, ranges in style, mood and intensity while spoken dialogue varies according to the speaker’s intonation, speed, and the words that are emphasized, among other factors. The translation of these different factors into visual and tactile outputs requires my unique perspective, as I am knowledgeable and experienced in morphing my ideas into unconventional physical and multimedia representations.

In one of my projects, for example, I wanted to show that while technology can enable communication over distances, something is lost when an individual’s presence is limited to the space and functionality of a computer screen. I created a presence chair that uses a screen at the back of a chair to display the image of the remote person in a video conference. This person then looks like she is sitting in a chair. She can also swivel the chair from the remote location to show that she is directing her attention to someone else in the physical room. Using similar devices, she then has some of the important physical movements that are commonly used in meetings or public speaking events (such as turning to see a speaker). This project was part of the sentient creatures lecture series where Jaron Lanier, renown digital media artist and computer scientist, presented a “guest” lecture from a remote location to an audience of sixty. Jaron was able to interact with his audience, for, when he moved, the chair mimicked a variety of his movements, allowing him to have a physical presence within the room (see http://connectmedia.waag.org/media/SentientCreatures/ jaronlanier.mov).

I have made several preliminary sketches (see Figure XX) to illustrate my initial thoughts on the possible functions and aesthetic look of the device that we are constructing. I am eager to develop a relationship with the Ryerson team, for I know that this project has the ability to cater to a community that has long been marginalized by mainstream entertainment. By providing alternative representations of sound information for the deaf and hard of hearing, I seek to create an entertainment experience that is as rich and meaningful for deaf and hard of hearing individuals as the audio visual experience that most hearing individuals are able to enjoy.

For the most part, my work as an artist has centered on the interrelation of art and media. In creating multimedia installations, I challenge my audiences to embrace that which is created through the interplay of computers, art and media, acknowledging their preconceptions in hopes. For example, my art project, “Sentor PoBot Goes to Washington” used an early video conferencing system built into a wireless robot. The purpose of this was to allow poets and musicians to communicate with people on the streets in front of the White House in Washington DC from various remote locations around the world. Artists were able to express their opinions on a “virtual soap box” while audience members on the street experienced a different physical and audio-visual form of the artist. Audience members were able to see how art and music, with the assistance of technology, could give an artist presence within a remote location which he would have been unable to access otherwise.

Working with Jeff Mann and Michelle Teran, I created the “Live Form Telekinetics Projects” which occurred at both InterAccess and Toronto and the WAAG in Amsterdam simultaneously. We worked with eight artists to create a telepresence dinner linked with various telepresence scultural objects. Physical objects were duplicated on each side and when manipulated in real space an equivalent manipulation occurred on the side. For example, if someone “clinked” the wineglass in Toronto, a robotic mechanism clinked the equivalent wineglass in Amsterdam. A telepresence table was built where half the table contained physical objects and the other half contained a video projection from the remote location. If a person in the remote location placed their hand on the table, it would appear as an actual size hand on the virtual portion of the table at the


FULL CITATION (TITLE/REFERENCE) REFEREED
JOURNAL ARTICLES CONFERENCE
PRESENTATION/POSTER OTHER (INCLUDING TECHNICAL REPORTS, NON-REFEREED
ARTICLES, ETC.)
Accepted/Published

Karam, M. and Russo, F. and Fels, D. (2009). Designing the Model Human Cochlea: An Ambient crossmodal audio-tactile display. (2009) IEEE Transactions on haptics: Special issue on ambient haptic systems. 2(3). 160-169. X
Branje, C. Maksimouski, M., Karam, M., Russo, R. & Fels, D.I. Vibrotactile display of music on the human back. (in press). Proceedings of The Third International Conferences on Advances in Computer-Human Interactions (ACHI 2010), Barcelona.
X
Karam, M., Branje, C., Russo, F., Fels, D.I. (accepted). The EmotiChair – An Interactive Crossmodal tactile music exhibit. ACM CHI, Atlanta, April 2010. X
Branje, C.J., Karam, M., Russo, F., Fels, D.I. (2009). Enhancing entertainment through a multimodal chair interface. IEEE Symposium on Human Factors and Ergonomics. Toronto. X
Karam, M. and Nespoli, G. and Russo, F. and Fels, D.I. (2009). Modelling Perceptual Elements of Music in a Vibrotactile Display for Deaf Users: A Field Study. In Proceedings of The Second International Conferences on Advances in Computer-Human Interactions (ACHI 2009). Cancun.
X
Branje, C. and Maksimowski, M., Nespoli, G., Karam, M., Fels, D. I. & Russo, F. (2009). Development and validation of a sensory-substitution technology for music. 2009 Conference of the Canadian Acoustical Association. Niagara on the Lake. X
Karam, M., Russo, F., Branje, C., Price, E., & Fels, D. I. (2008). Towards a model human cochlea: sensory substitution for crossmodal audio-tactile displays. In Proceedings of Graphics Iinterface. Windsor. pp. 267-274.
X
Karam, M. & Fels D.I. (2008). Designing a Model Human Cochlea: Issues and Challenges in Crossmodal Audio to Touch Displays. Invited Paper: Workshop on Haptic in Ambient Systems. Quebec City. X
Feel the Music: An accessible concert for deaf or hard of hearing audiences. Live music concert. Clinton’s Tavern. Toronto. (2009) X
Feel the Music: Concert at Ontario Science Centre. Three live bands and vibrochair. (2009). X
Emoti-chair: 5-month installation exhibit at Ontario Science Centre. Weston Family Ideas Centre. (2009). X

Teleconferencing robot with swiveling video monitor

DATE – 2005
DISCIPLINE – Science
MEDIUM – Patent
STATUS – Assignee: Horizonscan Inc.
WEBLINKS
Patent number: 7123285

Abstract: A teleconferencing robot for enabling a remote conferee to project a sense of presence into a group meeting. The teleconferencing robot includes: a base; a video monitor movably mounted to the base for receiving and displaying an image of the remote conferee; an attention getting device for getting the attention of conferees in the group meeting; a control device mounted on the base for moving the video monitor and actuating the attention getting device in response to an input control signal derived from a remote signal generated by the remote conferee and sending an outgoing data signal to the remote conferee providing feedback to the remote conferee from the robot; and the video monitor and attention getting device move in response to the input control signal to enable the remote conferee to project a sense of presence into the group meeting, and to confirm the movement by the outgoing data signal.
Type: Grant
Filed: May 31, 2005
Date of Patent: October 17, 2006
Assignees: Telbotics Inc., Ryerson Polytechnic University and the University of Toronto
Inventors: Graham Thomas Smith, Deborah Ingrid Fels, Jutta Treviranus

Video conferencing apparatus

DATE – 2002
DISCIPLINE – Science
MEDIUM – Patent
STATUS – Assignee: Horizonscan Inc.
WEBLINKS
Patent number: 6784916

Abstract: An apparatus and system for improving the projection of a remote conferee’s presence and improving eye contact between the remote video conferee and proximate conferee during a videoconference is disclosed. The image of the remote conferee’s face is shown on a video monitor with a camera located along the eye level of the image of the remote conferee’s face, and within the interocular distance of about 1.5 inches to 3 inches. A feedback screen showing the image of the proximate video conferee is also located near the camera and preferably within the interocular distance. Because the camera is within the interocular distance the proximate conferee will appear, to the remote conferee, to be looking at the eyes of the remote conferee when looking at the monitor, because the camera is within the interocular distance. This will be accentuated when the proximate conferee uses the feedback image, which is near the camera.
Type: Grant
Filed: February 11, 2002
Date of Patent: August 31, 2004
Assignee: Telbotics Inc.
Inventor: Graham Thomas Smith

PEBBLES

DATE – 1996 -2001
DISCIPLINE- Science
MEDIUM – Telepresence
STATUS- Published in 8 journals

SELECTED SCIENTIFIC JOURNALS & CONFERENCE PROCEEDINGS
2001 TeleMedicine 2001 Fels, D.I., Williams, L., Smith, G., Treviranus, J. The Supply Student, June 2001, Montreal, Canada.

1999 Telemedicine Journal, Williams, L. Fels, D.I., Smith, G., Treviranus, J., Eagleson, R. (1999), Developing A Video-Mediated Communication System For Hospitalized Children

1998 International Journal of Industrial Ergonomics, Williams, L. Fels, D.I., Smith, G., Treviranus, J., Eagleson, R. (1998). Control of a remote communication system by children.

1998 Internationaö Conference of the Learning Sciences, Williams, L. Fels, D.I., Smith, G., Treviranus, J., Eagleson, R., (1998), Analysis of a Video-mediated Communication System For Children

1998 Canadian Society for Mechanical Engineers, Williams, L. Fels, D.I., Smith, G., Treviranus, J., Eagleson, R. (1998) Creating An Interactive Supply Student

1997 Human Factors and Ergonomics ,Williams, L. Fels, D.I., Smith, G., Treviranus, J., Eagleson, R. (1997). Using PEBBLES to Facilitate Remot Learning , Albuquerque, pp. 320.

1997 Seventh International Conference on Human Computer Interaction Williams, L. Fels, D.I., Smith, G., Treviranus, J., Eagleson, R. (1997). PEBBLES: Providing Education by Bringing Learning Environments to Students. San Francisco pp. 115-118.

1996 CYBERG Conference, Williams, L. Fels, D.I., Smith, G., Treviranus, J., Spargo G, Eagleson, R. (1996) Control Of A Remote Communication System By Children

Teleconferencing robot with swiveling video monitor

DATE – 1998
DISCIPLINE – Science
MEDIUM – Patent
STATUS – Assignee: Horizonscan Inc.
WEBLINKS
Patent number: 6914622

Abstract: This invention relates to an apparatus for the projection of a remote conferee’s presence into a group meeting environment by using a combination of videoconferencing/teleconferencing and robotics technology. The remote conferee’s face is shown substantially life-size on a video monitor. The remote conferee’s eyes appear to be looking directly at the viewer. The video monitor can turn left or right to point at the person speaking, automatically or by manual control, to give the impression that the remote conferee is turning his head to look at the person speaking. The video screen can be raised and lowered to give the impression that the remote conferee is standing up and sitting down. An attention-getting mechanism prompts the attention of the other conferees when the remote conferee wants to interrupt or enter a conversation.
Type: Grant
Filed: May 6, 1998
Date of Patent: July 5, 2005
Assignees: Telbotics Inc.
Inventors: Graham Thomas Smith, Deborah Ingrid Fels, Jutta Treviranus

Panoramic Interactive System (4 patents)

Panoramic interactive system including coordinated movement of a film record within a viewing head

DATE – 1991
DISCIPLINE – Science
MEDIUM – Patent
STATUS – Assignee: Horizonscan Inc.

Patent number: 5253107
Abstract: An image viewing system is disclosed for viewing images recorded on a film record in a particular known angular relationship. The viewing system comprises a rotatable viewing head for selective viewing of portions of the film record as a function of the angular position of the viewing head. As the viewing head is rotated, the film record is adjusted to reflect a similar angular movement in the scenes of the film record. This arrangement corresponds to the dynamics involved if the user was to actually view the scenes of the film record.
Type: Grant
Filed: October 2, 1991
Date of Patent: October 12, 1993
Inventor: Graham T. Smith

Panoramic Interactive System

DATE – 1991
DISCIPLINE – Science
MEDIUM – Patent
STATUS – Assignee: Horizonscan Inc.

Patent number: 5153716
Abstract: This invention relates to a method and apparatus for viewing of a panorama or large portion thereof by selectively displaying a portion thereof on a video display device or other means and in a manner that forces the user to change his own orientation to vary the portion of the panorama viewed. Such a method and apparatus coordinates the user’s normal feedback responses associated with changing orientation to changes in the portion of the panorama viewed. In a preferred embodiment, a simple apparatus for allowing a user or users to view the recorded panorama is disclosed.
Type: Grant
Filed: July 29, 1991
Date of Patent: October 6, 1992
Assignee: Horizonscan Inc.
Inventor: Graham T. Smith

Panoramic interactive system

DATE – 1990
DISCIPLINE – Science
MEDIUM – Patent number: 5040055
STATUS – Assignee: Horizonscan Inc.

Abstract: This invention relates to a method and apparatus for viewing of a panorama or large portion thereof by selectively displaying a portion thereof on a video display device or other means and in a manner that forces the user to change his own orientation to vary the portion of the panorama viewed. Such a method and apparatus coordinates the user’s normal feedback responses associated with changing orientation to changes in the portion of the panorama viewed. In a preferred embodiment, a simple apparatus for allowing a user or users to view the recorded panorama is disclosed.
Type: Grant
Filed: October 22, 1990
Date of Patent: August 13, 1991
Assignee: Horizonscan Inc.
Inventor: Graham T. Smith

Panoramic interactive system

DATE – 1989
DISCIPLINE – Science
MEDIUM – Patent number: 4985762
STATUS – Assignee: Horizonscan Inc.

Abstract: This invention relates to a method and apparatus for recording of a panorama or large portion thereof in a manner for display or selective display of a portion thereof on a video display device. The method initially records the panorama in a manner not suitable for video display and thereafter projects the recorded image and records the projection of the panorama in a manner suitable for selective reproduction on a video display device. The method and apparatus are particularly suitable for recording of real time panoramas where the initial recording is time dependent and occurs quickly with sufficient accuracy for effective recording of the panorama, allowing the projecting step and second recording step to be independent of the initial demanding time restraint. Staging the recording of the panorama simplifies the recording and allow specialization of the steps to improve the quality of the final reproduction.
Type: Grant 
Filed: December 11, 1989
Date of Patent: January 15, 1991
Assignee: Horisonscan Inc. 
I
Inventor: Graham T. Smith