Objective evaluation of spatial information acquisition using a visuo-tactile sensory substitution device Luca Brayda1 , Claudio Campus1, Ryad Chellali1, Guido Rodriguez2 2

Clinical Neurophysiology, Department of Neurosciences, Ophthalmology and Genetics, University of Genoa, Italy 1 Human Robots Mediated Interactions lab., Telerobotics and Applications dept., Italian Institute of Technology, Via Morego 30, Genova, Italy {luca.brayda,claudio.campus, ryad.chellali}@iit.it

Abstract. This paper proposes a method to objectively assess the degree of sensory transfer in a visuo-tactile sensory substitution task. Using a special purpose single taxel-based device, users discover with tact some virtual objects of increasing geometrical complexity, while EEG signals are recorded. Successful reconstruction, as well as cognitive load and brain programming, are found to be associated with relative activation/deactivation of specific EEG bands, and to the users' exploration strategies, which allows us to objectively assess the perceived degree of complexity of the reconstruction task. Our metrics also provide an evaluation tool for the design of devices useful for the navigation of visually impaired people. Keywords: sensory substitution, neurophysiology, EEG, telerobotics





1 Introduction The central problem of Tele-robotics is to provide effective tools allowing to humans to perform physical modifications of distant environments. In such a context, tele-robots have to capture users motor controls and translate it into remote robots commands. On the other hand, tele-robots acquire data through embedded sensors and build a remote environment description to be displayed to operators. This closed-loop control scheme is mainly supported by bilateral interfaces. These latter convey sensory-motor information between the two partners. Unfortunately, these interfaces are known to distort and to reduce information flows: the operator has a partial and incorrect knowledge concerning the remote world status. As well, all operator’s actions and motor intents are not fully taken into account. As a dual system, the telerobot is in fact inherently asymmetric because humans and robots belong to two different sensory-motor spaces. This situation can be easily extended to more generic contexts where humans interact with machines or through machines.

Fig. 1. The taxel-based device and the way tactile feedback is provided during exploration of a virtual object.

Our work is dealing with an extreme situation of sensory asymmetries, namely sensory substitution processes. Our aim is to understand how users integrate information normally dedicated to the sensory modality X if displayed to the sensory modality Y. These mismatches are current if one needs to convey radiation, temperature, pressure, levels of gas concentration, etc. Usually, the visual channel is the best candidate for this task. However, we need to estimate the cognitive cost of such transfer. One way to do it is to estimate indirectly the fatigue through performances. Every sensory transfer has a cognitive cost, often leading to mistakes, misunderstood mental representations and fatigue. Sensory substitution is mostly the only way to display feedbacks and remote world’s descriptions in mediated interactions. Nowadays, the evaluation of the effectiveness of such sensory transfer does not go beyond qualitative evaluation; psychophysics-based techniques offer a more quantitative way to measure subjectively users’ performances on the basis of chosen tasks. However, few techniques that go beyond users' subjective interpretation exist today. To overcome this methodological barrier, researches are today more and more active, possibly adopting a multidisciplinary approach, to handle the strong need of objective measurements. A quantitative assessment is at the same time a way to prove that a sensory substitution process has reached the goal, but also the validation tool for off-line analysis and improvement of new technological devices [1] [2]. This paper proposes methods to objectively assess the degree of sensory transfer in a visuo-tactile sensory substitution task, using a special purpose device, measuring signals which are linked to the degree of integration of spatial information and, indirectly, to the cognitive load. For this purpose, we consider parameters extracted from neurophysiological signal (EEG) as suitable candidates for such objective evaluation. In fact, the high time resolution of neurophysiological measures makes possible an on-line monitoring of the subject’s general conditions and particularly of the sensory and cognitive state due to the ongoing task. Our objective is threefold: first, to stimulate a tactile sensory feedback, possibly evolving in a learning process, related to the exploration of a 3D virtual environment through tactile navigation. Second, to objectively assess the

transfer of information from the sense of touch to a spatial representation of such environment. Third, to find a correlation between our objective measures and the degree of difficulty the user are experiencing with different kinds of virtual environments. We will show evidence that brain signals related to spatial exploration are measured when objects are being virtually touched. We will also show that these signals modulate differently depending on the kind and complexity of the virtual objects. The remainder of the paper is organized as follows: Section 2 summarizes the state of the art on the use of EEG-based measures; Section 3 describes our methodology and the experimental setup; results are detailed and commented in Section 4. Finally, Section 5 contains discussion and Section 6 concludes the paper.

2. State of the art The earliest sensory substitution devices converted visual stimuli to tactile representations for the blind and visually impaired [3][4]. Tactile-vision substitution systems (TVSS) were the earliest to be developed [5], in order to translate visual to spatial and temporal patterns of cutaneous information for blind people [6]. In a typical system, a camera receives visual information which is converted to a tactile representation on a two-dimensional pin array. Some of the earliest works in this area were the development of the Optacon, which converted printed letters to a spatially distributed vibrotactile representation on the fingertip, using a miniature handheld camera [7]. Although reading speeds were significantly slower than Braille, the Optacon allowed blind people to access any text or graphics without having to wait for it to be converted into Braille. Early pioneering work in TVSS was also performed by Paul Bach-y-rita [8] and colleagues in the late 1960s: tactile displays could potentially provide a discrete and portable means of accessing graphical information in an intuitive non-visual manner. Many advances have also been made due to the appropriation of tactile displays for telerobotics [9], [10], [11] and virtual reality, to represent physical contact with a remote or simulated environment guaranteeing a stronger tele-presence. However, many of these have been limited to engineering prototypes. To the best of our knowledge, none of them conquered a significant market share. According to us, this happened because too much attention was given to precisely acquiring the sensory modality X, while less effort was dedicated to verify how "well" the substitution process was successful through the modality Y. Previous studies [1][12][13][14][15][16] about substitution of vision with tact showed activity in visual cortex during tactile perception in sighted participants as well as in those who have experienced visual deprivation of varying duration. Dynamic changes in neuroimaging and spectral EEG were reported while processing visually/haptically presented spatial information [17][18][19][20][21][22][23]. One of the applications which allows to investigate the quantitative use of neurophysiological parameters for sensory substitution is the navigation [24] in virtual environments. By construction, such environments can provide controlled stimuli and are thus suitable to test the link between a given geometrical representation of space, even very simple, and the way it is learnt by the sole tactile feedback. We have already shown that this

is possible in [2], and we assessed the degree of complexity of navigation-related tasks such as discrimination, recognition and reconstruction of geometrically simple virtual objects. In this study we attempt to find a correlation between certain neurophysiological parameters and specific events of the learning process of a simple tactile-based navigation task. We seek to derive, from such correlation, inferences about how far the device and its interface have carried out sensory substitution.

3. Methodology and Experimental Setup Our experimental setup is aimed at measuring brain activity while performing the exploration of a 3D virtual environment with a single taxel-based mouse-shaped device. Such device (Fig. 1) provides the third dimension (the height) of virtual objects, while the first two dimensions (length and width of such objects) are given by the device position on a flat surface [2]. In practice, the exploration through this device approximates what happens when one single finger is exploring the physical profile of an object, where the higher the finger is lifting, and the higher the touched object will be. A user is then put in the condition to integrate tactile feedback together with proprioceptive feedback forming a cognitive space map. The virtual objects the users are asked to interact with are depicted in Fig. 2: each environment is a stairshaped object, resembling a ziggurat, and their complexity is determined by the number of steps to explore and learn.

Fig. 2. The three virtual objects sensed through the device

Five subjects participated to the experiment, 3M and 2F, aged 29±5Y,. The following protocol was applied: • •

The subjects sustained a complete neurological/neuropsychological examination. The subjects’ EEG signals were blindfolded and monitored in the following five phases: 1. Pure resting state, followed by a visual stimulation (both with open and closed eyes). 2. Pure motor task: subjects freely moved the device on a tablet, without getting tactile feedback. 3. Exploration: subjects explored three more and more complex, virtual environments. The environments were automatically changed after ten

4. 5. •

trials (exploration/rest paced by two sounds start/stop). A pure resting state was induced between consecutive exploration trials. Pure motor task Pure resting state.

At the end of the whole experiment, subjects were unfolded and asked to depict and assign a difficulty’s coefficient to each explored environment.

For what concerns signal acquiring and processing, monopolar EEG was recorded with Ag/AgCl cup electrodes at 14 active scalp sites according to the international 10/20 system. Electrooculogram (EOG), electrocardiogram (EKG) and electromyogram (EMG) were also acquired. Data were sampled at 1024 Hz. Also the experimental video and the coordinates perceived by the subject (x, y and height of the virtual objects) from the device were recorded to study possible behavioralneurophysiological relations. Ocular, cardiac and muscular artifacts were removed using independent component analysis (ICA). EEG was filtered (0.5-50 Hz pass band) and post-processed using averaging and spectral/wavelet analysis to analyze the spatial/time evolution of the EEG usual bands and the other measures around the start/stop sound. We also applied some linear synchronization measures and ICA. Matlab and EEGLab [26] [27] were used for the elaboration.

4. Results The resting brain signals, according to literature, showed a wide presence of alpha (8-12 Hz) band with a particular concentration in the occipital region. For both the pure motor and exploration phases, we studied the evolution of the power corresponding to some known EEG bands around the event given by the sound starting the motion of the device. In particular, for each subject and each phase of the experiment (i.e. phases 2-4), we obtained from the raw EEG signal the independent components using Independent Component Analysis (ICA), which is widely used to reconstruct the localization (i.e. the spatial distribution) and the evolution (usually the time-frequency distribution) of the processes and sources generating the EEG signal. Then, the components of the subjects were clustered using a K-means algorithm to get a global result (mean topographical distribution and mean time frequency distribution of the clustered components from different subjects) for the whole studied group regarding each single phase of the experiment. For the two motor sessions, Fig. 3 shows in the left panel the topographical distribution of each single component contributing to the cluster (smaller heads), and the mean spatial distribution of the cluster (bigger head), as well as their timefrequency (right panel) distributions. The deactivation, or Event-Related Desynchronization (ERD) of the alpha (8-12Hz) and the beta (16-32Hz) band after the start sound (in blue) are an expected result: in fact, they are linked to the introduction of an external-related task, which is the movement of the arm of the user following the start sound. This complies with previous literature.

Fig. 4, instead, shows the same signal analysis applied to the exploration and tactile discovery of the 1-step ziggurat. In the frontal cluster, an evident Event-Related Synchronization (ERS) of the theta (4-8 Hz) band (in red and yellow) can be seen: in literature, this fact has been related to spatial processing and memory activation used to build a mental imagine of real or virtual objects. In the case of the second exploration session of phase 3 (2 steps), as depicted in Fig. 5, two clusters were individuated. The most relevant is the frontal, which shows a ERS not only in the theta, but also in sigma (12-16Hz) bands and a ERD, in the alpha and beta bands: this can be related to the augmented effort in discovering the internal details and it is a sign of cognitive effort. Finally, the case of the most complex environment (4 steps), is represented in Fig. 6. The fronto-central location with an ERS in theta and sigma, but this time also in the gamma (32-50 Hz) bands and an ERD similar to the 1-step and 2-steps cases. The gamma band is related to cognitive efforts. Interestingly, the gamma activation was absent in the unique subject who was able to perfectly recognize the four-steps ziggurat, while in some other subjects the same activation even preceded the sound starting the explorations. We summarize the detected variations in the EEG bands, just after the starting sound, in Table 1: clearly, as the virtual object is more complex, more EEG bands and more high frequency signals are activated, all by increasing the effort needed to reconstruct the virtual object. These neurophysiological variations seem to have a certain relationship with the type of strategy of exploration as will discuss in the next section.

Fig. 3. Pure motor session. Topographical (left panel) and time-frequency (right panel) distributions of clustered components. Two clusters are found, both showing a general EEG deactivation, especially in the alpha and beta bands after the start of the exploration (t≥0).

Fig. 4. One step exploration session. Two clusters are found. One mainly frontal shows an ERS in the theta EEG band and a ERD in the alpha and beta bands for t≥0.

Fig. 5. Two steps exploration session. The frontal cluster shows a ERS not only in the theta, but also in sigma bands and a ERD in the alpha and beta bands, similar to the 1-step case.

Fig. 6. Four steps exploration session. The fronto-central cluster shows a ERS in the theta, sigma this time also in the gamma bands and an ERD similar to the 1-step and 2-steps cases.

Table 1. Summary of EEG-based measures, for increasing complexity of the virtual environment. ERSFC is the event related synchronization at the fronto-central location. Clearly the progressive synchronization of theta, sigma and gamma bands is related the progressive increased complexity.

t>0 γ [32-50 Hz] β [16-32 Hz] σ [12-16 Hz] α [8-12 Hz] θ [4-8 Hz]

Motor ERD ERD -




5 Discussion The aim of our work is to identify possible brain reaction during the employment of a new technological device which can give a tactile representation of a virtual environment. We also acquired a baseline situation at rest and a pure motor task in which nothing else was requested than freely moving the device without spatial-related tactile stimulation. Such situation significantly differs from the signals recorded during the exploration of the three virtual objects, clearly showing that looking at EEG timefrequency components is consistent for tactile-based spatial reconstruction tasks. The reconstructing of the 3D shape of the first object was successful for all the subjects. Thus, the slower EEG bands are well associated with the spatial processing. The situation radically changed in the case of a two steps structure. All the subjects understood that they had explored a centered symmetrical object, and were able to delimitate its bounds. However, only two subjects correctly reconstructed the internal

structure. Observed mistakes were an unconscious smoothing and interpolation of the steps giving an appearance of a continuous conic pyramid. Subjects also reported in this case a strong increase in difficulty. This copes with our findings, where faster EEG bands, related to the reconstruction efforts, are captured together with the slow ones, related to spatial processing. Finally, the most complex situation presented a dramatically greater difficulty as reported by four of the subjects. Only one was able to correctly identify all the structure. Four subjects felt great surprise when, after the experiment, a small plastic model of the four steps ziggurat was shown. Correspondingly with this complexity and difficulty, the even faster EEG band (gamma) were found together with the theta and sigma. Some important process are related to this band, in particular the brain programming and the cognitive load. In some trials, in fact, we even found activations in this band before the starting sound, i.e. when the users was not moving any body parts: we speculate that this can be related to a user mentally programming the future movements or re-working the acquired information. Fig. 7 shows the trajectories that the mouse virtually traced on the tablet: interestingly, the user with the best performance (all object reconstructed) shows an exploration strategy more regular than the other users (in the black rectangle). We emphasize that this user did not show a significant activation of the gamma band with any of the three objects, thus providing additional evidence to our hypothesis.

Third user

First user

Fourth user

Second user

Fifth user

Fig. 7. Trajectories and strategies employed in the exploration of the three environments by five subject studied. Only the last subject was able to recognize the most complex environment, showing a different strategy and a clear easiness in the use of the device.

In all the cases, many of the occurred mistakes were probably ascribable to a nonuniform (e.g. insufficient in some areas) sampling of the explored space, as can be observed in Fig. 7: subjects generally over-explored a part of the environment

probably making some kind of “inference” on the other parts. Another important source of misunderstanding can be the orientation angle of the mouse (which is integral with the hand and, to some extent, with the arm) with respect to the tablet, a fact which is common among PC users, but intuitively largely compensated by the visual feedback, which is absent here. In Fig. 7 the first and second user are have in fact exploration strategies with straight lines rotated CW, which implies to "observe" an object rotated CCW. It can also be seen that users generally develop two strategies: the first aimed at “circumnavigate” the objects, individuating the boundaries of the whole structure, the second, mainly radial, can be referred to the internal navigation of the objects aimed at reconstruction of the details. We emphasize that we cast the problem of visuo-tactile sensory substitution in its worst scenario, i.e. with minimal tactile information, almost no training at all and with a reconstruction task: this was done on purpose to collect the immediate mental reaction and effort of users and to avoid bias due to different learning curves.

6 Conclusions In this paper we proposed a method to objectively assess the degree of sensory transfer in a visuo-tactile sensory substitution task. We showed that even with a single taxel-based device, users are able to discover and reconstruct some virtual objects of increasing geometrical complexity. We also showed that an objective measure of the information acquisition process through sensory substitution can be found: the spatial representation of the discovered objects, an increasing difficulty within the reconstruction task, as well as brain programming, are related to the relative activation/deactivation of specific EEG bands. Such metrics are the first step towards a more objective assessment of the issues underlying telerobotics tasks when sensory deprivation occurs. Our metrics also provide an evaluation tool for the design of devices useful for the navigation of visually impaired people.

References [1] [2] [3]

[4] [5] [6]

A. Bujnowski M. Drozd, R. Kowalik, J. Wtorek, “A tactile system for informing the blind on direction of a walk,” in Conference on Human System Interactions, (2008) R. Chellali, L. Brayda, E. Fontaine -How Taxel-based displaying devices can help blind people to navigate safely" - ICARA 2009 Wellington, New Zealand, (2009). Y. Kawai and F. Tomita, “Interactive tactile display system: a support system for the visually disabled to recognize 3D objects,” in Proceedings of the second annual ACM conference on Assistive technologies. Vancouver, British Columbia, Canada, (1996) S. A. Wall and S. Brewster, “Sensory substitution using tactile pin arrays: Human factors, technology and applications,” Signal Processing, vol. 86, no. 12, pp. 3674–3695(2006) T. Maucher, J. Schemmel, and K. Meier, “The heidelberg tactile vision substitution system,” in Int. Conf. on Computers Helping People with Special Needs, (2000) T. Way and K. Barner, “Automatic visual to tactile translation. II. evaluation of the TACTile image creation system,” Rehabilitation Engineering, IEEE Transactions on, vol. 5, no. 1, pp. 95–105, (1997)

[7] [8]

[9] [10]




[14] [15] [16] [17]


[19] [20]

[21] [22]


[24] [25] [26] [27]

Louis H. Goldish, Harry E. Taylor, "The Optacon: A Valuable Device for Blind Persons ", New Outlook for the Blind, 68, 2, 49-56, (1974) K. Kaczmarek, J. Webster, P. Bach-y-Rita, and W. Tompkins, “Electrotactile and vibrotactile displays for sensory substitution systems,” IEEE Transactions on Biomedical Engineering, vol. 38, no. 1, pp. 1–16, Jan. (1991) G Moy, C Wagner, RS Fearing, "A compliant tactile display for teletaction", in Proc. of IEEE International Conference on Robotics and Automation, (2000) A. Kron and G. Schmidt, “Multi-fingered tactile feedback from virtual and remote environments,” in Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2003.Proceedings. 11th Symposium on, (2003) A. Yamamoto, S. Nagasawa, H. Yamamoto, and T. Higuchi, “Electrostatic tactile display with thin film slider and its application to tactile telepresentation systems,” Visualization and Computer Graphics, IEEE Transactions on, vol. 12, no. 2, pp. 168–177, (2006) LB., Merabet, JD. Swisher, SA. McMains, MA. Halko, A. Amedi, A. Pascual-Leone et.al.., “Combined activation and deactivation of visual cortex during tactile sensory processing,” Journal of Neurophysiology 97, no. 2 , (2007) D. Senkowski, TR. Schneider, JJ. Foxe, and AK. Engel., “Crossmodal binding through neural coherence: implications for multisensory processing,” Trends in Neurosciences 31, no. 8, (2008) D. Bavelier and HJ. Neville, “Cross-modal plasticity: where and how?,” Nature Reviews. Neuroscience 3, no. 6, (2002) L. Merabet, G. Thut, B. Murray, J. Andrews, S. Hsiao, A. Pascual-Leone, “Feeling by Sight or Seeing by Touch?,” Neuron 42, no. 1, (2004) A. Zangaladze, CM. Epstein, ST. Grafton and K. Sathian, “Involvement of visual cortex in tactile discrimination of orientation,” Nature 401, no. 6753 , pp 587-59, (1999) Y. Li, K. Umeno, E. Hori, H. Takakura, S. Urakawa, T. Ono et al., “Global synchronization in the theta band during mental imagery of navigation in humans,” Neuroscience Research 65, no. 1, pp 44-52. (2009) JB. Caplan, JR. Madsen, A. Schulze-Bonhage, R. Aschenbrenner-Scheibe, EL. Newman, and MJ. Kahana, “Human theta oscillations related to sensorimotor integration and spatial learning,” The Journal of Neuroscience: The Official Journal of the Society for Neuroscience 23, no. 11, pp 4726-4736. (2003) HJ. Spiers, “Keeping the goal in mind: Prefrontal contributions to spatial navigation,” Neuropsychologia 46, no. 7, pp. 2106-2108 (2008) M. Grunwalda, T. Weissb, W. Krausec, L. Beyerd, R. Roste, I. Gutberletb., “Power of theta waves in the EEG of human subjects increases during recall of haptic information,” Neuroscience Letters 260, no. 3, pp 189-192. (1999) A. Gallace and C. Spence, “The cognitive and neural correlates of "tactile consciousness": a multisensory perspective,” Consciousness and Cognition 17, no. 1, pp 370-407 (2008) D. Osipova, A. Takashima, R. Oostenveld, G. Fernández, E. Maris, and O. Jensen., “Theta and gamma oscillations predict encoding and retrieval of declarative memory,” The Journal of Neuroscience: The Official Journal of the Society for Neuroscience, no. 28, pp 7523-7531 (2006) M. Grunwald, T. Weiss, W. Krause, L. Beyer, R. Rost, I.Gutberlet et al., “Theta power in the EEG of humans during ongoing processing in a haptic object recognition task,” Brain Research. Cognitive Brain Research 11, no. 1, pp. 33-37. (2001) O. Lahav and D. Mioduser, “Haptic-feedback support for cognitive mapping of unknown spaces by people who are blind,” Int. J. Hum.-Comput. Stud. 66, no. 1, pp. 23--35. (2008) K. Sathian, “Visual cortical activity during tactile perception in the sighted and the visually deprived,” Developmental Psychobiology 46, no. 3, 279-286 (2005) A. Delorme & S. Makeig. "EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics," Journal of Neuroscience Methods 134:9-21 (2004) A. Delorme, T. Fernsler, H. Serby, and S. Makeig. EEGLAB Tutorial.

Objective evaluation of spatial information acquisition ...

Unfortunately, these interfaces are known to distort and to reduce information flows: the operator has a partial and incorrect knowledge concerning the remote world status. As well, all operator's actions and motor intents are not fully taken into account. As a dual system, the tele- robot is in fact inherently asymmetric because ...

3MB Sizes 1 Downloads 177 Views

Recommend Documents

Globalization and Business Objective Evaluation ...
Dec 7, 2017 - Ago-Dic 2017 https://sites.google.com/site/supradg/. Objective ... o Preferential Trade Agreement (PTA) o Free Trade Area (FTA) o Customs Union o Monetary and Fiscal Union o Economic Union/ Political Union o Regionalism ... Editor: Vand

A Novel Method for Objective Evaluation of Converted Voice and ...
Objective, Subjective. I. INTRODUCTION. In the literature subjective tests exist for evaluation of voice transformation system. Voice transformation refers to the.

Public Communication and Information Acquisition
Feb 22, 2016 - show that beliefs about the aggregate economy remain common knowledge and, ...... Several lines of algebra show that the first term in paren-.

Information Acquisition and the Exclusion of Evidence ...
Nov 24, 2009 - A peculiar principle of legal evidence in common law systems is that ...... that excluding E3 allows us to support the first best in equilibrium.21.

Spatial Information Sharing System
Our aim is to create a system based on Client/Server architecture. .... it also gave use some insight into the testing of web services based applications. The.

pdf-0738\fundamentals-of-spatial-information-systems-apic-series ...
Try one of the apps below to open or edit this item. pdf-0738\fundamentals-of-spatial-information-systems-apic-series-by-robert-laurini-derek-thompson.pdf.

Communication and Information Acquisition in Networks
that the degree of substitutability of information between players is .... For instance, one could allow the technology of information acquisition to be random and.