Recognising Individuals Using Their Brain Patterns 



K.V.R. Ravi , *Ramaswamy Palaniappan Faculty of Information Science and Technology, Multimedia University, Melaka, Malaysia * Dept of Computer Science, University of Essex, Colchester, United Kingdom [email protected]; [email protected]

Abstract In this paper, a novel method to recognise persons using their brain patterns is proposed. These brain patterns are obtained when the individuals perceive a picture. High frequency brain energy is used as features that are classified by Elman backpropagation neural network. The experimental results using 1600 brain signals from 40 individuals give average classification rate of 96.63%. This pilot investigation shows that the proposed method of recognising persons using their brain signals is worth further study.

1. Introduction Fingerprint is the most common biometric method of recognising (identifying) or authenticating an individual [1,2]. However, the individuality of fingerprints has been challenged [2]. Therefore, it becomes important to find alternative biometric methods to replace or augment the fingerprint technology. In this regard, other biometrics like palmprint, hand geometry, iris, face, and electrocardiogram [3] have been proposed. However, using EEG as a biometric is relatively new compared to the other biometrics. Poulus et. al. [4] proposed a method using autoregressive (AR) modeling of EEG signals and Linear Vector Quantisation NN to classify an individual as distinct from other individuals with 72-80% success. But the method was not tried to recognise each individual in a group. Paranjape et. al. [5] used AR modeling of EEG with discriminant analysis to identify individuals with classification accuracy ranging from 49 to 85%. Both the methods used EEG signals recorded while the individuals were resting with eyes closed [4] and with eyes closed or open [5]. In this paper, a novel individual identification method using their brain signals is proposed. These brain signals are evoked during a visual stimulus (i.e. seeing a picture) and are commonly known as Visual Evoked Potentials

(VEP). High frequency energy from 61 electrodes (channels) in the gamma band range of 30-50 Hz are computed from the recorded VEP signals to be used as biometric features. Gamma band is specifically chosen instead of alternative frequency bands because other studies [6,7] have successfully used gamma band spectral features to classify alcoholics and non-alcoholics. Basar et. al. [8] have also discussed the existence of the relationship of gamma band with focused arousal. Because the method uses features computed from 61 VEP channels, it is unlikely that different individual will have similar activity in all parts of the brain. Thus, it is suitable for use in biometric applications. These computed biometric features are trained with the Elman backpropagation (EBP) neural network to classify (i.e. recognise) different individuals.

2. Experimental Methodology The proposed method could be divided into 3 stages. The first stage involves recording of the VEP signals from the individuals. In the next stage, these VEP signals are processed to remove VEP signals with eye-blink contamination, setting mean to zero, and extract features. The third stage involves EBP classification experiment.

2.1. VEP data Forty individuals participated in the experimental study. The subjects are seated in a reclining chair located in a sound attenuated RF shielded room. Measurements are taken from 61 channels placed on the subject’s scalp, which are sampled at 256 Hz. The electrode positions (as shown in Figure 1) are located at standard sites using extension of Standard Electrode Position Nomenclature, American Encephalographic Association. The signals are hardware band-pass filtered between 0.02 and 50 Hz.

NZ

FPZ FP1 AF7 F7

FC7

C7 A1

FP2 AFZ

AF3

F5

F3

F1

FZ

AF8

AF4 F2

F4

FC5

FC3

FC1

FCZ

FC2

FC4

C5

C3

C1

CZ

C2

C4

CP5

CP3

CP1

CPZ

CP2

PZ

P2

FC6 C6

CP4

Stimulus

F8

F6

Stimulus duration: 300 m s

C8 A2

CP6

TP7

TP8 P5

P3

P1

P4

P8 PO3

POZ

O1

OZ

Inter trial duration: 5100 m s

P6

P7 P07

Stimulus

FC8

One trial

Next trial

PO4 PO8 O2

Figure 3: Example of visual stimulus presentation

Figure 1. Locations of electrodes (61 active channels inside hexagon)

2.2. The VEP signals are recorded from subjects while being exposed to a stimulus, which consist of pictures of objects chosen from Snodgrass and Vanderwart picture set [9]. These pictures are common black and white line drawings like an airplane, a banana, a ball, etc. that are chosen according to a set of rules that provide consistency of pictorial representation. The pictures have been standardised on variables of central relevance to memory and cognitive processing. These pictures represent different concrete objects, which are easily named i.e. they have definite verbal labels. Figure 2 shows some of these pictures.

Figure 2: Some pictures from Snodgrass and Vandervart The subjects are asked to remember or recognise the stimulus. Stimulus duration of each picture is 300 ms with an inter-trial interval of 5.1 s. All the stimuli are shown using a computer display unit located 1 meter away from the subject’s eyes. One-second measurements after each stimulus onset are stored. Figure 3 shows an illustrative example of the stimulus presentation. This data set is actually a subset of a larger experiment designed to study the short-term memory differences between alcoholics and non-alcoholics [10].

VEP processing and feature extraction

2.2.1. Eye blink removal. VEP signals with eye blink artifact contamination are removed using a computer program written to detect VEP signals with magnitudes above 100 V. These VEP signals detected with eye blinks are then discarded from the experimental study and additional trials are conducted as replacements. The threshold value of 100 V is used since blinking produces 100-200 V potential lasting 250 milliseconds [11,12]. A total of 40 artifact free trials are stored for each subject. As such, a total of 1600 single trial VEP signals are available for analysis. Next, mean from the data are removed. This is to set the pre-stimulus baseline to zero. 2.2.2. Gamma band spectral power and normalisation. A 4th order forward and 4th order backward Elliptic digital FIR filter is used to extract the VEP in the 3-dB passband of 30 to 50 Hz, i.e. in the gamma band range. Forward and backward operation gives zero phase response to remove the non-linear phase distortion caused by the filtering. MATLAB’s (Mathworks Inc.) filtfilt function is utilised for this purpose. Order 4 is chosen since it gives a 30-dB minimum stopband at 25 and 55 Hz. Parseval’s theorem can now be applied to obtain the equivalent spectral energy of the signal, ~ x using

Spectral power 

2 1 N ~  x (n) , N n1

(1)

where N is the total number of data in the filtered signal. These values from each of the 61 channels are concatenated into one feature array representing the particular VEP pattern. This energy is normalised with the energy values from all the 61 channels. 2.2.3. Elman Backpropagation network. These feature arrays are classified by an EBP network [13]into the

different categories that represents the individuals. The EBP network commonly is a three layer network with feedback from the hidden layer output to the input layer. The EBP network has tansig neurons in its hidden layer, and purelin neurons in its output layer. This combination allows these networks approximate any function (with a finite number of discontinuities) with arbitrary accuracy. The only requirement is that the hidden layer must have enough neurons. More hidden neurons are needed here due to the recurrent feedback as compared to the standard backpropagation network but EBP network generally gives better generalisation. However, EBP network gives slightly varying performance when repeated due to the network’s recurrent behaviour. MATLAB’s newelm function is used to simulate the EBP network here.

VEP signal

Eye blink removal and setting mean to zero

Elliptic filtering to extract 30-50 Hz VEP

EBP network classification

VEP features

Parseval theorem to compute energy

Figure 4. Flowchart of experimental study In all the experiments, half of the available VEP patterns (i.e. 20 from each subject) are used for training while the rest half are used for testing. Therefore, a total of 800 VEP patterns are used in training, while the rest 800 VEP patterns are used in testing. The selection of VEP signals for the training and testing datasets are conducted randomly and are fixed for all the experiments. In the experiments, the number of hidden units for EBP network is varied from 100 to 200 in steps of 20. The training was conducted until the mean-square error drops below 0.0001.

takes a fraction of a second, so implementation of realtime application is not impossible. Table 1. EBP classification results Hidden units 100 120 140 160 180 200 Average

Training time (s) 23.29 26.78 31.47 35.93 40.54 46.96 34.17

Testing time (s) 0.118 0.125 0.126 0.141 0.157 0.188 0.143

Classification (%) 96.63 95.00 95.50 95.50 94.25 95.63 95.42

4. Conclusion In this paper, a novel method using EBP classification of VEP features has been proposed as a biometric tool to recognise individuals. The VEP features consist of energy values computed from 61 channels extracted while the individuals are seeing a picture. The positive results obtained in this paper show promise for the method to be studied further as a biometric tool to recognise or identify different individuals. The method could be used as a uni-modal (stand alone) or in part of a multi-modal individual identification system. The method proposed is advantageous because of the difficulty in establishing another individual exact VEP output (i.e. difficult to forge) but the changes of VEP signals over longer periods of time requires further investigation.

5. References [1]

Palaniappan, R., Raveendran, P., and Omatu, S., “VEP Optimal Channel Selection Using Genetic Algorithm for Neural Network Classification of Alcoholics,” IEEE Transactions on Neural Networks, pp. 486-491, vol. 13, No. 2, March 2002.

[2]

Pankanti, S, Bolle, R.M., and Jain, A., “Biometrics: The Future of Identification,” Special Issue of IEEE Computer on Biometrics, pp.46-49, Feb. 2000.

[3]

Biel, L., Pettersson, O., Philipson, L., and Wide, P., “ECG Analysis: A New Approach in Human Identification,” IEEE Transactions on Instrumentation and Measurement, pp. 808-812, vol. 50, No. 3, June 2001.

[4]

Poulos, M., Rangoussi, M., Chrissikopoulos, V., and Evangelou, A., “Person Identification Based on Parametric Processing of the EEG,” Proceedings of the 6th IEEE International Conference on Electronics, Circuits, and Systems, pp. 283-286, vol.1, 1999.

3. Results and Discussion Table 1 shows the results for EBP classification. The results are tabulated for varying hidden unit values from 100 to 200 in steps of 20, where the averaged values are also shown. The table also give the number of training epochs, training and testing times for 800 patterns. From Table 1, it could be seen that the number of hidden units do not influence the classification performance very significantly. The best performance was obtained for 100 hidden units with 96.63%. Another interesting fact is that the number of epochs needed was approximately the same for any number of hidden units. The entire process of feature extraction and classification

Training epochs 118 117 120 119 116 119 118.17

[5]

Paranjape, R.B., Mahovsky, J., Benedicenti, L., and Koles, Z., “The Electroencephalogram as a Biometric,” Proceedings of Canadian Conference on Electrical and Computer Engineering, pp. 1363-1366, vol.2, 2001.

[6]

Misulis K.E., Spehlmann’s Evoked Potential Primer: Visual, Auditory and Somatosensory Evoked Potentials in Clinical Diagnosis, ButterworthHeinemann, 1994.

[7]

R.Palaniappan, S. Anandan and P. Raveendran, “Two Level PCA to Reduce Noise and EEG From Evoked Potential Signals,” Proceedings of 7th International Conference on Control, Automation, Robotics and Vision, Singapore, pp. 1688-1693, December 2-5 2002.

[8]

Basar, E., Eroglu, C.B., Demiralp, T., and Schurman, M., “Time and Frequency Analysis of the Brain’s Distributed Gamma-Band System,” IEEE Engineering in Medicine and Biology Magazine, pp. 400-410, July/August 1995.

[9]

Snodgrass, J.G., and Vanderwart, M., “A Standardized Set of 260 Pictures: Norms for Name Agreement, Image Agreement, Familiarity, and Visual Complexity”, Journal of Experimental Psychology: Human Learning and Memory, pp. 174-215, vol. 6, No.2, 1980.

[10]

Zhang, X.L., Begleiter, H., Porjesz, B., and Litke, A., “Electrophysiological Evidence of Memory Impairment in Alcoholic Patients,” Biological Psychiatry, pp. 1157-1171, vol. 42, 1997.

[11]

Kasuba, T., “Simplified Fuzzy ARTMAP,” AI Expert, pp. 19-25, vol. 8, no. 11, 1993.

[12]

Kriss, A., “Recording Technique,” in Evoked Potentials in Clinical Testing, edited by Halliday, A.M., Churchill Livingstone, 1993.

[13]

Elman, J. L.,"Finding structure in time," Cognitive Science, vol. 14, pp. 179-211, 1990.

Recognising individuals

in a sound attenuated RF shielded room. Measurements ... of pictorial representation. ... A 4th order forward and 4th order backward Elliptic digital. FIR filter is ...

227KB Sizes 0 Downloads 218 Views

Recommend Documents

Recognising and applying writing strategies to
Irrespective of background or experience, they are grouped in small ... course and subsequently enrolled in postgraduate courses in Business and who had.

Recognising and nurturing teachers who make ... -
Write to [email protected] by 6 March 2009 to find out when and where the closest workshop to you will be taking place. What are the prizes? • Twenty ...

Recognising and Solving Special Function ODEs
[1] M. Abramowitz and I. A. Stegun (eds.), Handbook of. Mathematical Functions, National Bureau of Standards,. Washington, DC, 1964; Dover Publications, NY, 1965. [2] W. W. Bell, Special Functions for Scientists and Engi- neers, Van Nostrand, London,

Deterministic Identification of Specific Individuals from ...
Jan 27, 2015 - Vjk also follows the asymptotical χ2 distribution, with degree of freedom 1. .... jk are available. When the published statistics are exact, all values of Ms can be ..... In Table 1, we list the abbreviation, the target disease and th

Briefing: An Audience of Individuals Services
revenue growth, putting strategy into practice is the big challenge many brands are facing, and it's the big issue we ... which means taking every opportunity to measure and analyze first-party data across the whole customer journey. ... industries w

Individuals, Businesses, and Representation: IRS ...
book PassKey EA Review, Complete: Individuals,. Businesses, and Representation: IRS Enrolled. Agent Exam Study Guide 2017-2018 Edition page full.

identifying individuals using ecg beats - Palaniappan Ramaswamy's
signals for verifying the individuality of 20 subjects, also using ... If the information matches, then the output is ..... Instrumentation and Measurement Technology.

Chapter 12 - Individuals and Families.pdf
Page 2 of 14. This chapter is included in this RDP to align with the PDP's emphasis on addressing the need of. the Filipino people to be more resilient to both ...

Interpersonal Pathoplasticity in Individuals With Generalized Anxiety ...
2000), have biased social judgment regarding their negative impact on others (Erickson .... ology does not permit derivation of clusters that match the data best. Indeed, in Salzer et al. ..... behavioral treatment plus supportive listening (Newman e

Does understanding individuals require idiographic ...
judgement and this claim receives endorsement by the inventor of the term 'idiographic', ... I suggest, at the end, that narrative, rather than idiographic ... care, the World Psychiatric Association, for example, .... Thus more is needed by way of.

Embodied space in early blind individuals - Frontiers
Aug 1, 2012 - Stephanie A. Gagnon, Massachusetts General Hospital and Harvard Medical School, USA. 1In an external reference frame, locations are repre- sented within a framework external to the .... notices concerning any third-party graphics etc. m

“Spatial” Stream in Visually Deprived Individuals
Jun 6, 2012 - who lose vision after the full development of the visual system also ..... is left) displays the projection of the site of TMS application in the study of Collignon et al. [51]. ..... for perception and action,” Trends in Neuroscience