Simultaneous measurement of in-plane and out-of-plane displacement derivatives using dual-wavelength digital holographic interferometry Gannavarpu Rajshekhar,1 Sai Siva Gorthi,1,2 and Pramod Rastogi1,* 1

Applied Computing and Mechanics Laboratory, Ecole Polytechnique Fédérale de Lausanne, 1015 Lausanne, Switzerland 2

Rowland Institute at Harvard University, Cambridge, Massachusetts 02142, USA *Corresponding author: [email protected] Received 17 May 2011; revised 12 July 2011; accepted 26 July 2011; posted 27 July 2011 (Doc. ID 147684); published 27 September 2011

The paper introduces a method for simultaneously measuring the in-plane and out-of-plane displacement derivatives of a deformed object in digital holographic interferometry. In the proposed method, lasers of different wavelengths are used to simultaneously illuminate the object along various directions such that a unique wavelength is used for a given direction. The holograms formed by multiple reference-object beam pairs of different wavelengths are recorded by a 3-color CCD camera with red, green, and blue channels. Each channel stores the hologram related to the corresponding wavelength and hence for the specific direction. The complex reconstructed interference field is obtained for each wavelength by numerical reconstruction and digital processing of the recorded holograms before and after deformation. Subsequently, the phase derivative is estimated for a given wavelength using two-dimensional pseudo Wigner-Ville distribution and the in-plane and out-of-plane components are obtained from the estimated phase derivatives using the sensitivity vectors of the optical configuration. © 2011 Optical Society of America OCIS codes: 120.2880, 090.1995.

1. Introduction

Digital holographic interferometry (DHI) is an important optical metrological tool for deformation analysis in the areas of nondestructive testing and experimental mechanics. In DHI, the derivative of the phase of the reconstructed interference field is related to the displacement derivative of a deformed object. For many practical applications, information about the in-plane and out-of-plane components of displacement derivatives is required. The popular optical configuration for multidimensional deformation analysis in DHI usually relies on multi 0003-6935/11/340H16-06$15.00/0 © 2011 Optical Society of America H16

APPLIED OPTICS / Vol. 50, No. 34 / 1 December 2011

directional illumination of the object [1,2]. For measurement of multiple components of displacement derivatives, methods relying on sequential illumination of the object along different directions have been proposed [3,4]. The sequential operation makes the application of these methods difficult for simultaneous measurement of in-plane and out-of-plane components. Similarly, phase-shifting technique based methods [5,6] have been developed but they could pose difficulties for simultaneous measurements due to the requirement of multiple frames. Recently, methods based on multiwavelength or color DHI [7,8] were presented for multidimensional displacement measurements. However, these methods provide wrapped phase estimates and hence necessitate the application of the unwrapping operation and

further numerical differentiation to obtain the phase derivatives. This adds computational complexity and noise-susceptibility to the methods. In the paper, we propose a method that directly provides information about the in-plane and out-ofplane phase derivatives without requiring any unwrapping or differentiation operations. The method relies on multidirectional illumination of the object with lasers of different wavelengths and digital recording of holograms using a three CCD sensor camera, i.e., three-color CCD with red, green, and blue channels. Since each channel records only the intensity associated with the corresponding wavelength, digital processing of the recorded hologram provides a reconstructed interference field along each channel. The phase derivative could be subsequently obtained from the reconstructed interference field using phase derivative estimation methods like pseudo Wigner-Ville distribution [9,10], piecewise polynomial phase approximation, [11] etc. In the paper, the two-dimensional pseudo Wigner-Ville distribution (2D-PSWVD) [10] is used for phase derivative estimation because of its high robustness against noise. The theory of the proposed method is presented in the next section. Experimental results using the proposed method are shown in Section 3 followed by conclusions. 2. Theory

In the proposed method, a dual-beam illumination of the diffuse object is considered where red and green beams with wavelengths λr and λg are incident on the object as shown in Fig. 1. The 3-color CCD camera is located at a distance d from the object. Every channel of the camera records the hologram formed by the superposition of the reference and object beams of the corresponding wavelength. Denoting the reference and object waves in the CCD plane x0 y0 as Rr ðx0 ; y0 Þ and Or ðx0 ; y0 Þ for red wavelength and Rg ðx0 ; y0 Þ and Og ðx0 ; y0 Þ for green wavelength, the intensities recorded at the red and green channels of the CCD

are given as I r ðx0 ; y0 Þ ¼ jRr ðx0 ; y0 Þ þ Or ðx0 ; y0 Þj2 ¼ I r0 ðx0 ; y0 Þ þ Rr ðx0 ; y0 ÞOr ðx0 ; y0 Þ þ Rr ðx0 ; y0 ÞOr ðx0 ; y0 Þ

ð1Þ

I g ðx0 ; y0 Þ ¼ jRg ðx0 ; y0 Þ þ Og ðx0 ; y0 Þj2 ¼ I g0 ðx0 ; y0 Þ þ Rg ðx0 ; y0 ÞOg ðx0 ; y0 Þ þ Rg ðx0 ; y0 ÞOg ðx0 ; y0 Þ;

ð2Þ

where  denotes the complex conjugate, I r0 ¼ jRr ðx0 ; y0 Þj2 þ jOr ðx0 ; y0 Þj2 , and I g0 ¼ jRg ðx0 ; y0 Þj2 þ jOg ðx0 ; y0 Þj2 . The object wave-field at the object plane xy is numerically reconstructed using the Fresnel transform as [12]     j −jπ 2 −j2πd Γk0 ðx; yÞ ¼ exp ðx þ y2 Þ exp λk d λk d λk Z ∞Z ∞ Rk ðx0 ; y0 ÞI k ðx0 ; y0 Þ × −∞ −∞   −jπ 02 ðx þ y02 Þ × exp λk d   j2π ðxx0 þ yy0 Þ ∂x0 ∂y0 ; ð3Þ × exp λk d where Γk0 denotes the complex amplitude before object deformation. In the manuscript, the subscript k is used to indicate a particular color and k ¼ r or g depending on the red or green wavelength. For our analysis, the terms outside the integral can be neglected since they do not affect the overall interference phase and the above equation can be modified as Z ∞Z ∞ Rk ðx0 ; y0 ÞI k ðx0 ; y0 Þ Γk0 ðx; yÞ ¼ −∞ −∞   −jπ 02 02 ðx þ y Þ × exp λk d   j2π 0 0 ðxx þ yy Þ ∂x0 ∂y0 : ð4Þ × exp λk d For a N x × N y pixels CCD camera with pixel sizes Δx0 and Δy0 along horizontal and vertical directions, the above equation can be discretized using a 2D inverse Fourier transform and expressed as [12] Γk0 ðmΔxk ; nΔyk Þ ¼

N y −1 x −1 N X X

Rk ðpΔx0 ; qΔy0 ÞI k ðpΔx0 ; qΔy0 Þ

p¼0 q¼0



−jπ ððpΔx0 Þ2 þ ðqΔy0 Þ2 Þ × exp λk d    pm qn þ × exp j2π Nx Ny Fig. 1. Dual-beam illumination with red and green wavelengths.



∀ m ∈ ½0; N x − 1; n ∈ ½0; N y − 1;

ð5Þ

1 December 2011 / Vol. 50, No. 34 / APPLIED OPTICS

H17

where Δxk and Δyk are the spatial resolutions in the xy plane and are given as Δxk ¼

Δyk ¼

λk d N x Δx0

ð6Þ

λk d : N y Δy0

ð7Þ

Γk ðxk ; yk Þ ¼ Γk1 ðxk ; yk ÞΓk0 ðxk ; yk Þ ð8Þ

where ak is the amplitude and Δϕk is the interference phase for a given wavelength λk . When the object is deformed, the displacement vector ~ d of the object is related to the interference phase through the sensitivity vector, which is defined as the difference between the observation and illumination unit vectors. In Fig. 1, the illumination unit vectors are ^er and ^eg for the red and green beams ^. So, we have and the observation unit vector is u Δϕr ¼

∂Δϕg ∂Δϕr ∂d þ λg ¼ 4π z ð1 þ cosðθÞÞ ∂y ∂y ∂y

ð13Þ

∂Δϕg ∂Δϕr ∂d − λg ¼ 4π x sinðθÞ: ∂y ∂y ∂y

ð14Þ

λr

From the above equations, it is clear that the spatial resolutions Δxk and Δyk are functions of wavelength and are hence different for the red and green beams. Using the same procedure for numerical reconstruction as above, the complex amplitude of the object wave after deformation, i.e., Γk1 , is calculated. Finally, the complex reconstructed interference field Γk is obtained by multiplying the post-deformation complex amplitude with the conjugate of predeformation complex amplitude. Substituting xk ¼ mΔxk and yk ¼ nΔyk , we have,

¼ ak ðxk ; yk Þ exp½jΔϕk ðxk ; yk Þ;

λr

The above equations clearly indicate that the inplane and out-of-plane displacement derivatives can be estimated from the spatial phase derivatives ∂Δϕr =∂y and ∂Δϕg =∂y for the red and green beams. To obtain the phase derivative from the reconstructed interference field Γk , 2D-PSWVD [10] is used, which is given as Z Gk ðxk ; yk ; ω1 ; ω2 Þ ¼



Z



wðτ1 ; τ2 ÞΓk ðxk −∞ −∞ × Γk ðxk − τ1 ; yk − τ2 Þ

þ τ1 ; yk þ τ2 Þ

× exp½−2jðω1 τ1 þ ω2 τ2 Þ∂τ1 ∂τ2 : ð15Þ Here w is a real symmetric window function. For the analysis, a 2D Gaussian window was used:   2  τ τ2 1 exp − 12 þ 22 2πσ x σ y 2σ x 2σ y     σy σy σx σx ∀ τ1 ∈ − ; ; τ2 ∈ − ; ; 2 2 2 2

wðτ1 ; τ2 Þ ¼

ð16Þ

with σ x ¼ σ y ¼ 32. For 2D-PSWVD, the spatial frequencies ½ω1 ; ω2  at which the PSWVD attains its maximum correspond to the phase derivatives. In other words,

2π ~ 2π d · ð^ u − ^er Þ ¼ ½d ð1 þ cosðθÞÞ þ dx sinðθÞ λr λr z ð9Þ

and Δϕg ¼

2π ~ 2π d · ð^ u − ^eg Þ ¼ ½d ð1 þ cosðθÞÞ − dx sinðθÞ: λg λg z ð10Þ

Here dx and dz are the in-plane and out-of-plane displacement components. The spatial derivatives with respect to y could be written as   ∂Δϕr 2π ∂dz ∂dx ¼ ð1 þ cosðθÞÞ þ sinðθÞ λr ∂y ∂y ∂y

ð11Þ

  ∂Δϕg 2π ∂dz ∂dx ¼ ð1 þ cosðθÞÞ − sinðθÞ : ∂y ∂y λg ∂y

ð12Þ

From the above two equations, we have H18

APPLIED OPTICS / Vol. 50, No. 34 / 1 December 2011

Fig. 2. (Color online) Schematic of DHI setup with dual-color illumination. BS1-BS3: Beam splitters, BE1-BE3: Beam Expanders, M1-M3: Mirrors, OBJ: Object.



∂Δϕk ðxk ; yk Þ ∂Δϕk ðxk ; yk Þ ; ∂xk ∂yk



¼ arg maxGk ðxk ; yk ; ω1 ; ω2 Þ: ω1 ;ω2

ð17Þ

Here “arg max” signifies the value of ½ω1 ; ω2  when Gk ðxk ; yk ; ω1 ; ω2 Þ has a peak. Using the above equation, the phase derivative can be estimated for the red and green wavelengths, i.e., for k ¼ r and g. Since the derivative is directly obtained, there is no further requirement of unwrapping or numerical differentiation operations. The phase derivative would be denoted as ωk in the rest of the manuscript where ωk ¼ ∂Δϕk =∂y. It needs to be emphasized that the phase derivative ωk is evaluated at spatial coordinates ðxk ; yk Þ, which are different for the red and green beams since

the spatial resolution ½Δxk ; Δyk  depends on the wavelength λk . Consequently, the phase derivatives for the two wavelengths are evaluated on different grids and hence cannot be directly superimposed as in Eq. (13) and (14). This problem originates due to the use of the discrete Fourier transform in Eq. (5), which transforms the ðN x − 1ÞΔx0 × ðN y − 1ÞΔy0 grid in CCD plane to a ðN x − 1ÞΔxk × ðN y − 1ÞΔyk grid in the reconstruction plane with the grid resolution proportional to wavelength. To mitigate this problem, an interpolation scheme is used in the proposed method since the spatial coordinates or grid points ðxk ; yk Þ are known for both wavelengths. Accordingly, the phase derivative obtained for the red wavelength is interpolated on the grid points corresponding to the green wavelength. With ωg ðxg ; yg Þ and ωr ðxr ; yr Þ as the estimated

Fig. 3. (a) Intensity recorded in green channel. (b) Intensity recorded in red channel. (c) jΓg0 ðxg ; yg Þj2 (d) jΓr0 ðxr ; yr Þj2 (e) Fringe pattern for green wavelength. (f) Fringe pattern for red wavelength. 1 December 2011 / Vol. 50, No. 34 / APPLIED OPTICS

H19

phase derivatives for the green and red wavelengths, the interpolated derivative ωinterp ðxg ; yg Þ is obtained by using MATLAB’s “interp2” function and can be written as ωinterp ðxg ; yg Þ ¼ interp2ðxr ; yr ; ωr ðxr ; yr Þ; xg ; yg ; ‘spline’Þ: ð18Þ The “interp2” function in the above equation interpolates the function ωr ðxr ; yr Þ over the xg yg grid using cubic spline interpolation in two dimensions. After the interpolation, the phase derivative estimates for both wavelengths correspond to the same grid. Using Eqs. (13) and (14), the information about outof-plane and in-plane components of displacement derivative can be then obtained as λr ωinterp þ λg ωg ¼ 4π

∂dz ð1 þ cosðθÞÞ ∂y

ð19Þ

λr ωinterp − λg ωg ¼ 4π

∂dx sinðθÞ: ∂y

ð20Þ

3. Experimental Analysis

The schematic (not drawn to scale) of the optical configuration is shown in Fig. 2. The green and red beams from the lasers are split by the beam splitters BS1 and BS2 to obtain individual reference and object illumination beams. The reference beams for the red and green colors are combined via beam splitter BS3. For hologram recording, a 3-color CCD camera is used. The red and green channels of the camera store the intensity information related to the interference of the reference and object beams of the corresponding color. Effectively, each color channel records the hologram associated with the corresponding color at the same time, which permits

Fig. 4. (Color online) Phase derivatives (a) ωg ðxg ; yg Þ, (b) ωr ðxr ; yr Þ, and (c) winterp ðxg ; yg Þ in radians=μm. (d) The sum λr ωinterp þ λg ωg . (e) The difference λr ωinterp − λg ωg . H20

APPLIED OPTICS / Vol. 50, No. 34 / 1 December 2011

simultaneous multicolor illumination of the object in the optical configuration. To analyze the proposed method, a DHI experiment was performed where a clamped object was subjected to external loading and in-plane rotation. A Helium-Neon laser (633 nm) and a Coherent Verdi laser (532 nm) were used to generate the red and green beams. A JAI-M9-CL 3-CCD camera [1024 ðhorizontalÞ × 768 ðverticalÞ pixels] with pixel resolution ½Δx0 ; Δy0  ¼ ½4:65 μm; 4:65 μm was used to record the holograms before and after object deformation. Because of the three color channels, every image recorded by the camera can be expressed in the form of a three-dimensional array of size 768 × 1024 × 3, where the third dimension signifies red, green or blue color information. The intensities recorded in the green and red channels before object deformation are shown in Figs. 3(a) and 3(b). The complex amplitudes Γg0 and Γr0 are obtained for the green and red channels using discrete Fresnel transform and the corresponding intensities are shown in Figs. 3(c) and 3(d). The real and virtual images of the object are clearly visible in the figures. For the analysis, regions of interest were selected from the virtual images of the object for both wavelengths. The center of the object served as the common origin, i.e., spatial coordinate ð0; 0Þ for both regions though any other reference location such as a marked point on the object could be equivalently used. The reconstructed interference fields Γg ðxg ; yg Þ and Γr ðxr ; yr Þ were calculated using Eq. (8) and their real parts, which constitute the fringe patterns, are shown in Figs. 3(e) and 3(f) for the selected regions. In the above figures, the coordinates xg and yg for the green wavelength vary in steps or intervals of Δxg and Δyg whereas xr and yr for the red wavelength vary in intervals of Δxr and Δyr . The phase derivative estimates ωg ðxg ; yg Þ and ωr ðxr ; yr Þ in radians=μm using 2D-PSWVD for the green and red wavelengths are shown in Figs. 4(a) and 4(b). The interpolated phase derivative for the red wavelength, i.e., ωinterp ðxg ; yg Þ is shown in Fig. 4(c). The information about the in-plane and outof-plane components was obtained using Eq. (19) and (20) as shown in Fig. 4(d) and 4(e). For the analysis, the points near the borders were neglected. The accuracy of the proposed method is affected by the interpolation errors, color channel crosstalk, misalignment of CCDs, etc., which could be explored in the future. 4. Conclusions

In the paper, a dual-wavelength DHI-based method was presented for the measurement of in-plane and

out-of-plane displacement derivatives. The major advantage of the proposed method is the feasibility of simultaneous multidimensional measurements. In addition, the method directly provides the phase derivative without requiring unwrapping operations, numerical differentiation and multiple data-frames. The utility of the proposed method for practical applications was validated through experimental results. References 1. G. Pedrini, Y. L. Zou, and H. J. Tiziani, “Simultaneous quantitative evaluation of in-plane and out-of-plane deformations by use of a multidirectional spatial carrier,” Appl. Opt. 36, 786–792 (1997). 2. P. Picart, E. Moisson, and D. Mounier, “Twin-sensitivity measurement by spatial multiplexing of digitally recorded holograms,” Appl. Opt. 42, 1947–1957 (2003). 3. M. De La Torre-Ibarra, F. Mendoza-Santoyo, C. Perez-Lopez, and T. Saucedo-A, “Detection of surface strain by threedimensional digital holography,” Appl. Opt. 44, 27–31 (2005). 4. M. De la Torre-Ibarra, F. M. Santoyo, C. Perez-Lopez, T. S. Anaya, and D. D. Aguayo, “Surface strain distribution on thin metallic plates using 3-D digital holographic interferometry,” Opt. Eng. 45 (2006). 5. Y. Morimoto, T. Matui, M. Fujigaki, and A. Matsui, “Threedimensional displacement analysis by windowed phaseshifting digital holographic interferometry,” Strain 44, 49–56 (2008). 6. Y. Morimoto, T. Matui, and M. Fujigaki, “Application of three-dimensional displacement and strain distribution measurement by windowed phase-shifting digital holographic interferometry,” Adv. Mater. Res. 47–50, 1262–1265 (2008). 7. P. Picart, D. Mounier, and J. M. Desse, “High-resolution digital two-color holographic metrology,” Opt. Lett. 33, 276–278 (2008). 8. T. Saucedo-A, M. H. De La Torre-Ibarra, F. M. Santoyo, and I. Moreno, “Digital holographic interferometer using simultaneously three lasers and a single monochrome sensor for 3d displacement measurements,” Opt. Express 18, 19867–19875 (2010). 9. G. Rajshekhar, S. S. Gorthi, and P. Rastogi, “Strain, curvature, and twist measurements in digital holographic interferometry using pseudo-Wigner-Ville distribution based method,” Rev. Sci. Instrum. 80, 093107 (2009). 10. G. Rajshekhar, S. S. Gorthi, and P. Rastogi, “Estimation of displacement derivatives in digital holographic interferometry using a two-dimensional space-frequency distribution,” Opt. Express 18, 18041–18046 (2010). 11. S. S. Gorthi, G. Rajshekhar, and P. Rastogi, “Strain estimation in digital holographic interferometry using piecewise polynomial phase approximation based method,” Opt. Express 18, 560–565 (2010). 12. U. Schnars and W. P. O. Juptner, “Digital recording and numerical reconstruction of holograms,” Meas. Sci. Technol. 13, R85–R101 (2002).

1 December 2011 / Vol. 50, No. 34 / APPLIED OPTICS

H21

Simultaneous measurement of in-plane and out-of ...

wrapping or differentiation operations. The method relies on multidirectional illumination of the object with lasers of different wavelengths and digital recording of holograms using a three CCD sensor camera, i.e., three-color CCD with red, green, and blue channels. Since each channel records only the intensity associated ...

674KB Sizes 2 Downloads 297 Views

Recommend Documents

Determination of accurate extinction coefficients and simultaneous ...
and Egle [5], Jeffrey and Humphrey [6] and Lich- tenthaler [7], produce higher Chl a/b ratios than those of Arnon [3]. Our coefficients (Table II) must, of course,.

Simultaneous determination of digoxin and ...
ability of P-gp expression [5]; and (iii) P-gp kinetic profiling. [6]. ... data acquisition and processing. ..... sions and to obtain accurate permeability data for digoxin.

Simultaneous determination of digoxin and ...
MILLENNIUM32 software (version 3.05.01) was used for data acquisition and ... the linearity, sensitivity, precision and accuracy for each ana- lyte [16].

Simultaneous identification of noise and estimation of noise ... - ismrm
Because noise in MRI data affects all subsequent steps in this pipeline, e.g., from ... is the case for Rayleigh-distributed data, we have an analytical form for the.

Simultaneous Technology Mapping and Placement for Delay ...
The algorithm employs a dynamic programming (DP) technique and runs .... network or the technology decomposed circuit or the mapped netlist is a DAG G(V, ...

Simultaneous Estimation of Self-position and Word from ...
C t. O. W. Σ μ,. State of spatial concept. Simultaneous estimation of. Self-positions .... (desk). 500cm. 500cm. The environment on SIGVerse[Inamura et al. (2010)].

Simultaneous Synthesis of Au and Cu Nanoparticles in ...
Dec 6, 2012 - Nanotechnology, Edmonton, Alberta, Canada, and Department of Chemistry, Guru ..... The corresponding TEM images of sample G1 are shown.

Simultaneous elastic and electromechanical imaging ...
Both imaging and quantitative interpretation of SPM data on complex ... Stanford Research Instruments, and Model 7280, Signal Re- covery as ... Typical values were 1. =99 kHz .... nal in the center and enhanced PFM amplitude at the circum-.

Simultaneous elastic and electromechanical imaging by scanning ...
Received 3 March 2005; accepted 15 August 2005; published 20 September 2005. An approach for combined imaging of elastic and electromechanical ...

Simultaneous Control of Subthreshold and Gate Leakage ... - kaist
circuits with a data-retention capability. A new scheme called supply switching with ground collapse is proposed to control both gate and subthreshold leakage ...

Simultaneous Synthesis of Au and Cu Nanoparticles in ...
Dec 6, 2012 - A seed-growth method has been applied to synthesize the gold (Au) .... tube. In this tube, 5 mL of pure water was added along with two to three ...

Simultaneous Control of Subthreshold and Gate Leakage ... - kaist
tor of 6.3 with 65nm and 8.6 with 45nm technology. Various issues in implementing the proposed scheme using standard- cell elements are addressed, from ...

The Measurement and Conceptualization of Curiosity.PDF ...
vital to the fostering of perceptual learning and development. From her .... PDF. The Measurement and Conceptualization of Curiosity.PDF. Open. Extract.

Simultaneous administration of sodium selenite and mercuric chloride ...
Mineral Metabolism Unit, Institute for Medical Research and Occupational Health, P.O. Box 291, HR-10001 Zagreb, Croatia .... Milton Roy, FL, USA) after the samples were digested .... and Technology of the Republic of Croatia (Project No.

Simultaneous Synthesis of Au and Cu Nanoparticles in ...
Dec 6, 2012 - At high CuSO4 concentration, small Cu NP appeared which arranged ...... from CCP (Centre for Chemical Physics) and CIHR (Canadian.

LGU_NATIONWIDE SIMULTANEOUS EARTHQUAKE DRILL.pdf ...
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Main menu.

Performance Measurement of Processes and Threads Controlling ...
Performance Measurement of Processes and Threads C ... on Shared-Memory Parallel Processing Approach.pdf. Performance Measurement of Processes and ...

Masten-Prufer - Simultaneous Community and Court Enforcement ...
Retrying... Masten-Prufer - Simultaneous Community and Court Enforcement Supplement.pdf. Masten-Prufer - Simultaneous Community and Court Enforcement ...

Relative-Absolute Information for Simultaneous Localization and ...
That is why it is always required to handle the localization and mapping. “simultaneously.” In this paper, we combine different kinds of metric. SLAM techniques to form a new approach called. RASLAM (Relative-Absolute SLAM). The experiment result

Simultaneous Learning and Planning
Abstract— We develop a simultaneous learning and planning capability for a robot arm to enable the arm to plan and ... are to be learnt by the planner during the course of execution of the plan. Planar motion planning using pushing .... of this pro

Simultaneous Estimation of Self-position and Word from ...
Phoneme/Syllable recognition. Purpose of our research. 5. Lexical acquisition. Lexical acquisition related to places. Monte-Carlo Localization. /afroqtabutibe/.

Matrix Implementation of Simultaneous Iterative ...
Apr 20, 2011 - Mem. clock (MHz) 800. 1107. 1242. 1500. Memory (GB). 4. 1. 2. 2.6 ..... Nature, 450, 832–837. [16] Brandt, F., Etchells, S.A., Ortiz, J.O., Elcock, ...

Nonparametric Estimation of Triangular Simultaneous ...
Oct 6, 2015 - penalization procedure is also justified in the context of design density. ...... P0 is a projection matrix, hence is p.s.d, the second term of (A.21).