Proceedings of the 2002 IEEE International Conference on Robotics & Automation Washington, DC • May 2002

Visual Servo Control for the Hovering of an Outdoor Robotic Airship y  Geraldo F. Silveira José R. Azinheira

z

Patrick Rives

José R. H. Carvalho





Ely C. de Paiva

 Thisof anpaper addresses the issue of using automatic hovering outdoor autonomous airship image-based visual servoing. The hovering controller isa designed using a full dynamic model of the airship, in PD errorandfeedback scheme, taking the visual signalsTheas output extracted from an on-board camera. behavior and stability of the airship motion during theis task execution and subjected to the wind disturbance studied. approach using an The accurate airshipis nally model.validated in simulation  visual servoing; aerial unmanned robots; robotics.vision-based control;



Samuel S. Bueno

Abstract

Keywords

Fig. 1. Robotic Airship of AURORA Project.

I. Introduction

In the present article, the authors address the prob-

The application of vision based control techniques

lem of a visual servoing control scheme for the au-

in robotics has experienced great advances since early

tomatic hovering of an autonomous outdoor airship.

works in 70's [10]. Most initial results were based on

The robotic airship (Fig.

a look-and-move approach, where the robot acquires the image, computes its position and orientation with

and environmental monitoring tasks. The main chal-

regard to a reference frame of the scene, and uses a

lenges in the ight control design for this aerial vehicle

path planning algorithm to design a desired Cartesian

are related to

trajectory to reach its goal in a step by step fashion. Recently,

visual servoing

constants and a high sensitivity to wind or gust disturbances, and

Further results have proven that vi-

autonomously controlling robots using image as feed-

For the problem of automatic hovering of the airship

back signals [2], [5].

over a reference target, the methodology proposed in

Although visual based control approaches have been

[2] is completed with the inclusion of the vehicle dy-

successfully applied to many dierent scenarios, such

namics.

as mobile robots, hand-eye systems [6] or underwater

The target is composed of a circle on the

ground and a ball oating just over the center of the

vehicles [7], applications in the eld of aerial robots are

circle, as depicted in Fig. 3. An optimal controller is

Vision based autonomous landing for

then designed for the error feedback, taking the visual

airplanes is presented in [11], whereas vision guided

signals as output. The main idea here is to make a fea-

systems for autonomous helicopters can be found in

sibility analysis in order to validate the approach for

the [13] and [12]. Previous works on vision based con-

future onboard implementation.

trol for airships are presented only for small indoor

Also, the technique

would easily be extended to the vertical landing of the

blimps, without the presence of wind and gust distur-

airship by the proper change of the desired visual sig-

bances, as in [14] and [15].

nals.

 Research

Center Renato Archer  LRV/CenPRA, CP 6162, CEP 13082-120, Campinas/SP, Brazil. E-mail: {First_name.Last_name}@cenpra.gov.br y Instituto Superior Técnico  IDMEC/IST, Av. Rovisco Pais, 1049-001, Lisboa, Portugal. E-mail: [email protected] z INRIA Sophia Antipolis 2004 Route des Lucioles BP 93 06902 Sophia Antipolis Cedex, France. E-mail: [email protected]

0-7803-7272-7/02/$17.00 © 2002 IEEE

(ii) by its actuation deciency and the

resulting non-holonomic constraints.

sual servoing is an established issue in its purpose of

still incipient.

(i) its very large volume (24m3 with 9m

long) and low density, which result in increased time

techniques appeared to

tackle the vision process directly in the closed-loop control design.

1) is developed under the

AURORA Project [1], conceived for aerial inspection

The paper is organized as follows: in Section 2 the problem is stated and both the vehicle and vision models are introduced. In Section 3 the optimal controller design is presented. Section 4 presents numerical simulation results and Section 5 summarizes the nal conclusions.

2787

= [xpos ; xvel ]T , with xpos = [xa ; ya; za ; ; ; ] and xvel = [u; v; w; p; q; r ]. where

II. Modeling and control aspects

For the design of the hovering control of an aerial

The position variables given by

robot using visual servoing, it is rst necessary to de-

represent, respec-

tively, the perturbations in forward, lateral and verti-

this section.

cal linear velocities as well as the angular rates (note

A. Dynamics of the airship

that for hovering over a x target the reference velocity is null

In this section we briey review the nonlinear dynamic model representing the AURORA airship. detailed presentation can be found in [3].

A

values for main propellers thrust, vectoring angle, rudder deection angle and tail thrust.

to the airship, with the origin at the Center of Volume

B. Vision aspects

(O) (Fig. 3). The usual transformation based on the is used to switch from the air-

Let us consider a camera xed under the gondola

The

of the airship, pointing downwards (Figure 3).

dynamic model can be stated as:

The

pose (position and orientation) of the camera is an element

Ma X_ = Fa + Fg + Fp + Fk

= [U; V; W; P; Q; R]T

u = [ÆT ; Æv ; Ær ; Ætail ]T ,

corresponding to the perturbations around the trim

The dy-

ship body-xed frame to an Earth-xed frame.

(xvel = 0)).

The control vector is given by

namic model describes the motion of a frame attached

X

xvel

The velocity variables given by

airship dynamics. These components are presented in

where,

represent, re-

projected on the body frame, and the Euler angles.

system, namely the target, the vision setup and the

(; ; )

xpos

spectively, the perturbations in the airship position,

ne an accurate model of all the components of the

Euler angles

x

(1)

ferential manifold. Moreover, let us consider that the information in the image may be modeled as a set of

is the vector of inertial

elementary signals

(U; V; W ) and an-

velocities, composed by the linear

(P; Q; R) inertial velocities of the vehicle in the (Fa ) represents the generalized forces produced by aerodynamics, (Fg ) represents gravity, (Fp ) propulsion forces, (Fk ) the kinematics inertia, and the

of the 3D objects belonging to the scene. Then, a vec-

body frame;

Ma

characterizing the geometric fea-

tures which result from the projection onto the image

gular

generalized apparent mass matrix

 of R3  SO3 which is a six dimensional dif-

r

tor

( ) is constructed from these elementary signals.

s r; t

Refer to [2] for further details.

that includes

x

a

both the airship actual inertia as well as the virtual inertia elements associated with the dynamics of buoy-

Xa

ant vehicles [3]. As control inputs, the airship has the deection tail

Y cam

surfaces, a pair of main thrusters, and a small stern thruster as depicted in Fig.

2.

vectored from

C Za

The tail ns can be

o to +25o, and o o -30 to +120 up.

deected from -25

h

Z cam

O

X cam

Ya

hb

the engines can be

T rc

Fig. 3. Camera, frames and target representations.

δ

111 000 000 111 000 111 δ T

v + −

− +

The relationship between the velocities of the image

δe

features and the velocities of the camera is expressed by the Interaction Matrix

1 0 0 δ tail 1

_ = LT

s

+ −

δr

where

Fig. 2. Airship actuators.

Vcam

and

LT :



Vcam

cam

 (3)

cam are respectively the linear and

angular velocities of the camera.

For the design of the controller, the dynamic Eq.

This interaction matrix can be derived for many ex-

(1) is linearized around the trim position, for hovering

teroceptive sensors and denes a dieomorphism be-

above the target, resulting in a LTI system:

tween the conguration space of the vehicle and the image plane. In the case of vision sensors, elementary

_ = Ax + B u

x

signals can be chosen among a set of geometric prim-

(2)

itives. When the image features are general algebraic

2788

curves, an analytical expression for the interaction ma-

 The dimension of the subspace spanned by the inter-

trix can also be obtained [2].

action matrix

LT

yields the degrees of freedom of the

blimp conguration, possibly controlled by the visual

C. Target denition

feedback, in this case rank

LT

= 5. The corresponding

The target chosen is the airship mast positioned in

free motion is a circular motion at constant altitude.

the center of a circle drawn on the ground, see Fig. 3.

This motion is associated to a zero-dynamics of the

For simulation purposes, this corresponds to a circle

blimp which can be excited in the presence of noise

(with radius

= 2m) lying on the ground and a small

rc

or modelling errors. In converse, the presence of wind with constant direction will act as an external control

ball (considered as punctual) oating over the circle

hb

center (height

= 4m).

input to stabilize this zero-dynamics and to constrain

In the camera plane, the

(xc ; yc) and the ball appears as a point with coordinates (xb ; yb ). The

the blimp to be aligned with the wind direction. Here,

circle appears as an ellipse with center

an analogy with the pendulum motion can be done,

vector of visual signals, extracted from the image, is

where the center of rotation is the axis for the image

then chosen as:

reference, and the "gravity" is the aerodynamic force due to wind, plus the aerodynamic damping. However,

s

= [xc ;

where

2;0

ellipse and

]

yc ; xb ; yb ;  T ; and

d

0;2

with

are the



dierently from a "free pendulum" case, the loop is

= p2 0d+0 2 ;

;

reduced moments

indeed closed by the visual servoing with some inter-

(4)

action with the yawing motion which may signicantly change the damping characteristics if the visual servo-

of the

ing is not much quicker than the pendulum motion.

is the vertical distance from the opti-

cal center to the center of the circle. The parameter



D. Task function

depends only on the projection of the circle and incor-

A classical approach in robotics is to consider the

porates the projected ellipse area.

process of achieving a task such as tracking or posi-

From such a vector of visual signals, an analytical

tioning as a problem of regulation of an

form of the interaction matrix may be computed (see

output func-

[8] for more details on the computation). The compu-

tion [9].

tation of the interaction matrix at the desired airship

to vision-based control is straightforward: the task is

conguration

xpos

=

x

pos



or desired image

(s = )

s ,

The application of the task function approach

described as desired values for a certain set of elemen-

for a selected airship altitude and distance from the

tary visual signals.

target axis, ensures convergence for initial conditions

task function vector

near the nal position.

( =0

= 0)

( ) = M (s(r ; t)

(xc = xb = 0), with a at attitude

e r; t

  ;  , at a desired altitude h and distance xa from the circle axis. The corresponding desired vec  tor of visual signals is s ; y  ; ; y  ; d T and the

= [0

interaction matrix is given by: 2

1

d

6 0 6 6 1 T  hb ) L = 6 ( d s=s 6 6 0 4

0

0

1

d 0 (d

is:

This conguration has been

selected such that the airship can be aligned with the circle and ball

In mathematical terms a visual

e

1

0

hb )

0 yc2 0 yb2 1 2rc

c

0

b

where

2r ]

( )

s r; t

of

s

and

M

)

(6)

is the value of the visual features cur-

rently observed by the camera,

c

s

s

is the desired value

is a combination matrix.

For example, assuming that the carrier of the cam-

0 1 yc2 0 1 yb2  d 3yc 4rc

1 yc 0 0 1 yb 0 0 0

era can be classed as a pure integrator (such assump-

3 7 7 7 7 7 7 5

tion is realistic for hand-eye manipulators), we can design a very simple

(5)

velocity control scheme

using visual

feedback:

0



This image Jacobian relates the variation of the visual signals extracted from the image to the camera motion. The denition of the target and visual signals

cam =

If we assume that

were chosen in order to enable a good sensitivity of

less target

the image output to any motion of the camera. Other improvements obtained with this conguration are:

@e @t



Vcam

e

depends only on

= 0, we obtain: _=

e

 Both the target and visual signals were selected to leave the heading unobserved, such that the yaw angle is indeterminated allowing to cope with a previously

e; with  >

 M LT e

0

(7)

(r; t) a motion(8)

An exponential convergence will thus be ensured un-

unknown wind direction;

der the sucient condition:

2789

0

M LT > n x

in the sense that a

0 for any nonzero

e

(9)

 n matrix A is positive if xT Ax > n 2R .

= LT Rca xpos

(15)

From Eqs. (12) and (13), the output signal may be written as

y

= [e; s_]T .

Then, the open loop system to

be controlled may be rewritten as:

A good and simple way to satisfy this convergence condition in the neighborhood of the desired position

M

is to choose for the matrix

M where

LT

+

_ = Ax + B u = Cx

the pseudo-inverse of the

x y

s :

interaction matrix associated to



= LT + s=s

where the output matrix is the block diagonal matrix:

(10)

is the pseudoinverse of matrix

LT .

ics of the airship cannot be classed as a pure integrator,

J=

The control scheme is conceived as a regulator setup is to be tracked with

a PD error feedback. The output to be considered is

transformation matrix

C,

Vcam



cam From Eq.



V

Z 1

0

[xT Qx + uT Ru] dt Q

(18)

 0; R > 0 are to be K

is

obtained by discretizing the continuous-time system

along with a frame

with a sampling rate of 10Hz and after solving a sta-

from the airship frame ve-

= Rca

(17)

The discrete-time optimal output feedback gain

com-

tionary Riccati equation. Reminding that the system

locity to the camera frame velocity:



0

selected according to the characteristics of the system.

time derivatives. These output signals are related to

LT



c LT R a

where the weighting matrices

thus composed of the camera signals along with their

c Ra

0

mulation, with a performance index dened by:

A. Optimal control design

posed by the Interaction Matrix

c

LT Ra

ror feedback is searched using the standard LQR for-

III. Controller design

the state variables through a full rank matrix



In order to design the controller gain, an optimal er-

and a more sophisticated control scheme must be used.

s

=

C

Unfortunately, as was shown previously, the dynam-

where a desired image feature

(16)

has large time constants, this relatively slow sampling rate has proven to be sucient.





The digital control law resulting from the optimal

(11)

design is given by:

(3) and the frame transformation (11),

u(k) =

the feature derivatives may be related to the airship velocity:

()

Ky k

(19)

IV. Results

In order to show the tracking and robustness prop-

_ = LT Rca xvel

s

Then, assuming that

LT

and

(12)

c

Ra

erties of the vision based hovering control, we have run 3 dierent simulation cases:

are constants and

i) no wind and

no gust, to show the zero dynamic properties;

by integration of above equation:

ii)

weak wind and gust, to show the tracking features, and iii) strong wind and gust, to show the robust-

Z ts

t

_

s dt

= LT Rca

Z ts

t

ness properties.

xvel dt

at the position (N,E,D)=(North,East,Down)=(-15,0,25) (Fig.

The feature error may also be related to the airship

or, as

xpos

s

= LT Rca (xpos

4).

The center of the target is located

at (N,E,D)=(0,20,0), as depicted by the inner circle.

position error, and thus:

s

The vision-based hovering applica-

tion was launched with the airship located initially

(13)

Note that all the North-East plots shown here cor-

xpos

)

respond to the position of the CV (center of volume and also center of airship reference frame), and not the

(14)

camera position that is located

= 0,

1:14m below and 1:25m

ahead of the CV. The objective is to have the vector of

2790

30

1

5 0.8

0.6

0

0.4

25

altitude (m)

−5

cam

−10

0

Y

North (m)

0.2

−0.2

−15

20 −0.4

−20

−0.6

−0.8

−25 −5

0

5

10

15

20

25

−1 −1

30

15 −0.8

−0.6

−0.4

−0.2

East (m)

X

0

0.2

0.4

0.6

0.8

1

0

20

40

60

80

100

120

140

160

180

200

time (s)

cam

Fig. 6. Evolution of airship altitude, with altitude reference in dashed line (case: wind=2m=s, gust=1m=s).

Fig. 4. Left: Airship CV position in Earth-xed frame, Right: Target center path in image plane (case: no wind and no gust).

xc (dotted), xb (solid), yc (dashed), yb (dashdot) 1

1

5

wind

0.8

0.6

0

0.8

0.4

−5

error signals

cam

0.6 0

Y

North (m)

0.2

−10

−0.2

−15 −0.4

−20

0.4

−0.6

0.2

−0.8

−25 −5

0

5

10

15

20

25

30

−1 −1

−0.8

East (m)

−0.6

−0.4

−0.2

0

Xcam

0.2

0.4

0.6

0.8

1

0

Fig. 5. Left: Airship CV position in Earth-xed frame, Right: Target center path in image plane (case: wind=2m=s, gust=1m=s).

visual signals as

s

0

20

40

60

80

100

120

140

160

180

200

time(s)

Fig. 7. Error signals in the image feature (case: wind=2m=s, gust=1m=s).

= [0; 0:239; 0; 0:282; 6:535]T , which

corresponds to stay on the dashed circle in Fig. 4.

eliminated with the addition of an integral term. The

The North-East plot for the rst simulation with

small altitude error in Fig. 6 is also explained by the

no wind or gust perturbation is shown in Fig. 4. In

same reason. The comparison of the North-East plot

this case, the airship exhibits an initial circular move-

and the target path in the image plane in 5 shows the

ment around the target before attaining a steady nal

stability properties of the visual feedback, as the oscil-

position pointing to its center.

This behavior is ex-

latory movement of the airship in the trajectory path

plained by the zero dynamics or "pendulum" like mo-

does not correspond to strong oscillations in the image

tion, which is only damped by the low aerodynamic

plane. The image feature parameters for this simula-

forces, as discussed at the end of section 2.C. The im-

tion are depicted in Fig.

age path plot of the center of the target, in this case,

the actuators (main thrust, tail thrust, and vectoring

is shown at the right of the same gure.

angle) are shown in Fig. 8.

In the second case we have introduced a weak wind of

2m=s

from North with an additive gust of

7, and the signals sent to

In a nal simulation, we have introduced a strong

1m=s

wind incidence of

to point out the tracking performance of the hovering.

4m=s

from North with a gust of

The evolution of the airship motion towards the target is shown in Fig. 5. The corresponding altitude plot is

110

70

Note, that with the given wind direction, the de-

105

δT

60

100

Vectoring angle (deg)

depicted in 6.

50

Thrust (N)

sired nal airship position would be (N,E,D)=(-5,20,25). Figure 5 shows the CV somehow away from that point, which can be explained by the fact that the cam-

40

30

20

δ tail

10

95

90

85

80

75

70

era is indeed located ahead of the CV. The image path

0 65

−10

plot at the right of this gure still shows some oset er-

0

20

40

60

80

100

time (s)

ror, which is explained by the Proportional-Derivative

120

140

160

180

200

60

0

20

40

60

80

100

120

140

160

180

time (s)

Fig. 8. Actuation signals (case: wind=2m=s, gust=1m=s).

control conguration (eq. 15), which could easily be

2791

20

performance despite the level of disturbances.

1

The visual servo hovering will be implemented in the

5 0.8

wind

AURORA airship control system. Other vision based

0.6

0

0.4

control techniques such as automatic landing and path

0.2

following are also under development.

cam

−10

0

Y

North (m)

−5

−0.2

Acknowledgment

−15 −0.4

−20

This

−0.6

−0.8

−5

0

5

10

15

20

25

30

−1 −1

East (m)

work

is

partially

under grant 97/13384-7,

−25 −0.8

−0.6

−0.4

−0.2

X

0

0.2

0.4

0.6

0.8

ment

1

cam

under

grant

sponsored

by

FAPESP

CNPq/CTPETRO agree-

466713/00-2,

Brazil/Portugal-

CNPq/ICCTI agreement under grant 910094/99-3,

Fig. 9. Left: Airship CV position in Earth-xed frame, Right: Target center path in image plane (case: wind=4m=s, gust=2m=s).

and

Brazil/France-CNPq/INRIA

agreement

under

grant 480429/01-4. References

2m=s.

[1]

The North-East plot for this simulation is

shown in Fig. 9. The erratic movement exhibited at the nal conguration is explained by the presence of the heavy gust perturbation.

[2]

Despite this perturba-

tion and the strong wind incidence, the nal congu-

[3]

ration shows a good agreement with the nal desired position.

[4]

V. Discussion and conclusion

In this paper we present a pure visual-based control scheme applied on an outdoor underactuaded robotic

[5]

airship for hovering tasks. Previous similar works have dealt only with small indoor blimps [14],[15]. The image-based visual servoing scheme uses an In-

[6]

teraction Matrix related with the parameters in the image plane, which are dened by the projection of the target used in the task. The reference image used

[7]

was the projection of a circle on the ground with a small ball oating over the center of the circle. This

[8]

has proven to be an interesting reference image for two main reasons:

i.

the resulting Interaction Matrix

[9]

preserves the decoupling of longitudinal and lateral dynamics, simplifying the control design process;

ii.

the reference image permits multiple possible congu-

[10]

rations for task completion, the heading angle is not

[11]

dened and the airship may align with the wind to assure robustness and the task accomplishment. This

[12]

free motion of the airship in an horizontal circle at the nal conguration is associated to a zero dynamics of the blimp. The presence of the wind acts therefore as

[13]

an external control input stabilizing this zero dynamics and constraining the airship aligned with the wind

[14]

direction. Simulation results have been presented which correspond to fairly severe conditions, both with a constant

[15]

wind component and atmospheric turbulence - such disturbances are to be expected in outdoor environ-

A. Elfes, S. S. Bueno, M. Bergerman, and J. G. Ramos. A semi-autonomous robotic airship for environment monitoring missions. In IEEE International Conference on Robotics and Automation, Leuven, Belgium, May 1998. Bernard Espiau, François Chaumette, and Patrick Rives. A new approach to visual servoing in robotics. IEEE Transactions on Robotics and Automation, 8:313326, 1992. S.B.V. Gomes and J. G. Ramos. Airship dynamic modeling for autonomous operation. In IEEE International Conference on Robotics and Automation, Leuven, Belgium, May 1998. Azinheira, J.R., Paiva, E.C., Ramos, J.G., Bueno, S.S. Extended Dynamic Model for AURORA Robotic Airship In 14th. AIAA Lighter-Than-Air Technical Committee Convention and Exhibition, Akron, USA, July 2001.

Yi Ma, Jana Kosecka, and Shankar S. Sastry. Vision guided navigation for a nonholonomic mobile robot. IEEE Transactions on Robotics and Automation, 15(3):521537, June 1999. R. Pissard-Gibollet and P. Rives. Applying visual servoing techniques to control a mobile hand-eye system. In IEEE Int. Conference on Robotics and Automation, Nagoya, Japan, May 1995. P. Rives and J.J Borrelly. Visual servoing techniques applied to an underwater vehicle. In IEEE Int. Conf. on Robotics ans Automation, Albuquerque, USA, april 1997. P. Rives and H. Michel. Visual servoing based on ellipse features. In SPIE Conference, Boston, MA, USA, September 1993. Claude Samson, Bernard Espiau, and M. le Borges. Robot Control: the Task Function Approach. Oxford University Press, USA, 1990. Y. Shirai and H. Inoue. Guiding a robot by visual feedback in assembling tasks. Pattern Recognition, 5:99108, 1973. F.R. Schell and E.D. Dickmanns. Autonomous Landing of Airplanes by Dynamic Machine Vision. In Machine Vision and Applications 7:127-134 1994. Courtney S. Sharp, Omid Shakernia, S. Shankar Sastry. A Vision System for Landing an Unmanned Aerial Vehicle. In International Conference on Robotics and Automation, Seoul, Korea, 2001. Omead Amidi. An Autonomous Vision-Guided Helicopter. In Doctoral dissertation, Robotics Institute, Carnegie Mellon University, 1996. Sjoerd van der Zwaan, Alexandre Bernardino, and José Santos Victor. Vision based station keeping and docking for an aerial blimp. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, 2000. Hong Zhang and James P. Ostrowski. Visual servoing with dynamics: Control of an unmanned blimp. In Proceedings of the IEEE International Conference on Robotics and Automation, pages 618623, Michigan, USA, May 1999.

ments. The results show that the optimal controller is able to stabilize and control the airship with a good

2792

Visual Servo Control for the Hovering of an Outdoor ...

A semi-autonomous robotic airship for environment mon- itoring missions. In IEEE International Conference on. Robotics and Automation' Leuven' Belgium' May ...

551KB Sizes 2 Downloads 240 Views

Recommend Documents

Teach by Zooming Visual Servo Control for an ...
AIAA Guidance, Navigation, and Control Conference and Exhibit ..... In (22), ˆHd (t) ∈ R3x3 denotes the following estimated Euclidean homography:11 ..... Improvements in the Stability Analysis of a New Class of Model-Free Visual. Servoing ...

Multi-Reference Visual Servo Control of an Unmanned ...
Department of Mechanical and Aerospace Engineering, University of Florida, ... This research is supported in part by the NSF CAREER AWARD 0547448, ...... Navigation, and Control Conference, Keystone, Colorado, AIAA 2006-6718, 2006.

visual servo control of nonholonomic mobile robots
and/or mechanical constraints of the robotic platform. Consider the system of Fig ..... Transactions on Robotics and Automation 8, 313–326. Samson, Claude and ...

Design, Fabrication, and Visual Servo Control of an XY ...
Technology Development Fund under Grant 016/2008/A1. Q. Xu and Y. Li are with the ... mercially available. As an alternative, a low-cost microscope .... internal force in limb 1 can be calculated by a potential energy analysis as. (12) with. (13).

Daisy Chaining Based Visual Servo Control Part I - IEEE Xplore
Email: {gqhu, siddhart, ngans, wdixon}@ufl.edu. Abstract—A quaternion-based visual servo tracking con- troller for a moving six degrees of freedom object is ...

Daisy Chaining Based Visual Servo Control Part II - IEEE Xplore
Email: {gqhu, ngans, siddhart, wdixon}@ufl.edu. Abstract— In this paper, the open problems and applications of a daisy chaining visual servo control strategy ...

A Geometric Approach to Visual Servo Control in the ...
and point of arrival η ∈ R3 of the camera coordinate frame. The angle of incidence condition represents a pencil of lines describing the surface of a right circular cone, such that the generatrix identifies possible orientations of the camera opti

A Geometric Approach to Visual Servo Control in the ... - IEEE Xplore
University of Florida. Shalimar, FL-32579, USA. J. W. Curtis. Munitions Directorate. Air Force Research Laboratory. Eglin AFB, FL-32542, USA. Abstract—In this paper, we formulate a visual servo control problem when a reference image corresponding t

MORPHEUS Servo Control GUI.pdf
Sign in. Loading… Whoops! There was a problem loading more pages. Retrying... Whoops! There was a problem previewing this document. Retrying.

Camera Independent Visual Servo Tracking of ...
In this paper, a visual servo tracking problem is developed with the objective to enable ... systems more autonomous. ..... estimated rotation tracking error system.

Omnidirectional Visual-Servo of a Gough–Stewart ...
This allows a large field of view to be obtained and avoids the occlusion problems .... tioning a single omnidirectional camera (vision system providing 360◦.

Reconfigurable optical add-drop multiplexers with servo control and ...
Dec 31, 2004 - ... cost advantages. Conventional OADMs in the art typically employ ..... 4Ai4B shoW schematic illustration of tWo embodi ments of a WSR-S ...

Reconfigurable optical add-drop multiplexers with servo control and ...
Dec 31, 2004 - channel micromirror be individually controllable in an ana log manner ..... associated signal processing algorithm/softWare for such processing ...

Reconfigurable optical add and drop modules with servo control and ...
Dec 31, 2004 - cantly enhances the information bandwidth of the ?ber. The prevalence of ... OADM that makes use of free-space optics in a parallel construction. .... control assembly serves to monitor the optical power levels of the spectral ...

Reconfigurable optical add and drop modules with servo control and ...
Dec 31, 2004 - currently in the art are characteristically high in cost, and prone to ..... For purposes of illustration and clarity, only a selective. feW (e.g., three) of ...

A simple visual navigation system for an UAV - Department of ...
drone initial and actual position be (ax,ay,az)T and (x, y, z)T respectively, and |ax| ≪ s, .... stages: 1) Integral image generation, 2) Fast-Hessian detector. (interest point ..... Available: http://www.gaisler.com/doc/structdes.pdf. [28] A. J. V

Visual PID Control of a redundant Parallel Robot
Abstract ––In this paper, we study an image-based PID control of a redundant planar parallel robot using a fixed camera configuration. The control objective is to ...

An approach of partial control design for system control ...
The authors gratefully acknowledge the support of the NSFC (No. 50537030). References. [1] Vorotnikov VI. ... control, Tampa, FL, vol. 1–3; 1989. p. 1376–8.

Servo Whistler Training.pdf
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Servo Whistler ...

8 Servo Drive Breakout
Nov 27, 2017 - programmable development hardware more accessible. Elint Labz as a platform helps developers & young engineers from prototyping to product development. We provide open source hardware solutions and small quantity manufacturing services

for the Control Freak - GitHub
Oct 26, 2012 - Freelance iOS developer (part-time enterprise device management at. MobileIron, full-time fun at Radtastical Inc). • EE/Computer Engineer ...

Temporal Filtering of Visual Speech for Audio-Visual ...
performance for clean and noisy images but also audio-visual speech recognition ..... [4] Ross, L. A., Saint-Amour, D., Leavitt, V. M., Foxe, J. J. Do you see what I ...