Proceedings of the 7th International Symposium on Machinery and Mechatronics for Agriculture and Biosystems Engineering (ISMAB) 21-23 May 2014, Yilan, Taiwan

BUILDING A ROBOT FOR THE 2013 FIELD ROBOT COMPETITION Ping-Tsung Hsu, Dennis Chen, Yan-Fu Kuo*, Ta-Te Lin Department of Bio-Industrial Mechatronics Engineering, National Taiwan University, No. 1, Sec.4, Roosevelt Road, Taipei 106, Taiwan * Corresponding Author -- Email: [email protected]

Abstract: This article presents the design and development of a wheel robot for 2013 field robot competition. The field robot competition (FRC) hosted by the Taiwan Institute of Biological Mechatronics is an outdoor competition. Compared to indoor competitions, the environmental conditions of the FRC may vary greatly and can be harsh to robots. Thus, the robot and the control have to be developed robustly. The main tasks of the competition were (1) autonomous navigation, (2) precision irrigation, and (3) traffic light recognition in an open field. The developed robot comprised of a vision module, an inertial measurement unit, ultrasonic sensors, and infrared distance sensors. Two DC motors and a pump were equipped for robot driving and watering. A microcontroller was programmed to provide the robot with the required functions to fulfill the tasks. A frame was also designed and manufactured to attach the parts. The robot completed all the tasks posed in the competition, and successfully took first prize. Key Words: Field robot competition, Robot INTRODUCTION Robotic competitions have emerged as an efficient way to foster interest in learning beyond school (Almeida et al., 2000; Stier et al., 2011). These competitions provide students chances to encounter less than ideal running conditions and to develop skills for solving real-world problems. The 2013 field robot competition (FRC) hosted by the Taiwan Institute of Biological Mechatronics is the only outdoor competition held in estern Asia. It aims to ecourage students to integrate and implement engineering skills learned from school (The 6th Field Robot Competition: Competition Rules, 2013). The 2013 FRC presented 3 tasks: navigation along the field trail, precision irrigation, and traffic light recognition. The tasks needed to be fulfilled autonomously, with no interference from the contestants. Figure 1 shows the 2013 FRC field. In the competition, the robot had to move from section [1] through section [9], irrigate round pots of assigned color, and only pass The authors are solely responsible for the content of this technical presentation. The technical presentation does not necessarily reflect the official position of the Chinese Institute of Agricultural Machinery (CIAM), and its printing and distribution does not constitute an endorsement of views which may be expressed. Technical presentations are not subject to the formal peer review process by CIAM editorial committees; therefore, they are not to be presented as th refereed publications. Citation of this work should state that it is from the 7 ISMAB paper. EXAMPLE: Author's Last th Name, Initials. 2014. Title of Presentation. The 7 ISMAB May 21-23, 2014. Yilan, Taiwan. For information about securing permission to reprint or reproduce a technical presentation, please contact CIAM at [email protected] or the Chinese Institute of Agricultural Machinery, c/o Department of Bio-industrial Mechatronics, National Chung Hsing University, 250 KuoKuang Road, Taichung 40227, Taiwan.

769

into section [10] at a correct traffic signal. There are 8 checkpoints throughout the field where the contestants could choose to reset the robot once it passed through. The checkpoints were the boundaries between the sections. Research has addressed techniques and approaches applied to robot design and control in competitions. Noguchi et al. (2001) and Inoue et al. (2009) demonstrated autonomous navigation using an inertial measurement unit (IMU). Xue et al. (2012) showed pathfiding using machine vision. Anderson et al. (2013, pp.129-142) showed that accurate robot speed control could be achieved through a PID control. This work presents the strategies proposed by the first prize winner of the 2013 FRC.

Figure 1. Field map of the 2013 FRC. MATERIALS AND METHODS A wheel robot was designed for the competition. Figure 2 shows the design of the robot, and Fig. 3 shows the actual construction. The robot consists of 4 major systems: body frame, control, sensing, and actuating. The robot was powered by a Lithium-polymer battery (11.1V, 2200mah) and a universal serial bus (USB) power bank (5V, 5000mah). The details of the systems are described below.

Figure 2. Solidworks drawing of the robot.

Figure 3. The completed field robot

770

BODY FRAME The body frame was designed to provide space for accommodating sensors and actuators, while it had to be small enough to navigate along within the boundary of the field path. As the path way width is 60 cm, the robot mainframe was blueprinted to be 40 by 40 cm. Hollow aluminum extrusion bars (cross section 20 by 20 mm) were used as the skeleton. Two acrylic plates (5 mm in thickness) were used for the body top and bottom. The bottom acrylic plate housed drive module. Two drive wheels were placed at the front, and 2 swivel wheels were placed at the back. The drive wheels were connected directly to the shafts of DC motors. The motors were actuated by using H-bridge motor driver boards and Arduino Uno controllers mounted nearby. A distance measurement sensor was placed on the bottom frame, facing front for obstacle detection. The top acrylic plate hosted main controller, vision module, and human interface controls. The vision module was placed at the front of the robot. It was mounted on a pan/tilt head consisting of 2 RC servo motors. The pan/tilt design allows the robot to access a great field of view. Human interface controls were placed near the back of the vehicle. This makes changing the settings easier, especially when the robot is moving. The human interface controls consisted of a power switch, state selection switches, and other buttons for special functions. A LCD screen was placed next to the controls to display current status, settings, and other information (e.g., speed and heading). The top acrylic plate was incised in the middle to house an irrigation module. A hollow aluminum extrusion bar was placed vertically on right hand side of the robot. The bar provided support for the nozzle of the irrigation system. It also hosted an inertial measurement unit (IMU) module. The IMU module was placed on the bar to reduce the interference from the motors. CONTROL Robot behaviors were controlled by implementing a finite-state machine (Wagner, Schmuki, Wagner, & Wolstenholme, 2006). The machine consisted of 6 states. Each state represented an action the robot performed in a part of the field. Referring to Fig. 1, the first state directed the robot through sections [2] and [3]. The second state controlled irrigation. The third and fourth were for sections [4] through [5] and [6] through [7] respectively. The fifth was navigating section [8]. The final state took care of the traffic light. Different states of the robot’s program would be chosen in the event that the robot needed to be reset in the middle of the field. The control of the robot was realized using an Arduino Mega microcontroller. The Arduino Mega was based on the ATmega2560 (RAM 256 KB, 16 MHz). It was chosen because the board provided sufficient input/output pins and universal asynchronous receiver/transmitters to receive signals from sensors and to control actuators. The microcontroller also provided a user-friendly program developing environment. A USB power bank was used to provide stable power directly to the main controller.

771

SENSORS Machine Vision Unit CMUcam4 (Carnegie Mellon University) was used as the machine vision module for the robot. The machine vision module was used for path finding and traffic light recognition. CMUcam4 was employed to achieve the tasks with the built-in image processing algorithms. In the process of color tracking, the camera captured an image, and then calculated the mean of the pixels of a target color which had been predetermined. It then provided the coordinates of the mean pixels to the main controller. After retrieving the coordinates of the target, it checked whether the target was in the middle of the vision in the x-axis. If it was not, the main controller would control the two wheels with PID control to correct the heading. For the traffic light recognition, a region of interest (ROI) containing only the color signal was first defined. The robot was first moved before the traffic light. Figure 4 shows the original field of view (drawn with black line) and the ROI (drawn with red line). Only color information within the ROI was processed. This was done to decrease the chance of the robot being affected by random colors in the background.

Figure 4. Field of view (black) and region of interest (red). Distance Measurement Sensor In order to avoid deviation from the paths, distance measurement sensors are needed. Ultrasonic distance sensors (SRF-05) were used because of its insusceptibility to ambient light. The maximum measuring distance is up to about 5 meter and its resolution is 1 cm, which is sufficient for the use in this competition. Inertial Measurement Unit IMU module (HMC5883L, Parallax) was used to measure the robot’s heading direction. This module measures magnetic intensity along the x, y, and z axis of the chip. Calibration was performed as an initialization step of the robot startup to ensure uniform sensitivity along the 3 axes. An arbitrary heading is then calculated according to the readings of the 3 axes. The heading is then fed into the controller, allowing the robot to facilitate straight motion and 90 degree turns.

772

ACTUATOR 2 DC servo motors were used in the robot (IG-32RGM, Shayang Ye) to power the drive wheels. The rated output of the motors were 7 watts, and the built-in encoder had a resolution of 378 ticks per revolution at 2x decoding. Automatic control was implemented on the motors to achieve precise motion velocity on various terrain in the field. In the process, the motors were assumed to be first-order systems. The dynamic models of the DC servo motors were constructed using system identification techniques (Ljung, 1999). Then a proportional-integral-derivative (PID) algorithm was applied to control the motor speed. The PID gains were determined based on predetermined criteria of settling time and percentage overshoot. The calculation was performed using Matlab (Mathwork Inc.). An Arduino Uno controller coupled with an H-bridge motor driver board was used to implement the control algorithm on each motor. These Arduino Uno controllers served as local controllers and accepted target velocity commands from the main controller. The irrigation module consisted of a water tank, a submersible pump, and a nozzle. The water tank was made from a 2-liter water bottle. A submersible pump (2-50W, Javtop) provided the pressure needed to irrigate the plants. Water traveled through a hose and sprayed out of a nozzle, which was mounted on the vertical hollow extrusion bar. Small drainage holes were cut on the bottom deck, and all electrical components were elevated with plastic standoff screws to minimize damage in the event of a leakage. RESULTS & DISCUSSION The performance of the motor speed control was evaluated. Figure 5 shows the velocity profile of the motor without load. The figure shows that it takes 120 ms for the motor to reach the maximum velocity, which is approximately 200 rpm. The control achieved no overshoot, and a steady-state error band of ±5 rpm.

Figure 5. Velocity profile of motor with a 200-rpm step input. The field performance of the robot was evaluated. In actual runs, the robot attained a maximum speed of 1.26 m/s. Note that the drive wheel diameter was 12.5 cm. In a straight line running with a distance of 6 m, the robot deviated from the planned line only by approximately 5 cm. The robot could navigate 90 degree turns accurately. It could also track

773

any given color object with its vision module. In the 2013 FRC, the robot finished the first run in 3 min and 20 s . In the second run, the robot fell off the last section and was unable to finish. Nevertheless, the robot was awarded the champion trophy of the competition with its stable and high-level performance. CONCLUSIONS A wheel robot was designed and built for the 2013 FRC. The robot consisted of 4 major systems: body frame, control, sensing, and actuating. Automatic control algorithms were applied to achieve a high-level precision in path navigation. A machine vision module was implemented for traffic signal recognition. The robot successfully took first place in the competition. REFERENCES Almeida, L., Azevedo, J., Cardeira, C., Costa, P.,. 2000. Mobile Robot Competitions: Fostering Advances in Research, Development and Education in Robotics. In CONTROLO’2000, 592-597. Guimarães, Portugal: the 4th Portuguese Conference on Automatic Control. Anderson, R., and Cervo, D. 2013. Pro Arduino. USA: Apress. Inoue, K., Nii, K., Zhang Y., Atanasov, A. 2009. Tractor guidance system for field work. In Energy Efficiency & Agricultural Engineering, 280–295. Rousse, Bulgaria: Proceedings of International Scientific Conference. Ljung, L. 1999. System Identification: Theory for the User. USA: Prentice Hall. Noguchi, N., Reid, J.F., Zhang, Q., Will, J.D., Ishii K. 2001. Development of Robot Tractor Based on RTK-GPS and Gyroscope. ASAE Paper 01-1195. California, USA: ASAE. The 6th Field Robot Competition. 2013. Competition Rules. Taipei, Taiwan: Taiwan Institute of Biological Mechatronics. Availiable at: http://fieldrobotrace2013.tibm.org.tw/?page_id=19. Accessed 1 March 2014. Wagner, F., Schmuki, R., Wagner, T., & Wolstenholme, P. 2006. Modeling Software with Finite State Machines: A Practical Approach. Florida, USA: Auerbach Publications. Xue, J., Zhang, L., Tony, E. Grift. 2012. Variable field-of-view machine vision based row guidance of an agricultural robot. Computers and Electronics in Agriculture 84: 85-91. Zechel G., Stier, J., Beitelschmidt, M. 2011. A Robot Competition to Encourage First-Year Students in Mechatronic Sciences. Communications in Computer and Information Science 161: 288-299.

774

building a robot for the 2013 field robot competition

May 23, 2014 - Corresponding Author -- Email: [email protected] ... The robot consists of 4 major systems: body frame, .... Automatic control algorithms were.

740KB Sizes 3 Downloads 250 Views

Recommend Documents

Integrating human / robot interaction into robot control architectures for ...
architectures for defense applications. Delphine Dufourda and ..... focusses upon platform development, teleoperation and mission modules. Part of this program ...

KINEMATIC CONTROLLER FOR A MITSUBISHI RM501 ROBOT
Jan 20, 2012 - Besides that, some notebook computers did .... planning where no collision with obstacle occurs [9]. .... Atmel data manual, AT89C51ID2.pdf. [7].

[eBook] Download Robot Building for Beginners Full ...
development, but he also ... tutorials courses and books HTML5 CSS3 JavaScript PHP mobile app development Responsive Web DesignUn ebook scritto anche ...

The robot moway.pdf
The 2015 growth objective is 7 per cent. Government has. resolved to continue with investment in infrastructure and. has put in place appropriate measures to ensure fiscal. prudence. Michael M. Mundashi, SC. Chairman. Whoops! There was a problem load

Integrating human / robot interaction into robot control ...
10, Place Georges Clemenceau, BP 19, 92211 Saint-Cloud Cedex, France. bDGA / Centre .... and scalable man / machine collaboration appears as a necessity. 2.2. ..... between the robot onboard computer and the operator control unit. 4.3.

Robot Tank - GitHub
Control with App. Quick Guide. Makeblock Starter Robot Kit contains mechanical parts and electronic modules for you to start exploring the robot world which.

4WD Robot Controller.sch - GitHub
3/26/2013 3:16:59 PM C:\Users\Tobe\Desktop\Grove Update\新产品\4WD Robot Controller_NPI资料\Source file\4WD Robot Controller.sch (Sheet: 1/1) ...

Draft 2 - We Robot
2. Sex robots build AI technology and combine sensory perception, synthetic .... technology used for the android .

Robot Vision
data is available in the Internet at. . ..... uses only cheap off-the-shelf cameras to fulfill its tasks. .... Chapter four deals with vision systems.

Robot and control system
Dec 8, 1981 - illustrative of problems faced by the prior art in provid ... such conventions for the purpose of illustration, and despite the fact that the motor ...

A Resilient, Untethered Soft Robot
robot to carry the miniature air compressors, battery, valves, and controller needed for autonomous ...... Hamdani S, Longuet C, Perrin D, Lopez-cuesta J-M, Ga-.

Learning Reactive Robot Behavior for Autonomous Valve ...
Also, the valve can. be rusty and sensitive to high forces/torques. We specify the forces and torques as follows: 368. Page 3 of 8. Learning Reactive Robot Behavior for Autonomous Valve Turning_Humanoids2014.pdf. Learning Reactive Robot Behavior for

Novelty Detection for Robot Neotaxis.pdf
Particularly note- worthy is the Kohonen Novelty Filter [14, 13], which is an auto-encoder neural network trained by back-propagation of error. After training, any.

Anatomy of a Robot
Let's call him Sam. He may ... To make a long story short, Sam's robot reliably chugged around the racecourse and ..... business, management, engineering, production, and service. ...... http://home.att.net/~purduejacksonville/grill.html .... applica

Anatomy of a Robot
2. Project Process Flowchart. 3. How This Works When It's Implemented Right. 5. The User's ... Two years ago, I took my six-year-old son to a “robot race” up in the Rockies near. Boulder. .... for all ages, from high school through college and be

A Robot Supervision Architecture for Safe and ... - Robotics Institute
+1-412-268-7988; email: [email protected] ... email: [email protected] ..... [9] G. Podnar, J. Dolan, A. Elfes, M. Bergerman, H.B. Brown and A.D. Guisewite.

Vision for Mobile Robot Navigation: A Survey
One can now also design a vision-based navigation system that uses a ...... planning, it did not provide a meaningful model of the environment. For that reason ...

Spatial Concept Acquisition for a Mobile Robot that ...
Language acquisition. – The robot does not have .... Bayesian learning of a language model from continuous speech. .... *1:http://julius.sourceforge.jp/index.php.