Summary

The Modular Design and Production of an Intelligent Robot Based on a Closed-Loop Control Strategy

Published: October 14, 2017
doi:

Summary

We present a protocol on modular design and production of intelligent robots to help scientific and technical workers design intelligent robots with special production tasks based on personal needs and individualized design.

Abstract

Intelligent robots are part of a new generation of robots that are able to sense the surrounding environment, plan their own actions and eventually reach their targets. In recent years, reliance upon robots in both daily life and industry has increased. The protocol proposed in this paper describes the design and production of a handling robot with an intelligent search algorithm and an autonomous identification function.

First, the various working modules are mechanically assembled to complete the construction of the work platform and the installation of the robotic manipulator. Then, we design a closed-loop control system and a four-quadrant motor control strategy, with the aid of debugging software, as well as set steering gear identity (ID), baud rate and other working parameters to ensure that the robot achieves the desired dynamic performance and low energy consumption. Next, we debug the sensor to achieve multi-sensor fusion to accurately acquire environmental information. Finally, we implement the relevant algorithm, which can recognize the success of the robot's function for a given application.

The advantage of this approach is its reliability and flexibility, as the users can develop a variety of hardware construction programs and utilize the comprehensive debugger to implement an intelligent control strategy. This allows users to set personalized requirements based on their needs with high efficiency and robustness.

Introduction

Robots are complex, intelligent machines that combine knowledge of several disciplines, including mechanics, electronics, control, computers, sensors and artificial intelligence 1,2. Increasingly, robots are assisting or even replacing humans in the workplace, especially in industrial production, due to the advantages robots possess in performing repetitive or dangerous tasks. The design of the intelligent robot protocol in the current study is based on a closed-loop control strategy, specifically path planning based on a genetic algorithm. Furthermore, the functional modules have been strictly divided3,4, which may provide a solid foundation for future optimization work, so that the robots have a strong capacity for upgrades.

The modular implementation of the robotic platform is based primarily on the following methods: multi-dimensional combination control strategy in motor control module5,6, and intelligent exploration based on a genetic algorithm in the optimization algorithm module.

We use double closed-loop control of the DC motor and four-quadrant motor operation in the motor control module. Double closed-loop speed control means that the output of the speed regulator serves as the input of the current regulator, allowing it to control the current and torque of the motor. The advantage of this system is that the torque of the motor can be controlled in real-time based on the difference between the given speed and the actual speed. When the difference between given and actual speeds is relatively large, the motor torque increases and the speed changes faster to drive the motor speed toward the given value as quickly as possible, which makes for rapid speed regulation7,8,9. Conversely, when the speed is relatively close to the given value, it can automatically reduce the torque of the motor to avoid excessive speed, allowing the speed to achieve the given value relatively quickly with no error6,10. Since the equivalent time constant of the electric current loop is relatively small, the four-quadrant motor11,12 can respond more quickly to suppress the impact of interference when the system is subject to external interference. This allows it to improve the stability and anti-jamming ability of the system.

We choose a genetic intelligent optimization algorithm with the highest efficiency based on the results of a simulation run in MATLAB. A genetic algorithm is a stochastic parallel search algorithm based on the theory of natural selection in genetics. It constitutes an efficient method for finding the global optimal solution in the absence of any initial information. It regards the solution set of the problem as a population, thereby increasing the quality of the solution via continuous selection, crossover, mutation and other genetic operations. With regard to path planning by intelligent robots, difficulty arises as a result of insufficient initial information, complicated environments and nonlinearity. Genetic algorithms are better able to solve the problem of path planning because they possess global optimization ability, strong adaptability and robustness in solving nonlinear problems; there are no specific restrictions on the problem; the calculation process is simple; and there are no special requirements for the search space 13,14.

Protocol

1. Construction of the Machine

  1. Assemble the chassis as illustrated, securing mechanical components using appropriate fasteners. (Figure 1)
    NOTE: The chassis, which comprises the baseboard, motor, wheels, etc., is the primary component of the robot responsible for its motion. Thus, during assembly, keep the bracket straight.
  2. Tin the wire lead and both the positive and negative electrodes. Solder two wire leads onto the two ends of the motor, connecting the red lead to the positive electrode and the black lead to the negative electrode.
  3. Assemble the shaft sleeve, the motors and the wheels.
    1. Connect the motor to the shaft sleeve and secure it with a screw.
    2. Insert the shaft sleeve into the center of the wheel hub.
    3. Install the completed structure onto the chassis.
  4. Drill two holes, 3 mm in diameter, in the center of chassis, to allow for installation of the motor driving module. Connect the motor to the motor driving module.
  5. Drill one hole 1 cm from both the left and right edges of the chassis for installation of the bracket for the infrared sensors on the bottom.
  6. Install two fasteners onto the center of the two sides of the chassis.
    NOTE: To ensure normal operation of the infrared sensors, ensure that the connecting piece is perpendicular to the chassis.
  7. Drill a hole, 18 mm in diameter, through each of the two structural components for the installation of sensors. (Figure 2A)
  8. Install the motor drive to the underside of chassis. (Figure 2B) Install one infrared sensor pointing at each of the four directions, respectively, of the chassis. (Figure 2C)
  9. Install the steering gear in symmetry. Because of the large torque generated by the operation of the steering gear, ensure that the bolts are installed in a manner that provides a firm and impervious joint.
  10. Install four infrared sensors on the center of the machine.
  11. Place the 14.8 V power supply in the center of the machine, and affix the Microcontroller Unit (MCU) to the battery pack.
  12. Affix four range sensors to the upper part of machine. Adjust the angle between each sensor and the ground to 60°, to guarantee detection accuracy relative to the working table.
  13. Install the dual-axis tilt sensor, which is used to detect cases when the machine fails to reach its target in the working area.
  14. Use a screwdriver to attach the robot arm to the front of the machine. (Figure 3)

2. Debugging the Steering Engine and Driver Module

  1. Double-click to open the debugging software (e.g., Robot Servo Terminal2010). Connect computer to the debug board with a Universal Serial Bus (USB) converting cable. (Figure 4)
  2. Set the steering engine's baud rate to 9600 bits/s, the rate limitation to 521 rad/min, the angular limitation to 300° and voltage limitation to 9.6 V in the working interface.
  3. Set the working mode of the robot steering gear to "steering engine mode."
  4. Apply asynchronous half-duplex communication as the connection between the controller and steering engines. This way, the controller can control more than 255 steering engines from a single Universal Asynchronous Receiver/Transmitter (UART) interface. (Figure 5)
    Caution: There may be, at most, 6 steering engines connected to a single wire. Too many steering engines will lead to overheating and large voltage drop, resulting in unusual behavior such as resetting and abnormal data communication, etc. (Figure 6)
  5. Apply asynchronous half-duplex communication as the connection between the controller and the motor driving module. (Figure 7)
  6. Set the ID number of the two driving modules and the four steering engines. ID3 and ID4 are left blank for future updating purposes. (Figure 8)
    NOTE: ID1: leftward driving module; ID2: rightward driving module; ID5: left-front steering engine; ID6: right-front steering engine; ID7: left-rear steering engine; ID8: right-rear steering engine.
  7. Cascade the steering engines one by one and connect the cascade to the controller.
  8. Connect the sensors to their respective controller interfaces. It should be noted that the sensor whose connector bears a triangular mark is the ground (GND).
    NOTE: AD1: front infrared photoelectric sensor on underside; AD2: right infrared photoelectric sensor on underside; AD3: rear infrared photoelectric sensor on underside; AD4: left infrared sensor on underside; AD5: front infrared distance measuring sensor; AD6: right infrared distance measuring sensor; AD7: rear infrared measuring sensor; AD8: left infrared distance measuring sensor; AD9: left-front anti-fall infrared photoelectric sensor; AD10: right-front anti-fall infrared photoelectric sensor; AD11: right-rear anti-fall infrared photoelectric sensor; AD12: left-rear anti-fall infrared photoelectric sensor.

3. Debugging the Sensors

  1. Rotate the regulating knob on the tail of the infrared sensors to adjust the detection range of the sensors. When the robot is positioned in the center of the working table, the logic level of the top four infrared sensors is 1. When the machine moves to the edge of the working table, the logic level of the infrared sensor on the corresponding side will be 0. (Figure 9A)
    NOTE: The robot can determine its location in the working table by analyzing the logic level of the infrared sensors. For example, if the logic levels of the left and front sensors are 0, the robot must be in the upper-left region of the working table.
  2. Compare the measured values of the distance sensor to their baseline values for calibration. (Figure 9B)
    NOTE: The distance sensor is an analog sensor. As the distance varies, the sensor's signal strength feedback and corresponding measured values will also vary. The measured values will be relayed to the host machine via digital sensors so that the robot can identify changes in its surroundings.
  3. Debug the tilt-angle sensor.
    1. Position the tilt-angle sensor horizontally and record its measured values.
    2. Incline the sensor toward two different directions and record its measured values. If the measured values are within the error range, the sensor can be regarded as being in regular operation.

4. Control Scheme

  1. Construct the simulation model of the DC motor, based on the DC motor voltage balance equation, flux linkage equation and torque balance equation.
    1. Establish voltage balance equation given by
      Equation 1
      where ud is direct axis voltage, uq is quadrature axis voltage. Rd and Rq denote direct axis resistance and quadrature axis resistance respectively. Equation 2,Equation 3,Equation 4,Equation 5, represent direct axis current, direct axis current, direct axis flux and quadrature axis flux.
    2. Establish flux linkage equation given by
      Equation 6
      where Equation 7 and Equation 8 denote the coefficient of direct axis self-inductance and quadrature axis self-inductance respectively. Equation 9and Equation 10 are coefficient of mutual inductance. Equation 11, Equation 12 represent electromagnetic torque and load torque.
    3. Establish torque balance equation calculated by Equation 13.
    4. Build simulation model of the DC motor. (Figure 10)
  2. Apply double closed-loop control of the DC motor. Utilize the output of the speed regulator as the input to the current regulator to regulate the motor's torque and current.
    Note: Diagram of the structure of the current regulation system. (Figure 11)
    The transfer function of the PI current regulator is shown as Equation 14, where Equation 15is the proportional coefficient of the current regulator and Equation 16 is the lead time constant of the current regulator. It can be obtained by the scale coefficient Equation 17, and the integral coefficient Equation 18.
    1. Apply double closed-loop control of the DC motor. (Figure 12)
  3. Apply four-quadrant motion control of the DC motor. (Figure 13)
    1. Utilize an H-bridge driving circuit to achieve four-quadrant motion of the DC motor by modulating the on-off of Metal-Oxide-Semiconductor Field-Effect Transistor (MOSFET). (Figure 14)
      Note: When Q1 and Q4 are turned on and the motor is in the forward electric state and the first quadrant running state. When Q3 and Q4 are turned on, the motor is in the energy braking state and the second quadrant running state. When Q2 and Q3 are turned on, the motor is in the reverse electric state and the third quadrant motion state. When Q1 and Q2 are turned on, the motor is in the reverse energy braking state and the fourth quadrant motion state.
  4. Apply pulse width modulation (PWM) to regulate the speed of the DC motor. Modulate the DC voltage pulse width (duty cycle) applied to the motor armature by controlling the on-off of the electric switch when the voltage of DC motor power supply remains essentially unchanged, thus modulating the average value and the rotation speed input to armature voltage of the motor.

5. Write the Program

  1. Use the USB download line to import a Binary (BIN) file generated by KEIL5 into the controller.
  2. Select the program to be executed.

6. Application Scenario

  1. Apply color recognition to categorize cargo in a factory. (Figure 15)
    1. Use an optical camera to collect images and verify the scanned color using the number of the bounced two-dimensional array.
    2. Lift the object with the mechanical arms.
    3. Issue a command to transport the object to the designated location using the camera and driving motor of the robot.
  2. Search quickly to clear the designated areas. (Figure 16)
    1. Use the four optical sensors on the robot to detect the locations of surrounding obstacles.
    2. Command the steering engine to lift the mechanical shovel and clear obstacles in the designated areas.
    3. Use the genetic algorithm to determine the most effective search path.
  3. Use self-recognition to prevent falling from the workbench to separate workers from the machine working area and ensure worker safety.
    1. Modify signals based on the difference in altitude between the four upper optical sensors, which recognize the workbench and the ground.
    2. Analyze the mutable signals to determine the location of the edges of the workbench.
    3. Command the machine to avoid the edges of the workbench.

Representative Results

In the diagram of the double closed-loop motion control program, purple represents a given speed signal and yellow represents the value of the control system output. Figure 17 clearly shows that the double closed-loop control system is significantly more effective than an open-loop system. The actual overshoot of the output of the double closed-loop system is relatively small and the dynamic performance of the system is better. ( Figure 17)

Figure 18 shows the robot's color recognition accuracy under the influence of reflected light at different wavelengths. In practice, due to different light conditions, the reflected light wavelength of the target object will fluctuate within a certain range. To inspect the recognition accuracy of the machine, a test is conducted in the range of the wavelengths of yellow light (565-595 nm) and red light (625-740 nm). If the value returned by the camera is 1, the color recognition is accurate. In the range of 585-593 nm, the yellow light recognition accuracy rate of the camera exceeds 90%, whereas the rate outside the range decreases rapidly. Similarly, within the 660-700 nm range, the red light recognition accuracy rate exceeds 90%, while the rate outside the range decreases rapidly. The test results demonstrate that, under appropriate illumination, the robot achieves color recognition with a small margin of error. ( Figure 18)

Figure 19 illustrates the relation between the camera's color recognition accuracy and the distance. The recognition accuracy is inversely correlated with the distance. As illustrated in the experimental results, when the distance is between 0-30 cm, the color recognition accuracy of the camera is greater than 80%. The results demonstrate that this program has strong utility. ( Figure 19)

Figure 1
Figure 1: Construction of the chassis. Please click here to view a larger version of this figure.

Figure 2
Figure 2: Installation of the infrared sensors. Please click here to view a larger version of this figure.

Figure 3
Figure 3: The effect of installation. Please click here to view a larger version of this figure.

Figure 4
Figure 4: Debugging work screen. Please click here to view a larger version of this figure.

Figure 5
Figure 5: Connection of robot steering gear. Please click here to view a larger version of this figure.

Figure 6
Figure 6: Electrical connection principles. Please click here to view a larger version of this figure.

Figure 7
Figure 7: Electrical connection principles. Please click here to view a larger version of this figure.

Figure 8
Figure 8: Set ID number. Please click here to view a larger version of this figure.

Figure 9
Figure 9: Two sensors. Please click here to view a larger version of this figure.

Figure 10
Figure 10: Simulation model of DC motor. Please click here to view a larger version of this figure.

Figure 11
Figure 11: Current regulatory system. Please click here to view a larger version of this figure.

Figure 12
Figure 12: Simulation model of double closed-loop control. Please click here to view a larger version of this figure.

Figure 13
Figure 13: Diagram of four-quadrant operation of the motor. Please click here to view a larger version of this figure.

Figure 14
Figure 14: H-bridge circuit. Please click here to view a larger version of this figure.

Figure 15
Figure 15: The workflow of color recognition. Please click here to view a larger version of this figure.

Figure 16
Figure 16: The workflow of quick search. Please click here to view a larger version of this figure.

Figure 17
Figure 17: Simulink diagram. Please click here to view a larger version of this figure.

Figure 18
Figure 18: Color recognition accuracy under the influence of reflected light at different wavelengths. Please click here to view a larger version of this figure.

Figure 19
Figure 19: Relationship between the camera's color recognition accuracy and the distance. Please click here to view a larger version of this figure.

Discussion

In this paper, we designed a type of intelligent robot that can be built autonomously. We implemented the proposed intelligent search algorithm and autonomous recognition by integrating several software programs with hardware. In the protocol, we introduced basic approaches for configuring the hardware and debugging the intelligent robot, which may help users design a suitable mechanical structure of their own robot. However, during actual operation, it is necessary to pay attention to stability of the structure, its operating range, the degree of freedom and space utilization, to ensure that these parameters meet the requirements. A reasonable mechanical structure ensures high precision, high flexibility, and high robustness of the robot. To design complicated mechanical structures, the user can combine software such as Adams to construct a simulation model and apply virtual prototyping technology. This may allow them to exclude possibilities that do not satisfy the technical requirements or possibilities that are not mechanically feasible.

One potential issue is the inability of the robot to accurately achieve its desired functions. This may stem primarily from two causes. The first is the inability of the sensors to meet the requirements. For example, during the first test, the cleaning robot in this study was unable to successfully push obstacles out of the working area. This was because the range of the infrared sensor on the equipment was somewhat narrow, which meant that the robot could not achieve the requisite acceleration when it detected an obstacle. This issue could be solved by increasing the detection range of the infrared sensor. To address these issues, an additional level of debugging of the sensors may be necessary, based on the situation or application. The second is the inability of the selected motor to meet the performance requirement. When choosing a motor, priority must be given to a motor with suitable starting performance, operational stability and low noise within the budget.

To begin design and production of a new robot, the parameters for a manual configuration scheme must be defined to control the behavior of the robot, so that it may adapt to the demands of a new task. Simultaneously, all processes must follow the steps presented in the protocol. An advantage of the modular design of the robot lies in its clear division of work, which allows it to be developed via the collaboration of various engineers. Mechanical engineers design the structure of the hardware, electrical engineers design the motor control strategy, and controls engineers design the search algorithm. Thus, the work of each module can be developed independently to accomplish a specific task. We provide a basic design scheme for each module, to help users search for the optimal scheme for a particular application.

The range of potential applications will expand considerably as intelligent robot technology matures. It will prove to be an invaluable resource to individuals in the fields of ocean development, space exploration, industrial and agricultural production, social service, and entertainment, to name a few. This technology will gradually replace human beings in dangerous and unsanitary work environments. Intelligent robots will continue to develop toward multi-robot cooperation, and intelligent and networked direction.

Declarações

The authors have nothing to disclose.

Acknowledgements

The authors would like to express their gratitude to Mr. Yaojie He for his assistance in performing the experiments reported in this paper. This work was supported in part by the National Natural Science Foundation of China (No. 61673117).

Materials

structural parts UPTECMONYH HAR L1-1
structural parts UPTECMONYH HAR L2-1
structural parts UPTECMONYH HAR L3-1
structural parts UPTECMONYH HAR L4-1
structural parts UPTECMONYH HAR L5-1
structural parts UPTECMONYH HAR L5-2
structural parts UPTECMONYH HAR U3A
structural parts UPTECMONYH HAR U3B
structural parts UPTECMONYH HAR U3C
structural parts UPTECMONYH HAR U3F
structural parts UPTECMONYH HAR U3G
structural parts UPTECMONYH HAR U3H
structural parts UPTECMONYH HAR U3J
structural parts UPTECMONYH HAR I3
structural parts UPTECMONYH HAR I5
structural parts UPTECMONYH HAR I7
structural parts UPTECMONYH HAR CGJ
link component UPTECMONYH HAR LM1
link component UPTECMONYH HAR LM2
link component UPTECMONYH HAR LM3
link component UPTECMONYH HAR LM4
link component UPTECMONYH HAR LX1
link component UPTECMONYH HAR LX2
link component UPTECMONYH HAR LX3
link component UPTECMONYH HAR LX4
Steering gear structure component UPTECMONYH HAR KD
Steering gear structure component UPTECMONYH HAR DP
Infrared sensor UPTECMONYH HAR E18-B0 Digital sensor
Infrared Range Finder SHARP GP2D12
Gray level sensor SHARP GP2Y0A02YK0F
proMOTION CDS SHARP CDS 5516 The robot steering gear
motor drive module Risym HG7881
solder wire ELECALL 63A
terminal Bright wire 5264
motor BX motor 60JX
camera Logitech C270
Drilling machine XIN XIANG 16MM Please be careful
Soldering station YIHUA 8786D Be careful to be burn
screwdriver EXPLOIT 043003
Tweezers R`DEER RST-12

Referências

  1. Charalampous, K., Kostavelis, I., Gasteratos, A. Robot navigation in large-scale social maps: An action recognition approach. Expert Syst Appl. 66 (1), 261-273 (2016).
  2. Huang, Y., &Wang, Q. N. Disturbance rejection of Central Pattern Generator based torque-stiffness-controlled dynamic walking. Neurocomputing. 170 (1), 141-151 (2015).
  3. Tepljakov, A., Petlenkov, E., Gonzalez, E., Belikov, J. Digital Realization of Retuning Fractional-Order Controllers for an Existing Closed-Loop Control System. J Circuit Syst Comp. 26 (10), 32-38 (2017).
  4. Siluvaimuthu, C., Chenniyappan, V. A Low-cost Reconfigurable Field-programmable Gate Array Based Three-phase Shunt Active Power Filter for Current Harmonic Elimination and Power Factor Constraints. Electr Pow Compo Sys. 42 (16), 1811-1825 (2014).
  5. Brogardh, T., et al. Present and future robot control development – An industrial perspective. Annu Rev Control. 31 (1), 69-79 (2007).
  6. Wang, E., Huang, S. A Novel DoubleClosed Loops Control of the Three-phase Voltage-sourced PWM Rectifier. Proceedings of the CSEE. 32 (15), 24-30 (2012).
  7. Li, D. H., Chen, Z. X., Zhai, S. Double Closed-Loop Controller Design of Brushless DC Torque Motor Based on RBF Neural Network. , 1351-1356 (2012).
  8. Tian, H. X., Jiang, P. L., Sun, M. S. Double-Loop DCSpeed Regulation System Design Basd On OCC). , 889-890 (2014).
  9. Xu, G. Y., Zhang, M. Double Closed-Loop Feedback Controller Design for Micro Indoor Smart Autonomous Robot). , 474-479 (2011).
  10. Chen, Y. N., Xie, B., Mao, E. R. Electric Tractor Motor Drive Control Based on FPGA. , 271-276 (2016).
  11. Zhang, J., Zhou, Y. J., Zhao, J. Study on Four-quadrant Operation of Brushless DC Motor Control Method. Proc. International Conference on Mechatronics, Robotics and Automation. (ICMRA 2013). , 1363-1368 (2013).
  12. Joice, C. S., Paranjothi, S. R., Kumar, V. J. S. Digital Control Strategy for Four Quadrant Operation of Three Phase BLDC Motor With Load Variations. Ieee T Ind Inform. 9 (2), 974-982 (2013).
  13. Drumheller, Z., et al. Optimal Decision Making Algorithm for Managed Aquifer Recharge and Recovery Operation Using Near Real-Time Data: Benchtop Scale Laboratory Demonstration. Ground Water Monit R. 37 (1), 27-41 (2017).
  14. Wang, X. S., GAO, Y., Cheng, Y. H., Ma, X. P. Knowledge-guided genetic algorithm for path planning of robot. Control Decis. 24 (7), 1043-1049 (2009).

Play Video

Citar este artigo
Zhang, L., Zhu, J., Ren, H., Liu, D., Meng, D., Wu, Y., Luo, T. The Modular Design and Production of an Intelligent Robot Based on a Closed-Loop Control Strategy. J. Vis. Exp. (128), e56422, doi:10.3791/56422 (2017).

View Video