Next Article in Journal
A New Approach to Explore the Surface Profile of Clay Soil Using White Light Interferometry
Next Article in Special Issue
Extending and Matching a High Dynamic Range Image from a Single Image
Previous Article in Journal
Modeling and Analysis of Capacitive Relaxation Quenching in a Single Photon Avalanche Diode (SPAD) Applied to a CMOS Image Sensor
Previous Article in Special Issue
A PCB Alignment System Using RST Template Matching with CUDA on Embedded GPU Board
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Inline Inspection with an Industrial Robot (IIIR) for Mass-Customization Production Line †

1
Department of Mechanical Engineering, National Taiwan University of Science and Technology, Taipei 10607, Taiwan
2
Center for Cyber-Physical System Innovation, National Taiwan University of Science and Technology, Taipei 10607, Taiwan
*
Author to whom correspondence should be addressed.
Part of this paper was presented at “Wu, Z.-G.; Lin, C.-Y.; Lin, P.T. Stereovision-Based Inline Inspection with a 6R Robot. The 16th International Conference on Automation Technology (Automation 2019), Taipei, Taiwan, 2019; 1106”.
Sensors 2020, 20(11), 3008; https://doi.org/10.3390/s20113008
Submission received: 21 April 2020 / Revised: 17 May 2020 / Accepted: 25 May 2020 / Published: 26 May 2020

Abstract

:
Robots are essential for the rapid development of Industry 4.0. In order to truly achieve autonomous robot control in customizable production lines, robots need to be accurate enough and capable of recognizing the geometry and orientation of an arbitrarily shaped object. This paper presents a method of inline inspection with an industrial robot (IIIR) for mass-customization production lines. A 3D scanner was used to capture the geometry and orientation of the object to be inspected. As the object entered the working range of the robot, the end effector moved along with the object and the camera installed at the end effector performed the requested optical inspections. The detailed information about the developed methodology was introduced in this paper. The experiments showed there was a relative movement between the moving object and the following camera and the speed was around 0.34 mm per second (worst case was around 0.94 mm per second). For a camera of 60 frames per second, the relative moving speed between the object and the camera was around 6 micron (around 16 micron for the worst case), which was stable enough for most industrial production inspections.

1. Introduction

With the rapidly developing trend of Industry 4.0, industrial robots have been widely used in various applications, such as automotive manufacturing lines [1], semiconductor production lines [2], etc. For the goal of autonomous control of industrial robots, various methodologies have been developed to achieve high-accuracy robot manipulation. It has been expected that an Industry 4.0 production line is massively customized. In a mass-customization production line, there exist various kinds of products with different geometrical shapes. There may not be a general solution for automation. Thus, sensing and manipulation with a high degree of freedom became essential for handling these mass-customized products.
In a traditional production line, additional workers or devices may be needed to align or position their products in the conveying system. Inline inspection, therefore, is required to be able to recognize the position and orientation of an inspected object without the need for additional human workers and positioning devices. Moreover, the conveying system usually needs to stop for a couple of seconds to perform stable optical inspections. Stoppage of the conveying system decreases the overall production yield. It is desirable to develop an inline system that is capable of performing inspection of a product in a moving conveyor.
Recently, researchers have developed several robotics systems for picking or inspecting target objects in a moving conveyor or a production line. Han, et al. [3] determined the optimal path planning of a robot arm to pick randomly placed objects in a moving conveyor in terms of minimizing the pick-and-place time in the reachable range of the robot arm. They found that the effectiveness of object picking is related to how the objects are distributed in the conveyor. Anwar, et al. [4] developed some visual servo methodologies to move a camera at the end effector of a robot arm to face the inspection surface on an object. The end effector position was determined based on an image-based Jacobian approach. Abbood, et al. [5] used a camera and some image-processing techniques to identify the shape of an object in a moving conveyor. A low-cost system based on a simple robot arm with plastic gears and servo motors was built to demonstrate the concept of picking objects at a slow moving speed of less than 9 cm/s.
One of the most important features in the inline inspection system for mass-customization production lines is high-accuracy manipulation. Juan and his colleagues [6,7] measured the absolute position errors of the end effector of a 6-revolute-joint (6R) robot arm in the working space by laser. Accordingly, the error field of the end effector in the working space was parametrically modeled by kriging and was utilized to compute the corrected coordinates of the end effector. With the high accuracy of the laser measurement (i.e., ~0.001 mm), the absolute accuracy of the end effector was improved from ~1 mm to ~0.05 mm. However, the experimental setup of laser measurement was very time consuming.
To make the 3D position measurement of the end effector more efficient, Juan and his colleagues [8,9] measured the distances from the end effector to three different cable encoders. These three length measurements were then used to compute the 3D position of the end effector by trilateration. With a lower accuracy of the cable encoder (i.e., ~0.1 mm), the accuracy of the absolute positioning of the end effector reached the range of ~0.5 mm. Lin, et al. [10] also showed how multilateration could be used to estimate the end effector position when more than 3 cable encoders were used to measure their distances to the end effector. Shih and Lin [11] started to investigate how the end effector positioning under various levels of payloads could be improved using trilateration based on three cable encoder measurements and coordinate correction based on kriging models of error fields.
Cable encoder measurements show good potential on 3D positioning of industrial robots but they are not suitable for in situ measurements. Shih, et al. [12] developed a vision-based correction of end effector positioning in planar motions. A 2D camera was used to compute the transformation between the camera coordinates and the end effector coordinates in a given plane. Any target position of the end effector in the plane could then be computed with its coordinate in the live image. This approach was good for in situ measurement and robot manipulation but it only worked for positioning in a given plane and not suitable for moving target positions.
Wu, et al. [13] utilized a 3D scanner to capture the geometry and orientation of an object, which was randomly placed in a moving conveyer. Since the moving speed and direction of the conveyer was known, a 6R robot was able to move along with the moving object as it passed through the working range of the robot. The developed stereoscopic object tracking could be utilized for the application of inline inspection of randomly placed objects in a moving conveyer with known speed and direction. Wu, et al. [14] presented the development and implementation of inline inspection based on stereoscopic object scanning and object tracking by a 6R robot.
In this paper, a method of inline inspection with an industrial robot (IIIR) and its applications in mass-customization production lines are presented. Section 2 introduces the experimental setup and the required methodologies for matching the coordinates of the industrial robot and the 3D scanner. Section 3 explains how the camera position could be estimated based on the computer-aided design (CAD) model of a given object and the 3D point cloud of the object captured under the 3D scanner. Section 4 shows the experimental process and the results. Finally, the conclusion is given in Section 5.

2. Experimental Setup and Initialization of the Inline Inspection with a 6R Robot

Figure 1 shows the experimental setup of the developed inline inspection system [13]. A 3D scanner (i.e., stereovision camera with two pixel sensors in a resolution of 2 micron) was installed at the top of the entrance of a conveyer. The moving speed of the conveyer was denoted as V and was given as 33.8 mm/s. The moving direction of the conveyer was along the +X direction of robot coordinate, which could be seen in Figure 1. As an arbitrarily shaped object entered the conveyer, its 3D geometry and orientation would be captured by the 3D scanner. The 3D point cloud acquired by the 3D scanner would be used to calculate the desired positions and orientations of the end effector of the industrial robot. Accordingly, the end effector was desired to move along with the object and perform optical inspections without stopping the conveyer. This paper will present the required methodologies to estimate the end effector coordinates for the proposed inline inspection.
To match the 3D scanner coordinate, S , and the robot coordinate, R , the transformation from S to R , denoted as T S R , should be computed. The end effector of the robot was moved to N different positions within the inspection range of the 3D scanner. In this paper, N = 100 and N points were uniformly distributed, as shown as the red circles in Figure 2. The i t h end effector position observed in R was denoted as P i R . Accordingly, the 3D scanner was used to captured these end effector positions and i t h end effector position captured in S was denoted as Q i S . The i t h transformed 3D scanned position was denoted as Q i R and should satisfy the following equation:
Q i R = T S R Q i S
Since the end effector coordinate P i R should be closed to the transformed 3D scanned coordinate Q i R , the transformation could then be estimated by least square approximation (LSA).
The Equation (1) can be rewritten as the following matrix form:
[ Q i , 1 R Q i , 2 R Q i , 3 R 1 ] = [ T S , 11 R T S , 12 R T S , 13 R T S , 14 R T S , 21 R T S , 22 R T S , 23 R T S , 24 R T S , 31 R T S , 32 R T S , 33 R T S , 34 R 0 0 0 1 ] [ Q i , 1 S Q i , 2 S Q i , 3 S 1 ]
where each vector was formulated in the homogeneous coordinate. To determine the unknown parameters, T S , p q R for p = 1 , 2 , 3 and q = 1 , 2 , 3 , 4 , the data from all N experiments was put together producing the following equation:
[ Q 1 , p R Q 2 , p R Q N , p R ] = [ Q 1 , 1 S Q 1 , 2 S Q 1 , 3 S 1 Q 2 , 1 S Q 2 , 2 S Q 2 , 3 S 1 Q N , 1 S Q N , 2 S Q N , 3 S 1 ] [ T S , p 1 R T S , p 2 R T S , p 3 R T S , p 4 R ]
for p = 1 , 2 , 3 . Equation (3) is rewritten as the following:
ω p = Ω τ p
where ω p is a N × 1 vector that contains the p t h coordinate of every Q i R ; Ω is a N × 4 matrix that contains all the coordinates of every Q i S ; τ p is a 4 × 1 vector that contains the components of the p t h vrow of T S R . Based on LSA, the unknown parameters in τ p could be estimated by:
τ p = ( Ω T Ω ) 1 Ω T ω p
The transformed coordinates of the 100 end effector positions measured by the 3D scanner are shown as the blue crosses in Figure 2. The errors (i.e., coordinates of the end effector positions minus the transformed ones of the 3D scanned positions) along the X, Y and Z directions are shown in Figure 3, Figure 4 and Figure 5, respectively. The averages and standard deviations of the errors along the X, Y and Z directions are listed in Table 1. Since the 3D scanner is less accurate in the Z direction than the X and Y directions, the standard deviation of errors along the Z direction is greater than those along the X and Y directions. Figure 6 shows the Euclidean errors between the end effector positions and transformed ones of the 3D scanned positions. Based on the averaged value of Euclidean errors listed in Table 1, the error between P i R and Q i R was around 2 mm, which was mostly caused by the lower accuracy of the 3D scanner along the Z direction.

3. Estimation of the End Effector Coordinates for the Inline Inspection

Figure 7 illustrates the position of the camera for the desired inspection. Suppose the CAD model of the investigated object was known, as shown in Figure 7a. The camera was expected to take a picture facing a target point, P t , on the object along the direction of n t , where n t stands for the normal vector of object surface near P t . The CAD model and the desired camera position were saved in the database as the cyber model for later object recognition and positioning of camera. The distance from P t to the center of the camera lens, P c , was given as d . Therefore, P c could be determined by:
P c = P t + d n t
The transformation from P t to P c could be extended to more complicated conditions. This paper only focuses on the translation along the direction of n t with a distance of d .
As the true object entered the conveyer as shown in Figure 1, the geometry and orientation of the object was captured by the 3D scanner, as shown as the blue points in Figure 7b. Iterative closest point (ICP) [15] determined the transformation, T I C P , from the known CAD model to the point cloud obtained from 3D scanner. ICP, which was first introduced by Besl and McKay [15] in 1992, iteratively finds the minimal distance between two sets of points, lines, or surfaces. Later on, Fischler and Bolles [16] used random sample consensus (RANSEC) to improve the accuracy of 3D data sampling. Wahl, et al. [17] was able to capture the geometrical features from the sampled 3D data and analyze the histogram of the point features. Rusu, et al. [18] then introduced a fast point feature histogram (FPFH) to speed up the process of point feature analysis. This paper uses the FPFH to estimate the transformation between the known CAD model and the captured 3D point cloud.
Based on the transformation of the model matching, the target point on the object, denoted as P t , was then found near the point cloud captured from the 3D scanner and was given by:
P t = T I C P P t
The normal vector of the surface near P t was determined by LSA and was denoted as n t . Therefore, the position of the camera to perform the desired inspection could be computed by:
P c = P t + d n t
Consider the object entered the conveyer at time of t = 0 , the coordinate of P c ( t = 0 ) may not be inside the working range of the robot. It’s necessary to estimate at what time the object would move into the working range of the robot so that the desired inspection could be performed. In our implementation, the conveyer moved along the +X direction of the robot coordinate. Therefore, the Y- and Z-components of P c did not change with time. Only the X-component of P c , denoted as P c , x ( t ) , varied with time t and could be written as:
P c , x ( t ) = P t , x + d n t , x + V t
where P t , x and n t , x stand for the X-components of P t and n t , respectively. Suppose the P c , x ( t = t 1 ) entered the working range of the robot and P c , x ( t = t 2 ) moved out of the working range. The inspection should be done within the time interval of [ t 1 ,   t 2 ] . From Equation (9), the limits of the inspection time could be determined by:
t 1 = P x , min P t , x d n t , x V
t 2 = P x , max P t , x d n t , x V
where P x , min and P x , max represent the minimal and maximal X-coordinates of the working range of the robot, respectively. Finally, the coordinates of the end effector for following the object and performing the desired inspection were determined, i.e., P c ( t = t 1 ) , P c ( t = t 1 + Δ t ) , …, P c ( t = t 2 ) . At each time t , the desired position and orientation of the end effector were computed based on the inverse kinematics in the motor control system.

4. Experiment and Result of the Proposed Inline Inspection

The experimental process of the presented methodology is shown in Figure 8. An object to be inspected entered the conveyer at the beginning of the experiment, as shown in Figure 8a. The geometry and the orientation of the object were then captured by the 3D scanner. Suppose the CAD model of the object was known, the transformation between the captured point cloud and the CAD model in the database was estimated by ICP, as explained in Section 3. The target inspection point on the surface of the object, P t , was then calculated as well as the normal vector of the inspected surface, n t . Given the desired distance between the camera and the target point, d, the desired positions of the camera, P c , could then be calculated.
Considering the known moving speed of the conveyer and the working range of the industrial robot, the camera could reach the desired positions between time t 1 and time t 2 , as shown in Equations (10) and (11), respectively. Figure 8b shows the moment of time t 1 where the camera moved to the desired position of P c ( t 1 ) . The industrial robot started to follow with the movement of the object to create near-zero relative movement between the camera and the moving object, as shown in Figure 8c. The inline spection was then performed before the object moved out of the reachable range at time t 2 .
In the experiments of the inline inspection, a flat plate with a circular mark, as shown in Figure 9, was to be inspected. The center of the circular mark was the target point for inspection. The normal vector of the plate had a directional angle, θ , from the X axis and a tilting angle, ϕ , from the X-Y plane, as shown in Figure 10. The area of the circular mark was 301.69 mm2. The area of the circular mark would be measured during the object following and inline inspection to show the accuracy of the optical inspection. The relative movement between the camera and the moving plate would be investigated to show the stability of the inline inspection system.
In this experiment, the directional angle θ of the inspection surface was 45 ° and the tilting angle ϕ was 28.7 ° . The exact position of the object was randomly placed on top of the conveyer. The scale factor of the inspected image was 0.1865 (mm/pixel). From the beginning to the end of the inspection process, one inspection image of the flat plate was taken for every 0.5 s. The relative movement of the center of the circular mark between each consecutive image was measured and shown in Figure 11. The average relative movement of center was 0.2460 mm and the maximum relative movement was 0.6521 mm. Taking derivative with time, the absolute relative speed of the center was shown in Figure 12. The average speed was 0.3358 mm/s and the maximum speed was 0.9420 mm/s. For a camera of 60 frames per second (fps), the relative moving speed between the object and the camera was around 6 micron (around 16 micron for the worst case), which was stable enough for most industrial production inspections. During the inspection process, the area of the circular mark was measured and shown in Figure 13. Compared to the true circular area, 301.69 mm2, the error of the area measurement during the inspection process is shown in Figure 14. The average error was 3.6695 mm2 (1.21% of error) around and the maximum error was 5.2122 mm2 (1.73% of error). The results of the presented experiment are listed in Table 2.
A comparison between the presented IIIR and some other existing methods that were developed in the past few years is given in Table 3. The investigated systems were developed for either pick-and-place applications [5,12,19,20] or production inspections (i.e., [4] and this work). For planar manipulations [5,12], image-based object recognition would be enough; on the other hand, 3D scanning was utilized for recognizing the objects that were either randomly placed (i.e., [20] and this work) or had arbitrary shapes [19]. The challenges of the robot manipulations increased as the objects were placed in a moving platform (i.e., [5] and this work). The comparison in Table 3 showed that the presented IIIR delivered a unique ability to recognize randomly placed objects in a moving platform with good accuracy (i.e., relative speed between camera and object = 0.3358 mm/s = 5.6 micron/frame for a 60-fps camera), which is important for the inline inspection in a mass-customization production line.

5. Conclusions

This paper has presented a method of inline inspection with an industrial robot (IIIR) that carries a camera to move along with moving object and performs the desired optical inspections in mass-customization production lines. The developed inline inspection system integrated multiple technologies, including a 3D scanning, embedded system, coordinate transformation, robot control and machine vision. In our implementation, the relative speed between the moving object and the camera and the speed was around 0.34 mm per second (worst case was around 0.94 mm per second). For a camera of 60 frames per second, the relative moving speed between the object and the camera was around 6 microns (around 16 microns for the worst case), which was stable enough for most industrial production inspections. Furthermore, the average error of the inspection measurement was round 1.21% (worst case was 1.73% of error). The developed system could be extended to various kind of inline operations for automation applications in Industry 4.0.
The main contribution of this paper was the integrated system of a 3D scanner for recognizing the position and orientation of a moving object, an industrial robot for moving to the desired position to perform inline inspection without stopping the conveyor, and a series of numerical methods for matching the coordinates of the 3D scanner and the robot arm, and calculating the desired end effector positions for inline inspections. The developed system was able to perform accurate object tracking without using costly measurement devices such as a high-speed camera or laser tracker. Therefore, it could be widely applied to various kinds of automation production lines but also highly applicable for small- and medium-sized enterprises.

Author Contributions

P.T.L. is the corresponding author of this paper. P.T.L. is an associate professor in the Department of Mechanical Engineering at NTUST. Z.-G.W. is the main contributor of this paper. Z.-G.W. was a master student under the advisory of P.T.L. in the Department of Mechanical Engineering at National Taiwan University of Science and Technology (NTUST) and graduated in July 2019. The coordinate matching and end effector positioning were based Z.-G.W. master thesis. C.-Y.L. and H.-W.C. currently are studying for their master degrees under the advisory of P.T.L. in the Department of Mechanical Engineering at NTUST and expect to graduate in July 2021. They contributed in executing some experiments and demonstrating the experimental results in various exhibitions and conference meetings. All authors have read and agreed to the published version of the manuscript.

Funding

This paper was funded by Ministry of Science and Technology (MOST), Taiwan (grant numbers MOST 108-2634-F-003-004 and MOST 108-2221-E-011-129-MY3) and Center for Cyber-Physical System Innovation, which is a Featured Areas Research Center in Higher Education Sprout Project of Ministry of Education (MOE), Taiwan (since 2018).

Conflicts of Interest

Authors declare no conflict of interest.

References

  1. Lopez, C.-P.; Santórum, M.; Aguilar, J. Autonomous Cycles of Collaborative Processes for Integration Based on Industry 4.0; Springer Science and Business Media LLC: Berlin German, 2019; pp. 177–186. [Google Scholar]
  2. Chen, P.P.; Trappey, A.I.; Lin, B.H.; Trappey, C. Patent Analytics of Robotics Technology for Intelligent Manufacturing in the Semiconductor Industry. In Proceedings of the 2018 IEEE 22nd International Conference on Computer Supported Cooperative Work in Design (CSCWD), Nanjing, China, 9–11 May 2018; Institute of Electrical and Electronics Engineers (IEEE): Piscataway, NJ, USA, 2018; pp. 213–217. [Google Scholar]
  3. Han, S.D.; Feng, S.W.; Yu, J. Toward Fast and Optimal Robotic Pick-and-Place on a Moving Conveyor. IEEE Robot. Autom. Lett. 2020, 5, 446–453. [Google Scholar] [CrossRef] [Green Version]
  4. Anwar, A.; Lin, W.; Deng, X.; Qiu, J.; Gao, H. Quality Inspection of Remote Radio Units Using Depth-Free Image-Based Visual Servo With Acceleration Command. IEEE Trans. Ind. Electron. 2018, 66, 8214–8223. [Google Scholar] [CrossRef]
  5. Abbood, W.T.; Abdullah, O.I.; Khalid, E.A. A real-time automated sorting of robotic vision system based on the interactive design approach. Int. J. Interact. Des. Manuf. (IJIDeM) 2020, 14, 201–209. [Google Scholar] [CrossRef]
  6. Lian, W.-L.; Lin, S.-P.; Juan, P.-C.; Lu, W.-H.; Lin, P.T. A Kriging-based Absolute Error Correction in Robot Control. In Proceedings of the Asian Congress of Structural and Multidisciplinary Optimization (ACSMO 2016), Nagasaki, Japan, 22–26 May 2016; p. 190. [Google Scholar]
  7. Juan, P.-C.; Lu, W.-H.; Lin, S.-P.; Lin, P.T. Accuracy Improvement of Cyber-Physical Robots (CPR). In Proceedings of the 12th World Congress of Structural and Multidisciplinary Optimization (WCSMO 2017), Braunschweig, Germany, 5–9 June 2017. [Google Scholar]
  8. Juan, P.-C.; Lu, W.-H.; Lin, S.-P.; Wu, Z.-G.; Chang, C.-Y.; Kuo, C.-H.; Lin, P.T. Fast Measurements of the End Effector Absolute Positions Using Improved Cable Encoder Measurements. In Proceedings of the International Symposium on Optomechatronic Technology 2017 (ISOT 2017), Tainan, Taiwan, 5–9 November 2017; p. AM-05. [Google Scholar]
  9. Lin, P.T.; Juan, P.-C.; Lin, S.-P.; Lu, W.-H.; Wu, Z.-G. How End Effector Absolute Accuracy Plays A Role in Industry 4.0. In Proceedings of the 14th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications (MESA 2018), Oulu, Finland, 2–4 July 2018; p. 1570405807. [Google Scholar]
  10. Lin, P.T.; Shih, H.-C.; Chen, Y.-J.; Lin, S.-P.; Lu, W.-H. Six-Degree-of-Freedom Measurement and Correction of End Effector Absolute Positions Based on Multilateration and Kriging Modeling. In Proceedings of the Asian Congress of Structural and Multidisciplinary Optimization 2018 (ACSMO 2018), Dalian, China, 21–24 May 2018; p. A040220. [Google Scholar]
  11. Shih, H.-C.; Lin, P.T. End Effector Absolute Positioning Under Various Loading Conditions Based on Cable Measurement Trilateration and Kriging Modeling. In Proceedings of the International Conference on Advanced Technology Innovation 2019 (ICATI2019), Sapporo, Japan, 15–18 July 2019; p. SCG9002. [Google Scholar]
  12. Shih, H.-C.; Hsueh, T.-H.; Lu, W.-H.; Lin, S.-P.; Chuang, C.-H.; Lin, P.T. Planar Absolute Positioning of End Effector Using Image-Based Error Correction. In Proceedings of the 21st National Conference on Mechanism and Machine Design (CSMMT 2018), Keelung, Taiwan, 26–27 October 2018; p. 70. [Google Scholar]
  13. Wu, Z.-G.; Lin, S.-P.; Lu, W.-H.; Lin, P.T. In-Line Automatic Optical Inspection (AOI) in A Non-Stop Manufacturing Line Aided by 6R Robot Arm. In Proceedings of the International Conference on Advanced Technology Innovation 2019 (ICATI2019), Sapporo, Japan, 15–18 July 2019; p. SCF9001. [Google Scholar]
  14. Wu, Z.-G.; Lin, C.-Y.; Lin, P.T. Stereovision-Based Inline Inspection with a 6R Robot. In Proceedings of the 16th International Conference on Automation Technology (Automation 2019), Taipei, Taiwan, 22–24 November 2019; p. 1106. [Google Scholar]
  15. Besl, P.J.; McKay, N.D. A method for registration of 3-D shapes. IEEE Trans. Pattern Anal. Mach. Intell. 1992, 14, 239–256. [Google Scholar] [CrossRef]
  16. Fischler, M.; Bolles, R. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
  17. Wahl, E.; Hillenbrand, U.; Hirzinger, G. Surflet-pair-relation histograms: A statistical 3D-shape representation for rapid classification. In Proceedings of the Fourth International Conference on 3-D Digital Imaging and Modeling, Banff, AB, Canada, 6–10 October 2003; Institute of Electrical and Electronics Engineers (IEEE): Piscataway, NJ, USA, 2003; pp. 474–481. [Google Scholar]
  18. Rusu, R.B.; Blodow, N.; Beetz, M. Fast Point Feature Histograms (FPFH) for 3D registration. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009; Institute of Electrical and Electronics Engineers (IEEE): Piscataway, NJ, USA, 2009; pp. 3212–3217. [Google Scholar]
  19. Jørgensen, T.B.; Jensen, S.H.N.; Aanæs, H.; Hansen, N.W.; Krüger, N. An adaptive robotic system for doing pick and place operations with deformable objects. J. Intell. Robot. Syst. 2019, 94, 81–100. [Google Scholar]
  20. Le, T.-T.; Lin, C.-Y. Bin-Picking for Planar Objects Based on a Deep Learning Network: A Case Study of USB Packs. Sensors 2019, 19, 3602. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. Experimental setup of the inline inspection (X-Y-Z is the origin of the robot coordinate).
Figure 1. Experimental setup of the inline inspection (X-Y-Z is the origin of the robot coordinate).
Sensors 20 03008 g001
Figure 2. The coordinates of the end effector positions and the transformed coordinates of the 3D scanned positions.
Figure 2. The coordinates of the end effector positions and the transformed coordinates of the 3D scanned positions.
Sensors 20 03008 g002
Figure 3. Histogram of errors along X direction between the end effector coordinates and transformed coordinates of the 3D scanned positions.
Figure 3. Histogram of errors along X direction between the end effector coordinates and transformed coordinates of the 3D scanned positions.
Sensors 20 03008 g003
Figure 4. Histogram of errors along Y direction between the end effector coordinates and transformed coordinates of the 3D scanned positions.
Figure 4. Histogram of errors along Y direction between the end effector coordinates and transformed coordinates of the 3D scanned positions.
Sensors 20 03008 g004
Figure 5. Histogram of errors along Z direction between the end effector coordinates and transformed coordinates of the 3D scanned positions.
Figure 5. Histogram of errors along Z direction between the end effector coordinates and transformed coordinates of the 3D scanned positions.
Sensors 20 03008 g005
Figure 6. Histogram of Euclidean errors between the end effector coordinates and transformed coordinates of the 3D scanned positions.
Figure 6. Histogram of Euclidean errors between the end effector coordinates and transformed coordinates of the 3D scanned positions.
Sensors 20 03008 g006
Figure 7. Position of the camera for the desired inspection: (a) relative to the CAD model of the investigated object; (b) relative to the point cloud of the object captured by 3D scanner.
Figure 7. Position of the camera for the desired inspection: (a) relative to the CAD model of the investigated object; (b) relative to the point cloud of the object captured by 3D scanner.
Sensors 20 03008 g007
Figure 8. Demonstration of the experimental process: (a) An object was placed underneath the 3D scanner at a random position on the conveyer; (b) the industrial robot moved the camera to the desired position to face perpendicularly to the inspected surface on the object; (c) the camera moved with the movement of the object without stopping the conveyer.
Figure 8. Demonstration of the experimental process: (a) An object was placed underneath the 3D scanner at a random position on the conveyer; (b) the industrial robot moved the camera to the desired position to face perpendicularly to the inspected surface on the object; (c) the camera moved with the movement of the object without stopping the conveyer.
Sensors 20 03008 g008
Figure 9. The circular mark on the flat plate.
Figure 9. The circular mark on the flat plate.
Sensors 20 03008 g009
Figure 10. The directional angle θ and the tilting angle ϕ of the normal vector n t of the object.
Figure 10. The directional angle θ and the tilting angle ϕ of the normal vector n t of the object.
Sensors 20 03008 g010
Figure 11. The relative movement of the center of the circular mark between each consecutive inspection image: (a) analysis with respect to time, (b) histogram of analysis.
Figure 11. The relative movement of the center of the circular mark between each consecutive inspection image: (a) analysis with respect to time, (b) histogram of analysis.
Sensors 20 03008 g011
Figure 12. The absolute relative speed of the center of the circular mark between each consecutive inspection image: (a) analysis with respect to time, (b) histogram of analysis.
Figure 12. The absolute relative speed of the center of the circular mark between each consecutive inspection image: (a) analysis with respect to time, (b) histogram of analysis.
Sensors 20 03008 g012
Figure 13. The measured area of the circular mark during the inspection process: (a) analysis with respect to time, (b) histogram of analysis.
Figure 13. The measured area of the circular mark during the inspection process: (a) analysis with respect to time, (b) histogram of analysis.
Sensors 20 03008 g013
Figure 14. Error of the measured circular area during the inspection process: (a) analysis with respect to time, (b) histogram of analysis.
Figure 14. Error of the measured circular area during the inspection process: (a) analysis with respect to time, (b) histogram of analysis.
Sensors 20 03008 g014
Table 1. Averages and standard deviations of the errors between the end effector coordinates and transformed coordinates of the 3D scanned positions.
Table 1. Averages and standard deviations of the errors between the end effector coordinates and transformed coordinates of the 3D scanned positions.
ErrorsAverage (mm)Standard Deviation (mm)
Errors along X direction−0.00120.9844
Errors along Y direction0.00020.8312
Errors along Z direction−0.00021.9227
Euclidean errors2.00411.1577
Table 2. Experimental results of the presented process of inline inspection with an industrial robot (IIIR).
Table 2. Experimental results of the presented process of inline inspection with an industrial robot (IIIR).
Analyzed DataAverageStandard DeviationWorst CaseUnit
Relative movement0.24600.15320.6521 (max. value)mm
Absolute relative speed0.33580.22610.9420 (max. value)mm/s
Area of circular mark305.35950.7186306.9022
(measure with max. error of 5.2122 mm2)
mm2
Error of area of circular mark3.66950.71865.2122 (max. value)mm2
Table 3. Comparison between the proposed IIIR and other existing methods/technologies.
Table 3. Comparison between the proposed IIIR and other existing methods/technologies.
Methodologies/TechnologiesManipulation TypesPositioning MethodsObject ConditionsAccuracy/Performance
Automated sorting of a robotic vision system [5]Pick and place using a 4 degrees of freedom (DOF) robot armImage-based shape recognitionRandomly placed in a moving conveyor with a known and constant speed (<9 cm/s)92% success rate of shape sorting
Pick and place of deformable objects [19]Pick and place with a 6R robot arm3D scanning and parameter optimizationRandomly placed in a container98% success rate of picking the pork loins
Vision-based end effector positioning [12]Pick and place with a 6R robot armVision-based planar positioningRandomly placed in a planar surfacePositioning error = 0.42 mm
Planar object picking based on a deep learning network [20]Pick and place with a 6R robot arm3D scanning and coordinate matching (based on deep learning)Randomly placed in a containerPositioning error = 3.6 mm
Quality inspection using depth-free image-based visual servo [4]Target tracking with a 6R robot arm and performing inspectionsImage-based visual servoPlaced in a fixed platformPositioning error = 5.5 micron
Presented IIIR in this workObject following with a 6R robot arm and performing inspections3D scanning and coordinate matchingRandomly placed in a moving conveyor with a known and constant speed (33.8 mm/s)Relative speed between a 60-fps camera and object = 5.6 micron/frame

Share and Cite

MDPI and ACS Style

Wu, Z.-G.; Lin, C.-Y.; Chang, H.-W.; Lin, P.T. Inline Inspection with an Industrial Robot (IIIR) for Mass-Customization Production Line. Sensors 2020, 20, 3008. https://doi.org/10.3390/s20113008

AMA Style

Wu Z-G, Lin C-Y, Chang H-W, Lin PT. Inline Inspection with an Industrial Robot (IIIR) for Mass-Customization Production Line. Sensors. 2020; 20(11):3008. https://doi.org/10.3390/s20113008

Chicago/Turabian Style

Wu, Zai-Gen, Chao-Yi Lin, Hao-Wei Chang, and Po Ting Lin. 2020. "Inline Inspection with an Industrial Robot (IIIR) for Mass-Customization Production Line" Sensors 20, no. 11: 3008. https://doi.org/10.3390/s20113008

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop