### Abstract

In many practical situations, robots may encounter objects moving in their work space, resulting in undesirable consequences for either the robots or the moving objects. Such situations often call for sensing arrangements that can produce planar images along with depth measurements, e.g., Kinect sensors, to estimate the position of the moving object in 3-D space. In this paper, we aim to estimate the relative distance of a moving object along the axis orthogonal to a camera lens plane, thus relaxing the need to rely on depth measurements that are often noisy when the object is too close to the sensor. Specifically, multiple images of an object, with distinct orthogonal distances, are firstly captured. In this step, the object's distance from the camera is measured and the normalized area, which is the normalized sum of pixels, of the object is computed. Both computed normalized area and measured distance are filtered using a Gaussian smoothing filter (GSF). Next, a Bayesian statistical model is developed to map the computed normalized area with the measured distance. The developed Bayesian linear model allows to predict the distance between the camera sensor (or robot) and the object given the normalized computed area, obtained from the 2-D images, of the object. To evaluate the performance of the relative distance estimation process, a test stand was built that consists of a robot equipped with a camera. During the learning process of the statistical model, an ultrasonic sensor was used for measuring the distance corresponding to the captured images. After learning the model, the ultrasonic sensor was removed and excellent performance was achieved when using the developed model in estimating the distance of an object, a human hand carrying a measurement tape, moving back and forth along the axis normal to the camera plane.

Original language | English (US) |
---|---|

Title of host publication | 2018 IEEE Applied Imagery Pattern Recognition Workshop, AIPR 2018 |

Publisher | Institute of Electrical and Electronics Engineers Inc. |

ISBN (Electronic) | 9781538693063 |

DOIs | |

State | Published - May 6 2019 |

Event | 2018 IEEE Applied Imagery Pattern Recognition Workshop, AIPR 2018 - Washington, United States Duration: Oct 9 2018 → Oct 11 2018 |

### Publication series

Name | Proceedings - Applied Imagery Pattern Recognition Workshop |
---|---|

Volume | 2018-October |

ISSN (Print) | 2164-2516 |

### Conference

Conference | 2018 IEEE Applied Imagery Pattern Recognition Workshop, AIPR 2018 |
---|---|

Country | United States |

City | Washington |

Period | 10/9/18 → 10/11/18 |

### Fingerprint

### ASJC Scopus subject areas

- Engineering(all)

### Cite this

*2018 IEEE Applied Imagery Pattern Recognition Workshop, AIPR 2018*[8707380] (Proceedings - Applied Imagery Pattern Recognition Workshop; Vol. 2018-October). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/AIPR.2018.8707380

**Learning Robot-Object Distance Using Bayesian Regression with Application to A Collision Avoidance Scenario.** / Ghalyan, Ibrahim F.J.; Jaydeep, Aneesh; Kapila, Vikram.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*2018 IEEE Applied Imagery Pattern Recognition Workshop, AIPR 2018.*, 8707380, Proceedings - Applied Imagery Pattern Recognition Workshop, vol. 2018-October, Institute of Electrical and Electronics Engineers Inc., 2018 IEEE Applied Imagery Pattern Recognition Workshop, AIPR 2018, Washington, United States, 10/9/18. https://doi.org/10.1109/AIPR.2018.8707380

}

TY - GEN

T1 - Learning Robot-Object Distance Using Bayesian Regression with Application to A Collision Avoidance Scenario

AU - Ghalyan, Ibrahim F.J.

AU - Jaydeep, Aneesh

AU - Kapila, Vikram

PY - 2019/5/6

Y1 - 2019/5/6

N2 - In many practical situations, robots may encounter objects moving in their work space, resulting in undesirable consequences for either the robots or the moving objects. Such situations often call for sensing arrangements that can produce planar images along with depth measurements, e.g., Kinect sensors, to estimate the position of the moving object in 3-D space. In this paper, we aim to estimate the relative distance of a moving object along the axis orthogonal to a camera lens plane, thus relaxing the need to rely on depth measurements that are often noisy when the object is too close to the sensor. Specifically, multiple images of an object, with distinct orthogonal distances, are firstly captured. In this step, the object's distance from the camera is measured and the normalized area, which is the normalized sum of pixels, of the object is computed. Both computed normalized area and measured distance are filtered using a Gaussian smoothing filter (GSF). Next, a Bayesian statistical model is developed to map the computed normalized area with the measured distance. The developed Bayesian linear model allows to predict the distance between the camera sensor (or robot) and the object given the normalized computed area, obtained from the 2-D images, of the object. To evaluate the performance of the relative distance estimation process, a test stand was built that consists of a robot equipped with a camera. During the learning process of the statistical model, an ultrasonic sensor was used for measuring the distance corresponding to the captured images. After learning the model, the ultrasonic sensor was removed and excellent performance was achieved when using the developed model in estimating the distance of an object, a human hand carrying a measurement tape, moving back and forth along the axis normal to the camera plane.

AB - In many practical situations, robots may encounter objects moving in their work space, resulting in undesirable consequences for either the robots or the moving objects. Such situations often call for sensing arrangements that can produce planar images along with depth measurements, e.g., Kinect sensors, to estimate the position of the moving object in 3-D space. In this paper, we aim to estimate the relative distance of a moving object along the axis orthogonal to a camera lens plane, thus relaxing the need to rely on depth measurements that are often noisy when the object is too close to the sensor. Specifically, multiple images of an object, with distinct orthogonal distances, are firstly captured. In this step, the object's distance from the camera is measured and the normalized area, which is the normalized sum of pixels, of the object is computed. Both computed normalized area and measured distance are filtered using a Gaussian smoothing filter (GSF). Next, a Bayesian statistical model is developed to map the computed normalized area with the measured distance. The developed Bayesian linear model allows to predict the distance between the camera sensor (or robot) and the object given the normalized computed area, obtained from the 2-D images, of the object. To evaluate the performance of the relative distance estimation process, a test stand was built that consists of a robot equipped with a camera. During the learning process of the statistical model, an ultrasonic sensor was used for measuring the distance corresponding to the captured images. After learning the model, the ultrasonic sensor was removed and excellent performance was achieved when using the developed model in estimating the distance of an object, a human hand carrying a measurement tape, moving back and forth along the axis normal to the camera plane.

UR - http://www.scopus.com/inward/record.url?scp=85065966905&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85065966905&partnerID=8YFLogxK

U2 - 10.1109/AIPR.2018.8707380

DO - 10.1109/AIPR.2018.8707380

M3 - Conference contribution

AN - SCOPUS:85065966905

T3 - Proceedings - Applied Imagery Pattern Recognition Workshop

BT - 2018 IEEE Applied Imagery Pattern Recognition Workshop, AIPR 2018

PB - Institute of Electrical and Electronics Engineers Inc.

ER -