Impact Factor:6.549
 Scopus Suggested Journal: Tracking ID for this title suggestion is: 55EC484EE39417F0

International Journal
of Computer Engineering in Research Trends (IJCERT)

Scholarly, Peer-Reviewed, Platinum Open Access and Multidisciplinary

Welcome to IJCERT

International Journal of Computer Engineering in Research Trends. Scholarly, Peer-Reviewed, Platinum Open Access and Multidisciplinary

ISSN(Online):2349-7084                 Submit Paper    Check Paper Status    Conference Proposal

Back to Current Issues

Real-Time Indoor Floor Detection for Mobile Robots using Iterative Approach

Honnaraju B, S Murali, , ,
1&2 . Dept. of Computer Science and Engineering, Maharaja Institute of Technology Mysore, Karnataka, India.

This paper introduces an iterative technique to segmentation of indoor floor from a single 2-D camera connected to a mobile robot. The segmentation of the floor is an essential function for efficient mobile robot navigation, unlike previous approaches which rely on geometric indications, edges etc. Mobile robot mounted with the 2-D camera can capture floors images over long indoor sequences. Both floor and non-floor regions are all of a similar color when the conditions of illumination vary over a single image. The texture of the floor varies over a single image, and features may be lost as lighting changes. As mobile robot rotates, lot of information may be lost in the presence of small obstacles. Some floor areas are very shining under the artificial light condition. The radiant floor gives the wrong segmentation output. In the proposed approach does not require multiple images. The floor patterns are dynamically selected in the proposed research, and segmentation is performed based on the chosen designs. Camera calibration is not required in the proposed approach for floor segmentation. Also, edge cues and geometric cues are not necessary. The extensive experiment is conducted on a broad set of real indoor corridor floor image set.

Honnaraju B, S Murali."Real-Time Indoor Floor Detection for Mobile Robots using Iterative Approach". International Journal of Computer Engineering In Research Trends (IJCERT), ISSN:2349-7084, Vol.7, Issue 04,pp.23-27, April - 2020, URL:,

Keywords : segmentation, pattern, floor, structuring element, standard deviation.

1.	J Adorno et al. “Smartphone-based Floor Detection in Unstructured and Structured Environments”, 978-1-5090-1941-0/16/$31.00 ©2016 IEEE

2.	R. Deepu, B. Honnaraju, and S. Murali, "Path Generation for Robot Navigation using a Single-Camera", Procedia Computer Science, vol. 46, pp. 1425–1432, 2015, DOI: 10.1016/j.procs.2015.02.061.

3.	Soumabha Bhowmick and Abhishek Pant, “A novel floor segmentation algorithm for mobile robot navigation ” 978-1-4673-8564-0/15/$31.00 c 2015 IEEE

4.	Suryansh Kumar and K Madhava Krishna, “Fusing Appearance and Geometric Cues for Adaptive Floor Segmentation over Images”, Centre for Robotics International Institute of Information Technology 2013.

5.	X.-N. Cui, Y.-G. Kim, and H. Kim, “Floor segmentation by computing plane normal from image motion fields for visual navigation”. International Journal of Control, Automation, and Systems, 7(5):788–798, 2009.

6.	Y.-G. Kim and H. Kim, “Layered ground floor detection for vision-based mobile robot navigation”. In Proceedings of the IEEE International Conference on Robotics and Automation, 2004.

7.	J. Zhou and B. Li. “Robust ground plane detection with normalized homography in monocular sequences from a robot platform”. In Proceedings of the International Conference on Image Processing, 2006.

DOI Link :

Download :

Refbacks : Currently there are no Refbacks

Support Us

We have kept IJCERT is a free peer-reviewed scientific journal to endorse conservation. We have not put up a paywall to readers, and we do not charge for publishing. But running a monthly journal costs is a lot. While we do have some associates, we still need support to keep the journal flourishing. If our readers help fund it, our future will be more secure.

Quick Links


Science Central

Score: 13.30

Submit your paper to