Keynote Speaker
Prof. You-Fu Li (IEEE Fellow)
City University of Hong Kong, China
Biography: You-Fu Li received the PhD degree in robotics from the Department of Engineering Science, University of Oxford in 1993. From 1993 to 1995 he was a research staff in the Department of Computer Science at the University of Wales, Aberystwyth, UK. He joined City University of Hong Kong in 1995 and is currently professor in the Department of Mechanical Engineering. His research interests include robot sensing, robot vision, and visual tracking. In these areas, he has published over 400 papers including over 200 SCI listed journal papers. Dr Li has received many awards in robot sensing and vision including IEEESensors Journal Best Paper Award by IEEE Sensors Council, Second Prize of Natural Science Research Award by the Ministry of Education, 1st Prize of Natural Science Research Award of Hubei Province, 1st Prize of Natural Science Research Award of Zhejiang Province, China. He was on Top 2% of the world’s most highly cited scientists by Stanford University, 2020- and Career Long. He has served as an Associate Editor for IEEE Transactions on Automation Science and Engineering (T-ASE), Associate Editor and Guest Editor for IEEE Robotics and Automation Magazine (RAM), and Editor for CEB, IEEE International Conference on Robotics and Automation (ICRA). He is a fellow of IEEE.
Speech Title: 3D visual sensing and tracking for robots
Abstract: Visual sensing and tracking are needed in many engineering applications including robotics. In this talk, I will present our research in visual sensing for automated 3D sensing in general and for motion tracking for robotics in particular. Different approaches in our investigation in 3D vision will be reported. These include an active vision approach to 3D visual sensing. For robotic applications, visual sensing in 3D is often needed, but the calibration remains tedious and inflexible with traditional approach. To this end, we have investigated the relevant issues for different visual sensing systems. A flexible calibration method desires the vision system parameters to be recalibrated automatically or with less operator interference whenever the configuration of the system is changed, but practically this is often hard to achieve. Various attempts were made in our previous works to enhance the flexibility in the visual sensing calibration. I will present some them including the work on omni-directional visual sensing and tracking. Another case to present is that of gaze tracking where the issues in the modeling and calibration are addressed with our new calibration method developed.