{"id":92,"date":"2016-11-27T19:06:27","date_gmt":"2016-11-28T00:06:27","guid":{"rendered":"https:\/\/cecas.clemson.edu\/~yue6\/?page_id=92"},"modified":"2017-05-20T22:58:10","modified_gmt":"2017-05-21T02:58:10","slug":"speakers","status":"publish","type":"page","link":"https:\/\/cecas.clemson.edu\/~yue6\/speakers\/","title":{"rendered":"Speakers"},"content":{"rendered":"\t\t\t\t\t\t<style>\r\n\t\t\t\t<style>\r\n#wpsm_accordion_339 .wpsm_panel-heading{\r\n\tpadding:0px !important;\r\n}\r\n#wpsm_accordion_339 .wpsm_panel-title {\r\n\tmargin:0px !important; \r\n\ttext-transform:none !important;\r\n\tline-height: 1 !important;\r\n}\r\n#wpsm_accordion_339 .wpsm_panel-title a{\r\n\ttext-decoration:none;\r\n\toverflow:hidden;\r\n\tdisplay:block;\r\n\tpadding:0px;\r\n\tfont-size: 18px !important;\r\n\tfont-family: Open Sans !important;\r\n\tcolor:#000000 !important;\r\n\tborder-bottom:0px !important;\r\n}\r\n\r\n#wpsm_accordion_339 .wpsm_panel-title a:focus {\r\noutline: 0px !important;\r\n}\r\n\r\n#wpsm_accordion_339 .wpsm_panel-title a:hover, #wpsm_accordion_339 .wpsm_panel-title a:focus {\r\n\tcolor:#000000 !important;\r\n}\r\n#wpsm_accordion_339 .acc-a{\r\n\tcolor: #000000 !important;\r\n\tbackground-color:#e8954c !important;\r\n\tborder-color: #ddd;\r\n}\r\n#wpsm_accordion_339 .wpsm_panel-default > .wpsm_panel-heading{\r\n\tcolor: #000000 !important;\r\n\tbackground-color: #e8954c !important;\r\n\tborder-color: #e8954c !important;\r\n\tborder-top-left-radius: 0px;\r\n\tborder-top-right-radius: 0px;\r\n}\r\n#wpsm_accordion_339 .wpsm_panel-default {\r\n\t\tborder:1px solid transparent !important;\r\n\t}\r\n#wpsm_accordion_339 {\r\n\tmargin-bottom: 20px;\r\n\toverflow: hidden;\r\n\tfloat: none;\r\n\twidth: 100%;\r\n\tdisplay: block;\r\n}\r\n#wpsm_accordion_339 .ac_title_class{\r\n\tdisplay: block;\r\n\tpadding-top: 12px;\r\n\tpadding-bottom: 12px;\r\n\tpadding-left: 15px;\r\n\tpadding-right: 15px;\r\n}\r\n#wpsm_accordion_339  .wpsm_panel {\r\n\toverflow:hidden;\r\n\t-webkit-box-shadow: 0 0px 0px rgba(0, 0, 0, .05);\r\n\tbox-shadow: 0 0px 0px rgba(0, 0, 0, .05);\r\n\t\tborder-radius: 4px;\r\n\t}\r\n#wpsm_accordion_339  .wpsm_panel + .wpsm_panel {\r\n\t\tmargin-top: 5px;\r\n\t}\r\n#wpsm_accordion_339  .wpsm_panel-body{\r\n\tbackground-color:#ffffff !important;\r\n\tcolor:#000000 !important;\r\n\tborder-top-color: #e8954c !important;\r\n\tfont-size:16px !important;\r\n\tfont-family: Open Sans !important;\r\n\toverflow: hidden;\r\n\t\tborder: 2px solid #e8954c !important;\r\n\t}\r\n\r\n#wpsm_accordion_339 .ac_open_cl_icon{\r\n\tbackground-color:#e8954c !important;\r\n\tcolor: #000000 !important;\r\n\tfloat:left !important;\r\n\tpadding-top: 12px !important;\r\n\tpadding-bottom: 12px !important;\r\n\tline-height: 1.0 !important;\r\n\tpadding-left: 15px !important;\r\n\tpadding-right: 15px !important;\r\n\tdisplay: inline-block !important;\r\n}\r\n\r\n\t\t\t #wpsm_accordion_339 .wpsm_panel-heading {\r\n\t\t\t\tbackground-image: url(https:\/\/cecas.clemson.edu\/~yue6\/wp-content\/plugins\/responsive-accordion-and-collapse\/img\/style-soft.png);\r\n\t\t\t\tbackground-position: 0 0;\r\n\t\t\t\tbackground-repeat: repeat-x;\r\n\t\t\t}\r\n\t\t\t#wpsm_accordion_339 .ac_open_cl_icon{\r\n\t\t\t\tbackground-image: url(https:\/\/cecas.clemson.edu\/~yue6\/wp-content\/plugins\/responsive-accordion-and-collapse\/img\/style-soft.png);\r\n\t\t\t\tbackground-position: 0 0;\r\n\t\t\t\tbackground-repeat: repeat-x;\r\n\t\t\t}\r\n\t\t\t<\/style>\t\r\n\t\t\t<\/style>\r\n\t\t\t<div class=\"wpsm_panel-group\" id=\"wpsm_accordion_339\" >\r\n\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t<!-- Inner panel Start -->\r\n\t\t\t\t\t<div class=\"wpsm_panel wpsm_panel-default\">\r\n\t\t\t\t\t\t<div class=\"wpsm_panel-heading\" role=\"tab\" >\r\n\t\t\t\t\t\t  <h4 class=\"wpsm_panel-title\">\r\n\t\t\t\t\t\t\t<a  class=\"\"  data-toggle=\"collapse\" data-parent=\"\" href=\"javascript:void(0)\" data-target=\"#ac_339_collapse1\" onclick=\"do_resize()\">\r\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"ac_open_cl_icon fa fa-plus\"><\/span>\r\n\t\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t \r\n\t\t\t\t\t\t\t\t<span class=\"ac_title_class\">\r\n\t\t\t\t\t\t\t\t\tAamir Ahmad (Max Planck, Germany): Human-Multirobot Interaction in Cooperative Perception-based Search and Rescue Missions\t\t\t\t\t\t\t\t<\/span>\r\n\t\t\t\t\t\t\t<\/a>\r\n\t\t\t\t\t\t  <\/h4>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<div id=\"ac_339_collapse1\" class=\"wpsm_panel-collapse collapse \"  >\r\n\t\t\t\t\t\t  <div class=\"wpsm_panel-body\">\r\n\t\t\t\t\t\t\t<p> Website : <a href= \"https:\/\/ps.is.tuebingen.mpg.de\/employees\/aahmad\"> https:\/\/ps.is.tuebingen.mpg.de\/employees\/aahmad<\/a><\/p>\r\n<p>\r\nAbstract: In this talk I will present some new insights into the effects of shared autonomy on the success of human-robot collaborative search and rescue (SAR) missions within a multi-robot cooperative perception scenario. Multiple robots in a team can generate richer information of the surrounding in comparison to a single robot, therefore, increasing the chances of locating the survivors. However, the actual task of locating the survivors depends on two crucial factors, \r\ni) The operator's ability to correctly detect and classify survivors assuming that such classification is the human operator's job in the collaborative SAR mission, and ii) whether the robot team is searching for the survivors in the right locations. The second factor itself depends on the human operator's ability to control the robots either as a team or individually each robot, unless the robots survey the accident site fully autonomously. A novel and systematic psychophysical experiment for human-multi robot interaction, involving simulated search and rescue, is designed to find the dependency of success rate of SAR missions on the human operator's level of control within a shared-autonomy control scheme. We present results that include 30 human operators and over 100 hours of human-multi robot interaction experiments. \r\n<\/p>\t\t\t\t\t\t  <\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<!-- Inner panel End -->\r\n\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t<!-- Inner panel Start -->\r\n\t\t\t\t\t<div class=\"wpsm_panel wpsm_panel-default\">\r\n\t\t\t\t\t\t<div class=\"wpsm_panel-heading\" role=\"tab\" >\r\n\t\t\t\t\t\t  <h4 class=\"wpsm_panel-title\">\r\n\t\t\t\t\t\t\t<a  class=\"collapsed\"  data-toggle=\"collapse\" data-parent=\"\" href=\"javascript:void(0)\" data-target=\"#ac_339_collapse2\" onclick=\"do_resize()\">\r\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"ac_open_cl_icon fa fa-plus\"><\/span>\r\n\t\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t \r\n\t\t\t\t\t\t\t\t<span class=\"ac_title_class\">\r\n\t\t\t\t\t\t\t\t\tJoey Durham (Amazon Robotics, USA): Assembling Orders in Amazon\u2019s Robotic Warehouses\t\t\t\t\t\t\t\t<\/span>\r\n\t\t\t\t\t\t\t<\/a>\r\n\t\t\t\t\t\t  <\/h4>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<div id=\"ac_339_collapse2\" class=\"wpsm_panel-collapse collapse \"  >\r\n\t\t\t\t\t\t  <div class=\"wpsm_panel-body\">\r\n\t\t\t\t\t\t\t<p> Website : <a href= \"http:\/\/amazonpickingchallenge.org\/\"> http:\/\/amazonpickingchallenge.org\/<\/a><\/p>\r\n<p>\r\nAmazon Robotics builds the world\u2019s largest mobile robotic fleet where many thousands of robots deliver inventory shelves to pick operators in e-commerce warehouses. Each Amazon warehouse holds millions of items of inventory, most customer orders represent a unique combination of several items, and many orders need to be shipped within a couple hours of being placed to meet delivery promises. This talk will describe how mobile robots and human operators collaborate to solve this challenging problem and enable Amazon to ship millions of orders every day.\r\n<\/p>\t\t\t\t\t\t  <\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<!-- Inner panel End -->\r\n\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t<!-- Inner panel Start -->\r\n\t\t\t\t\t<div class=\"wpsm_panel wpsm_panel-default\">\r\n\t\t\t\t\t\t<div class=\"wpsm_panel-heading\" role=\"tab\" >\r\n\t\t\t\t\t\t  <h4 class=\"wpsm_panel-title\">\r\n\t\t\t\t\t\t\t<a  class=\"collapsed\"  data-toggle=\"collapse\" data-parent=\"\" href=\"javascript:void(0)\" data-target=\"#ac_339_collapse3\" onclick=\"do_resize()\">\r\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"ac_open_cl_icon fa fa-plus\"><\/span>\r\n\t\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t \r\n\t\t\t\t\t\t\t\t<span class=\"ac_title_class\">\r\n\t\t\t\t\t\t\t\t\tAntonio Franchi (LAAS, France): What if my hand was a flying multi-robot system? Towards a natural and physical human-robot interaction\t\t\t\t\t\t\t\t<\/span>\r\n\t\t\t\t\t\t\t<\/a>\r\n\t\t\t\t\t\t  <\/h4>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<div id=\"ac_339_collapse3\" class=\"wpsm_panel-collapse collapse \"  >\r\n\t\t\t\t\t\t  <div class=\"wpsm_panel-body\">\r\n\t\t\t\t\t\t\t<p> Website : <a href= \"http:\/\/homepages.laas.fr\/afranchi\/\"> http:\/\/homepages.laas.fr\/afranchi\/<\/a><\/p>\r\n<p>\r\nIn this seminar I will introduce the flying hand concept, a system conceived and developed together with Prof. Domenico Prattichizzo from the University of Siena. The flying hand is a visionary concept that sits at the crossroad of several  disciplines in robotics, in particular: human-robot interfaces, hand grasping, multi-robot systems, and aerial robotics.\r\n\r\nIn its essence, the flying-hand is a multi-robot system composed by two or more aerial (flying) robots. Each robot is endowed with a rigid tool that acts in the same way a finger of a real hand would do, i.e., exerting a unilateral force on an object to be grasped and transported.    \r\nThe group of robots can be collectively tele-operated by a real human hand using either a haptic interface or any other gesture capturing device.\r\nIn this way the flying hand represents a unique concept in the panorama of human-multi-robot interfacing systems.\r\n\r\nIn this talk I will briefly introduce and present the control, the human interface, and the experimental implementation aspects related to this innovative robotic concept.\r\n<\/p>\r\n\t\t\t\t\t\t  <\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<!-- Inner panel End -->\r\n\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t<!-- Inner panel Start -->\r\n\t\t\t\t\t<div class=\"wpsm_panel wpsm_panel-default\">\r\n\t\t\t\t\t\t<div class=\"wpsm_panel-heading\" role=\"tab\" >\r\n\t\t\t\t\t\t  <h4 class=\"wpsm_panel-title\">\r\n\t\t\t\t\t\t\t<a  class=\"collapsed\"  data-toggle=\"collapse\" data-parent=\"\" href=\"javascript:void(0)\" data-target=\"#ac_339_collapse4\" onclick=\"do_resize()\">\r\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"ac_open_cl_icon fa fa-plus\"><\/span>\r\n\t\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t \r\n\t\t\t\t\t\t\t\t<span class=\"ac_title_class\">\r\n\t\t\t\t\t\t\t\t\tSam, Shuzhi Ge and Yanan Li (NUS, Singapore): Intelligent Control and Learning in Physical Human Robot Interaction\t\t\t\t\t\t\t\t<\/span>\r\n\t\t\t\t\t\t\t<\/a>\r\n\t\t\t\t\t\t  <\/h4>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<div id=\"ac_339_collapse4\" class=\"wpsm_panel-collapse collapse \"  >\r\n\t\t\t\t\t\t  <div class=\"wpsm_panel-body\">\r\n\t\t\t\t\t\t\t<p> Website : <a href= \"https:\/\/robotics.nus.edu.sg\/sge\/\"> https:\/\/robotics.nus.edu.sg\/sge\/<\/a><\/p>\r\n<p>\r\nRobots are expected to participate in and learn from intuitive, long term interaction with humans, and be safely deployed in myriad social applications ranging from elderly care, entertainment to education. They are also envisioned to collaborate and co-work with human beings in the foreseeable future for productivity, service, and operations with guaranteed quality. In all these applications, robots which are stiff and tightly controlled in position will face problems such as saturation, instability, and physical failure, when they interact with unknown environments. In this talk, I will present some of our recent works on control of robots in interacting with unknown environments and physical human robot interaction.\r\n<\/p>\t\t\t\t\t\t  <\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<!-- Inner panel End -->\r\n\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t<!-- Inner panel Start -->\r\n\t\t\t\t\t<div class=\"wpsm_panel wpsm_panel-default\">\r\n\t\t\t\t\t\t<div class=\"wpsm_panel-heading\" role=\"tab\" >\r\n\t\t\t\t\t\t  <h4 class=\"wpsm_panel-title\">\r\n\t\t\t\t\t\t\t<a  class=\"collapsed\"  data-toggle=\"collapse\" data-parent=\"\" href=\"javascript:void(0)\" data-target=\"#ac_339_collapse5\" onclick=\"do_resize()\">\r\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"ac_open_cl_icon fa fa-plus\"><\/span>\r\n\t\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t \r\n\t\t\t\t\t\t\t\t<span class=\"ac_title_class\">\r\n\t\t\t\t\t\t\t\t\tPaolo Robuffo Giordano (Lagadic team, France): Collective Control, State Estimation and Human Interaction for Quadrotors in Unstructured Environments\t\t\t\t\t\t\t\t<\/span>\r\n\t\t\t\t\t\t\t<\/a>\r\n\t\t\t\t\t\t  <\/h4>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<div id=\"ac_339_collapse5\" class=\"wpsm_panel-collapse collapse \"  >\r\n\t\t\t\t\t\t  <div class=\"wpsm_panel-body\">\r\n\t\t\t\t\t\t\t<p> Website : <a href= \"http:\/\/www.irisa.fr\/lagadic\/team\/Paolo.Robuffo_Giordano.html\"> http:\/\/www.irisa.fr\/lagadic\/team\/Paolo.Robuffo_Giordano.html<\/a><\/p>\r\n<p>\r\nThis talk will give an overview of some recent theoretical and experimental results in the field of collective control for multiple quadrotor UAVs under partial control of a human operator. In particular, we will discuss some possible shared control frameworks with the operator providing motion inputs and receiving a visual-force feedback informative of the group status. The talk will illustrate the nature and kind of problems addressed within this research line, by focusing on both theoretical analyses and experimental implementations, and then discuss some future research directions.\r\n<\/p>\t\t\t\t\t\t  <\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<!-- Inner panel End -->\r\n\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t<!-- Inner panel Start -->\r\n\t\t\t\t\t<div class=\"wpsm_panel wpsm_panel-default\">\r\n\t\t\t\t\t\t<div class=\"wpsm_panel-heading\" role=\"tab\" >\r\n\t\t\t\t\t\t  <h4 class=\"wpsm_panel-title\">\r\n\t\t\t\t\t\t\t<a  class=\"collapsed\"  data-toggle=\"collapse\" data-parent=\"\" href=\"javascript:void(0)\" data-target=\"#ac_339_collapse6\" onclick=\"do_resize()\">\r\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"ac_open_cl_icon fa fa-plus\"><\/span>\r\n\t\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t \r\n\t\t\t\t\t\t\t\t<span class=\"ac_title_class\">\r\n\t\t\t\t\t\t\t\t\tAndreas Kolling (The University of Sheffield, UK): Human Interaction with Robot Swarms - A Survey\t\t\t\t\t\t\t\t<\/span>\r\n\t\t\t\t\t\t\t<\/a>\r\n\t\t\t\t\t\t  <\/h4>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<div id=\"ac_339_collapse6\" class=\"wpsm_panel-collapse collapse \"  >\r\n\t\t\t\t\t\t  <div class=\"wpsm_panel-body\">\r\n\t\t\t\t\t\t\t<p> Website : <a href= \"https:\/\/www.sheffield.ac.uk\/acse\/staff\/ak\"> https:\/\/www.sheffield.ac.uk\/acse\/staff\/ak<\/a><\/p>\r\n<p>\r\nThe study of human-swarm interaction (HSI) is a novel research field that is motivated by the difficulty of effectively controlling large robot swarms. This talk aims to contribute to the foundations of HSI by proposing and discussing coreconcepts and open questions. In particular, we will discuss the properties of swarms that are most relevant for HSI, such as the cognitive complexity of solving tasks with swarm systems, challenges and solutions relating to human-swarm communication, state estimation and visualization, and effective human control of swarms. For the latter we present a taxonomy of swarm control methods. We will presented selected results from the state of the art, as well as highlight remaining challenges and open problems for human-swarm interaction.\t\t\t\t\t\t  <\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<!-- Inner panel End -->\r\n\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t<!-- Inner panel Start -->\r\n\t\t\t\t\t<div class=\"wpsm_panel wpsm_panel-default\">\r\n\t\t\t\t\t\t<div class=\"wpsm_panel-heading\" role=\"tab\" >\r\n\t\t\t\t\t\t  <h4 class=\"wpsm_panel-title\">\r\n\t\t\t\t\t\t\t<a  class=\"collapsed\"  data-toggle=\"collapse\" data-parent=\"\" href=\"javascript:void(0)\" data-target=\"#ac_339_collapse7\" onclick=\"do_resize()\">\r\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"ac_open_cl_icon fa fa-plus\"><\/span>\r\n\t\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t \r\n\t\t\t\t\t\t\t\t<span class=\"ac_title_class\">\r\n\t\t\t\t\t\t\t\t\tValeria Villani (University of Modena and Reggio Emilia) Italy): Hands-free and infrastructure-less human-MRS interaction systems\t\t\t\t\t\t\t\t<\/span>\r\n\t\t\t\t\t\t\t<\/a>\r\n\t\t\t\t\t\t  <\/h4>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<div id=\"ac_339_collapse7\" class=\"wpsm_panel-collapse collapse \"  >\r\n\t\t\t\t\t\t  <div class=\"wpsm_panel-body\">\r\n\t\t\t\t\t\t\t<p> Website : <a href= \"http:\/\/www.arscontrol.unimore.it\/site\/home\/people\/valeria-villani.html\"> http:\/\/www.arscontrol.unimore.it\/site\/home\/people\/valeria-villani.html<\/a><\/p>\r\n<p>\r\nIn this talk we will describe methodologies for letting a human operator interact with a multi-robot system in a natural manner, without the need for any dedicated infrastructure, and while keeping his\/her hands free. This is achieved utilizing a wearable device, namely a smartwatch or a sensorized wristband, utilized for recognizing wrist motion. Recognized motion is then translated, in a natural manner, into commands for the MRS. Feedback to the user is provided in terms of modulated frequency.\r\nWe will characterize, in a systematic manner, what are the features that make an interaction system \"natural\", and how this enhances the user's interaction experience.\t\t\t\t\t\t  <\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<!-- Inner panel End -->\r\n\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t<!-- Inner panel Start -->\r\n\t\t\t\t\t<div class=\"wpsm_panel wpsm_panel-default\">\r\n\t\t\t\t\t\t<div class=\"wpsm_panel-heading\" role=\"tab\" >\r\n\t\t\t\t\t\t  <h4 class=\"wpsm_panel-title\">\r\n\t\t\t\t\t\t\t<a  class=\"collapsed\"  data-toggle=\"collapse\" data-parent=\"\" href=\"javascript:void(0)\" data-target=\"#ac_339_collapse8\" onclick=\"do_resize()\">\r\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"ac_open_cl_icon fa fa-plus\"><\/span>\r\n\t\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t \r\n\t\t\t\t\t\t\t\t<span class=\"ac_title_class\">\r\n\t\t\t\t\t\t\t\t\tYue Wang (Clemson, USA): Trust based Control and Motion Planning for Multi-Robot Systems with Human-in-the-Loop\t\t\t\t\t\t\t\t<\/span>\r\n\t\t\t\t\t\t\t<\/a>\r\n\t\t\t\t\t\t  <\/h4>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<div id=\"ac_339_collapse8\" class=\"wpsm_panel-collapse collapse \"  >\r\n\t\t\t\t\t\t  <div class=\"wpsm_panel-body\">\r\n\t\t\t\t\t\t\t<p> Website : <a href= \"http:\/\/yue6.people.clemson.edu\/\"> http:\/\/yue6.people.clemson.edu\/<\/a><\/p>\r\n<p>\r\nHuman-robot collaboration integrates the best part of human intelligence with the advantages of autonomous robotic systems. This talk will begin with a discussion on human trust in robot and modeling and measurement approaches for trust in real-time robotic operations. We consider the scenario where a human operator supervises a multi-robot team by teleoperating a selected robot while other robots coordinate with each other autonomously. Based on these results, we present our recent works on trust-based motion planning, scheduling, and control of multi-robot systems. We will first introduce trust-based symbolic motion planning for robots, which is the process of specifying and planning robot tasks in a discrete space, then carrying them out in a continuous space in a manner that preserves the discrete-level task specifications. We then present co-design approaches for the scheduling and control of multi-robot systems such that human trust to each robot is kept within an acceptable range. Last but not least, we show a trust-based leader selection strategy for multi-robot bilateral haptic teleoperation.\r\n<\/p>\t\t\t\t\t\t  <\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<!-- Inner panel End -->\r\n\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t<!-- Inner panel Start -->\r\n\t\t\t\t\t<div class=\"wpsm_panel wpsm_panel-default\">\r\n\t\t\t\t\t\t<div class=\"wpsm_panel-heading\" role=\"tab\" >\r\n\t\t\t\t\t\t  <h4 class=\"wpsm_panel-title\">\r\n\t\t\t\t\t\t\t<a  class=\"collapsed\"  data-toggle=\"collapse\" data-parent=\"\" href=\"javascript:void(0)\" data-target=\"#ac_339_collapse9\" onclick=\"do_resize()\">\r\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"ac_open_cl_icon fa fa-plus\"><\/span>\r\n\t\t\t\t\t\t\t\t\t\r\n\t\t\t\t\t\t\t\t \r\n\t\t\t\t\t\t\t\t<span class=\"ac_title_class\">\r\n\t\t\t\t\t\t\t\t\tFumin Zhang (Georgia Tech, USA): Interaction Between Human and A Swarm of Miniature Autonomous Blimps\t\t\t\t\t\t\t\t<\/span>\r\n\t\t\t\t\t\t\t<\/a>\r\n\t\t\t\t\t\t  <\/h4>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t\t<div id=\"ac_339_collapse9\" class=\"wpsm_panel-collapse collapse \"  >\r\n\t\t\t\t\t\t  <div class=\"wpsm_panel-body\">\r\n\t\t\t\t\t\t\t<p> Website : <a href= \"http:\/\/users.ece.gatech.edu\/~fumin\/\"> http:\/\/users.ece.gatech.edu\/~fumin\/<\/a><\/p>\r\n<p>\r\nSignificant recent advances in human robot interaction calls for convenient mobile platforms that move in three dimensional space to support experiments and demonstrations.\r\nUnmanned aerial vehicles such as quad-rotors and multi-copters have become popular for this purpose.  However, the indoor usage of these unmanned aerial vehicles (UAVs) is limited by flight duration per battery charge (typically less than 20 minutes) and  safety concerns to humans sharing the same lab space.  Safety nets or cages provide human protection, but sacrifice the potential for human robot interaction experiments.\r\n We develop the Georgia Tech Miniature Autonomous Blimp (GT-MAB) as  flying vehicles for indoor experiments that  supports safe  interaction between human and robot swarm.  The GT-MAB  has relatively long flight duration up to two hours per battery charge. Furthermore, the blimps are naturally cushioned and do not cause any pain when collide with human. It offers a fun experience that often encourage physical contacts with humans. We have developed vision based feedback control laws that enable the GT-MAB to detect and track humans. An online learning algorithm is developed to select proper robot movements that generating interesting motion patterns that react to human movements.\r\n\t\t\t\t\t\t  <\/div>\r\n\t\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<\/div>\r\n\t\t\t\t\t<!-- Inner panel End -->\r\n\t\t\t\t\t\r\n\t\t\t\t\t\t\t<\/div>\r\n\t\t\t\r\n<script type=\"text\/javascript\">\r\n\t\r\n\t\tfunction do_resize(){\r\n\r\n\t\t\tvar width=jQuery( '.wpsm_panel .wpsm_panel-body iframe' ).width();\r\n\t\t\tvar height=jQuery( '.wpsm_panel .wpsm_panel-body iframe' ).height();\r\n\r\n\t\t\tvar toggleSize = true;\r\n\t\t\tjQuery('iframe').animate({\r\n\t\t\t    width: toggleSize ? width : 640,\r\n\t\t\t    height: toggleSize ? height : 360\r\n\t\t\t  }, 250);\r\n\r\n\t\t\t  toggleSize = !toggleSize;\r\n\t\t}\r\n\t\t\r\n<\/script>\t\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>&nbsp;<\/p>\n","protected":false},"author":19,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-92","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/cecas.clemson.edu\/~yue6\/wp-json\/wp\/v2\/pages\/92","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/cecas.clemson.edu\/~yue6\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/cecas.clemson.edu\/~yue6\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/cecas.clemson.edu\/~yue6\/wp-json\/wp\/v2\/users\/19"}],"replies":[{"embeddable":true,"href":"https:\/\/cecas.clemson.edu\/~yue6\/wp-json\/wp\/v2\/comments?post=92"}],"version-history":[{"count":11,"href":"https:\/\/cecas.clemson.edu\/~yue6\/wp-json\/wp\/v2\/pages\/92\/revisions"}],"predecessor-version":[{"id":341,"href":"https:\/\/cecas.clemson.edu\/~yue6\/wp-json\/wp\/v2\/pages\/92\/revisions\/341"}],"wp:attachment":[{"href":"https:\/\/cecas.clemson.edu\/~yue6\/wp-json\/wp\/v2\/media?parent=92"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}