- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Theses and Dissertations /
- Statistically-informed multimodal domain adaptation...
Open Collections
UBC Theses and Dissertations
UBC Theses and Dissertations
Statistically-informed multimodal domain adaptation in industrial human-robot collaboration environments Mukherjee, Debasmita
Abstract
Increased global competition has placed a great demand for manufacturers to be flexible with their products and services. This can be addressed with the introduction of robots which are very effective in carrying out repetitive, non-ergonomic tasks working in partnership with human operators who typically excel in precise tasks requiring dexterity, flexibility, and cognitive decision-making. This paradigm of humans and robots working together, forms the motivation behind the field of human-robot collaboration (HRC). This dissertation begins by introducing a novel taxonomy of HRC to better articulate the possible interactions between humans and robots based on levels of robot intelligence and autonomy. Cohesive HRC can be achieved through communication between human and robot partners. The field of human-robot communication (HRCom) finds its roots in human communication aiming to achieve the “naturalness” inherent in the latter. This dissertation posits that the design aspects can take inspiration from human communication to create more intuitive systems that truly leverage the presence of the human as a collaborating agent so that the human's role is more meaningful than just a command centre. However, this goal must come at no additional effort to the human. HRCom can be achieved through a robust robot perception system developed using machine learning. The challenge is the dearth of comprehensive, labelled datasets while standard, publicly available ones do not generalize well to domain and application specific scenarios. Furthermore, models also fail to generalize under domain shifts stemming from changes in the environment of the robot. Keeping in mind the aforementioned challenges and the complexities inherent in HRCom, a framework, SIMLea, is presented. Statistically-Informed Multimodal (Domain Adaptation by Transfer) Learning takes inspiration from human communication to use human feedback to auto-label for domain adaptation. The strength of the contribution lies in the use of incommensurable multimodal decision-level inputs for personalizing with user-specific data leading to statistically-informed extension of datasets, greater safety, enhanced monitoring of the continuous learning of the model, and judicious use of resources. The framework is validated with facial expression and hand gesture recognition for involuntary and voluntary communications; but is also applicable to other combinations of multimodal inputs in HRC applications.
Item Metadata
Title |
Statistically-informed multimodal domain adaptation in industrial human-robot collaboration environments
|
Creator | |
Supervisor | |
Publisher |
University of British Columbia
|
Date Issued |
2023
|
Description |
Increased global competition has placed a great demand for manufacturers to be flexible with their products and services. This can be addressed with the introduction of robots which are very effective in carrying out repetitive, non-ergonomic tasks working in partnership with human operators who typically excel in precise tasks requiring dexterity, flexibility, and cognitive decision-making. This paradigm of humans and robots working together, forms the motivation behind the field of human-robot collaboration (HRC). This dissertation begins by introducing a novel taxonomy of HRC to better articulate the possible interactions between humans and robots based on levels of robot intelligence and autonomy.
Cohesive HRC can be achieved through communication between human and robot partners. The field of human-robot communication (HRCom) finds its roots in human communication aiming to achieve the “naturalness” inherent in the latter. This dissertation posits that the design aspects can take inspiration from human communication to create more intuitive systems that truly leverage the presence of the human as a collaborating agent so that the human's role is more meaningful than just a command centre. However, this goal must come at no additional effort to the human.
HRCom can be achieved through a robust robot perception system developed using machine learning. The challenge is the dearth of comprehensive, labelled datasets while standard, publicly available ones do not generalize well to domain and application specific scenarios. Furthermore, models also fail to generalize under domain shifts stemming from changes in the environment of the robot. Keeping in mind the aforementioned challenges and the complexities inherent in HRCom, a framework, SIMLea, is presented. Statistically-Informed Multimodal (Domain Adaptation by Transfer) Learning takes inspiration from human communication to use human feedback to auto-label for domain adaptation.
The strength of the contribution lies in the use of incommensurable multimodal decision-level inputs for personalizing with user-specific data leading to statistically-informed extension of datasets, greater safety, enhanced monitoring of the continuous learning of the model, and judicious use of resources. The framework is validated with facial expression and hand gesture recognition for involuntary and voluntary communications; but is also applicable to other combinations of multimodal inputs in HRC applications.
|
Genre | |
Type | |
Language |
eng
|
Date Available |
2023-04-19
|
Provider |
Vancouver : University of British Columbia Library
|
Rights |
Attribution-NonCommercial-NoDerivatives 4.0 International
|
DOI |
10.14288/1.0431170
|
URI | |
Degree | |
Program | |
Affiliation | |
Degree Grantor |
University of British Columbia
|
Graduation Date |
2023-05
|
Campus | |
Scholarly Level |
Graduate
|
Rights URI | |
Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International