Contreras, Cesar Alan, Chiou, Manolis, Rastegarpanah, Alireza, Szulik, Michal and Stolkin, Rustam (2026). Probabilistic Human Intent Prediction for Mobile Manipulation:An Evaluation with Human-Inspired Constraints. Journal of Intelligent & Robotic Systems, 112 ,
Abstract
We present GUIDER (Global User Intent Dual-phase Estimation for Robots), a dual-phase probabilistic framework for intent inference in mobile manipulation that operates without predefined goals. A Synergy Map fuses motion evidence with an occupancy grid to rank likely interaction areas during navigation. After arrival, perception merges U-Net and FastSAM saliency with three geometric grasp-feasibility tests; an end-effector kinematics-aware update then evolves object probabilities in real time. In 100 teleoperation trials (20 participants 5 tasks) in Isaac Sim, GUIDER outperformed baselines. During navigation, median stability was 100% across tasks (BOIR, the baseline, had an overall median of 89.85%), with large gains under redirection (BOIR 59.67–63.49% in T2/T5). During manipulation, median stability was 100% in all tasks, while Trajectron (manipulation baseline) dropped to 62.68% for tool grasping (T4). GUIDER yielded earlier confident object predictions in geometry-constrained settings (T5: 20.31 s remaining vs 3.89 s). Ablations confirm the need for the multi-horizon synergy map, the grasp-feasibility checks, and temporal end-effector probability evolution. GUIDER provides a unified probabilistic backbone spanning base and arm, supporting future variable-autonomy controllers.
| Publication DOI: | https://doi.org/10.1007/s10846-026-02362-4 |
|---|---|
| Divisions: | College of Engineering & Physical Sciences > School of Computer Science and Digital Technologies > Applied AI & Robotics College of Engineering & Physical Sciences College of Engineering & Physical Sciences > Aston Centre for Artifical Intelligence Research and Application College of Engineering & Physical Sciences > School of Computer Science and Digital Technologies Aston University (General) |
| Funding Information: | This work was funded by the Nuclear Decommissioning Authority (NDA) and supported by the United Kingdom National Nuclear Laboratory (UKNNL). In addition, it was supported by the UK Research and Innovation (UKRI) project “REBELION” under Grant 10079049. |
| Additional Information: | This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adap-tation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indi-cate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copy-right holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. |
| Uncontrolled Keywords: | Variable Autonomy,Intent Inference,Logical Constraints,Human-Robot Interaction,Mobile Manipulation,Human Intent |
| Data Access Statement: | The datasets generated and analysed during the current study are available from the corresponding author upon reasonable request. |
| Last Modified: | 31 Mar 2026 14:18 |
| Date Deposited: | 31 Mar 2026 14:18 |
| Full Text Link: | |
| Related URLs: |
https://link.sp ... 846-026-02362-4
(Publisher URL) |
PURE Output Type: | Article |
| Published Date: | 2026-03-05 |
| Accepted Date: | 2026-01-22 |
| Authors: |
Contreras, Cesar Alan
Chiou, Manolis Rastegarpanah, Alireza (
0000-0003-4264-6857)
Szulik, Michal Stolkin, Rustam |
0000-0003-4264-6857