Rendering of Constraints with Underactuated Haptic Devices
Several previous works have studied the application of proxy-based rendering algorithms to underactuated haptic devices. However, all these works make oversimplifying assumptions about the configuration of the haptic device, and they ignore the user’s intent. In this work, we lift those assumptions, and we carry out a theoretical study that unveils the existence of unnatural ghost forces under typical proxy-based rendering. We characterize and quantify those ghost forces. In addition, we design a novel rendering strategy, with anisotropic coupling between the device and the proxy. With this strategy, the forces rendered by an underactuated device are a best match of the forces rendered by a fully actuated device. We have demonstrated our findings on synthetic experiments and a simple real-world experiment.
Soft Hand Simulation for Smooth and Robust Natural Interaction
Natural hand-based interaction should feature hand motion that adapts smoothly to the tracked user’s motion, reacts robustly to contact with objects in a virtual environment, and enables dexterous manipulation of these objects. In our work, we enable all these properties thanks to an efficient soft hand simulation model. This model integrates an articulated skeleton, nonlinear soft tissue and frictional contact, to provide the realism necessary for natural interaction. As a result, we accomplish hand simulation as an asset that can be connected to diverse input tracking devices, and seamlessly integrated in game engines for fast deployment in VR applications.
Wearable devices are driven by a soft hand simulation that is integrated in Unreal Engine allowing interaction with arbitrary objects and arbitrary scenes, like this Jenga game. There are multiple components involved: Tactile devices + Oculus + Leap Motion + Unreal + CLAP.
Proxy-Based Haptic Rendering for Underactuated Haptic Devices
Standard haptic rendering algorithms are not well suited for underactuated haptic devices. They compute forces oblivious of underactuation, and then they simply project the resulting forces to the actuated subspace. We propose instead a proxy-based haptic rendering method that computes displacements in the actuated subspace only, and then translates these displacements into force commands using regular controllers. Our method is well behaved in two important ways: it is locally passive w.r.t. the motion of the haptic device, and the displayed impedance can be easily controlled regardless of the mapping between device and virtual configuration spaces.
In Proceedings of IEEE World Haptics 2017, pp. 48 – 53.
Novel wearable tactile interfaces offer the possibility to simulate tactile interactions with virtual environments directly on our skin. But, unlike kinesthetic interfaces, for which haptic rendering is a well explored problem, they pose new questions about the formulation of the rendering problem. In this work, we propose a formulation of tactile rendering as an optimization problem, which is general for a large family of tactile interfaces. Based on an accurate simulation of contact between a finger model and the virtual environment, we pose tactile rendering as the optimization of the device degrees of freedom, such that the contact surface between the device and the actual finger matches as close as possible the contact surface in the virtual environment. We describe the optimization formulation in general terms, and we also demonstrate its implementation on a thimble-like wearable device. We validate the tactile rendering formulation by analyzing the contact surface produced by the device on the finger, and we show that it outperforms other approaches.
IEEE Transactions on Haptics 10(2):254 – 264, 2017.
Efficient Nonlinear Skin Simulation for Multi-Finger Tactile Rendering
Recent advances in tactile rendering span, among others, wearable cutaneous interfaces, tactile rendering algorithms, or nonlinear soft skin models. However, the adoption of these advances for multi-finger tactile rendering of dexterous grasping and manipulation is hampered by the computational cost incurred with nonlinear skin models when applied to the full hand. We have observed that classic constrained dynamics solvers, typically designed for contact mechanics, fail to perform efficiently on deformation constraints of nonlinear skin models.
In this paper, we propose a novel constrained dynamics solver designed to perform well with highly nonlinear deformation constraints. In practice, we achieve more than 10×speed-up over previous approaches, and as a result we enable multifinger tactile rendering of manipulation actions that capture the nonlinearity of skin.
In Proceedings of IEEE Haptics Symposium 2016, pp. 155-160.
Soft Finger Tactile Rendering for Wearable Haptics
This paper introduces a tactile rendering algorithm for wearable cutaneous devices that stimulate the skin through local contact surface modulation. The first step in the algorithm simulates contact between a skin model and virtual objects, and computes the contact surface to be rendered. The accuracy of this surface is maximized by simulating soft skin with its characteristic nonlinear behavior. The second step takes the desired contact surface as input, and computes the device configuration by solving an optimization problem, i.e., minimizing the deviation between the contact surface in the virtual environment and the contact surface rendered by the device. The method is implemented on a thimble-like wearable device.
In Proceedings of IEEE World Haptics 2015, pp. 327 – 332.