How to define the correct guidelines for enhanced telepresence and task embodiment in remote palpation.

Teleoperated robots have been widely accepted in several fields of medical practice, enhancing human abilities and allowing remote operation. However, such technology has not been able yet to permeate areas such as primary care and physical examination. Such applications strongly rely on the quality of the interaction between doctor and patient, and on its multimodal nature. In order to achieve remote physical examination is thus mandatory to have a good doctor-robot interface, but what does good mean? Ultimately, the goal is for the user to achieve task embodiment, making the remote task feel like the in-person one. Several research groups have proposed a wide variety of interfaces, showcasing largely different methods of control and feedback, because of the absence of design guidelines. In this work, we argue that the ideal interface for a remote task should resemble as close as possible the experience provided by the in-person equivalent, keeping in consideration the nature of the target users. To support our claims, we analyze many remote interfaces and compare them with the respective in-person task. This analysis is not limited to the medical sector, with examples such as remote abdominal surgery, but it expands to all forms of teleoperation, up to nuclear waste handling and avionics.


Introduction 1.Teleoperation in healthcare
Teleoperation is the ability to perform a task remotely controlling a robot [1].Teleoperation is extremely interesting when the working site is unreachable or the presence of the human user in the remote environment could be dangerous for the user itself or other individuals: nuclear matter handling, aeronautics, and medical practice [2,3,4].Teleoperated systems are characterized by a user, a doctor in the case of medical applications, a leader, a follower, and a remote environment, or a patient in the case of medical applications.As a matter of fact, teleoperation has been well accepted in the healthcare system for more than 2 decades, with a large number of applications from neurosurgery to endoscopic interventions [5,6].The combination of enhanced precision, tremor filtering and miniaturization of the access port to the human body, whilst keeping the doctor in charge of the operation, allowed medical performance beyond human capabilities [7].All mentioned fields are characterized by the intrinsic challenge of reaching the operation site: teleoperated robots can easily reach where the surgeons' hands cannot.The most representative example of such technology is the Da Vinci Surgical System (Intuitive Surgical Inc.): a teleoperated architecture used for remote surgery.The ideal remote 1292 (2023) 012024 IOP Publishing doi:10.1088/1757-899X/1292/1/012024 2 system features both high teleoperation, the ability to act in the remote environment, and high telepresence, the feeling of being in the remote environment (see Fig. 1).Through telepresence, the doctor fully embodies the task, perceiving it as if it was in-person rather than remotely.This important feature of remotely controlled systems is the key to achieving the ideal synergy between the doctor and the machine, referred to in the literature as 'human-robot orchestration' [8].Although no commercially available system showcases haptic feedback, implementing haptic Figure 1.Schematized example of generic medical teleoperation featuring a doctor, a leader, a follower and a patient.In an intuitive system, the doctor would behave in the same way as in the in-person task.feedback has been proven more than feasible [9,10] and easily implementable in existing commercial systems [11,12].Driven by the desire to expand robotic technologies to other aspects of medical practice, several research groups have started developing systems in order to achieve remote palpation [13].

Palpation and Remote Palpation
The most common test run by primary care doctors is physical examination [14].Not only does it lead to the most suitable follow-up exam, but in some cases also to a correct diagnosis of abdominal pathologies [15].Requiring no additional tools besides the doctor's hands and expertise, represents a key element of the healthcare system's workflow.Physical examination is composed of 4 parts: auscultation, inspection, percussion, and palpation [16,17].The latter has been studied both in simulations [18] and in real scenarios [19] because of its high dexterity and multi-modal feedback modalities.Given the pivotal role of such a practice, what happens when it cannot be performed in person?Teleoperated systems seem to offer a possible solution, both enabling the diagnosis of unreachable patients and slowing down the ever-growing shortage of physicians [20].When analyzing state-of-the-art teleoperated systems for remote palpation, it can be observed that the absence of standard design guidelines has led to a wide variety of different implementations of the doctor-robot human-machine interface (HMI), none of which showcases the desired telepresence required to achieve a good performance.Generally, the main direction taken by this field is the implementation of haptic systems able to record and convey tactile feedback from the remote environment to the user intuitively and naturally [21].
Moreover, even the delivery modality of the tactile feedback itself has multiple different implementations in the literature.In some cases, the tactile information is shown either directly [22,23] or using a simulation of the interaction with the soft tissues [24,25,26]: this feedback IOP Publishing doi:10.1088/1757-899X/1292/1/0120243 interface can be considered as an add-on for commercial systems without the introduction of stability problems.However, the ultimate result is no fidelity and no transparency because the feedback is delivered visually and the motion required to control the trajectory of the robotic hand feels unnatural, relying on the same open kinematic chain used in surgical robots.In fact, intuitiveness plays a fundamental role in achieving telepresence and effectiveness [27,28].For intuitiveness, we intend the ability of the system to achieve high telepresence with minimal learning required.Note that high telepresence is effectively achieved when the remote system is embodied by the user, and becomes part of her/his body schema, allowing her/him to perceive the remote task as if it was in person.In other cases, fingertip tactile haptic devices have been implemented and used together with a traditional control interface.The simplest solutions showcase vibrating stimulation devices, placed on top of the user's fingers [29] or over the control interface [30].The most complex interfaces showcase the entire surface's morphology changes, acting as physical twins of the touched surface in the remote environment [31,32,33].Those devices are able to reproduce any shape touched by the follower and convey both kinesthetic and tactile haptic information, but they are solely feedback interfaces, and cannot be used for the trajectory control of the follower.This negatively impacts the telepresence of the overall system, as the user still needs to learn how to use a system that inevitably feels very different from using its own hand to perform in-presence palpation.Once again, the interface might result not natural and not intuitive, and promote estrangement for the user, because it is no longer requires palpating a soft interface, but controlling a joystick.
Such a problem has been tackled by using the physical phantom also as a control input [34,35], producing more intuitive interfaces for human-machine interaction.In the work by Li et al. [36], it is shown how the same interface is used both for sending control commands and delivering tactile feedback.However, state-of-the-art cases of said technology have extremely limited dexterity, with discrete predetermined possible trajectories, strongly impairing the user's ability to perform any desired motion and once again promoting estrangement for the user.
2. Comparison of remote/in-person tasks in successful teleoperated applications 2.1.Medical field One great example of remote teleoperation in the medical field is RAMIS: Robot Assisted Minimally Invasive Surgery.A user, usually a surgeon, uses a leader interface to control the movements of the follower, which is composed of an endoscopic camera and a set of minimally invasive instruments.Concerning the benefits, RAMIS not only provides low infection rates, shorter recovery time, and reduced blood loss for the patient [37], but also enhanced precision, dexterity, and vision for the doctor [38,39].Hence, RAMIS has radically transformed surgery [40], going from just over 100,000 robotic assisted operations in 2008 [41] to over a million in 2018 [42].These numbers are bound to increase even further as new systems approach the market [43,44,45].As the teleoperated systems permeate surgery, they showcase numerous benefits compared to traditional methods [46,47].As a result, many procedures have been converted from laparoscopic surgery to RAMIS, from cancer removal to partial organ resection [48,49,50,51,52,53,54,55,56,57,58,59,60].As it emerges from all these studies, RAMIS represents a significant upgrade to laparoscopic surgery.It is also very important to notice that surgeons performing RAMIS are laparoscopic surgeons and have many years of experience using laparoscopic instruments, as RAMIS is a direct successor of minimally invasive surgery, or laparoscopy, and not of traditional open surgery.Because of that, all commercial devices present to the user an interface that resembles as much as possible the handles of a laparoscopic instrument (see Fig. 2).Because of the specific target user, reproducing the interface that characterizes a task they are already proficient in can effectively lower the time needed to learn how to use the new tool and avoid the user's estrangement from the task.Moreover, a closer look at commercially available systems such as the Da Vinci or the Raven [61] reveals that these systems do not implement any haptic feedback.Such feedback is instead present, even if limited, in laparoscopic surgery, and the absence of it complicates the handling of soft tissues [62].However, as already mentioned in Section 1.2, there are several solutions explored in literature able to add such feature.Note that even in laparoscopic surgery, point-wise kinaesthetic haptic feedback is the only form of mechanical feedback available to the doctors, given the extremely small contact area between the tool and the patient's tissues.

External fields
Nuclear power plants (NPP) are a prime application for teleoperation, mainly because of the high level of radiation produced by the nuclear reactor and the radioactive waste [2].In such a challenging environment, many tasks cannot be executed in person by operators without a great health hazard or potential danger.In NPP, teleoperation can be used for a wide range of tasks: from remotely cleaning the reactor's walls [63] to safely performing maintenance [64] and handling radioactive waste [65].The last 2 examples are particularly relevant, representing high dexterity handling tasks in which objects need to be visually inspected and correctly placed.In a sense, this type of teleoperation does not radically differ from teleoperated surgery in the sense that the aim is to correctly follow a trajectory through space and perform pick-up and placement.In fact, such a task can be performed in person, through a glovebox (see Fig. 3a): a radiation-shielded cage in which an operator can introduce the hands, thanks to protective gloves.However, performing the same task remotely could not only reduce even further the level of radiation experienced by the user but also prevent any possible damage in case of failure (see Fig. 3c).Intuitively, teleoperated systems developed for such applications have to withstand radiations, and every component needs to be tested accordingly [66].Concerning the system's architecture, because the only requirement is to check if the correct object was grasped and placed correctly, an interface consisting of an open kinematic chain with clamps on the handle and visual feedback has been shown to achieve acceptable results in commercial reactors such as the Joint European Torus (JET).However, the limited dexterity and the lack of a multimodal feedback modality, showcasing only visual feedback, led to increased operation times, when compared with the in-person task.When designing the teleoperated system for the maintenance of the experimental fusion plant ITER, an extensive study has analyzed and optimized all the procedures so to save as much time as possible and reduce the delays caused by low telepresence of the teleoperated systems [67].Moreover, following a trend already explored in teleoperated surgery, enabling shared control for trajectory optimization and obstacle avoidance [68] adding multimodal feedback such as haptic and visual [69] significantly improves performance and operation time.Note that the implemented haptic feedback is kinaesthetic and not tactile, assuming a point-like contact between the follower and the remote environment.This assumption works relatively well when working with clamps and interacting with a rigid environment, but falls short in reproducing extended soft contact surfaces, such as the human abdomen inspected during palpation.
Another field in which teleoperation has been heavily involved is aerospace, specifically in the development of Unmanned Areal Vehicles (UAVs) and drones.Any surveillance task that requires monitoring of large or dangerous zones is a prime target for avionic teleoperation: from road surveillance [70] to applications in hazardous areas such as mines [71] and volcanic landscapes [72], on top of large scale natural areas monitoring both on land and in the sea [73,74,75,76].In this case, teleoperation has multiple advantages: on one side, the devices are much smaller and more compact, reducing the costs of purchase and maintenance, on the other, in case of fatal crash there is no human casualty because the operator is not physical on the vehicle.Conversely to NPP teleoperation, this type of teleoperation is radically different from what is requested by medical teleoperation, in the sense that it does not concern manipulation and it is not performed with a robotic manipulator, but rather with an autonomous flying vehicle.Historically, the interface presented to the user, or remote pilot, has been as similar as possible to an actual interface of a human-operated vehicle [77] (see Fig. 3b and 3d).In practice, control interfaces are fundamentally flight simulators used to control a real vehicle rather than simulate one.This solution is ideal for already trained pilots, because they already have learned how to use such an interface and the skill transfer required is minimal, given the high similarity to the in-person case.However, since areal devices as drones became available to the greater public, new control interfaces were needed to make the technology usable for people without pilot training.Because of their wide presence in today's devices, interfaces such as joy-pads and console controllers are the most common commercially available control interface for drones [78] (see Fig. 3e).They feature an easy encoding of the user's inputs and result more intuitive to most people because they are already widely used in other applications, such as the gaming industry.However, many research studies have proven that it is possible to achieve an even more intuitive control interface exploiting the human's body motions [79,80,81,82] or brain-machine interfaces [83].Because the average person does not know how to pilot an aircraft, allowing them to simulate flying with their bodies or just thinking about flying promotes the embodiment of the user in the task, resulting more intuitive and natural.This paradigm shift represents a key point in the concept of remote control interface for aerospace applications and highlights a very important key feature of remote interfaces in general: the interface needs to be intuitive for the target users that are supposed to use it.For such reason, when the target shifted from trained pilots to general public, the interfaces changed from reproducing the cockpit to body and brain-guided control, avoiding the user's estrangement from the target task of flying.

Conclusion
Overall, all the analyzed examples of successful teleoperation have the same common ground: they are highly tuned in order to be natural and intuitive for the target user.Task embodiment is obtained through high telepresence.In order to achieve telepresence, the user needs to feel as if she/he is performing the task in person, rather than remotely.In other words, a good interface allows the user to move from the in-person task to the remote one without requiring her/him to undergo any learning process.However, this highly depends on the nature of the users for which the teleoperated system is intended.Highly specialized professionals such as doctors or pilots would prefer a complicated setting as similar as possible to the one that they have been trained to work with, whereas the general public would prefer a simpler interface.The same simpler setup would result unnatural and limiting to trained professionals, that already mastered the ability to control a more powerful and more complex tool.Therefore, we must focus on the need of general practitioners while trying to define the guidelines about remote palpation and remote physical examination.Unlikely surgery, during palpation the patient is awake and her/his feedback is a fundamental part of the in-person task.Literature studies have indeed proved that feedback signals such as facial expressions [84] and patient's voice [85] strongly improve the users' performance during palpation, to further confirm the importance of reproducing the same experience that the target user would be subjected to during the in-person task.Therefore, it is mandatory to further study physical examination and palpation and to identify and isolate the important features for general practitioners.Those features will then be needed while designing the teleoperated system in order to obtain an intuitive and natural interface with enhanced telepresence.Focusing on the target user will allow us to understand better the important features dictating the effectiveness of the system and to define the correct guidelines for the evaluation of remote palpation teleoperated architectures.

Figure 2 .
Figure 2. Example of a commercial Da Vinci system within hospital settings.Note how the leader's handles reproduce the handles of a laparoscopic instrument, so to enhance the overall telepresence.Source: Wikimedia Commons.

Figure 3 .
Figure 3. Examples of in-person and remote tasks from NPPs and avionics.On the left, (a) a standard glovebox to handle radioactive waste, and (c) a proposed robotic interface to remotely achieve the same action.On the right, (b) a cockpit of an airplane and (d) a professional control station for UAVs and (e) a joystick to control commercially available drones.Source: Wikimedia Commons.