RCS has a Sensable Technologies PHANToM Desktop haptic (force feedback) device that is part of a ReachIn immersive environment. The PHANToM can also be used as a standalone device. It is driven by a dual processor machine. A number of projects have made use of the PHANToM in several different areas; integration with existing visual software, simulation, modelling and of course haptic visualization.
Projects and Research
This work came out the the IERAPSI project as there became a desire to add haptic support to a medical visualization tool that had been written using AVS/Express. The Haptic/Express has been written, in C++, to provide AVS/Express with a device independent haptic interface. For a new device to be supported by the system only the functions required by the Haptic/Express API need to be written. Currently Haptic/Express only supports solid non-deformable objects, mainly because providing a visual feedback of surface deformation would require modification of the AVS/Express OpenGL renderer, which was beyond the scope of IERPASI. Any triangle based mesh is supported, images and co-planar orthoslices are replaced with two triangles of the same extents. Objects can be nested hierarchically (GroupObject) and dynamically added/removed/changed from the scene. The haptic device can also be used to translate/rotate objects.
Two devices have had Haptic/Express interfaces written for them:
A PHANToM Desktop device using the GHOST API. The PHANToM interface also enables AVS/Express to be used with the ReachIn immersive environment.
- A CORBA server object, that provides the PHANToM Desktop device as a service to remote machines, this makes it possible to perform the visualization on one machine and interact using the PHANToM device located on another machine. This is useful on single processor machines where there might not be enough CPU cycles to drive both the visualization and haptic device. Another example which we have demonstrated is in the VIPL, which is driven by an SGI Onyx 3000 in our machine room. The PHANToM device is located in the VIPL attached to a Windows(TM) PC. Using Express/MPE, the PHANToM device could be used to interact with visualizations running on the Onyx.
In 2004 some work was done to assess the possibility of porting or rewritting this framework for the Amira visualization package with view to a future project.
Touching Ancient Egypt
As part of National Science Week 2005, MVC was asked by Manchester Museum if we could provide a haptic demonstration to allow the public to "touch" some artefacts that were usually kept under lock and key in display cases.
The work to achieved this involved using a laser scanner, used by dentistry to record dentures, to acquire the objects in 3D. An ancient Egyptian canopic jar lid, from a vessel that would have originally contained mummified internal organs, was chosen as it scanned the most successfully in the short time we had access to the equipment. The data was converted to VRML and then using the ReachIn API parameters and settings adjusted to bring the ancient limestone object to life. Textured VRML model of canopic jar lid
The demonstration ran for two days 18-19 March and was a great success with young and old alike. The whole project was an interesting learning process for all those involved and there is a desire to take this further and make more delicate and unusual artefacts available for study and to the public in this way.
The majority of this work was carried out by our PhD Student Yuan Wang.
Student Projects (MVC/Compter Science)
A Haptic Sailing Simulator, 2007 - (MVC/MSc), Eleni Georgiou, Completed
This project aim is the creation of a real-time, educational haptic sailing simulator which will able to demonstrate how wind and current affect the course of a sailing boat as it travels through water, allow the user to interact with the sail and observe how the boat�s speed and direction is affected when the orientation of the sail changes, and provide suitable feedback to help the user gain a better understanding of sailing. To create the sailing simulator a Sensable Technologies Desktop PHANToM haptic device along with the H3D haptic scene graph API was used.
A Novel Haptic Rendering Mode for Scientific Visualization, 2003 - 2007(MVC/PhD), Yuan Wang, Completed
As technology for modelling complex physical processes has advanced, the simulated data has become multidimensional, which has brought about a serious effect of visual clutter when displayed within common visualization techniques. Recent developments in combined visual/haptic visualization modes aim to relieve this problem. Within thermal sensing, the force is mapped as a haptic vibration sensation and I am interested in the usage of vibration within scientific visualization. I seek to address how to build a psychophysical vibration model based on the PHANToM haptic device, and build an effective mapping function relating data to vibration.
The Effects of Network Delay on Human Performance in Collaborative Virtual Environments, 2003- (CS/PhD), Caroline Jay. Running
At the moment the PHANToM haptic device is being used to conduct a series of experiments that examine how task performance is affected if the representation of a person's actions in the virtual world is delayed.
Haptic Visualization of Scalar and Vector Fields, Summer 2003 (MVC/MSc), Yuan Wang, Complete
Haptic devices provide force-feedback tools for investigating 3D visualizations. Proprietary haptic systems come with their own API's, e.g., Ghost and ReachIn. These are in general based around polygonal scenes i.e., surfaces. This is fine for data that can be intuitively rendered as a surface such as a medical CT volume used to visualize bone. However many types of data are better thought of as continuous fields. Scalar data such density and viscosity is generally rendered using volume techniques e.g., ray-cast, Back-to-Front textures. Vector data such as velocity/flow, is visualized using advection methods, e.g., streamlines. Neither of these methods produce surfaces but are suitable for visualization through haptics. There is an opportunity to use the full potential of the haptic device since surfaces are 2D in 3D space whereas fields themselves are 3D
Haptic Feedback for Training a Liver Biopsy Procedure, Spring 2003 (MVC/BSc), Thomas Hughes, Complete
The Manchester Visualization Centre has been researching into the application of virtual environments to medical training. This project involved the implementation of a liver biopsy medical procedure within the ReachIn environment. Key to the project was to research and develop realistic force feedback effects for the insertion of medical instruments through the human body. Invaluable feedback was obtained from clinicians at the Manchester Royal Infirmary in order to achieve this.
Touch and See, October 2002 - May 2003 (CS/BSc), Ben Lyons, Complete
The project explored how best to integrate haptic feedback into the Gnu/MAVERIK VR system developed within the Advanced Interfaces Group. MAVERIK has the ability to manage exceptionally large models, such as complete offshore oil and gas platforms, and our goal was to see how these could be managed using the GHOST libraries. The solution adopted was to use a caching scheme to load parts of massive models into GHOST upon demand. We discovered experimentally that GHOST is only able to handle around 500 primitive shapes, whilst our models contain hundreds of thousands. By using MAVERIK's fast spatial searching those parts of a massive model lying within "arm's reach" of the user are dynamically loaded into the PHANToM's workspace. The resulting system works in real time and requires no additional programming by the user, other than a couple of initialisation calls. Existing MAVERIK programs can be changed straightforwardly to add a sense of touch to the user's interaction with models.
Modelling NURBS Surfaces Using A Stereoscopic Display With Haptic Feedback, Summer 2001 (MVC/MSc), Alex Daltas. Complete
This project developed a system for editing NURBS surfaces that uses a combination of software techniques and virtual reality hardware to provide the user with a natural, intuitive interface. The focus of the project is to remove some of the layers that stand between the user and the model in traditional NURBS modelling systems, making the process indirect. To this end a hardware system is used that provides collocated three dimensional stereoscopic graphics and haptic rendering. A technique for the direct manipulation of the surface is provided to replace the indirect process of control point manipulation, and support for closed surfaces is included to give the surface the feel of a solid object.