People Involved

Ryan Lilien (Professor, DCS & CCBR)
Christian Muise (Graduate Student)
Ronald Fung (Undergraduate Student)
Laptin Doan (Undergraduate Student)

Additional Thanks To

Izhar Wallach
Marcus Brubaker
Neil Fraser
Gail Carmichael
Clifton Forlines


Comments, Questions, Suggestions, and other Feedback can be sent to:
molviz ((at)) googlegroups (dot) com.

Download Available!

A version of our PyMol extension and head tracker (both Wiimote based and Webcam (iSight) based) are now available from here. Note that this is an open source project and is still under development. We welcome any feedback as well as your assistance in improving the code. Check out our PyMolWiki page.

You are likely to want to install a Head Tracking Driver (likely the Face Tracker) and the Immersive-Viz application. Both are required.

Join Us!

MolViz is an open source community project. We are participating in the Google summer of code. More information here.

Sponsor Our Lab!

Are you a member of industry or a non-profit interested in sponsoring ongoing research in our lab at UofT? Come check out our current projects. Contact Ryan Lilien.


AddThis Social Bookmark Button

Head Tracking with and WITHOUT the Wiimote

We adapted an open source molecular visualization package (PyMol) to integrate two forms of head tracking (wiimote IR based and webcam based). By rotating the molecule in a direction opposite to the motion of the user's head we provide a 3D experience; to the user, it appears as if they are 'peeking' around the side of the object. In the demonstration video (full video below), we first illustrate active tracking using the wiimote and IR-emitting eyewear and then show our passive tracking using only Apple's built-in webcam. The webcam enabled version uses generic head detection and tracking. We smooth the results of the webcam-based head tracking to provide a more fluid user experience. The video does not do the effect justice so we encourage you to try it for yourself! Our software requires no additional hardware (beyond a mac with builtin iSight webcam) and will be freely available at the end of March.

Best Demonstration of Effect:

Full Video:

Molecular Visualization

A key component of Structural Biology (the study of protein structure and function) is the ability to visualize proteins. In the 1960s, Cyrys Levinthal and his group at MIT built one of the first computer systems for the visualization and manipulation of protein structures. Since that time, well over a hundred molecular visualization applications have been written. These applications converged to the 'modern desktop visualization tool' which is characterized by high-resolution graphics and mouse-based navigation. An excellent example of such an application is PyMol.

Simultaneous to the development of these desktop tools a number of research groups designed high-end virtual-reality type immersive environments. These immersive environments allowed the user to visualize and navigate in a true 3D environment. Immersive environments were generally several orders of magnitude more expensive than desktop computers and often required specialized hardware, facilities, and support.

The Nintendo Wiimote

The Nintendo Wiimote is an inexpensive high degree-of-freedom wireless (bluetooth) input device. This suggests that the Wiimote might find use in a wide range of visualization applications including the manipulation of complex scientific data. We developed a series of inexpensive Wiimote-based interfaces for molecular visualization. These applications provide features previously only found in immersive environments to the low-cost desktop environment. We found that Wiimote-based molecular manipulation is particularly useful for presentations. When delivering a seminar, the speaker is able to interact directly with the projected image (rather than alternating between the presentation computer and the screen). In a classroom setting, an instructor can pass the Wiimote around the room evoking increased participation and exciting the students through the "wow" factor.

Head Tracking

The notion of manipulating a three-dimensional image in response to the tracking of a user's head position has been an integral part of virtual-reality environments for over fifteen years. By rotating the visualization in a direction exactly opposite to the user's head motion a three-dimensional effect is achieved. In essence, the user's visual system is tricked into believing that he or she is looking 'around' the object. Most recently, Johnny Lee, a graduate student at CMU, showed that this effect can be achieved utilizing Wiimote based infrared head tracking. In this system, the user is equipped with a pair of infrared emitting glasses; the user's head is then tracked by locating the IR emitters using the Wiimote's infrared camera. This simple solution works surprisingly well. We adapted this technique for molecular visualization using the open-source application PyMol.