Apr, 2018: Zukunftstag 2018. Two pupils of the Gymnasium Horn came to visit the University of Bremen and created their own website with wordpress.

Apr, 2018: We got invited by KUKA to showcase our demo at the European Robotics Forum 2018 in Tampere, Finland

Feb, 2018: The House of Science, Bremen hosts an exhibition about local scientists and science projects with collaborators around the world. One of the featured exhibits is a demo of our Autonomous Surgical Lamps, developed by Jörn Teuber of the Computer Graphics and Virtual Reality group. The exhibition will be open until the 21st of April (photos).

Feb, 2018: The University of Bremen participates in the opening of a research laboratory in Bangkok.

Nov, 2017: 2017 VRST Best Poster Award Winner. Michael Bonfert, Melina Cahnbley, Inga Lehne, Ralf Morawe, Gabriel Zachmann and Johannes Schöning are winning the award for the poster titled "Augmented Invaders: A Mixed Reality Multiplayer Outdoor Game."

Nov, 2017: Organizers of the French VR conference and trade show Laval Virtual immersed themselves into a variety of different virtual environments where they learned about current projects of the Computer Graphics & Virtual Reality lab at the University of Bremen (full report, in German).

Sep, 2017: Founding Everyday Activity Science and Engineering (EASE). EASE is a interdisciplinary research center at the University of Bremen that investigates everyday activity science & engineering. For more Information click here.

Jun 17, 2017: Haptic and hand tracking demos at the Open Campus 2017.

Feb-Apr 2017: David Vilela (Mechanical Engineering Laboratory, University of Coruna, Spain) visited our lab. He is working on benchmarks to compare different intersection calculation methods in collisions, and also different force models.

Feb 2017: G. Zachmann and J. Teuber visited the Mahidol University in Bangkok, Thailand as part of a delegation from the University of Bremen. The goal of the visit was to foster the cooperation between the two universities and lay ground-work for future colaborations.

Jun 2016: Radio Bremen visited our lab to film the works of the Creative Unit "Intra-Operative Information" for a news magazine on the local TV station. Click here for the film at Radio Bremen. And Click here for the same film on our Website.

May 16, 2016: Patrick Lange was honored with the SIGSIM Best PhD Award at the ACM SIGSIM PADS Conference 2016.

Jun 19-21, 2015: G. Zachmann gives invited talk at the DAAD-Stipendiatentreffen in Bremen, Germany.

Jun 2015: Haptic and hand tracking demos at the Open Campus 2015.

Dec 08-10, 2014: ICAT-EGVE 2014 and EuroVR 2014 conferences at the University of Bremen organized by G. Zachmann.

Sep 25-26, 2014: GI VR/AR 2014 conference at the University of Bremen organized by G. Zachmann.

Sep 24-25, 2014: VRIPHYS 2014 conference at the University of Bremen organized by G. Zachmann .

Feb 4, 2014: G. Zachmann gives invited talk on Interaction Metaphors for Collaborative 3D Environments at Learntec.

Jan 2014: G. Zachmann got invited to be a Member of the Review Panel in the Human Brain Project for the Competitive Call for additional project partners

Nov 2013: Invited Talk at the "Cheffrühstück 2013"

Oct 2013: PhD thesis of Rene Weller published in the Springer Series on Touch and Haptic Systems.

Jun 2013: G. Zachmann participated in the Dagstuhl Seminar Virtual Realities (13241)

Jun 2013: Haptic and hand tracking demos at the Open Campus 2013.

Jun 2013: Invited talk at Symposium für Virtualität und Interaktion 2013 in Heidelberg by Rene Weller.

Apr 2013: Rene Weller was honored with the EuroHaptics Ph.D Award at the IEEE World Haptics Conference 2013.

Jan 2013: Talk at the graduation ceremony of the University of Bremen by Rene Weller.

Oct 2012: Invited Talk by G. Zachmann at the DLR VROOS Workshop Servicing im Weltraum -- Interaktive VR-Technologien zum On-Orbit Servicing in Oberpfaffenhofen, Munich, Germany.

Oct 2012: Daniel Mohr earned his doctorate in the field of vision-based pose estimation.

Sept 2012: G. Zachmann: Keynote Talk at ICEC 2012, 11th International Conference on Entertainment Computing.

Sep 2012: "Best Paper Award" at GI VR/AR Workshop in Düsseldorf.

Sep 2012: Rene Weller earned his doctorate in the field of collision detection.

Aug 2012: GI-VRAR-Calendar 2013 is available!



Haptesha - A Collaborative Multi-User Haptic Workspace

Haptic feedback is an essential and emerging technology for many applications, ranging from virtual assembly simulation to mobile computing. It can help to improve human-computer interaction as well as, in multi-user scenarios, human-human interactions in many fields like industrial applications, entertainment, education, medicine and arts.

We present a haptic workspace that allows high fidelity two-handed multi-user interactions in scenarios containing a large number of dynamically simulated rigid objects and a polygon count that is only limited by the capabilities of the graphics card.

The main challenge when doing haptic rendering is the extremely high frequency that is required: While the temporal resolution of the human eye is limited to approximately 30 Hz, the bandwidth of the human tactile system is about 1000 Hz. In most haptic scenarios, the computational bottleneck remains the collision detection, whereas the force computation can be done relatively fast.

Thus, the heart of our haptic workspace is our new geometric data structure, called Inner Sphere Trees (ISTs), that not only allows us to detect collisions between pairs of massive objects at haptic rates but also enables us to define a novel type of contact information that guarantees stable forces and torques.

For further informations about ISTs please visit our project website.

Picture of our Haptic Workspace

Our haptic workspace set up for two users and four Novint Falcons.

Haptesha - The Game

Based on our haptic workspace we have developed a haptic multiplayer game that requires from the players complex bi-manual interactions simultaneously in the same environment. Besides a user survey, we included several mechanisms to track the quality of the users actions in the game for a quantitative and qualitative analysis.

The results show that 6-DOF force-feedback devices outperform 3-DOF devices significantly, both in user perception and in user performance.

6DOFWorkspace 6DOFPlayfield

The two-player set up with four haptic devices for our user study (Left). The playing field of our haptic game (Right).


Videos on Youtube

Haptesha: A Collaborative Multi-User Haptic Workspace

The scene contains more than 20 objects and a total polygon count of 3 millions running on a simple consumer PC with an Intel Core 2 Duo E6700. The haptic devices allow only three DOFs, but the architecture of our workspace supports full six DOF force rendering.

3-DOF vs. 6-DOF - Playful Evaluation of Complex Haptic Interactions

Based in our workspace, we present a novel multiplayer game that supports qualitative as well as quantitative evaluation of different haptic devices in demanding haptic interaction tasks.

This video is accepted for the video session of the IEEE VR 2011 conference in Singapore.




This work was partially supported by DFG grant ZA292/1-1 and the research project, funded by the Federal Minstry of Education and Research (BMBF) grant Avilus / 01 IM 08 001 U.