Jun 17, 2017: Haptic and hand tracking demos at the Open Campus 2017.
Feb-Apr 2017: David Vilela (Mechanical Engineering Laboratory, University of Coruna, Spain) visit our Workgroup from February to April.
His main work is to compare different intersection calculation methods in collisions, and also different force models.
Feb 2017: G. Zachmann and J. Teuber visited the Mahidol University in Bangkok,
Thailand as part of a delegation from the University of Bremen.
The goal of the visit was to foster the cooperation between the two universities and lay ground-work for future colaborations.
Jun 2016: Radio Bremen visits our lab to film the works of the Creative Unit "Intra-Operative Information" for a news magazine on the local TV station. Click here for the film
at Radio Bremen. And Click here for the same film on our Website.
May 16, 2016: Patrick Lange was honored with the SIGSIM Best PhD Award at the ACM SIGSIM
PADS Conference 2016.
Jun 19-21, 2015: G. Zachmann gives invited talk at the
DAAD-Stipendiatentreffen in Bremen, Germany.
Jun 2015: Haptic and hand tracking demos at the Open Campus 2015.
Dec 08-10, 2014:
ICAT-EGVE 2014 and EuroVR 2014
conferences at the University of Bremen organized by G. Zachmann.
Sep 25-26, 2014:
GI VR/AR 2014 conference at the University of Bremen organized by G. Zachmann.
Sep 24-25, 2014: VRIPHYS 2014 conference at the University of Bremen organized by G. Zachmann .
Feb 4, 2014: G. Zachmann gives invited talk on Interaction Metaphors for Collaborative 3D Environments
Jan 2014: G. Zachmann got invited to be a Member of the Review Panel in the Human Brain Project for the Competitive Call for additional project partners
Nov 2013: Invited Talk at the "Cheffrühstück 2013"
Oct 2013: Dissertation of Rene Weller published in the Springer Series on Touch and Haptic Systems.
Jun 2013: G. Zachmann participated in the Dagstuhl Seminar Virtual Realities (13241)
Jun 2013: Haptic and hand tracking demos at the Open Campus 2013.
Jun 2013: Invited talk at Symposium für Virtualität und Interaktion 2013 in Heidelberg by Rene Weller.
Apr 2013: Rene Weller was honored with the EuroHaptics Ph.D Award at the IEEE World Haptics Conference 2013.
Jan 2013: Talk at the graduation ceremony of the University of Bremen by Rene Weller.
Oct 2012: Invited Talk by G. Zachmann at the DLR VROOS Workshop Servicing im Weltraum -- Interaktive VR-Technologien zum On-Orbit Servicing in Oberpfaffenhofen, Munich, Germany.
Oct 2012: Daniel Mohr earned his doctorate in the field of vision-based pose estimation.
Sept 2012: G. Zachmann: Keynote Talk at ICEC 2012, 11th International Conference on Entertainment Computing.
Sep 2012: "Best Paper Award" at GI VR/AR Workshop in Düsseldorf.
Sep 2012: Rene Weller earned his doctorate in the field of collision detection.
Aug 2012: GI-VRAR-Calendar 2013 is available!
The master project Kinaptic is developing a video game for blind and sighted people using the Microsoft Kinect, haptic feedback
devices and stereo rendering for different 3D devices.
A multimodal gaming approach for blind and sighted players.
The motivation of the project is to explore how blind and sighted players can be included in a shared virtual environment.
This will be approached through the development of a game in which the sighted player forms a tunnel with his body by movements in front of
the Kinect while the blind player flies through this tunnel with an object that provides haptic feedback. To win the game the blind player has
to reach his sighted opponent who is trying to form the tunnel in a way that makes it difficult to reach him.
The Official launch of the project was on the 6th of october and currently a team of 10 computer science graduate students
is working on the project goals on a daily basis. Besides creating an awesome video game the goal of our team is to evaluate the
outcomes of our project in a user study and to publish a scientific paper.
Next to the main project we developed a side project called Sculpting. It is about designing objects or sculptures in a virtual three-dimensional
environment with the use of virtual sculpting tools. The outcome of the project is a prototype of a sculpting game.
Our approach is based on the zSpace, a 3D virtual reality tablet. The zSpace system comes with a high definition stereoscopic display which renders
full resolution images. In addition it enables the user to experience smooth parallax by tracking the movements of the users' head. To interact with
the virtual environment the zSpace offers a Stylus which is designed with 6 degrees of freedom. The Stylus is tracked by the system and used to design
3D objects. For head tracking and quad buffer stereo lightweight glasses are used. Overall the zSpace is can be seen as a virtual sculpting tool. Like
the main project the goal for this game is also an inclusion game for blind and sighted people. This makes it necessary to think about non-visual tools
as well. Therefor we decided to use a haptic device (PHANTOM Omni) as well.
More information about the technical side of the project please follow this link or visit the section Sculpting.