Apr, 2018: Zukunftstag 2018. Two pupils of the Gymnasium Horn came to visit the University of Bremen and created their own website with wordpress.

Apr, 2018: We got invited by KUKA to showcase our demo at the European Robotics Forum 2018 in Tampere, Finland

Feb, 2018: The House of Science, Bremen hosts an exhibition about local scientists and science projects with collaborators around the world. One of the featured exhibits is a demo of our Autonomous Surgical Lamps, developed by Jörn Teuber of the Computer Graphics and Virtual Reality group. The exhibition will be open until the 21st of April (photos).

Feb, 2018: The University of Bremen participates in the opening of a research laboratory in Bangkok.

Nov, 2017: 2017 VRST Best Poster Award Winner. Michael Bonfert, Melina Cahnbley, Inga Lehne, Ralf Morawe, Gabriel Zachmann and Johannes Schöning are winning the award for the poster titled "Augmented Invaders: A Mixed Reality Multiplayer Outdoor Game."

Nov, 2017: Organizers of the French VR conference and trade show Laval Virtual immersed themselves into a variety of different virtual environments where they learned about current projects of the Computer Graphics & Virtual Reality lab at the University of Bremen (full report, in German).

Sep, 2017: Founding Everyday Activity Science and Engineering (EASE). EASE is a interdisciplinary research center at the University of Bremen that investigates everyday activity science & engineering. For more Information click here.

Jun 17, 2017: Haptic and hand tracking demos at the Open Campus 2017.

Feb-Apr 2017: David Vilela (Mechanical Engineering Laboratory, University of Coruna, Spain) visited our lab. He is working on benchmarks to compare different intersection calculation methods in collisions, and also different force models.

Feb 2017: G. Zachmann and J. Teuber visited the Mahidol University in Bangkok, Thailand as part of a delegation from the University of Bremen. The goal of the visit was to foster the cooperation between the two universities and lay ground-work for future colaborations.

Jun 2016: Radio Bremen visited our lab to film the works of the Creative Unit "Intra-Operative Information" for a news magazine on the local TV station. Click here for the film at Radio Bremen. And Click here for the same film on our Website.

May 16, 2016: Patrick Lange was honored with the SIGSIM Best PhD Award at the ACM SIGSIM PADS Conference 2016.

Jun 19-21, 2015: G. Zachmann gives invited talk at the DAAD-Stipendiatentreffen in Bremen, Germany.

Jun 2015: Haptic and hand tracking demos at the Open Campus 2015.

Dec 08-10, 2014: ICAT-EGVE 2014 and EuroVR 2014 conferences at the University of Bremen organized by G. Zachmann.

Sep 25-26, 2014: GI VR/AR 2014 conference at the University of Bremen organized by G. Zachmann.

Sep 24-25, 2014: VRIPHYS 2014 conference at the University of Bremen organized by G. Zachmann .

Feb 4, 2014: G. Zachmann gives invited talk on Interaction Metaphors for Collaborative 3D Environments at Learntec.

Jan 2014: G. Zachmann got invited to be a Member of the Review Panel in the Human Brain Project for the Competitive Call for additional project partners

Nov 2013: Invited Talk at the "Cheffrühstück 2013"

Oct 2013: PhD thesis of Rene Weller published in the Springer Series on Touch and Haptic Systems.

Jun 2013: G. Zachmann participated in the Dagstuhl Seminar Virtual Realities (13241)

Jun 2013: Haptic and hand tracking demos at the Open Campus 2013.

Jun 2013: Invited talk at Symposium für Virtualität und Interaktion 2013 in Heidelberg by Rene Weller.

Apr 2013: Rene Weller was honored with the EuroHaptics Ph.D Award at the IEEE World Haptics Conference 2013.

Jan 2013: Talk at the graduation ceremony of the University of Bremen by Rene Weller.

Oct 2012: Invited Talk by G. Zachmann at the DLR VROOS Workshop Servicing im Weltraum -- Interaktive VR-Technologien zum On-Orbit Servicing in Oberpfaffenhofen, Munich, Germany.

Oct 2012: Daniel Mohr earned his doctorate in the field of vision-based pose estimation.

Sept 2012: G. Zachmann: Keynote Talk at ICEC 2012, 11th International Conference on Entertainment Computing.

Sep 2012: "Best Paper Award" at GI VR/AR Workshop in Düsseldorf.

Sep 2012: Rene Weller earned his doctorate in the field of collision detection.

Aug 2012: GI-VRAR-Calendar 2013 is available!

Autonomous Surgical Lamps

Autonomous Surgical Lamps

As part of the Creative Unit - Intra-Operative Information, we are developing algorithms for the autonomous positioning of surgical lamps in open surgery. These algorithms work solely on the input of one depth camera, which is positioned above the patient during the surgery. The algorithms identify the operation site (aka the situs) and all possible occlusions. They then move the lamps to avoid occlusions and collisions while optimizing for the least amount of movement feasible over time.

The basic idea is to take the point cloud given by the depth camera and render it from the perspective of the situs towards the working space of the lamps above the operating table. Out of this rendering, we directly get the information, which parts of the lamps workspace are occluded and which not. To be able to minimize the movement over time, we also use information about past occlusions and movements to position the lamps in areas, that are most likely to not be occluded in the future. We arranged the algorithms in a pipeline, which takes the depth image of the depth camera as input, analyzes it to find the situs, and at last outputs the current optimal positions for a given set of lamps.

Pipeline ORScene

Our pipeline with illustrations of the outputs of some stages (left). A Screenshot of our testing and visualization environment with input data from a real surgery (right).


Videos on Youtube

Download the video: ASuLa_inDepth.mp4

Video on local TV station

Download the video: ButenUnBinnen_CU-IOI.mp4
Source: Radio Bremen, Buten un Binnen

Towards Physically Correct Lighting Simulation

One important part of improving the lighting on the surgical site is to be able to estimate the expected illumination of the site with different positions of the lamps. In computer graphics, ray-tracing is used to provide photo-realistic renderings of arbitrary scenes. Ray-tracing can simulate the light in a physically correct manner, which is why we chose it for our project. Unfortunately, we only have point-clouds as representation of the scene which are harder to ray-trace. So to be able to quickly evaluate different configurations of lamp positions we developed fast ray-tracing-methods for point-clouds on the basis of NVidia's OptiX ray tracing engine. A video of this in action can be seen below.

Download the video: Raytraced_Pointclouds.mp4

RGB-D Data

The following downloadable data is a recording of a complete, open, abdominal surgery. It was recorded using a Microsoft Kinect v2, which was mounted directly above the patient, using a costom recording program. The recordings are stored in HDF5-files which can be read using the appropriate HDF5 library and the following C++ class: header, source. They are also compressed using gzip to preserve server space. Each file contains at most 27000 frames, which corresponds to roughly 16-17 minutes of recording.

This data may only be used for scientific purposes. If you publish research which is using this material, cite the above paper. Also, please contact us, we are always interested to hear what others are doing with this data.

As this is real-life data, there are long stretches of the recording, where a surgical lamp obstructs most of the kinects field of view. As a rule of thumb, the smaller the compressed file size, the bigger and longer the obstruction of the field of view.


This work was partially supported by the grant Creative Unit - Intra-Operative Information.