IEEE VR 2007 Workshop on
"Trends and Issues in Tracking for Virtual Environments"
Time and Location
Date: 11 March 2007 (half day; 1:30-5:30 PM)Location: Hilton Charlotte Center City, Charlotte, North Carolina, USA
Web: http://conferences.computer.org/vr/2007/
Program
Time | Authors | Title |
---|---|---|
1:30 | Eigil Samset, Simon P. DiMaio | Hybrid Tracking: A new trend in Image-Guided Therapy |
1:45 | Greg Welch, Michael Noland, Gary Bishop | Complementary Tracking and Two-Handed Interaction for Remote 3D Medical Consultation with a PDA |
1:55 | Shih-Ching Yeh, Chiao Wang, Albert Rizzo, Alexander A. Sawchuk | A Home-Based Stereo Webcam-LED Tracking System with Multiple Degrees of Freedom Supporting for Tele-Rehabilitation [Video] |
2:10 | Thomas Pintaric, Hannes Kaufmann | Affordable Infrared-Optical Pose-Tracking for Virtual and Augmented Reality |
2:25 | Florin Duca, Jonas Fredriksson, Morten Fjeld | Real-Time 3D Hand Interaction: Single Webcam Low-Cost Approach |
2:45 | Christian Jansen, Frank Steinicke, Klaus Hinrichs, Jan Vahrenhold, Bernd Schwald | Performance Improvement for Optical Tracking by Adapting Marker Arrangements |
3:00 | Coffee Break | Coffee Break |
3:30 | Greg Welch, B. Danette Allen, Adrian Ilie, Gary Bishop | Measurement Sample Time Optimization for Human Motion Tracking/Capture Systems |
3:45 | Steven Maesen, Philippe Bekaert | Low-Cost, Wide-Area Tracking for Virtual Environments [Video] |
4:00 | Joseph Newman, Alexander Bornik, Daniel Pustka, Florian Echtler, Manuel Huber, Dieter Schmalstieg, Gudrun Klinker | Tracking for Distributed Mixed Reality Environments |
4:15 | Javier I. Girado, Tom Peterka, Robert L. Kooima, Jinghua Ge, Daniel J. Sandin, Andrew Johnson, Jason Leigh, Thomas A. DeFanti | Real Time Neural Network-based Face Tracker for VR Displays [Video] |
4:30 | J. Sánchez, Diego Borro | Non Invasive 3D Tracking for Augmented Video Applications |
4:45 | Schmalstieg, Welch, Samset, | Panel (as time permits) |
The speakers' names have been highlighted.
The Proceedings
Or, you can send me an email at zach in.tu-clausthal.de, I might still have some copies left, for only 20€.
The BibTex entry is
@PROCEEDINGS{Zach07b, editor = "Gabriel Zachmann", title = "Proc. IEEE VR 2007 Workshop on 'Trends and Issues in Tracking for Virtual Environments'", month = mar # "11", year = 2007, address = "Charlotte, NC, USA", organization = "IEEE", publisher = "Shaker Verlag, Aachen, Germany" isbn = "978-3-8322-5967-9" }
Important Dates
- Deadline for submissions: 15. January 2007
- Notification of acceptance: 21. January 2007
- Camera-ready due: 26. January 2007
- Workshop: 11 March 2007 (half day; 1:30-5:30 PM)
Aims and Scope of the Workshop
This workshop will be part of the IEEE VR 2007 conference, co-located with the IEEE Symposium on 3D User Interfaces (3DUI).The goal of this half-day workshop is to bring together researchers and industry working in the area of tracking and to talk about making tracking actually work. To that end, the workshop is to provide a broad picture of what is the current state of the art, what are the various technologies available, and what are issues for further research and development.
Within this context, "tracking" is meant to include all kinds of techniques, methods, devices, and algorithms that are used to make the computer "sense" the location or configuration of users, parts of users, or objects. Applications are in the areas of VR, AR, animation, etc.
The workshop will consist of two parts: the first and main one will comprise presentations, and the second one will be used for panel and plenary discussions.
The list of presentation topics includes:
- tracking utilizing electro-magnetic, inertial, optical
- novel / unconventional technologies for tracking
- hybrid tracking
- wide-area / large workspace tracking (e.g., GPS, Galileo, or others)
- high-speed or high-precision tracking
- accuracy, repeatability, registration issues
- correction and other post-processing of tracking data
- camera-based tracking (with markers or markerless)
- user interface issues that are directly related to tracking
- ...
Workshop Format & Intended Audience
The workshop addresses a wide range of topics, therefore we welcome participants from a wide range of disciplines and industries, e.g. HCI, VR/AR hardware/software/applications, usability, entertainment, algorithms, etc.The workshop will be divided into two parts. In the first and main part, there will be a number of presentations pertaining to the relevant topics. In the second, somewhat smaller part, we would like to discuss open issues, current trends, long-term visions, further avenues of research, etc. This discussion will probably be a combined panel and plenary discussion.
Submissions
Participants are invited to submit novel research results, state-of-the-art, work in progress, experience reports, publicly presentable ideas for unimplemented and/or unusual systems, or, possibly, substantiated visions for future work.Workshop papers submissions will be reviewed by the workshop committee.
In addition, they will provide the basis for the plenary discussion.
Papers should not exceed 6 pages (short papers, i.e., <= 4 pages, are welcome, too). They should be formatted according to the VGTC publication format in quality PDF. Guidelines and style files can be found at http://www.cs.sfu.ca/~vis/Tasks/camera.html.
Supplemental materials (e.g. videos) are welcome to support the submission. (Videos should be in a common format, e.g., MPEG-1, MPEG-4, XviD, DivX, or Quicktime.)
Submissions should include Name, Affiliation, and Contact Address.
All accepted workshop papers will be published with ISBN as workshop proceedings (by Shaker). In addition, they will be published on the conference DVD (including the supplemental material).
Submissions should be made via Email to: zach_REMOVE_ME@in.tu-clausthal.de
We will ask authors to commit to personally attend the workshop to present the papers.
Workshop Organizer and Contact
Gabriel Zachmann, Clausthal University, Germanyhttp://cg.in.tu-clausthal.de or http://zach.in.tu-clausthal.de
Email: zach in.tu-clausthal.de