Home

 

About me

My name is Martin Feick, and I am a PhD student in the Cognitive Assistants Departement at the German Research Center for Artificial Intelligence (DFKI). I am also a member of the Ubiquitous Media Technology Lab (UMTL) at Saarland University.

I finished my Master’s Thesis research in Human-Computer Interaction (HCI) at the University College London, UCLIC (United Kingdom). During my master studies, I also worked part-time as a HCI researcher at Saarland University, HCI-lab. I previously spent six months in the Interactions Lab at the University of Calgary (Canada) writing my Bachelor’s thesis. I hold a Master and Bachelor of Sciences in Applied Computer Science from the Saarland University of Applied Sciences.

My HCI research interests are in designing and developing technology to support object-centred interaction and collaboration across different modalities. I combine Virtual-, Augmented- and Mixed-Reality, Fabrication as well as Robotics to create novel interfaces enabling more efficient and natural interaction.

                      G       

 

 

 

Recent Projects

 

Tactlets: Adding Tactile Feedback to 3D Objects Using Custom Printed Controls: Rapid prototyping of haptic output on 3D objects promises to enable a more widespread use of the tactile channel for ubiquitous, tangible, and wearable computing. Existing prototyping approaches, however, have limited tactile output capabilities, require advanced skills for design and fabrication, or are incompatible with curved object geometries. In this paper, we present a novel digital fabrication approach for printing custom, high-resolution controls for electro-tactile output with integrated touch sensing on interactive objects. It supports curved geometries of everyday objects. We contribute a design tool for modeling, testing, and refining tactile input and output at a high level of abstraction, based on parameterized electro-tactile controls. We further contribute an inventory of 10 parametric Tactlet controls that integrate sensing of user input with real-time electro-tactile feedback. We present two approaches for printing Tactlets on 3D objects, using conductive inkjet printing or FDM 3D printing. Empirical results from a psychophysical study and findings from two practical  application cases confirm the functionality and practical feasibility of the Tactlets approach.

 

 

Mixed-Reality for Object-Focused Remote Collaboration: In this paper we outline the design of a mixed-reality system to support object-focused remote collaboration. Here, being able to adjust collaborators’ perspectives on the object as well as understand one another’s perspective is essential to support effective collaboration over distance. We propose a low-cost mixed-reality system that allows users to: (1) quickly align and understand each other’s perspective; (2) explore objects independently from one another, and (3) render gestures in the remote’s workspace. In this work, we focus on the expert’s role and we introduce an interaction technique allowing users to quickly manipulation 3D virtual objects in space.

 

 

The Way You Move: The Effect of a Robot Surrogate Movement in Remote Collaboration: In this paper, we discuss the role of the movement trajectory and velocity enabled by our tele-robotic system (ReMa) for remote collaboration on physical tasks. Our system reproduces changes in object orientation and position at a remote location using a humanoid robotic arm. However, even minor kinematics differences between robot and human arm can result in awkward or exaggerated robot movement. As a result, user communication with the robotic system can become less efficient, less fluent and more time intensive.

 

 

 

Perspective on and Re-Orientation of Physical Proxies in Object-Focused Remote Collaboration: Remote collaborators working together on physical object have difficulty building shared understanding of what each person is talking about. Conventional video chat systems are insufficient for many situations because they present a single view of the object in a flattened image. To understand how this limited perspective affects collaboration, we designed the Remote Manipulator (ReMa), which can reproduce orientation manipulations on a proxy object at a remote site. We conducted two studies with ReMa, with two main findings. First, a shared perspective is more effective and preferred compared to the opposing perspective offered by conventional video chat systems. Second, the physical proxy and video chat complement one another in a combined system: people used the physical proxy to understand object, and used video chat to perform gestures and confirm remote actions.

 

 

 

Top