Select Page

Projects

Projects

Following is a sample list of my past projects. For more, please do not hesitate to contact me.

PhD


The Virtual Reality Questionnaire Toolkit

Authors: Martin Feick, Niko Kleer, Anthony Tang, and Antonio Krüger

pdf

     video     poster    Conference:   conference(UIST’20 website)    

Abstract: In this work, we present the VRQuestionnaireToolkit, which enables the research community to easily collect subjective measures within virtual reality (VR). We contribute a highly customizable and reusable open-source toolkit which can be integrated in existing VR projects rapidly. The toolkit comes with a pre-installed set of standard questionnaires such as NASA TLX, SSQ and SUS Presence questionnaire. Our system aims to lower the entry barrier to use questionnaires in VR and to significantly reduce development time and cost needed to run pre-, in between- and post-study questionnaires.


 

Master


TanGi: Tangible Proxies For Embodied Object Exploration And Manipulation In Virtual Reality

Authors: Martin Feick, Scott Bateman, Anthony Tang, André Miede, and Nicolai Marquardt

pdf

     video   Conference:   conference(ISMAR’20 website)    

Abstract: Exploring and manipulating complex virtual objects is challenging due to limitations of conventional controllers and free-hand interaction techniques. We present the TanGi toolkit which enables novices to rapidly build physical proxy objects using Composable Shape Primitives. TanGi also provides Manipulators allowing users to build objects including movable parts, making them suitable for rich object exploration and manipulation in VR. With a set of different use cases and applications we show the capabilities of the TanGi toolkit and evaluate its use. In a study with 16 participants, we demonstrate that novices can quickly build physical proxy objects using the Composable Shape Primitives and explore how different levels of object embodiment affect virtual object exploration. In a second study with 12 participants we evaluate TanGi’s Manipulators and investigate the effectiveness of embodied interaction. Findings from this study show that TanGi’s proxies outperform traditional controllers and were generally favored by participants.


Tactlets: Adding Tactile Feedback to 3D Objects Using Custom Printed Controls

Authors: Daniel Groeger, Martin Feick, Anusha Withana, and Jürgen Steimle

pdf

     video     poster    Conference:   conference(UIST’19 website)    

Abstract: Rapid prototyping of haptic output on 3D objects promises to enable a more widespread use of the tactile channel for ubiquitous, tangible, and wearable computing. Existing prototyping approaches, however, have limited tactile output capabilities, require advanced skills for design and fabrication, or are incompatible with curved object geometries. In this paper, we present a novel digital fabrication approach for printing custom, high-resolution controls for electro-tactile output with integrated touch sensing on interactive objects. It supports curved geometries of everyday objects. We contribute a design tool for modeling, testing, and refining tactile input and output at a high level of abstraction, based on parameterized electro-tactile controls. We further contribute an inventory of 10 parametric Tactlet controls that integrate sensing of user input with real-time electro-tactile feedback. We present two approaches for printing Tactlets on 3D objects, using conductive inkjet printing or FDM 3D printing. Empirical results from a psychophysical study and findings from two practical application cases confirm the functionality and practical feasibility of the Tactlets approach.


Mixed-Reality for Object-Focused Remote Collaboration

Authors: Martin Feick, Anthony Tang, and Scott Bateman

pdf

     video     poster    Conference:   conference(UIST’18 website)    

Abstract: In this paper we outline the design of a mixed-reality system to support object-focused remote collaboration. Here, being able to adjust collaborators’ perspectives on the object as well as understand one another’s perspective is essential to support effective collaboration over distance. We propose a low-cost mixed-reality system that allows users to: (1) quickly align and understand each other’s perspective; (2) explore objects independently from one another, and (3) render gestures in the remote’s workspace. In this work, we focus on the expert’s role and we introduce an interaction technique allowing users to quickly manipulation 3D virtual objects in space.


Fundamentals of Real-Time Data Processing Architectures Lambda and Kappa

Authors: Martin Feick, Niko Kleer, and Marek Kohn (course-project: all authors contributed equally)

Supervisor: Prof. Dr. Markus Esch

pdf

     Conference:   conference(SKILL 2018 website)    

Abstract: The amount of data and the importance of simple, scalable and fault tolerant architectures for processing the data keeps increasing. Big Data being a highly influential topic in numerous businesses has evolved a comprehensive interest in this data. The Lambda as well as the Kappa Architecture represent state-of-the-art real-time data processing architectures for coping with massive data streams. This paper investigates and compares both architectures with respect to their capabilities and implementation. Moreover, a case study will be conducted for gaining more detailed insights concerning their strengths and weakness.


 

 

Bachelor


The Way You Move: The Effect of a Robot Surrogate Movement in Remote Collaboration

Authors: Martin Feick, Lora Oehlberg, Anthony Tang, André Miede, and Ehud Sharlin

       Conference:   (HRI’18 website)     

Abstract: In this paper, we discuss the role of the movement trajectory and velocity enabled by our tele-robotic system (ReMa) for remote collaboration on physical tasks. Our system reproduces changes in object orientation and position at a remote location using a humanoid robotic arm. However, even minor kinematics differences between robot and human arm can result in awkward or exaggerated robot movement. As a result, user communication with the robotic system can become less efficient, less fluent and more time intensive.


Implementing a Humanoid Tele-Robotic Prototype for Investigating Issues in Remote Collaboration

Author: Martin Feick (Bachelor’s Thesis)

Supervisors: Prof. Dr. Anthony Tang and Prof. Dr. André Miede

      Code: 

Abstract: We desgined and developed a novel system, ReMa (Remote Manipulator), for supporting remote collaboration on physical tasks through a physical telepresence humanoid robot. The system captures and reproduces object manipulations on a proxy object at a remote location. The prototype combines latest robotics and motion capture technologies, exploring their capabilities and limitations. We found that directly mapping human and robot action is problematic due to the arrangement and limits of the robot joints. We applied ReMa to investigate how limited perspective in current video-mediated systems affects remote collaboration. Our main finings are: (1) a shared perspective is more effective and preferred compared to the opposing perspective offered by conventional video chat systems, and (2) the physical proxy and video chat complement one another in a combined system; people used the physical proxy to understand objects, and used video chat to perform gestures and confirm remote actions. These research findings validate both the design and implementation of ReMa as an effective research platform.


Perspective on and Re-Orientation of Physical Proxies in Object-Focused Remote Collaboration

Authors: Martin Feick, Terrance Mok, Anthony Tang, Lora Oehlberg, and Ehud Sharlin

        Conference:   (CHI’18 website)

Abstract: Remote collaborators working together on physical object have difficulty building shared understanding of what each person is talking about. Conventional video chat systems are insufficient for many situations because they present a single view of the object in a flattened image. To understand how this limited perspective affects collaboration, we designed the Remote Manipulator (ReMa), which can reproduce orientation manipulations on a proxy object at a remote site. We conducted two studies with ReMa, with two main findings. First, a shared perspective is more effective and preferred compared to the opposing perspective offered by conventional video chat systems. Second, the physical proxy and video chat complement one another in a combined system: people used the physical proxy to understand object, and used video chat to perform gestures and confirm remote actions.


TossCraft was a cooperative study project at the Saarland University of Applied Sciences (htw saar)

Authors: Martin Feick, Niko Kleer, and Marek Kohn (All authors contributed equally to this work)

Supervisor: Prof. Dr. André Miede

Project report:       Code: 

Abstract: The project TossCraft explores the capabilities and limitations of the Augmented Reality (AR) Leap Motion device. Our goal was to simulate a “natural” throwing movement within a game environment. In TossCraft, the player interacts with a variety o spheres representing real word objects such as foot-, bowling- or bouncyball. To proceed in the game, the spheres have to be physically moved into a given target by the player. In some cases, hitting a target is sufficient as well. That can be done by either throwing or punching the sphere. Reaching the target is accomplished by overcoming different challenges. As the game starts off easy, by progressing in the game, every level becomes more challenging demanding the player to be even more careful about his actions. To finish the game, the player has to solve all maps. 


Twitter-Monitor was a cooperative study project at the Saarland University of Applied Sciences (htw saar).

Authors: Marius Backes, Martin Feick, Niko Kleer, Marek Kohn, Stefan Schlösser, Philipp Schäfer, and Oliver Seibert (All authors contributed equally to this work)

Supervisor: Prof. Dr. Klaus Berberich

Project report:       Code: 

Abstract: Twitter monitor allows users to filter Twitter messages based on their personal interests. The intelligent web application continuously listens to the Twitter stream and stores the relevant tweets into a database. The user enters “key words” that describe topics s/he is interested in. Our algorithm calculates, based on the user input, a rating for each tweet which then get potentially displayed in the user interface. Following that, users have the opportunity to evaluate the displayed tweets providing a feedback mechanism for the algorithm by choosing the most interesting tweets for users. The applications also offers many different settings such as e-mail notification, topic categories or a blacklist for topics that are not relevant to the users. 


Towards a critical perspective on mHealth applications was second year (undergraduate) project at the Saarland University of Applied Sciences (htw saar).

Author: Martin Feick

Supervisor: Prof. Dr. André Miede

Tech report:    

Abstract: Today’s world is influenced by technologies which create opportunities to support people in their daily lives. Companies promote and market products in different ways to reach everybody and to maximize their turnover which is especially relevant for the healthcare sector. People with diseases could find relief by dealing with the illness, but the provided applications and information come with a huge reponsibility for patients. How far can these applications support people without risks and what about the usability of medical treatments and diagnosis? This paper is primarily concerned with these questions and correlates the current problems in the development process of mHealth applications.