Refine
Document Type
- Conference Proceeding (3)
- Article (2)
- Part of a Book (2)
Has Fulltext
- no (7)
Is part of the Bibliography
- yes (7)
Keywords
- Computer sciences (1)
- Drone control (1)
- Elektrodynamik (1)
- Experimental ergonomics (1)
- Gamification (1)
- Inverse Kinematics (1)
- Motion Capture (1)
- Physikunterricht (1)
- Test (1)
- Third person hmi (1)
Institute
- Fakultät IuI (6)
- Fakultät AuL (1)
We introduce an augmented reality application that allows the representation of planned real world objects (e.g. wind turbines or power poles) at their actual geographic location. The application uses GPS for positioning, which is then supplemented by augmented reality feature tracking to get a constant and stable positional and rotational reading. As addition to the visualization, we use on-the-fly gathered sensor data to identify foreground objects. Thus, for practical scenarios our application depicts images with mostly correct occlusion between real and virtual objects. The application will be used to support urban and landscape planners in their process, especially for the purpose of public information and acceptance. It provides an advantage to current planning processes, where the representation of objects is limited to positions on maps, miniature models, or at best a photo montage where the object is placed into a still camera image.
In physics education, the topic of electromagnetic induction is an important but also challenging topic for many students. The early introduction of formulae, e.g., Faraday's law of induction, seems to hinder rather than to foster the understanding of the topic's underlying principles. In this paper, we present the basic idea for a teaching concept that can be helpful for a qualitative understanding of electromagnetic induction. To support this teaching concept further, we have developed an application for a tablet computer following the augmented reality approach. The tablet measures the magnetic field strength of a Helmholtz coil and superimposes the corresponding number of virtual field lines on the induction coil. (As Provided).
With human motion capture being used in various research fields and the entertainment industry, suitable systems need to be selected based on individual use cases. In this paper we propose a novel software framework that is capable to simulate, compare, and evaluate any motion capturing system in a purely virtual way. Given an avatar as input character, a user can create an individual tracking setup by simply placing trackers on the avatars skin. The physical behavior of the placed trackers is configurable and extendable to simulate any existing tracking device. Thus it is possible e.g. to add or modify drift, noise, latency, frequency, or any other parameter of the virtual trackers. Additionally it is possible to integrate an individual inverse kinematics (IK) solving system which is steered by the placed trackers. This allows to compare not only different tracker setups, but also different IK solving systems. Finally users can plug-in custom error metrics for comparison of the calc ulated body poses against ground truth poses. To demonstrate the capabilities of our proposed framework, we present a proof of concept by implementing a simplified simulation model of the HTC vive tracking system to control the VRIK solver from the FinalIK plugin and calculate error metrics for positional, angular, and anatomic differences.
Der Beitrag befasst sich mit dem Dilemma, dass in der räumlichen Planung bereits frühzeitig valide Visualisierungen geplanter Vorhaben erforderlich wären, jedoch häufig keine abgesicherten Aussagen über deren späteres Erscheinungsbild möglich sind. Am Beispiel Windenergie werden einleitend idealtypische Probleme und Herausforderungen benannt. Darauf aufbauend wird der aktuelle Stand des Einsatzes von Geo-Informationssystemen und Visualisierungstechniken zusammengefasst und relevante Probleme aufgezeigt. Vor diesem Hintergrund werden Lösungsansätze für eine flexible Visualisierung durch Kopplung von GIS-Technologien und Game-Engines aus dem Bereich der Computerspiele skizziert. Das künftige Potenzial von AR/VR-Technologien in diesem Bereich wird erläutert.
Ein konzeptuelles Verständnis der Bewegung von Ladungsträgern in elektrischen und magnetischen Feldern ist ein wichtiges Ziel des Physikunterrichts der Oberstufe. Allerdings fehlt im deutschsprachigen Raum dazu bisher ein geeignetes Testinstrument, mit dem Lernziele überprüft werden können. Im Beitrag wird die Entwicklung eines Multiple-Choice-Tests beschrieben, um entsprechenden Unterricht zu evaluieren. In drei Studien wurden Argumente gesammelt, die für die Validität einer solchen Interpretation der Testwerte sprechen. In Studie 1 wurden die Leistungen von 283 Schülerinnen und Schülern erfasst und mittels einer Raschanalyse gezeigt, dass die beiden fokussierten Dimensionen (Verständnis der Bewegung von Ladungsträgern im elektrischen bzw. magnetischen Feld) statistisch getrennt werden können. Dies kann vor allem als Argument für die strukturelle Validität aufgefasst werden. Im Rahmen von Studie 2 wurde die kognitive Validität mit einer Teilstichprobe von 18 Schülerinnen und Schülern aus Studie 1 untersucht. Die Ergebnisse dienten dazu, den Test entsprechend zu modifizieren. In Studie 3 wurden 55 Schülerinnen und Schüler mittels eines einschlägigen, computerbasierten Lernspiels instruiert. Der hohe Zuwachs in den Testscores ist ein Indiz für die Instruktionssensitivität des Tests, einer wichtigen Voraussetzung für konsequentielle Validität. Der Test wird als Onlinematerial zur Nutzung zur Verfügung gestellt.
Over the last years drones became a more and more popular solution for inspection and survey tasks. Controlling these drones, especially in tight spaces, using ‘line of sight’ or a ‘first person’ view from the perspective of the drone can be a difficult task. Often users experience an increased rate of difficulty that can be traced back to the limited situationaloverview of the user. To investigate whether a different form of visualization and interaction might result in a higher level of usability for the user, an experimental workspace was set up, with the goal of exploring the possibility of implementing a ‘third person view’ metaphor, like one used in video games. To further allow the user to experience his environment the use of virtual reality was used to stream the followers perspective directly to the users headset. This allowed the user to fly inside a simulated environment allowing for a control- and repeatable testing ground of the software. The workspace consisted of a simulation in which a ‘proof of concept’ was developed. In this simulation a drone used a conventional GPS sensor to follow a human controlled drone, offering his view, from a static camera, as a third person perspective to the controller using a virtual reality headset. Within the framework of the project, two aspects in particular were investigated: The performance of the technical system and the basic user experience and ergonomics of this form of interaction. To evaluate the performance of the follower system, the GPS position, as well as execution times and latencies were recorded. The user experience was evaluated based on personal interviews. The results show that the developed system can in fact follow a drone based on the GPS position alone, as well as calculate the desired positions in a timely manner. Yet, the existing delay in movement induced by the controller execution, as well as the drones own inertia did not allow for continues camera tracking of the drone using a static camera. This introduced several issues regarding tracking and impacted the user experience, but still showed that such a metaphor could in theory be implemented and further refined. The personal interviews showed that users would try tracking the drone by moving their head, like they are used to in virtual reality games. Ultimately, it was deduced that introducing a vectorbased drone movement, additional range detection sensor, as well as a moveable camera, controlled via head movement would be next steps to improve of the overall system. Since the prototype created in this paper only contained a bare bones user interface and experience the use of a usability study has been foregone in exchange for a more stable software solution. This allows further research into this topic the possibility of evaluating possible types of spatial user interfaces, which could improve the user immersion.