004 Informatik
Refine
Year of publication
Document Type
- Moving Images (72)
- Conference Proceeding (45)
- Article (27)
- Part of a Book (4)
- Bachelor Thesis (3)
- Book (2)
- Master's Thesis (1)
- Report (1)
- Working Paper (1)
Keywords
- Inverted Classroom (4)
- Digitalisierung (3)
- LiDAR (3)
- Scrum (3)
- Agile Lehre (2)
- Future Skills (2)
- Gazebo (2)
- Power Consumption (2)
- Robot operating system (ROS) (2)
- Simulation and Modeling (2)
Institute
- Fakultät AuL (74)
- Fakultät IuI (44)
- Fakultät WiSo (32)
- Institut für Management und Technik (3)
- LearningCenter (1)
In this paper, we evaluate the application of Bayesian Optimization (BO) to discrete event simulation (DES) models. In a first step, we create a simple model, for which we know the optimal set of parameter values in advance. We implement the model in SimPy, a framework for DES written in Python. We then interpret the simulation model as a black box function subject to optimization. We show that it is possible to find the optimal set of parameter values using the open source library GPyOpt. To enhance our evaluation, we create a second and more complex model. To better handle the complexity of the model, and to add a visual component, we build the second model in Simio, a commercial off-the-shelf simulation modeling tool. To apply BO to a model in Simio, we use the Simio API to write an extension for optimization plug-ins. This extension encapsulates the logic of the BO algorithm, which we deployed as a web service in the cloud.
The fact that simulation models are black box functions with regard to their behavior and the influence of their input parameters makes them an apparent candidate for Bayesian Optimization (BO). Simulation models are multivariable and stochastic, and their behavior is to a large extent unpredictable. In particular, we do not know for sure which input parameters to adjust to maximize (or minimize) the model’s outcome. In addition, the complex models can take a substantial amount of time to run.
Bayesian Optimization is a sequential and self-learning algorithm to optimize black box functions similar to as we find them in simulation models: they contain a set of parameters for which we want to identify the optimal set, they are expensive to evaluate, and they exhibit stochastic noise. BO has proven to efficiently optimize black box functions from varius disciplines. Among those, and most notably, it is successfully applied in machine learning algorithms to optimize hyperparameters.
SimBO is a flexible framework for optimizing discrete event-driven simulations (DES) using sequential optimization algorithms. While specifically designed for Bayesian Optimization (BO) in the context of DES, SimBO can be applied to any black-box problem with other optimization algorithms. The framework consists of four encapsulated components - the black-box problem, the sequential optimization algorithm, a database for experiment configuration and results, and a web-based graphical user interface - that communicate via well-defined interfaces. Each component can be run in different environments, allowing for cooperation between different hardware- and software configurations. In our research context, SimBO’s architecture enabled BO algorithms to be run on a high-performance cluster with GPU support, while the simulation is executed on a local Windows machine using the Simio simulation software. The framework’s flexibility also makes it suitable for evolving from a research-focused tool to a production-ready, cloud-based optimization tool for modern algorithms.
This paper investigates four different mobile robots with respect to their drivingcharacteristics and soil preservation properties in an agricultural environment.Thereby, robots of classical design from agriculture as well as systems from spacerobotics with advanced locomotion concepts are considered to determine theindividual advantages of each rover concept with respect to the application domain.Locomotion experiments were conducted to analyze the general driving behavior,tensile force, and obstacle‐surmounting capability and ground interaction of eachrobot. Various soil conditions typical for the area of application are taken intoaccount, which are varied in terms of moisture and density. The presented workcovers the specification of the conducted experiments, documentation of theimplementation as well as analysis and evaluation of the collected data. In theevaluation, particular attention is paid to the change in driving characteristics underdifferent soil conditions, as well as to the soil stress caused by driving, since soilquality is of critical importance for agricultural applications. The analysis shows thatthe advanced locomotion concepts, as used in space robotics, also have positiveimplications for certain requirements in agricultural applications, such as maneuver-ability in wet conditions and soil conservation. The results show potential for designinnovations in agricultural robotics that can be used, to open up new fields ofapplication for instance in the context of precision farming.
The usage of high-level synthesis (HLS) tools for FPGAs has increased significantly over the last years since they matured and allow software programmers to take advantage of reconfigurable hardware technology.
Most HLS tools employ methods to optimize for loops, e. g. by unrolling or pipelining them. But there is hardly any work on the optimization of while loops. This comes at no surprise since most while loops have loop-carried dependences involving the loop condition which result in large recurrence cycles in the dataflow graphs. Therefore typical while loops cannot be parallelized or pipelined.
We propose a novel transformation which allows to optimize while loops nested within a for loop. By interchanging the two loops, it is possible to pipeline (and thereby parallelize) the inner loop, resulting in a reduced execution time. We present two case studies on different hardware platforms and show the speedup factors - compared to a host processor and to an unoptimized hardware implementation - achieved by our while loop optimization method.
Artificial intelligence (AI) promises transformative impacts on society, industry, and agriculture, while being heavily reliant on diverse, quality data. The resource-intensive "data
problem" has initialized a shift to synthetic data. One downside of synthetic data is known as the "reality gap", a lack of realism. Hybrid data, combining synthetic and real data, addresses this. The paper examines terminological inconsistencies and proposes a unified taxonomy for real, synthetic, augmented, and hybrid data. It aims to enhance AI training datasets in smart agriculture, addressing the challenges in the agricultural data landscape. Utilizing hybrid data in AI models offers improved prediction performance and adaptability.
This paper presents an optimized algorithm for estimating static and dynamic gait parameters. We use a marker- and contact-less motion capture system that identifies 20 joints of a person walking along a corridor.
Based on the proposed gait cycle detection basic metrics as walking frequency, step/stride length, and support phases are estimated automatically. Applying a rigid body model, we are capable to calculate static and dynamic gait stability metrics. We conclude with initial results of a clinical study evaluating orthopaedic technical support.
Der Einsatz paralleler Hardware-Architekturen betrifft alle Software-Entwickler und -Entwicklerinnen: vom Supercomputer bis zum eingebetteten System werden Multi- und Manycore-Systeme inzwischen eingesetzt. Die Herausforderungen an das Software Engineering sind vielfältig. Zum einen ist (wieder) ein stärkeres Verständnis für die Hardware notwendig. Ohne eine skalierbare Partitionierung der Software und parallele Algorithmen bleibt die Rechenleistung ungenutzt. Zum anderen stehen neue Programmiersprachen im Vordergrund, die die Ausführung von parallelen Anweisungen ermöglichen.
Dieses Buch betrachtet unterschiedliche Aspekte bei der Entwicklung paralleler Systeme und berücksichtigt dabei auch eingebettete Systeme. Es verbindet Theorie und praktische Anwendung und ist somit für Studierende und Anwender in der Praxis gleichermaßen geeignet. Durch die programmiersprachenunabhängige Darstellung der Algorithmen können sie leicht für die eigene Anwendung angepasst werden. Viele praktische Projekte erleichtern das Selbststudium und vertiefen das Gelernte.
Background
Against the background of a steadily increasing degree of digitalization in health care, a professional information management (IM) is required to successfully plan, implement, and evaluate information technology (IT). At its core, IM has to ensure a high quality of health data and health information systems to support patient care.
Objectives
The goal of the present study was to define what constitutes professional IM as a construct as well as to propose a reliable and valid measurement instrument.
Methods
To develop and validate the construct of professionalism of information management (PIM) and itsmeasurement, a stepwise approach followed an established procedure from information systems and behavioral research. The procedure included an analysis of the pertaining literature and expert rounds on the construct and the
instrument, two consecutive and comprehensive surveys at the national and international level, exploratory and confirmatory factor analyses as well as reliability and validity testing.
Results
Professionalism of information management was developed as a construct consisting of the three dimensions of strategic, tactical, and operational IMas well as of the regularity and cyclical phases of IM procedures as the two elements of professionalism.
The PIM instrument operationalized the construct providing items that incorporated IM procedures along the three dimensions and cyclical phases. These procedures had to be evaluated against their degree of regularity in the instrument. The instrument proved to be reliable and valid in two consecutive measurement phases
and across three countries.
Conclusion
It can be concluded that professionalism of information management is a meaningful construct that can be operationalized in a scientifically rigorous manner. Both science and practice can benefit from these developments in terms of improved self-assessment, benchmarking capabilities, and eventually, obtaining a better understanding of health IT maturity.
Facets of Website Content
(2019)
Content is of primary importance in the World Wide Web. In particular, subjective perceptions of content are known to influence a variety of user evaluations, thereby altering attitudes and behavioral outcomes. Thus, it is essential that individually experienced facets of content can be adequately assessed. In a series of seven studies, we create, validate, and benchmark a measure for users' subjective view on web content. In the first six studies, a total of 3106 participants evaluated a sum of 60 websites. The resulting Web-CLIC questionnaire is a 12-item measure based on prior research on web content. It encloses four main facets of users' content experience: clarity, likeability, informativeness, and credibility – jointly representing a general factor subjective content perception. Very high internal consistencies and high short- to medium-term retest reliabilities are demonstrated. Strong evidence for construct validity in terms of factorial, convergent, divergent, discriminative, concurrent, experimental, and predictive validity is found. In a seventh study, encompassing 7379 ratings on 120 websites, benchmarks for 10 different content domains and optimal cut points are provided. Overall, the present research suggests that the Web-CLIC is a sound measure of subjective content perception of both practical and theoretical benefit.