Fakultät IuI
Refine
Year of publication
Document Type
- Conference Proceeding (42)
- Article (13)
- Part of Periodical (1)
- Working Paper (1)
Language
- English (57) (remove)
Is part of the Bibliography
- yes (57)
Keywords
- Gazebo (2)
- LiDAR (2)
- Materialermüdung (2)
- Power Consumption (2)
- Robot operating system (ROS) (2)
- Simulation and Modeling (2)
- biogas (2)
- lab on a chip (2)
- mobile field laboratory (2)
- soil nutrients analysis (2)
Institute
- Fakultät IuI (57) (remove)
This article proposes the concept of a simulation framework for environmental sensors with multilevel abstraction in agricultural scenarios. The implementation case study is a simulation of a grain-harvesting scenario enabled by LiDAR sensors. Environmental sensor models as well as kinematics and dynamic behavior of machines are based on the robotics simulator Gazebo. Models for powertrain, machine process aggregates and peripheral simulation components are implemented with the help of MATLAB/ Simulink and with the robotics middleware Robot Operating System (ROS). This article deals with the general concept of a multilevel simulation framework and in particular with sensor and environmental modeling.
The objective of this review is a global assessment of the economics of second‐generation biorefineries, with a focus on the use of food waste and agricultural residues for chemical production by applying biotechnological processes. Analyses are conducted on feedstock and product distribution, applied economic models, and profitability figures for the period 2013–2018. In a study of 163 articles on different biorefinery systems, the production of chemicals is identified as the second major product class, after bioenergy. Bagasse and straw are frequently analyzed second‐generation feedstocks. Based on the evaluation of 22 articles, second‐generation biorefineries producing chemicals by applying biotechnological processes proves to be economically feasible. On average, both the internal rate of return (IRR) and the return on investment (ROI) are 20% and the payback period (PP) is 6 years. The cost share of feedstock in biorefineries is between 0–50%. The price of the end product and the fermentation yields have the most impact on profitability. The processing of food waste that has industrial and municipal origins appears more economical than the processing of agricultural residues. Scientists, policy makers and entrepreneurs with an appropriate risk tolerance are advised to pay particular attention to municipal food waste and the potential economic production of carboxylic acids. For various economic issues related to biorefineries, dynamic‐deterministic models are recommended, which can be extended by a stochastic model. This review provides an initial overview of the economic feasibility of second‐generation biorefineries. Further techno‐economic analyses are required to produce statistically significant statements on key profitability figures. © 2020 The Authors. Biofuels, Bioproducts, and Biorefining published by Society of Chemical Industry and John Wiley & Sons, Ltd.
Artificial intelligence (AI) and human-machine interaction (HMI) are two keywords that usually do not fit embedded applications. Within the steps needed before applying AI to solve a specific task, HMI is usually missing during the AI architecture design and the training of an AI model. The human-in-the-loop concept is prevalent in all other steps of developing AI, from data analysis via data selection and cleaning to performance evaluation. During AI architecture design, HMI can immediately highlight unproductive layers of the architecture so that lightweight network architecture for embedded applications can be created easily. We show that by using this HMI, users can instantly distinguish which AI architecture should be trained and evaluated first since a high accuracy on the task could be expected. This approach reduces the resources needed for AI development by avoiding training and evaluating AI architectures with unproductive layers and leads to lightweight AI architectures. These resulting lightweight AI architectures will enable HMI while running the AI on an edge device. By enabling HMI during an AI uses inference, we will introduce the AI-in-the-loop concept that combines AI's and humans' strengths. In our AI-in-the-loop approach, the AI remains the working horse and primarily solves the task. If the AI is unsure whether its inference solves the task correctly, it asks the user to use an appropriate HMI. Consequently, AI will become available in many applications soon since HMI will make AI more reliable and explainable.
This paper presents an optimized algorithm for estimating static and dynamic gait parameters. We use a marker- and contact-less motion capture system that identifies 20 joints of a person walking along a corridor.
Based on the proposed gait cycle detection basic metrics as walking frequency, step/stride length, and support phases are estimated automatically. Applying a rigid body model, we are capable to calculate static and dynamic gait stability metrics. We conclude with initial results of a clinical study evaluating orthopaedic technical support.
Interpolation of data in smart city architectures is an eminent task for the provision of reliable services. Furthermore, it is a key functionality for information validation between spatiotemporally related sensors. Nevertheless, many existing projects use a simplified geospatial model that does not take the infrastructure, which affects events and effects in the real world, into account. There are various available algorithms for interpolation and the calculation of routes on infrastructure based graphs and distances on geospatial data. This work proposes a combined approach by interconnecting detailed geospatial data whilst regarding the underlying infrastructure model.
Knowledge of the small-scale nutrient status of a field is an important basis for decision-making when it comes to optimising the fertiliser use in crop production. Currently, the traditional method involves soil sampling in the field and soil sample analysis in the laboratory as two separate working processes.
The previous research project "soil2data" developed a mobile field laboratory for different carrier vehicles. In the follow-up project "prototypes4soil2data", the results of soil2data are further developed. A mixed soil sample is collected during the drive on the field. The soil sample is then wet-chemically prepared and analysed. The overall soil sampling and analysis process is divided into the following process steps: soil sampling planning, soil sampling, soil preparation, soil analysis and data management. The process steps are modified for the mobile field laboratory and the process steps run in parallel. The new soil extraction method is based on official German methods (VDLUFA) to ensure the interoperability of the analysis results with the VDLUFA fertiliser recommendations. An innovative key component is the NUTRISTAT analysis module (lab-on-chip with ISFET measurement technology). It can measure pH, the nutrients NO3-, H2PO4-, K+ and the electrical conductivity. In addition to the advantages of rapid data availability and no need to transport soil material to the laboratory, it provides a future basis for new application, e.g. verification of current results in the field during soil sampling with existing results or dynamic adjustment of soil sampling during work in the field.
Analysis of methods for prioritizing critical data transmissions in agricultural vehicular networks
(2020)
Applying wireless communication technologies to agricultural vehicular networks often results in high end-to-end delays and loss of packets due to intermittent or broken connectivity. This paper analyses the methods for the successful delivery of the vehicular data within acceptable delay times. Different kinds of data that are generated and transmitted in agricultural networks are considered in this paper, followed by the data prioritization methods which allow critical data to be prioritized against other data. In this regard, Enhanced Distributed Channel Access, Differentiated Services, and application-based data rate variation are discussed in conjunction with the Simple Network Management Protocol. These techniques are simulated or tested separately and then together and the results show that even in poor network conditions, high-prioritized data is not lost or delayed.
he development of context-aware applications is a difficult and error-prone task. The dynamics of the environmental context combined with the complexity of the applications poses a vast number of possibilities for mistakes during the creation of new applications. Therefore it is important to test applications before they are deployed in a life system. For this reason, this paper proposes a testing tool, which will allow for automatic generation of various test cases from application description documents. Semantic annotations are used to create specific test data for context-aware applications. A test case reduction methodology based on test case diversity investigations ensures scalability of the proposed automated testing approach.