Refine
Document Type
- Conference Proceeding (17)
- Article (1)
- Part of a Book (1)
Has Fulltext
- yes (19) (remove)
Is part of the Bibliography
- yes (19)
Keywords
- BPMN (1)
- Business Process Modeling (1)
- DMN (1)
- DTNs (1)
- Distributed Energy Management, Smart Grid, Privacy, Communication Performance, Robustness (1)
- Dynamic Process Adaption (1)
- Inverse Distance Weighting (1)
- IoT (1)
- Kriging (1)
- Language Extension (1)
- Meta Modeling (1)
- OSM (1)
- OppNets (1)
- Process Robustness Verification (1)
- QoS (1)
- Smart Cities (1)
- Unreliable Communication Environments (1)
- rBPMN (1)
- smart city; monitoring; plausibility; traffic data; time series; spatio-temporal; reasoning (1)
- spatial resoning, distance algorithm, IoT, (1)
Institute
- Fakultät IuI (19) (remove)
High Performance and Privacy for Distributed Energy Management: Introducing PrivADE+ and PPPM
(2018)
Distributed Energy Management (DEM) will play a vital role in future smart grids. An important and often
overlooked factor in this concept is privacy. This paper presents two privacy-preserving DEM algorithms
called PrivADE+ and PPPM. PrivADE+ uses a round-based energy management procedure for switchable and
dynamically adaptable loads. PPPM utilises on the market-based PowerMatcher approach. Both algorithms
apply homomorphic encryption to privately gather aggregated data and exchange commands. Simulations
show that PrivADE+ and PPPM achieve good energy management quality with low communication requirements
and without negative influences on robustness.
The Internet of Things (IoT) relies on sensor devices to measure real-world phenomena in order to provide IoT services. The sensor readings are shared with multiple entities, such as IoT services, other IoT devices or other third parties. The collected data may be sensitive and include personal information. To protect the privacy of the users, the data needs to be protected through an encryption algorithm. For sharing cryptographic cipher-texts with a group of users Attribute-Based Encryption (ABE) is well suited, as it does not require to create group keys. However, the creation of ABE cipher-texts is slow when executed on resource constraint devices, such as IoT sensors. In this paper, we present a modification of an ABE scheme, which not only allows to encrypt data efficiently using ABE, but also reduces the size of the cipher-text, that must be transmitted by the sensor. We also show how our modification can be used to realise an instantaneous key revocation mechanism.
The Internet of Things (IoT) is the enabler for new innovations in several domains. It allows the connection of digital services with real, physical entities. These entities are devices of different categories and range in size from large machinery to tiny sensors. In the latter case, devices are typically characterized by limited resources in terms of computational power, available memory and sometimes limited power supply. As a consequence, the use of security algorithms requires expert knowledge in order for them to work within the limited resources. That means to find a suitable configuration for the algorithms to perform properly on the device. On the other side, there is the desire to protect valuable assets as strong as possible. Usually, security goals are captured in security policies, but they do not consider resource availability on the involved device and their consumption while executing security algorithms. This paper presents a resource aware information exchange model and a generation tool that uses high-level security policies as input. The model forms the conceptual basis for an automated security configuration recommendation system.
The Internet of Things (IoT) is the enabler for new innovations in several domains. It allows the connection of digital services with physical entities in the real world. These entities are devices of different categories and sizes range from large machinery to tiny sensors. In the latter case, devices are typically characterized by limited resources in terms of computational power, available memory and sometimes limited power supply. As a consequence, the use of security algorithms requires of them to work within the limited resources. This means to find a suitable implementation and configuration for a security algorithm, that performs properly on the device, which may become a challenging task. On the other side, there is the desire to protect valuable assets as strong as possible. Usually, security goals are recorded in security policies, but they do not consider resource availability on the involved device and its power consumption while executing security algorithms. This paper presents an IoT security configuration tool that helps the designer of an IoT environment to experiment with the trade-off between maximizing security and extending the lifetime of a resource constrained IoT device. The tool is controlled with high-level description of security goals in the form of policies. It allows the designer to validate various (security) configurations for a single IoT device up to a large sensor network.
Die Unterstützung des Maschinenführers auf der Landmaschine durch digitale Dienste nimmt immer stärker zu. Die Darstellungsmöglichkeiten sind jedoch auf die Größe der eingesetzten Terminals beschränkt. Um Sichteinschränkungen aus der Kabine durch zusätzliche Terminals zu vermeiden, ist der Einsatz von Augmented Reality sinnvoll. Hier lassen sich die vorhandenen Informationen statisch oder dynamisch in das Sichtfeld des Landwirts einblenden. Doch erst durch die in diesen Beitrag gezeigte Overlay Darstellungsebene mit integrierten Informationen lässt sich das Potenzial der Augmented Reality vollständig nutzen.
Der wirtschaftliche Druck in der Landwirtschaft mit weniger Ressourcen höhere Erträge zu erwirtschaften hat zu einer zunehmenden Automatisierung und Industrialisierung agrartechnischer Prozesse geführt. Die Vernetzung von kooperativen Agrarprozessen verfügt über außerordentliches wirtschaftliches Potenzial, birgt aber auch große Gefahren für die Datensicherheit. Daten werden vielfach nicht durch den Dateneigentümer erfasst, sondern von beauftragten Dienstleistern (z.B. von Lohnunternehmen). Bei einer Datenerfassung durch Dienstleister sind Datenzugriffe nicht kontrollierbar und nachträgliche Datenmanipulationen nicht auszuschließen. Datensicherheitslösungen aus anderen Wirtschaftsbereiche lassen sich nur unzureichend auf die Landtechnik übertragen. Dieser Beitrag stellt ein Basiskonzept zur bereichsübergreifenden Datensicherheit in der Landtechnik vor. Das Ziel des Konzeptes ist, die Datenhoheit durch den Eigentümer zu jeder Zeit zu gewährleisten und ausgewählte Prozessdaten manipulationssicher zu dokumentieren.
Die Nutzung von Sensorsystemen bei der teilflächenspezifischen Bewirtschaftung eines Schlags steigert den Ertrag sowie die Wirtschaftlichkeit des Pflanzenanbaus. Dennoch tragen weitere Faktoren zur optimalen Nährstoffversorgung einer Pflanze bei, als sie von solch einem lokal arbeitenden System erfasst werden. Um die Effizienz dieser Precision Farming Systeme auszubauen ist der nächste, hier erfolgreich durchgeführte Schritt die Anbindung der mobilen Landmaschine über das Internet an eine regionsübergreifende Datenanalyseplattform und die Ausführung zeitkritischer Optimierungsfunktionen auf der Landmaschine.
Protection and privacy of data in cooperative agricultural processes : the challenges of the future
(2016)
In agriculture, the growing usage of sensors, smart mobile machinery and information systems results in high volumes of data. The data differs in accuracy, frequency, volume, type and, most importantly, owner of the information. However, cooperative processes and big data analyses require access to comprehensive amounts of data for successful agricultural operation and reasoning. In some processes instructed contractors even gather data belonging to other owners and use it for machinery operation optimisation and accounting (e.g. yield in maize harvest). Today’s approach of data handling has a high potential to conflict with European and national regulations for data protection and privacy. This article presents a concept for continuous data protection and privacy in cooperative agricultural processes. The concept aims at ensuring data sovereignty for the owner while making as much data usable for process operation and big data research at the same time. Briefly explained, owners pick a collection of data and create usage licenses for other players. The licenses specify time-limited and / or position-bound access to the data collection. Privacy environments in soft- and / or hardware protect access rights on end user devices, data share hubs and machinery devices such as agricultural terminals. In addition to access right configurations, digital signatures prevent data manipulation when cooperative players capture data during processes. Socalled signature boxes represent certificated soft- or hardware components, which are located close at data sources (e.g. as hardware attached to sensors on mobile machinery) and bind the data captured with digital signatures.
Der Einsatz des ISOBUS zeigt, dass Bedarf an Datenkommunikation auch auf landtechnischen Gespannen besteht. Jedoch wird auch deutlich, dass der ISOBUS mit seiner relativ geringen Datenrate keine Ressourcenreserven für neue Anwendungen aufweist. Aus diesem Grund ist der Wechsel der Übertragungstechnologie für die Weiterentwicklung des ISOBUS zu einem High-Speed ISOBUS notwendig. Eine geeignete und im weiteren Verlauf näher betrachtete Technologie für den Wechsel ist Ethernet. Es wird gezeigt welche Potenziale für den ISOBUS durch Ethernet entstehen und welche Herausforderungen dabei bewältigt werden müssen.
Reliable information processing is an indispensable task in Smart City environments. Heterogeneous sensor infrastructures of individual information providers and data portal vendors tend to offer a hardly revisable information quality. This paper proposes a correlation model-based monitoring approach to evaluate the plausibility of smart city data sources. The model is based on spatial, temporal, and domain dependent correlations between individual data sources. A set of freely available datasets is used to evaluate the monitoring component and show the challenges of different spatial and temporal resolutions.
Interpolation of data in smart city architectures is an eminent task for the provision of reliable services. Furthermore, it is a key functionality for information validation between spatiotemporally related sensors. Nevertheless, many existing projects use a simplified geospatial model that does not take the infrastructure, which affects events and effects in the real world, into account. There are various available algorithms for interpolation and the calculation of routes on infrastructure based graphs and distances on geospatial data. This work proposes a combined approach by interconnecting detailed geospatial data whilst regarding the underlying infrastructure model.
In der Agrartechnik steht Landwirten und Lohnunternehmern eine steigende Anzahl digitaler Dienste zur Verfügung. Eine Modellierung, Ausführung und Steuerung von kooperativen Agrarprozessen ist aufgrund der verschiedenen, zueinander inkompatiblen IT-Lösungen nur eingeschränkt möglich. Es fehlt ein einheitlicher Standard zur Beschreibung dieser Prozesse. Der Beitrag stellt die Beschreibung von Agrarprozessen mit der Business Process Model and Notation (BPMN) dar. Domänenexperten (z.B. Landwirte, Lohnunternehmer, digitale Dienstanbieter) können kooperative Prozessabläufe plattformübergreifend gestalten, ohne dabei Prozessinterna mit anderen Akteuren teilen zu müssen. Als Brücke zwischen der kooperativen Prozessebene und der ausführenden Maschinenebene wird im Beitrag Message Queue Telemetry Transport (MQTT) eingesetzt: Mittels MQTT können Anweisungen und Informationen (z.B. Arbeitsaufträge, Statusdaten) zwischen beiden Ebenen in Echtzeit vermittelt und verarbeitet werden.
For Delay-Tolerant Networks (DTNs) many routing algorithms have been suggested. However, their performance depends heavily on the applied scenario. Especially heterogeneous scenarios featuring known and unknown node movements as well as different kinds of data lead to either poor delivery ratios or exhausted network resources.
To overcome these problems this paper introduces Data-Driven Routing for DTNs. Data is categorized according to its requirements into priority queues. Each queue applies an appropriate DTN routing algorithm that fits the data requirements best. Simulation results show that Data-Driven Routing allows high delivery ratios for time-critical data while saving network resources during the transfer of less time-critical data at the same time.
Ein modulares Framework zur Modellierung, Konfiguration und Regelung von kooperativen Agrarprozessen
(2016)
Die Komplexität vieler Agrarprozesse nimmt aufgrund von technischem Fortschritt, steigenden rechtlichen Anforderungen und Nachweispflichten beständig zu. Prozessketten werden in Kooperation verschiedener Akteure (Landwirt, Lohnunternehmer, Dienstleister, digitaler Vermittler, Behörde) gemeinsam bearbeitet, dokumentiert und geprüft. Ein ökonomisch und ökologisch ressourceneffizientes Management der Prozessausführung stellt eine Herausforderung für alle Akteure dar. Dynamische Prozessveränderungen führen vielfach zu manuellen Eingriffen in die Prozessregelung, die kostenintensive Verzögerungen verursachen. Das Forschungsvorhaben OPeRAte entwirft und evaluiert neu gestaltete Konzepte und Mechanismen zur durchgehenden Organisation und Regelung kooperativer Agrarprozesse. Es werden konfigurierbare und wiederverwendbare Module identifiziert, die sich an Prozessparameter anpassen und in artverwandten Prozessen erneut verwenden lassen. Das OPeRAte-Framework ermöglicht die Zusammenführung aller beteiligten Akteure und Ressourcen (Maschinen, Sensoren, Aktoren, Endgeräte, Server, Daten, etc.) über offene Schnittstellen. Prozessinhaber sollen durch autonome Prozesskonfigurationen und -adaptionen entlastet und durch Visualisierungen zu effizienten Entscheidungen befähigt werden. Die Konzepte dieses Beitrags dienen als Diskussionsgrundlage zur Formulierung von flexiblen und erweiterbaren Lösungsstrategien für die Landtechnik.
Management of agricultural processes is often troubled by disconnections and data transfer failures. Limited cellular network coverage may prevent information exchange between mobile process participants.
The research projects KOMOBAR and ISOCom designed, implemented und field-tested a delay tolerant platform for robust communication in rural areas and challenging environments. An adaptable combination of infrastructure-based cellular networks and infrastructure-free multihop ad hoc communication (WLAN) leads to a variety of new communication opportunities. Temporal storage and forwarding of data on mobile farm machinery as well as dynamic platform configurations during process runtime strongly enhance reliability and robustness of data transfers.
Process modeling languages help to define and execute processes and workflows. The Business Process Model and Notation (BPMN) 2.0 is used for business processes in commercial areas such as banks, shops, production and supply industry. Due to its flexible notation, BPMN is increasingly being used in non-traditional business process domains like Internet of Things (IoT) and agriculture. However, BPMN does not fit well to scenarios taking place in environments featuring limited, delayed, intermittent or broken connectivity. Communication just exists for BPMN - characteristics of message transfers, their priorities and connectivity parameters are not part of the model. No backup mechanism for communication issues exists, resulting in error-prone and failing processes. This paper introduces resilient BPMN (rBPMN), a valid BPMN extension for process modeling in unreliable communication environments. The meta model addition of opportunistic message flows with Quality of Service (QoS) parameters and connectivity characteristics allows to verify and enhance process robustness at design time. Modeling of explicit or implicit, decision-based alternatives ensures optimal process operation even when connectivity issues occur. In case of no connectivity, locally moved functionality guarantees stable process operation. Evaluation using an agricultural slurry application showed significant robustness enhancements and prevented process failures due to communication issues.
Our world and our lives are changing in many ways. Communication, networking, and computing technologies are among the most influential enablers that shape our lives today. Digital data and connected worlds of physical objects, people, and devices are rapidly changing the way we work, travel, socialize, and interact with our surroundings, and they have a profound impact on different domains,such as healthcare, environmental monitoring, urban systems, and control and management applications, among several other areas. Cities currently face an increasing demand for providing services that can have an impact on people’s everyday lives. The CityPulse framework supports smart city service creation by means of a distributed system for semantic discovery, data analytics, and interpretation of large-scale (near-)real-time Internet of Things data and social media data streams. To goal is to break away from silo
applications and enable cross-domain data integration. The CityPulse framework integrates multimodal, mixed quality, uncertain and incomplete data to create reliable, dependable information and continuously adapts data processing techniques to meet the quality of information requirements from end users. Different than existing solutions that mainly offer unified views of the data, the CityPulse framework is also equipped with powerful data analytics modules that perform intelligent data aggregation, event detection, quality
assessment, contextual filtering, and decision support. This paper presents the framework, describes ist components, and demonstrates how they interact to support easy development of custom-made applications for citizens. The benefits and the effectiveness of the framework are demonstrated in a use-case scenario
implementation presented in this paper.
he development of context-aware applications is a difficult and error-prone task. The dynamics of the environmental context combined with the complexity of the applications poses a vast number of possibilities for mistakes during the creation of new applications. Therefore it is important to test applications before they are deployed in a life system. For this reason, this paper proposes a testing tool, which will allow for automatic generation of various test cases from application description documents. Semantic annotations are used to create specific test data for context-aware applications. A test case reduction methodology based on test case diversity investigations ensures scalability of the proposed automated testing approach.