Refine
Year of publication
Document Type
- Conference Proceeding (16)
- Article (2)
Language
- English (18) (remove)
Is part of the Bibliography
- yes (18)
Keywords
- time synchronization (2)
- 5G (1)
- 802.1AS (1)
- BPMN (1)
- Business Process Modeling (1)
- DMN (1)
- DTNs (1)
- DiffServ (1)
- Distributed Energy Management, Smart Grid, Privacy, Communication Performance, Robustness (1)
- Dynamic Adaptation (1)
Institute
- Fakultät IuI (18) (remove)
he development of context-aware applications is a difficult and error-prone task. The dynamics of the environmental context combined with the complexity of the applications poses a vast number of possibilities for mistakes during the creation of new applications. Therefore it is important to test applications before they are deployed in a life system. For this reason, this paper proposes a testing tool, which will allow for automatic generation of various test cases from application description documents. Semantic annotations are used to create specific test data for context-aware applications. A test case reduction methodology based on test case diversity investigations ensures scalability of the proposed automated testing approach.
Our world and our lives are changing in many ways. Communication, networking, and computing technologies are among the most influential enablers that shape our lives today. Digital data and connected worlds of physical objects, people, and devices are rapidly changing the way we work, travel, socialize, and interact with our surroundings, and they have a profound impact on different domains,such as healthcare, environmental monitoring, urban systems, and control and management applications, among several other areas. Cities currently face an increasing demand for providing services that can have an impact on people’s everyday lives. The CityPulse framework supports smart city service creation by means of a distributed system for semantic discovery, data analytics, and interpretation of large-scale (near-)real-time Internet of Things data and social media data streams. To goal is to break away from silo
applications and enable cross-domain data integration. The CityPulse framework integrates multimodal, mixed quality, uncertain and incomplete data to create reliable, dependable information and continuously adapts data processing techniques to meet the quality of information requirements from end users. Different than existing solutions that mainly offer unified views of the data, the CityPulse framework is also equipped with powerful data analytics modules that perform intelligent data aggregation, event detection, quality
assessment, contextual filtering, and decision support. This paper presents the framework, describes ist components, and demonstrates how they interact to support easy development of custom-made applications for citizens. The benefits and the effectiveness of the framework are demonstrated in a use-case scenario
implementation presented in this paper.
Management of agricultural processes is often troubled by disconnections and data transfer failures. Limited cellular network coverage may prevent information exchange between mobile process participants.
The research projects KOMOBAR and ISOCom designed, implemented und field-tested a delay tolerant platform for robust communication in rural areas and challenging environments. An adaptable combination of infrastructure-based cellular networks and infrastructure-free multihop ad hoc communication (WLAN) leads to a variety of new communication opportunities. Temporal storage and forwarding of data on mobile farm machinery as well as dynamic platform configurations during process runtime strongly enhance reliability and robustness of data transfers.
Process modeling languages help to define and execute processes and workflows. The Business Process Model and Notation (BPMN) 2.0 is used for business processes in commercial areas such as banks, shops, production and supply industry. Due to its flexible notation, BPMN is increasingly being used in non-traditional business process domains like Internet of Things (IoT) and agriculture. However, BPMN does not fit well to scenarios taking place in environments featuring limited, delayed, intermittent or broken connectivity. Communication just exists for BPMN - characteristics of message transfers, their priorities and connectivity parameters are not part of the model. No backup mechanism for communication issues exists, resulting in error-prone and failing processes. This paper introduces resilient BPMN (rBPMN), a valid BPMN extension for process modeling in unreliable communication environments. The meta model addition of opportunistic message flows with Quality of Service (QoS) parameters and connectivity characteristics allows to verify and enhance process robustness at design time. Modeling of explicit or implicit, decision-based alternatives ensures optimal process operation even when connectivity issues occur. In case of no connectivity, locally moved functionality guarantees stable process operation. Evaluation using an agricultural slurry application showed significant robustness enhancements and prevented process failures due to communication issues.
For Delay-Tolerant Networks (DTNs) many routing algorithms have been suggested. However, their performance depends heavily on the applied scenario. Especially heterogeneous scenarios featuring known and unknown node movements as well as different kinds of data lead to either poor delivery ratios or exhausted network resources.
To overcome these problems this paper introduces Data-Driven Routing for DTNs. Data is categorized according to its requirements into priority queues. Each queue applies an appropriate DTN routing algorithm that fits the data requirements best. Simulation results show that Data-Driven Routing allows high delivery ratios for time-critical data while saving network resources during the transfer of less time-critical data at the same time.
The 3GPP release 16 integrates TSN functionality into 5G and standardizes various options for TSN time synchronization over 5G such as transparent mode and bridge mode. The time domains for the TSN network and the 5G network are kept separate with an option to synchronize either of the networks to the other. The TSN time synchronization over 5G is possible either by using the IEEE 1588 generalized Precision Time Protocol (gPTP) based on UDP/IP multicast or via IEEE 802.1AS based on Ethernet PDUs. The INET and Simu5G simulation frameworks, which are both based on the OMNeT++ discrete event simulator, are widely used for simulating TSN and 5G networks. The INET framework comprises the 802.1AS based time synchronization mechanism, and Simu5G provides the 5G user plane carrying IP PDUs. We modified the 802.1AS-based synchronization model of INET so that it works over UDP/IP. With that, it is possible to synchronize TSN slaves (connected to 5G UEs), across a 5G network, with a TSN master clock, present within a TSN network, that is connected to the 5G core network. Our simulation results show that 500 microseconds of synchronization accuracy can be achieved with the corrected asymmetric propagation delay of uplink and downlink between the gNodeB (gNB) and the User Equipment (UE). Furthermore, the synchronization accuracy can be improved if the delay difference between uplink and downlink is known.
Recent real-time networking developments have enabled ultra reliability, very low latency and high data rates in wired networks. Wireless networking developments have also shown that they can achieve very high data rates with consistency, but they still lack in providing ultra reliability and extremely low latency. Time Sensitive Networking (TSN) developments have brought these capabilities in Industry automation and Automotive industry too. Although TSN is standardized for wired networks for a long time, for wireless networks it will be standardized within the IEEE 802.11be standard for Wi-Fi and 3GPP Release 17 for 5G in the near future. This paper provides an overview of TSN in wired and wireless networks with the aim of comparing different simulators and presenting their offered functionality and shortcomings. These tools can be used to make oneself familiar with TSN algorithms, standards, and for the development and testing of time sensitive networks. Afterwards, the paper discusses open research questions for using TSN over wireless networks.
Analysis of methods for prioritizing critical data transmissions in agricultural vehicular networks
(2020)
Applying wireless communication technologies to agricultural vehicular networks often results in high end-to-end delays and loss of packets due to intermittent or broken connectivity. This paper analyses the methods for the successful delivery of the vehicular data within acceptable delay times. Different kinds of data that are generated and transmitted in agricultural networks are considered in this paper, followed by the data prioritization methods which allow critical data to be prioritized against other data. In this regard, Enhanced Distributed Channel Access, Differentiated Services, and application-based data rate variation are discussed in conjunction with the Simple Network Management Protocol. These techniques are simulated or tested separately and then together and the results show that even in poor network conditions, high-prioritized data is not lost or delayed.
This paper presents a framework for OMNeT++ which includes time synchronization model for WLANs. Synchronization is based on the Generalized Precision Time Protocol (gPTP) standard, which aims to achieve an accuracy of less than 100 nanoseconds. The presented model is developed and implemented in OMNeT++, a discrete event network simulator, using its INET library. A new type of WLAN node is modeled which supports time synchronization at the Link layer. A clock module for WLAN nodes is also modeled which implements variable clock drift to simulate noise interference in clock frequency oscillators. Simulations with our WLAN nodes are done and the results show that using gPTP based time synchronization in wireless networks, accuracy of ±3ns can be achieved.
Interpolation of data in smart city architectures is an eminent task for the provision of reliable services. Furthermore, it is a key functionality for information validation between spatiotemporally related sensors. Nevertheless, many existing projects use a simplified geospatial model that does not take the infrastructure, which affects events and effects in the real world, into account. There are various available algorithms for interpolation and the calculation of routes on infrastructure based graphs and distances on geospatial data. This work proposes a combined approach by interconnecting detailed geospatial data whilst regarding the underlying infrastructure model.
Reliable information processing is an indispensable task in Smart City environments. Heterogeneous sensor infrastructures of individual information providers and data portal vendors tend to offer a hardly revisable information quality. This paper proposes a correlation model-based monitoring approach to evaluate the plausibility of smart city data sources. The model is based on spatial, temporal, and domain dependent correlations between individual data sources. A set of freely available datasets is used to evaluate the monitoring component and show the challenges of different spatial and temporal resolutions.
Protection and privacy of data in cooperative agricultural processes : the challenges of the future
(2016)
In agriculture, the growing usage of sensors, smart mobile machinery and information systems results in high volumes of data. The data differs in accuracy, frequency, volume, type and, most importantly, owner of the information. However, cooperative processes and big data analyses require access to comprehensive amounts of data for successful agricultural operation and reasoning. In some processes instructed contractors even gather data belonging to other owners and use it for machinery operation optimisation and accounting (e.g. yield in maize harvest). Today’s approach of data handling has a high potential to conflict with European and national regulations for data protection and privacy. This article presents a concept for continuous data protection and privacy in cooperative agricultural processes. The concept aims at ensuring data sovereignty for the owner while making as much data usable for process operation and big data research at the same time. Briefly explained, owners pick a collection of data and create usage licenses for other players. The licenses specify time-limited and / or position-bound access to the data collection. Privacy environments in soft- and / or hardware protect access rights on end user devices, data share hubs and machinery devices such as agricultural terminals. In addition to access right configurations, digital signatures prevent data manipulation when cooperative players capture data during processes. Socalled signature boxes represent certificated soft- or hardware components, which are located close at data sources (e.g. as hardware attached to sensors on mobile machinery) and bind the data captured with digital signatures.
Long Range Wide Area Network (LoRaWAN) operates in the ISM band with 868 MHz, where the Time on Air (ToA) is regulated in the EU to 1 %. LoRaWAN nodes use the Adaptive Data Rate (ADR) algorithm to adapt their data rates during operation. The standard ADR algorithm works well with stationary nodes, however is very slow in the adaptation for mobile nodes. This paper introduces a new ADR algorithm for LoRaWAN that is supported by higher level meta-data for sensor streams, namely Quality of Information (QoI). With the help of QoI it is possible to provide additional information to the new ADR algorithm, reducing the convergence time and thus improving the Packet Delivery Ratio (PDR) of data from mobile sensor nodes. The new algorithm requires only modifications on network server side and keeps backwards compatibility with LoRaWAN nodes. Results show a significant better PDR compared to the standard ADR in scenarios with a limited number of mobile nodes.
The Internet of Things (IoT) is the enabler for new innovations in several domains. It allows the connection of digital services with real, physical entities. These entities are devices of different categories and range in size from large machinery to tiny sensors. In the latter case, devices are typically characterized by limited resources in terms of computational power, available memory and sometimes limited power supply. As a consequence, the use of security algorithms requires expert knowledge in order for them to work within the limited resources. That means to find a suitable configuration for the algorithms to perform properly on the device. On the other side, there is the desire to protect valuable assets as strong as possible. Usually, security goals are captured in security policies, but they do not consider resource availability on the involved device and their consumption while executing security algorithms. This paper presents a resource aware information exchange model and a generation tool that uses high-level security policies as input. The model forms the conceptual basis for an automated security configuration recommendation system.
The Internet of Things (IoT) is the enabler for new innovations in several domains. It allows the connection of digital services with physical entities in the real world. These entities are devices of different categories and sizes range from large machinery to tiny sensors. In the latter case, devices are typically characterized by limited resources in terms of computational power, available memory and sometimes limited power supply. As a consequence, the use of security algorithms requires of them to work within the limited resources. This means to find a suitable implementation and configuration for a security algorithm, that performs properly on the device, which may become a challenging task. On the other side, there is the desire to protect valuable assets as strong as possible. Usually, security goals are recorded in security policies, but they do not consider resource availability on the involved device and its power consumption while executing security algorithms. This paper presents an IoT security configuration tool that helps the designer of an IoT environment to experiment with the trade-off between maximizing security and extending the lifetime of a resource constrained IoT device. The tool is controlled with high-level description of security goals in the form of policies. It allows the designer to validate various (security) configurations for a single IoT device up to a large sensor network.
The Internet of Things (IoT) relies on sensor devices to measure real-world phenomena in order to provide IoT services. The sensor readings are shared with multiple entities, such as IoT services, other IoT devices or other third parties. The collected data may be sensitive and include personal information. To protect the privacy of the users, the data needs to be protected through an encryption algorithm. For sharing cryptographic cipher-texts with a group of users Attribute-Based Encryption (ABE) is well suited, as it does not require to create group keys. However, the creation of ABE cipher-texts is slow when executed on resource constraint devices, such as IoT sensors. In this paper, we present a modification of an ABE scheme, which not only allows to encrypt data efficiently using ABE, but also reduces the size of the cipher-text, that must be transmitted by the sensor. We also show how our modification can be used to realise an instantaneous key revocation mechanism.
High Performance and Privacy for Distributed Energy Management: Introducing PrivADE+ and PPPM
(2018)
Distributed Energy Management (DEM) will play a vital role in future smart grids. An important and often
overlooked factor in this concept is privacy. This paper presents two privacy-preserving DEM algorithms
called PrivADE+ and PPPM. PrivADE+ uses a round-based energy management procedure for switchable and
dynamically adaptable loads. PPPM utilises on the market-based PowerMatcher approach. Both algorithms
apply homomorphic encryption to privately gather aggregated data and exchange commands. Simulations
show that PrivADE+ and PPPM achieve good energy management quality with low communication requirements
and without negative influences on robustness.