Modeling and analysis of cyber-physical systems are still challenging. One reason is that cyber-physical systems involve many different parts (cyber or physical), of different nature (discrete or... Show moreModeling and analysis of cyber-physical systems are still challenging. One reason is that cyber-physical systems involve many different parts (cyber or physical), of different nature (discrete or continuous), and in constant interaction via sensing and actuating.For instance, consider a group of robots, each running a program that takes decision based on the sequence of sensor readings. The sensors that equip a robot return the current position of the robot and the position of any adjacent obstacle. The interactionoccurring between each robot in the group cannot be derived solely from the specification of individual robots. If the field on which the robots roam changes its property, the same group of robots might sense different values, and therefore take different actions. Also, the time at which a robot acts and senses will affect the decision of each controller and will change the resulting collective behavior.This thesis proposes a compositional approach to the design and programming of interacting cyber-physical components. We present an algebra that provides a novel perspective on modeling interaction of cyber-physical components. Using our algebraic framework, one can design a complex cyber-physical system by first designing each part, and then specifying how the parts interact. We formalized the relation between our abstract semantic model and an implementation written in Maude, a programming language based on rewriting logic. We present some applications, including safety and liveness properties of a system consisting of a set of robots, each equipped with a battery, running on a shared field. Show less
Radiography is an important technique to inspect objects, with applications in airports and hospitals. X-ray imaging is also essential in industry, for instance in food safety checks for the... Show moreRadiography is an important technique to inspect objects, with applications in airports and hospitals. X-ray imaging is also essential in industry, for instance in food safety checks for the presence of foreign objects. Computed tomography (CT) enables more accurate visualizations of an object in 3D, but requires more computation time. Spectral X-ray imaging is an important recent development to optimize these conflicting goals of speed and accuracy. This technique enables separation of detected X-ray photons in terms of energy. More information can be extracted from spectral images, which allows for better separation of materials. Deep learning is another important recent technique enabling machines to quickly carry out processing tasks, by training these with large volumes of data for these specific tasks.In this dissertation we present new processing methods that use spectral imaging and machine learning, with a special focus on industrial processes. We design a workflow using CT to efficiently generate large volumes of machine learning training data. In addition, we develop a compression method for efficient processing of large volumes of spectral data and two new spectral CT methods to produce more accurate reconstructions. The presented methods are designed for effective use in industry. Show less
With the advent of multicore processors and data centers, computer hardware has become increasingly parallel, allowing one to run multiple pieces of software at the same time on different machines.... Show moreWith the advent of multicore processors and data centers, computer hardware has become increasingly parallel, allowing one to run multiple pieces of software at the same time on different machines. Coordination of these pieces is best expressed in a coordination language as an explicit interaction protocol that clearly defines the interactions among all components in the software. An explicit interaction protocol not only improves code structure but also enables automated analysis of the protocol to improve execution efficiency of the software. In this thesis, we focus in particular on improving execution efficiency by means of scheduling, which concerns with the allocation of (computing) resources to software tasks. Almost always, scheduling is the responsibility of a general-purpose operating system that makes no assumptions on the software and thereby ignores all relevant scheduling information in that software. As a result, the operating system alone cannot ensure optimally scheduled execution of the software. In this thesis, we propose a solution that changes the software such that it will be efficiently scheduled by the general-purpose operating system. The main idea is to take advantage of the duality between scheduling and coordination. To be precise, we analyze the protocol of the software to determine an optimal scheduling strategy for this software. Then, we enforce this optimal schedule by incorporating the strategy in the original protocol. As a result, we force the ignorant operating scheduler to follow our precomputed optimal schedule. Show less
This thesis manuscript explores the use of video games as tools for conceptual exploration and academic research. The research question of how video games facilitate exploration is investigated... Show moreThis thesis manuscript explores the use of video games as tools for conceptual exploration and academic research. The research question of how video games facilitate exploration is investigated through nine chapters, including empirical studies such as user surveys, video game design artifacts, and user studies. Chapter 2 introduces relevant terminology and related work. Chapter 3 describes the creation of CURIO, a video game developed for classroom use that requires players to ask critical and original questions about topics a teacher defines, revealing the need to highlight information gaps to stimulate curiosity for conceptual exploration. Chapter 4 investigates what types of video games elicit curiosity for exploration through a survey, while Chapter 5 formulates a hypothesis on design patterns for exploration based on the survey results. Chapters 6 and 7 focus on implementing and validating design patterns for exploration and how they influence player behavior and experience. Chapter 8 reflects on the use of video games in research efforts. The final chapter summarizes insights and contributions, and provides directions for future research. Overall, the manuscript presents evidence that video games can effectively facilitate exploration and can be used as tools for academic research. Show less
The focus of this thesis is on the technical methods which help promote the movement towards Trustworthy AI, specifically within the Inspectorate of the Netherlands.The goal is develop and assess... Show moreThe focus of this thesis is on the technical methods which help promote the movement towards Trustworthy AI, specifically within the Inspectorate of the Netherlands.The goal is develop and assess the technical methods which are required to shift the actions of the Inspectorate to a data-driven paradigm, concretely under a supervised classification framework of machine learning.The aspect of reliability is addressed as a data quality concern, viz. missingness and noise.The aspect of fairness is addressed as a counter to bias in the selection process of inspections.The conclusion is that, whilst no complete solution has yet been suggested, it is possible to address the concerns related to data quality and data bias, culminating in well-performing classification models which are reliable and fair. Show less
Predictive maintenance (PdM) is a maintenance policy that uses the past, current, and prognosticated health condition of an asset to predict when timely maintenance should occur. PdM overcomes... Show morePredictive maintenance (PdM) is a maintenance policy that uses the past, current, and prognosticated health condition of an asset to predict when timely maintenance should occur. PdM overcomes challenges of more conservative policies, such as corrective or scheduled maintenance. The remaining useful life (RUL) is a critical notion in PdM that determines the time remaining until a system is no longer useful and requires maintenance. Among the approaches employed to estimate the RUL, data-driven PdM methods have shown to be a good candidate due to their (mostly) domain-agnostic nature and broad applicability mos on the asset’s generated data. Nevertheless, there are various challenges to consider in data-driven PdM, such as algorithm selection, hyperparameter optimization, and uncertainty of the RUL estimation. This thesis proposes solutions and frameworks for these challenges using simulated datasets. We furthermore dive into scheduling optimization which is the next step in PdM and point towards the importance of understanding the data generating process in PdM using real-world data. Finally, we show how a method originally developed for PdM in the automotive industry can lend itself to the medical domain, exhibiting the significance of knowledge-transfer and the versatility of data-driven methods. Show less