Search
Generic filters
Exact matches only
Filter by content type
Users
Attachments

Publications Overview

The only constant in our world is change. Why is there not a field of science that explicitly studies continuous change? We propose the establishment of process science, a field that studies processes: coherent series of changes, both man-made and naturally occurring, that unfold over time and occur at various levels. Process science is concerned with understanding and influencing change. It entails discovering and understanding processes as well as designing interventions to shape them into desired directions. Process science is based on four key principles; it (1) puts processes at the center of attention, (2) investigates processes scientifically, (3) embraces perspectives of multiple disciplines, and (4) aims to create impact by actively shaping the unfolding of processes. The ubiquitous availability of digital trace data, combined with advanced data analytics capabilities, offer new and unprecedented opportunities to study processes through multiple data sources, which makes process science very timely.

More
get_appFrancesca Zerbato, Ronny Seiger, Gemma Di Federico, Andrea Burattin, Barbara Weber
Conference or Workshop Item
Process mining techniques rely on the availability of event logs, where events have a certain granularity that is deemed appropriate for representing business activities. In this paper, we discuss why choosing a proper granularity level during preprocessing can be challenging and reflect on the implications that such a “fixed” view over the process bears for the analysis. Then, inspired by use cases in the context of user behavior analysis, we envision possible solutions that allow exploring and mining multiple granularity levels of process activities.

More
The Fourth Industrial Revolution is in full progress. An increasing number of manufacturing companies are using information technology to digitize their products and services by integrating production machines and production processes with enterprise information systems and digital processes. These interconnected smart machines, products, and processes build the Industrial Internet of Things (IIoT). However, if we want to integrate the functionality and data provided by the production machines and resources with information systems such as Enterprise Resource Planning (ERP) systems and business process management systems (BPMS), several interoperability, abstraction, and interaction issues must be addressed: machines have different non-standardized programming interfaces; are programmed in low-level programming languages (e.g., G code); are very complex; and operate mostly in isolation. Nevertheless, this bi-directional integration of the IIoT devices with information systems shows a lot of potential and benefits for both areas. On the one hand, live status data about production machines and processes can be fed directly into the ERP systems, e.g., to optimize the production plans or adapt the production plans in case of machine failures or downtimes. On the other hand, the BPMS can monitor, analyze, and control production processes, e.g., to adapt or reconfigure the production processes in case of exceptions on the shop floor. Realizing these flexible and dynamic processes and production lines is among the main goals of Industry 4.0 developments. In this blog post, we present a novel software stack to reduce the gap between the production shop floor and enterprise-level management systems. First, we discuss how to abstract and integrate production machines via web services that can be called from BPMS. Then, we showcase this software stack by using the Camunda Modeler and Camunda Platform to automate and execute exemplary production processes in our own smart factory.

More
get_appLukas Malburg, Manfred-Peter Rieder, Ronny Seiger, Patrick Klein, Ralph Bergmann
Conference or Workshop Item
The production industry is in a transformation towards more autonomous and intelligent manufacturing. In addition to more flexible production processes to dynamically respond to changes in the environment, it is also essential that production processes are continuously monitored and completed in time. Video-based methods such as object detection systems are still in their infancy and rarely used as basis for process monitoring. In this paper, we present a framework for video-based monitoring of manufacturing processes with the help of a physical smart factory simulation model. We evaluate three state-of-the-art object detection systems regarding their suitability to detect workpieces and to recognize failure situations that require adaptations. In our experiments, we are able to show that detection accuracies above 90% can be achieved with current object detection methods.

More
In the field of Business Process Management (BPM), modeling business processes and related data is a critical issue since process activities need to manage data stored in databases. The connection between processes and data is usually handled at the implementation level, even if modeling both processes and data at the conceptual level should help designers in improving business process models and identifying requirements for implementation. Especially in data- and decision-intensive contexts, business process activities need to access data stored both in databases and data warehouses. In this paper, we complete our approach for defining a novel conceptual view that bridges process activities and data. The proposed approach allows the designer to model the connection between business processes and database models and define the operations to perform, providing interesting insights on the overall connected perspective and hints for identifying activities that are crucial for decision support.

More
Our everyday lives are increasingly pervaded by digital assistants and smart devices forming the Internet of Things (IoT). While user interfaces to directly monitor and control individual IoT devices are becoming more sophisticated and end-user friendly, applications to connect standalone IoT devices and create more complex IoT processes for automating and assisting users with repetitive tasks still require a high level of technical expertise and programming knowledge. Related approaches for process modelling in IoT mostly suggest extensions to complex modelling languages, require high levels of abstraction and technical knowledge, and rely on unintuitive tools. We present a novel approach for end-user oriented--no-code--IoT process modelling using Mixed Reality (MR) technology: HoloFlows. Users are able to explore the IoT environment and model processes among sensors and actuators as first class citizens by simply "drawing" virtual wires among physical IoT devices. MR technology hereby facilitates the understanding of the physical contexts and relations among the IoT devices and provides a new and more intuitive way of modelling IoT processes. The results of a user study comparing HoloFlows with classical modelling approaches show an increased user experience and decrease of required modelling knowledge and technical expertise to create IoT processes.

More
abstract = {In the past decade, brain and autonomic nervous system activity measurement received increasing attention in the study of software engineering (SE). This paper presents a systematic literature review (SLR) to survey the existing NeuroSE literature. Based on a rigorous search protocol, we identified 89 papers (hereafter denoted as NeuroSE papers). We analyzed these papers to develop a comprehensive understanding of who had published NeuroSE research and classified the contributions according to their type. The 47 articles presenting completed empirical research were analyzed in detail. The SLR revealed that the number of authors publishing NeuroSE research is still relatively small. The thematic focus so far has been on code comprehension, while code inspection, programming, and bug fixing have been less frequently studied. NeuroSE publications primarily used methods related to brain activity measurement (particularly fMRI and EEG), while methods related to the measurement of autonomic nervous system activity (e.g., pupil dilation, heart rate, skin conductance) received less attention. We also present details of how the empirical research was conducted, including stimuli and independent and dependent variables, and discuss implications for future research. The body of NeuroSE literature is still small. Yet, high quality contributions exist constituting a valuable basis for future studies.

More
This work has been motivated by the needs we discovered when analyzing real-world processes from the healthcare domain that have revealed high flexibility demands and complex temporal constraints. When trying to model these processes with existing languages, we learned that none of the latter was able to fully address these needs. This motivated us to design TConDec-R, a declarative process modeling language enabling the specification of complex temporal constraints. Enacting business processes based on declarative process models, however, introduces a high complexity due to the required optimization of objective functions, the handling of various temporal constraints, the concurrent execution of multiple process instances, the management of cross-instance constraints, and complex resource allocations. Consequently, advanced user support through optimized schedules is required when executing the instances of such models. In previous work, we suggested a method for generating an optimized enactment plan for a given set of process instances created from a TConDec-R model. However, this approach was not applicable to scenarios with uncertain demands in which the enactment of newly created process instances starts continuously over time, as in the considered healthcare scenarios. Here, the process instances to be planned within a specific timeframe cannot be considered in isolation from the ones planned for future timeframes. To be able to support such scenarios, this article significantly extends our previous work by generating optimized enactment plans under a rolling planning horizon. We evaluate the approach by applying it to a particularly challenging healthcare process scenario, i.e., the diagnostic procedures required for treating patients with ovarian carcinoma in a Woman Hospital. The application of the approach to this sophisticated scenario allows avoiding constraint violations and effectively managing shared resources, which contributes to reduce the length of patient stays in the hospital.

More
The use of neurophysiological measurements to advance the design, development, use, acceptance, influence and adaptivity of information systems is receiving increasing attention. Within the field of education, neurophysiological measurements have commonly been used to capture a learner’s psychological constructs such as cognitive load, attention and emotion, which play an important role in student learning. This paper systematically examines the literature on the use of neurophysiological measurements in higher education. In particular, using a well-established Systematic Literature Review (SLR) method, we identified 83 papers reporting empirical evidence about the outcome of employing neurophysiological measurements within educational technologies in higher education. The findings of the SLR are divided into three main themes discussing the employed measurements, experimental settings and constructs and outcomes. Our findings identify that (1) electroencephalography and facial expression recognition are the dominantly employed types of measurement, (2) the majority of the experiments used a pre-experimental design, (3) attention and emotion are the two foremost cognitive and non-cognitive constructs under investigation, while less emphasis is paid to meta-cognitive constructs and (4) the reported results mostly focus on monitoring learners’ states, which are not always the same as the intended purpose, such as developing an adaptive system. On a broader term, the review of the literature provides evidence of the effective use of neurophysiological measurements by educational technologies to enhance learning; however, a number of challenges and concerns related to the accuracy and validity of the captured construct, the intrusiveness of the employed instruments as well as ethical and privacy considerations have surfaced, that need to be addressed before such technologies can be employed and adopted at scale.

More
get_appChristian Janisch, Agnes Koschmider, Massimo Mecella, Barbara Weber, Andrea Burattin, Claudio Di Ciccio, Giancarlo Fortino, Avigdor Gal, Udo Kannengiesser, Francesco Leotta, Felix Mannhardt, Andrea Marrella, Jan Mendling, Anderas Oberweis, Manfred Reichert, Stefanie Rinderle-Ma, Estefania Serral, WenZhan Song, Jianwen Su, Victoria Torres, Matthias Weidlich, Mathias Weske, Liang Zhang
Journal paper
The Internet of Things (IoT) refers to a network of connected devices collecting and exchanging data over the Internet. These things can be artificial or natural and interact as autonomous agents forming a complex system. In turn, Business Process Management (BPM) was established to analyze, discover, design, implement, execute, monitor and evolve collaborative business processes within and across organizations. While the IoT and BPM have been regarded as separate topics in research and practice, we strongly believe that the management of IoT applications will strongly benefit from BPM concepts, methods and technologies on the one hand; on the other one, the IoT poses challenges that will require enhancements and extensions of the current state-of-the-art in the BPM field. In this paper, we question to what extent these two paradigms can be combined and we discuss the emerging challenges and intersections from a research and practitioner’s point of view in terms of complex software systems development.

More
This demo paper presents an infrastructure to enable realtime monitoring of process events (i.e., telemetry). The infrastructure relies on the MQTT protocol which ensures minimum logging overhead. The paper presents a Java library for producing (i.e., logging) and consuming events, built on top of HiveMQ. Additionally, a prototype dashboard to display basic statistics is reported and described.

More
get_appRonny Seiger, Uwe Aßmann, Dominik Grzelak, Mikhail Belov, Paul Riedel, Ariel Podlubne, Wanqi Zhao, Jens Kerber, Jonas Mohr, Fabio Espinosa, Tim Schwartz
Journal paper
Im zukünftigen taktilen Internet wachsen die physische und virtuelle Welt auch über mehrere Standorte hinweg immer weiter zusammen. Robotic Co-working – das gemeinsame Kollaborieren von Robotern und Menschen – gewinnt dabei zunehmend im Kontext des Internet der Dinge (IoT) und cyber-physischer Systeme (CPS) an Bedeutung. Mit dieser Arbeit präsentieren wir eine Fallstudie, die anlässlich des 50-jährigen Jubiläums des Informatikstudiums in Deutschland durchgeführt wurde. In dieser arbeiten Menschen und Roboter über mehrere Standorte verteilt in einer virtuellen Co-Working-Zelle zusammen, um einen „physischen und virtuellen Informatik-Deutschland-Campus 2069“ zu konstruieren. Unter Nutzung von Sensorik, Aktuatorik und Software wird der cyber-physische Campus, von einem Workflow gesteuert, schrittweise errichtet. Mithilfe einer Mixed-Reality-Anwendung kann dieser Prozess immersiv, d. h. eintauchbar und interaktiv, unabhängig von einem konkreten Standort erlebt werden.

More
The Internet of Things (IoT) enables software-based access to vast amounts of data streams from sensors measuring physical and virtual properties of smart devices and their surroundings. While sophisticated means for the control and data analysis of single IoT devices exist, a more process-oriented view of IoT systems is often missing. Such a lack of process awareness hinders the development of process-based systems on top of IoT environments and the application of process mining techniques for process analysis and optimization in IoT. We propose a framework for the stepwise correlation and composition of raw IoT sensor streams with events and activities on a process level based on Complex Event Processing (CEP). From this correlation we derive refined process event logs–possibly with ambiguities–that can be used for process analysis at runtime (i. e., online). We discuss the framework using examples from a smart factory.

More
Process design artifacts have been increasingly used to guide the modeling of business processes. To support users in designing and understanding process models, different process artifacts have been combined in several ways leading to the emergence of the so-called “hybrid process artifacts”. While many hybrid artifacts have been proposed in the literature, little is known about how they can actually support users in practice. To address this gap, this work investigates the way users engage with hybrid process artifacts during comprehension tasks. In particular, we focus on a hybrid representation of DCR Graphs (DCR-HR) combining a process model, textual annotations and an interactive simulation. Following a qualitative approach, we conduct a multi-granular analysis exploiting process mining, eye-tracking techniques, and verbal data analysis to scrutinize the reading patterns and the strategies adopted by users when being confronted with DCR-HR. The findings of the coarse-grained analysis provide important insights about the behavior of domain experts and IT specialists and show how user’s background and task type change the use of hybrid process artifacts. As for the fine-grained analysis, user’s behavior was classified into goal-directed and exploratory and different strategies of using the interactive simulation were identified. In addition, a progressive switch from an exploratory behavior to a goal-directed behavior was observed. These insights pave the way for an improved development of hybrid process artifacts and delineate several directions for future work.

More
Process modeling plays a central role in the development of today’s process-aware information systems both on the management level (e.g., providing input for requirements elicitation and fostering communication) and on the enactment level (providing a blue-print for process execution and enabling simulation). The literature comprises a variety of process modeling approaches proposing different modeling languages (i.e., imperative and declarative languages) and different types of process artifact support (i.e., process models, textual process descriptions, and guided simulations). However, the use of an individual modeling language or a single type of process artifact is usually not enough to provide a clear and concise understanding of the process. To overcome this limitation, a set of so-called “hybrid” approaches combining languages and artifacts have been proposed, but no common grounds have been set to define and categorize them. This work aims at providing a fundamental understanding of these hybrid approaches by defining a unified terminology, providing a conceptual framework and proposing an overarching overview to identify and analyze them. Since no common terminology has been used in the literature, we combined existing concepts and ontologies to define a “Hybrid Business Process Representation” (HBPR). Afterwards, we conducted a Systematic Literature Review (SLR) to identify and investigate the characteristics of HBPRs combining imperative and declarative languages or artifacts. The SLR resulted in 30 articles which were analyzed. The results indicate the presence of two distinct research lines and show common motivations driving the emergence of HBPRs, a limited maturity of existing approaches, and diverse application domains. Moreover, the results are synthesized into a taxonomy classifying different types of representations. Finally, the outcome of the study is used to provide a research agenda delineating the directions for future work.

More
Conference or Workshop Item
Process models provide a blueprint for process execution and an indispensable tool for process management. Bearing in mind their trending use for requirement elicitation, communication and improvement of business processes, the need for understandable process models becomes a must. In this paper, we propose a research model to investigate the impact of modularization on the understandability of declarative process models. We design a controlled experiment supported by eye-tracking, electroencephalography (EEG) and galvanic skin response (GSR) to appraise the understandability of hierarchical process models through measures such as comprehension accuracy, response time, attention, cognitive load and cognitive integration.

More
Imperative process models have become immensely popular. However, their use is usually limited to rigid and repetitive processes. Considering the inherent flexibility in most processes in the real-world and the increased need for managing knowledge-intensive processes, the adoption of declarative languages becomes more pertinent than ever. While the quality of imperative models has been extensively investigated in the literature, little is known about the dimensions affecting the quality of declarative models. This work takes an advanced stride to investigate the quality of declarative models. Following the theory of Personal Construct Psychology (PCT), our research introduces a novel method within the Business Process Management (BPM) field to explore quality in the eyes of expert modelers. The findings of this work summarize the dimensions defining the quality of declarative models. The outcome shows the potential of PCT as a basis to discover quality dimensions and advances our understanding of quality in declarative process models.

More
Data visualizations are versatile tools for gaining cognitive access to large amounts of data and for making complex relation-ships in data understandable. This paper proposes a method for assessing data visualizations according to the purposes theyfulfill in domain-specific data analysis settings. We introduce a framework that gets configured for a given analysis domainand allows to choose data visualizations in a methodically justified way, based on analysis questions that address differentaspects of data to be analyzed. Based on the concepts addressed by the analysis questions, the framework provides systematicguidance for determining which data visualizations are able to serve which conceptual analysis interests. In a second step ofthe method, we propose to follow a data-driven approach and to experimentally compare alternative data visualizations fora particular analytical purpose. More specifically, we propose to use eye tracking to support justified decisions about whichof the data visualizations selected with the help of the framework are most suitable for assessing the analysis domain in acognitively efficient way. We demonstrate our approach of how to come from analytical purposes to data visualizations usingthe example domain of Process Modeling Behavior Analysis. The analyses are performed on the background of representativeanalysis questions from this domain.

More
get_appThomas Hildebrandt, Amine Abbad Andaloussi, Lars Rune Christensen, Søren Debois, Nicklas Pape Healy, Hugo A. López, Morten Marquard, Naja L. Holten Møller, Anette Chelina Møller Petersen, Tijs Slaats, Barbara Weber
Conference or Workshop Item
We report on a new approach to co-creating adaptive case management systems jointly with end-users, developed in the context of the Effective co-created and compliant adaptive case Management Systems for Knowledge Workers (EcoKnow.org) research project. The approach is based on knowledge from prior ethnographic field studies and research in the declarative Dynamic Condition Response (DCR) technology for model-driven design of case management systems. The approach was tested in an operational environment jointly with the danish municipality of Syddjurs by conducting a service-design project and implementing an open source case manager tool and a new highlighter tool for mapping between textual specifications and the DCR notation. The design method and technologies were evaluated by understandability studies with endusers. The study showed that the development could be done in just 6 months, and that the new highlighter tool in combination with the traditional design and simulation tools, supports domain experts formalise and provide traceability between their interpretations of textual specifications and the formal models.

More
Understanding how developers interact with different software artifacts when performing comprehension tasks has a potential to improve developers' productivity. In this paper, we propose a method to analyze eye-tracking data using process mining to find distinct reading patterns of how developers interacted with the different artifacts. To validate our approach, we conducted an exploratory study using eye-tracking involving 11 participants. We applied our method to investigate how developers interact with different artifacts during domain and code understanding tasks. To contextualize the reading patterns and to better understand the perceived benefits and challenges participants associated with the different artifacts and their choice of reading patterns, we complemented the eye-tracking data with the data obtained from think aloud. The study used behavior-driven development, a development practice that is increasingly used in Agile software development contexts, as a setting. The study shows that our method can be used to explore developers' behavior at an aggregated level and identify behavioral patterns at varying levels of granularity.

More
Conference or Workshop Item
The production and manufacturing industries are currently transitioning towards more autonomous an intelligent production lines within the Fourth Industrial Revolution (Industry 4.0). Learning Factories as small scale physical models of real shop floors are realistic platforms to conduct research in the smart manufacturing area without depending on expensive real world production lines or completely simulated data. In this work, we propose to use learning factories for conducting research in the context of Business Process Management (BPM) and Internet of Things (IoT) as this combination promises to be mutually beneficial for both research areas. We introduce our physical Fischertechnik factory models simulating a complex production line and three exemplary use cases of combining BPM and IoT, namely the implementation of a BPM abstraction stack on top of a learning factory, the experience-based adaptation and optimization of manufacturing processes, and the stream processing-based conformance checking of IoT-enabled processes.

More
get_appJosep Sanchez-Ferreres, Luis Delicado, Amine Abbad Andaloussi, Andrea Burattin, Guillermo Calderon-Ruiz, Barbara Weber, Josep Carmona, Lluís Padró
Journal paper
The creation of a process model is primarily a formalization task that faces the challenge of constructing a syntactically correct entity which accurately reflects the semantics of reality, and is understandable to the model reader. This paper proposes a framework called Model Judge, focused towards the two main actors in the process of learning process model creation: novice modelers and instructors. For modelers, the platform enables the automatic validation of the process models created from a textual description, providing explanations about quality issues in the model. Model Judge can provide diagnostics regarding model structure, writing style, and semantics by aligning annotated textual descriptions to models. For instructors, the platform facilitates the creation of modeling exercises by providing an editor to annotate the main parts of a textual description, that is empowered with natural language processing (NLP) capabilities so that the annotation effort is minimized. So far around 300 students, in process modeling courses of five different universities around the world have used the platform. The feedback gathered from some of these courses shows good potential in helping students to improve their learning experience, which might, in turn, impact process model quality and understandability. Moreover, our results show that instructors can benefit from getting insights into the evolution of modeling processes including arising quality issues of single students, but also discovering tendencies in groups of students. Although the framework has been applied to process model creation, it could be extrapolated to other contexts where the creation of models based on a textual description plays an important role.

More
Process design artifacts (e.g., process models, textual process descriptions and simulations) are increasingly used to provide input for requirements elicitation and to facilitate the design of business processes. To support the understandability of process models and make them accessible for end-users with different backgrounds, several hybrid representations combining different design artifacts have been proposed in the literature. This paper investigates the understandability of DCR-HR, a new hybrid process design artifact based on DCR graphs. Using eye-tracking and think-aloud techniques, this paper explores the benefits and challenges associated with the use of different design artifacts and investigates the way end-users engage with them. The results motivate the use of DCR-HR and provide insights about the support it provides to end-users with different backgrounds.

More
Conceptual models play an important role in many organizations. They serve as tools for communication and documentation, are often a central part in process improvement initiatives, and are key to the development and evolution of information systems. Existing modeling tools typically support end users in a rather generic and non-personalized manner. However, users not only differ in their modeling expertise and the challenges they encounter while modeling, but also in their preferences. Therefore, they would benefit from a new generation of modeling environments that are highly personalized and adapt themselves to users’ needs. This keynote presents a vision of such modeling environments with a focus on process modeling. It highlights this potential with several examples from our research and touches upon challenges that come with the development of next generation modeling environments.

More
The creation of a process model is a process consisting of five distinct phases, i.e., problem understanding, method finding, modeling, reconciliation, and validation. To enable a fine-grained analysis of process model creation based on phases or the development of phase-specific modeling support, an automatic approach to detect phases is needed. While approaches exist to automatically detect modeling and reconciliation phases based on user interactions, the detection of phases without user interactions (i.e., problem understanding, method finding, and validation) is still a problem. Exploiting a combination of user interactions and eye tracking data, this paper presents a two-step approach that is able to automatically detect the sequence of phases a modeler is engaged in during model creation. The evaluation of our approach shows promising results both in terms of quality as well as computation time demonstrating its feasibility."

More
get_appConstantina Ioannou, Ekkart Kindler, Per Bækgaard, Shazia Saqid, Barbara Weber
Conference or Workshop Item
Despite their wide adoption for conducting experiments in numerous domains, neurophysiological measurements often are time consuming and challenging to interpret because of the inherent complexity of deriving measures from raw signal data and mapping measures to theoretical constructs. While significantefforts have been undertaken to support neurophysiological experiments, the existing software solutions are non-trivial to use because often these solutions aredomain specific or their analysis processes are opaque to the researcher. This paper proposes an architecture for a software platform that supports experimentswith multi-modal neurophysiological tools through extensible, transparent and repeatable data analysis and enables the comparison between data analysis processes to develop more robust measures. The identified requirements and the proposed architecture are intended to form a basis of a software platform capable of conducting experiments using neurophysiological tools applicable to various domains.

More
New types of smart devices are emerging every day providing end-users with new ways of interacting with the IoT. Setup and configuration of these devices are, however, rather complex and require technical expertise. Especially in smart homes users often lack these skills and need assistance with setting up devices, which makes them hesitant to adopt new technologies. We present a mixed reality application to support users with the setup of IoT devices. The setup tasks are modelled and executed as formal interactive workflows. An intuitive wizard guides users and presents additional task-related multimedia information in mixed reality. A semantically enriched IoT middleware provides data and control of the IoT devices. We conducted a brief user study to evaluate the application's usability.

More