Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
50 result(s) for "event-sequence analysis"
Sort by:
Debugging experiment machinery through time‐course event sequence analysis
This application note describes an open‐source web application software package for viewing and analysing time‐course event sequences in the form of log files containing timestamps. Web pages allow the visualisation of time‐course event sequences as time curves and the comparison of sequences against each other to visualise deviations between the timings of the sequences. A feature allows the analysis of the sequences by parsing selected sections with a support vector machine model that heuristically calculates a value for the likelihood of an error occurring based on the textual output in the log files. This allows quick analysis for errors in files with large numbers of log events. The software is written in ASP.NET with Visual Basic code‐behind to allow it to be hosted on servers and integrated into web application frameworks.
Capabilities Unveiled: The Role of Ordinary Activities in the Evolution of Product Development Processes
In contrast to the prevailing interpretation of capabilities as collectives, this inductive study of product development in a leading design firm highlights the centrality of the myriad ordinary activities that may shape the evolution of capabilities. A detailed comparison of 90 diverse product development processes over a 15-year period shows, first, that mindful microactivities carried out by individuals in and around the organization and at all levels of the organizational hierarchy are central in shaping the content of the product development capability and its dynamic adaptation. Understanding organizational renewal and competitive advantage may hence require a partial shift in focus from capabilities as aggregate entities, to the practical realities of core organizational processes. Second, this more fine-grained perspective leads to a set of insights on how organizational renewal may be partially shaped by timely managerial interventions aimed at encoding successful experiments into higher-level organizational capabilities. Third, higher-level capabilities resulting from the conversion of heterogeneous experiences display higher process homogeneity and a permanent increase in performance, because of stabilization of managerial attention. My findings contribute to unveiling the concept of capabilities, extending prior research on dynamic capabilities and organizational renewal and providing a lens for research on the microfoundations of capability evolution and organizational advantage.
Learning While Innovating
This paper examines processes of trial-and-error learning during the development of a technological innovation by an interorganizational joint venture created expressly for developing and commercializing products from the new technology. We develop a model of adaptive learning, which incorporates elements from laboratory models of learning and applies them to the field research setting. The learning model focuses on relationships between the goals, actions, and outcomes of an innovation team within the joint venture as it develops the innovation over time, and the influences that environmental events and external interventions by resource controllers in parent companies have on the learning process. The model is tested based on a real-time longitudinal study of the development of a biomedical innovation (therapeutic apheresis) from 1983 to 1988. Different patterns of learning were observed in different periods of innovation development. Event time series analyses clearly contradict the learning model during an initial expansion period, but strongly support the model during a subsequent contraction period. Explanations for why these different patterns of organizational learning occurred over time are provided, and focus on a set of organizational structures and practices which are commonly used to manage innovation development, but which inhibit learning.
An Empirical Taxonomy of Implementation Processes Based on Sequences of Events in Information System Development
A widely accepted and usable taxonomy is a fundamental element in the development of a scientific body of knowledge. However, the creation of good empirical taxonomies of implementation processes is complicated by the need to consider the dynamics of the implementation process. This paper addresses this difficulty by using an optimal matching procedure to measure the pairwise distances among event sequences occurring in 53 computer-based information system (IS) implementation projects. Cluster analysis based on these inter-sequence distances is used to generate the empirical taxonomy of implementation processes. The resulting taxonomy includes six distinct archetypical processes. One of the process types is labeled textbook life cycle (type 4) due to its close resemblance to the detailed, rational approach commonly prescribed in IS textbooks. The logical minimalist process (type 1) follows some of the basic steps of the textbook approach, but is characterized by little project definition and infrequent assignment of personnel. Whereas both textbook life cycle and logical minimalist approaches use external vendors and consultants to some extent, external dependence is much greater in traditional off-the-shelf (type 2) and outsourced cooperative (type 5) processes. The traditional off-the-shelf process simply involves purchasing the system from an external vendor, with little system construction or assignment of personnel. In contrast, the outsourced cooperative process consists of joint system development by internally assigned personnel and external vendors. The remaining two process types— problem-driven minimalist (type 3) and in-house trial and error (type 6)—are both considerably influenced by performance problems. The problem-driven minimalist process is initiated by such problems, with little project definition, and results in a reassignment of organizational roles. The in-house trial-and-error process begins like textbook life cycle, with a clear project definition, but involves frequent system modifications to respond to the performance problems encountered during the project. The paper demonstrates how an empirical taxonomy that incorporates the dynamics of event sequences may be developed. The archetypes comprising the taxonomy are related to other implementation process models available in the literature. Some limitations of the study are acknowledged and its implications for future research and practice are discussed.
Prognostic Properties of Instantaneous Amplitudes Maxima of Earth Surface Tremor
A method is proposed for analyzing the tremor of the earth’s surface, measured by GPS, in order to highlight prognostic effects. The method is applied to the analysis of daily time series of vertical displacements in Japan. The network of 1047 stations is divided into 15 clusters. The Huang Empirical Mode Decomposition (EMD) is applied to the time series of the principal components from the clusters, with subsequent calculation of instantaneous amplitudes using the Hilbert transform. To ensure the stability of estimates of the waveforms of the EMD decomposition, 1000 independent additive realizations of white noise of limited amplitude were averaged before the Hilbert transform. Using a parametric model of the intensities of point processes, we analyze the connections between the instants of sequences of times of the largest local maxima of instantaneous amplitudes, averaged over the number of clusters and the times of earthquakes in the vicinity of Japan with minimum magnitude thresholds of 5.5 for the time interval 2012–2023. It is shown that the sequence of the largest local maxima of instantaneous amplitudes significantly more often precedes the moments of time of earthquakes (roughly speaking, has an “influence”) than the reverse “influence” of earthquakes on the maxima of amplitudes.
Interactive log-delta analysis using multi-range filtering
Process mining is a family of analytical techniques that extract insights from an event log and present them to an analyst. A key analysis task is to understand the distinctive features of different variants of the process and their impact on process performance. Techniques for log-delta analysis (or variant analysis) put a strong emphasis on automatically extracting explanations for differences between variants. A weakness of them is, however, their limited support for interactively exploring the dividing line between typical and atypical behavior. In this paper, we address this research gap by developing and evaluating an interactive technique for log-delta analysis, which we call InterLog. This technique is developed based on the idea that the analyst can interactively define filter ranges and that these filters are used to partition the log L into sub-logs L1 for the selected cases and L2 for the deselected cases. In this way, the analyst can step-by-step explore the log and manually separate the typical behavior from the atypical. We prototypically implement InterLog and demonstrate its application for a real-world event log. Furthermore, we evaluate it in a preliminary design study with process mining experts for usefulness and ease of use.
Risk Assessment of Hydrogen Fuel Cell Electric Vehicles in Tunnels
The need to understand the risks and implications of traffic incidents involving hydrogen fuel cell electric vehicles in tunnels is increasing in importance with higher numbers of these vehicles being deployed. A risk analysis was performed to capture potential scenarios that could occur in the event of a crash and provide a quantitative calculation for the probability of each scenario occurring, with a qualitative categorization of possible consequences. The risk analysis was structured using an event sequence diagram with probability distributions on each event in the tree and random sampling was used to estimate resulting probability distributions for each end-state scenario. The most likely consequence of a crash is no additional hazard from the hydrogen fuel (98.1–99.9% probability) beyond the existing hazards in a vehicle crash, although some factors need additional data and study to validate. These scenarios include minor crashes with no release or ignition of hydrogen. When the hydrogen does ignite, it is most likely a jet flame from the pressure relief device release due to a hydrocarbon fire (0.03–1.8% probability). This work represents a detailed assessment of the state-of-knowledge of the likelihood associated with various vehicle crash scenarios. This is used in an event sequence framework with uncertainty propagation to estimate uncertainty around the probability of each scenario occurring.
EHRchitect: An open-source software tool for medical event sequences data extraction from Electronic Health Records
Electronic Health Records (EHR) analysis is pivotal in advancing medical research. Numerous real-world EHR data providers offer data access through exported datasets. While enabling profound research possibilities, exported EHR data requires quality control and restructuring for meaningful analysis. Challenges arise in medical events (e.g., diagnoses or procedures) sequence analysis, which provides critical insights into conditions, treatments, and outcomes progression. Identifying causal relationships, patterns, and trends requires a more complex approach to data mining and preparation. This paper introduces EHRchitect - an application written in Python that addresses the quality control challenges by automating dataset transformation, facilitating the creation of a clean, formatted, and optimized MySQL database (DB), and sequential data extraction according to the user's configuration. The tool creates a clean, formatted, and optimized DB, enabling medical event sequence data extraction according to users' study configuration. Event sequences encompass patients' medical events in specified orders and time intervals. The extracted data are presented as distributed Parquet files, incorporating events, event transitions, patient metadata, and events metadata. The concurrent approach allows effortless scaling for multi-processor systems. EHRchitect streamlines the processing of large EHR datasets for research purposes. It facilitates extracting sequential event-based data, offering a highly flexible framework for configuring event and timeline parameters. The tool delivers temporal characteristics, patient demographics, and event metadata to support comprehensive analysis. The developed tool significantly reduces the time required for dataset acquisition and preparation by automating data quality control and simplifying event extraction.
EventFormer: a hierarchical neural point process framework for spatio-temporal clustering events prediction
In real-world scenarios, event data often exhibits inherent randomness, complex historical dependencies, and hierarchical spatio-temporal clustering. However, existing neural point process models typically overlook the hierarchical nature of spatial information, or treat temporal and spatial relevance as separate factors. This oversight results in suboptimal performance when handling data with prominent spatio-temporal clustering features. Additionally, current models have not explicitly considered the interdependencies between event types. To remedy these limitations, we introduce a novel neural point process framework named EventFormer. Leveraging the time-oriented and type-oriented multi-head attention modules, along with a Ladder Attention mechanism that progressively refines spatial embeddings across hierarchical levels, EventFormer adeptly captures the nuanced dynamics of event occurrences. Furthermore, EventFormer incorporates a type-aware conditional intensity function to explicitly model interactions between event types, enhancing both predictive accuracy and interpretability. Extensive experiments on real-world datasets demonstrate the outstanding performance of EventFormer in event likelihoods modeling and prediction tasks.
The use of operational event sequence diagrams and work domain analysis techniques for the specification of the crewing configuration of a single-pilot commercial aircraft
Aircraft manufacturers and avionics systems suppliers are developing technologies for airliners that will be operated by just a single crew member. An alternative approach to using a large amount of on-board computing proposes the utilisation of extant technology derived from single-seat military aircraft and Uninhabited Air Systems where control is distributed in real time across the aircraft flight deck and ground stations (which supervise several aircraft simultaneously). Using a combination of operational event sequence diagrams and work domain analysis techniques, the allocation of tasks and requirements for the development of supporting technologies for such an operational architecture are identified in a low visibility taxi scenario. These analyses show that many of the functions undertaken by a second pilot in this situation are associated with checking, surveillance and monitoring activities. These must be undertaken either by automated aircraft systems or the monitoring personnel in the ground station. This analytical approach can successfully provide the necessary information underpinning the design requirements for such an aircraft concept.