Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Series TitleSeries Title
-
Reading LevelReading Level
-
YearFrom:-To:
-
More FiltersMore FiltersContent TypeItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
41,372
result(s) for
"workflow"
Sort by:
The Log Skeleton Visualizer in ProM 6.9
by
Verbeek, H. M. W
in
Workflow
2022
Process discovery is an important area in the field of process mining. To help advance this area, a process discovery contest (PDC) has been set up, which allows us to compare different approaches. At the moment of writing, there have been three instances of the PDC: in 2016, in 2017, and in 2019. This paper introduces the winning contribution to the PDC 2019, called the Log Skeleton Visualizer. This visualizer uses a novel type of process models called log skeletons. In contrast with many workflow net-based discovery techniques, these log skeletons do not rely on the directly follows relation. As a result, log skeletons offer circumstantial information on the event log at hand rather than only sequential information. Using this visualizer, we were able to classify 898 out of 900 traces correctly for the PDC 2019 and to win this contest.
Journal Article
Introducing Microsoft Flow : automating workflows between apps and services
\"Use Microsoft Flow in your business to improve productivity through automation with this step-by-step introductory text ... You'll see the prerequisites to get started with this cloud-based service, including how to create a flow and how to use different connectors. [It] takes you through connecting with SharePoint, creating approval flows, and using mobile apps. ... The second half of the book continues with managing connections and gateways, where you'll cover the configuration, creation,, and deletion of connectors and how to connect to a data gateway. The final topic is Flow administration and techniques to manage the environment.\"--Back cover.
Workflow automation and performance improvement based on PostgreSQL
2023
This article discusses the development of an automated information system for improving and improving the efficiency of the cinema. This is achieved by automating the process of submitting requests, monitoring the quality and quantity of solutions for such requests. The system is designed to provide access to the list of services provided, its timely updating and optimization; the formation of all types of reports; providing managers with a tool that automates most of the routine work on the registration of the results of the cinema.
Journal Article
A Survey of Data-Intensive Scientific Workflow Management
2015
Nowadays, more and more computer-based scientific experiments need to handle massive amounts of data. Their data processing consists of multiple computational steps and dependencies within them. A
data-intensive scientific workflow
is useful for modeling such process. Since the sequential execution of data-intensive scientific workflows may take much time,
Scientific Workflow Management Systems
(
SWfMSs
) should enable the parallel execution of data-intensive scientific workflows and exploit the resources distributed in different infrastructures such as grid and cloud. This paper provides a survey of data-intensive scientific workflow management in SWfMSs and their parallelization techniques. Based on a SWfMS functional architecture, we give a comparative analysis of the existing solutions. Finally, we identify research issues for improving the execution of data-intensive scientific workflows in a multisite cloud.
Journal Article
Sustainable data analysis with Snakemake version 2; peer review: 2 approved
2021
Data analysis often entails a multitude of heterogeneous steps, from the application of various command line tools to the usage of scripting languages like R or Python for the generation of plots and tables. It is widely recognized that data analyses should ideally be conducted in a reproducible way. Reproducibility enables technical validation and regeneration of results on the original or even new data. However, reproducibility alone is by no means sufficient to deliver an analysis that is of lasting impact (i.e., sustainable) for the field, or even just one research group. We postulate that it is equally important to ensure adaptability and transparency. The former describes the ability to modify the analysis to answer extended or slightly different research questions. The latter describes the ability to understand the analysis in order to judge whether it is not only technically, but methodologically valid.
Here, we analyze the properties needed for a data analysis to become reproducible, adaptable, and transparent. We show how the popular workflow management system Snakemake can be used to guarantee this, and how it enables an ergonomic, combined, unified representation of all steps involved in data analysis, ranging from raw data processing, to quality control and fine-grained, interactive exploration and plotting of final results.
Journal Article
Sapporo: A workflow execution service that encourages the reuse of workflows in various languages in bioinformatics version 2; peer review: 2 approved, 2 approved with reservations
by
Ishii, Manabu
,
P. Kinoshita, Bruno
,
Kodama, Yuichi
in
Application programming interface
,
Bioinformatics
,
Biology
2022
The increased demand for efficient computation in data analysis encourages researchers in biomedical science to use workflow systems. Workflow systems, or so-called workflow languages, are used for the description and execution of a set of data analysis steps. Workflow systems increase the productivity of researchers, specifically in fields that use high-throughput DNA sequencing applications, where scalable computation is required. As systems have improved the portability of data analysis workflows, research communities are able to share workflows to reduce the cost of building ordinary analysis procedures. However, having multiple workflow systems in a research field has resulted in the distribution of efforts across different workflow system communities. As each workflow system has its unique characteristics, it is not feasible to learn every single system in order to use publicly shared workflows. Thus, we developed Sapporo, an application to provide a unified layer of workflow execution upon the differences of various workflow systems. Sapporo has two components: an application programming interface (API) that receives the request of a workflow run and a browser-based client for the API. The API follows the Workflow Execution Service API standard proposed by the Global Alliance for Genomics and Health. The current implementation supports the execution of workflows in four languages: Common Workflow Language, Workflow Description Language, Snakemake, and Nextflow. With its extensible and scalable design, Sapporo can support the research community in utilizing valuable resources for data analysis.
Journal Article