The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Hadoop on datacentre is a popular analytical platform for enterprises. Cloud vendors host Hadoop clusters on the datacentre to provide high performance analytical computing facilities to its customers, who demand a parallel programming model to deal with huge data. Effective cost/time management and ingenious resource consumption among the concurrent users, must be the primary concern without which...
Using Cloud Computing environment, the e-assessment process becomes an orchestration of a set of dedicated cloud services. In this paper, we propose an architecture for an e-assessment environment based on cloud services. This environment implements an approach that we have proposed to develop a generic e-assessment process which will be adapted to a learner profile. The e-assessment process activities...
Workflow management system (WfMS) should be capable of self-autonomous non-function attributes to experience its resilience to the change of runtime environment in Business Process Management (BPM) domain. With the motivation mentioned above, we propose the resilience mechanism for WfMS, associating with the corresponding methods proposed in our early research work, engaging to achieve our final aim,...
The need to apply complex algorithms on large volumes of data is boosting the development of technological solutions able to satisfy big data analytics needs in Cloud and HPC environments. In this context Ophidia represents a big data analytics framework for eScience offering a cross-domain solution for managing scientific, multi-dimensional data. It also exploits an in-memory-based distributed data...
We describe the use of our in-transit workflow infrastructure to run an ensemble of HYDRA [1] [2] Inertial Confinement Fusion (ICF) simulations in support of experiments conducted using the National Ignition Facility (NIF) laser. We discuss how our approach can be used to gain deeper insight into NIF experiments.We ran over 60,000 2D HYDRA simulations and generated over a billion synthetic x-ray images...
Development of Task Based Access Control models have been recently progressing in workflow management systems. This paper describes the extension of the well-known secure role-based workflow model (RBWM) which allows flexible management of permissions to system objects through tasks. Nonetheless, RBWM does not address the issues of versioning of securable objects and dependence on the state of linked...
In experimental research using computation, a workflow is a sequence of steps involving some data processing or analysis where the output of one step may be used as the input of another. The processing steps may involve user-supplied parameters, that when modified, result in a new version of input to the downstream steps, in turn generating new versions of their own output. As more experimentation...
A fully automatic translation of unified modeling language (UML) models to complete source code is not reported so far because some implementation details will not be there in the model, or a single UML model is not enough for complete code generation, or some model elements may not be directly convertible to source code. These issues are addressed in this study. The authors take workflow modelling...
Examination affairs management is a bottleneck of the examination management information system because of its complex process, many participants and its confidentiality requirements. To solve this problem, an automated management information system of the examination affairs is present in this paper. In the system the examination affairs is controlled by the workflow based on forms, and its security...
In the scientist's community one of the most vital challenges is the reproducibility of a workflow execution. The necessary parameters of the execution (we call them descriptors) can be external which depend on for example the computing infrastructure (grids, clusters and clouds), on third party resources or it can be internal which belong to the code of the workflow such as variables. Consequently,...
In almost all research field scientific studies can be implemented by in silico experiments. They are modelled by scientific workflows which describes the data or control flow between the consecutive computational tasks. Since these experiments are data and compute intensive they need parallel and distributed infrastructures to be enacted (grids, clusters, clouds and supercomputers). The complexity...
There are many software tools designed to support research activities such as data analysis and in silico experiments in the biotechnology field. This paper presents an adaptation of BioVel portal called "Laboratorio Virtual de Biotecnología" (LVB), that can be used to introduce students to scientific workflows. The paper reports the preliminary results of a usability case study carried...
This paper proposes a system design of real-time collaboration platform which is created as an alternative solution to produce e-book using collaborative writing framework by Lowry et al. The system is designed to assists a group of authors to collaborate on writing, editing, reviewing, and revising document as a digital book across multiple devices in real-time interaction using internet network...
Based on the state of the art of process mining, we can conclude that quality characteristics (failure rate metrics or loops) are poorly represented or absent in most predictive models that can be found in the literature. The main goal of this present research work is to analyze how to learn prediction model defining failure as response variable. A model of this type can be used for active real-time-controlling...
The reproducibility of an in-silico experiment is a great challenge because of the parallel and distributed environment and the complexity of the scientific workflows. In order to solve such problems on one hand provenance data has to be captured about the dataflow, the ancestry of the results and the environment of the execution, on the other hand description data has to be collected from the scientist...
Scientific workflows (swf) are commonly used to model and execute large-scale scientific experiments. From the scientist's perspective the workflow execution is like black boxes. The scientist submits the workflow and at the end, the result or a notification about failed completion is returned. Concerning long running experiments or when workflows are in experimental phase it may not be acceptable...
Workflows are an established IT concept to achieve business goals in a reliable and robust manner. However, the dynamic nature of modern information systems, the upcoming Industry 4.0, and the Internet of Things increase the complexity of modeling robust workflows significantly as various kinds of situations, such as the failure of a production system, have to be considered explicitly. Consequently,...
A large and diverse group of computational scientific research efforts deal with parameterized studies, in which same or similar computational tools are applied on different sets of data. Such uniform and well-defined analysis efforts can be encapsulated as parameter-sweep workflows. Due to computation and data intensive nature, resources that span across multiple domains may be needed for timely...
Scientific workflow management systems (SWFMSs) are facing unprecedented challenges from big data deluge. As revising all the existing workflow applications to fit into Cloud computing paradigm is impractical, thus migrating SWFMSs into the Cloud to leverage the functionalities of both Cloud computing and SWFMSs may provide a viable approach to big data processing. In this paper, we first discuss...
Scientific workflow systems aim to provide user friendly, end-to-end solutions for automating and simplifying computational or data intensive tasks. A number of workflow environments have been developed in recent years to provide support for the specification and execution of scientific workflows. Normal static workflows can poorly cope with the ever changing status of the existing distributed systems...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.