The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Scientific simulations on high performance computing (HPC) platforms generate large quantities of data. To bridge the widening gap between compute and I/O, and enable data to be more efficiently stored and analyzed, simulation outputs need to be refactored, reduced, and appropriately mapped to storage tiers. However, a systematic solution to support these steps has been lacking on the current HPC...
Model-Based Testing rises hopes of project teams of meeting both eager time and budget constraints as well as achieving better system quality by thorough testing. However, toolchain and method impose a certain skill set on the project engineer. This paper presents a possible way forward and introduces the constraints to the system architecture.
In situ workflows contain tasks that exchange messages composed of several data fields. However, a consumer task may not necessarily need all the data fields from its producer. For example, a molecular dynamics simulation can produce atom positions, velocities, and forces; but some analyses require only atom positions. The user should decide whether to specialize the output of a producer task for...
Applications in computer network security, social media analysis, and other areas rely on analyzing a changing environment. The data is rich in relationships and lends itself to graph analysis. Traditional static graph analysis cannot keep pace with network security applications analyzing nearly one million events per second and social networks like Facebook collecting 500 thousand comments per second...
The pursuit of innovation in the mobile software industry frequently requires coming up with new features – and not just any feature, but startling and unexpected delightful features. Despite the potential of context awareness to provide a system with delightfulness, current requirements elicitation techniques do not cope with an essential aspect: comprehension of the relationships among the numerous...
In high-performance computing (HPC), end-to-end workflows are typically utilized to gain insights from scientific simulations. An end-to-end workflow consists of scientific simulation and data analysis, and can be executed in-situ, in-transit, and offline. Existing studies on end-to-end workflows have largely focused on the high-performance execution approaches. However, the emerging heterogeneous...
The decision making process is mainly data driven, which makes Data Warehousing play a crucial role in providing a reliable and efficient tool to meet analytical needs and provide end users and decision makers with useful insights. However, several surveys show that a significant percentage of Data Warehouse (DW) projects fail to meet their business goals. Indeed, it is mainly due to the lack of any...
Modeling and Simulation (M&S) has become an essential tool in the development, testing and verification of operational software in a complex multi-domain, multi-threaded heterogeneous system of systems environment. The complex systems of today encompass mix of hardware sub-systems (with varying degree of capabilities), software environments (comprising a plethora of development environments, operating...
Design models are widely spread as core artifacts in software engineering. Yet, a key problem is how to fulfill correctly these blueprint specifications when code components are developed. The best possible scenario occurs when a source modeling language can be perfectly linked to a target language of election. Namely, a well defined mapping bridges the gap between the source and the target language...
Modeling helps explain the fundamental physics hidden behind experimental data. In the case of material modeling, running one simulation rarely results in output that reproduces the experimental data. Often one or more of the force field parameters are not precisely known and must be optimized for the output to match that of the experiment. Since the simulations require high performance computing...
This work is devoted to determination of the relationship between two different approaches for wireless link quality assessment: 1) using continuous error rate parameter of bits flow — bit error rate (BER), and 2) quality parameters defined in the ITU-T G.826 standard (12/2002) for digital paths — errored block (EB), errored second (ES), severely errored second (SES), and respective Ratio parameters...
The purpose of this work is to establish a numerical framework capable of generating simulation data with the same statistical properties as the represented electrical reverberation chamber (ERC). While an ERC whose resonating volume is geometrically varied produces randomly distributed fields, a numerical simulation is a deterministic system. However, the statistical distributions of electrical fields...
A potential formalization of factor management and assessment algorithm, automated modeling system architecture and performance evaluation of major infrastructural transport and logistics projects and processes is suggested, based on numerical and analytical methods of digital economy, qualitative (verbal) and quantitative assessment indicators and criteria, economic logistics model and efficient...
This paper studies a statistical dataset describing submissions to the municipalities in Czech Republic. The dataset contains five submission-specific subgroups as interdependent time series. The research purpose is to build a suitable model for description of the process. In this work, the autoregressive and vector autoregressive models are used for fitting the data. The obtained results proved to...
Objectives. The article proposes a modification of the method for quantitative risk assessment (the Monte Carlo simulation method) by taking into account the multifactorial relationship between the key parameters and the risk factors of the investment project, which makes it possible to obtain a more relevant and efficient sample of simulations. Methods. Investigations described in the article are...
The approach to the intelligent modeling system development is proposed in the paper. Such system should be used for identification, classification and definition of the parameters of motion of marine facilities. Modeling system structure, conceptual model of modeling system and intellectual methods that are used for data processing are described. The ontology model is used for modeling system data...
This paper proposes the principle of classifier construction for software reliability models and assessment techniques. The classifier is built using a facet-hierarchical approach and allows systematizing scientific publications in the field of software reliability. The available publications for more than 50 years have been analyzed and classified by use of hierarchy of software reliability attributes...
In practice of the time series analysis there is a specific problem on how to estimate tendencies of nonlinear dynamics evolution when several periodical processes alternate (so-called intermittency phenomena). Such alternations relate to bifurcations, but particular regularities and mechanisms of intermittency phenomena remain insufficiently understandable due to the height sensitivity of these phenomena...
In this paper the water quality is explored using the fuzzy cognitive analysis. Fuzzy cognitive maps of Silov are used as a tool for analysis and modeling. The proposed method is based on the iterative calculation and analysis of system indicators of the fuzzy cognitive maps of Silov for a certain period of time. Presented a fuzzy cognitive map, calculated system parameters and the analysis of the...
Verifying attacks against cyber physical systems can be a costly and time-consuming process. By using a simulated environment, attacks can be verified quickly and accurately. By combining the simulation of a cyber physical system with a hybrid attack graph, the effects of a series of exploits can be accurately analysed. Furthermore, the use of a simulated environment to verify attacks may uncover...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.