The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
The paper provides the mathematical model describing the movement of different kinds of avalanche snow mass and its interaction with obstacles based on the modified method of particle dynamics. Further, the authors introduce their algorithm to calculate avalanche impact on buildings and structures; this algorithm underlies a computer program that allows you to set the basic parameters of a building,...
In the article it is substantiated scientific and methodic approach for one of the regions of the Southern Federal District touristy flows dynamic modeling. It was made definition and analysis of factors that affects touristy industry. It was researched cyclic dependencies, that are determined by internal industry patterns. It was approbation of suggested approach for the region.
A new method of constructing nonparametric dynamic model of the human oculomotor system on the basis of experimental data “input-output” is developed, considering nonlinear and inertial properties of the rectus muscles of the eye. A technology for tracking eye movement is based on the videos. It is possible to determine the dynamic characteristics of the oculomotor system functions as a transition...
Modeling and Simulation (M&S) has become an essential tool in the development, testing and verification of operational software in a complex multi-domain, multi-threaded heterogeneous system of systems environment. The complex systems of today encompass mix of hardware sub-systems (with varying degree of capabilities), software environments (comprising a plethora of development environments, operating...
Modeling helps explain the fundamental physics hidden behind experimental data. In the case of material modeling, running one simulation rarely results in output that reproduces the experimental data. Often one or more of the force field parameters are not precisely known and must be optimized for the output to match that of the experiment. Since the simulations require high performance computing...
Batch and stream processing represent the two main approaches implemented by big data systems such as Apache Spark and Apache Flink. Although only stream applications are intended to satisfy real-time requirements, both approaches are required to meet certain response time constraints. In addition, cluster architectures continuously expand and computing resources constitute high investments and expenses...
Solving inverse problems is central in geosciences and remote sensing. Very often a mechanistic physical model of the system exists that solves the forward problem. Inverting the implied radiative transfer model (RTM) equations numerically implies, however, challenging and computationally demanding problems. Statistical models tackle the inverse problem and predict the biophysical parameter of interest...
The new method of building models of technological processes is presented in the article. This method allows to perform modeling in a dynamic mode and to unite several types of modeling for describing the same process. The method developed by the authors has been effectively applied in the design of modern automated control systems to optimize the load on the controllers, the algorithm management...
The problem of monitoring ongoing technical condition of various vehicle subsystems from operational data is a subject matter for many modern armies. Due to the requirements on reliability of military vehicles and its failure-free operation during a mission; it would be advantageous to have a tool that enables a quick diagnostics of vehicles most important components. Present article shows one of...
The design and optimization of complex systems implies the availability of efficient models of each single device that is part of the system. Among these, hysteretic equipments can become a bottleneck of an optimization by simulation process, due to the complexity of the models available in the technical literature: the definition of a computationally efficient model is an aspect that can play a central...
Energy efficiency in high performance computing (HPC) systems is a relevant issue nowadays, which is approached from multiple edges and components (network, I/O, resource management, etc). HPC industry turned its focus towards embedded and low-power computational infrastructures (of RISC architecture processors) to improve energy efficiency, therefore, we use an ARM-based cluster, known as millicluster,...
Microstructural information plays a key role in governing the dominant physics for various applications involving fracture networks. Resolving the interactions of thousands of interconnected sub-micron scale fractures is computationally intensive, and is intractable with current technologies. Coarsening of the domain and simplification of the physics are two commonly used workarounds, but these methods...
We present a fully automated method to predict the full radiation patterns and S-parameters of antennas over a large frequency bandwidth using the knowledge of the simulated results at a few frequency points. The frequency points are adaptively selected in regions of fast variations of radiation pattern and S-parameters, as well as in large unsampled regions. The method has a built-in absolute error...
The employment of five distinct benchmarks on the Distributed Environment for Academic Computing (DEAC) Cluster at Wake Forest University provides meaningful metrics of cluster processor and memory performance. Given the heterogeneous nature of the DEAC Cluster, the benchmarks taken consider the specific processor architectures comprising the cluster. The data obtained will be assessed via two modeling...
Dagger is a modeling and visualization framework that addresses the challenge of representing knowledge and information for decision-makers, enabling them to better comprehend the operational context of network security data. It allows users to answer critical questions such as “Given that I care about mission X, is there any reason I should be worried about what is going on in cyberspace?” or “If...
With the progressive development of information and communication technologies, we are now forming a new world called hyperworld that is composed by the cyber world and the physical world with various digital explosions including data, connectivity, service and intelligence. Therefore, Cyber-I has been proposed, which is a real individual's counterpart in cyberspace, and is to create a unique, digital,...
In this paper, we present the "Slow Start Problem" in participatory sensing applications where a service is provided based on data collected by participants. The slow start problem refers to the initial stage in participatory sensing service deployment, during which service adoption remains sparse and, hence, the collected data does not offer adequate coverage. Predictive models, learned...
With the increasing death number of cardiovascular disease, it is significant to study ECG signals at meridian acupoints for developing new alternative and complementary therapies for chronic cardiovascular diseases. Therefore, an ECG measuring experiment at acupoints of the human meridian is firstly carried out for obtaining information transmission data of the meridian system. Then according to...
"Cold Start" in participatory sensing applications refers to the initial stage in service deployment, during which service adoption remains sparse and, hence, the collected data does not offer adequate coverage. Predictive models, learned from data, offer a way to generalize from sparse observations, but the models themselves need to be statistically reliable to offer a reliable service...
The paper describes the modeling design of data security in Cloud Computing. Data security can be defined as confidence and integrity maintenance of data processed by an organization. In the paper, dealing with data security in all layers of cloud computing is discussed. The standard cloud storage uses three-level data security model in cloud computing which can be extended by a forth level responsible...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.