The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
According to the continuity and monotonicity of industrial real time data, an auto regression compression method (for short ARCM) is proposed. Firstly, the auto regression model of a group of sampled sequence is established. Secondly, the next sampled data can be predicted by the model. If the error between the actual data and the predictive data is in the allowable range, we save the parameters of...
Data centers consume a significant amount of energy in the US and worldwide, much of which is consumed by the cooling infrastructure, particularly the chiller plant and computer room air conditioners and air handlers. To enable energy efficient data center designs, ASHRAE added two new IT environmental classes, A3 and A4, with associated allowable inlet air temperatures of 40C and 45C respectively...
Socially aware services often have a large user base and data of users have to be partitioned and replicated over multiple geographically distributed clouds. Choosing in which cloud to place data, however, is difficult. Effective data placements entail meeting multiple system objectives, including reducing the usage of cloud resources, providing good service quality to users, and even minimizing the...
Recent years have seen rapid growth in data storage, magnifying the importance of ensuring data safety by performing regular backups. However, traffic created by such backups can be a significant burden on the underlying communication network. In the present paper we address the tradeoff between frequent backups (increased safety) and reducing the network peak load. We address the problem of shifting...
Out of sample fusion is a computational method where real and independent computer generated data are fused. The method hinges on a density ratio model whereby the distributions of the real and the generated data are related. The method is applied in the interval estimation of very small binomial proportions from moderately large samples.
Supervisory control of complex teams of autonomous systems, itself may require levels of autonomous decision making. A single human operator or a small number of operators attempting to maintain situational awareness and control over a large number of autonomous units may require automated assistance in overseeing such a team. This assistance may range from attention management services to outright...
Parallel Secondo scales up the capability of processing extensible data models in Secondo. It combines Hadoop with a set of Secondo databases, providing almost all existing SECONDO data types and operators. Therefore it is possible for the user to convert large-scale sequential queries to parallel queries without learning the Map/Reduce programming details. This paper demonstrates such a procedure...
Government and commercial customers are increasingly interested in robust, reusable flight software. For many spacecraft, Attitude Determination and Control Subsystem (ADCS) contributes a significant portion of the FSW. Thus refinements to ADCS code pay dividends for code development and reuse. Sierra Nevada Corporation (SNC) has recently developed an ADCS model and code set that follows model based...
Over the past years, large amounts of structured and unstructured data are being collected from various sources. These huge amounts of data are difficult to handle by a single machine which requires the work to be distributed across large number of computers. Hadoop is one such distributed framework which process data in distributed manner by using Mapreduce programming model. In order for Mapreduce...
Music industry is one of the traditional industries that is deeply influenced by the new computing strategy for big data. This paper aims to discuss a new solution for the digital music dissemination in big data era. By analyzing the features of big data, the key points of digital music dissemination and the participants in this industry, we put forward a dissemination model for digital music and...
Conceptual modeling is the abstraction of a simulation model from the real world system that is being modeled; in other words, choosing what to model, and what not to model. This is generally agreed to be the most difficult, least understood and most important task to be carried out in a simulation study. In this tutorial the problem of conceptual modeling is first illustrated through an example of...
Model verification and validation are defined, and why model verification and validation are important is discussed. The three approaches to deciding model validity are described. A graphical paradigm that shows how verification and validation are related to the model development process and a flowchart that shows how verification and validation is part of the model development process are presented...
Data mining is the process of analyzing data from different viewpoints and summarizing it into useful information. Clustering is the process of grouping of similar objects together. The group of the objects is called the cluster which contains similar objects compared to objects of the other cluster. Different clustering algorithm can be used according to the behavior of data. Farthest first algorithm...
There are so many operational entities in large-scale battlefield simulation. Entities may interact with each other in a real-time and frequent manner, which may restrain the performance of the simulation system. From researches in recent decades, it can be concluded that the Data Distribution Management (DDM) is an effective solution for this issue. Although the main focus for realizing DDM is Pub/Sub...
This paper attempts to introduce the concept of cloud computing into simulation and proposes a multi-user collaborating simulation mechanism based on cloud computing. This mechanism is different from the ideology of integrating (with the same language, on the same platform, etc), and embodies the ideology of distributing and collaborating. The mechanism can be implemented in one of the many cloud...
We consider the problem of design tools of knowledge bases in the environment of the Semantic Web. The dynamics of the development of software tools and design architecture are considered in the context of software technology in Semantic Web. There is given also a short description of the software platform for operation of Knowledge bases as Projects of virtual reality in Semantic Web.
Model-Based Testing particularly involves test case generation, execution and evaluation. Executing the model-based test cases, without introducing significant overhead in resource constrained embedded systems, is not yet supported in existing tools/methodologies. A test framework for executing MBT in embedded systems with minimal overhead in the target is outlined in this paper. The main scope of...
Geospatial data that exhibit time varying patterns are being captured faster than we are able to process them. We thus need machines to assist us in these tasks. One such problem is the automatic understanding of the behavior of moving objects for finding higher level information such as goals, intention etc. We propose a system that can solve one part of this complex task: automatic classification...
Recent studies have shown that the use of educational games during learning process is dramatically increased. Furthermore, researchers suggest the attachment of adaptive features in order to motivate students and assess their knowledge level on a specific educational subject. In this paper, we present an educational browser-based game with coins that contributes to better understanding the addition...
The betweenness centrality of a node is a measure related to the number of shortest path the node is involved with. It is, indeed, a measure of the importance of the node in the network, and in the recent years has been used intensively for network analysis. The major drawback of this measure is its high computational cost, and thus in the literature several works appeared providing ways of approximating...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.