The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
With the rapid development of software defined networking and network function virtualization, researchers have proposed a new cloud networking model called Network-as-a-Service (NaaS) which enables both in-network packet processing and application-specific network control. In this paper, we revisit the problem of achieving network energy efficiency in data centers and identify some new optimization...
The purpose of this paper is to propose new model for emotional interaction that uses learning styles and student emotional state to adapt the user interface and learning path. This aims to reduce the difficulty and emotional stain that students encounter while interacting with learning platforms. To this end will be used techniques of Affective Computer that can capture the student emotional state...
K-means is the most widely used clustering algorithm due to its fairly straightforward implementations in various problems. Meanwhile, when the number of clusters increase, the number of iterations also tend to slightly increase. However there are still opportunities for improvement as some studies in the literature indicate. In this study, improved implementations of k-means algorithm with a centroid...
Simulation has been demonstrated to be a powerful tool to mimic processes and activities in emergency departments. However, most applications only rely on the data that were manually input by the staff in the departments. First, this practice does not guarantee that the required data to build the simulation models are captured in the computer system, as some information about the processes of emergency...
Computing models provide the parallel and distributed algorithms for cloud. The ability to estimate the performance of parallel computing models for efficient resource scheduling is critical. Current techniques for predicting the performance are mostly based on analyzing and simulating. The behavior of parallel computing model directly leads to the diversity of mathematical model. Without a general...
Developed a method for computer analysis of the microparticles motion parameters along trajectories, based on the proposed A. A. Vavilov principle consistent disclosure of structural, parametrical and signal uncertainty. Signal uncertainty is caused by instrumental and methodological errors in the localization of microparticles position in the trajectories resulting from image processing. The difference...
Web based computing technology and services has proliferated exponentially and seen mammoth growth in the way people interact with systems across the globe since last decade. Nowadays user community has turned into contributor community and interactive social web platforms have made them empowered for global content creation and consumption. This content exploding is continued and ever increasing...
Desktop Grid (DG) systems use a combination of geographically heterogeneous distributed resources to execute jobs from science and engineering projects. Organization of the distributed resources are administrated by scheduling policies. To evaluate and prove the effectiveness of DG scheduling policy, a simulator is necessary since DG is an unpredictable and unrepeatable environment. Hence, the goal...
The Technology Leaders Program (TLP) is an interdisciplinary undergraduate program at the University of Virginia. The program has recently received a grant to purchase materials that will support its mission of fostering a maker culture. The TLP faculty advisors recognize that the current pencil-and-paper method of tracking their assets would not effectively and efficiently work with a larger inventory,...
When faced with dirty data fused from different sources, modern data management generally trace origins using data provenance technical to repair it fundamentally. Unfortunately, the state-of-art data provenance approaches can only deal with small amount of data using annotation or inverse process. What's worse, these approaches just work under stand-alone mode, resulting in low efficiencies. In this...
The purpose of this study is to design a computational thinking curriculum standard for K-12 education. The Delphi technique was employed to collect different views and derive consensus from a panel of thirteen experts, including computer scientists, computer science educators, K-12 computer teachers, and industry experts. The first draft of Delphi survey questionnaire, consisting of nine themes (problem...
Equipping classrooms with inexpensive sensors can provide students and teachers with the opportunity to interact with the classroom in a smart way. In this paper an approach to acquiring contextual data from a classroom environment, using inexpensive sensors, is presented. We present our approach to formalising the usage data. Further we demonstrate how the data was used to model specific room usage...
K-modes is a typical categorical clustering algorithm. Firstly, we improve the process of K-modes: when allocating categorical objects to clusters, the number of each attribute item in clusters is updated, so that the new modes of clusters can be computed after reading the whole dataset once. In order to make K-modes capable for large-scale categorical data, we then implement K-modes on Hadoop using...
For extorting the helpful comprehension concealed in the biggest compilation of a database the data mining technology is used. There are some negative approaches occurred about the data mining technology, among which the potential privacy incursion and potential discrimination. The latter consists of irrationally considering individuals on the source of their fitting to an exact group. Data mining...
The current Analytics tools and models that are available in the market are very costly, unable to handle Big Data and less secure. The traditional Analytics systems takes a long time to come up with results, so it is not beneficial to use for Real Time Analytics. So, the proposed work resolves all these problems by combining the Apache Open Source platform which solves the issues of Real Time Analytics...
Foundations of the integrated theory of analytically defined and multifunctional data structuring on the basis of correlative, entropic, spectral, logical and statistical and matrix models of data movement have been presented in this article.
As modern simulations involve large inputs and outputs over the network, there is an increasing need to store, manage and analyze the massive datasets, efficiently. In this paper, we present ARLS (After action Reviewer for Large-Scale simulation data), a Hadoop-based output analysis tool for large-scale simulation datasets. ARLS clusters distributed storages using Hadoop and analyzes the large-scale...
With the cloud paradigm and the concept of everything as a service (XasS), our ability to leverage the potential of distributed computing resources seems greater than ever. On the other hand, data farming is a methodology based on the idea that by repeatedly running a simulation model on a vast parameter space, enough output data can be gathered to provide an meaningful insight into relations between...
Computer forensics investigators aim to analyse and present facts through the examination of digital evidences in short times. As the volume of suspicious data is becoming large, the difficulties of catching the digital evidence in a legally acceptable time are high. This paper proposes an effective method for reducing investigation time redundancy to achieve the normalization of data on hard disk...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.