The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
The robustness is one of the primary characteristics of a real system, which impacts the function and performance of the system. Many real systems in our real world can be formulated as complex networks. It is a feasible method to estimate the robustness of real systems from the perspective of complex networks. The robustness evaluation is one of the basic and hot research topics in the field of complex...
To ensure the scalability of big data analytics, approximate MapReduce platforms emerge to explicitly trade off accuracy for latency. A key step to determine optimal approximation levels is to capture the latency of big data jobs, which is long deemed challenging due to the complex dependency among data inputs and map/reduce tasks. In this paper, we use matrix analytic methods to derive stochastic...
Predicting resource requirements for cloud services is critical for dimensioning, anomaly detection and service assurance. We demonstrate a system for real-time estimation of the needed amount of infrastructure resources, such as CPU and memory, for a given service. Statistical learning methods on server statistics and load parameters of the service are used for learning a resource prediction model...
Existing researches on the theoretical analysis for Proportional Fair (PF) scheduling are mostly based on the full buffer assumption while the user always has a bursty traffic in reality. In this paper, an analytical framework for PF scheduling in an Orthogonal Frequency Division Multiple Access (OFDMA) wireless network is proposed, and the user traffic is set to follow a poisson arrival process....
In active learning for Automatic Speech Recognition (ASR), a portion of data is automatically selected for manual transcription. The objective is to improve ASR performance with retrained acoustic models. The standard approaches are based on confidence of individual sentences. In this study, we look into an alternative view on transcript label quality, in which Gaussian Supervector Distance (GSD)...
Predictive analysis methods offer the possibility ofestimating the impact of design decisions, which may help inthe accomplishment of operational optimal results, before thedeployment of the system, and therefore minimizing the requiredeffort and cost. However, current predictive methods cannot beused on cloud environments, because of their complexity anddynamic nature. The main goal of this thesis...
Any kind of smart testing technique must be very efficient to be competitive with random fuzz testing. State-of the-art test generators are largely inferior to random testing in real world applications. This work proposes to gather and evaluate lightweight analyses that can enable the creation of an efficient and sufficiently effective analysis-assisted fuzz tester. The analyses shall leverage information...
Modelers face multiple challenges in their work. In this paper, we focus on two of them. First, multiple modeling methods and tools are currently available. Modelers are sometimes limited by their tools or paradigms. Second, when multiple models are proposed for the same case, a decision maker needs criteria to decide which model to choose for his/her objective.
Background modeling is a critical case for background-subtraction-based approaches and also for a wide range of applications. A background generation becomes difficult when the scene is complex or an object stay for more than half of the time in the scene. In this paper, we propose a block-based scene background initialization and modeling with low computational cost which making them feasible for...
The paper is focused on modeling the transmission channel along a car roof. The frequencies of the anylysis are 2.4 GHz anf 5.8 GHz to be used for in-car wireless services. Propagation of electromagnetic energy and modeling the transmission channel along a well-conductive plate are the issues discussed. For the analytical description, we used the Norton model of electromagnetic wave propagation. Numerical...
Today, REST APIs have established as a means for realizing distributed systems and are supposed to gain even more importance in the context of Cloud Computing, Internet of Things, and Microservices. Nevertheless, many existing REST APIs are known to be not well-designed, resulting in the absence of desirable quality attributes that truly RESTful systems entail. Although existing analysis show, that...
The application of Partial Membership Latent Dirichlet Allocation (PM-LDA) for hyperspectral endmember estimation and spectral unmixing is presented. PM-LDA provides a model for a hyperspectral image analysis that accounts for spectral variability and incorporates spatial information through the use of superpixel-based “documents.” In our application of PM-LDA, we employ the Normal Compositional Model...
Building program analysis tools is hard. A recurring development task is the implementation of the meta-model around which a tool is usually constructed. The XCORE prototype supports this task by generating the implementation of the meta-model. For this purpose, developers will add directly into the source code of the tool under construction some meta-information describing the desired meta-model...
Exploring the design space of the memory hierarchy requires the use of effective methodologies, tools, and models to evaluate different parameter values. Reuse distance is of one of the locality models used in the design exploration and permits analytical cache miss estimation, program characterization, and synthetic trace generation. Unfortunately, the reuse distance is limited to a single locality...
Much prior work has studied cache replacement, but a large gap remains between theory and practice. The design of many practical policies is guided by the optimal policy, Belady's MIN. However, MIN assumes perfect knowledge of the future that is unavailable in practice, and the obvious generalizationsof MIN are suboptimal with imperfect information. What, then, is the right metric for practical cache...
The availability of open source software projects has created an enormous opportunity for software engineering research. However, this availability requires that researchers judiciously select an appropriate set of evaluation targets and properly document this rationale. After all, the choice of targets may have a significant effect on evaluation.We developed a tool called RepoGrams to support researchers...
Software project artifacts such as source code, requirements, and change logs represent a gold-mine of actionable information. As a result, software analytic solutions have been developed to mine repositories and answer questions such as "who is the expert?,'' "which classes are fault prone?,'' or even "who are the domain experts for these fault-prone classes?'' Analytics often require...
Just-In-Time (JIT) defect prediction models aim to predict the commits that will introduce defects in the future. Traditionally, JIT defect prediction models are trained using metrics that are primarily derived from aspects of the code change itself (e.g., the size of the change, the author’s prior experience). In addition to the code that is submitted during a commit, authors write commit messages,...
The evaluation of dependability or performance of general systems usually relies on the assistance of stochastic modeling and simulation tools. Those software packages enables the creation of models and computation of metrics quickly and accurately. This paper introduces the Mercury tool, which is an integrated software that enables creating and evaluating Reliability Block Diagrams, Stochastic Petri...
One of the challenges in Long Term Evolution (LTE) and next generation cellular networks (5G) is the traffic overload and the network congestion caused by massive Machine Type Communication (MTC) devices accessing to an Evolved Node B (eNodeB) simultaneously. In this paper, a novel method is proposed, named Advanced Traffic Scattering for Group Paging (A-TSFGP), aiming to alleviate the system overload...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.