The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
The aim of this work is to present the results of a comparative study of two different least squares (LS) approach for estimation of non-integer order models. The first one use a repeated fractional integration and modal approach and the second one is based on Grunwald approximation. Both methods are explained in details and identification algorithms are elaborate.
We present general principles for the design and analysis of unbiased Monte Carlo estimators for quantities such as α = g(E (X)), where E (X) denotes the expectation of a (possibly multidimensional) random variable X, and g(·) is a given deterministic function. Our estimators possess finite work-normalized variance under mild regularity conditions such as local twice differentiability of g(·) and...
Estimates of software development effort are frequently inaccurate and over-optimistic. In this paper we describe how changes in the granularity of the unit of estimation, e.g., work-days instead of work-hours, affects the effort estimates. We describe four psychological mechanisms, how they interact and discuss the expected total effect of higher granularity units on effort estimates. We argue that...
Traditional choice models assume that consumers have well-defined preferences and are not influenced by additional information. However, consumers do not always behave rationally in making purchase decisions. To capture consumers' irrational behaviors and make a more accurate market demand prediction, this paper reports the development of a behavioral choice model that covers reference dependence,...
The Compressed Sensing (CS) framework outperforms the sampling rate limits given by Shannon's theory. This gap is possible since it is assumed that the signal of interest admits a linear decomposition of few vectors in a given sparsifying Basis (Fourier, Wavelet, …). Unfortunately in realistic operating systems, uncertain knowledge of the CS model is inevitable and must be evaluated. Typically, this...
We compute the Weiss-Weinstein bound in the context of change-point estimation in a multivariate time series whatever the considered distribution of the data as well the prior. Closed-form expressions are then given in the case of Gaussian observations with change of mean and variance and in the case of parameter change in a Poisson distribution. The proposed bound is shown to be tighter than the...
In this paper, we analyze the binary arithmetic coding of High Efficiency Video Coding (HEVC) and the second generation of audio and video coding standard (AVS2). Then an optimized probability estimation scheme is proposed for arithmetic coder. The proposed scheme is incorporated into the HEVC reference software (HM 16.0) and AVS2 reference software (RD 10.1). Experimental results demonstrate that...
Key value data sets of the form {(x, wx)} where wx > 0 are prevalent. Common queries over such data are segment f-statistics Q(f, H) = Sx?H f(wx), specified for a segment H of the keys and a function f. Different choices of f correspond to count, sum, moments, capping, and threshold statistics. When the data set is large, we can compute a smaller sample from which we can quickly estimate statistics...
This paper proposes a new non-parametric test to decide whether to transfer from source data to target data in order to improve the performance of predictive models on target domains. The test is based on the conformity framework. It statistically tests whether the target data and source data have been generated from the target distribution under the exchangeability assumption. The source data is...
The shift of the Internet usage from a host to host interconnection to a content retrieval becomes an indisputable fact. Information Centric Networking (ICN) paradigm is proposed as a Future Internet architecture adapted to this change. By identifying the content and not the host, this new design enables many interesting features such as in-network data storage and caching. ICN network components...
Context: The pool of papers published in ESEM. Objective: To utilize citation analysis and automated topic analysis to characterize the SE research literature over the years focusing on those papers published in ESEM. Method: We collected data from Scopus database consisting of 513 ESEM papers. For thematic analysis, we used topic modeling to automatically generate the most probable topic distributions...
Current technical debt management approaches mainly address specific types of technical debt. This paper introduces a framework to aid in decision making for technical debt management, and it includes those elements considered in technical debt management in the available literature, which are classified in three groups and mapped into three stakeholders' points of view. The research method was systematic...
Recently, technical debt investigations became more and more important in the software development industry. In this paper we show that the same challenges are valid for the automated test systems. We present an internal quality analysis of standardized test software developed by ETSI and 3GPP, performed on the systems publicly available at www.ttcn-3.org.
Estimation of data veracity is recognized as one of the grand challenges of big data. Typically, the goal of truth discovery is to determine the veracity of multi-source, conflicting data and return, as outputs, a veracity label and a confidence score for each data value, along with the trustworthiness score of each source claiming it. Although a plethora of methods has been proposed, it is unlikely...
Social networking sites such as Twitter provide more opportunities to express what people think or intend in short text. In short text, abbreviations such as "ASAP" or "joinus" and emoticons are often used. Because these expressions are not registered into the existing dictionaries, these are analyzed as unknown expressions. That can be a bottleneck for improving accuracy of reputation...
Pattern of Life (POL) analysis constitutes a subset of Activity-based Intelligence (ABI) — understanding those complex spatiotemporal contexts within which entities (e.g., cancer cells, people, etc.) move about and interact, normally-but not always- with a type of recognizable regularity. POL analysis methods are particularly important when attempting to detect and track complex behaviors in stochastic...
In this paper, we present a method to estimate the quality (trustfulness) of the solutions of the classical optimal data association (DA) problem associated with a given source of information (also called a criterion). We also present a method to solve the multi-criteria DA problem and to estimate the quality of its solution. Our approach is new and mixes classical algorithms (typically Murty's approach...
The influence of the missing values in the classification of incomplete pattern mainly depends on the context. In this paper, we present a fast classification method for incomplete pattern based on the fusion of belief functions where the missing values are selectively (adaptively) estimated. At first, it is assumed that the missing information is not crucial for the classification, and the object...
This paper introduces a novel approach to robust tracking that combines Particle Filters (PFs) and estimation of physical constraints using Bayesian Networks (BNs). Heterogeneous Context Data (CD) describing the environment in which tracked objects move, is fused with the help of BNs. The resulting uncertain constraints are incorporated into the filtering process through a modification of the importance...
Apprenticeship learning based on inconsistent demonstrations is presented in this paper. We address a problem where given demonstrations are not directly applicable to reward function estimation due to the non-stationarity of an environment or the difference between the dynamics of a robot and a demonstrator. A basic idea of the proposed method is to use a subset of the trajectories sampled from the...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.