The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
In this search; Bayesian analysis of the two parameter Lomax distribution reliability has been considered, and the estimation has been obtained under logarithm loss function for three different prior distributions (Quasi distribution, Exponential distribution and Gamma distribution). they has been made under complete data analyses. The simulation study has been conducted to compare by mean squared...
Forensic Voice Comparison (FVC) is increasingly using the likelihood ratio (LR) in order to indicate whether the evidence supports the prosecution (same-speaker) or defender (different-speakers) hypotheses. Nevertheless, the LR accepts some practical limitations due both to its estimation process itself and to a lack of knowledge about the reliability of this (practical) estimation process. It is...
This paper presents an effective design space exploration strategy for the development of dependable systems using selective hardening techniques based on software. Instead of design space exploration approaches based on brute-force or time-consuming fault injection experiments, this strategy is grounded in an early estimation of the register file criticality in microprocessor-based systems. This...
Counterfeit electronics have become a big concern in the globalized semiconductor industry where chips might be recycled, remarked, cloned or overproduced. In this work, we advance the state-of-the-art counterfeit detection of flash memory, which is widely used in electronic systems. Fake memories may be used in critical systems, such as missiles, military aircrafts and helicopters, thus diminishing...
Video summarization is useful to find a concise representation of the original video, nevertheless its evaluation is somewhat challenging. This paper proposes a simple and efficient method for precisely evaluating the video summaries produced by the existing techniques. This method includes two steps. The first step is to establish a set of matched frames between automatic summary (AT) and the ground...
Despite being an essential prerequisite at the basis of many applications ranging from surveillance to computational photography, the problem of initial background estimation seems to be marginally investigated. In this paper, we present a reliable CNN-based solution to estimate the initial background (BG) of a scene, given not necessarily a whole sequence but just a small set of frames containing...
Understanding where people attention focuses is a challenging and extremely valuable task that can be solved using computer vision technologies. In this paper we address this problem on surveillance-like scenarios, where head and body imagery are usually low resolution. We propose a method to profile the attention of people moving in a known space. We exploit coarse gaze estimation and a novel model...
The ubiquitous hand gesture plays an important role in the natural human machine interaction (HMI). Recently, the consumer color and depth cameras have been used to estimate hand shapes and postures for the mid-air HMI. Under the observation that 3D hand contours possess much information of hand postures, we estimate 3D hand contours from infrared images with a limited computation complexity for the...
A new efficient measure for predicting estimation accuracy is proposed and successfully applied to multistream-based unsupervised adaptation of ASR systems to address data uncertainty when the ground-truth is unknown. The proposed measure is an extension of the M-measure, which predicts confidence in the output of a probability estimator by measuring the divergences of probability estimates spaced...
Warping-based image stitching methods often suffer from perspective variations among multiple images and lead to shape and perspective distortions in stitching results. Moreover, they also quickly lose their efficiency in low-textured images, due to the lack of reliable point correspondences. To solve these problems, this paper presents a locally warping-based image stitching by imposing line constraints...
We introduce a powerful technique to make classifiers more reliable and versatile. Background Check equips classifiers with the ability to assess the difference of unlabelled test data from the training data. In particular, Background Check gives classifiers the capability to (i) perform cautious classification with a reject option, (ii) identify outliers, and (iii) better assess the confidence in...
Forensic Voice Comparison (FVC) is increasingly using the likelihood ratio (LR) in order to indicate whether the evidence supports the prosecution (same-speaker) or defender (different-speakers) hypotheses. In addition to support one hypothesis, the LR provides a theoretically founded estimate of the relative strength of its support. Despite this nice theoretical aspect, the LR accepts some practical...
Introducing random interruptions in the cooperation of the sensing nodes with the fusion center is a new approach to increase the efficiency of the centralized cooperative spectrum sensing in cognitive radio (CR) networks. In this paper, we extend this approach to the multiband cooperative spectrum sensing. Specifically, first, we formally show how to incorporate the random interruptions in a multiband...
Estimation of worker reliability on microtask crowdsourcing platforms has gained attention from many researchers. On microtask platforms no worker is fully reliable for a task and it is likely that some workers are spammers, in the sense that they provide a random answer to collect the financial reward. Existence of spammers is harmful as they increase the cost of microtasking and will negatively...
We present a scene depth map generation method based on light field cameras. From the plenoptic function, the angular information about each image point under different sizes of aperture is extracted, which could be used for confocal stereo. Considering confocal constancy and gradient constancy, we take into account two constraints: (1) When a pixel is in focus, its relative intensities across aperture...
In this paper, we aim to develop an efficient speculation framework for a heterogeneous cluster. Speculation is a common mechanism that identifies ‘slow’ node in a cluster and starts redundant tasks on other nodes to guarantee the reliability. We consider MapReduce/Hadoop as a representative computing platform, and our general goal is to accurately and quickly identify the straggler nodes during the...
A novel bio-inspired strategy, the Hybrid Speeding Up Slowing Down (Hybrid SUSD) strategy, is introduced to achieve distributed control of a multi-agent system for the localization of multiple sources in a search space. Hybrid SUSD switches between bio-inspired exploration algorithms and exploitation algorithms. The exploration algorithms provide coverage of the workspace with non-zero probability...
The increase of high-cost and high-precision manufacturing process underlines the importance of the reliability estimation of Bogey test data. To estimate the failure probability of Bogey test, Bayesian approaches often focus on the choice of the prior distribution. However, this paper develops a new method, which making use of the concavity of lifetime's distribution function to construct a non-informative...
Software effort estimation influences almost all the process of software development such as: bidding, planning, and budgeting. Hence, delivering an accurate estimation in early stages of the software life cycle may be the key of success of any project. To this aim, many solo techniques have been proposed to predict the effort required to develop a software system. Nevertheless, none of them proved...
Performing statistical inference on massive data sets may not be computationally feasible using the conventional statistical inference methodology. In particular, there is a need for methods that are scalable to large volume and variability of data. Moreover, veracity of the inference is crucial. Hence, there is a need to produce quantitative information on the statistical correctness of parameter...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.