The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
It is necessary for a researcher to know historical transition in researchers and research topics. Although Web search engines can be used for obtaining such information, collecting the information across a long time period is difficult and laborious. Thus, we proposed a method for automatically extracting historical transition in researchers and research topics by using co-occurrence information...
The estimation of a programs' reliability is an essential part in the process of software development. Existing methods for the analysis of software reliability are based on run-time data, program metrics, and properties of development process or program architecture. The disadvantage of these methods is that they use indirect information about the errors, which are the main cause of program unreliability...
In this paper we consider the problem of approximate dynamic matching for homogeneous symmetric publish/subscribe system. In this kind of applications, besides one-to-one swap, exchange can occur between more than two subscriptions, which is called cycle matching. There the number of possible cycle matching increases exponentially with the number of subscriptions and cycle length, one challenge is...
This paper focuses on the theory of estimation of distribution algorithms. First, elaborated the idea of estimation of distribution algorithms, And then for the limitations of solving complex optimization problems, proposed Q Learning-Based Estimation of Distribution Algorithm. The Q learning algorithm is introduced into evolutionary computation, through the Agent and group interaction, to achieve...
Recently, a new type of interacting multiple model (IMM) method was introduced based on the relatively new smooth variable structure filter (SVSF), and is referred to as the IMM-SVSF. The SVSF is a type of sliding mode estimator that is formulated in a predictor-corrector fashion. This strategy keeps the estimated state bounded within a region of the true state trajectory, thus creating a stable and...
Modeling the nonstationarity of image signals is one of the challenging issues for image interpolation. In this paper, we propose a similarity probability modeling to faithfully characterize the nonstationarity of image signals, and present a novel image interpolation algorithm based on the proposed model. The missing pixels are estimated in groups by weighted block estimation. The weight of each...
In the paper, operation problems of simulation modeling of algorithm operation of the computing device with programmed logic are considered. The algorithm operation of the computing device with programmed logic is considered from position of queuing system and presented in the form of the simulation model in GPSS language.
Although the World Wide Web has evolved to be an enormous resource over the last decade, sifting through all the information available poses a significant challenge to users as there is simply too much information available. Therefore, systems that support users by performing personalized information filtering or search are becoming increasingly important. This paper presents a mechanism which provides...
Tool outsourcing is facing many risks. If the outsourcing enterprises can't carry on the analysis, appraisal and control of tool outsourcing sufficiently, the outsourcing enterprises not only can't obtain benefits, but also has the inestimable loss from tool outsourcing. In this paper, we analyze risk factors of tool outsourcing, and Risk-matrix has been reviewed; then we calculate the Borda price...
The paper studied mainly the context-based adaptive binary arithmetic coding (CABAC) and its concerned implement strategies in the h.264. The improved CABAC adopted Finite-State-Machine and Lookup-Table to implement its arithmetic coding engine. Compared with the normal CABAC, the improved one is conceptual simple and outputs lower bit-rate with better efficiency. Finally, its test model is evaluated...
This paper addresses the estimation of symmetric χ2-divergence between two point processes. We propose a novel approach by, first, mapping the space of spike trains in an appropriate functional space, and then, estimating the divergence in this functional space using a least square regression approach. We compare the proposed approach with other available methods on simulated data, and discuss its...
This paper proposes an elaborated approach for fast mode decision by using the probabilities estimated through context models. In H.264, the binary arithmetic coder is adopted to convert encoding modes into binary bins and assign each bin a context to estimate the probability of the bin as “1” or “0”. By collecting the probabilities of all bins, the probability of encoding modes can be calculated...
The current surveillance systems, especially the enhanced en-route system and the advance surface movement guidance and control system (A-SMGCS), are an integration of various techniques, primary surveillance radar (PSR), secondary surveillance radar (SSR), Multilateration (MLAT), and Automatic Dependent Surveillance — Broadcast(ADS-B). The performances of these systems are often evaluated based on...
The collaborative risk is the key factor of coordination supply chain surplus profits allocation, and affect the stability of coordination supply chain mechanism. In order to solve the surplus profits allocation problem, analyze the reasons and category of the collaborative risk, and then measure the risk of supply chain members through the method of Two-factor evaluation. On the basis establish the...
Weerahandi introduced the concept of generalized confidence intervals, which are to develop interval estimation. But it is difficult to use this method for constructing generalized confidence regions of vector parameters. In this paper we give a new method based on generalized bootstrap variable to construct asymptotically correct confidence regions of vector parameters.
A number of applications in social networks, telecommunications, and mobile computing create massive streams of graphs. In many such applications, it is useful to detect structural abnormalities which are different from the “typical” behavior of the underlying network. In this paper, we will provide first results on the problem of structural outlier detection in massive network streams. Such problems...
In this paper, we consider the the WSNs layering learning routing algorithm of random spread of the nodes. A node routing algorithm is given out by node hierarchical and learning method. When count path of routing by using this algorithm, only a little information between this point and its adjacent points is needed. By using depth-first search strategy and combining node route historical experience,...
We propose a method based on partial least squares (PLS) regression to estimate the probability density function of the critical path delay. The method works on a reduced problem facilitated by PLS regression and requires only 102 samples to achieve satisfactory accuracy. The method is verified by simulations on ISCAS'85 benchmark circuits.
Many problems in machine learning and computer vision consist of predicting multi-dimensional output vectors given a specific set of input features. In many of these problems, there exist inherent temporal and spacial dependencies between the output vectors, as well as repeating output patterns and input-output associations, that can provide more robust and accurate predictors when modelled properly...
Current 802.11 networks do not typically achieve the maximum potential throughput despite link adaptation and cross-layer optimization techniques designed to alleviate many causes of packet loss. A primary contributing factor is the difficulty in distinguishing between various causes of packet loss, including collisions caused by high network use, co-channel interference from neighboring networks,...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.