The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Hepatitis C virus is a massive health issue affecting significant portions of the world's population. Applying data preprocessing, feature reduction techniques, and generating rules based on the selected features for classification tasks are considered as important steps in the knowledge discovery in databases. This paper highlights a Rough-Granular Neural Networks model that incorporates Rough Sets...
Graph theory, as an important approach in data mining, can be applied to dimensionality reduction. As illustrated here, this paper proposes a new graph-theory method that reduces data dimensionality in a more effective and efficient manner than traditional methods. The proposed method, namely related family, is based on a hypergraph information system. The method not only compute all reducts of dimension...
With the ubiquitous nature and sheer scale of data collection, the problem of data summarization is most critical for effective data management. Classical matrix decomposition techniques have often been used for this purpose, and have been the subject of much study. In recent years, several other forms of decomposition, including Boolean Matrix Decomposition have become of significant practical interest...
This paper proposes a method using non-quadratic regularization. The Auto-binomial model (ABM) is used for a prior. The texture parameters of ABM are estimated using the evidence maximization framework. The regularization parameters are kept constant during despeckling. The experimental results showed that the SLC images can be despeckled using the non-quadratic regularization, because it is an iterative...
Understanding how topics within a document evolve over its structure is an interesting and important problem. In this paper, we address this problem by presenting a novel variant of Latent Dirichlet Allocation (LDA): Sequential LDA (SeqLDA). This variant directly considers the underlying sequential structure, i.e., a document consists of multiple segments (e.g., chapters, paragraphs), each of which...
A simple and more concrete granular computing model may be developed using partition of interval set-valued in decision System. The natural intervals of attribute values to be transformed into multiple sub-interval of are given by normalization. And some characteristics of interval set-valued of decision systems in fuzzy rough set theory are discussed. The correctness and effectiveness of the approach...
Rough set theory is a technique of granular computing. As a generalization of classical rough set theory, covering-based rough set has been used for attribute reduction in data mining. Linguistic dynamic systems are dynamic processes involving mainly computing with words instead of numbers for modeling and analysis of complex systems and human-machine interfaces. They are all potential methods for...
Precision and grade are two important quantitative indexes. The purpose of this paper is to combine precision and grade, and explore new extended rough set model. Based on logical difference operation of grade and precision, this paper proposes model of logical difference operation of grade upper approximation operator and variable precision lower approximation operator. In the new model, fundamental...
This paper proposes a statistical method to approximate the distributions of computational and communicational abilities of resources on computational grids. The well-known gamma and normal distributions are employed to approximate the distribution of computational ability and communicational ability for those computational resources. The result of the preliminary experiment shows that this study...
In credibility theory, variance is usually used as a measure of the variation of a possibility distribution about the expected value. Since the variance is defined via nonlinear fuzzy integral, its computation is difficult for general fuzzy variables. To avoid this difficulty, this paper defines the spread of a fuzzy variable based on Lebesgue-Stieltjes (L-S) integral. Our approach is to find an "average...
Optimization with a computational fluid dynamics (CFD) simulator has gained attention in various fields recently. Optimization is hampered by the increased computing time used by many localized solutions that complicate the calculation by needing a vast amount of space for solution generation and distribution. Therefore, researchers hope to speed the process and make the analytical time highly effective...
Precision and grade are two important quantitative indexes. The purpose of this paper is to combine precision and grade, and explore new extended rough set model. Based on logical difference operation of grade and precision, this paper proposes model of logical difference operation of grade lower approximation operator and variable precision upper approximation operator. In the new model, fundamental...
Constraints are commonly used in both simulation and formal verification in order to specify expected input conditions and state transitions. Constraint solving is a process to determine input vectors which satisfy the set of constraints during constrained random simulation. Even though constraints are used in formal property checking to restrict the search space, constraint solving has never had...
Precision and grade are two important quantitative indexes. The purpose of this paper is to combine precision and grade, and explore new extended rough set model. Based on logical OR operation of precision and grade, this paper proposes model of logical OR operation of variable precision lower approximation operator and grade upper approximation operator. In the new model, fundamental structure and...
Cooperative agents often need to reason about the states of a large and complex uncertain domain that evolves over time. Since exact calculation is usually impractical, we aim at providing a modeling tool that supports approximate online monitoring in such settings. Our proposed framework, the multi-agent dynamic Bayesian networks (MA-DBNs), models the dynamics of a group of cooperative agents approximately...
It is theoretically well known that PVL model reduction is locally effective, but it may perform quite well globally in practice. This phenomenon is not well understood. In this paper, we give a solid theoretical analysis for the convergence of PVL model reduction. We transform it to the solution of parametrized linear systems by the nonsymmetric Lanczos method. We establish a result on its local...
SIFT (scale invariant feature transform) features have been one of the most efficient descriptors for object recognition. However, the excessive number of key points and high dimensionality has limited its capacity in object recognition. In this paper we present a novel method based on SIFT features for reliable object recognition. At first, a matching tree is constructed to eliminate non-essential...
Shape discretization through union of weighted points or balls appears as a common representation in different fields of computer graphics and geometric modeling. Among others, it has been very successful for implicit surface reconstruction with radial basis functions, molecular atomic models, fluid simulation from particle systems and deformation tracking with particle filters. These representations...
Motivated by performance evaluation of a computer communication system, we consider a renewal input, general service time, single-server, and infinite-capacity queuing system with generally distributed time-out threshold. We obtain two moment approximate formulas for the mean system performance measures (including the mean number of customers in the system and the mean response time) by using the...
The generalization ability of a learning model is one of the key elements of machine learning and data mining. Cross-validation is a common technique by which to evaluate the generalization error and to select the optimal model. However, the calculation required for sequential data processing by cross-validation is expensive in some generative models, such as hidden Markov models, stochastic context-free...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.