The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
By investigating the current situation of university library's digital and informational development, this paper analyzes the opportunities and challenges brought by cloud computing technology to the development of libraries. Two strategies of renting cloud services and building cloud platform are put forward to realize the cloud computing service of Digital Library in big data environment.
The elastic provisioning of Virtual Infrastructures (VIs) enables a dynamic management of cloud resources (computing and communication) in order to meet the hosted application's requirements. Thus, to perform elasticity requests, providers usually rely on reallocation mechanisms and policies. The concerns regarding the environment and the operational costs indicate energy consumption of the data centers...
One of the most important and challenging problems in recommendation systems is that of modeling temporal behavior. Typically, modeling temporal behavior increases the cost of parameter inference and estimation. Along with it, it also poses the constraint of requiring a large amount of data for reliably learning the parameters of the model. Therefore, it is often difficult to model temporal behavior...
We present a novel approach for estimating conditional probability tables, based on a joint, rather than independent, estimate of the conditional distributions belonging to the same table. We derive exact analytical expressions for the estimators and we analyse their properties both analytically and via simulation. We then apply this method to the estimation of parameters in a Bayesian network. Given...
Sparse Discriminant Analysis (SDA) has been widely used to improve the performance of classical Fisher's Linear Discriminant Analysis in supervised metric learning, feature selection and classification. With the increasing needs of distributed data collection, storage and processing, enabling the Sparse Discriminant Learning to embrace the Multi-Party distributed computing environments becomes an...
Analysis of spatio-temporal data is a common research topic that requires the interpolations of unknown locations and the predictions of feature observations by utilizing information about where and when the data were observed. One of the most difficult problems is to make predictions of unknown locations. Tensor factorization methods are popular in this field because of their capability of handling...
Latent Dirichlet Allocation (LDA) has been widely used in text mining to discover topics from documents. One major approach to learn LDA is Gibbs sampling. The basic Collapsed Gibbs Sampling (CGS) algorithm requires O(NZ) computations to learn an LDA model with Z topics from a corpus containing N tokens. Existing approaches that improve the complexity of CGS focus on reducing the factor Z. In this...
Given a collection of basic customer demographics (e.g., age and gender) andtheir behavioral data (e.g., item purchase histories), how can we predictsensitive demographics (e.g., income and occupation) that not every customermakes available?This demographics prediction problem is modeled as a classification task inwhich a customer's sensitive demographic y is predicted from his featurevector x. So...
In this paper, we focus on developing a novel mechanism to preserve differential privacy in deep neural networks, such that: (1) The privacy budget consumption is totally independent of the number of training steps; (2) It has the ability to adaptively inject noise into features based on the contribution of each to the output; and (3) It could be applied in a variety of different deep neural networks...
The methodology of community detection can be divided into two principles: imposing a network model on a given graph, or optimizing a designed objective function. The former provides guarantees on theoretical detectability but falls short when the graph is inconsistent with the underlying model. The latter is model-free but fails to provide quality assurance for the detected communities. In this paper,...
Hospital readmissions within 30 days after discharge are costly and it has been a prior for researchers to identify patients at risk of early readmission. Most of the reported hospital readmission prediction models have been built with historical data and thus can outdate over time. In this work, a self-adaptive 30-day diabetic hospital readmission prediction model has been developed. A diabetic inpatient...
The software composition using high-granularity entities nowadays is a common practice. The process of software composition is supported by various CASE tools. First tools were made on the basis of very simple formalisms (e.g. intuitionistic propositional logic). During the years the tools evolved to more efficient ones, which are able to deal with concurrency, multiparty sessions and other advanced...
In this article were analyzed existing models of access control in information systems, and was explored an extended model of building access control for an automated information and library system. As the operating factor on the proposed model, operators and their functions were chosen. In addition, the generalized structure and algorithm of the proposed model is given.
Modern drug discovery organizations generate large volumes of SAR data. A promising methodology that can be used to mine this chemical data to identify novel structure-activity relationships is the matched molecular pair (MMP) methodology. However, before the full potential of the MMP methodology can be utilized, a MMP identification method that is capable of identifying all MMPs in large chemical...
Most of the available robot programming by demonstration (PbD) approaches focus on learning a single task, in a given environmental situation. In this paper, we propose to learn multiple tasks together, within a common environment, using one of the available PbD approaches. Task-parameterized Gaussian mixture model (TP-GMM) is used at the core of the proposed approach. A database of TP-GMMs will be...
[Background]: Developing conceptual models is an integral part of the requirements engineering (RE) process. Goal models are requirements engineering conceptual models that allow diagrammatic representation of stakeholder intentions and how they affect each other. A specific goal modeling language construct, the contribution of goal satisfaction of one goal to another, plays a central role in supporting...
Unified Modeling Language (UML) is a modeling standard that has been commonly used in the software industry. However, students face difficulties while learning how to model complete and correct UML diagrams. One of the reasons is the way UML has been taught. In order to improve the effectiveness of learning it is necessary to employ methods in which the students actively take part in the learning...
Educational approaches for computer science proposing the use of complete online courses or traditional courses employing some kind of online material have received much attention recently. The integration of online materials into traditional courses or the replacement of entire courses offer huge possibilities, including increased teaching quality and better study and work alignment. However, researchers...
This paper presents the development of a real-time Supervision Hardware-in-the-Loop (HIL) emulator of shaded PV systems. This study is focused on shaded conditions due to the impact of shadows in the final energy production and the global structural healthy. In this context, we propose a methodology to emulate in real-time the shaded PV system behavior. This proposed methodology is intended for evaluation...
The Air Force is shifting its cybersecurity paradigm from an information technology (IT)-centric toward a mission oriented approach. Instead of focusing on how to defend its IT infrastructure, it seeks to provide mission assurance by defending mission relevant cyber terrain enabling mission execution in a contested environment. In order to actively defend a mission in cyberspace, efforts must be taken...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.