The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Nowadays more and more organisations use the collaborative environments, such as the social networks, to identify profiles of competencies, which are usually declared by the users themselves. We postulate that the analysis of the computer-supported collaborative activities may provide information about the users' competencies in specific domains. In this research work, we present a trace-based approach...
Top-k join is an essential tool for data analysis, since it enables selective retrieval of the k best combined results that come from multiple different input datasets. In the context of Big Data, processing top-k joins over huge datasets requires a scalable platform, such as the widely popular MapReduce framework. However, such a solution does not necessarily imply efficient processing, due to inherent...
Online marketplaces are e-commerce websites where thousands of products are provided by multiple third parties. There are dozens of these differently structured marketplaces that need to be visited by the end users to reach their targets. This searching process consumes a lot of time and effort; moreover it negatively affects the user experience. In this paper, extensive analysis and evaluation of...
Resolving ambiguous and unknown identities is crucial to intelligence analysis in which fraud and deceptive names are frequently used by criminals and terrorists to make their activities unnoticeable. Typical approaches rely on the similarity measure of textual and other content-based characteristics, which are usually not applicable in the case of falsely-defined and unknown instances. This barrier...
We describe our web-based system for the analysis of students' results on the course Fundamentals of Electrical Engineering by applying the method of Formal Concept Analysis. We have focused on the students' answers and constructed their concept lattices or taxonomies of the subject matter. Finally, we have shown that this approach corresponds well with the actual students' overall results and final...
Positive emotions have been proven to be a key factor for successful learning. In modern personalized learning environments informal learning takes a prominent role and with this the use of computer-mediated communication. Communication data, like for example chat logs, can be harvested for sentiments. Most sentiment analyses operate processing only verbal information. But the messages exchanged in...
This paper explores recent research done into the philosophy of data. The research utilizes experimental philosophy ideas combined with Information Technology methodologies to assess participants' philosophies of data. Reusing the concept of the Data Flow Diagram, I suggest a methodology of experimental philosophy that allows participants to categorize flows into data, information, and knowledge in...
The Web has been flooded with highly heterogeneous data sources that freely offer their data to the public. Careful design and compliance to standards is a way to cope with the heterogeneity. However, any agreement and compliance is practically hard to achieve across different communities. In this work we describe a framework that enables the exploitation of content across different scientific disciplines...
Concept lattice (Galois lattice) is an efficient tool for data analysis and rule extraction from multidimensional space. This paper introduces some definitions of concept lattice, and compares two methods of data inductive: AOI and concept lattice. After introduce the context, actual medical data is discretized and concept lattice and Hasse diagram are constructed to generate the concept hierarchy,...
In this paper, a similarity evaluating model based on rough formal concept analysis and information content similarity is proposed which evaluates the similarity degree between the concepts. We use the information content approach to automatically obtain part of similarity scores of two concepts which makes up the normal featural and structural evaluating models. Then through our model, the similarity...
We have considered the task to discover similarities in modern music by applying methods of Formal Concept Analysis, specifically those of Contextual Topology. For this, we investigate the Music Genome, a project started on the 6th of January 2000 by a group of musicians and music-loving technologists who came together with the idea of creating the most comprehensive analysis of music ever. We define...
XML (eXtensible Markup Language) documents are the main format for publishing and interchanging data on the Web. Integrity constraints are essential in data design. Functional dependencies are the most important semantic constraints. Functional dependencies satisfied by XML data have been introduced recently. Formal Concept Analysis (FCA) is a mathematical theory of concept hierarchies which is based...
Change impact analysis plays an immanent role in the maintenance and enhancement of software systems, especially for defect prevention. In our previous work we have developed approaches to detect logical dependencies among artifacts in repositories and calculated different metrics. But that is not enough, because in order to use change impact analysis a detailed process with guidelines on the one...
The goal of this work is develop and test of a new software archetype, to aid the competence management process in Post-Graduate of Production Engineering Courses. This system will be designed using JADE Agent Framework, to read and analyze XML data. Those technologies have been used to build an innovative environment for software building. The research methodology used in this scientific work is...
Data as a Service (DaaS) emerges as a new trend for exchanging data between independent data owners and data users so that data can be acquired on demand through standard protocols across heterogeneous platforms. It is usually a user-interactive and iterative process to compose the services into various data-driven business scenarios of data acquisition, analysis, and other processing activities....
With the highly rapid increase of information, due to the development of massive application system, more and more people have an urgent need for a simple and rapid technology to integrate data which are stored in various data sources. The integration of heterogeneous data sources has become a central problem of modern computing. According to analyzing and researching general methodologies for heterogeneous...
We propose a solution to the problem of exploring large, complex data sets in a (relational) database by a human user. In a nutshell, our proposed solution is to develop the tools to support data exploration and browsing in an organized manner. We introduce: (a) an organization of data around the idea of defining distances on data sets, to reflect the intuitive notion of data that is (closely) related...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.