The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Future of food innovation lies in the art and science of designing an interactive connected intelligent device that can detect how we feel and display the content suitable for individual consumers. We designed a smart dining table and chairs that can detect, sense, and analyze consumer's satisfaction, and interact with consumers. A team of furniture designer, software engineers, mechanical engineer,...
The aim of this paper is to introduce a semantic methodology using ontology in order to improve results of data mining in judicial decisions database. An intelligent and automatic method to search for sentences in lawsuits related to the one in trial is presented. A judicial ontology is built with and without rules from experts. The method can provide judiciary celerity, seeking to solve the yearning...
Stack Overflow is one of the most popular question-and-answer sites for programmers. However, there are a great number of duplicate questions that are expected to be detected automatically in a short time. In this paper, we introduce two approaches to improve the detection accuracy: splitting body into different types of data and using word-embedding to treat word ambiguities that are not contained...
Developers summarize their changes to code in commit messages.When a message seems “unusual’', however, this puts doubt into the quality of the code contained in the commit. We trained n-gram language models and used cross-entropy as an indicator of commit message “unusualness” of over 120,000 commits from open source projects.Build statuses collected from Travis-CI were used as a proxy for code quality...
Originally developed for purely functional verification of software, native or host compiled simulation [6] has gained momentum, thanks to its considerable speedup compared to instruction set simulation (ISS). To obtain a performance model of the software, non-functional information is computed from the target binary code using low-level analysis and back-annotated into the highlevel code used to...
The software engineering based on components is an evolving branch of software engineering. The evolution in data mining and information retrieval techniques forms the basis of the approaches to component retrieval. This has paved way to new techniques to be used for efficient storage, retrieval and management of component repository and storage systems. Such information retrieval caters to the needs...
Although it is possible to design and manufacture MPSoCs with hundreds of processors, there is still a gap in the ability to debug hardware, software, and applications for such chips. Current state-of-the-art works related to MPSoC debugging suffer from poor integration, scalability in data storage, and simple graphical data representation. This work proposes a modular debugging framework to aid the...
Online reviews nowadays are an important source of information for consumers to evaluate online services and products before deciding which product and which provider to choose. Therefore, online reviews have significant power to influence consumers' purchase decisions. Being aware of this, an increasing number of companies have organized spammer review campaigns, in order to promote their products...
This paper describes a decision support system (DSS) built on knowledge extraction using simulation-based optimization and data mining. The paper starts with a requirements analysis based on a survey conducted with a number of industrial companies about their practices of using simulations for decision support. Based upon the analysis, a new, interactive DSS that can fulfill the industrial requirements,...
The web becomes a huge repository of data and information. In order to deal with large quantities of data, the users need intelligent-based tools and methods to access data, process it, and make it useful for variety of purposes. Proposed as a part of Semantic Web, Resource Description Framework (RDF) is an important way of representing information. Its intrinsic feature of high connectivity creates...
In the last years the presence of embedded devices in everyday life has grown exponentially. The market of these devices imposes conflicting requirements such as cost, performance and energy. The use of Multiprocessor Systems on Chip (MPSoCs) is a widely accepted solution to provide a trade-off between these demands. However, programming MPSoCs is still a cumbersome task. Several research efforts...
The paper presents an analysis of directions of development of cognitive information technologies for mathematical education. Two aspects of the problem are highlighted, they are a simulation of mental processes of a student and a modeling of a subject area. The proposed concept of development of cognitive informational technologies is confirmed by a retrospective analysis of information technologies...
Tool-based code review is growing in popularity and has become a standard part of the development process at Mi-crosoft. Adoption of these tools makes it possible to mine data from code reviews and provide access to it. In this paper, we pre-sent an experience report for CodeFlow Analytics, a system that collects code review data, generates metrics from this data, and provides a number of ways for...
Component-based software development (CBSD)has proved to be a highly useful way of developing software using re-usable components, especially within a short time frame. The biggest challenge faced during development (specially testing) of component-based software is that the source-code of components is not available. Due to this, the traditional testing techniques cannot be applied directly while...
The effort and expertise required for manually crafting the models for model-based testing (MBT) is a major obstacle slowing down its industrial adoption. For implemented and executable systems, there are approaches to automate some part or even the whole process of creating the models for MBT. Recently, using extracted models for testing graphical user interface (GUI) applications has been a popular...
Computer controlled machinery is perceived as a cyber-physical system. Such systems are vulnerable to cyber attacks of ever-growing sophistication with potentially severe consequences. To address this threat advanced security/status monitoring measures are to be developed and deployed within the framework of industrial control systems. A system diagnostic approach based on modeling and assessment...
Accumulated provenance data about correctly constructed and successfully executed scientific workflows provides a valuable source of knowledge, which may be used and reused in new scientific experiments' design. In this paper we propose a provenance-based workflow composition approach — the opposite to package-based one, presented earlier. Processing semantic-compliant provenance data makes available...
Software effort estimation requires high accuracy, but accurate estimations are difficult to achieve. Increasingly, datamining is used to improve an organization's software process quality, e.g. the accuracy of effort estimations. There are a large number of different method combination exists for software effort estimation, selecting the most suitable combination becomes the subject of research in...
The ever increasing number of platforms and languages available to software developers means that the software industry is reaching high levels of complexity. Model Driven Architecture (MDA) presents a solution to the problem of improving software development processes in this changing and complex environment. MDA driven development is based on models definition and transformation. Design patterns...
Model-driven engineering, an emerging trend in software engineering, has enabled the application of refactoring to UML models. One of the important steps in refactoring is the identification of refactoring opportunities within the model, also referred to as Model Smells. An Object-Oriented system modeled by UML is built up from many different views. Model refactoring, in recent proposals, is applied...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.