The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
The success rate of IT system development projects has remained as low as 30 %, resulting in that more than 60 % of workers have suffered from problems. This paper proposes a software agent to replace rudimentary-level human counselors. The agent uses a counseling knowledge whose domain is limited to IT workers or students. Counseling differs from learning support whose problems are clear and well-understood...
The paper introduces a "lazy" Data Mining technology, which models students' learning characteristics by considering real data instead of deriving ("guessing") their characteristics explicitly. In former work, the authors developed a modeling system for university learning processes, which aims at evaluating and refining university curricula to reach an optimal learning success...
The paper presents a very general technique to represent human needs and offers along with a technology to find optimal matches. Moreover, the system is able to learn from its use by collecting user feedback and changing its parameters accordingly. This way, the system adjusts itself to the human expectations and desires and even follows the trend of these desires and expectations.
Schema versioning is an indispensable feature for applications using temporal databases and requiring an entire history of data and schema. ôXSchema [7] is an infrastructure for constructing and validating temporal XML documents, but any explicit support for XML schema versioning is offered. A ôXSchema schema is composed of a conventional XML Schema document annotated with physical and logical annotations...
There are many approaches being proposed to find the correspondence points between two images. They generally perform when used to find the correspondences between two images of the same object, such as in a video sequence or in a stereo camera. However, they fail if the number of true matches between two images was small compared to all the potential correspondence points found, which could happen...
This paper describes a new approach to the modeling of epidemic dynamics with complex networks. The infection spreading considers the links of each ill node and the probability to infect healthy nodes. Moreover, a fitness parameter is used for each node of a network to simulate the individual reaction against an infectious process. The dynamics of infection has been evaluated on different kinds of...
The diagnostic signature of many microvasculature diseases like diabetes mellitus, hypertension and arteriosclerosis includes the changes incurred in the diameter of retinal blood vessels. Therefore estimation of precise vascular widths is a critical and demanding process in automated retinal image analysis. This paper proposes an automated system to measure the vessel caliber in the retinal images...
The paper introduces multiscale spatial Weber local descriptor (MSWLD) for robust face recognition system. In the proposed method, WLD is calculated in different neighborhood (multiscale) and WLD histograms are obtained from blocks of an image to preserve spatial information. WLD histograms from different blocks are then concatenated to produce the final feature set of a face image. Fisher ratio is...
It is described an abstract model for the definition and the dynamic evolution of "communities of actants" originating from a given reference society of roles. Multiple representations are provided, showing how communities evolve with respect to their reference societies. In particular we show how such representations are self-similar and factorisable into "prime" constituents...
The Open Information Extraction Project is one of the most ambitious attempts in the area of automatically constructing ontologies by harvesting information from the web. What we will call their Know-It-All Ontology contains about 6 billion items, consisting of triples and rules. The downside of such automatically constructed ontologies is that they contain a vast number of errors: some arising from...
We propose a composite centrality measure for general weighted and directed complex networks, based on measure standardisation and statistical normalization, whereby the composite measure is expected to follow a standard log-normal distribution. This offers a natural and absolute scale to measure node and edge centralities for complex networks. Considering snapshots of the world trade web, we demonstrate...
Whenever we give model description for simulation purpose, we always face to a problem how to describe behaviors correctly in terms of objects. Clearly sophisticated diagram approach would facilitate matters greatly. PetriNet is one of the nice vehicles for discrete modeling. In this investigation, we introduce Well-Formed PetriNet (WFPN) and to propose how to construct reachable PetriNet models by...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.