The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
The paper presents the Big data definition and the main characteristics description The model of association between entities and characteristics is constructed. The method of heterogeneous data sharing and bringing to relational data model “entity-characteristic” was created. The testing results of developed methods and algorithms are presented.
Curriculum design and implementation in higher medical education can be a great challenge. Although there are well-defined standards, such as the Curriculum Inventory and Competency Framework by MedBiquitous Consortium, existing systems are incapable of a visual representation of the various components, attributes, and relations. In this paper, we present the MEDCIN platform, a pilot tool which uses...
A software package has been developed to bridge the R analysis model with the conceptual analysis environment typical of radiation physics experiments. The new package has been used in the context of a project for the validation of simulation models, where it has demonstrated its capability to satisfy typical requirements pertinent to the problem domain.
Scientific workflow systems are used to integrate existing software components (actors) into larger analysis pipelines to perform in silico experiments. Current approaches for handling data in nested-collection structures, as required in many scientific domains, lead to many record-management actors (shims) that make the workflow structure overly complex, and as a consequence hard to construct, evolve...
The report form is the enterprise application system's important component, the outstanding report form may carry on the filtration, the merge, the statistics, the analysis to the essential data, providing an intuitive form of expression and provides the good basis for the decision-making. At present the Major industry relative is not mature based on the B/S report form system development pattern,...
Pre-computing is an ordinary way to improve the response time in an OLAP application. With the extension of main memory, establishing a cache structure and pre-loading the data into memory with the free memory can further speed up the query response time. The paper gives out an implement of a cache structure in the LE-OLAP, and describes the query and the elimination algorithm of the cache in detail.
Cyberworlds in the era of 'cloud' computing are being created on the Web where data and its dependencies are constantly changing and evolving. The problem of combinatorial explosion in system development inevitably arises when dealing with cyberworlds. To solve the problem, we have developed a data processing system called the Cellular Data System (CDS), based on the Incrementally Modular Abstraction...
In this article, we first study on the system architecture of heterogeneous data integration based on XML; on this base, we then discuss data integration model under mobile environment, and put forward the mobile data management platform based on WML. At last, taking ERP system for example, we discuss data integration platform architecture in ERP system.
A technical framework and relative key techniques to realize a report system based on Three-layer Calculating Architecture are proposed, including metadata mapping and its application, functions of the engine, ETL module, data warehouse and etc. which were designed for this report system. All the techniques studied in this paper have been practiced and have been realized in an actual project of data...
In this paper, our research objective is to develop a database virtualization technique so that data analysts or other users who apply data mining methods to their jobs can use all ubiquitous databases in the Internet as if they were recognized as a single database, thereby helping to reduce their workloads such as data collection from the Internet databases and data cleansing works. In this study,...
Data(base) reverse engineering is the process through which the missing technical and/or semantic schemas of a database (or, equivalently, of a set of files) are reconstructed. If carefully performed, this process allows legacy databases to be safely maintained, extended, migrated to modern platforms or merged with other, possibly heterogeneous, databases. Although this process is mostly pertinent...
Every organization needs to create clearly formatted reports using some reporting software. Created reports can be used within organization either as a base for further analysis and researches, or as set of data formatted as a document that can be delivered to employees, customers, and partners. Since reporting software usually is not used just by IT professionals, it ought to have simple and easily...
This paper presents an integrated intelligent system being capable of automatically estimating and updating a large-size matrix. In the theoretical economics, the input-output model of economics uses a matrix representation of a nation's (or a region's) economy to predict the effect of changes in one industry on others and by consumers, government, and foreign suppliers on the economy. The system...
The Web was an information resource with dynamic state, at the same time, As the spatial data complexity and its multifamily of application field, to get implicit and useful knowledge of space, it is necessary to study the integration issues of the spatial data and choose a suitable technology for data processing and analysis. In the paper, the features of spatial datum were analyzed, they were seriate...
With the highly rapid increase of information, due to the development of massive application system, more and more people have an urgent need for a simple and rapid technology to integrate data which are stored in various data sources. The integration of heterogeneous data sources has become a central problem of modern computing. According to analyzing and researching general methodologies for heterogeneous...
To analyze the education of a school, the first thing is to analyze the whole teacher team of that school. How to measure the quality of the team and the personal qualities of team members, to understand the development trend of the whole team and the potential ability becomes very necessary. A well established data analysis system for teachers can provide both macro-data and micro-data analysis for...
In order to develop an interoperable set of simulators, scenario generation and analysis tools, a common data model and format is needed that can describe network devices, network topology, node movement, and packet information. NetDMF provides a foundation, a common data model and format, to advance development of a flexible software environment for simulations and data analysis from experiments...
It is very difficult now for engineering data exchange, integration and share between different CAD/CAE because of lack of uniform data stipulation. Visualization, as an effective tool, plays an important role in engineering data process and analysis. Visualization interface for engineering data based on XML characterized by structured definition, generality and extensibility was built in this paper,...
In view of the large number of heterogeneous enterprise applications can not be interoperability, not sharing resources, data can not be linked the status quo. OGSA architecture based on the use of the latest GT4 grid development tools to build a data grid platform to integrate existing applications.
Several vulnerability analysis techniques in web-based applications detect and report on different types of vulnerabilities. However, no single technique provides a generic technology-independent handling of Web-based vulnerabilities. In this paper we present our experience with and experimental exemplification of using the application vulnerability description language (AVDL) to realize a unified...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.