The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
As digital life became more intertwined with our daily activities, humans developed an increasing reliance on cloud computing to sync data between all their devices, acquired 24/7 access to information through the internet, as well as the ability to share data from anywhere using a variety of methods. Predictably, technology began to exert its influence in education, with new practices of large-scale...
The components and systems involved in railway operation are subject to stringent reliability and safety requirements, but up until now the cyber security of those same systems has been largely under-explored. In this work, we examine a widely-used railway technology, track beacons or balises, which provide a train with its position on the track and often assist with accurate stopping at stations...
Although the database schemas are an integral part of information systems, the use of software product lines has been mainly studied for the production of executable code. The impact on data management and in particular the database schemas are poorly documented and little studied in the literature. The paper is an attempt to explore some of the issues of the modeling and implementation of the variability...
The development of large and complex simulated models often requires teams to collaborate. One approach is to break a large model into independently developed partial models that, when combined, capture the overall behavior. However, maintaining consistent world state across independently developed simulations is a challenge. In this paper, we introduce the Collaborative Aspect-Oriented Distributed...
A multi-year research project focused on a global aerospace company's design-to-production transition, and in particular how to answer production-related questions much earlier in a program's design cycle than is possible today. A fundamental difficulty is that the time and expertise required to formulate appropriate analysis models prevents their routine use, especially in new program development...
Challenges such as understanding sustainable urban development require modeling interdependencies and interactions among systems. The High Level Architecture (HLA) provides an approach to studying these aspects by integrating separately developed simulations in a distributed computing environment. These applications require coupling interdependent simulations and sequencing their execution to ensure...
In this paper, we present an abstract model of visualization and inference processes, and describe an information-theoretic measure for optimizing such processes. In order to obtain such an abstraction, we first examined six classes of workflows in data analysis and visualization, and identified four levels of typical visualization components, namely disseminative, observational, analytical and model-developmental...
OntoUML is an ontologically well-founded conceptual modelling language that distinguishes various types of classifiers and relations providing precise meaning to the modelled entities. Efforts arise to incorporate OntoUML into the Model-Driven Development approach as a conceptual modelling language for the PIM of application data. In a prequel paper, we have introduced and outlined our approach for...
An important phase of a data-oriented software system reengineering is a database reengineering process and, in particular, its subprocess - a database reverse engineering process. In this paper we present one of the model-to-model transformations from a chain of transformations aimed at transformation of a generic relational database schema into a form type data model. The transformation is a step...
This paper describes a Framework for achieving the migration of relational databases to other types of databases(OO, OR, XML), based on a meta-model which plays the core role of the framework in establishing migration, exploiting the different object concepts including inheritance, aggregation, and composition. The realization of the meta-model is based on the principle of semantic enrichment and...
Successful traffic speed prediction is of great importance for the benefits of both road users and traffic management agencies. To solve the problem, traffic scientists have developed a number of time-series speed prediction approaches, including traditional statistical models and machine learning techniques. However, existing methods are still unsatisfying due to the difficulty to reflect the stochastic...
In this work we present a synergic integration of the Functional Mock-Up Interface (FMI) and Business Process Model and Notation (BPMN) standards aimed at managing coupled system simulations. The expressiveness of the BPMN diagrams enabled us to define the relationship between the involved systems and guarantees a one-to-one correspondence with an XML file which is the starting point for the automation...
Way beyond its industrial roots, robotics evolved to be a highly interdisciplinary field with a variety of applications in a smart world. The eRobotics methodology addresses this evolution by providing platforms where roboticist can exchange ideas and collaborate with experts from other disciplines for developing complex technical systems and automated solutions. Virtual Testbeds are the central method...
Due to the tremendous advances in GPS and location-based web services, highly available spatiotemporal trajectory data poses an important challenge - knowledge discovery from trajectories. Knowledge discovery tasks on trajectory big data such as classification, clustering and outlier detection require a dedicated data model, which can support various utility functions and provide a robust object-relational...
In software development we are faced with the problem to comprehend and take over source code from other developers. The key challenge is to understand the underlying specification implemented by the software system. Regaining this understanding is more difficult when the source code is the only reliable source of information, documentation is outdated or only present in fragments, and original developers...
Respect for privacy and data confidentiality in a company are two fundamentals that must be protected. However, a Data Warehouse can be used as a very powerful mechanism for discovering crucial information, hence the importance of implementing security measures which guarantee the data confidentiality by establishing an access control policy. In this direction, several propositions were made, however,...
The objective of this paper is to introduce a new constraint model for testing the conformity of overriding methods during inheritance operation for an object oriented (OO) system. This model is based on formal specification techniques and can be used to generate test data in derived classes. The key idea of this conformity approach is to use an optimal constraint and a partitioning technique based...
Modularization is considered as one enabler for flexible and highly reconfigurable process plants. These characteristics are needed to overcome deficiencies regarding market volatility and shorter product innovation cycles. Current standardization activities aim at the specification of Module Type Package (MTP) files as a semantic description of modules for fast and efficient integration into process...
Several leading research groups name hardware generation as the next disruptive productivity improvement after IP-reuse. Metamodeling and code generation have already demonstrated a speedup by a factor 3× for the complete implementation phase of a chip. Furthermore, code size reduction by a factor of 3× was achieved with the hardware generation language (HGL) Chisel.
The most significant issue in the recent times in finance is finding the systematic ways to abridge and envision the stock market data. Stock market analysis gives useful information to Entrepreneurs, Individuals and Institutions about the etiquette of the market helping with speculation decisions. Many prediction models have been developed during the last decade. As a comparative study, we will be...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.