The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
The design of today's systems on chip (SoC's) raises difficult issues, in particular regarding verification. In their early design phases, hardware/software embedded systems are commonly described as ESL (Electronic System Level) models, such that their functional and transactional behavior can be analyzed by simulation. To enhance this validation process, we have previously developed a framework...
A major challenge for the European electronic industry is to enhance productivity while reducing costs and ensuring quality in development, integration and maintenance. Model-Driven Engineering (MDE) principles and techniques have already shown promising capabilities but still need to scale to support real-world scenarios implied by the full deployment and use of complex electronic components and...
Two emerging architectural paradigms, i.e., Software Defined Networking (SDN) and Network Function Virtualization (NFV), enable the deployment and management of Service Function Chains (SFCs). A SFC is an ordered sequence of abstract Service Functions (SFs), e.g., firewalls, VPN-gateways, traffic monitors, that packets have to traverse in the route from source to destination. While this appealing...
Software reuse in the early stages is a key issue in rapid development of applications. This article introduces a metaprocess-oriented methodology based on the model reuse as software assets, and starting from the domain specification and analysis phases. The approach includes the definition of a conceptual level to adequately represent the domain and a reuse process to specify the metaprocess as...
With the flexibility and programmability levels offered by Network Functions Virtualization (NFV), it is expected to catalyze the upcoming “softwarization” of the network through software implementation of networking functionalities on virtual machines (VMs). While looking into the different issues thrown at NFV, numerous works have demonstrated how performance, power consumption and, consequently,...
The current stage in the development of information systems is characterized by the use of multi-component distributed architectures and methods of intellectual processing and analysis of large amounts of heterogeneous data. At present, the problem of ensuring the reliability and safety of such systems has not been sufficiently investigated. The purpose of the work is to develop a method for computer...
The emergence of power as a first-class design constraint has fueled the proposal of a growing number of run-time power optimizations. Many of these optimizations trade-off power saving opportunity for a variable performance loss which depends on application characteristics and program phase. Furthermore, the potential benefits of these optimizations are sometimes non-additive, and it can be difficult...
The study deals with the design of an information system for companies' manufacturers' saddlery and harness in Venezuela, which allows offering its users the possibility of developing an effective and efficient performance. To achieve in this paper types Generic Strategies (Porter) is analyzed and the theoretical foundations of the Balanced Scorecard (Kaplan and Norton) also runs as strategic management...
The increasing number of complex embedded systems used in safety relevant tasks produces major challenges in the field of safety analysis. This paper presents a simulation-based safety analysis that will overcome these challenges. The presented approach consists of two parts: an Error Effect Simulation (EES) and a graph-based specification. The EES is composed of a system simulation with fault injection...
E-learning is a promising research area, as they are expected to increase enrollment and improve the quality of education. Adaptive e-learning systems, traditionally focused on content personalization, are in need to cope with continuous changing requirements and changing environment. Indeed, the specification and the management quality attributes of such systems, supported throughout the whole lifecycle...
Technical Debt (TD) is a metaphor to refer to events that bring quality loss in the code. A large volume of Technical Debt in an application may bring losses for the organisation that keeps it. For those organisations that outsource software development, monitoring TD in the influx of development packages may be an efficient strategy to ensure the quality of the product acquired. This works bring...
Self-adaptive software systems (SASS) are equipped with feedback loops to adapt autonomously to changes of the software or environment. In established fields, such as embedded software, sophisticated approaches have been developed to systematically study feedback loops early during the development. In order to cover the particularities of feedback, techniques like one-way and in-the-loop simulation...
This paper examines knowledge construction in a distributed learning environment supported by social web tools. Research data was gathered from online asynchronous discussions in a first-year master's degree course in multimedia in education. Our analysis was modelled on a validated analysis model and results indicate that, despite a significant percentage in the phase of sharing and comparing information,...
With the proliferation of digital measurement devices, such as smart meter on the distribution systems and phasor measurement units on the transmission systems, power companies find themselves inundated with increasingly growing data and long for efficient tools and analytical techniques to identify, digest and utilize critical information to improve the efficiency and reliability of grid operations...
Fluent linear temporal logic is a formalism for specifying properties of event-based systems, based on propositions called fluents, defined in terms of activating and deactivating events. In this paper, we propose complementing the notion of fluent by the related concept of counting fluent. As opposed to the boolean nature of fluents, counting fluents are numerical values, that enumerate event occurrences,...
Open-source repository data can be automatically mined using sequence mining methods to provide high-level feedback on project status. GitHub.com projects are acquired, sequence-mined, clustered, and regressed to analyze project characteristics. Such results can be presented to project managers, as part of a display generated by an automated monitoring system. Such monitoring systems provide high-level...
Parameter study is a widely spread type of scientific research methodologies used with modern High Performance and High Throughput Computing infrastructures such as Clouds. More and more often, parameter study experiments are oriented towards generating large amount of data describing complicated processes and phenomena. It becomes clear that new software for supporting such large-scale experiments...
Current systems for enacting scientific experiments, and in particular simulation workflows, do not support multi-scale and multi-field problems if they are not coupled on the level of the mathematical model. We present a life cycle that utilizes the notion of choreographies to enable the trial-and-error modeling and execution of multi-scale and/or multi-field simulations. The life cycle exhibits...
Production environments in large enterprises are expensive to maintain, support and enhance. Organisations take extreme care to provision and configure their environments and they are very wary to change. This is because a small change may result to a domino effect of failures resulting in costly system downtime. Consequently, engineers are also looking for reliable ways of testing environmental changes...
The inherent dynamic nature of Cyber Physical Systems (CPS) requires novel mechanisms to support their evolution over their operation life time. Though typically the development of CPS integrates the software (cyber) design with the physical domain, this contribution concentrates mainly on another essential integration plane: The software design level. This paper presents an approach to support the...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.