The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
We present the results of a survey of tool use in software modeling education conducted from December 2016 to March 2017. The survey was conducted among 150 professors who taught modeling in 30 countries from all regions of the world. Professors reported using 32 modeling tools. Top motivations for choosing tools are simplicity of learning and installing, as well as the tools being free and supporting...
The purpose of this research study is to investigate the problems of design and requirements to diagnostic automated systems based on image processing. To realize the research purpose, the authors designed a conceptual model, a use case diagram, and a prototype of automated system object model.
A 10-year working partnership between engineering instructors and communication instructors for a course called “Computer-Aided Engineering: Applications to Biomedical Process” aims to enrich the professional skills of the engineering students overall. This paper provides a window to the ways the course aids students in improving their engineering/technical writing and presenting skills. This year,...
We present a framework for comprehensive model-based systematic design and staged verification of Human-in-the-Loop Cyber-Physical Systems (HiLCPS) to handle their inherent complexity. HiLCPS are systems where humans are in the middle of the feedback loop between their cyber and the physical components. HiLCPS designers require modern tools and simpler model-based approaches for design and verification...
Software requirement selection is to find a subset of requirements (so-called optimal set) that gives the highest customer value for a release of software while keeping the cost within the budget. Several industrial studies however, have demonstrated that requirements of software projects are intricately interdependent and these interdependencies impact the values of requirements. Furthermore, the...
A distributed system is considered that carries out computational tasks according to the master-worker paradigm. A master has a set of computational tasks to resolve. She assigns each task to a set of workers over the Internet, instead of computing the task locally. For each task each worker reply to the master with the task result. Since the task was not computed locally, the master can not trust...
It is well known that not all intrusions can be prevented and additional lines of defense are needed to deal with intruders. However, most current approaches use honeynets relying on the assumption that simply attracting intruders into honeypots would thwart the attack. In this paper, we propose a different and more realistic approach, which aims at delaying intrusions, so as to control the probability...
Analog-/Mixed-Signal (AMS) design verification is one of the most challenging and time consuming tasks of todays complex system on chip (SoC) designs. In contrast to digital system design, AMS designers have to deal with a continuous state space of conservative quantities, highly nonlinear relationships, non-functional influences, etc. enlarging the number of possibly critical scenarios to infinity...
Because modern engineering products require more and more functionality, the models used in the design of these products get larger and more complex. A way to handle this complexity would be a suitable mechanism to modularize models. However, current approaches in the Model Driven Engineering field have limited support for modularity. This is the gap that our research addresses. We want to tackle...
Service-oriented computing has been successfully adopted by the industry. This raises however new challenges, especially with respect to service selection and ranking in dynamic environments. Current solutions for service selection and ranking lack flexibility to handle dynamic environments. This paper proposes to integrate algorithms based on the Formal Concept Analysis theory to extend service-oriented...
In this paper we present an advanced software tool designed for a multi-criteria optimization of self-organizing neural networks (SOMs) for their effective implementation in hardware. Problems that we have to deal with in this type of implementations are radically different from those that occur in only pure software realizations. Therefore, although there are many available systems to simulate NNs,...
Software effort estimation is primary requisite in software development life cycle. Most of the software projects failed due to inaccurate effort estimation. So, to overcome this shortcoming many techniques were introduced in past by various researchers. There are many techniques exists for estimating the software project effort such as learning oriented, model based and expert based techniques. This...
Tailoring is the mechanism of adapting a software process to the needs of a project. Model-Driven Engineering (MDE) provides a formal basis and tools infrastructure for automatic software process tailoring. However, the use of a MDE approach can become awkward for most process engineers, because it requires knowledge of MDE concepts and formalisms to implement the required models and tailoring transformations...
Climate simulation and weather forecasting codes are among the most complex examples of scientific software. Moreover, many of them are written in Fortran, making them some of the largest and most complex Fortran codes ever developed. For companies and researchers creating Fortran development tools -- IDEs, static analyzers, refactoring tools, etc. -- it is helpful to study these codes to understand...
Simple Function Point is a functional size measurement method that can be used in place of IFPUG Function Point, but requires a much simpler - hence less time and effort consuming - measurement process. Simple Function Point was designed to be equivalent to IFPUG Function Point in terms of numerical results. This paper reports an empirical study aiming at verifying the effectiveness of Simple Function...
Enterprise Architecture (EA) is a strategy that employ by enterprises in order to align their business and Information Technology (IT). In EA project, EA Implementation Methodology (EAIM) play critical role on managing, developing, and maintaining the project. There are complexities in current EAIM's method, practice, and modelling, which cause ineffectiveness on EA implementation. This research aims...
We employ model-based software (MBS) techniques to assess the quality of adaptation in a network system S in the presence of uncontrollable external environment conditions. The lack of complete knowledge about the I/O behavior of S arises due to large dimensionality of the input parameter space and their interactions with the various components in S. The MBS techniques adapt the operations of S over...
In this paper we present an interpreter framework designed for measuring static and dynamic characteristics of a Scade model[1]. While some of the software metrics have become industrial standards in software development and for popular languages there is a variety of software measurement tools, for Scade there are no such tools. Our main achievement is that we developed an interpreter for metrics,...
The quality of battle damage location process (BDLP) is very important for the efficiency of damage location. The person's location ability can also be reacted by it. However, measuring complexity of BDLP is a rather new area of research with only a small number of contributions. So the complexity of BDLP is put forward to assess the BDLP's quality. The diagram entropy model which applied on the program...
Software evolution is an essential activity that adapts existing software to changes in requirements. It has played a central role in the overall software lifecycle in recent years. It is generally acknowledged that the software employed in real-world environments must continuously evolve and adapt. Many studies, on the other hand, have suggested that software evolution consumes a large part of development...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.