Serwis Infona wykorzystuje pliki cookies (ciasteczka). Są to wartości tekstowe, zapamiętywane przez przeglądarkę na urządzeniu użytkownika. Nasz serwis ma dostęp do tych wartości oraz wykorzystuje je do zapamiętania danych dotyczących użytkownika, takich jak np. ustawienia (typu widok ekranu, wybór języka interfejsu), zapamiętanie zalogowania. Korzystanie z serwisu Infona oznacza zgodę na zapis informacji i ich wykorzystanie dla celów korzytania z serwisu. Więcej informacji można znaleźć w Polityce prywatności oraz Regulaminie serwisu. Zamknięcie tego okienka potwierdza zapoznanie się z informacją o plikach cookies, akceptację polityki prywatności i regulaminu oraz sposobu wykorzystywania plików cookies w serwisie. Możesz zmienić ustawienia obsługi cookies w swojej przeglądarce.
The results of the inclusion of Virtual Reality in the production line of an Industry Joalheira in the state of Bahia, a case study which was conducted in the elevenmonth period by a Startup consisting of two professional Designers and one Information Technology, between the years of 2008 and 2009. The production flow of this jewelery factory was losses due to continuous communication difficulties...
Context: A growing number of software organizations have been adopting Continuous DElivery (CDE) and Continuous Deployment (CD) practices. Researchers have started investing significant efforts in studying different aspects of CDE and CD. Many studies refer to CDE (i.e., where an application is potentially capable of being deployed) and CD (i.e., where an application is automatically deployed to production...
Since the demand for more bandwidth, agile infrastructures and services grows, it becomes challenging for Service Providers like GEANT to manage the proprietary underlay, while keeping costs low. In such a scenario, Software Defined Networking (SDN), open hardware and open source software prove to be key components to address those challenges. After one year of development, SDX-L2 and BoD, the SDN-ization...
Today's research projects propose a modular manufacturing environment for production sites, which adapt itself autonomously and makes manufacturing decisions without human interaction. Therefore, it is necessary that the next generations of production lines, especially the intralogistics transportation systems, are designed more adaptable and flexible. The object in this paper is a cyber-physical...
In modern manufacturing, large data sets from different sources are permanently generated along the production chain. This data are supposed to be used to optimize products and production chains. However, in most cases, process participants only focus on acquiring data and leave it then to decision makers for interpretation. While there is nothing wrong with that on principle, comparable results might...
The increasing amount of gathered sensor data in Industry 4.0 allows comprehensive data analysis software that creates value-adding opportunities. As companies often cannot implement such software by themselves and as they typically don't want to give their data to external scientists, they commission them to build the required software in order to execute it locally. However, installing, configuring,...
By using a virtual campus, users can not only immerse in the campus scene, but also interact with the system, which helps them understand the campus rapidly and deeply. But the virtual campus itself doesn't provide the relevant data information of the campus scene, which severely limits its application to a certain extent. To solve this problem, oriented to Beijing Institute of Petrochemical Technology...
This work presents the major characteristics and lessons learned by the development and prototype implementation of an event-oriented and cloud-based SCADA system that is constructed using a microservice architecture. The microservices are then utilized to propose an approach for implementing product-driven production systems under the RAMI4.0 specification.
Deadlocks are critical problems afflicting parallel applications, causing software to hang with no further progress. Existing detection tools suffer not only from significant recording performance overhead, but also from excessive memory and/or storage overhead. In addition, they may generate numerous false alarms. Subsequently, after problems have been reported, tremendous manual effort is required...
The need to improve the quality of management at minimum costs, the complexity of the structure of the management object, the functions performed by it, leads to an increase in uncertainties that need to be taken into account. Application of typification and unification will allow to reduce the cost of production of new products, to increase the level of automation of productions.
The current requirements for personalized customization drive the manufacturing industry to make adjustments constantly, which means a longitudinal unobstructed and transverse reconfigurable system is crucial for enterprise to remain competitive. However, the leading time for launching a new system or reconfiguring an exist system is cost-intensive and time-consuming. This paper proposed a new structure...
The aim of this paper is to show the strengths and the weakness of process mining tools in post-delivery validation. This is illustrated on two use-cases from a real-world system. We also indicate what type of research has to be done to make process mining tools more usable for validation purposes.
The article is devoted to the experience of the Russian and Soviet Fund of algorithms and programs. The fund, created more than half a century, played a noticeable role in creating a regulatory framework for requirements for replicable software. Fund experience also contributed to the development of the copyright to the software. Currently, due to the widespread use of free software, the fund performs...
Requirements engineering provides several practices to analyze how a user wants to interact with a future software. Mockups, prototypes, and scenarios are suitable to understand usability issues and user requirements early. Nevertheless, users are often dissatisfied with the usability of a resulting software. Apparently, previously explored information was lost or no longer accessible during the development...
Scaling clusters is no longer the only struggle in moving towards exascale in HPC. While scaling components such as the network and file systems is a widely accepted need, monitoring, on the other hand, is often left behind in the procurement of these large systems. Monitoring is often quite an afterthought that is expected to be incorporated in existing infrastructure. While that often works for...
As defense budgets have decreased in recent years while the equipment becomes more complex, Automatic Test Systems have also become more complex. At the same time system engineering requirements demand more and better requirements verification. One recent program had over 1000 requirements that needed some sort of verification. This paper discusses the techniques and statistics used to capture, evaluate,...
Automatic testing is a widely adopted technique for improving software quality. Software developers add, remove and update test methods and test classes as part of the software development process as well as during the evolution phase, following the initial release. In this work we conduct a large scale study of 61 popular open source projects and report the relationships we have established between...
ATE's are used in production test in order to decrease the test design time and test setup time. ATE's can be classified as General Purpose ATE (GP-ATE) and Specific Purpose ATE (SP-ATE). While general purpose ATE's shortens the test design time, they are expensive since they have many interfaces to test different types of connections. If the number of planned production is high, SP-ATE is preferred...
The article presents ontologically-oriented tools and applications for creating an intellectual environment for engineering interaction in enterprise resource planning systems. It is noted that the usage of modular ontologies to provide data sustainability is a key factor in the “cross-linking” of local systems and different-type applications in the environment of enterprise resource planning systems.
We present in this paper a security analysis of electronic devices which considers the lifecycle properties of embedded systems. We first define a generic model of electronic devices lifecycle showing the complex interactions between the numerous assets and the actors. The method is illustrated through a case study: a connected insulin pump. The lifecycle induced vulnerabilities are analyzed using...
Podaj zakres dat dla filtrowania wyświetlonych wyników. Możesz podać datę początkową, końcową lub obie daty. Daty możesz wpisać ręcznie lub wybrać za pomocą kalendarza.