The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
There are many software-based tools that can be used to evaluate the radio-ecological impacts. Plume Gaussian is a simple dispersion model, easy to be implemented and developed further. This paper describes the application of Plume Gaussian model to simulate the dispersion of I-131 released around the Serpong Nuclear Area (Kawasan Nuklir Serpong, KNS). The software development consists: mathematical...
A novel academic recommendation algorithm, MTAR (multi-type academic recommendation), which integrates resource content and user behaviors, is proposed to process five types of academic resources at the same time, which can quickly and precisely recommend academic resources of interest to users. The MTAR algorithm profiles five kinds of academic resources from four features including resource type,...
Kapok is a Python library created to estimate forest height using repeat-pass polarimetric synthetic aperture radar interferometry (PolInSAR). The library can import data collected by NASA's Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) sensor. The library includes functions for data visualization, coherence region plotting, coherence optimization, and inversion of the random volume...
A new radiometry and design framework has been introduced in the latest Digital Imaging and Remote Sensing Image Generation model (DIRSIG5) that allows for faster simulations while streamlining the generation of high-fidelity radiometric data. The same framework that allows for improved computational performance has also modularized simulation components to allow for extensive interchangeability based...
Computer software size continues to grow recently. But it is difficult to collect information to support software development and maintenances. Data mining technology can be used to automatically discover knowledge from software testing data. It is helpful to increase software developing process and improve software quality. At first, correlation analysis is adopted to study the relevance among the...
Cyber physical Systems (CPS) are a key technology to enable the development of highly automated and autonomous maritime systems. The development of new complex and distributed safety-critical systems increases the challenges of testing due to a variety of Verification and Validation (V+V) methods, strictly required confidence to functional correctness of heterogenous cooperating systems and management...
Modern Cyber-Physical Systems are often driven bya plethora of controllers that are connected with each other andtheir environment. To guarantee a safe and robust execution ofthe systems, their control units have to strictly fulfill certainproperties which calls for the use of formal analysis methods inthe software development process. We present the combinationof the model-based engineering technique...
A new Software Reliability Model based on a Mixed Poisson process where the failure rate follows an Inverse Gaussian distribution is proposed. By using the Empirical Bayes estimate of the failure rate, our estimate depends just on the number of failures and the total past time, it is not necessary to know the exact instants of time were the failures have occurred. We simulate the failure detection...
Testing software-intensive systems, for us, has traditionally focused on verifying and validating compliance and conformance to specification, as well as some general non-functional requirements such as performance of different components. In recent years, we have seen a strong move towards more data intensive systems. We have found that these types of systems require a different approach for testing...
Software modelling deals with multiple problems such as defect detection, synchronization and authorship assessment. These problems are obviously solved by complex tools for static and manual analysis of models. But an origin of these problems is in dynamical part of software modelling process - activities of developers, that can be incrementally processed right in developers' environments. So, methods...
Workflow management system (WfMS) should be capable of self-autonomous non-function attributes to experience its resilience to the change of runtime environment in Business Process Management (BPM) domain. With the motivation mentioned above, we propose the resilience mechanism for WfMS, associating with the corresponding methods proposed in our early research work, engaging to achieve our final aim,...
Mathematical models are the foundation of numerical simulation of optoelectronic devices. We present a concept for a machine-actionable as well as human-understandable representation of the mathematical knowledge they contain and the domain-specific knowledge they are based on. We propose to use theory graphs to formalize mathematical models and model pathway diagrams to visualize them. We illustrate...
Human computer interface (HCI) in pervasive computing environment (PvCE) faces many kinds of devices and different user preferences. To create user interfaces for every device and user preference traditionally in each user interaction is unacceptable, especially when the user interface have to be generated dynamically. This paper proposes a so-called type model for simple user interaction in PvCE...
Implementing cryptography on Internet-of-Things (IoT) devices, that is resilient against side channel analysis, has so far been a task only suitable for specialist software designers in interaction with access to a sophisticated testing facility. Recently a novel tool has been developed, ELMO, which offers the potential to enable non-specialist software developers to evaluate their code w.r.t. power...
Too often, capacity planning activities that are crucial to software performance are being pushed to late development phases where trivial measurement-based assessment techniques can be employed on enterprise applications that are nearing completion. This procedure is highly inefficient, time consuming, and may result in disproportionately high correction costs to meet existing service level agreements...
This work addresses the automatic generation of the resources required for the assisted creation of domain models according to specialized views of their meta-model. The task of a designer who builds models compliant to a complex domain meta-model is eased if the model editor requests the information according to a specific view of the meta-model based on the conceptualization or the specific construction...
Virtual worlds and avatar-based interactive computer games are a hype among consumers and researchers for many years now. In recent years, such games on mobile devices also became increasingly important. However, most virtual worlds require the use of proprietary clients and authoring environments and lack portability, which limits their usefulness for targeting wider audiences like e.g. in consumer...
Test automation in distributed systems requires new methods in signal simulation for the stimulation of the distributed system. Increasing complexity of electric electronic (E/E) systems enhances the testing-effort. The main challenge is reducing the time consuming manual stimulation in consideration of improving the quality of testing. Currently used systems for test automation with a software-based...
Nowadays, the adoption of renewable energy sources distributed across the city is crucial for planning and developing the future Smart City. An accurate simulation and modelling of energy sources, such as Photovoltaic Panels (PV), is necessary to evaluate both economical and environmental benefits. With the growth of renewable sources in the city simulations of energy production became crucial for...
As the most active project in the Hadoop ecosystem these days (Zaharia, 2014), Spark is a fast and general purpose engine for large-scale data processing. Thanks to its advanced Directed Acyclic Graph (DAG) execution engine and in-memory computing mechanism, Spark runs programs up to 100x faster than Hadoop MapReduce in memory, or 10x faster on disk (Apache, 2016). However, Spark performance is impacted...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.