The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Together with the technology advancement, Computer Vision plays an important role in enhancing smart computing systems to help people overcome obstacles in their daily lives. One of the common troublesome problems is human memorization ability, especially memorizing things such as personal items. It is annoying for people to waste their time finding lost items manually by recall or notes. This motivates...
Feature extraction and matching are two crucial components in person Re-Identification (ReID). The large pose deformations and the complex view variations exhibited by the captured person images significantly increase the difficulty of learning and matching of the features from person images. To overcome these difficulties, in this work we propose a Pose-driven Deep Convolutional (PDC) model to learn...
To solve deep metric learning problems and producing feature embeddings, current methodologies will commonly use a triplet model to minimise the relative distance between samples from the same class and maximise the relative distance between samples from different classes. Though successful, the training convergence of this triplet model can be compromised by the fact that the vast majority of the...
This study aimed to aid the enormous effort required to analyze phraseological writing competence by developing an automatic evaluation tool for texts. We attempted to measure both second language (L2) writing proficiency and text quality. In our research, we adapted the CollGram technique that searches a reference corpus to determine the frequency of each pair of tokens (bi-grams) and calculates...
The cloud computing ecosystem comprises hundreds of providers, offering diverse computing services, incompatible APIs, and significantly different pricing models. Cloud application management platforms hide the heterogeneity of the services and APIs, allowing, to varying degrees, portability between providers. These tools remove technical barriers to switching providers, but they do not provide a...
With the constant increase in the number of interconnected devices in today networks, and the high demand of adaptiveness, more and more computations can be designed according to self-organisation principles. In this context, a key building block for large-scale system coordination, called gradient, is used to estimate distances in a fully-distributed way: it is the basis for a vast variety of higher...
As demands on new software increase, new approaches are needed to help developers ensure Quality of Service (QoS) for their offered service. In this paper we present a QoS modeling approach that complements and extends the standard microservice and component-based software engineering tools by giving the software engineer information on what Non-Functional Requirements (NFRs) and quality constraints...
Software Quality model is a well-accepted way for assessing high-level quality characteristics (e.g., maintainability) by aggregation from low-level metrics. Aggregation method in a software quality model denotes how to aggregate low-level metrics to high-level quality characteristics. Most of the existing quality models adopt the weighted linear aggregation method. The main drawback of weighted linear...
This work presents an approximate model of spatially distributed Markov processes for the GIS-based real-time disaster Decision Support System. The proposed model uses possibility measures based on rough or fuzzy sets approach. The transition distribution of vague Markov jump-type process can be determined using the possibility values. The dynamics of the processes is represented as the motion of...
In this paper the calculations of the robustness of a network is addressed. After a brief description of the most relevant metrics, our Network Robustness Simulator(NRS) is presented as well as its structure and working model. The NRS computes the robustness I a dynamic scenario, it copes with multiple failures and different types of attack. In particular, the addition of the epidemic based model...
In this work, an artificial intelligent approach to predicting crude oil price is presented. Decision Trees (DT) are utilized in the modeling and prediction of crude oil from a dataset covering 24 years. The input attributes to the decision tree are key economic indicators that are believed to affect crude oil price and the system has as it's output the numerical value of the predicted crude oil price...
Memristors have extended their influence beyond memory to logic and in-memory computing. Memristive logic design, the methodology of designing logic circuits using memristors, is an emerging concept whose growth is fueled by the quest for energy efficient computing systems. As a result, many memristive logic families have evolved with different attributes, and a mature comparison among them is needed...
Wireless communication networks are very crucial for any operations of all sectors of modern society such as commercial mobile communication or military communication. High-altitude electromagnetic pulse (HEMP) leads severe threats to the survivability and robustness of communication network. Due to the high complexity and hard proven properties of large-scale networks, network-level HEMP effect evaluation...
Early design-space evaluation of computer-systems is usually performed using performance models such as detailed simulators, RTL-based models etc. Unfortunately, it is very challenging (often impossible) to run many emerging applications on detailed performance models owing to their complex application software-stacks, significantly long run times, system dependencies and the limited speed/potential...
Recently, wireless technology experiences a fast growth to meet user demand and push toward the boundary limit of system performance. The simulation and verification framework play important role for accelerating investigation of technology proof of concept, field-trial, and large-scale commercial prototyping. In this paper, we present system-level simulation of heterogeneous model and unified HW/SW...
Previous models based on Deep Convolutional Neural Networks (DCNN) for face verification focused on learning face representations. The face features extracted from the models are applied to additional metric learning to improve a verification accuracy. The models extract high-dimensional face features to solve a multi-class classification. This results in a dependency of a model on specific training...
Cloud federation allows interconnected Cloud Computing environments of different Cloud Service Providers (CSPs) to share their resources and deliver more efficient service performance. However, each CSP provides a different level of security in terms of cost and performance. Instead of consuming the whole set of Cloud services that are required to deploy an application through a single CSP, consumers...
Performance is one of the main aspects that should be taken into consideration during the design, development, tuning and optimisation of computer networks supported by cloud computing platforms (CCPs). Queueing network models (QNMs) of CCPs constitute essential quantitative tools of investigation towards identifying acceptable levels of quality-of-service (QoS), whether for upgrading an existing...
Image quality assessment (IQA) plays a crucial role in monitoring quality control in image communication systems, and in benchmarking and optimizing parameters in enhancement algorithms. The full-reference IQA metrics require a good-quality reference image, obtaining which may not be practical in real-life applications. This paper, therefore, proposes a no-reference IQA metric based on the hypothesis...
Several applications in numerical scientific computing involve very large sparse matrices with a regular or irregular sparse structure. These matrices can be stored using special compression formats (storing only non-zero elements) to reduce memory space and processing time. The choice of the optimal format is a critical process that involves several criteria. The general context of this work is to...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.