The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
We study the descriptive complexity of summation problems in Abelian groups and semigroups. In general, an input to the summation problem consists of an Abelian semigroup G, explicitly represented by its multiplication table, and a subset X of G. The task is to determine the sum over all elements of X.
We consider the problem of counting the number of answers to a first-order formula on a finite structure. We present and study an extension of first-order logic in which algorithms for this counting problem can be naturally and conveniently expressed, in senses that are made precise and that are motivated by the wish to understand tractable cases of the counting problem.
A successor-invariant first-order formula is a formula that has access to an auxiliary successor relation on a structure's universe, but the model relation is independent of the particular interpretation of this relation. It is well known that successor-invariant formulas are more expressive on finite structures than plain first-order formulas without a successor relation. This naturally raises the...
This paper presents a comparative analysis of the complexity accuracy tradeoff in state-of-the-art RF MIMO transmitter mitigation models. The complexity and accuracy of the candidate models depends on the basis functions considered in these models. Therefore, a brief description of the mitigation models is presented accompanied by derivations of the model complexities in terms of the number of FLOPs...
Shah, Rashmi and Ramchandran recently considered a model for Private Information Retrieval (PIR) where a user wishes to retrieve one of several Ä-bit messages from a set of n non-colluding servers. Their security model is information-theoretic. Their paper is the first to consider a model for PIR in which the database is not necessarily replicated, so allowing distributed storage techniques to be...
A structure enjoys the Herbrand property if, whenever it satisfies an equality between some terms, these terms are unifiable. On such structures the expressive power of equalities becomes trivial, as their semantic satisfiability is reduced to a purely syntactic check.
This paper presents a methodology allowing the execution of Bounded Model Checking (via CBMC) and Abstract Interpretation (via Frama-C) analyses on large, real case, C codebases. Then the paper shows some of the results that can nowadays be achieved with relatively new tools like Clang Static Analyzer and Facebook Infer. Finally, a brief introduction on SonarQube and how it can be used to display...
A configurable Business Process (BP) is an abstract BP that engineers customize with respect to specific requirements. To keep track of the multiple and recurrent customizations that lead to a set of derived BPs, this paper proposes a knowledge-based approach that uses a new Process Structure Tree called configurable PST (cPST). A cPST abstracts a separate variability option of the configurable BP...
Detecting anomalous behaviors of cloud platforms is one of critical tasks for cloud providers. Every anomalous behavior potentially causes incidents, especially some unaware and/or unknown issues, which severely harm their SLA (Service Level Agreement). Existing solutions generally monitor cloud platform at different layers and then detect anomalies based on rules or learning algorithms on monitoring...
Motivated by the question of whether the recently introduced Reduced Cutset Coding (RCC) [1], [2] offers rate-complexity performance benefits over conventional context-based conditional lossless coding for sources with two-dimensional Markov structure, this paper compares several row-centric coding strategies that vary in the amount of conditioning as well as whether a model or an empirical table...
It was first observed by John Bell that quantum theory predicts correlations between measurement outcomes that lie beyond the explanatory power of local hidden variable theories. These correlations have traditionally been studied extensively in the probabilistic framework. A drawback of this perspective is that one is then forced to use in a single argument the outcomes of mutually-exclusive measurements...
We review the recent progress on the definition of randomness with respect to conditional probabilities and a generalization of van Lambalgen theorem (Takahashi 2006, 2008, 2009, 2011). In addition we generalize Kjos Hanssen theorem (2010) when the consistency of the posterior distributions holds. Finally we propose a definition of random sequences with respect to conditional probabilities as the...
Recently, the separated fragment (SF) has been introduced and proved to be decidable. Its defining principle is that universally and existentially quantified variables may not occur together in atoms. The known upper bound on the time required to decide SF's satisfiability problem is formulated in terms of quantifier alternations: Given an SF sentence ∃z⃗∀x⃗1∃y⃗1…∀x⃗n∃y⃗n.ψ in which ψ is quantifier...
Probabilistic systems that accumulate quantities such as energy or cost are naturally modelled by cost chains, which are Markov chains whose transitions are labelled with a vector of numerical costs. Computing information on the probability distribution of the total accumulated cost is a fundamental problem in this model. In this paper, we study the so-called cost problem, which is to compute quantiles...
We present new data structures for quasistrict higher categories, in which associativity and unit laws hold strictly. Our approach has low axiomatic complexity compared to traditional algebraic approaches, and gives a practical method for performing calculations in quasistrict 4-categories. It is amenable to computer implementation, and we exploit this to give a machine-verified algebraic proof that...
We study the complexity of the inference problem for propositional circumscription (the minimal inference problem) over arbitrary finite domains. The problem is of fundamental importance in nonmonotonic logics and commonsense reasoning. The complexity of the problem for the two-element domain has been completely classified [Durand, Hermann, and Nordh, Trichotomy in the complexity of minimal inference,...
Prefrontal cortex (PFC) is thought to support the ability to focus on goal-relevant information by filtering out irrelevant information, a process akin to dimensionality reduction. Here, we find direct evidence of goal-directed data compression within medial PFC during learning, such that the degree of neural compression predicts an individual’s ability to selectively attend to concept-specific information...
Small-scale clouds (SCs) often suffer from resource under-provisioning during peak demand, leading to inability to satisfy service level agreements (SLAs) and consequent loss of customers. One approach to address this problem is for a set of autonomous SCs to share resources among themselves in a cost-induced cooperative fashion, thereby increasing their individual capacities (when needed) without...
Evaluating an extremely useful graph property, the spectral radius (largest absolute eigenvalue of the graph adjacency matrix), for large graphs requires excessive computing resources. This problem becomes especially challenging, for instance with distributed or remote storage, when accessing the whole graph itself is expensive in terms of memory or bandwidth. One approach to tackle this challenge...
The paper formulates methods to measure the trustsworthiness of a network system S under hostile environment conditions incident on S. How good is the system S in meeting the QoS expectations of applications (i.e. the QoS capability of S) is quantitatively measured — say on a [0,1] scale. We employ model-based assessment tools (e.g. PO-MDP) to benchmark the QoS capability by stress-testing S with...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.