The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
We study model networks of spiking neurons where synaptic inputs interact in terms of nonlinear functions. These nonlinearities are used to represent the spatial grouping of synapses on the dendrites and to model the computations performed at local branches. We analyze the complexity of learning in these networks in terms of the VC dimension and the pseudo dimension. Polynomial upper bounds on these...
It has remained an open question whether there exist product unit networks with constant depth that have superlinear VC dimension. In this paper we give an answer by constructing two-hidden-layer networks with this property. We further show that the pseudo dimension of a single product unit is linear. These results bear witness to the cooperative effects on the computational capabilities of product...
This paper presents new results about the confidence bounds on the generalization performances of perceptrons. It deals with regression problems. It is shown that the probability to get a generalization error greater than the empirical error plus a precision ε, depends on the number of inputs and on the magnitude of the coefficients of the combination. The result presented does not require to bound...
Recent theoretical works applying the methods of statistical learning theory have put into relief the interest of old well known learning paradigms such as Bayesian inference and Gibbs algorithms. Sample complexity bounds have been given for such paradigms in the zero error case. This paper studies the behavior of these algorithms without this assumption. Results include uniform convergence of Gibbs...
Based on a statistical mechanics approach, we develop a method for approximately computing average case learning curves and their sample fluctuations for Gaussian process regression models. We give examples for the Wiener process and show that universal relations (that are independent of the input distribution) between error measures can be derived.
Complexity of neural networks measured by the number of hidden units is studied in terms of rates of approximation. Limitations of improvements of upper bounds of the order of O(n1/2) on such rates are investigated for perceptron networks with some periodic and sigmoidal activation functions.
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.