Institutions

About Us

Help

Gaeilge
0
1000
Home
Browse
Advanced Search
Search History
Marked List
Statistics
A
A
A
Author(s)
Institution
Publication types
Funder
Year
Limited By:
Author = Pearlmutter, Barak A.;
133 items found
Sort by
Title
Author
Item type
Date
Institution
Peer review status
Language
Order
Ascending
Descending
25
50
100
per page
1
2
3
4
5
6
Bibtex
CSV
EndNote
RefWorks
RIS
XML
Displaying Results 126  133 of 133 on page 6 of 6
Marked
Mark
Transformations of Gaussian Process Priors
(2003)
MurraySmith, Roderick; Pearlmutter, Barak A.
Transformations of Gaussian Process Priors
(2003)
MurraySmith, Roderick; Pearlmutter, Barak A.
Abstract:
Gaussian processesprior systems generally consist of noisy measurements of samples of the putatively Gaussian process of interest, where the samples serve to constrain the posterior estimate. Here we consider the case where the measurements are instead noisy weighted sums of samples. This framework incorporates measurements of derivative information and of filtered versions of the process, thereby allowing GPs to perform sensor fusion and tomography, it allows certain group invariances (ie symmetries) to be weakly enforced, can be used to model heteroskedasticity in output variance, and under certain conditions it allows the dataset to be dramatically reduced in size. The method is applied to a sparsely sampled image, where each sample is taken using a broad and nonmonotonic point spread function.
http://mural.maynoothuniversity.ie/8164/
Marked
Mark
Tricks from Deep Learning
(2016)
Baydin, Atilim Gunes; Pearlmutter, Barak A.; Siskind, Jeffrey Mark
Tricks from Deep Learning
(2016)
Baydin, Atilim Gunes; Pearlmutter, Barak A.; Siskind, Jeffrey Mark
Abstract:
The deep learning community has devised a diverse set of methods to make gradient optimization, using large datasets, of large and highly complex models with deeply cascaded nonlinearities, practical. Taken as a whole, these methods constitute a breakthrough, allowing computational structures which are quite wide, very deep, and with an enormous number and variety of free parameters to be effectively optimized. The result now dominates much of practical machine learning, with applications in machine translation, computer vision, and speech recognition. Many of these methods, viewed through the lens of algorithmic differentiation (AD), can be seen as either addressing issues with the gradient itself, or finding ways of achieving increased efficiency using tricks that are ADrelated, but not provided by current AD systems. The goal of this paper is to explain not just those methods of most relevance to AD, but also the technical constraints and mindset which led to their discovery. Af...
http://mural.maynoothuniversity.ie/8115/
Marked
Mark
Use your powers wisely: resource allocation in parallel channels
(2006)
Pearlmutter, Barak A.; Jaramillo, Santiago
Use your powers wisely: resource allocation in parallel channels
(2006)
Pearlmutter, Barak A.; Jaramillo, Santiago
Abstract:
This study evaluates various resource allocation strategies for simultaneous estimation of two independent signals from noisy observations. We focus on strategies that make use of the underlying dynamics of each signal, exploiting the difference in estimation uncertainty between them. This evaluation is done empirically, by exploring the parameter space through computer simulations. Two cases are studied: one in which an initial allocation is maintained during estimation of the variables, and one in which allocation can be dynamically changed at each time step according to the uncertainty of the estimate from each channel. The results suggest that there are conditions in which it is advantageous to assign a high signaltonoise ratio (SNR) to only one of the signals and guess the other one. Furthermore, comparison between the two allocation strategies shows that the dynamic strategy significantly improves estimation performance in low SNR scenarios when the signals have similar dyna...
http://mural.maynoothuniversity.ie/1361/
Marked
Mark
Using Backpropagation with Temporal Windows to Learn the Dynamics of the CMU DirectDrive Arm II
(1988)
Goldberg, Kenneth Y.; Pearlmutter, Barak A.
Using Backpropagation with Temporal Windows to Learn the Dynamics of the CMU DirectDrive Arm II
(1988)
Goldberg, Kenneth Y.; Pearlmutter, Barak A.
Abstract:
Computing the inverse dynamics of a robot arm is an active area of research in the control literature. We hope to learn the inverse dynamics by training a neural network on the measured response of a physical arm. The input to the network is a temporal window of measured positions; output is a vector of torques. We train the network on data measured from the first two joints of the CMU DirectDrive Arm II as it moves through a randomlygenerated sample of "pickandplace" trajectories. We then test generalization with a new trajectory and compare its output with the torque measured at the physical arm. The network is shown to generalize with a root mean square error/standard deviation (RMSS) of 0.10. We interpreted the weights of the network in terms of the velocity and acceleration filters used in conventional control theory.
http://mural.maynoothuniversity.ie/10248/
Marked
Mark
Using Polyvariant UnionFree Flow Analysis to Compile a HigherOrder FunctionalProgramming Language with a FirstClass Derivative Operator to Efficient Fortranlike Code
(2008)
Siskind, Jeffrey Mark; Pearlmutter, Barak A.
Using Polyvariant UnionFree Flow Analysis to Compile a HigherOrder FunctionalProgramming Language with a FirstClass Derivative Operator to Efficient Fortranlike Code
(2008)
Siskind, Jeffrey Mark; Pearlmutter, Barak A.
Abstract:
We exhibit an aggressive optimizing compiler for a functionalprogramming language which includes a firstclass forward automatic differentiation (AD) operator. The compiler’s performance is competitive with FORTRANbased systems on our numerical examples, despite the potential inefficiencies entailed by support of a functionalprogramming language and a firstclass AD operator. These results are achieved by combining (1) a novel formulation of forward AD in terms of a reflexive mechanism that supports firstclass nestable nonstandard interpretation with (2) the migration to compiletime of the conceptually runtime nonstandard interpretation by wholeprogram interprocedural flow analysis. Categories and Subject Descriptors G.1.4 [Quadrature and Numerical Differentiation]: Automatic differentiation; D.3.2 [Language Classifications]: Applicative (functional) languages; D.3.4 [Processors]: Code generation, Compilers, Optimization; F.3.2 [Semantics of Programming Languages]: Partial eva...
http://mural.maynoothuniversity.ie/8151/
Marked
Mark
Using programming language theory to make automatic differentiation sound and efficient
(2008)
Pearlmutter, Barak A.; Siskind, Jeffrey Mark
Using programming language theory to make automatic differentiation sound and efficient
(2008)
Pearlmutter, Barak A.; Siskind, Jeffrey Mark
Abstract:
This paper discusses a new Automatic Differentiation (AD) system that correctly and automatically accepts nested and dynamic use of the AD operators, without any manual intervention. The system is based on a new formulation of AD as highly generalized firstclass citizens in a λ calculus, which is briefly described. Because the λcalculus is the basis for modern programminglanguage implementation techniques, integration of AD into the λcalculus allows AD to be integrated into an aggressive compiler. We exhibit a research compiler which does this integration. Using novel analysis techniques, it accepts source code involving free use of a firstclass forward AD operator and generates object code which attains numerical performance comparable to, or better than, the most aggressive existing AD systems.
http://mural.maynoothuniversity.ie/1730/
Marked
Mark
VC Dimension of an IntegrateandFire Neuron Model
(1996)
Zador, Anthony M.; Pearlmutter, Barak A.
VC Dimension of an IntegrateandFire Neuron Model
(1996)
Zador, Anthony M.; Pearlmutter, Barak A.
Abstract:
We compute the VC dimension of a leaky integrateandfire neuron model. The VC dimension quantifies the ability of a function class to partition an input pattern space, and can be considered a measure of computational capacity. In this case, the function class is the class of integrateandfire models generated by varying the integration time constant T and the threshold θ, the input space they partition is the space of continuoustime signals, and the binary partition is specified by whether or not the model reaches threshold at some specified time. We show that the VC dimension diverges only logarithmically with the input signal bandwidth N. We also extend this approach to arbitrary passive dendritic trees. The main contributions of this work are (1) it offers a novel treatment of computational capacity of this class of dynamic system; and (2) it provides a framework for analyzing the computational capabilities of the dynamic systems defined by networks of spiking neurons.
http://mural.maynoothuniversity.ie/8131/
Marked
Mark
VC Dimension of an IntegrateandFire Neuron Model
(1999)
Zador, Anthony M.; Pearlmutter, Barak A.
VC Dimension of an IntegrateandFire Neuron Model
(1999)
Zador, Anthony M.; Pearlmutter, Barak A.
Abstract:
We find the VC dimension of a leaky integrateandfire neuron model. The VC dimension quantifies the ability of a function class to partition an input pattern space, and can be considered a measure of computational capacity. In this case, the function class is the class of integrateandfire models generated by varying the integration time constant τ and the threshold ϴ, the input space they partition is the space of continuoustime signals, and the binary partition is specified by whether or not the model reaches threshold and spikes at some specified time. We show that the VC dimension diverges only logarithmically with the input signal bandwidth N , where the signal bandwidth is determined by the noise inherent in the process of spike generation. For reasonable estimates of the signal bandwidth, the VC dimension turns out to be quite small (¡10). We also extend this approach to ar bitrary passive dendritic trees. The main contributions of this work are (1) ...
http://mural.maynoothuniversity.ie/8132/
Displaying Results 126  133 of 133 on page 6 of 6
1
2
3
4
5
6
Bibtex
CSV
EndNote
RefWorks
RIS
XML
Item Type
Book chapter (15)
Conference item (29)
Journal article (66)
Report (23)
Peer Review Status
Peerreviewed (107)
Nonpeerreviewed (26)
Year
2019 (1)
2018 (3)
2017 (2)
2016 (6)
2015 (3)
2014 (4)
2013 (8)
2012 (2)
2011 (2)
2009 (3)
2008 (15)
2007 (8)
2006 (9)
2005 (9)
2004 (7)
2003 (6)
2002 (9)
2001 (2)
2000 (6)
1999 (3)
1998 (2)
1996 (4)
1995 (3)
1994 (4)
1993 (2)
1991 (2)
1990 (2)
1989 (1)
1988 (2)
1987 (1)
1986 (2)
built by Enovation Solutions