Institutions

About Us

Help

Gaeilge
0
1000
Home
Browse
Advanced Search
Search History
Marked List
Statistics
A
A
A
Author(s)
Institution
Publication types
Funder
Year
Limited By:
Author = Pearlmutter, Barak A.;
133 items found
Sort by
Title
Author
Item type
Date
Institution
Peer review status
Language
Order
Ascending
Descending
25
50
100
per page
1
2
3
4
5
6
Bibtex
CSV
EndNote
RefWorks
RIS
XML
Displaying Results 51  75 of 133 on page 3 of 6
Marked
Mark
Divideandconquer checkpointing for arbitrary programs with no user annotation
(2018)
Siskind, Jeffrey Mark; Pearlmutter, Barak A.
Divideandconquer checkpointing for arbitrary programs with no user annotation
(2018)
Siskind, Jeffrey Mark; Pearlmutter, Barak A.
Abstract:
Classical reversemode automatic differentiation (AD) imposes only a small constantfactor overhead in operation count over the original computation, but has storage requirements that grow, in the worst case, in proportion to the time consumed by the original computation. This storage blowup can be ameliorated by checkpointing, a process that reorders application of classical reversemode AD over an execution interval to tradeoff space vs. time. Application of checkpointing in a divideandconquer fashion to strategically chosen nested execution intervals can break classical reversemode AD into stages which can reduce the worstcase growth in storage from linear to sublinear. Doing this has been fully automated only for computations of particularly simple form, with checkpoints spanning execution intervals resulting from a limited set of program constructs. Here we show how the technique can be automated for arbitrary computations. The essential innovation is to apply the technique...
http://mural.maynoothuniversity.ie/10241/
Marked
Mark
Doing the twist: diagonal meshes are isomorphic to twisted toroidal Meshes
(1996)
Pearlmutter, Barak A.
Doing the twist: diagonal meshes are isomorphic to twisted toroidal Meshes
(1996)
Pearlmutter, Barak A.
Abstract:
We show that a k x n diagonal mesh is isomorphic to a n+k/2 x n+k/2  nk/2 twisted toroidal mesh, i.e., a network similar to a standard n+k/2 x nk/2 toroidal mesh, but with opposite handed twists of nk/2 in the two directions, which results in a loss of (nk/2)2 nodes.
http://mural.maynoothuniversity.ie/1419/
Marked
Mark
Dreams, mnemonics, and tuning for criticality
(2013)
Pearlmutter, Barak A.; Houghton, Conor J.
Dreams, mnemonics, and tuning for criticality
(2013)
Pearlmutter, Barak A.; Houghton, Conor J.
Abstract:
According to the tuningforcriticality theory, the essential role of sleep is to protect the brain from supercritical behaviour. Here we argue that this protective role determines the content of dreams and any apparent relationship to the art of memory is secondary to this.
http://mural.maynoothuniversity.ie/6551/
Marked
Mark
Dynamic recurrent neural networks
(1990)
Pearlmutter, Barak A.
Dynamic recurrent neural networks
(1990)
Pearlmutter, Barak A.
Abstract:
We survey learning algorithms for recurrent neural networks with hidden units and attempt to put the various techniques into a common framework. We discuss fixpoint learning algorithms, namely recurrent backpropagation and deterministic Boltzmann Machines, and nonfixpoint algorithms, namely backpropagation through time, Elman's history cutoff nets, and Jordan's output feedback architecture. Forward propagation, an online technique that uses adjoint equations, is also discussed. In many cases, the unified presentation leads to generalizations of various sorts. Some simulations are presented, and at the end, issues of computational complexity are addressed.
http://mural.maynoothuniversity.ie/5505/
Marked
Mark
Efficient Implementation of a HigherOrder Language with BuiltIn AD
(2016)
Siskind, Jeffrey Mark; Pearlmutter, Barak A.
Efficient Implementation of a HigherOrder Language with BuiltIn AD
(2016)
Siskind, Jeffrey Mark; Pearlmutter, Barak A.
Abstract:
We show that Automatic Differentiation (AD) operators can be provided in a dynamic language without sacrificing numeric performance. To achieve this, general forward and reverse AD functions are added to a simple highlevel dynamic language, and support for them is included in an aggressive optimizing compiler. Novel technical mechanisms are discussed, which have the ability to migrate the AD transformations from runtime to compiletime. The resulting system, although only a research prototype, exhibits startlingly good performance. In fact, despite the potential inefficiencies entailed by support of a functionalprogramming language and a firstclass AD operator, performance is competitive with the fastest available preprocessorbased Fortran AD systems. On benchmarks involving nested use of the AD operators, it can even dramatically exceed their performance.
http://mural.maynoothuniversity.ie/8113/
Marked
Mark
Equivalence Proofs for MultiLayer Perceptron Classifiers and the Bayesian Discriminant Function
(1990)
Hampshire, John B.; Pearlmutter, Barak A.
Equivalence Proofs for MultiLayer Perceptron Classifiers and the Bayesian Discriminant Function
(1990)
Hampshire, John B.; Pearlmutter, Barak A.
Abstract:
This paper presents a number of proofs that equate the outputs of a MultiLayer Perceptron (MLP) classifier and the optimal Bayesian discriminant function for asymptotically large sets of statistically independent training samples. Two broad classes of objective functions are shown to yield Bayesian discriminant performance. The first class are “reasonable error measures,” which achieve Bayesian discriminant performance by engendering classifier outputs that asymptotically equate to a posteriori probabilities. This class includes the meansquared error (MSE) objective function as well as a number of information theoretic objective functions. The second class are classification figures of merit (CFMmono ), which yield a qualified approximation to Bayesian discriminant performance by engendering classifier outputs that asymptotically identify themaximum a posteriori probability for a given input. Conditions and relationships for Bayesian discriminant functional equivalence are given f...
http://mural.maynoothuniversity.ie/5504/
Marked
Mark
Evolving the Incremental Lambda Calculus into a Model of Forward Automatic Differentiation (AD)
(2016)
Kelly, Robert; Pearlmutter, Barak A.; Siskind, Jeffrey Mark
Evolving the Incremental Lambda Calculus into a Model of Forward Automatic Differentiation (AD)
(2016)
Kelly, Robert; Pearlmutter, Barak A.; Siskind, Jeffrey Mark
Abstract:
Formal transformations somehow resembling the usual derivative are surprisingly common in computer science, with two notable examples being derivatives of regular expressions and derivatives of types. A newcomer to this list is the incremental λcalculus, or ILC, a "theory of changes" that deploys a formal apparatus allowing the automatic generation of efficient update functions which perform incremental computation. The ILC is not only defined, but given a formal machineunderstandable definitionaccompanied by mechanically verifiable proofs of various properties, including in particular correctness of various sorts. Here, we show how the ILC can be mutated into propagating tangents, thus serving as a model of Forward Accumulation Mode Automatic Differentiation. This mutation is done in several steps. These steps can also be applied to the proofs, resulting in machinechecked proofs of the correctness of this model of forward AD.
http://mural.maynoothuniversity.ie/8116/
Marked
Mark
Fast accurate MEG source localization using a multilayer perceptron trained with real brain noise
(2002)
Jun, Sung Chan; Pearlmutter, Barak A.; Nolte, Guido
Fast accurate MEG source localization using a multilayer perceptron trained with real brain noise
(2002)
Jun, Sung Chan; Pearlmutter, Barak A.; Nolte, Guido
Abstract:
Iterative gradient methods such as Levenberg–Marquardt (LM) are in widespread use for source localization from electroencephalographic (EEG) and magnetoencephalographic (MEG) signals. Unfortunately, LM depends sensitively on the initial guess, necessitating repeated runs. This, combined with LM's high perstep cost, makes its computational burden quite high. To reduce this burden, we trained a multilayer perceptron (MLP) as a realtime localizer. We used an analytical model of quasistatic electromagnetic propagation through a spherical head to map randomly chosen dipoles to sensor activities according to the sensor geometry of a 4D Neuroimaging Neuromag122 MEG system, and trained a MLP to invert this mapping in the absence of noise or in the presence of various sorts of noise such as white Gaussian noise, correlated noise, or real brain noise. A MLP structure was chosen to trade off computation and accuracy. This MLP was trained four times, with each type of noise. We measured...
http://mural.maynoothuniversity.ie/8121/
Marked
Mark
Fast Exact Multiplication by the Hessian
(1994)
Pearlmutter, Barak A.
Fast Exact Multiplication by the Hessian
(1994)
Pearlmutter, Barak A.
Abstract:
Just storing the Hessian H (the matrix of second derivatives a2E/aw, aw, of the error E with respect to each pair of weights) of a large neural network is difficult. Since a common use of a large matrix like H is to compute its product with various vectors, we derive a technique that directly calculates Hv, where v is an arbitrary vector. To calculate Hv, we first define a differential operator Rycf (w)}=(a/&) f (w+ W) J~=~, note that%{Vw}= Hv and%{w}= v, and then apply R {.} to the equations used to compute 0,.
http://mural.maynoothuniversity.ie/5501/
Marked
Mark
Fast robust MEG source localization using MLPs
(2002)
Jun, Sung Chan; Pearlmutter, Barak A.; Nolte, Guido
Fast robust MEG source localization using MLPs
(2002)
Jun, Sung Chan; Pearlmutter, Barak A.; Nolte, Guido
Abstract:
Source localization from MEG data in real time requires algorithms which are robust, fully automatic, and very fast. We present two neural network systems which are able to localize a single dipole to reasonable accuracy within a fraction of a millisecond, even when the signals are contaminated by considerable noise. The first network is a multilayer perceptron (MLP) which takes the sensor measurements as inputs, uses two hidden layers, and outputs source location in Cartesian coordinates. After training with random dipolar sources contaminated by real noise, localization of a single dipole could be performed within 300 microseconds on an 800 Mhz Athlon workstation, with an average localization error of 1.15 cm. To improve the accuracy to 0.28 cm, one can apply a few iterations of conventional LevenbergMarquardt (LM) minimization using the MLP output as the initial guess. The combined method is about twenty times faster than ultistart LM localization with comparable accuracy. In a ...
http://mural.maynoothuniversity.ie/1422/
Marked
Mark
Fast Robust SubjectIndependent Magnetoencephalographic Source Localization Using an Artificial Neural Network
(2005)
Jun, Sung Chan; Pearlmutter, Barak A.
Fast Robust SubjectIndependent Magnetoencephalographic Source Localization Using an Artificial Neural Network
(2005)
Jun, Sung Chan; Pearlmutter, Barak A.
Abstract:
We describe a system that localizes a single dipole to reasonable accuracy from noisy magnetoencephalographic (MEG) measurements in real time. At its core is a multilayer perceptron (MLP) trained to map sensor signals and head position to dipole location. Including head position overcomes the previous need to retrain the MLP for each subject and session. The training dataset was generated by mapping randomly chosen dipoles and head positions through an analytic model and adding noise from real MEG recordings. After training, a localization took 0.7 ms with an average error of 0.90 cm. A few iterations of a LevenbergMarquardt routine using the MLP output as its initial guess took 15 ms and improved accuracy to 0.53 cm, which approaches the natural limit on accuracy imposed by noise. We applied these methods to localize single dipole sources from MEG components isolated by blind source separation and compared the estimated locations to those generated by standard manually assisted co...
http://mural.maynoothuniversity.ie/563/
Marked
Mark
Filtered Gaussian Processes for Learning with Large DataSets
(2005)
Shi, Jian Qing; MurraySmith, Roderick; Titterington, D. Mike; Pearlmutter, Barak A.
Filtered Gaussian Processes for Learning with Large DataSets
(2005)
Shi, Jian Qing; MurraySmith, Roderick; Titterington, D. Mike; Pearlmutter, Barak A.
Abstract:
Kernelbased nonparametric models have been applied widely over recent years. However, the associated computational complexity imposes limitations on the applicability of those methods to problems with large datasets. In this paper we develop a filtering approach based on a Gaussian process regression model. The idea is to generate a smalldimensional set of filtered data that keeps a high proportion of the information contained in the original large dataset. Model learning and prediction are based on the filtered data, thereby decreasing the computational burden dramatically.
http://mural.maynoothuniversity.ie/2511/
Marked
Mark
FirstClass Nonstandard Interpretations by Opening Closures
(2007)
Siskind, Jeffrey Mark; Pearlmutter, Barak A.
FirstClass Nonstandard Interpretations by Opening Closures
(2007)
Siskind, Jeffrey Mark; Pearlmutter, Barak A.
Abstract:
We motivate and discuss a novel functional programming construct that allows convenient modular runtime nonstandard interpretation via reflection on closure environments. This mapclosure construct encompasses both the ability to examine the contents of a closure environment and to construct a new closure with a modified environment. From the user’s perspective, mapclosure is a powerful and useful construct that supports such tasks as tracing, security logging, sandboxing, error checking, profiling, code instrumentation and metering, runtime code patching, and resource monitoring. From the implementor’s perspective, mapclosure is analogous to call/cc. Just as call/cc is a nonreferentiallytransparent mechanism that reifies the continuations that are only implicit in programs written in direct style, mapclosure is a nonreferentially transparent mechanism that reifies the closure environments that are only implicit in higherorder programs. Just as CPS conversion is a nonlocal ...
http://mural.maynoothuniversity.ie/2051/
Marked
Mark
Fusion of functional brain imaging modalities via linear programming.
(2004)
O Halchenko, Yaroslav; Pearlmutter, Barak A.; Hanson, Stephen Jose; Zaimi, Adi
Fusion of functional brain imaging modalities via linear programming.
(2004)
O Halchenko, Yaroslav; Pearlmutter, Barak A.; Hanson, Stephen Jose; Zaimi, Adi
Abstract:
Proposed method makes a number of simplifying assumptions which convert the EEG/FMRI integration problem into optimization of a convex function, of a form amenable to efﬁcient solution as a very sparse linear programming (LP) problem. The assumptions made in doing this are, surprisingly, in general somewhat more robust than those generally used to cast EEG/FMRI integration as optimization of a nonconvex function not amenable to efﬁcient global optimization. This is because the L1 norm used here corresponds to a more robust statistical estimator than the L2 normal generally used For this reason, even though this technique results in a tractable global optimization, it is more robust to nonGaussian noise and outliers than approaches that make the Gaussian noise assumption [1]. Current poster presents formulation of the problem together with results obtained on artiﬁcial data.
http://mural.maynoothuniversity.ie/12678/
Marked
Mark
GMaximization: an Unsupervised Learning Procedure for Discovering Regularities
(1986)
Pearlmutter, Barak A.; Hinton, Geoffrey
GMaximization: an Unsupervised Learning Procedure for Discovering Regularities
(1986)
Pearlmutter, Barak A.; Hinton, Geoffrey
Abstract:
Hill climbing is used to maximize an information theoretic measure of the difference between the actual behavior of a unit and the behavior that would be predicted by a statistician who knew the first order statistics of the inputs but believed them to be independent. This causes the unit to detect higher order correlations among its inputs. Initial simulations are presented, and seem encouraging. We describe an extension of the basic idea which makes it resemble competitive learning and which causes members of a population of these units to differentiate, each extracting different structure from the input.
http://mural.maynoothuniversity.ie/5533/
Marked
Mark
Garbage Collection with Pointers to Individual Cells
(1996)
Pearlmutter, Barak A.
Garbage Collection with Pointers to Individual Cells
(1996)
Pearlmutter, Barak A.
Abstract:
In the heap model in which garbage collectors usually operate, the heap is an array of cells. Each cell contains either a nonpointer, to be ignored, or a pointer to a block of cells somewhere in the heap, called an object. The objects do not overlap. In addition, there are a bunch of cells not in the heap, called the root set. It is possible to determine from a cell whether it contains a pointer or not, and it is possible to determine from a pointer how long the object pointed to is.
http://mural.maynoothuniversity.ie/8134/
Marked
Mark
Gradient calculations for dynamic recurrent neural networks: A survey
(1995)
Pearlmutter, Barak A.
Gradient calculations for dynamic recurrent neural networks: A survey
(1995)
Pearlmutter, Barak A.
Abstract:
We survey learning algorithms for recurrent neural networks with hidden units, and put the various techniques into a common framework. We discuss fixed point learning algorithms, namel recurrent back propagation and deterministic Boltzmann Machines, and nonfixed point algorithms, namely back propagation through time, Elman's history cut off, and Jordan's output feedback architecture. Forward propagation, an outline technique that uses adjoint equations, and variations thereof, are also discussed. In many cases, the unified presentation leads to generalizations of carious sorts. We discuss advantages and disadvantages of temporally continuous neural networks in contrast to clocked ones, continue with some "tricks of the trade" for training, using, and simulating continuous time and recurrent neural networks. We present some simulations, and at the end, address issues of computational complexity and learning speed.
http://mural.maynoothuniversity.ie/5490/
Marked
Mark
Gradient Descent: SecondOrder Momentum and Saturating Error
(1991)
Pearlmutter, Barak A.
Gradient Descent: SecondOrder Momentum and Saturating Error
(1991)
Pearlmutter, Barak A.
Abstract:
Batch gradient descent, ~w(t) = 7JdE/dw(t) , conver~es to a minimum of quadratic form with a time constant no better than '4Amax/ Amin where Amin and Amax are the minimum and maximum eigenvalues of the Hessian matrix of E with respect to w. It was recently shown that adding a momentum term ~w(t) = 7JdE/dw(t) + Q'~w(t  1) improves this to ~ VAmax/ Amin, although only in the batch case. Here we show that secondorder momentum, ~w(t) = 7JdE/dw(t) + Q'~w(t 1) + (3~w(t  2), can lower this no further. We then regard gradient descent with momentum as a dynamic system and explore a non quadratic error surface, showing that saturation of the error accounts for a variety of effects observed in simulations and justifies some popular heuristics.
http://mural.maynoothuniversity.ie/5539/
Marked
Mark
HardLOST: Modified kMeans for Oriented Lines
(2004)
O'Grady, Paul D.; Pearlmutter, Barak A.
HardLOST: Modified kMeans for Oriented Lines
(2004)
O'Grady, Paul D.; Pearlmutter, Barak A.
Abstract:
Robust clustering of data into linear subspaces is a common problem. Here we treat clustering into onedimensional subspaces that cross the origin. This problem arises in blind source separation, where the subspaces correspond directly to columns of a mixing matrix. We present an algorithm that identifies these subspaces using a modified kmeans procedure, where line orientations and distances from a line replace the cluster centres and distance from cluster centres of conventional kmeans. This method, combined with a transformation into a sparse domain and an L1norm optimisation, constitutes a blind source separation algorithm for the underdetermined case.
http://mural.maynoothuniversity.ie/2052/
Marked
Mark
Hemodynamics for braincomputer interfaces: optical correlates of control signals
(2008)
Matthews, Fiachra; Pearlmutter, Barak A.; Ward, Tomas E.; Soraghan, Christopher John; M...
Hemodynamics for braincomputer interfaces: optical correlates of control signals
(2008)
Matthews, Fiachra; Pearlmutter, Barak A.; Ward, Tomas E.; Soraghan, Christopher John; Markham, Charles
Abstract:
This article brings together the various elements that constitute the signal processing challenges presented by a hemodynamicsdriven functional nearinfrared spectroscopy (fNIRS) based braincomputer interface (BCI). We discuss the use of optically derived measures of cortical hemodynamics as control signals for next generation BCIs. To this end we present a suitable introduction to the underlying measurement principle, we describe appropriate instrumentation and highlight how and where performance improvements can be made to current and future embodiments of such devices. Key design elements of a simple fNIRSBCI system are highlighted while in the process identifying signal processing problems requiring improved solutions and suggesting methods by which this might be accomplished.
http://mural.maynoothuniversity.ie/1360/
Marked
Mark
Illusory Percepts from Auditory Adaptation
(2007)
Parra, Lucas C.; Pearlmutter, Barak A.
Illusory Percepts from Auditory Adaptation
(2007)
Parra, Lucas C.; Pearlmutter, Barak A.
Abstract:
Phenomena resembling tinnitus and Zwicker phantom tone are seen to result from an auditory gain adaptation mechanism that attempts to make full use of a fixedcapacity channel. In the case of tinnitus, the gain adaptation enhances internal noise of a frequency band otherwise silent due to damage. This generates a percept of a phantom sound as a consequence of hearing loss. In the case of Zwicker tone, a frequency band is temporarily silent during the presentation of a notched broadband sound, resulting in a percept of a tone at the notched frequency. The model suggests a link between tinnitus and the Zwicker tone percept, in that it predicts different results for normal and tinnitus subjects due to a loss of instantaneous nonlinear compression. Listening experiments on 44 subjects show that tinnitus subjects (11 of 44) are significantly more likely to hear the Zwicker tone. This psychoacoustic experiment establishes the first empirical link between the Zwicker tone percept and tin...
http://mural.maynoothuniversity.ie/1317/
Marked
Mark
Independent Component Analysis for Brain fMRI Does Indeed Select for Maximal Independence
(2013)
Calhoun, Vince D.; Potluru, Vamsi K.; Phlypo, Ronald; Silva, Rogers F.; Pearlmutter, Ba...
Independent Component Analysis for Brain fMRI Does Indeed Select for Maximal Independence
(2013)
Calhoun, Vince D.; Potluru, Vamsi K.; Phlypo, Ronald; Silva, Rogers F.; Pearlmutter, Barak A.; Caprihan, Arvind; Plis, Sergey M.; Adali, Tulay
Abstract:
A recent paper by Daubechies et al. claims that two independent component analysis (ICA) algorithms, Infomax and FastICA, which are widely used for functional magnetic resonance imaging (fMRI) analysis, select for sparsity rather than independence. The argument was supported by a series of experiments on synthetic data. We show that these experiments fall short of proving this claim and that the ICA algorithms are indeed doing what they are designed to do: identify maximally independent sources.
http://mural.maynoothuniversity.ie/6548/
Marked
Mark
Independent Components of Magnetoencephalography: Localization
(2002)
Tang, Akaysha; Pearlmutter, Barak A.; Malaszenko, Natalie A.; Phung, Dan B.; Reeb, Beth...
Independent Components of Magnetoencephalography: Localization
(2002)
Tang, Akaysha; Pearlmutter, Barak A.; Malaszenko, Natalie A.; Phung, Dan B.; Reeb, Bethany C.
Abstract:
We applied secondorder blind identification (SOBI), an independent component analysis (ICA) method, to MEG data collected during cognitive tasks. We explored SOBI’s ability to help isolate underlying neuronal sources with relatively poor signaltonoise ratios, allowing their identification and localization. We compare localization of the SOBIseparated components to localization from unprocessed sensor signals, using an equivalent current dipole (ECD) modeling method. For visual and somatosensory modalities, SOBI preprocessing resulted in components that can be localized to physiologically and anatomically meaningful locations. Furthermore, this preprocessing allowed the detection of neuronal source activations that were otherwise undetectable. This increased probability of neuronal source detection and localization can be particularly beneficial for MEG studies of higher level cognitive functions, which often have greater signal variability and degraded signaltonoise ratios tha...
http://mural.maynoothuniversity.ie/5506/
Marked
Mark
Independent Components of Magnetoencephalography: SingleTrial Response Onset Times
(2002)
Tang, Akaysha; Pearlmutter, Barak A.; Malaszenko, Natalie A.; Phung, Dan B.
Independent Components of Magnetoencephalography: SingleTrial Response Onset Times
(2002)
Tang, Akaysha; Pearlmutter, Barak A.; Malaszenko, Natalie A.; Phung, Dan B.
Abstract:
We recently demonstrated that secondorder blind identification (SOBI), an independent component analysis (ICA) method, can separate the mixture of neuronal and noise signals in magnetoencephalographic (MEG) data into neuroanatomically and neurophysiologically meaningful components. When the neuronal signals had relatively higher trialtotrial variability, SOBI offered a particular advantage in identifying and localizing neuronal source activations with increased source detectability (A. C. Tang et al., 2002, Neural Comput. 14, 1827–1858). Here, we explore the utility of SOBI in the analysis of temporal aspects of neuromagnetic signals from MEG data. From SOBI components, we were able to measure singletrial response onset times of neuronal populations in visual, auditory, and somatosensory modalities during cognitive and sensory activation tasks, with a detection rate as high as 96% under optimal conditions. Comparing the SOBIaided detection results with those obtained directly f...
http://mural.maynoothuniversity.ie/5531/
Marked
Mark
Isolating endogenous visuospatial attentional effects using the novel visualevoked spread spectrum analysis (VESPA) technique
(2007)
Lalor, Edmund C.; Kelly, Simon P.; Pearlmutter, Barak A.; Reilly, Richard B.; Foxe, Joh...
Isolating endogenous visuospatial attentional effects using the novel visualevoked spread spectrum analysis (VESPA) technique
(2007)
Lalor, Edmund C.; Kelly, Simon P.; Pearlmutter, Barak A.; Reilly, Richard B.; Foxe, John J.
Abstract:
In natural visual environments, we use attention to select between relevant and irrelevant stimuli that are presented simultaneously. Our attention to objects in our visual field is largely controlled endogenously, but is also affected exogenously through the influence of novel stimuli and events. The study of endogenous and exogenous attention as separate mechanisms has been possible in behavioral and functional imaging studies, where multiple stimuli can be presented continuously and simultaneously. It has also been possible in electroencephalogram studies using the steadystate visualevoked potential (SSVEP); however, it has not been possible in conventional eventrelated potential (ERP) studies, which are hampered by the need to present suddenly onsetting stimuli in isolation. This is unfortunate as the ERP technique allows for the analysis of human physiology with much greater temporal resolution than functional magnetic resonance imaging or the SSVEP. While ERP studies of end...
http://mural.maynoothuniversity.ie/1312/
Displaying Results 51  75 of 133 on page 3 of 6
1
2
3
4
5
6
Bibtex
CSV
EndNote
RefWorks
RIS
XML
Item Type
Book chapter (15)
Conference item (29)
Journal article (66)
Report (23)
Peer Review Status
Peerreviewed (107)
Nonpeerreviewed (26)
Year
2019 (1)
2018 (3)
2017 (2)
2016 (6)
2015 (3)
2014 (4)
2013 (8)
2012 (2)
2011 (2)
2009 (3)
2008 (15)
2007 (8)
2006 (9)
2005 (9)
2004 (7)
2003 (6)
2002 (9)
2001 (2)
2000 (6)
1999 (3)
1998 (2)
1996 (4)
1995 (3)
1994 (4)
1993 (2)
1991 (2)
1990 (2)
1989 (1)
1988 (2)
1987 (1)
1986 (2)
built by Enovation Solutions