Institutions
|
About Us
|
Help
|
Gaeilge
0
1000
Home
Browse
Advanced Search
Search History
Marked List
Statistics
A
A
A
Author(s)
Institution
Publication types
Funder
Year
Limited By:
Author = Pearlmutter, Barak A.;
132 items found
Sort by
Title
Author
Item type
Date
Institution
Peer review status
Language
Order
Ascending
Descending
25
50
100
per page
1
2
3
4
5
6
Bibtex
CSV
EndNote
RefWorks
RIS
XML
Displaying Results 1 - 25 of 132 on page 1 of 6
Marked
Mark
A 12-Channel, real-time near-infrared spectroscopy instrument for brain-computer interface applications
(2008)
Soraghan , C.; Matthews, F.; Markham, Charles; Pearlmutter, Barak A.; O'Neill, R.;...
A 12-Channel, real-time near-infrared spectroscopy instrument for brain-computer interface applications
(2008)
Soraghan , C.; Matthews, F.; Markham, Charles; Pearlmutter, Barak A.; O'Neill, R.; Ward, Tomas E.
Abstract:
A continuous wave near-infrared spectroscopy (NIRS) instrument for brain-computer interface (BCI) applications is presented. In the literature, experiments have been carried out on subjects with such motor degenerative diseases as amyotrophic lateral sclerosis, which have demonstrated the suitability of NIRS to access intentional functional activity, which could be used in a BCI as a communication aid. Specifically, a real-time, multiple channel NIRS tool is needed to realise access to even a few different mental states, for reasonable baud rates. The 12-channel instrument described here has a spatial resolution of 30mm, employing a flexible software demodulation scheme. Temporal resolution of ~100ms is maintained since typical topographic imaging is not needed, since we are only interested in exploiting the vascular response for BCI control. A simple experiment demonstrates the ability of the system to report on haemodynamics during single trial mental arithmetic tasks. Multiple tr...
http://mural.maynoothuniversity.ie/1441/
Marked
Mark
A Context-Sensitive Generalization of ICA
(1996)
Pearlmutter, Barak A.; Parra, Lucas C.
A Context-Sensitive Generalization of ICA
(1996)
Pearlmutter, Barak A.; Parra, Lucas C.
Abstract:
Source separation arises in a surprising number of signal processing applications, from speech recognition to EEG analysis. In the square linear blind source separation problem without time delays, one must find an unmixing matrix which can detangle the result of mixing n unknown independent sources through an unknown n x n mixing matrix. The recently introduced ICA blind source separation algorithm (Baram and Roth 1994; Bell and Sejnowski 1995) is a powerful and surprisingly simple technique for solving this problem. ICA is all the more remarkable for performing so well despite making absolutely no use of the temporal structure of its input! This paper presents a new algorithm, contextual ICA, which derives from a maximum likelihood density estimation formulation of the problem. cICA can incorporate arbitrarily complex adaptive history-sensitive source models, and thereby make use of the temporal structure of its input. This allows it to separate in a number of situations where sta...
http://mural.maynoothuniversity.ie/5491/
Marked
Mark
A Dual-Channel Optical Brain-Computer Interface In A Gaming Environment
(2006)
Soraghan, Christopher John; Matthews, Fiachra; Kelly, Dan; Ward, Tomas E.; Markham, Cha...
A Dual-Channel Optical Brain-Computer Interface In A Gaming Environment
(2006)
Soraghan, Christopher John; Matthews, Fiachra; Kelly, Dan; Ward, Tomas E.; Markham, Charles; Pearlmutter, Barak A.; O'Neill, Ray
Abstract:
This paper explores the viability of using a novel optical Brain-Computer Interface within a gaming environment. We describe a system that incorporates a 3D gaming engine and an optical BCI. This made it possible to classify activation in the motor cortex within a synchronous experimental paradigm. Detected activations were used to control the arm movement of a human model in the graphical engine.
http://mural.maynoothuniversity.ie/1293/
Marked
Mark
A Dual-Channel Optical Brain-Computer Interface In A Gaming Environment
(2006)
Soraghan, Christopher John; Matthews, Fiachra; Kelly, Dan; Markham, Charles; Pearlmutte...
A Dual-Channel Optical Brain-Computer Interface In A Gaming Environment
(2006)
Soraghan, Christopher John; Matthews, Fiachra; Kelly, Dan; Markham, Charles; Pearlmutter, Barak A.; O'Neill, Ray
Abstract:
This paper explores the viability of using a novel optical Brain-Computer Interface within a gaming environment. We describe a system that incorporates a 3D gaming engine and an optical BCI. This made it possible to classify activation in the motor cortex within a synchronous experimental paradigm. Detected activations were used to control the arm movement of a human model in the graphical engine.
http://mural.maynoothuniversity.ie/1280/
Marked
Mark
A New Hypothesis for Sleep: Tuning for Criticality
(2009)
Pearlmutter, Barak A.; Conor J. Houghton, Conor J.
A New Hypothesis for Sleep: Tuning for Criticality
(2009)
Pearlmutter, Barak A.; Conor J. Houghton, Conor J.
Abstract:
We propose that the critical function of sleep is to prevent uncontrolled neuronal feedback while allowing rapid responses and prolonged retention of short-term memories. Through learning, the brain is tuned to react optimally to environmental challenges. Optimal behavior often requires rapid responses and the prolonged retention of short-term memories. At a neuronal level, these correspond to recurrent activity in local networks. Unfortunately, when a network exhibits recurrent activity, small changes in the parameters or conditions can lead to runaway oscillations. Thus, the very changes that improve the processing performance of the network can put it at risk of runaway oscillation. To prevent this, stimulus-dependent network changes should be permitted only when there is a margin of safety around the current network parameters. We propose that the essential role of sleep is to establish this margin by exposing the network to a variety of inputs, monitoring for erratic behavior, ...
http://mural.maynoothuniversity.ie/1653/
Marked
Mark
A normative model of attention: receptive field modulation
(2004)
Jaramillo, Santiago; Pearlmutter, Barak A.
A normative model of attention: receptive field modulation
(2004)
Jaramillo, Santiago; Pearlmutter, Barak A.
Abstract:
When sensory stimuli are encoded in a lossy fashion for efficient transmission, there are necessarily tradeoffs between the represented fidelity of various aspects of the stimuli. In the model of attention presented here, a top-down signal informs the encoder of these tradeoffs. Given the stimulus ensemble and tradeoff requirements, our system learns an optimal encoder. This general model is instantiated in a simple network: an autoencoder with a bottleneck, innervated by a top-down attentional signal, and trained using backpropagation. The modulation of neural activity learned by this model qualitatively matches that measured in animals during visual attention tasks.
http://mural.maynoothuniversity.ie/8120/
Marked
Mark
A Slow Axon Antidromic Blockade Hypothesis for Tremor Reduction via Deep Brain Stimulation
(2013)
García, Míriam R.; Pearlmutter, Barak A.; Wellstead, Peter E.; Middleton, Richard H.
A Slow Axon Antidromic Blockade Hypothesis for Tremor Reduction via Deep Brain Stimulation
(2013)
García, Míriam R.; Pearlmutter, Barak A.; Wellstead, Peter E.; Middleton, Richard H.
Abstract:
Parkinsonian and essential tremor can often be effectively treated by deep brain stimulation. We propose a novel explanation for the mechanism by which this technique ameliorates tremor: a reduction of the delay in the relevant motor control loops via preferential antidromic blockade of slow axons. The antidromic blockade is preferential because the pulses more rapidly clear fast axons, and the distribution of axonal diameters, and therefore velocities, in the involved tracts, is sufficiently long-tailed to make this effect quite significant. The preferential blockade of slow axons, combined with gain adaptation, results in a reduction of the mean delay in the motor control loop, which serves to stabilize the feedback system, thus ameliorating tremor. This theory, without any tuning, accounts for several previously perplexing phenomena, and makes a variety of novel predictions.
http://mural.maynoothuniversity.ie/6546/
Marked
Mark
A Spectrum of Colors: Investigating the Temporal Frequency Characteristics of the Human Visual System Using a System Identification Approach
(2006)
Lalor, Edmund C.; Reilly, Richard B.; Pearlmutter, Barak A.; Foxe, John J.
A Spectrum of Colors: Investigating the Temporal Frequency Characteristics of the Human Visual System Using a System Identification Approach
(2006)
Lalor, Edmund C.; Reilly, Richard B.; Pearlmutter, Barak A.; Foxe, John J.
Abstract:
Noise input signals are commonly used in both linear and nonlinear system identification of physiological systems. This method can be applied to electrophysiological analysis of the human visual system by controlling the modulation of the contrast of a checkerboard stimulus using a pre-computed noise waveform. In this study we describe how one can obtain an estimate of the linear impulse response of the visual system using noise waveforms. Furthermore, we examine the impulse responses obtained using noise signals with different frequency characteristics, in an attempt to investigate the temporal frequency characteristics of the human visual system. We show that noise signals with frequency content greater than 15 Hz are more effective at evoking these responses than those with little or no power at high frequencies.
http://mural.maynoothuniversity.ie/2048/
Marked
Mark
AD in Fortran Part 1: Design
(2012)
Radul, Alexey Andreyevich; Pearlmutter, Barak A.; Siskind, Jeffrey Mark
AD in Fortran Part 1: Design
(2012)
Radul, Alexey Andreyevich; Pearlmutter, Barak A.; Siskind, Jeffrey Mark
Abstract:
We propose extensions to FORTRAN which integrate forward and reverse Automatic Differentiation (AD) directly into the programming model. Irrespective of implementation technology, embedding AD constructs directly into the language extends the reach and convenience of AD while allowing abstraction of concepts of interest to scientific-computing practice, such as root finding, optimization, and finding equilibria of continuous games. Multiple different subprograms for these tasks can share common interfaces, regardless of whether and how they use AD internally. A programmer can maximize a function F by calling a library maximizer, XSTAR=ARGMAX(F, X0), which internally constructs derivatives of F by AD, without having to learn how to use any particular AD tool. We illustrate the utility of these extensions by example: programs become much more concise and closer to traditional mathematical notation. A companion paper describes how these extensions can be implemented by a program that g...
http://mural.maynoothuniversity.ie/6554/
Marked
Mark
AD in Fortran Part 2: Implementation via Prepreprocessor
(2013)
Radul, Alexey; Pearlmutter, Barak A.; Siskind, Jeffrey Mark
AD in Fortran Part 2: Implementation via Prepreprocessor
(2013)
Radul, Alexey; Pearlmutter, Barak A.; Siskind, Jeffrey Mark
Abstract:
We describe an implementation of the FARFEL FORTRAN AD extensions (Radul et al., 2012). These extensions integrate forward and reverse AD directly into the programming model, with attendant benefits to flexibility, modularity, and ease of use. The implementation we describe is a “prepreprocessor” that generates input to existing FORTRAN-based AD tools. In essence, blocks of code which are targeted for AD by FARFEL constructs are put into subprograms which capture their lexical variable context, and these are closure-converted into top-level subprograms and specialized to eliminate EXTERNAL arguments, rendering them amenable to existing AD preprocessors, which are then invoked, possibly repeatedly if the AD is nested.
http://mural.maynoothuniversity.ie/6555/
Marked
Mark
Algorithmic Differentiation through Convergent Loops
(2002)
Pearlmutter, Barak A.; Link, Hamilton E.
Algorithmic Differentiation through Convergent Loops
(2002)
Pearlmutter, Barak A.; Link, Hamilton E.
Abstract:
We consider an explicit iterate-to-fixedpoint operator and derive associated rules for both forward and reverse mode algorithmic differentiation. Like other AD transformation rules, these are exact and efficient. In this case, they generate code which itself invokes the iterate-to-fixedpoint operator. Loops which iterate until a variable changes less than some tolerance should be regarded as approximate iterate-to-fixedpoint calculations. After a convergence analysis, we contend that it is best both pragmatically and theoretically to find the approximate fixedpoint of the adjoint system of the actual desired fixedpoint calculation, rather than find the adjoint of the approximate primal fixedpoint calculation. Our exposition unifies and formalizes a number of techniques already known to the AD community, introduces a convenient and powerful notation, and opens the door to fully automatic efficient AD of a broadened class of codes.
http://mural.maynoothuniversity.ie/10237/
Marked
Mark
Algorithmic Differentiation, Functional Programming, and Iterate-to-Fixedpoint
(2002)
Pearlmutter, Barak A.
Algorithmic Differentiation, Functional Programming, and Iterate-to-Fixedpoint
(2002)
Pearlmutter, Barak A.
Abstract:
Abstract included in text.
http://mural.maynoothuniversity.ie/10249/
Marked
Mark
An Analysis of Publication Venues for Automatic Differentiation Research
(2014)
Baydin, Atilim Gunes; Pearlmutter, Barak A.
An Analysis of Publication Venues for Automatic Differentiation Research
(2014)
Baydin, Atilim Gunes; Pearlmutter, Barak A.
Abstract:
We present the results of our analysis of publication venues for papers on automatic differentiation (AD), covering academic journals and conference proceedings. Our data are collected from the AD publications database maintained by the autodiff.org community website. The database is purpose-built for the AD field and is expanding via submissions by AD researchers. Therefore, it provides a relatively noise-free list of publications relating to the field. However, it does include noise in the form of variant spellings of journal and conference names. We handle this by manually correcting and merging these variants under the official names of corresponding venues. We also share the raw data we get after these corrections.
http://mural.maynoothuniversity.ie/6277/
Marked
Mark
An Investigation of Feasibility and Safety of Bi‐Modal Stimulation for the Treatment of Tinnitus: An Open‐Label Pilot Study
(2016)
Hamilton, Caroline; D'Arcy, Shona; Pearlmutter, Barak A.; Crispino, Gloria; Lalor,...
An Investigation of Feasibility and Safety of Bi‐Modal Stimulation for the Treatment of Tinnitus: An Open‐Label Pilot Study
(2016)
Hamilton, Caroline; D'Arcy, Shona; Pearlmutter, Barak A.; Crispino, Gloria; Lalor, Edmund C.; Conlon, Brendan J.
Abstract:
Objectives: Tinnitus is the perception of sound in the absence of an external auditory stimulus. It is widely believed that tinnitus, in patients with associated hearing loss, is a neurological phenomenon primarily affecting the central auditory structures. However, there is growing evidence for the involvement of the somatosensory system in this form of tinnitus. For this reason it has been suggested that the condition may be amenable to bi‐modal stimulation of the auditory and somatosensory systems. We conducted a pilot study to investigate the feasibility and safety of a device that delivers simultaneous auditory and somatosensory stimulation to treat the symptoms of chronic tinnitus. Methods: A cohort of 54 patients used the stimulation device for 10 weeks. Auditory stimulation was delivered via headphones and somatosensory stimulation was delivered via electrical stimulation of the tongue. Patient usage, logged by the device, was used to classify patients as compliant or nonc...
http://mural.maynoothuniversity.ie/10229/
Marked
Mark
An MEG study of response latency and variability in the human visual system during a visual-motor integration task
(2000)
Tang, Akaysha; Pearlmutter, Barak A.; Hely, Tim A.; Zibulevsky, Michael; Weisend, Micha...
An MEG study of response latency and variability in the human visual system during a visual-motor integration task
(2000)
Tang, Akaysha; Pearlmutter, Barak A.; Hely, Tim A.; Zibulevsky, Michael; Weisend, Michael P.
Abstract:
Human reaction times during sensory-motor tasks vary considerably. To begin to understand how this variability arises, we examined neuronal populational response time variability at early versus late visual processing stages. The conventional view is that precise temporal information is gradually lost as information is passed through a layered network of mean-rate \units." We tested in humans whether neuronal populations at different processing stages behave like mean-rate \units". A blind source separation algorithm was applied to MEG signals from sensory-motor integration tasks. Response time latency and variability for multiple visual sources were estimated by detecting single-trial stimulus-locked events for each source. In two subjects tested on four visual reaction time tasks, we reliably identified sources belonging to early and late visual processing stages. The standard deviation of response latency was smaller for early rather than late processing stages. This su...
http://mural.maynoothuniversity.ie/8133/
Marked
Mark
Attention and Optimal Sensory Codes
(2004)
Jaramillo, Santiago; Pearlmutter, Barak A.
Attention and Optimal Sensory Codes
(2004)
Jaramillo, Santiago; Pearlmutter, Barak A.
Abstract:
Neuronal activity can be modulated by attention even while the sensory stimulus is held fixed. This modulation implies changes in the tuning curve (or receptive field) of the neurons involved in sensory processing. We propose an information-theoretic hypothesis for the purpose of this modulation, and show using computer simulation that the similar modulation emerges in a system that is optimally encoding a sensory stimulus when the system is informed about the changing relevance of different features of the input. We present a simple model that learns a covert attention mechanism, given input patterns and tradeoff requirements. After optimization, the system gains the ability to reorganize its computational resources (or coding strategy) depending on the incoming covert attentional signal, using only threshold shifts in neurons throughout the network. The modulation of activity of the encoding units for different attentional states qualitatively matches that observed in animal selec...
http://mural.maynoothuniversity.ie/1882/
Marked
Mark
Automatic Differentiation in Machine Learning: a Survey
(2018)
Baydin, Atilim Gunes; Pearlmutter, Barak A.; Radul, Alexey Andreyevich; Siskind, Jeffre...
Automatic Differentiation in Machine Learning: a Survey
(2018)
Baydin, Atilim Gunes; Pearlmutter, Barak A.; Radul, Alexey Andreyevich; Siskind, Jeffrey Mark
Abstract:
Derivatives, mostly in the form of gradients and Hessians, are ubiquitous in machine learning. Automatic differentiation (AD), also called algorithmic differentiation or simply “auto-diff”, is a family of techniques similar to but more general than backpropagation for efficiently and accurately evaluating derivatives of numeric functions expressed as computer programs. AD is a small but established field with applications in areas including computational fluid dynamics, atmospheric sciences, and engineering design optimization. Until very recently, the fields of machine learning and AD have largely been unaware of each other and, in some cases, have independently discovered each other’s results. Despite its relevance, general-purpose AD has been missing from the machine learning toolbox, a situation slowly changing with its ongoing adoption under the names “dynamic computational graphs” and “differentiable programming”. We survey the intersection of AD and machine learning, cov...
http://mural.maynoothuniversity.ie/10227/
Marked
Mark
Automatic differentiation in machine learning: a survey
(2015)
Baydin, Atilim Gunes; Pearlmutter, Barak A.; Radul, Alexey Andreyevich; Siskind, Jeffre...
Automatic differentiation in machine learning: a survey
(2015)
Baydin, Atilim Gunes; Pearlmutter, Barak A.; Radul, Alexey Andreyevich; Siskind, Jeffrey Mark
Abstract:
Derivatives, mostly in the form of gradients and Hessians, are ubiquitous in machine learning. Automatic differentiation (AD) is a technique for calculating derivatives of numeric functions expressed as computer programs efficiently and accurately, used in fields such as computational fluid dynamics, nuclear engineering, and atmospheric sciences. Despite its advantages and use in other fields, machine learning practitioners have been little influenced by AD and make scant use of available tools. We survey the intersection of AD and machine learning, cover applications where AD has the potential to make a big impact, and report on some recent developments in the adoption of this technique. We aim to dispel some misconceptions that we contend have impeded the use of AD within the machine learning community.
http://mural.maynoothuniversity.ie/8145/
Marked
Mark
Automatic Differentiation of Algorithms for Machine Learning
(2014)
Baydin, Atilim Gunes; Pearlmutter, Barak A.
Automatic Differentiation of Algorithms for Machine Learning
(2014)
Baydin, Atilim Gunes; Pearlmutter, Barak A.
Abstract:
Automatic differentiation --- the mechanical transformation of numeric computer programs to calculate derivatives efficiently and accurately --- dates to the origin of the computer age. Reverse mode automatic differentiation both antedates and generalizes the method of backwards propagation of errors used in machine learning. Despite this, practitioners in a variety of fields, including machine learning, have been little influenced by automatic differentiation, and make scant use of available tools. Here we review the technique of automatic differentiation, describe its two main modes, and explain how it can benefit machine learning practitioners. To reach the widest possible audience our treatment assumes only elementary differential calculus, and does not assume any knowledge of linear algebra.
http://mural.maynoothuniversity.ie/6276/
Marked
Mark
Automatic Learning Rate Maximization by On-Line Estimation of the Hessian's Eigenvectors
(1993)
Lecun, Yann; Simard, Patrice Y.; Pearlmutter, Barak A.
Automatic Learning Rate Maximization by On-Line Estimation of the Hessian's Eigenvectors
(1993)
Lecun, Yann; Simard, Patrice Y.; Pearlmutter, Barak A.
Abstract:
We propose a very simple, and well principled wayofcomputing the optimal step size in gradient descent algorithms. The on-line version is very efficient computationally, and is applicable to large backpropagation networks trained on large data sets. The main ingredient is a technique for estimating the principal eigenvalue(s) and eigenvector(s) of the objective function's second derivativematrix (Hessian), which does not require to even calculate the Hessian. Several other applications of this technique are proposed for speeding up learning, or for eliminating useless parameters.
http://mural.maynoothuniversity.ie/8137/
Marked
Mark
Binomial Checkpointing for Arbitrary Programs with No User Annotation
(2016)
Siskind, Jeffrey Mark; Pearlmutter, Barak A.
Binomial Checkpointing for Arbitrary Programs with No User Annotation
(2016)
Siskind, Jeffrey Mark; Pearlmutter, Barak A.
Abstract:
Heretofore, automatic checkpointing at procedure-call boundaries, to reduce the space complexity of reverse mode, has been provided by systems like Tapenade. However, binomial checkpointing, or treeverse, has only been provided in Automatic Differentiation (AD) systems in special cases, e.g., through user-provided pragmas on DO loops in Tapenade, or as the nested taping mechanism in adol-c for time integration processes, which requires that user code be refactored. We present a framework for applying binomial checkpointing to arbitrary code with no special annotation or refactoring required. This is accomplished by applying binomial checkpointing directly to a program trace. This trace is produced by a general-purpose checkpointing mechanism that is orthogonal to AD.
http://mural.maynoothuniversity.ie/8112/
Marked
Mark
Biophotonic Methods for Brain-Computer Interfaces
(2007)
Soraghan , C.; Matthews, F.; Markham, Charles; Pearlmutter, Barak A.; Ward, Tomas E.
Biophotonic Methods for Brain-Computer Interfaces
(2007)
Soraghan , C.; Matthews, F.; Markham, Charles; Pearlmutter, Barak A.; Ward, Tomas E.
Abstract:
Poster
http://mural.maynoothuniversity.ie/1371/
Marked
Mark
Blind separation of sources with sparse representations in a given signal dictionary
(2000)
Zibulevsky, Michael; Pearlmutter, Barak A.
Blind separation of sources with sparse representations in a given signal dictionary
(2000)
Zibulevsky, Michael; Pearlmutter, Barak A.
Abstract:
The blind source separation problem is to extract the underlying source signals from a set of linear mixtures, where the mixing matrix is unknown. We consider a two-stage separation process. First, a priori selection of a possibly overcomplete signal dictionary (e.g. wavelet frame, learned dictionary, etc.) in which the sources are assumed to be sparsely representable. Second, unmixing the sources by exploiting the their sparse representability. We consider the general case of more sources than mixtures. But also derive a more efficient algorithm in the case of a non-overcomplete dictionary and equal numbers of sources and mixtures. Experiments with artificial signals and with musical sounds demonstrate significantly better separation than other known techniques.
http://mural.maynoothuniversity.ie/8125/
Marked
Mark
Blind Source Separation by Sparse Decomposition
(1999)
Zibulevsky, Michael; Pearlmutter, Barak A.
Blind Source Separation by Sparse Decomposition
(1999)
Zibulevsky, Michael; Pearlmutter, Barak A.
Abstract:
The blind source separation problem is to extract the underlying source signals from a set of their linear mixtures, where the mixing matrix is unknown. This situation is common, eg in acoustics, radio, and medical signal processing. We exploit the property of the sources to have a sparse representation in a corresponding (possibly overcomplete) signal dictionary. Such a dictionary may consist of wavelets, wavelet packets, etc., or be obtained by learning from a given family of signals. Starting from the maximum posteriori framework, which is applicable to the case of more sources than mixtures, we derive a few other categories of objective functions, which provide faster and more computations, when there are an equal number of sources and mixtures. Our experiments with artificial signals and with musical sounds demonstrate significantly better separation than other known techniques.
http://mural.maynoothuniversity.ie/8166/
Marked
Mark
Blind Source Separation by Sparse Decomposition in a Signal Dictionary
(2001)
Zibulevsky, Michael; Pearlmutter, Barak A.
Blind Source Separation by Sparse Decomposition in a Signal Dictionary
(2001)
Zibulevsky, Michael; Pearlmutter, Barak A.
Abstract:
The blind source separation problem is to extract the underlying source signals from a set of linear mixtures, where the mixing matrix is unknown. This situation is common in acoustics, radio, medical signal and image processing, hyperspectral imaging, and other areas. We suggest a two- stage separation process: a priori selection of a possibly overcomplete signal dictionary (for instance, a wavelet frame or a learned dictionary) in which the sources are assumed to be sparsely representable, followed by unmixing the sources by exploiting the their sparse representability. We consider the general case of more sources than mixtures, but also derive a more efficient algorithm in the case of a nonovercomplete dictionary and an equal numbers of sources and mixtures. Experiments with artificial signals and musical sounds demonstrate significantly better separation than other known techniques.
http://mural.maynoothuniversity.ie/5485/
Displaying Results 1 - 25 of 132 on page 1 of 6
1
2
3
4
5
6
Bibtex
CSV
EndNote
RefWorks
RIS
XML
Item Type
Book chapter (14)
Conference item (29)
Journal article (66)
Report (23)
Peer Review Status
Peer-reviewed (106)
Non-peer-reviewed (26)
Year
2019 (1)
2018 (2)
2017 (2)
2016 (6)
2015 (3)
2014 (4)
2013 (8)
2012 (2)
2011 (2)
2009 (3)
2008 (15)
2007 (8)
2006 (9)
2005 (9)
2004 (7)
2003 (6)
2002 (9)
2001 (2)
2000 (6)
1999 (3)
1998 (2)
1996 (4)
1995 (3)
1994 (4)
1993 (2)
1991 (2)
1990 (2)
1989 (1)
1988 (2)
1987 (1)
1986 (2)
built by Enovation Solutions