Institutions

About Us

Help

Gaeilge
0
1000
Home
Browse
Advanced Search
Search History
Marked List
Statistics
A
A
A
Author(s)
Institution
Publication types
Funder
Year
Limited By:
Author = Pearlmutter, Barak A.;
133 items found
Sort by
Title
Author
Item type
Date
Institution
Peer review status
Language
Order
Ascending
Descending
25
50
100
per page
1
2
3
4
5
6
Bibtex
CSV
EndNote
RefWorks
RIS
XML
Displaying Results 26  50 of 133 on page 2 of 6
Marked
Mark
Blind source separation of multichannel neuromagnetic responses
(2000)
Tang, Akaysha; Pearlmutter, Barak A.; Zibulevsky, Michael; Carter, Scott A.
Blind source separation of multichannel neuromagnetic responses
(2000)
Tang, Akaysha; Pearlmutter, Barak A.; Zibulevsky, Michael; Carter, Scott A.
Abstract:
Magnetoencephalography (MEG) is a functional brain imaging technique with millisecond temporal resolution and millimeter spatial sensitivity. The high temporal resolution of MEG compared to fMRI and PET (milliseconds vs. seconds and tens of seconds) makes it ideal for measuring the precise time of neuronal responses, thereby o!ering a powerful tool for studying temporal dynamics. We applied blindsource separation (BSS) to continuous 122channel human magnetoencephalographic data from two subjects and "ve tasks. We demonstrate that without using any domainspeci"c knowledge and without making the common assumption of single or multiplecurrent dipole sources, BSS is capable of separating nonneuronal noise sources from neuronal responses and also of separating neuronal responses from di!erent sensory modalities, and from di!erent processing stages within a given modality
http://mural.maynoothuniversity.ie/5535/
Marked
Mark
Blind source separation via multinode sparse representation
(2002)
Zibulevsky, Michael; Kisilev, Pavel; Zeevi, Yehoshua Y.; Pearlmutter, Barak A.
Blind source separation via multinode sparse representation
(2002)
Zibulevsky, Michael; Kisilev, Pavel; Zeevi, Yehoshua Y.; Pearlmutter, Barak A.
Abstract:
We consider a problem of blind source separation from a set of instan taneous linear mixtures, where the mixing matrix is unknown. It was discovered recently, that exploiting the sparsity of sources in an appro priate representation according to some signal dictionary, dramatically improves the quality of separation. In this work we use the property of multiscale transforms, such as wavelet or wavelet packets, to decompose signals into sets of local features with various degrees of sparsity. We use this intrinsic property for selecting the best (most sparse) subsets of features for further separation. The performance of the algorithm is verified on noisefree and noisy data. Experiments with simulated signals, musical sounds and images demonstrate significant improvement of separation quality over previously reported results.
http://mural.maynoothuniversity.ie/5507/
Marked
Mark
Block Coordinate Descent for Sparse NMF
(2013)
Potluru, Vamsi K.; Le Roux, Jonathan; Calhoun, Vince D.; Plis, Sergey M.; Pearlmutter, ...
Block Coordinate Descent for Sparse NMF
(2013)
Potluru, Vamsi K.; Le Roux, Jonathan; Calhoun, Vince D.; Plis, Sergey M.; Pearlmutter, Barak A.; Hayes, Thoms P.
Abstract:
Nonnegative matrix factorization (NMF) has become a ubiquitous tool for data analysis. An important variant is the sparse NMF problem which arises when we explicitly require the learnt features to be sparse. A natural measure of sparsity is the La norm, however its optimization is NPhard. Mixed norms, such as Ll/L2 measure, have been shown to model sparsity robustly, based on intuitive attributes that such measures need to satisfy. This is in contrast to computationally cheaper alternatives such as the plain L1 norm. However, present algorithms designed for optimizing the mixed norm L1 /L2 are slow and other formulations for sparse NMF have been proposed such as those based on L1 and La norms. Our proposed algorithm allows us to solve the mixed norm sparsity constraints while not sacrificing computation time. We present experimental evidence on realworld datasets that shows our new algorithm performs an order of magnitude faster compared to the current stateoftheart solvers opt...
http://mural.maynoothuniversity.ie/6556/
Marked
Mark
Bounds on Query Convergence
(2005)
Pearlmutter, Barak A.
Bounds on Query Convergence
(2005)
Pearlmutter, Barak A.
Abstract:
The problem of finding an optimum using noisy evaluations of a smooth cost function arises in many contexts, including economics, business, medicine, experiment design, and foraging theory. We derive an asymptotic bound E[ (x_t  x*)^2 ] >= O(1/sqrt(t)) on the rate of convergence of a sequence (x_0, x_1, >...) generated by an unbiased feedback process observing noisy evaluations of an unknown quadratic function maximised at x*. The bound is tight, as the proof leads to a simple algorithm which meets it. We further establish a bound on the total regret, E[ sum_{i=1..t} (x_i  x*)^2 ] >= O(sqrt(t)) These bounds may impose practical limitations on an agent's performance, as O(eps^4) queries are made before the queries converge to x* with eps accuracy.
http://mural.maynoothuniversity.ie/8163/
Marked
Mark
Brightness Illusions as Optimal Percepts  Technical Report: NUIMCSTR200602
(2006)
Jaramillo, Santiago ; Pearlmutter, Barak A.
Brightness Illusions as Optimal Percepts  Technical Report: NUIMCSTR200602
(2006)
Jaramillo, Santiago ; Pearlmutter, Barak A.
Abstract:
We show that Mach bands and a number of other lowlevel brightness illusions can be accounted for by assuming that the perceptual system performs simple Bayesian inference using a Gaussian image prior with noisy retinal gangion cells. This theory accounts for phenomena which have proven problematic for simple energybased and lateralinteraction models while avoiding the complexities of midlevel vision theories that involve the estimation of structure and albedo.
http://mural.maynoothuniversity.ie/2315/
Marked
Mark
ChaitinKolmogorov Complexity and Generalization in Neural Networks
(1991)
Pearlmutter, Barak A.; Rosenfeld, Ronald
ChaitinKolmogorov Complexity and Generalization in Neural Networks
(1991)
Pearlmutter, Barak A.; Rosenfeld, Ronald
Abstract:
We present a unified framework for a number of different ways of failing to generalize properly. During learning, sources of random information contaminate the network, effectively augmenting the training data with random information. The complexity of the function computed is therefore increased, and generalization is degraded. We analyze replicated networks, in which a number of identical networks are independently trained on the same data and their results averaged. We conclude that replication almost always results in a decrease in the expected complexity of the network, and that replication therefore increases expected generalization. Simulations confirming the effect are also presented.
http://mural.maynoothuniversity.ie/5536/
Marked
Mark
CometCloudCare (C3): Distributed Machine Learning PlatformasaService with Privacy Preservation
(2014)
Potluru, Vamsi K.; DiazMontes, Javier; Sarwate, Anand D.; Plis, Sergey M.; Calhoun, Vi...
CometCloudCare (C3): Distributed Machine Learning PlatformasaService with Privacy Preservation
(2014)
Potluru, Vamsi K.; DiazMontes, Javier; Sarwate, Anand D.; Plis, Sergey M.; Calhoun, Vince D.; Pearlmutter, Barak A.; Parashar, Manish
Abstract:
The growth of data sharing initiatives in neuroscience and genomics [14, 16, 19, 25] represents an exciting opportunity to confront the “small N ” problem plaguing contemporary studies [20]. When possible, open data sharing provides the greatest benefit. However some data cannot be shared at all due to privacy concerns and/or risk of reidentification. Sharing other data sets is hampered by the proliferation of complex data use agreements (DUAs) which preclude truly automated data mining. These DUAs arise because of concerns about the privacy and confidentiality for subjects; though many do permit direct access to data, they often require a cumbersome approval process that can take months. Additionally, some researchers have expressed doubts about the efficiency and scalability of centralized data storage and analysis for large volume datasets [18]. In response, distributed cloud solutions have been suggested [23]; however, the task of transferring large volumes of imaging data (p...
http://mural.maynoothuniversity.ie/10230/
Marked
Mark
Comments on 'Hebbian learning is jointly controlled by electrotonic and input structure
(1994)
Pearlmutter, Barak A.
Comments on 'Hebbian learning is jointly controlled by electrotonic and input structure
(1994)
Pearlmutter, Barak A.
Abstract:
It is argued that simulations presented by Tsai, Camevale and Bmwn do not agree with their theoretical predictions and that their mathematical dedvation contains a major flaw. The origin of these misunderstandings is haced to the application of a special rase of an equation whose general version is given here.
http://mural.maynoothuniversity.ie/12679/
Marked
Mark
Comments on: Dynamic programming approach to optimal weight selection in multilayer neural networks [with reply]
(1989)
Pearlmutter, Barak A.; Sanatchandran, P.
Comments on: Dynamic programming approach to optimal weight selection in multilayer neural networks [with reply]
(1989)
Pearlmutter, Barak A.; Sanatchandran, P.
Abstract:
Abstract included in text.
http://mural.maynoothuniversity.ie/10238/
Marked
Mark
Concurrent Robin Hood Hashing
(2018)
Kelly, Robert; Pearlmutter, Barak A.; Maguire, Phil
Concurrent Robin Hood Hashing
(2018)
Kelly, Robert; Pearlmutter, Barak A.; Maguire, Phil
Abstract:
In this paper we examine the issues involved in adding concurrency to the Robin Hood hash table algorithm. We present a nonblocking obstructionfree KCAS Robin Hood algorithm which requires only a single word compareandswap primitive, thus making it highly portable. The implementation maintains the attractive properties of the original Robin Hood structure, such as a low expected probe length, capability to operate effectively under a high load factor and good cache locality, all of which are essential for high performance on modern computer architectures. We compare our data structures to various other lockfree and concurrent algorithms, as well as a simple hardware transactional variant, and show that our implementation performs better across a number of contexts.
http://mural.maynoothuniversity.ie/14238/
Marked
Mark
Confusion of Tagged Perturbations in Forward Automatic Differentiation of HigherOrder Functions
(2012)
Manzyuk, Oleksandr; Pearlmutter, Barak A.; Radul, Alexey Andreyevich; Rush, David R.; S...
Confusion of Tagged Perturbations in Forward Automatic Differentiation of HigherOrder Functions
(2012)
Manzyuk, Oleksandr; Pearlmutter, Barak A.; Radul, Alexey Andreyevich; Rush, David R.; Siskind, Jeffrey Mark
Abstract:
Forward Automatic Differentiation (AD) is a technique for augmenting programs to both perform their original calculation and also compute its directional derivative. The essence of Forward AD is to attach a derivative value to each number, and propagate these through the computation. When derivatives are nested, the distinct derivative calculations, and their associated attached values, must be distinguished. In dynamic languages this is typically accomplished by creating a unique tag for each application of the derivative operator, tagging the attached values, and overloading the arithmetic operators. We exhibit a subtle bug, present in fielded implementations, in which perturbations are confused despite the tagging machinery.
http://mural.maynoothuniversity.ie/6552/
Marked
Mark
Constructing TimeFrequency Dictionaries for Source Separation via TimeFrequency Masking and Source Localisation
(2004)
de Fréin, Ruairí; Rickard, Scott T.; Pearlmutter, Barak A.
Constructing TimeFrequency Dictionaries for Source Separation via TimeFrequency Masking and Source Localisation
(2004)
de Fréin, Ruairí; Rickard, Scott T.; Pearlmutter, Barak A.
Abstract:
We describe a new localisation and source separation algorithm which is based upon the accurate construction of timefrequency spatial signatures. We present a technique for constructing timefrequency spatial signatures with the required accuracy. This algorithm for multichannel source separation and localisation allows arbitrary placement of microphones yet achieves good performance. We demonstrate the efficacy of the technique using source location estimates and compare estimated timefrequency masks with the ideal 0 dB mask.
http://mural.maynoothuniversity.ie/8118/
Marked
Mark
Convolutive nonnegative matrix factorisation with a sparseness constraint
(2006)
Pearlmutter, Barak A.; O'Grady, Paul D.
Convolutive nonnegative matrix factorisation with a sparseness constraint
(2006)
Pearlmutter, Barak A.; O'Grady, Paul D.
Abstract:
Discovering a representation which allows auditory data to be parsimoniously represented is useful for many machine learning and signal processing tasks. Such a representation can be constructed by nonnegative matrix factorisation (NMF), a method for finding partsbased representations of nonnegative data. We present an extension to NMF that is convolutive and includes a sparseness constraint. In combination with a spectral magnitude transform, this method discovers auditory objects and their associated sparse activation patterns.
http://mural.maynoothuniversity.ie/1375/
Marked
Mark
Coordinate Descent for Mixednorm NMF
(2013)
Potluru, Vamsi K.; Le Roux, Jonathan; Pearlmutter, Barak A.; Hershey, John R.; Brand, M...
Coordinate Descent for Mixednorm NMF
(2013)
Potluru, Vamsi K.; Le Roux, Jonathan; Pearlmutter, Barak A.; Hershey, John R.; Brand, Matthew E.
Abstract:
Nonnegative matrix factorization (NMF) is widely used in a variety of machine learning tasks involving speech, documents and images. Being able to specify the structure of the matrix factors is crucial in incorporating prior information. The factors correspond to the feature matrix and the learnt representation. In particular, we allow an userfriendly specification of sparsity on the groups of features using the L1/L2 measure. Also, we propose a pairwise coordinate descent algorithm to minimize the objective. Experimental evidence of the efficacy of this approach is provided on the ORL faces dataset.
http://mural.maynoothuniversity.ie/6553/
Marked
Mark
Correction: Independent Component Analysis for Brain fMRI Does Indeed Select for Maximal Independence
(2013)
Calhoun, Vince D.; Potluru, Vamsi K.; Phlypo, Ronald; Silva, Rogers F.; Pearlmutter, Ba...
Correction: Independent Component Analysis for Brain fMRI Does Indeed Select for Maximal Independence
(2013)
Calhoun, Vince D.; Potluru, Vamsi K.; Phlypo, Ronald; Silva, Rogers F.; Pearlmutter, Barak A.; Caprihan, Arvind; Plis, Sergey M.; Adali, Tulay
Abstract:
This article was republished on October 23, 2013 because of missing equations.
http://mural.maynoothuniversity.ie/10231/
Marked
Mark
Deep brain stimulation may reduce tremor by preferential blockade of slower axons via antidromic activation
(2011)
García, Míriam R.; Verwoerd, Mark; Pearlmutter, Barak A.; Wellstead, Peter; Middleton, ...
Deep brain stimulation may reduce tremor by preferential blockade of slower axons via antidromic activation
(2011)
García, Míriam R.; Verwoerd, Mark; Pearlmutter, Barak A.; Wellstead, Peter; Middleton, Richard H.
Abstract:
Deep brain stimulation (DBS) has been used to ameliorate essential and Parkinsonian tremor, however the detailed mechanism by which tremor reduction is achieved remains unclear. We hypothesize that DBS works by reducing time delays in the feedback paths of the motor control loops. In particular, we suggest that antidromic activation of axonal pathways induced by stimulation will preferentially block axons with longer propagation times, reducing time delays in neuronal motor circuits in a stabilising manner. We demonstrate the plausibility of this hypothesis using two simple computational models which account for a variety of experimental results, and allow us to makes a number of testable predictions.
http://mural.maynoothuniversity.ie/8117/
Marked
Mark
Detecting intrusions using system calls: alternative data models
(1999)
Warrender , Christina; Forrest , Stephanie; Pearlmutter, Barak A.
Detecting intrusions using system calls: alternative data models
(1999)
Warrender , Christina; Forrest , Stephanie; Pearlmutter, Barak A.
Abstract:
Intrusion detection systems rely on a wide variety of observable data to distinguish between legitimate and illegitimate activities. In this paper we study one such observable—sequences of system calls into the kernel of an operating system. Using systemcall data sets generated by several different programs, we compare the ability of different data modeling methods to represent normal behavior accurately and to recognize intrusions. We compare the following methods: Simple enumeration of observed sequences, comparison of relative frequencies of different sequences, a rule induction technique, and Hidden Markov Models (HMMs). We discuss the factors affecting the performance of each method, and conclude that for this particular problem, weaker methods than HMMs are likely sufficient.
http://mural.maynoothuniversity.ie/1418/
Marked
Mark
Differentiating Functions of the Jacobian with Respect to the Weights
(2000)
Flake, Gary William; Pearlmutter, Barak A.
Differentiating Functions of the Jacobian with Respect to the Weights
(2000)
Flake, Gary William; Pearlmutter, Barak A.
Abstract:
For many problems, the correct behavior of a model depends not only on its inputoutput mapping but also on properties of its Jacobian matrix, the matrix of partial derivatives of the model's outputs with respect to its inputs. We introduce the Jprop algorithm, an efficient general method for computing the exact partial derivatives of a variety of simple functions of the Jacobian of a model with respect to its free parameters. The algorithm applies to any parametrized feedforward model, including nonlinear regression, multilayer perceptrons, and radial basis function networks.
http://mural.maynoothuniversity.ie/5484/
Marked
Mark
DiffSharp: An AD Library for .NET Languages
(2016)
Baydin, Atilim Gunes; Pearlmutter, Barak A.; Siskind, Jeffrey Mark
DiffSharp: An AD Library for .NET Languages
(2016)
Baydin, Atilim Gunes; Pearlmutter, Barak A.; Siskind, Jeffrey Mark
Abstract:
DiffSharp is an algorithmic differentiation or automatic differentiation (AD) library for the .NET ecosystem, which is targeted by the C# and F# languages, among others. The library has been designed with machine learning applications in mind, allowing very succinct implementations of models and optimization routines. DiffSharp is implemented in F# and exposes forward and reverse AD operators as general nestable higherorder functions, usable by any .NET language. It provides highperformance linear algebra primitivesscalars, vectors, and matrices, with a generalization to tensors underwaythat are fully supported by all the AD operators, and which use a BLAS/LAPACK backend via the highly optimized OpenBLAS library. DiffSharp currently uses operator overloading, but we are developing a transformationbased version of the library using F#'s "code quotation" metaprogramming facility. Work on a CUDAbased GPU backend is also underway.
http://mural.maynoothuniversity.ie/8114/
Marked
Mark
DiffSharp: Automatic Differentiation Library
(2015)
Baydin, Atilim Gunes; Pearlmutter, Barak A.; Siskind, Jeffrey Mark
DiffSharp: Automatic Differentiation Library
(2015)
Baydin, Atilim Gunes; Pearlmutter, Barak A.; Siskind, Jeffrey Mark
Abstract:
In this paper we introduce DiffSharp, an automatic differentiation (AD) library designed with machine learning in mind. AD is a family of techniques that evaluate derivatives at machine precision with only a small constant factor of overhead, by systematically applying the chain rule of calculus at the elementary operator level. DiffSharp aims to make an extensive array of AD techniques available, in convenient form, to the machine learning community. These including arbitrary nesting of forward/reverse AD operations, AD with linear algebra primitives, and a functional API that emphasizes the use of higherorder functions and composition. The library exposes this functionality through an API that provides gradients, Hessians, Jacobians, directional derivatives, and matrixfree Hessian and Jacobianvector products. Bearing the performance requirements of the latest machine learning techniques in mind, the underlying computations are run through a highperformance BLAS/LAPACK backend...
http://mural.maynoothuniversity.ie/8144/
Marked
Mark
Discovering Convolutive Speech Phones Using Sparseness and Nonnegativity
(2007)
O'Grady, Paul D.; Pearlmutter, Barak A.
Discovering Convolutive Speech Phones Using Sparseness and Nonnegativity
(2007)
O'Grady, Paul D.; Pearlmutter, Barak A.
Abstract:
Discovering a representation that allows auditory data to be parsimoniously represented is useful for many machine learning and signal processing tasks. Such a representation can be constructed by Nonnegative Matrix Factorisation (NMF), which is a method for finding partsbased representations of nonnegative data. Here, we present a convolutive NMF algorithm that includes a sparseness constraint on the activations and has multiplicative updates. In combination with a spectral magnitude transform of speech, this method extracts speech phones that exhibit sparse activation patterns, which we use in a supervised separation scheme for monophonic mixtures.
http://mural.maynoothuniversity.ie/10244/
Marked
Mark
Discovering Convolutive Speech Phones using Sparseness and NonNegativity Constraints
(2007)
O'Grady, Paul D.; Pearlmutter, Barak A.
Discovering Convolutive Speech Phones using Sparseness and NonNegativity Constraints
(2007)
O'Grady, Paul D.; Pearlmutter, Barak A.
Abstract:
Discovering a representation that allows auditory data to be parsimoniously represented is useful for many machine learning and signal processing tasks. Such a representation can be constructed by Nonnegative Matrix Factorisation (NMF), which is a method for finding partsbased representations of nonnegative data. Here, we present an extension to convolutive NMF that includes a sparseness constraint. In combination with a spectral magnitude transform of speech, this method extracts speech phones (and their associated sparse activation patterns), which we use in a supervised separation scheme for monophonic mixtures.
http://mural.maynoothuniversity.ie/1313/
Marked
Mark
Discovering speech phones using convolutive nonnegative matrix factorisation with a sparseness constraint
(2008)
O'Grady, Paul D.; Pearlmutter, Barak A.
Discovering speech phones using convolutive nonnegative matrix factorisation with a sparseness constraint
(2008)
O'Grady, Paul D.; Pearlmutter, Barak A.
Abstract:
Discovering a representation that allows auditory data to be parsimoniously represented is useful for many machine learning and signal processing tasks. Such a representation can be constructed by Nonnegative Matrix Factorisation (NMF), a method for finding partsbased representations of nonnegative data. Here, we present an extension to convolutive NMF that includes a sparseness constraint, where the resultant algorithm has multiplicative updates and utilises the beta divergence as its reconstruction objective. In combination with a spectral magnitude transform of speech, this method discovers auditory objects that resemble speech phones along with their associated sparse activation patterns. We use these in a supervised separation scheme for monophonic mixtures, finding improved separation performance in comparison to classic convolutive NMF.
http://mural.maynoothuniversity.ie/1697/
Marked
Mark
Discovering speech phones using convolutive nonnegative matrix factorisation with a sparseness constraint
(2008)
O'Grady, Paul D.; Pearlmutter, Barak A.
Discovering speech phones using convolutive nonnegative matrix factorisation with a sparseness constraint
(2008)
O'Grady, Paul D.; Pearlmutter, Barak A.
Abstract:
Discovering a representation that allows auditory data to be parsimoniously represented is useful for many machine learning and signal processing tasks. Such a representation can be constructed by Nonnegative Matrix Factorisation (NMF), a method for finding partsbased representations of nonnegative data. Here, we present an extension to convolutive NMF that includes a sparseness constraint, where the resultant algorithm has multiplicative updates and utilises the beta divergence as its reconstruction objective. In combination with a spectral magnitude transform of speech, this method discovers auditory objects that resemble speech phones along with their associated sparse activation patterns. We use these in a supervised separation scheme for monophonic mixtures, finding improved separation performance in comparison to classic convolutive NMF.
http://mural.maynoothuniversity.ie/1685/
Marked
Mark
Dissecting the cellular contributions to early visual sensory processing deficits in schizophrenia using the VESPA evoked response
(2008)
Lalor, Edmund C.; Yeap, Sherlyn; Reilly, Richard B.; Pearlmutter, Barak A.; Foxe, John J.
Dissecting the cellular contributions to early visual sensory processing deficits in schizophrenia using the VESPA evoked response
(2008)
Lalor, Edmund C.; Yeap, Sherlyn; Reilly, Richard B.; Pearlmutter, Barak A.; Foxe, John J.
Abstract:
Electrophysiological research has shown clear dysfunction of early visual processing mechanisms in patients with schizophrenia. In particular, the P1 component of the visual evoked potential (VEP) is substantially reduced in amplitude in patients. A novel visual evoked response known as the VESPA (Visual Evoked Spread Spectrum Analysis) was recently described. This response has a notably different scalp topography from that of the traditional VEP, suggesting preferential activation of a distinct subpopulation of cells. As such, this method constitutes a potentially useful candidate for investigating cellular contributions to early visual processing deficits. In this paper we compare the VEP and VESPA responses between a group of healthy control subjects and a group of schizophrenia patients.We also introduce an extension of the VESPAmethod to incorporate nonlinear processing in the visual system. A significantly reduced P1 component was found in patients using the VEP (with a large ...
http://mural.maynoothuniversity.ie/5534/
Displaying Results 26  50 of 133 on page 2 of 6
1
2
3
4
5
6
Bibtex
CSV
EndNote
RefWorks
RIS
XML
Item Type
Book chapter (15)
Conference item (29)
Journal article (66)
Report (23)
Peer Review Status
Peerreviewed (107)
Nonpeerreviewed (26)
Year
2019 (1)
2018 (3)
2017 (2)
2016 (6)
2015 (3)
2014 (4)
2013 (8)
2012 (2)
2011 (2)
2009 (3)
2008 (15)
2007 (8)
2006 (9)
2005 (9)
2004 (7)
2003 (6)
2002 (9)
2001 (2)
2000 (6)
1999 (3)
1998 (2)
1996 (4)
1995 (3)
1994 (4)
1993 (2)
1991 (2)
1990 (2)
1989 (1)
1988 (2)
1987 (1)
1986 (2)
built by Enovation Solutions