WorldCat Identities

Smith, Leslie S.

Works: 11 works in 24 publications in 1 language and 344 library holdings
Genres: Encyclopedias  Conference papers and proceedings  Dictionaries  Academic theses 
Roles: Editor, Contributor, Author
Classifications: SF442.2, 636.8003
Publication Timeline
Most widely held works by Leslie S Smith
The international encyclopedia of cats by G. N Henderson( Book )

1 edition published in 1973 in English and held by 306 WorldCat member libraries worldwide

Brain inspired cognitive systems 2008 by A Hussain( )

7 editions published in 2010 in English and held by 19 WorldCat member libraries worldwide

Brain Inspired Cognitive Systems 2008 (June 24-27, 2008; São Luís, Brazil) brought together leading scientists and engineers who use analytic, syntactic and computational methods both to understand the prodigious processing properties of biological systems and, specifically, of the brain, and to exploit such knowledge to advance computational methods towards ever higher levels of cognitive competence. This book includes the papers presented at four major symposia: Part I - Cognitive Neuroscience Part II - Biologically Inspired Systems Part III - Neural Computation Part IV - Models of Consciousness
From brains to systems : brain-inspired cognitive systems 2010 by Carlos Hernández( )

6 editions published in 2011 in English and held by 9 WorldCat member libraries worldwide

Brain Inspired Cognitive Systems - BICS 2010 aims to bring together leading scientists and engineers who use analytic and synthetic methods both to understand the astonishing processing properties of biological systems and specifically of the brain, and to exploit such knowledge to advance engineering methods to build artificial systems with higher levels of cognitive competence. BICS is a meeting point of brain scientists and cognitive systems engineers where cross-domain ideas are fostered in the hope of getting emerging insights on the nature, operation and extractable capabilities of brains. This multiple approach is necessary because the progressively more accurate data about the brain is producing a growing need of a quantitative understanding and an associated capacity to manipulate this data and translate it into engineering applications rooted in sound theories. BICS 2010 is intended for both researchers that aim to build brain inspired systems with higher cognitive competences, and for life scientists who use and develop mathematical and engineering approaches for a better understanding of complex biological systems like the brain. Four major interlaced focal symposia are planned for this conference and these are organized into patterns that encourage cross-fertilization across the symposia topics. This emphasizes the role of BICS as a major meeting point for researchers and practitioners in the areas of biological and artificial cognitive systems. Debates across disciplines will enrich researchers with complementary perspectives from diverse scientific fields. BICS 2010 will take place July 14-16, 2010, in Madrid, Spain
A system for the design of networks of computing processes with some applications by Leslie S Smith( Book )

2 editions published in 1980 in English and held by 2 WorldCat member libraries worldwide

Cepstrum of bispectrum spike detection on extracellular signals with concurrent intracellular signals by Shahjahan Shahid( )

1 edition published in 2009 in English and held by 2 WorldCat member libraries worldwide

CARMEN: an e-science virtual laboratory supporting collaboration in neuroinformatics by Colin D Ingram( )

1 edition published in 2009 in English and held by 2 WorldCat member libraries worldwide

Brain inspired cognitive systems : Stirling, UK, 29 August - 1 September 2004 by 2004, Stirling) International Conference on Brain Inspired Cognitive Systems (1( )

1 edition published in 2004 in English and held by 1 WorldCat member library worldwide

A framework for neural net specification by Leslie S Smith( )

1 edition published in 1992 in English and held by 1 WorldCat member library worldwide

Novel computationally intelligent machine learning algorithms for data mining and knowledge discovery by Iffat A Gheyas( )

1 edition published in 2009 in English and held by 1 WorldCat member library worldwide

This thesis addresses three major issues in data mining regarding feature subset selection in large dimensionality domains, plausible reconstruction of incomplete data in cross-sectional applications, and forecasting univariate time series. For the automated selection of an optimal subset of features in real time, we present an improved hybrid algorithm: SAGA. SAGA combines the ability to avoid being trapped in local minima of Simulated Annealing with the very high convergence rate of the crossover operator of Genetic Algorithms, the strong local search ability of greedy algorithms and the high computational efficiency of generalized regression neural networks (GRNN). For imputing missing values and forecasting univariate time series, we propose a homogeneous neural network ensemble. The proposed ensemble consists of a committee of Generalized Regression Neural Networks (GRNNs) trained on different subsets of features generated by SAGA and the predictions of base classifiers are combined by a fusion rule. This approach makes it possible to discover all important interrelations between the values of the target variable and the input features. The proposed ensemble scheme has two innovative features which make it stand out amongst ensemble learning algorithms: (1) the ensemble makeup is optimized automatically by SAGA; and (2) GRNN is used for both base classifiers and the top level combiner classifier. Because of GRNN, the proposed ensemble is a dynamic weighting scheme. This is in contrast to the existing ensemble approaches which belong to the simple voting and static weighting strategy. The basic idea of the dynamic weighting procedure is to give a higher reliability weight to those scenarios that are similar to the new ones. The simulation results demonstrate the validity of the proposed ensemble model
Improving associative memory in a network of spiking neurons by Russell I Hunter( )

1 edition published in 2011 in English and held by 1 WorldCat member library worldwide

In this thesis we use computational neural network models to examine the dynamics and functionality of the CA3 region of the mammalian hippocampus. The emphasis of the project is to investigate how the dynamic control structures provided by inhibitory circuitry and cellular modification may effect the CA3 region during the recall of previously stored information. The CA3 region is commonly thought to work as a recurrent auto-associative neural network due to the neurophysiological characteristics found, such as, recurrent collaterals, strong and sparse synapses from external inputs and plasticity between coactive cells. Associative memory models have been developed using various configurations of mathematical artificial neural networks which were first developed over 40 years ago. Within these models we can store information via changes in the strength of connections between simplified model neurons (two-state). These memories can be recalled when a cue (noisy or partial) is instantiated upon the net. The type of information they can store is quite limited due to restrictions caused by the simplicity of the hard-limiting nodes which are commonly associated with a binary activation threshold. We build a much more biologically plausible model with complex spiking cell models and with realistic synaptic properties between cells. This model is based upon some of the many details we now know of the neuronal circuitry of the CA3 region. We implemented the model in computer software using Neuron and Matlab and tested it by running simulations of storage and recall in the network. By building this model we gain new insights into how different types of neurons, and the complex circuits they form, actually work. The mammalian brain consists of complex resistive-capacative electrical circuitry which is formed by the interconnection of large numbers of neurons. A principal cell type is the pyramidal cell within the cortex, which is the main information processor in our neural networks. Pyramidal cells are surrounded by diverse populations of interneurons which have proportionally smaller numbers compared to the pyramidal cells and these form connections with pyramidal cells and other inhibitory cells. By building detailed computational models of recurrent neural circuitry we explore how these microcircuits of interneurons control the flow of information through pyramidal cells and regulate the efficacy of the network. We also explore the effect of cellular modification due to neuronal activity and the effect of incorporating spatially dependent connectivity on the network during recall of previously stored information. In particular we implement a spiking neural network proposed by Sommer and Wennekers (2001). We consider methods for improving associative memory recall using methods inspired by the work by Graham and Willshaw (1995) where they apply mathematical transforms to an artificial neural network to improve the recall quality within the network. The networks tested contain either 100 or 1000 pyramidal cells with 10% connectivity applied and a partial cue instantiated, and with a global pseudo-inhibition. We investigate three methods. Firstly, applying localised disynaptic inhibition which will proportionalise the excitatory post synaptic potentials and provide a fast acting reversal potential which should help to reduce the variability in signal propagation between cells and provide further inhibition to help synchronise the network activity. Secondly, implementing a persistent sodium channel to the cell body which will act to non-linearise the activation threshold where after a given membrane potential the amplitude of the excitatory postsynaptic potential (EPSP) is boosted to push cells which receive slightly more excitation (most likely high units) over the firing threshold. Finally, implementing spatial characteristics of the dendritic tree will allow a greater probability of a modified synapse existing after 10% random connectivity has been applied throughout the network. We apply spatial characteristics by scaling the conductance weights of excitatory synapses which simulate the loss in potential in synapses found in the outer dendritic regions due to increased resistance. To further increase the biological plausibility of the network we remove the pseudo-inhibition and apply realistic basket cell models with differing configurations for a global inhibitory circuit. The networks are configured with; 1 single basket cell providing feedback inhibition, 10% basket cells providing feedback inhibition where 10 pyramidal cells connect to each basket cell and finally, 100% basket cells providing feedback inhibition. These networks are compared and contrasted for efficacy on recall quality and the effect on the network behaviour. We have found promising results from applying biologically plausible recall strategies and network configurations which suggests the role of inhibition and cellular dynamics are pivotal in learning and memory
Brain Inspired Cognitive Systems 2008 00( )

2 editions published in 2010 in English and held by 0 WorldCat member libraries worldwide

moreShow More Titles
fewerShow Fewer Titles
Audience Level
Audience Level
  Kids General Special  
Audience level: 0.34 (from 0.30 for CARMEN: an ... to 1.00 for Brain insp ...)

Brain inspired cognitive systems 2008
From brains to systems : brain-inspired cognitive systems 2010
English (24)