Current Search: Hahn, William (x)
View All Items
- Title
- Imaging Alzheimer’s Disease in vivo with diffusion Tensor MRI.
- Creator
- Hahn, William E., Graduate College
- Date Issued
- 2012-03-30
- PURL
- http://purl.flvc.org/fcla/dt/3342374
- Format
- Document (PDF)
- Title
- White matter networks indicative of Alzheimer's disease from diffusion MRI.
- Creator
- Hahn, William E., Fuchs, Armin, Graduate College
- Date Issued
- 2013-04-12
- PURL
- http://purl.flvc.org/fcla/dt/3361307
- Subject Headings
- Alzheimer's disease, Diffusion tensor imaging, Diffusion magnetic resonance imaging
- Format
- Document (PDF)
- Title
- DEVELOPING A DEEP LEARNING PIPELINE TO AUTOMATICALLY ANNOTATE GOLD PARTICLES IN IMMUNOELECTRON MICROSCOPY IMAGES.
- Creator
- Jerez, Diego Alejandro, Hahn, William, Florida Atlantic University, Department of Mathematical Sciences, Charles E. Schmidt College of Science
- Abstract/Description
-
Machine learning has been utilized in bio-imaging in recent years, however as it is relatively new and evolving, some researchers who wish to utilize machine learning tools have limited access because of a lack of programming knowledge. In electron microscopy (EM), immunogold labeling is commonly used to identify the target proteins, however the manual annotation of the gold particles in the images is a time-consuming and laborious process. Conventional image processing tools could provide...
Show moreMachine learning has been utilized in bio-imaging in recent years, however as it is relatively new and evolving, some researchers who wish to utilize machine learning tools have limited access because of a lack of programming knowledge. In electron microscopy (EM), immunogold labeling is commonly used to identify the target proteins, however the manual annotation of the gold particles in the images is a time-consuming and laborious process. Conventional image processing tools could provide semi-automated annotation, but those require that users make manual adjustments for every step of the analysis. To create a new high-throughput image analysis tool for immuno-EM, I developed a deep learning pipeline that was designed to deliver a completely automated annotation of immunogold particles in EM images. The program was made accessible for users without prior programming experience and was also expanded to be used on different types of immuno-EM images.
Show less - Date Issued
- 2020
- PURL
- http://purl.flvc.org/fau/fd/FA00013628
- Subject Headings
- Electron microscopy, Immunogold labeling, Image analysis, Deep learning
- Format
- Document (PDF)
- Title
- Sparse Coding and Compressed Sensing: Locally Competitive Algorithms and Random Projections.
- Creator
- Hahn, William E., Barenholtz, Elan, Florida Atlantic University, Charles E. Schmidt College of Science, Center for Complex Systems and Brain Sciences
- Abstract/Description
-
For an 8-bit grayscale image patch of size n x n, the number of distinguishable signals is 256(n2). Natural images (e.g.,photographs of a natural scene) comprise a very small subset of these possible signals. Traditional image and video processing relies on band-limited or low-pass signal models. In contrast, we will explore the observation that most signals of interest are sparse, i.e. in a particular basis most of the expansion coefficients will be zero. Recent developments in sparse...
Show moreFor an 8-bit grayscale image patch of size n x n, the number of distinguishable signals is 256(n2). Natural images (e.g.,photographs of a natural scene) comprise a very small subset of these possible signals. Traditional image and video processing relies on band-limited or low-pass signal models. In contrast, we will explore the observation that most signals of interest are sparse, i.e. in a particular basis most of the expansion coefficients will be zero. Recent developments in sparse modeling and L1 optimization have allowed for extraordinary applications such as the single pixel camera, as well as computer vision systems that can exceed human performance. Here we present a novel neural network architecture combining a sparse filter model and locally competitive algorithms (LCAs), and demonstrate the networks ability to classify human actions from video. Sparse filtering is an unsupervised feature learning algorithm designed to optimize the sparsity of the feature distribution directly without having the need to model the data distribution. LCAs are defined by a system of di↵erential equations where the initial conditions define an optimization problem and the dynamics converge to a sparse decomposition of the input vector. We applied this architecture to train a classifier on categories of motion in human action videos. Inputs to the network were small 3D patches taken from frame di↵erences in the videos. Dictionaries were derived for each action class and then activation levels for each dictionary were assessed during reconstruction of a novel test patch. We discuss how this sparse modeling approach provides a natural framework for multi-sensory and multimodal data processing including RGB video, RGBD video, hyper-spectral video, and stereo audio/video streams.
Show less - Date Issued
- 2016
- PURL
- http://purl.flvc.org/fau/fd/FA00004713, http://purl.flvc.org/fau/fd/FA00004713
- Subject Headings
- Artificial intelligence, Expert systems (Computer science), Image processing -- Digital techniques -- Mathematics, Sparse matrices
- Format
- Document (PDF)
- Title
- FINANCIAL TIME-SERIES ANALYSIS WITH DEEP NEURAL NETWORKS.
- Creator
- Rimal, Binod, Hahn, William Edward, Florida Atlantic University, Department of Mathematical Sciences, Charles E. Schmidt College of Science
- Abstract/Description
-
Financial time-series data are noisy, volatile, and nonlinear. The classic statistical linear models may not capture those underlying structures of the data. The rapid advancement in artificial intelligence and machine learning techniques, availability of large-scale data, and increased computational capabilities of a machine opens the door to developing sophisticated deep learning models to capture the nonlinearity and hidden information in the data. Creating a robust model by unlocking the...
Show moreFinancial time-series data are noisy, volatile, and nonlinear. The classic statistical linear models may not capture those underlying structures of the data. The rapid advancement in artificial intelligence and machine learning techniques, availability of large-scale data, and increased computational capabilities of a machine opens the door to developing sophisticated deep learning models to capture the nonlinearity and hidden information in the data. Creating a robust model by unlocking the power of a deep neural network and using real-time data is essential in this tech era. This study constructs a new computational framework to uncover the information in the financial time-series data and better inform the related parties. It carries out the comparative analysis of the performance of the deep learning models on stock price prediction with a well-balanced set of factors from fundamental data, macroeconomic data, and technical indicators responsible for stock price movement. We further build a novel computational framework through a merger of recurrent neural networks and random compression for the time-series analysis. The performance of the model is tested on a benchmark anomaly time-series dataset. This new computational framework in a compressed paradigm leads to improved computational efficiency and data privacy. Finally, this study develops a custom trading simulator and an agent-based hybrid model by combining gradient and gradient-free optimization methods. In particular, we explore the use of simulated annealing with stochastic gradient descent. The model trains a population of agents to predict appropriate trading behaviors such as buy, hold, or sell by optimizing the portfolio returns. Experimental results on S&P 500 index show that the proposed model outperforms the baseline models.
Show less - Date Issued
- 2022
- PURL
- http://purl.flvc.org/fau/fd/FA00014009
- Subject Headings
- Neural networks (Computer science), Deep learning (Machine learning), Time-series analysis, Stocks, Simulated annealing (Mathematics)
- Format
- Document (PDF)
- Title
- CRACKING THE SPARSE CODE: LATERAL COMPETITION FORMS ROBUST V1-LIKE REPRESENTATIONS IN CONVOLUTIONAL NEURAL NETWORKS.
- Creator
- Teti, Michael, Barenholtz, Elan, Hahn, William, Florida Atlantic University, Center for Complex Systems and Brain Sciences, Charles E. Schmidt College of Science
- Abstract/Description
-
Although state-of-the-art Convolutional Neural Networks (CNNs) are often viewed as a model of biological object recognition, they lack many computational and architectural motifs that are postulated to contribute to robust perception in biological neural systems. For example, modern CNNs lack lateral connections, which greatly outnumber feed-forward excitatory connections in primary sensory cortical areas and mediate feature-specific competition between neighboring neurons to form robust,...
Show moreAlthough state-of-the-art Convolutional Neural Networks (CNNs) are often viewed as a model of biological object recognition, they lack many computational and architectural motifs that are postulated to contribute to robust perception in biological neural systems. For example, modern CNNs lack lateral connections, which greatly outnumber feed-forward excitatory connections in primary sensory cortical areas and mediate feature-specific competition between neighboring neurons to form robust, sparse representations of sensory stimuli for downstream tasks. In this thesis, I hypothesize that CNN layers equipped with lateral competition better approximate the response characteristics and dynamics of neurons in the mammalian primary visual cortex, leading to increased robustness under noise and/or adversarial attacks relative to current robust CNN layers. To test this hypothesis, I develop a new class of CNNs called LCANets, which simulate recurrent, feature-specific lateral competition between neighboring neurons via a sparse coding model termed the Locally Competitive Algorithm (LCA). I first perform an analysis of the response properties of LCA and show that sparse representations formed by lateral competition more accurately mirror response characteristics of primary visual cortical populations and are more useful for downstream tasks like object recognition than previous sparse CNNs, which approximate competition with winner-take-all mechanisms implemented via thresholding.
Show less - Date Issued
- 2022
- PURL
- http://purl.flvc.org/fau/fd/FA00014050
- Subject Headings
- Neural networks (Computer science), Machine learning, Computer vision
- Format
- Document (PDF)
- Title
- Mechanisms of Selective Attention in Working Memory, Modeled from Human Alpha Band Oscillations.
- Creator
- Nouri, Asal, Ester, Edward, Hahn, William, Florida Atlantic University, Center for Complex Systems and Brain Sciences, Charles E. Schmidt College of Science
- Abstract/Description
-
Working memory (WM) enables the flexible representation of information over short intervals. It is established that WM performance can be enhanced by a retrospective cue presented during storage, yet the neural mechanisms responsible for this benefit are unclear. Here, we tested several explanations for retrospective cue benefits by quantifying changes in spatial WM representations reconstructed from alpha-band (8 - 12 Hz) EEG activity recorded from human participants before and after the...
Show moreWorking memory (WM) enables the flexible representation of information over short intervals. It is established that WM performance can be enhanced by a retrospective cue presented during storage, yet the neural mechanisms responsible for this benefit are unclear. Here, we tested several explanations for retrospective cue benefits by quantifying changes in spatial WM representations reconstructed from alpha-band (8 - 12 Hz) EEG activity recorded from human participants before and after the presentation of a retrospective cue. This allowed us to track cue-related changes in WM representations with high temporal resolution. Our findings suggest that retrospective cues engage several different mechanisms such as recovery of information previously decreased to baseline after being cued as relevant and protecting the cued item from temporal decay to mitigate information loss during WM storage. Our EEG findings suggest that participants can supplement active memory traces with information from other memory stores. We next sought to better understand these additional store(s) by asking whether they are subject to the same temporal degradation seen in active memory representations during storage. We observed a significant increase in the quality of location representations following a retrocue, but the magnitude of this benefit was linearly and inversely related to the timing of the retrocue such that later cues yielded smaller increases.
Show less - Date Issued
- 2023
- PURL
- http://purl.flvc.org/fau/fd/FA00014192
- Subject Headings
- Working memory, Short-term memory, Attention, Alpha Rhythm
- Format
- Document (PDF)
- Title
- PRESERVING KNOWLEDGE IN SIMULATED BEHAVIORAL ACTION LOOPS.
- Creator
- St.Clair, Rachel, Barenholtz, Elan, Hahn, William, Florida Atlantic University, Center for Complex Systems and Brain Sciences, Charles E. Schmidt College of Science
- Abstract/Description
-
One basic goal of artificial learning systems is the ability to continually learn throughout that system’s lifetime. Transitioning between tasks and re-deploying prior knowledge is thus a desired feature of artificial learning. However, in the deep-learning approaches, the problem of catastrophic forgetting of prior knowledge persists. As a field, we want to solve the catastrophic forgetting problem without requiring exponential computations or time, while demonstrating real-world relevance....
Show moreOne basic goal of artificial learning systems is the ability to continually learn throughout that system’s lifetime. Transitioning between tasks and re-deploying prior knowledge is thus a desired feature of artificial learning. However, in the deep-learning approaches, the problem of catastrophic forgetting of prior knowledge persists. As a field, we want to solve the catastrophic forgetting problem without requiring exponential computations or time, while demonstrating real-world relevance. This work proposes a novel model which uses an evolutionary algorithm similar to a meta-learning objective, that is fitted with a resource constraint metrics. Four reinforcement learning environments are considered with the shared concept of depth although the collection of environments is multi-modal. This system shows preservation of some knowledge in sequential task learning and protection of catastrophic forgetting in deep neural networks.
Show less - Date Issued
- 2022
- PURL
- http://purl.flvc.org/fau/fd/FA00013896
- Subject Headings
- Artificial intelligence, Deep learning (Machine learning), Reinforcement learning, Neural networks (Computer science)
- Format
- Document (PDF)