Current Search: info:fedora/islandora:personCModel (x) » info:fedora/fau:legacyETDs (x) » Department of Computer and Electrical Engineering and Computer Science (x)
View All Items
Pages
- Title
- Design of a high signal to noise ratio electrical impedance plethysmograph.
- Creator
- Urso, Alessio Francesco., Florida Atlantic University, Shankar, Ravi, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
We have developed a high signal to noise ratio automatically resetting electrical impedance plethysmograph for noninvasive determination of blood pressure in pigeons. Pigeons are finding increased use as an economical and appropriate animal model for the study of human atherosclerosis. The impedance plethysmograph obtains the pulsatile arterial volume change as an impedance pulse. Nyboer's equation may then be used to extract the arterial volume change from the impedance pulse. The designed...
Show moreWe have developed a high signal to noise ratio automatically resetting electrical impedance plethysmograph for noninvasive determination of blood pressure in pigeons. Pigeons are finding increased use as an economical and appropriate animal model for the study of human atherosclerosis. The impedance plethysmograph obtains the pulsatile arterial volume change as an impedance pulse. Nyboer's equation may then be used to extract the arterial volume change from the impedance pulse. The designed impedance plethysmograph has a sensitivity of 430 mV/m$\Omega$ and a noise level of 0.12 m$\Omega$ peak-to-peak, significantly better than systems reported earlier. Refinements to further enhance the performance are also presented.
Show less - Date Issued
- 1990
- PURL
- http://purl.flvc.org/fcla/dt/14609
- Subject Headings
- Impedance plethysmography, Atherosclerosis--Animal models, Blood pressure--Measurement
- Format
- Document (PDF)
- Title
- Design of a power management model for a solar/fuel cell hybrid energy system.
- Creator
- Melendez, Rosana., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This thesis proposes a Power Management Model (PMM) for optimization of several green power generation systems. A Photovoltaic/Fuel cell Hybrid Energy System (PFHES) consisting of solar cells, electrolyzer and fuel cell stack is utilized to meet a specific DC load bank for various applications. The Photovoltaic system is the primary power source to take advantage of renewable energy. The electrolyzer-fuel cell integration is used as a backup and as a hydrogen storage system with the different...
Show moreThis thesis proposes a Power Management Model (PMM) for optimization of several green power generation systems. A Photovoltaic/Fuel cell Hybrid Energy System (PFHES) consisting of solar cells, electrolyzer and fuel cell stack is utilized to meet a specific DC load bank for various applications. The Photovoltaic system is the primary power source to take advantage of renewable energy. The electrolyzer-fuel cell integration is used as a backup and as a hydrogen storage system with the different energy sources integrated through a DC link bus. An overall power management strategy is designed for the optimization of the power flows among the different energy sources. Extensive simulation experiments have been carried out to verify the system performance under PMM governing strategy. The simulation results indeed demonstrate the effectiveness of the proposed approach.
Show less - Date Issued
- 2010
- PURL
- http://purl.flvc.org/FAU/2705074
- Subject Headings
- Electric power systems, Building-integrated photovoltaic systems, Renewable energy sources, Hydrogen as fuel, Research
- Format
- Document (PDF)
- Title
- Design of a Test Framework for the Evaluation of Transfer Learning Algorithms.
- Creator
- Weiss, Karl Robert, Khoshgoftaar, Taghi M., Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
A traditional machine learning environment is characterized by the training and testing data being drawn from the same domain, therefore, having similar distribution characteristics. In contrast, a transfer learning environment is characterized by the training data having di erent distribution characteristics from the testing data. Previous research on transfer learning has focused on the development and evaluation of transfer learning algorithms using real-world datasets. Testing with real...
Show moreA traditional machine learning environment is characterized by the training and testing data being drawn from the same domain, therefore, having similar distribution characteristics. In contrast, a transfer learning environment is characterized by the training data having di erent distribution characteristics from the testing data. Previous research on transfer learning has focused on the development and evaluation of transfer learning algorithms using real-world datasets. Testing with real-world datasets exposes an algorithm to a limited number of data distribution di erences and does not exercise an algorithm's full capability and boundary limitations. In this research, we de ne, implement, and deploy a transfer learning test framework to test machine learning algorithms. The transfer learning test framework is designed to create a wide-range of distribution di erences that are typically encountered in a transfer learning environment. By testing with many di erent distribution di erences, an algorithm's strong and weak points can be discovered and evaluated against other algorithms. This research additionally performs case studies that use the transfer learning test framework. The rst case study focuses on measuring the impact of exposing algorithms to the Domain Class Imbalance distortion pro le. The next case study uses the entire transfer learning test framework to evaluate both transfer learning and traditional machine learning algorithms. The nal case study uses the transfer learning test framework in conjunction with real-world datasets to measure the impact of the base traditional learner on the performance of transfer learning algorithms. Two additional experiments are performed that are focused on using unique realworld datasets. The rst experiment uses transfer learning techniques to predict fraudulent Medicare claims. The second experiment uses a heterogeneous transfer learning method to predict phishing webgages. These case studies will be of interest to researchers who develop and improve transfer learning algorithms. This research will also be of bene t to machine learning practitioners in the selection of high-performing transfer learning algorithms.
Show less - Date Issued
- 2017
- PURL
- http://purl.flvc.org/fau/fd/FA00005925
- Subject Headings
- Dissertations, Academic -- Florida Atlantic University, Machine learning., Algorithms., Machine learning Development.
- Format
- Document (PDF)
- Title
- Design of analog building blocks useful for artificial neural networks.
- Creator
- Renavikar, Ajit Anand., Florida Atlantic University, Shankar, Ravi, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Software simulations of a scaleable VLSI implementable architecture and algorithm for character recognition by a research group at Florida Atlantic University (FAU) have shown encouraging results. We address here hardware implementation issues pertinent to the classification phase of character recognition. Using the digit classification techniques developed at FAU as a foundation, we have designed and simulated general purpose building blocks useful for a possible implementation of a Digital ...
Show moreSoftware simulations of a scaleable VLSI implementable architecture and algorithm for character recognition by a research group at Florida Atlantic University (FAU) have shown encouraging results. We address here hardware implementation issues pertinent to the classification phase of character recognition. Using the digit classification techniques developed at FAU as a foundation, we have designed and simulated general purpose building blocks useful for a possible implementation of a Digital & Analog CMOS VLSI chip that is suitable for a variety of artificial neural network (ANN) architectures. HSPICE was used to perform circuit-level simulations of the building blocks. We present here the details of implementation of the recognition chip including the architecture, circuit design and the simulation results.
Show less - Date Issued
- 1996
- PURL
- http://purl.flvc.org/fcla/dt/15328
- Subject Headings
- Neural networks (Computer science), Artificial intelligence, Optical character recognition devices, Pattern recognition systems
- Format
- Document (PDF)
- Title
- THE DESIGN OF HIGH FREQUENCY OSCILLATORS: NOISE CHARACTERIZATION, DESIGN THEORY, AND MEASUREMENTS.
- Creator
- VICTOR, ALAN MICHAEL., Florida Atlantic University, Gazourian, Martin G., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
A design theory for high frequency oscillators is presented. Emphasis is placed on oscillator design techniques which are applicable to the electrical tuning of LC and transmission line resonators. Attention is paid to design approaches which yield an oscillator with high spectral purity and a large signal to noise ratio. Theory and measurements demonstrate for the oscillator configurations investigated the a small L/C ratio is desirable for improved oscillator signal to noise ratio....
Show moreA design theory for high frequency oscillators is presented. Emphasis is placed on oscillator design techniques which are applicable to the electrical tuning of LC and transmission line resonators. Attention is paid to design approaches which yield an oscillator with high spectral purity and a large signal to noise ratio. Theory and measurements demonstrate for the oscillator configurations investigated the a small L/C ratio is desirable for improved oscillator signal to noise ratio. Equations are developed which define the noise figure the oscillator due to the additive noise of the active device. This analysis demonstrates the need for a high device starting transconductance which should be subsequently reduced during oscillation to minimize the device noise contribution. A relationship is developed between the receiver dynamic range and the oscillator signal to the noise ratio. Oscillator designs in the region 20 Mhz - 200 Mhz verify the analysis. A unified approach to large signal oscillator design is investigated and relationships to oscillator signal to noise ratio using the previously developed theory are noted
Show less - Date Issued
- 1980
- PURL
- http://purl.flvc.org/fcla/dt/14043
- Subject Headings
- Oscillators, Audio-frequency
- Format
- Document (PDF)
- Title
- The design of reliable decentralized computer systems.
- Creator
- Wu, Jie., Florida Atlantic University, Fernandez, Eduardo B., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
With the increase in the applications of computer technology, there are more and more demands for the use of computer systems in the area of real-time applications and critical systems. Reliability and performance are fundamental design requirements for these applications. In this dissertation, we develop some specific aspects of a fault-tolerant decentralized system architecture. This system can execute concurrent processes and it is composed of processing elements that have only local...
Show moreWith the increase in the applications of computer technology, there are more and more demands for the use of computer systems in the area of real-time applications and critical systems. Reliability and performance are fundamental design requirements for these applications. In this dissertation, we develop some specific aspects of a fault-tolerant decentralized system architecture. This system can execute concurrent processes and it is composed of processing elements that have only local memories with point-to-point communication. A model using hierarchical layers describes this system. Fault tolerance techniques are discussed for the applications, software, operating system, and hardware layers of the model. Scheduling of communicating tasks to increase performance is also addressed. Some special problems such as the Byzantine Generals problem are considered. We have shown that, by combining reliable techniques on different layers and with consideration of system performance, one can provide a system with a very high level reliability as well as performance.
Show less - Date Issued
- 1989
- PURL
- http://purl.flvc.org/fcla/dt/12237
- Subject Headings
- Electronic digital computers--Reliability, Fault-tolerant computing, System design, Computer software--Reliability
- Format
- Document (PDF)
- Title
- THE DESIGN OF SWITCHED-CAPACITOR HIGHPASS FILTERS.
- Creator
- LEE, KING FU., Florida Atlantic University, Gazourian, Martin G., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The design of high order switched-capacitor highpass filters is presented. Emphasis is placed on the design procedures of cascaded biquadratic sections and ladder network realizations of switchedcapacitor highpass filters. The stability problem of the doubly terminated switched-capacitor ladder highpass filter is discussed. Design examples are presented to illustrate the design procedures. The sensitivities of the realization methods are discussed. An .analytical equation of the gain...
Show moreThe design of high order switched-capacitor highpass filters is presented. Emphasis is placed on the design procedures of cascaded biquadratic sections and ladder network realizations of switchedcapacitor highpass filters. The stability problem of the doubly terminated switched-capacitor ladder highpass filter is discussed. Design examples are presented to illustrate the design procedures. The sensitivities of the realization methods are discussed. An .analytical equation of the gain deviation for the cascaded biquadratic sections realization is derived. Monte Carlo analysis is performed for the design examples. The results of the analyses are compared to reveal the differences in sensitivities in terms of the order of the filters and the type of realizations.
Show less - Date Issued
- 1983
- PURL
- http://purl.flvc.org/fcla/dt/14169
- Subject Headings
- Switched capacitor circuits, Digital filters (Mathematics)
- Format
- Document (PDF)
- Title
- DESIGN/IMPLEMENTATION OF AN EXPERT SYSTEM USING A DATA BASE MANAGEMENT SYSTEM ARCHITECTURE IN DEVELOPMENT OF ITS INFERENCE ENGINE.
- Creator
- LABOONE, PERRY ALLEN., Florida Atlantic University, Hoffman, Frederick, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The subject of this thesis is the design and implement at ion of an expert system from a standard data base management software language. The advantages and limitations of such a system design are discussed and supported by an accompanying implementation. Both, the design and implementation, demonstrate what gives the expertise or personification of human reasoning to a machine and why this type of reasoning is well suited to certain types of problems. This fundamental departure from...
Show moreThe subject of this thesis is the design and implement at ion of an expert system from a standard data base management software language. The advantages and limitations of such a system design are discussed and supported by an accompanying implementation. Both, the design and implementation, demonstrate what gives the expertise or personification of human reasoning to a machine and why this type of reasoning is well suited to certain types of problems. This fundamental departure from traditional deterministic analytical problem solving is accomplished by developing a system that is heuristic in nature. This heuristic implementation provides for a system that assists in the development of an emerging solution, rather than a deterministic solution in and of itself (i.e., a system that is programmed with a set of meta-knowledge rules that governs the decision making process and acts upon a second set of knowledge rules).
Show less - Date Issued
- 1986
- PURL
- http://purl.flvc.org/fcla/dt/14349
- Subject Headings
- Expert systems (Computer science), Systems design
- Format
- Document (PDF)
- Title
- Designing non-proprietary and practical ISDN terminal equipment for basic access.
- Creator
- Mitchell, Todd., Florida Atlantic University, Ilyas, Mohammad, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This thesis illustrates the design of non-proprietary and practical ISDN terminal equipment for basic access. Terminal compatibility with the AT&T 5ESS, Northern Telecom DMS100, and Siemens EWSD Class 5 switches is considered. General compliance is dictated by all applicable CCITT recommendations and BELLCORE technical references. A consolidation of the useful information in the CCITT and BELLCORE technical documents is provided from a design engineer's perspective. A practical, cost...
Show moreThis thesis illustrates the design of non-proprietary and practical ISDN terminal equipment for basic access. Terminal compatibility with the AT&T 5ESS, Northern Telecom DMS100, and Siemens EWSD Class 5 switches is considered. General compliance is dictated by all applicable CCITT recommendations and BELLCORE technical references. A consolidation of the useful information in the CCITT and BELLCORE technical documents is provided from a design engineer's perspective. A practical, cost effective implementation is outlined and considerations for flexibility due to the changing requirements are explored. The objective is to specify simplified guidelines which can be followed to create generic ISDN terminal equipment which is non-proprietary and practical for ISDN subscribers today.
Show less - Date Issued
- 1990
- PURL
- http://purl.flvc.org/fcla/dt/14662
- Subject Headings
- Integrated services digital networks, Data transmission systems--Equipment and supplies
- Format
- Document (PDF)
- Title
- Detection and classification of marine mammal sounds.
- Creator
- Esfahanian, Mahdi, Zhuang, Hanqi, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Ocean is home to a large population of marine mammals such as dolphins and whales and concerns over anthropogenic activities in the regions close to their habitants have been increased. Therefore the ability to detect the presence of these species in the field, to analyze and classify their vocalization patterns for signs of distress and distortion of their communication calls will prove to be invaluable in protecting these species. The objective of this research is to investigate methods...
Show moreOcean is home to a large population of marine mammals such as dolphins and whales and concerns over anthropogenic activities in the regions close to their habitants have been increased. Therefore the ability to detect the presence of these species in the field, to analyze and classify their vocalization patterns for signs of distress and distortion of their communication calls will prove to be invaluable in protecting these species. The objective of this research is to investigate methods that automatically detect and classify vocalization patterns of marine mammals. The first work performed is the classification of bottlenose dolphin calls by type. The extraction of salient and distinguishing features from recordings is a major part of this endeavor. To this end, two strategies are evaluated with real datasets provided by Woods Hole Oceanographic Institution: The first strategy is to use contour-based features such as Time-Frequency Parameters and Fourier Descriptors and the second is to employ texture-based features such as Local Binary Patterns (LBP) and Gabor Wavelets. Once dolphin whistle features are extracted for spectrograms, selection of classification procedures is crucial to the success of the process. For this purpose, the performances of classifiers such as K-Nearest Neighbor, Support Vector Machine, and Sparse Representation Classifier (SRC) are assessed thoroughly, together with those of the underlined feature extractors.
Show less - Date Issued
- 2014
- PURL
- http://purl.flvc.org/fau/fd/FA00004282, http://purl.flvc.org/fau/fd/FA00004282
- Subject Headings
- Acoustic phenomena in nature, Marine mammals -- Effect of noise on, Marine mammals -- Vocalization, Signal processing -- Mathematics, Underwater acoustics, Wavelets (Mathematics)
- Format
- Document (PDF)
- Title
- Detection of change-prone telecommunications software modules.
- Creator
- Weir, Ronald Eugene., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Accurately classifying the quality of software is a major problem in any software development project. Software engineers develop models that provide early estimates of quality metrics which allow them to take actions against emerging quality problems. The use of a neural network as a tool to classify programs as a low, medium, or high risk for errors or change is explored using multiple software metrics as input. It is demonstrated that a neural network, trained using the back-propagation...
Show moreAccurately classifying the quality of software is a major problem in any software development project. Software engineers develop models that provide early estimates of quality metrics which allow them to take actions against emerging quality problems. The use of a neural network as a tool to classify programs as a low, medium, or high risk for errors or change is explored using multiple software metrics as input. It is demonstrated that a neural network, trained using the back-propagation supervised learning strategy, produced the desired mapping between the static software metrics and the software quality classes. The neural network classification methodology is compared to the discriminant analysis classification methodology in this experiment. The comparison is based on two and three class predictive models developed using variables resulting from principal component analysis of software metrics.
Show less - Date Issued
- 1995
- PURL
- http://purl.flvc.org/fcla/dt/15183
- Subject Headings
- Computer software--Evaluation, Software engineering, Neural networks (Computer science)
- Format
- Document (PDF)
- Title
- Determination of receptive fields in neural networks using Alopex.
- Creator
- Shah, Gaurang G., Florida Atlantic University, Pandya, Abhijit S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This research aims at proposing a model for visual pattern recognition inspired by the neural circuitry in the brain. Our attempt is to propose few modifications in the Alopex algorithm and try to use it for the calculations of the receptive fields of neurons in the trained network. We have developed a small-scale, four-layered neural network model for simple character recognition as well as complex image patterns, which can recognize the patterns transformed by affine conversion. Here Alopex...
Show moreThis research aims at proposing a model for visual pattern recognition inspired by the neural circuitry in the brain. Our attempt is to propose few modifications in the Alopex algorithm and try to use it for the calculations of the receptive fields of neurons in the trained network. We have developed a small-scale, four-layered neural network model for simple character recognition as well as complex image patterns, which can recognize the patterns transformed by affine conversion. Here Alopex algorithm is presented as an iterative and stochastic processing method, which was proposed for optimization of a given cost function over hundreds or thousands of iterations. In this case the receptive fields of the neurons in the output layers are obtained using the Alopex algorithm.
Show less - Date Issued
- 2005
- PURL
- http://purl.flvc.org/fcla/dt/13298
- Subject Headings
- Pattern recognition systems, Neural networks (Computer science), Computer algorithms, Neuroanatomy, Image processing
- Format
- Document (PDF)
- Title
- Determining the Effectiveness of Human Interaction in Human-in-the-Loop Systems by Using Mental States.
- Creator
- Lloyd, Eric, Huang, Shihong, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
A self-adaptive software is developed to predict the stock market. It’s Stock Prediction Engine functions autonomously when its skill-set suffices to achieve its goal, and it includes human-in-the-loop when it recognizes conditions benefiting from more complex, expert human intervention. Key to the system is a module that decides of human participation. It works by monitoring three mental states unobtrusively and in real time with Electroencephalography (EEG). The mental states are drawn from...
Show moreA self-adaptive software is developed to predict the stock market. It’s Stock Prediction Engine functions autonomously when its skill-set suffices to achieve its goal, and it includes human-in-the-loop when it recognizes conditions benefiting from more complex, expert human intervention. Key to the system is a module that decides of human participation. It works by monitoring three mental states unobtrusively and in real time with Electroencephalography (EEG). The mental states are drawn from the Opportunity-Willingness-Capability (OWC) model. This research demonstrates that the three mental states are predictive of whether the Human Computer Interaction System functions better autonomously (human with low scores on opportunity and/or willingness, capability) or with the human-in-the-loop, with willingness carrying the largest predictive power. This transdisciplinary software engineering research exemplifies the next step of self-adaptive systems in which human and computer benefit from optimized autonomous and cooperative interactions, and in which neural inputs allow for unobtrusive pre-interactions.
Show less - Date Issued
- 2016
- PURL
- http://purl.flvc.org/fau/fd/FA00004764, http://purl.flvc.org/fau/fd/FA00004764
- Subject Headings
- Cognitive neuroscience., Neural networks (Computer science), Pattern recognition systems., Artificial intelligence., Self-organizing systems., Human-computer interaction., Human information processing.
- Format
- Document (PDF)
- Title
- Developing a photovoltaic MPPT system.
- Creator
- Bennett, Thomas, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Many issues related to the design and implementation of a maximum power point tracking (MPPT) converter as part of a photovoltaic (PV) system are addressed. To begin with, variations of the single diode model for a PV module are compared, to determine whether the simplest variation may be used for MPPT PV system modeling and analysis purposes. As part ot this determination, four different DC/DC converters are used in conjunction with these different PV models. This is to verify consistent...
Show moreMany issues related to the design and implementation of a maximum power point tracking (MPPT) converter as part of a photovoltaic (PV) system are addressed. To begin with, variations of the single diode model for a PV module are compared, to determine whether the simplest variation may be used for MPPT PV system modeling and analysis purposes. As part ot this determination, four different DC/DC converters are used in conjunction with these different PV models. This is to verify consistent behavior across the different PV models, as well as across the different converter topologies. Consistent results across the different PV models, will allow a simpler model to be used for simulation ana analysis. Consistent results with the different converters will verify that MPPT algorithms are converter independent. Next, MPPT algorithms are discussed. In particular,the differences between the perturb and observe, and the incremental conductance algorithms are explained and illustrated. A new MPPT algorithm is then proposed based on the deficiencies of the other algorithms. The proposed algorithm's parameters are optimized, and the results for different PV modules obtained. Realistic system losses are then considered, and their effect on the PV system is analyzed ; especially in regards to the MPPT algorithm. Finally, a PV system is implemented and the theoretical results, as well as the behavior of the newly proposed MPPT algorithm, are verified.
Show less - Date Issued
- 2012
- PURL
- http://purl.flvc.org/FAU/3356887
- Subject Headings
- Photovoltaic power systems, Design, Electronic circuits, Electric current converters, Power (Mechanics), Renewable energy sources
- Format
- Document (PDF)
- Title
- Developing accurate software quality models using a faster, easier, and cheaper method.
- Creator
- Lim, Linda., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Managers of software development need to know which components of a system are fault-prone. If this can be determined early in the development cycle then resources can be more effectively allocated and significant costs can be reduced. Case-Based Reasoning (CBR) is a simple and efficient methodology for building software quality models that can provide early information to managers. Our research focuses on two case studies. The first study analyzes source files and classifies them as fault...
Show moreManagers of software development need to know which components of a system are fault-prone. If this can be determined early in the development cycle then resources can be more effectively allocated and significant costs can be reduced. Case-Based Reasoning (CBR) is a simple and efficient methodology for building software quality models that can provide early information to managers. Our research focuses on two case studies. The first study analyzes source files and classifies them as fault-prone or not fault-prone. It also predicts the number of faults in each file. The second study analyzes the fault removal process, and creates models that predict the outcome of software inspections.
Show less - Date Issued
- 2001
- PURL
- http://purl.flvc.org/fcla/dt/12746
- Subject Headings
- Computer software--Development, Computer software--Quality control, Software engineering
- Format
- Document (PDF)
- Title
- Development and simulation of vertical profiling capability for FAU autonomous underwater vehicles.
- Creator
- Lin, Huaying., Florida Atlantic University, Hsu, Sam, An, Edgar, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This thesis describes the development of a simulation environment for Autonomous Underwater Vehicles (AUVs) on UNIX platforms. AUV missions can therefore be carried out without going to the sea. The Yoyo controller, a component for AUVs is also described in this thesis. The main function of Yoyo is to control the vertical profile of an AUV while it is navigating underwater performing data collection missions. The development of the controller is done in the simulation environment. Several...
Show moreThis thesis describes the development of a simulation environment for Autonomous Underwater Vehicles (AUVs) on UNIX platforms. AUV missions can therefore be carried out without going to the sea. The Yoyo controller, a component for AUVs is also described in this thesis. The main function of Yoyo is to control the vertical profile of an AUV while it is navigating underwater performing data collection missions. The development of the controller is done in the simulation environment. Several test cases have been performed, and the test results have clearly demonstrated the successful development of the controller.
Show less - Date Issued
- 1998
- PURL
- http://purl.flvc.org/fcla/dt/15599
- Subject Headings
- Oceanographic submersibles--Computer simulation, Underwater navigation
- Format
- Document (PDF)
- Title
- Development of A Portable Impedance Based Flow Cytometer for Diagnosis of Sickle Cell Disease.
- Creator
- Dieujuste, Darryl, Zhuang, Hanqi, Du, Sarah, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Sickle cell disease is an inherited blood cell disorder that affects about 100,000 people in the US and results in high cost of medical care exceeding $1.1 billion annually. Sickle cell patients suffer from unpredictable, painful vaso-occlusive crises. Portable, costeffective approaches for diagnosis and monitoring sickle blood activities are important for a better management of the disease and reducing the medical cost. In this research, a mobile application controlled, impedance-based flow...
Show moreSickle cell disease is an inherited blood cell disorder that affects about 100,000 people in the US and results in high cost of medical care exceeding $1.1 billion annually. Sickle cell patients suffer from unpredictable, painful vaso-occlusive crises. Portable, costeffective approaches for diagnosis and monitoring sickle blood activities are important for a better management of the disease and reducing the medical cost. In this research, a mobile application controlled, impedance-based flow cytometer is developed for the diagnosis of sickle cell disease. Calibration of the portable device is performed using a component of known impedance value. The preliminary test results are then compared to those obtained by a commercial benchtop impedance analyzer for further validation. With the developed portable flow cytometer, experiments are performed on two sickle cell samples and a healthy cell sample. The acquired results are subsequently analyzed with MATLAB scripts to extract single-cell level impedance information as well as statistics of different cell conditions. Significant differences in cell impedance signals are observed between sickle cells and normal cells, as well as between sickle cells under hypoxia and normoxia conditions.
Show less - Date Issued
- 2018
- PURL
- http://purl.flvc.org/fau/fd/FA00013145
- Subject Headings
- Sickle cell disease, Sickle cell anemia--Diagnosis, Flow cytometry--Diagnostic use, Mobile Applications
- Format
- Document (PDF)
- Title
- DEVELOPMENT OF A WEARABLE DEVICE FOR MONITORING PHYSIOLOGICAL PARAMETERS RELATED TO HEART FAILURE.
- Creator
- Iqbal, Sheikh Muhammad Asher, Asghar, Waseem, Florida Atlantic University, Department of Computer and Electrical Engineering and Computer Science, College of Engineering and Computer Science
- Abstract/Description
-
Heart failure is a chronic cardiovascular disease that is caused due to the lack of blood supply from heart. This lack of blood supply leads to accumulation of the fluid in the thoracic region. Traditionally, implantable cardioverter defibrillators (ICDs) are used to treat HF and to monitor its parameters. Healthcare wearable devices (HWDs) are healthcare devices that can be worn or attached to the skin. HWD are non-invasive and low-cost means of providing healthcare at the point-of-care (POC...
Show moreHeart failure is a chronic cardiovascular disease that is caused due to the lack of blood supply from heart. This lack of blood supply leads to accumulation of the fluid in the thoracic region. Traditionally, implantable cardioverter defibrillators (ICDs) are used to treat HF and to monitor its parameters. Healthcare wearable devices (HWDs) are healthcare devices that can be worn or attached to the skin. HWD are non-invasive and low-cost means of providing healthcare at the point-of-care (POC). Herein, this dissertation discusses the development of a HWD for the monitoring of the parameters of heart failure (HF). These parameters include thoracic impedance, electrocardiogram (ECG), heart rate, oxygen saturation in blood and activity status of the subject. These are similar parameters as monitored using ICD. The dissertation will discuss the development, design, and results of the HWD.
Show less - Date Issued
- 2023
- PURL
- http://purl.flvc.org/fau/fd/FA00014349
- Subject Headings
- Wearable technology--Design and construction, Wearable devices, Heart failure
- Format
- Document (PDF)
- Title
- Development of a Wearable Device to Detect Epilepsy.
- Creator
- Khandnor Bakappa, Pradeepkumar, Agarwal, Ankur, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This paper evaluates the effectiveness of a wearable device, developed by the author, to detect different types of epileptic seizures and monitor epileptic patients. The device uses GSR, Pulse, EMG, body temperature and 3-axis accelerometer sensors to detect epilepsy. The device first learns the signal patterns of the epileptic patient in ideal condition. The signal pattern generated during the epileptic seizure, which are distinct from other signal patterns, are detected and analyzed by the...
Show moreThis paper evaluates the effectiveness of a wearable device, developed by the author, to detect different types of epileptic seizures and monitor epileptic patients. The device uses GSR, Pulse, EMG, body temperature and 3-axis accelerometer sensors to detect epilepsy. The device first learns the signal patterns of the epileptic patient in ideal condition. The signal pattern generated during the epileptic seizure, which are distinct from other signal patterns, are detected and analyzed by the algorithms developed by the author. Based on an analysis, the device successfully detected different types of epileptic seizures. The author conducted an experiment on himself to determine the effectiveness of the device and the algorithms. Based on the simulation results, the algorithms are 100 percent accurate in detecting different types of epileptic seizures.
Show less - Date Issued
- 2017
- PURL
- http://purl.flvc.org/fau/fd/FA00004937, http://purl.flvc.org/fau/fd/FA00004937
- Subject Headings
- Epilepsy--Diagnosis--Technological innovations., Patient monitoring., Signal processing--Digital techniques., Wearable computers--Industrial applications.
- Format
- Document (PDF)
- Title
- DEVELOPMENT OF AN ALGORITHM TO GUIDE A MULTI-POLE DIAGNOSTIC CATHETER FOR IDENTIFYING THE LOCATION OF ATRIAL FIBRILLATION SOURCES.
- Creator
- Ganesan, Prasanth, Ghoraani, Behnaz, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Atrial Fibrillation (AF) is a debilitating heart rhythm disorder affecting over 2.7 million people in the US and over 30 million people worldwide annually. It has a high correlation with causing a stroke and several other risk factors, resulting in increased mortality and morbidity rate. Currently, the non-pharmocological therapy followed to control AF is catheter ablation, in which the tissue surrounding the pulmonary veins (PVs) is cauterized (called the PV isolation - PVI procedure) aims...
Show moreAtrial Fibrillation (AF) is a debilitating heart rhythm disorder affecting over 2.7 million people in the US and over 30 million people worldwide annually. It has a high correlation with causing a stroke and several other risk factors, resulting in increased mortality and morbidity rate. Currently, the non-pharmocological therapy followed to control AF is catheter ablation, in which the tissue surrounding the pulmonary veins (PVs) is cauterized (called the PV isolation - PVI procedure) aims to block the ectopic triggers originating from the PVs from entering the atrium. However, the success rate of PVI with or without other anatomy-based lesions is only 50%-60%. A major reason for the suboptimal success rate is the failure to eliminate patientspecific non-PV sources present in the left atrium (LA), namely reentry source (a.k.a. rotor source) and focal source (a.k.a. point source). It has been shown from several animal and human studies that locating and ablating these sources significantly improves the long-term success rate of the ablation procedure. However, current technologies to locate these sources posses limitations with resolution, additional/special hardware requirements, etc. In this dissertation, the goal is to develop an efficient algorithm to locate AF reentry and focal sources using electrograms recorded from a conventionally used high-resolution multi-pole diagnostic catheter.
Show less - Date Issued
- 2019
- PURL
- http://purl.flvc.org/fau/fd/FA00013310
- Subject Headings
- Atrial Fibrillation--diagnosis, Algorithm, Catheter ablation
- Format
- Document (PDF)