Current Search: Almost a woman. (x) » College of Engineering and Computer Science (x)
View All Items
Pages
- Title
- Effects of bio-diesel fuel blends on the performance and emissions of diesel engine.
- Creator
- Bastiani, Sergio., College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
This study presents an experimental investigation into the effects of running biodiesel fuel blends on conventional diesel engines. Bio fuels provide a way to produce fuels without redesigning any of the engine technology present today, yet allowing for green house emissions to decrease. Bio-diesel is one of these types of emerging bio-fuels, which has an immediate alternative fuel aspect to it, while providing a decrease in green house emissions, as well as a solution to recycling used Waste...
Show moreThis study presents an experimental investigation into the effects of running biodiesel fuel blends on conventional diesel engines. Bio fuels provide a way to produce fuels without redesigning any of the engine technology present today, yet allowing for green house emissions to decrease. Bio-diesel is one of these types of emerging bio-fuels, which has an immediate alternative fuel aspect to it, while providing a decrease in green house emissions, as well as a solution to recycling used Waste Vegetable Oils which are other wise disposed. This study shows how by blending bio-diesel with petroleum diesel at intervals of B5, B10, B15, and B20 decrease green house emissions can significantly while maintaining similar performance output and efficiency with respect to 100% petroleum diesel.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/FAU/166446
- Subject Headings
- Biodiesel fuels, Research, Biodiesel fuels, Environmental aspects, Diesel motor, Alternative fuels, Testing, Greenhouse effect, Atmospheric
- Format
- Document (PDF)
- Title
- Fault tolerance and reliability patterns.
- Creator
- Buckley, Ingrid A., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The need to achieve dependability in critical infrastructures has become indispensable for government and commercial enterprises. This need has become more necessary with the proliferation of malicious attacks on critical systems, such as healthcare, aerospace and airline applications. Additionally, due to the widespread use of web services in critical systems, the need to ensure their reliability is paramount. We believe that patterns can be used to achieve dependability. We conducted a...
Show moreThe need to achieve dependability in critical infrastructures has become indispensable for government and commercial enterprises. This need has become more necessary with the proliferation of malicious attacks on critical systems, such as healthcare, aerospace and airline applications. Additionally, due to the widespread use of web services in critical systems, the need to ensure their reliability is paramount. We believe that patterns can be used to achieve dependability. We conducted a survey of fault tolerance, reliability and web service products and patterns to better understand them. One objective of our survey is to evaluate the state of these patterns, and to investigate which standards are being used in products and their tool support. Our survey found that these patterns are insufficient, and many web services products do not use them. In light of this, we wrote some fault tolerance and web services reliability patterns and present an analysis of them.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/FAU/166447
- Subject Headings
- Fault-tolerant computing, Computer software, Reliability, Reliability (Engineering), Computer programs
- Format
- Document (PDF)
- Title
- Low complexity H.264 video encoder design using machine learning techniques.
- Creator
- Carrillo, Paula., Department of Computer and Electrical Engineering and Computer Science, College of Engineering and Computer Science
- Abstract/Description
-
H.264/AVC encoder complexity is mainly due to variable size in Intra and Inter frames. This makes H.264/AVC very difficult to implement, especially for real time applications and mobile devices. The current technological challenge is to conserve the compression capacity and quality that H.264 offers but reduce the encoding time and, therefore, the processing complexity. This thesis applies machine learning technique for video encoding mode decisions and investigates ways to improve the...
Show moreH.264/AVC encoder complexity is mainly due to variable size in Intra and Inter frames. This makes H.264/AVC very difficult to implement, especially for real time applications and mobile devices. The current technological challenge is to conserve the compression capacity and quality that H.264 offers but reduce the encoding time and, therefore, the processing complexity. This thesis applies machine learning technique for video encoding mode decisions and investigates ways to improve the process of generating more general low complexity H.264/AVC video encoders. The proposed H.264 encoding method decreases the complexity in the mode decision inside the Inter frames. Results show, at least, a 150% average reduction of complexity and, at most, 0.6 average increases in PSNR for different kinds of videos and formats.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/FAU/166448
- Subject Headings
- Code division multiple access, Digital media, Technological innovations, Image transmission, Technological innovations, Coding theory, Data structures (Computer science)
- Format
- Document (PDF)
- Title
- Video transcoding using machine learning.
- Creator
- Holder, Christopher., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The field of Video Transcoding has been evolving throughout the past ten years. The need for transcoding of video files has greatly increased because of the new upcoming standards which are incompatible with old ones. This thesis takes the method of using machine learning for video transcoding mode decisions and discusses ways to improve the process of generating the algorithm for implementation in different video transcoders. The transcoding methods used decrease the complexity in the mode...
Show moreThe field of Video Transcoding has been evolving throughout the past ten years. The need for transcoding of video files has greatly increased because of the new upcoming standards which are incompatible with old ones. This thesis takes the method of using machine learning for video transcoding mode decisions and discusses ways to improve the process of generating the algorithm for implementation in different video transcoders. The transcoding methods used decrease the complexity in the mode decision inside the video encoder. Also methods which automate and improve results are discussed and implemented in two different sets of transcoders: H.263 to VP6 , and MPEG-2 to H.264. Both of these transcoders have shown a complexity loss of almost 50%. Video transcoding is important because the quantity of video standards have been increasing while devices usually can only decode one specific codec.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/FAU/166451
- Subject Headings
- Coding theory, Image transmission, Technological innovations, File conversion (Computer science), Data structures (Computer science), MPEG (Video coding standard), Digital media, Video compression
- Format
- Document (PDF)
- Title
- Experiments and modeling on resistivity of multi-layer concrete with and without embedded rebar.
- Creator
- Liu, Yanbo., College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
Factors such as water to cement ratio, moisture, mixture, presence and depth of rebar, and dimension of specimens, all of which affect apparent resistivity of concrete, were analyzed by experimental and modeling methods. Cylinder and rectangular prism concrete specimens were used in the experiments exposed in a high moisture room, laboratory room temperature, high humidity and outdoor weather environments. Single rebar and four rebar specimens were used to study the rebar effect on the...
Show moreFactors such as water to cement ratio, moisture, mixture, presence and depth of rebar, and dimension of specimens, all of which affect apparent resistivity of concrete, were analyzed by experimental and modeling methods. Cylinder and rectangular prism concrete specimens were used in the experiments exposed in a high moisture room, laboratory room temperature, high humidity and outdoor weather environments. Single rebar and four rebar specimens were used to study the rebar effect on the apparent resistivity. Modeling analysis was employed to verify and explain the experimental results. Based on the results, concrete with fly ash showed higher resistivity than concrete with just ordinary Portland cement. Rebar presence had a significant effect on the measured apparent resistivity at some of the locations. The results could be used as a guide for field apparent resistivity measurements and provide a quick, more precise and easy way to estimate the concrete quality.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/FAU/166452
- Subject Headings
- Reinforced concrete, Corrosion, Testing, Reinforcing bars, Properties, Concrete, Quality control
- Format
- Document (PDF)
- Title
- Development of functional relationships between radar and rain gage data using inductive modeling techniques.
- Creator
- Peters, Delroy., College of Engineering and Computer Science, Department of Civil, Environmental and Geomatics Engineering
- Abstract/Description
-
Traditional methods such as distance weighing, correlation and data driven methods have been used in the estimation of missing precipitation data. Also common is the use of radar (NEXRAD) data to provide better spatial distribution of precipitation as well as infilling missing rain gage data. Conventional regression models are often used to capture highly variant nonlinear spatial and temporal relationships between NEXRAD and rain gage data. This study aims to understand and model the...
Show moreTraditional methods such as distance weighing, correlation and data driven methods have been used in the estimation of missing precipitation data. Also common is the use of radar (NEXRAD) data to provide better spatial distribution of precipitation as well as infilling missing rain gage data. Conventional regression models are often used to capture highly variant nonlinear spatial and temporal relationships between NEXRAD and rain gage data. This study aims to understand and model the relationships between radar (NEXRAD) estimated rainfall data and the data measured by conventional rain gages. The study is also an investigation into the use of emerging computational data modeling (inductive) techniques and mathematical programming formulations to develop new optimal functional approximations. Radar based rainfall data and rain gage data are analyzed to understand the spatio-temporal associations, as well as the effect of changes in the length or availability of data on the models. The upper and lower Kissimmee basins of south Florida form the test-bed to evaluate the proposed and developed approaches and also to check the validity and operational applicability of these functional relationships among NEXRAD and rain gage data for infilling of missing data.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/FAU/166454
- Subject Headings
- Weather control, Mathematical models, Radar meteorology, Technological innovations, Precipitation (Meteorology), Measurement, Weather forecasting, Technological innovations
- Format
- Document (PDF)
- Title
- Design & performance of a wind and solar-powered autonomous surface vehicle.
- Creator
- Rynne, Patrick Forde., College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
The primary objective of this research is the development a wind and solar-powered autonomous surface vehicle (WASP) for oceanographic measurements. This thesis presents the general design scheme, detailed aerodynamic and hydrodynamic aspects, sailing performance theory, and dynamic performance validation measurements obtained from a series of experiments. The WASP consists of a 4.2 meter long sailboat hull, a low-Reynolds number composite wing, a 2000 Watt-hour battery reservoir, a system of...
Show moreThe primary objective of this research is the development a wind and solar-powered autonomous surface vehicle (WASP) for oceanographic measurements. This thesis presents the general design scheme, detailed aerodynamic and hydrodynamic aspects, sailing performance theory, and dynamic performance validation measurements obtained from a series of experiments. The WASP consists of a 4.2 meter long sailboat hull, a low-Reynolds number composite wing, a 2000 Watt-hour battery reservoir, a system of control actuators, a control system running on an embedded microprocessor, a suite of oceanographic sensors, and power regeneration from solar energy. The vehicle has a maximum speed of five knots and weighs approximately 350 kilograms. Results from four oceanographic missions that were conducted in the Port Everglades Intracoastal Waterway in Dania Beach [sic] Florida are presented. Water temperature, salinity and oxidation-reduction measurements recorded during these missions are also discussed. The combination of a mono-hull and solid wing in an autonomous system is a viable design for a long-range ocean observation platform. The results of four near-shore ocean observation missions illustrate the initial capabilities of the design. Future work aimed to further reduce both the mass of the wing design and the power requirements of the system will increase performance in all operating conditions and should be considered. Furthermore, the progression of the legal framework related to ocean vehicles must be pursued with respect to unmanned autonomous systems.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/FAU/166455
- Subject Headings
- Hydrographic surveying, Instruments, Evaluation, Aids to navigation, Equipment and supplies, Testing, Sailboats, Design and construction, Robots, Control systems, Oceanographic instruments, Evaluation
- Format
- Document (PDF)
- Title
- The effects of nitric acid and silane surface treatments on carbon fibers and carbon/vinyl ester composites before and after seawater exposure.
- Creator
- Langston, Tye A., College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
This research focuses on carbon fiber treatment by nitric acid and 3- (trimethoxysilyl)propyl methacrylate silane, and how this affects carbon/vinyl ester composites. These composites offer great benefits, but it is difficult to bond the fiber and matrix together, and without a strong interfacial bond, composites fall short of their potential. Silanes work well with glass fiber, but do not bond directly to carbon fiber because its surface is not reactive to liquid silanes. Oxidizing surface...
Show moreThis research focuses on carbon fiber treatment by nitric acid and 3- (trimethoxysilyl)propyl methacrylate silane, and how this affects carbon/vinyl ester composites. These composites offer great benefits, but it is difficult to bond the fiber and matrix together, and without a strong interfacial bond, composites fall short of their potential. Silanes work well with glass fiber, but do not bond directly to carbon fiber because its surface is not reactive to liquid silanes. Oxidizing surface treatments are often prescribed for improved wetting and bonding to carbon, but good results are not always achieved. Furthermore, there is the unanswered question of environmental durability. This research aimed to form a better understanding of oxidizing carbon fiber treatments, determine if silanes can be bonded to oxidized surfaces, and how these treatments affect composite strength and durability before and after seawater exposure. Nitric acid treatments on carbon fibers were found to improve their tensile strength to a constant level by smoothing surface defects and chemically modifying their surfaces by increasing carbonyl and carboxylic acid concentrations. Increasing these surface group concentrations raises fiber polar energy and causes them to cohere. This impedes wetting, resulting in poor quality, high void content composites, even though there appeared to be improved adhesion between the fibers and matrix. Silane was found to bond to the oxidized carbon fiber surfaces, as evidenced by changes in both fiber and composite properties. The fibers exhibited low polarity and cohesion, while the composites displayed excellent resin wetting, low void content, and low seawater weight gain and swelling. On the contrary, the oxidized fibers that were not treated with silane exhibited high polarity and fiber cohesion., Their composites displayed poor wetting, high void content, high seawater weight gain, and low swelling. Both fiber treatment types resulted in great improvements in dry transverse tensile strength over the untreated fibers, but the oxidized fiber composites lost strength as the acid treatment time was extended, due to poor wetting. The acid/silane treated composites lost some transverse tensile strength after seawater exposure, but the nitric acid oxidized fiber composites appeared to be more seawater durable.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/FAU/172669
- Subject Headings
- Silane compounds, Testing, Surface chemistry, Composite materials, Biodegradation, Carbon compounds, Testing
- Format
- Document (PDF)
- Title
- Web accessibility for the hearing impaired.
- Creator
- Pasmore, Simone., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
With the exponential increase of Internet usage and the embedding of multimedia content on the Web, some of the Internet resources remain inaccessible for people with disabilities. Particularly, people who are deaf or Hard of Hearing (HOH) experience inaccessible Web sites due to a lack of Closed Captioning (CC) for multimedia content on the Web, no sign language equivalents for the content on the Web, and an insufficient evaluation framework for determining if a Web page is accessible to the...
Show moreWith the exponential increase of Internet usage and the embedding of multimedia content on the Web, some of the Internet resources remain inaccessible for people with disabilities. Particularly, people who are deaf or Hard of Hearing (HOH) experience inaccessible Web sites due to a lack of Closed Captioning (CC) for multimedia content on the Web, no sign language equivalents for the content on the Web, and an insufficient evaluation framework for determining if a Web page is accessible to the Hearing Impaired community. Several opportunities for accessing content needed to be rectified in order for the Hearing Impaired community to access the full benefits of the information repository on the Internet. The research contributions of this thesis are to resolve some of the Web accessibility problems being faced by the Hearing Impaired community. These objectives are to create an automated CC for the Web for multimedia content, to embed sign language equivalent for content available on the Web, to create a framework to evaluate Web accessibility for the Hearing Impaired community, and to create a social network for the Deaf community. To demonstrate the feasibility of fulfilling the above listed objectives several prototypes were implemented. These prototypes have been used in real life scenarios in order to have an objective evaluation of the proposed framework. Further, the implemented prototypes have had an impact to both the academic community and to the industry.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/fcla/dt/177011
- Subject Headings
- Computers and people with disabilities, Interactive multimedia, Hearing impaired, Services for, Communication devices for people with disabilities, User interfaces (Computer systems), Web sites, Design
- Format
- Document (PDF)
- Title
- The adhesive effects in dental restoration.
- Creator
- Vargas, Raul., College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
The dental field shows proliferation in the market of new adhesives. The purpose of this study is to evaluate the mechanical properties on total restoration, based on the manufacturer's technical specifications, experimental and mechanical test results. The optimal dentist's selection will be when the most appropriate adhesive can be chosen for one specific restoration, avoiding wasted time, material and exposure to marginal infections with a failure restoration. This research was developed...
Show moreThe dental field shows proliferation in the market of new adhesives. The purpose of this study is to evaluate the mechanical properties on total restoration, based on the manufacturer's technical specifications, experimental and mechanical test results. The optimal dentist's selection will be when the most appropriate adhesive can be chosen for one specific restoration, avoiding wasted time, material and exposure to marginal infections with a failure restoration. This research was developed in stages. The first step is the study of the tooth morphological information. Following, there is the structure identification type and the chemical composition of six different pure adhesives. Next, perform the X-R Diffraction, Energy Disperse Spectroscopy (EDS), and Scanning Electron Microscopy (SEM). The final step is to perform the mechanical test, computer simulation, and discuss the results to obtain the best dental adhesive with and the new finding. Result: The samples show an amorphous structure and a chemical composition in the X-R Diffraction, SEM and EDS experiments. The mechanical test shows real mechanical properties under tension and sheer rupture stress. Poisson ratio, strain, and another relationship will be used in the computer simulation test. Results will be reflected in the Discussion and Conclusion. Significance: The first conclusion is that the amorphous structure is present in all six adhesives experiments. In addition, it shows strong possibilities of bonding with another neighbor's molecules. The discussion will be extended to the bonding advantages for this type of structure in the total dental restoration., Findings: First, we found that the time delay of photo polymerization was controlled with the variable water evaporation of the etching treatment. In addition, it was found that the variable size of the wavelength of the curing light obtained better molecular organization and avoided internal stress and bonding defect. Lastly, the chemical composition was a variable that provided the opportunity to predict the type of bond and strength.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/fcla/dt/177014
- Subject Headings
- Dental adhesives, Fillings (Dentistry), Polymers in medicine, Dental bonding
- Format
- Document (PDF)
- Title
- Fuzzycuda: interactive matte extraction on a GPU.
- Creator
- Gibson, Joel, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Natural matte extraction is a difficult and generally unsolved problem. Generating a matte from a nonuniform background traditionally requires a tediously hand drawn matte. This thesis studies recent methods requiring the user to place only modest scribbles identifying the foreground and the background. This research demonstrates a new GPU-based implementation of the recently introduced Fuzzy- Matte algorithm. Interactive matte extraction was achieved on a CUDA enabled G80 graphics processor....
Show moreNatural matte extraction is a difficult and generally unsolved problem. Generating a matte from a nonuniform background traditionally requires a tediously hand drawn matte. This thesis studies recent methods requiring the user to place only modest scribbles identifying the foreground and the background. This research demonstrates a new GPU-based implementation of the recently introduced Fuzzy- Matte algorithm. Interactive matte extraction was achieved on a CUDA enabled G80 graphics processor. Experimental results demonstrate improved performance over the previous CPU based version. In depth analysis of experimental data from the GPU and the CPU implementations are provided. The design challenges of porting a variant of Dijkstra's shortest distance algorithm to a parallel processor are considered.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/FAU/186288
- Subject Headings
- Computer graphics, Scientific applications, Information visualization, High performance computing, Real-time data processing
- Format
- Document (PDF)
- Title
- Enabling access for mobile devices to the web services resource framework.
- Creator
- Mangs, Jan Christian., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The increasing availability of Web services and grid computing has made easier the access and reuse of different types of services. Web services provide network accessible interfaces to application functionality in a platform-independent manner. Developments in grid computing have led to the efficient distribution of computing resources and power through the use of stateful web services. At the same time, mobile devices as a platform of computing have become a ubiquitous, inexpensive, and...
Show moreThe increasing availability of Web services and grid computing has made easier the access and reuse of different types of services. Web services provide network accessible interfaces to application functionality in a platform-independent manner. Developments in grid computing have led to the efficient distribution of computing resources and power through the use of stateful web services. At the same time, mobile devices as a platform of computing have become a ubiquitous, inexpensive, and powerful computing resource. Concepts such as cloud computing has pushed the trend towards using grid concepts in the internet domain and are ideally suited for internet-supported mobile devices. Currently, there are a few complete implementations that leverage mobile devices as a member of a grid or virtual organization. This thesis presents a framework that enables the use of mobile devices to access stateful Web services on a Globus-based grid. To illustrate the presented framework, a user-friendly mobile application has been created that utilizes the framework libraries do to demonstrate the various functionalities that are accessible from any mobile device that supports Java ME.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/FAU/186290
- Subject Headings
- User interfaces (Computer systems), Data structures (Computer science), Mobile computing, Security measures, Mobile communication systems, Computational grids (Computer systems)
- Format
- Document (PDF)
- Title
- Collabortive filtering using machine learning and statistical techniques.
- Creator
- Su, Xiaoyuan., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Collaborative filtering (CF), a very successful recommender system, is one of the applications of data mining for incomplete data. The main objective of CF is to make accurate recommendations from highly sparse user rating data. My contributions to this research topic include proposing the frameworks of imputation-boosted collaborative filtering (IBCF) and imputed neighborhood based collaborative filtering (INCF). We also proposed a model-based CF technique, TAN-ELR CF, and two hybrid CF...
Show moreCollaborative filtering (CF), a very successful recommender system, is one of the applications of data mining for incomplete data. The main objective of CF is to make accurate recommendations from highly sparse user rating data. My contributions to this research topic include proposing the frameworks of imputation-boosted collaborative filtering (IBCF) and imputed neighborhood based collaborative filtering (INCF). We also proposed a model-based CF technique, TAN-ELR CF, and two hybrid CF algorithms, sequential mixture CF and joint mixture CF. Empirical results show that our proposed CF algorithms have very good predictive performances. In the investigation of applying imputation techniques in mining incomplete data, we proposed imputation-helped classifiers, and VCI predictors (voting on classifications from imputed learning sets), both of which resulted in significant improvement in classification performance for incomplete data over conventional machine learned classifiers, including kNN, neural network, one rule, decision table, SVM, logistic regression, decision tree (C4.5), random forest, and decision list (PART), and the well known Bagging predictors. The main imputation techniques involved in these algorithms include EM (expectation maximization) and BMI (Bayesian multiple imputation).
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/FAU/186301
- Subject Headings
- Filters (Mathematics), Machine learning, Data mining, Technological innovations, Database management, Combinatorial group theory
- Format
- Document (PDF)
- Title
- Spectral refinement to speech enhancement.
- Creator
- Charoenruengkit, Werayuth., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The goal of a speech enhancement algorithm is to remove noise and recover the original signal with as little distortion and residual noise as possible. Most successful real-time algorithms thereof have done in the frequency domain where the frequency amplitude of clean speech is estimated per short-time frame of the noisy signal. The state of-the-art short-time spectral amplitude estimator algorithms estimate the clean spectral amplitude in terms of the power spectral density (PSD) function...
Show moreThe goal of a speech enhancement algorithm is to remove noise and recover the original signal with as little distortion and residual noise as possible. Most successful real-time algorithms thereof have done in the frequency domain where the frequency amplitude of clean speech is estimated per short-time frame of the noisy signal. The state of-the-art short-time spectral amplitude estimator algorithms estimate the clean spectral amplitude in terms of the power spectral density (PSD) function of the noisy signal. The PSD has to be computed from a large ensemble of signal realizations. However, in practice, it may only be estimated from a finite-length sample of a single realization of the signal. Estimation errors introduced by these limitations deviate the solution from the optimal. Various spectral estimation techniques, many with added spectral smoothing, have been investigated for decades to reduce the estimation errors. These algorithms do not address significantly issue on quality of speech as perceived by a human. This dissertation presents analysis and techniques that offer spectral refinements toward speech enhancement. We present an analytical framework of the effect of spectral estimate variance on the performance of speech enhancement. We use the variance quality factor (VQF) as a quantitative measure of estimated spectra. We show that reducing the spectral estimator VQF reduces significantly the VQF of the enhanced speech. The Autoregressive Multitaper (ARMT) spectral estimate is proposed as a low VQF spectral estimator for use in speech enhancement algorithms. An innovative method of incorporating a speech production model using multiband excitation is also presented as a technique to emphasize the harmonic components of the glottal speech input., The preconditioning of the noisy estimates by exploiting other avenues of information, such as pitch estimation and the speech production model, effectively increases the localized narrow-band signal-to noise ratio (SNR) of the noisy signal, which is subsequently denoised by the amplitude gain. Combined with voicing structure enhancement, the ARMT spectral estimate delivers enhanced speech with sound clarity desirable to human listeners. The resulting improvements in enhanced speech are observed to be significant with both Objective and Subjective measurement.
Show less - Date Issued
- 2009
- PURL
- http://purl.flvc.org/FAU/186327
- Subject Headings
- Adaptive signal processing, Digital techniques, Spectral theory (Mathematics), Noise control, Fuzzy algorithms, Speech processing systems, Digital techniques
- Format
- Document (PDF)
- Title
- Gene selection for sample sets with biased distribution.
- Creator
- Kamal, Abu Hena Mustafa., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Microarray expression data which contains the expression levels of a large number of simultaneously observed genes have been used in many scientific research and clinical studies. Due to its high dimensionalities, selecting a small number of genes has shown to be beneficial for many tasks such as building prediction models from the microarray expression data or gene regulatory network discovery. Traditional gene selection methods, however, fail to take the class distribution into the...
Show moreMicroarray expression data which contains the expression levels of a large number of simultaneously observed genes have been used in many scientific research and clinical studies. Due to its high dimensionalities, selecting a small number of genes has shown to be beneficial for many tasks such as building prediction models from the microarray expression data or gene regulatory network discovery. Traditional gene selection methods, however, fail to take the class distribution into the selection process. In biomedical science, it is very common to have microarray expression data which is severely biased with one class of examples (e.g., diseased samples) significantly less than other classes (e.g., normal samples). These sample sets with biased distributions require special attention from researchers for identification of genes responsible for a particular disease. In this thesis, we propose three filtering techniques, Higher Weight ReliefF, ReliefF with Differential Minority Repeat and ReliefF with Balanced Minority Repeat to identify genes responsible for fatal diseases from biased microarray expression data. Our solutions are evaluated on five well-known microarray datasets, Colon, Central Nervous System, DLBCL Tumor, Lymphoma and ECML Pancreas. Experimental comparisons with the traditional ReliefF filtering method demonstrate the effectiveness of the proposed methods in selecting informative genes from microarray expression data with biased sample distributions.
Show less - Date Issued
- 2009
- PURL
- http://purl.flvc.org/FAU/186330
- Subject Headings
- Gene expression, Research, Methodology, Medical informatics, Apoptosis, Molecular aspects, DNA microarrays, Research
- Format
- Document (PDF)
- Title
- Scheduling for composite event detection in wireless sensor networks.
- Creator
- Ambrose, Arny Isonja, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Wireless sensor networks are used in areas that are inaccessible, inhospitable or for continuous monitoring. The main use of such networks is for event detection. Event detection is used to monitor a particular environment for an event such as fire or flooding. Composite event detection is used to break down the detection of the event into the specific conditions that need to be present for the event to occur. Using this method, each sensor node does not need to carry every sensing component...
Show moreWireless sensor networks are used in areas that are inaccessible, inhospitable or for continuous monitoring. The main use of such networks is for event detection. Event detection is used to monitor a particular environment for an event such as fire or flooding. Composite event detection is used to break down the detection of the event into the specific conditions that need to be present for the event to occur. Using this method, each sensor node does not need to carry every sensing component necessary to detect the event. Since energy efficiency is important the sensor nodes need to be scheduled so that they consume [sic] consume as little energy as possible to extend the network lifetime. In this thesis, a solution to the sensor Scheduling for Composite Event Detection (SCED) problem will be presented as a way to improve the network lifetime when using composite event detection.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/fcla/dt/186333
- Subject Headings
- Sensor networks, Wireless communication systems, Embedded computer systems, Computer systems, Reliability
- Format
- Document (PDF)
- Title
- Automated nursing knowledge classification using indexing.
- Creator
- Chinchanikar, Sucharita Vijay., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Promoting healthcare and wellbeing requires the dedication of a multi-tiered health service delivery system, which is comprised of specialists, medical doctors and nurses. A holistic view to a patient care perspective involves emotional, mental and physical healthcare needs, in which caring is understood as the essence of nursing. Properly and efficiently capturing and managing nursing knowledge is essential to advocating health promotion and illness prevention. This thesis proposes a...
Show morePromoting healthcare and wellbeing requires the dedication of a multi-tiered health service delivery system, which is comprised of specialists, medical doctors and nurses. A holistic view to a patient care perspective involves emotional, mental and physical healthcare needs, in which caring is understood as the essence of nursing. Properly and efficiently capturing and managing nursing knowledge is essential to advocating health promotion and illness prevention. This thesis proposes a document-indexing framework for automating classification of nursing knowledge based on nursing theory and practice model. The documents defining the numerous categories in nursing care model are structured with the help of expert nurse practitioners and professionals. These documents are indexed and used as a benchmark for the process of automatic mapping of each expression in the assessment form of a patient to the corresponding category in the nursing theory model. As an illustration of the proposed methodology, a prototype application is developed using the Latent Semantic Indexing (LSI) technique. The prototype application is tested in a nursing practice environment to validate the accuracy of the proposed algorithm. The simulation results are also compared with an application using Lucene indexing technique that internally uses modified vector space model for indexing. The result comparison showed that the LSI strategy gives 87.5% accurate results compared to the Lucene indexing technique that gives 80% accuracy. Both indexing methods maintain 100% consistency in the results.
Show less - Date Issued
- 2009
- PURL
- http://purl.flvc.org/FAU/186677
- Subject Headings
- Nursing, Computer-assisted instruction, Data transmission systems, Outcome assessment (Medical care), Nursing assessment, Digital techniques
- Format
- Document (PDF)
- Title
- Traffic congestion detection using VANET.
- Creator
- Padron, Francisco M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
We propose a distributed, collaborative traffic congestion detection and dissemination system using VANET that makes efficient use of the communication channel, maintains location privacy, and provides drivers with real-time information on traffic congestions over long distances. The system uses vehicles themselves, equipped with simple inexpensive devices, as gatherers and distributors of information without the need for costly road infrastructure such as sensors, cameras or external...
Show moreWe propose a distributed, collaborative traffic congestion detection and dissemination system using VANET that makes efficient use of the communication channel, maintains location privacy, and provides drivers with real-time information on traffic congestions over long distances. The system uses vehicles themselves, equipped with simple inexpensive devices, as gatherers and distributors of information without the need for costly road infrastructure such as sensors, cameras or external communication equipment. Additionally, we present a flexible simulation and visualization framework we designed and developed to validate our system by showing its effectiveness in multiple scenarios and to aid in the research and development of this and future VANET applications.
Show less - Date Issued
- 2009
- PURL
- http://purl.flvc.org/FAU/186684
- Subject Headings
- Vehicular ad-hoc networks (Computer networks), Traffic congestion, Mathematical models, Mobile communication systems, Evaluation, Traffic congestion, Prevention
- Format
- Document (PDF)
- Title
- Object detection in low resolution video sequences.
- Creator
- Pava, Diego F., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
With augmenting security concerns and decreasing costs of surveillance and computing equipment, research on automated systems for object detection has been increasing, but the majority of the studies focus their attention on sequences where high resolution objects are present. The main objective of this work is the detection and extraction of information of low resolution objects (e.g. objects that are so far away from the camera that they occupy only tens of pixels) in order to provide a...
Show moreWith augmenting security concerns and decreasing costs of surveillance and computing equipment, research on automated systems for object detection has been increasing, but the majority of the studies focus their attention on sequences where high resolution objects are present. The main objective of this work is the detection and extraction of information of low resolution objects (e.g. objects that are so far away from the camera that they occupy only tens of pixels) in order to provide a base for higher level information operations such as classification and behavioral analysis. The system proposed is composed of four stages (preprocessing, background modeling, information extraction, and post processing) and uses context based region of importance selection, histogram equalization, background subtraction and morphological filtering techniques. The result is a system capable of detecting and tracking low resolution objects in a controlled background scene which can be a base for systems with higher complexity.
Show less - Date Issued
- 2009
- PURL
- http://purl.flvc.org/FAU/186685
- Subject Headings
- Computer systems, Security measures, Remote sensing, Image processing, Digital techniques, Imaging systems, Mathematical models
- Format
- Document (PDF)
- Title
- Transport and dispersion of fire extinguishing agents downstream from clutter elements of aircraft engine nacelles.
- Creator
- Zbeeb, Khaled., College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
The combination of highly turbulent airflow, flammable fluids, and numerous ignition sources makes aircraft engine nacelles a difficult fire zone to protect. Better understanding of nacelle air flow and how it influences the spread of fires and fire extinguishing agents is needed to improve the efficiency of fire suppression. The first objective was to establish a CFD model for a flow field test section to analyze the transport and dispersion of fire extinguishing agents in the presence of...
Show moreThe combination of highly turbulent airflow, flammable fluids, and numerous ignition sources makes aircraft engine nacelles a difficult fire zone to protect. Better understanding of nacelle air flow and how it influences the spread of fires and fire extinguishing agents is needed to improve the efficiency of fire suppression. The first objective was to establish a CFD model for a flow field test section to analyze the transport and dispersion of fire extinguishing agents in the presence of various clutter elements. To validate the use of the CFD model, the simulation results of the CFD model were compared to the experimental data and they show an agreement with the experimental data. The second objective was to present parametric studies to show the effects of the coflow speed, turbulence intensity and agent droplet size on the transport and dispersion of the agent particles downstream from the clutter elements.
Show less - Date Issued
- 2009
- PURL
- http://purl.flvc.org/FAU/186687
- Subject Headings
- Airplanes, Nacelles, Safety measures, Airplanes, Fires and fire prevention, Fire extinguishing agents, Testing, Airplanes, Fluid dynamics, Mathematical models
- Format
- Document (PDF)