Current Search: info:fedora/islandora:sp_large_image_cmodel (x) » Department of Computer and Electrical Engineering and Computer Science (x) » Roth, Zvi S. (x)
View All Items
Pages
- Title
- YACAD: Yet Another Congestion Avoidance Design for ATM-based networks.
- Creator
- Hsu, Sam, Florida Atlantic University, Ilyas, Mohammad, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This dissertation proposes YACAD (Yet Another Congestion Avoidance Design for ATM-based Networks), a congestion prevention model that includes admission control, traffic shaping, and link-by-link flow control for ATM-based networks. Network traffic in this model is composed of real-time traffic and data traffic. As real-time traffic is delay-sensitive and connection-oriented, its call acceptance is based upon the effective bandwidth at all nodes. Effective bandwidth is defined as a vector of...
Show moreThis dissertation proposes YACAD (Yet Another Congestion Avoidance Design for ATM-based Networks), a congestion prevention model that includes admission control, traffic shaping, and link-by-link flow control for ATM-based networks. Network traffic in this model is composed of real-time traffic and data traffic. As real-time traffic is delay-sensitive and connection-oriented, its call acceptance is based upon the effective bandwidth at all nodes. Effective bandwidth is defined as a vector of bandwidth and maximum node delay. As data traffic can be either connection-oriented or connectionless, it is subject to link-by-link flow control based on a criterion known as effective buffer which is defined as a scalar of buffer size. Data traffic is not delay-sensitive but is loss-sensitive. Traffic shaping is imposed on real-time traffic to ensure a smooth inflow of real-time cells. YACAD also allocates a large buffer (fat bucket) to data traffic to accommodate sudden long bursts of data cells. Absence of data cell loss is a major feature of YACAD. Two simulation studies on the performance of the model are conducted. Analyses of the simulation results show that the proposed congestion avoidance model can achieve congestion-free networking and bounded network delays for real-time traffic at high levels of channel utilization. The maximum buffer requirements for loss-free cell delivery for data traffic, and the cell loss probabilities for real-time traffic are also obtained. In addition, results of performance comparisons to other similar models have shown that YACAD outperforms several other leaky-bucket based congestion control methods in terms of cell loss probability for real-time traffic. The simulation source program has also been verified using existing queueing theories, and the Paired-t Confidence Interval method with satisfactory results at 99% confidence level.
Show less - Date Issued
- 1993
- PURL
- http://purl.flvc.org/fcla/dt/12336
- Subject Headings
- Integrated services digital networks, Broadband communications systems, Packet switching (Data transmission), Computer networks--Management
- Format
- Document (PDF)
- Title
- XYZ: A scalable, partially centralized lookup service for large-scale peer-to-peer systems.
- Creator
- Zhang, Jianying., Florida Atlantic University, Wu, Jie, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Peer-to-Peer (P2P) systems are characterized by direct access between peer computers, rather than through a centralized server. File sharing is the dominant P2P application on the Internet, allowing users to easily contribute, search and obtain content. The objective of this thesis was to design XYZ, a partially centralized, scalable and self-organizing lookup service for wide area P2P systems. The XYZ system is based on distributed hash table (DHT). A unique ID and a color assigned to each...
Show morePeer-to-Peer (P2P) systems are characterized by direct access between peer computers, rather than through a centralized server. File sharing is the dominant P2P application on the Internet, allowing users to easily contribute, search and obtain content. The objective of this thesis was to design XYZ, a partially centralized, scalable and self-organizing lookup service for wide area P2P systems. The XYZ system is based on distributed hash table (DHT). A unique ID and a color assigned to each node and each file. The author uses clustering method to create the system backbone by connecting the cluster heads together and uses color clustering method to create color overlays. Any lookup for a file with a color will only be forwarded in the color overlay with the same color so that the searching space is minimized. Simulations and analysis are also provided in this thesis.
Show less - Date Issued
- 2005
- PURL
- http://purl.flvc.org/fcla/dt/13263
- Subject Headings
- Wireless communication systems, Peer-to-peer architecture (Computer networks), Computational grids (Computer systems)
- Format
- Document (PDF)
- Title
- XYZ Video Compression: An algorithm for real-time compression of motion video based upon the three-dimensional discrete cosine transform.
- Creator
- Westwater, Raymond John., Florida Atlantic University, Furht, Borko, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
XYZ Video Compression denotes a video compression algorithm that operates in three dimensions, without the overhead of motion estimation. The smaller overhead of this algorithm as compared to MPEG and other "standards-based" compression algorithms using motion estimation suggests the suitability of this algorithm to real-time applications. The demonstrated results of compression of standard motion video benchmarks suggest that XYZ Video Compression is not only a faster algorithm, but develops...
Show moreXYZ Video Compression denotes a video compression algorithm that operates in three dimensions, without the overhead of motion estimation. The smaller overhead of this algorithm as compared to MPEG and other "standards-based" compression algorithms using motion estimation suggests the suitability of this algorithm to real-time applications. The demonstrated results of compression of standard motion video benchmarks suggest that XYZ Video Compression is not only a faster algorithm, but develops superior compression ratios as well. The algorithm is based upon the three-dimensional Discrete Cosine Transform (DCT). Pixels are organized as 8 x 8 x 8 cubes by taking 8 x 8 squares out of 8 consecutive frames. A fast three-dimensional transform is applied to each cube, generating 512 DCT coefficients. The energy-packing property of the DCT concentrates the energy in the cube into few coefficients. The DCT coefficients are quantized to maximize the energy concentration at the expense of introduction of a user-determined level of error. A method of adaptive quantization that generates optimal quantizers based upon statistics gathered for the 8 consecutive frames is described. The sensitivity of the human eye to various DCT coefficients is used to modify the quantizers to create a "visually equivalent" cube with still greater energy concentration. Experiments are described that justify choice of Human Visual System factors to be folded into the quantization step. The quantized coefficients are then encoded into a data stream using a method of entropy coding based upon the statistics of the quantized coefficients. The bitstream generated by entropy coding represents the compressed data of the 8 motion video frames, and typically will be compressed at 50:1 at 5% error. The decoding process is the reverse of the encoding process: the bitstream is decoded to generate blocks of quantized DCT coefficients, the DCT coefficients are dequantized, and the Inverse Discrete Cosine Transform is performed on the cube to recover pixel data suitable for display. The elegance of this technique lies in its simplicity, which lends itself to inexpensive implementation of both encoder and decoder. Finally, real-time implementation of the XYZ Compressor/Decompressor is discussed. Experiments are run to determine the effectiveness of the implementation.
Show less - Date Issued
- 1996
- PURL
- http://purl.flvc.org/fcla/dt/12450
- Subject Headings
- Digital video, Data compression (Telecommunication), Image processing--Digital techniques, Coding theory
- Format
- Document (PDF)
- Title
- An XML-based data exchange model.
- Creator
- Sreenivasan, Sridhar., Florida Atlantic University, Hsu, Sam, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Data are the backbone to any organization. In some organizations there are applications that operate on a distributed platform. These applications might often communicate with applications on a different platform, that structures data in a different format. The manner in which data is transferred between such applications can become complex, because there will be many platform dependent applications that need to process the data transferred. Hence a mechanism for exchange of data in a simple,...
Show moreData are the backbone to any organization. In some organizations there are applications that operate on a distributed platform. These applications might often communicate with applications on a different platform, that structures data in a different format. The manner in which data is transferred between such applications can become complex, because there will be many platform dependent applications that need to process the data transferred. Hence a mechanism for exchange of data in a simple, effective manner for such applications is a basic necessity. In this thesis, such a mechanism to deal with data exchange in a platform independent manner is discussed. The proposed model is XML-based, and data between the applications is exchanged in the form of XML documents. XML is text-based and can be processed by any application existing on any platform. The model has an interface that processes the XML documents transferred between the client applications and the underlying database systems. The model is implemented in a System administration application. The application is a Web application that transfers data in the XML format. This is processed by the interface and transferred to the database. Data from the database is retrieved and converted to XML documents by the interface and transferred to the client (Web) applications.
Show less - Date Issued
- 2002
- PURL
- http://purl.flvc.org/fcla/dt/12926
- Subject Headings
- XML (Document markup language)
- Format
- Document (PDF)
- Title
- Web-based wireless sensor network monitoring using smartphones.
- Creator
- Marcus, Anthony M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This thesis consists of the development of a web based wireless sensor network (WSN) monitoring system using smartphones. Typical WSNs consist of networks of wireless sensor nodes dispersed over predetermined areas to acquire, process, and transmit data from these locations. Often it is the case that the WSNs are located in areas too hazardous or inaccessible to humans. We focused on the need for access to this sensed data remotely and present our reference architecture to solve this problem....
Show moreThis thesis consists of the development of a web based wireless sensor network (WSN) monitoring system using smartphones. Typical WSNs consist of networks of wireless sensor nodes dispersed over predetermined areas to acquire, process, and transmit data from these locations. Often it is the case that the WSNs are located in areas too hazardous or inaccessible to humans. We focused on the need for access to this sensed data remotely and present our reference architecture to solve this problem. We developed this architecture for web-based wireless sensor network monitoring and have implemented a prototype that uses Crossbow Mica sensors and Android smartphones for bridging the wireless sensor network with the web services for data storage and retrieval. Our application has the ability to retrieve sensed data directly from a wireless senor network composed of Mica sensors and from a smartphones onboard sensors. The data is displayed on the phone's screen, and then, via Internet connection, they are forwarded to a remote database for manipulation and storage. The attributes sensed and stored by our application are temperature, light, acceleration, GPS position, and geographical direction. Authorized personnel are able to retrieve and observe this data both textually and graphically from any browser with Internet connectivity or through a native Android application. Web-based wireless sensor network architectures using smartphones provides a scalable and expandable solution with applicability in many areas, such as healthcare, environmental monitoring, infrastructure health monitoring, border security, and others.
Show less - Date Issued
- 2011
- PURL
- http://purl.flvc.org/FAU/3171682
- Subject Headings
- Smartphones, Wireless communication systems, Security measures, Wireless communication systems, Technological innovations, Computer networks, Security measures, Ad hoc networks (Computer networks), Security measures
- Format
- Document (PDF)
- Title
- A web-based automated classification system for nursing language based on nursing theory.
- Creator
- Dass, Subhomoy D., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Health care systems consist of various individuals and organizations that aim to meet the health care needs of people and provide a complete and responsive health care solution. One of the important aspects of a health care delivery system is nursing. The use of technology is a vital aspect for delivering an optimum and complete nursing care to individuals; and also for improving the quality and delivery mechanism of nursing care. The model proposed in this thesis for Nursing Knowledge...
Show moreHealth care systems consist of various individuals and organizations that aim to meet the health care needs of people and provide a complete and responsive health care solution. One of the important aspects of a health care delivery system is nursing. The use of technology is a vital aspect for delivering an optimum and complete nursing care to individuals; and also for improving the quality and delivery mechanism of nursing care. The model proposed in this thesis for Nursing Knowledge Management System is a novel knowledge-based decision support system for nurses to capture and manage nursing practice, and further, to monitor nursing care quality, as well as to test aspects of an electronic health record for recording and reporting nursing practice. As a part of a collaborative research of the Christine E. Lynn College of Nursing and the Department of Computer Science, a prototype toolset was developed to capture and manage nursing practice in order to improve the quality of care. This thesis focuses on implementing a web based SOA solution for Automated Classification of Nursing Care Categories, based on the knowledge gained from the prototype for nursing care practice.
Show less - Date Issued
- 2011
- PURL
- http://purl.flvc.org/FAU/3332184
- Subject Headings
- Nursing, Quality control, Outcome asssessment (Medical care), Nursing assessment, Digital techiques, Nursing, Computer-assisted instruction, Nursing informatics
- Format
- Document (PDF)
- Title
- Web services cryptographic patterns.
- Creator
- Hashizume, Keiko., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Data security has been identified as one of the most important concerns where sensitive messages are exchanged over the network. In web service architecture, multiple distributed applications communicate with each other over the network by sending XML messages. How can we protect these sensitive messages? Some web services standards have emerged to tackle this problem. The XML Encryption standard defines the process of encrypting and decrypting all of an XML message, part of an XML message,...
Show moreData security has been identified as one of the most important concerns where sensitive messages are exchanged over the network. In web service architecture, multiple distributed applications communicate with each other over the network by sending XML messages. How can we protect these sensitive messages? Some web services standards have emerged to tackle this problem. The XML Encryption standard defines the process of encrypting and decrypting all of an XML message, part of an XML message, or even an external resource. Like XML Encryption, the XML Signature standard specifies how to digitally sign an entire XML message, part of an XML message, or an external object. WS-Security defines how to embed security tokens, XML encryption, and XML signature into XML documents. It does not define new security mechanisms, but leverages existing security technologies such as encryption and digital signature.
Show less - Date Issued
- 2009
- PURL
- http://purl.flvc.org/FAU/216413
- Subject Headings
- Computer networks, Access control, Data encryption (Computer science), XML (Document markup language), Digital signatures, Computer network architectures
- Format
- Document (PDF)
- Title
- Web log analysis: Experimental studies.
- Creator
- Yang, Zhijian., Florida Atlantic University, Zhong, Shi, Pandya, Abhijit S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
With rapid growth of the World Wide Web, web performance becomes increasingly important for modern businesses, especially for e-commerce. As we all know, web server logs contain potentially useful empirical data to improve web server performance. In this thesis, we discuss some topics related to the analysis of a website's server logs for enhancing server performance, which will benefit some applications in business. Markov chain models are used and allow us to dynamically model page...
Show moreWith rapid growth of the World Wide Web, web performance becomes increasingly important for modern businesses, especially for e-commerce. As we all know, web server logs contain potentially useful empirical data to improve web server performance. In this thesis, we discuss some topics related to the analysis of a website's server logs for enhancing server performance, which will benefit some applications in business. Markov chain models are used and allow us to dynamically model page sequences extracted from server logs. My experimental studies contain three major parts. First, I present a workload characterization study of the website used for my research. Second, Markov chain models are constructed for both page request and page-visiting sequence prediction. Finally, I carefully evaluate the constructed models using an independent test data set, which is from server logs on a different day. The research results demonstrate the effectiveness of Markov chain models for characterizing page-visiting sequences.
Show less - Date Issued
- 2005
- PURL
- http://purl.flvc.org/fcla/dt/13202
- Subject Headings
- Markov processes, Operations research, Business enterprises--Computer networks, Electronic commerce--Data processing
- Format
- Document (PDF)
- Title
- Web accessibility for the hearing impaired.
- Creator
- Pasmore, Simone., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
With the exponential increase of Internet usage and the embedding of multimedia content on the Web, some of the Internet resources remain inaccessible for people with disabilities. Particularly, people who are deaf or Hard of Hearing (HOH) experience inaccessible Web sites due to a lack of Closed Captioning (CC) for multimedia content on the Web, no sign language equivalents for the content on the Web, and an insufficient evaluation framework for determining if a Web page is accessible to the...
Show moreWith the exponential increase of Internet usage and the embedding of multimedia content on the Web, some of the Internet resources remain inaccessible for people with disabilities. Particularly, people who are deaf or Hard of Hearing (HOH) experience inaccessible Web sites due to a lack of Closed Captioning (CC) for multimedia content on the Web, no sign language equivalents for the content on the Web, and an insufficient evaluation framework for determining if a Web page is accessible to the Hearing Impaired community. Several opportunities for accessing content needed to be rectified in order for the Hearing Impaired community to access the full benefits of the information repository on the Internet. The research contributions of this thesis are to resolve some of the Web accessibility problems being faced by the Hearing Impaired community. These objectives are to create an automated CC for the Web for multimedia content, to embed sign language equivalent for content available on the Web, to create a framework to evaluate Web accessibility for the Hearing Impaired community, and to create a social network for the Deaf community. To demonstrate the feasibility of fulfilling the above listed objectives several prototypes were implemented. These prototypes have been used in real life scenarios in order to have an objective evaluation of the proposed framework. Further, the implemented prototypes have had an impact to both the academic community and to the industry.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/fcla/dt/177011
- Subject Headings
- Computers and people with disabilities, Interactive multimedia, Hearing impaired, Services for, Communication devices for people with disabilities, User interfaces (Computer systems), Web sites, Design
- Format
- Document (PDF)
- Title
- A wavelet-based detector for underwater communication.
- Creator
- Petljanski, Branko., Florida Atlantic University, Erdol, Nurgun, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The need for reliable underwater communication at Florida Atlantic University is critical in transmitting data to and from Autonomous Underwater Vehicles (AUV) and remote sensors. Since a received signal is corrupted with ambient ocean noise, the nature of such noise is investigated. Furthermore, we establish connection between ambient ocean noise and fractal noise. Since the matched filter is designed under the assumption that noise is white, performance degradation of the matched filter due...
Show moreThe need for reliable underwater communication at Florida Atlantic University is critical in transmitting data to and from Autonomous Underwater Vehicles (AUV) and remote sensors. Since a received signal is corrupted with ambient ocean noise, the nature of such noise is investigated. Furthermore, we establish connection between ambient ocean noise and fractal noise. Since the matched filter is designed under the assumption that noise is white, performance degradation of the matched filter due non-white noise is investigated. We show empirical results that the wavelet transform provides an approximate Karhunen-Loeve expansion for 1/f-type noise. Since whitening can improve only broadband signals, a new method for synchronization signal design in wavelet subspaces with increased energy-to-peak amplitude ratio is presented. The wavelet detector with whitening of fractal noise and detection in wavelet subspace is shown. Results show that the wavelet detector improves detectability, however this is below expectation due to differences between fractal noise and ambient ocean noise.
Show less - Date Issued
- 2001
- PURL
- http://purl.flvc.org/fcla/dt/12778
- Subject Headings
- Wavelets (Mathematics), Underwater acoustics
- Format
- Document (PDF)
- Title
- Wavelet transform-based digital signal processing.
- Creator
- Basbug, Filiz., Florida Atlantic University, Erdol, Nurgun, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This study deals with applying the wavelet transform to mainly two different areas of signal processing: adaptive signal processing, and signal detection. It starts with background information on the theory of wavelets with an emphasis on the multiresolution representation of signals by the wavelet transform in Chapter 1. Chapter 2 begins with an overview of adaptive filtering in general and extends it to transform domain adaptive filtering. Later in the chapter, a novel adaptive filtering...
Show moreThis study deals with applying the wavelet transform to mainly two different areas of signal processing: adaptive signal processing, and signal detection. It starts with background information on the theory of wavelets with an emphasis on the multiresolution representation of signals by the wavelet transform in Chapter 1. Chapter 2 begins with an overview of adaptive filtering in general and extends it to transform domain adaptive filtering. Later in the chapter, a novel adaptive filtering architecture using the wavelet transform is introduced. The performance of this new structure is evaluated by using the LMS algorithm with variations in step size. As a result of this study, the wavelet transform based adaptive filter is shown to reduce the eigenvalue ratio, or condition number, of the input signal. As a result, the new structure is shown to have faster convergence, implying an improvement in the ability to track rapidly changing signals. Chapter 3 deals with signal detection with the help of the wavelet transform. One scheme studies signal detection by projecting the input signal onto different scales. The relationship between this approach and that of matched filtering is established. Then the effect of different factors on signal detection with the wavelet transform is examined. It is found that the method is robust in the presence of white noise. Also, the wavelets are analyzed as eigenfunctions of a certain random process, and how this gives way to optimal receiver design is shown. It is further demonstrated that the design of an optimum receiver leads to the wavelet transform based adaptive filter structure described in Chapter 2.
Show less - Date Issued
- 1993
- PURL
- http://purl.flvc.org/fcla/dt/12354
- Subject Headings
- Wavelets (Mathematics), Signal processing--Digital techniques
- Format
- Document (PDF)
- Title
- Wavelet de-noising applied to vibrational envelope analysis methods.
- Creator
- Bertot, Edward Max, Khoshgoftaar, Taghi M., Beaujean, Pierre-Philippe, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
In the field of machine prognostics, vibration analysis is a proven method for detecting and diagnosing bearing faults in rotating machines. One popular method for interpreting vibration signals is envelope demodulation, which allows a technician to clearly identify an impulsive fault source and its severity. However incipient faults -faults in early stages - are masked by in-band noise, which can make the associated impulses difficult to detect and interpret. In this thesis, Wavelet De...
Show moreIn the field of machine prognostics, vibration analysis is a proven method for detecting and diagnosing bearing faults in rotating machines. One popular method for interpreting vibration signals is envelope demodulation, which allows a technician to clearly identify an impulsive fault source and its severity. However incipient faults -faults in early stages - are masked by in-band noise, which can make the associated impulses difficult to detect and interpret. In this thesis, Wavelet De-Noising (WDN) is implemented after envelope-demodulation to improve accuracy of bearing fault diagnostics. This contrasts the typical approach of de-noising as a preprocessing step. When manually measuring time-domain impulse amplitudes, the algorithm shows varying improvements in Signal-to-Noise Ratio (SNR) relative to background vibrational noise. A frequency-domain measure of SNR agrees with this result.
Show less - Date Issued
- 2014
- PURL
- http://purl.flvc.org/fau/fd/FA00004080, http://purl.flvc.org/fau/fd/FA00004080
- Subject Headings
- Fluid dynamics, Signal processing, Structural dynamics, Wavelet (Mathematics)
- Format
- Document (PDF)
- Title
- VoIP Network Security and Forensic Models using Patterns.
- Creator
- Pelaez, Juan C., Fernandez, Eduardo B., Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Voice over Internet Protocol (VoIP) networks is becoming the most popular telephony system in the world. However, studies of the security of VoIP networks are still in their infancy. VoIP devices and networks are commonly attacked, and it is therefore necessary to analyze the threats against the converged network and the techniques that exist today to stop or mitigate these attacks. We also need to understand what evidence can be obtained from the VoIP system after an attack has occurred....
Show moreVoice over Internet Protocol (VoIP) networks is becoming the most popular telephony system in the world. However, studies of the security of VoIP networks are still in their infancy. VoIP devices and networks are commonly attacked, and it is therefore necessary to analyze the threats against the converged network and the techniques that exist today to stop or mitigate these attacks. We also need to understand what evidence can be obtained from the VoIP system after an attack has occurred. Many of these attacks occur in similar ways in different contexts or environments. Generic solutions to these issues can be expressed as patterns. A pattern can be used to guide the design or simulation of VoIP systems as an abstract solution to a problem in this environment. Patterns have shown their value in developing good quality software and we expect that their application to VoIP will also prove valuable to build secure systems. This dissertation presents a variety of patterns (architectural, attack, forensic and security patterns). These patterns will help forensic analysts as well, as secure systems developers because they provide a systematic approach to structure the required information and help understand system weaknesses. The patterns will also allow us to specify, analyze and implement network security investigations for different architectures. The pattern system uses object-oriented modeling (Unified Modeling Language) as a way to formalize the information and dynamics of attacks and systems.
Show less - Date Issued
- 2007
- PURL
- http://purl.flvc.org/fau/fd/FA00012576
- Subject Headings
- Internet telephony--Security measures, Computer network protocols, Global system for mobile communications, Software engineering
- Format
- Document (PDF)
- Title
- Voice activity detection over multiresolution subspaces.
- Creator
- Schultz, Robert Carl., Florida Atlantic University, Erdol, Nurgun, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Society's increased demand for communications requires searching for techniques that preserve bandwidth. It has been observed that much of the time spent during telephone communications is actually idle time with no voice activity present. Detecting these idle periods and preventing transmission during these idle periods can aid in reducing bandwidth requirements during high traffic periods. While techniques exist to perform this detection, certain types of noise can prove difficult at best...
Show moreSociety's increased demand for communications requires searching for techniques that preserve bandwidth. It has been observed that much of the time spent during telephone communications is actually idle time with no voice activity present. Detecting these idle periods and preventing transmission during these idle periods can aid in reducing bandwidth requirements during high traffic periods. While techniques exist to perform this detection, certain types of noise can prove difficult at best for signal detection. The use of wavelets with multi-resolution subspaces can aid detection by providing noise whitening and signal matching. This thesis explores its use and proposes a technique for detection.
Show less - Date Issued
- 1999
- PURL
- http://purl.flvc.org/fcla/dt/15740
- Subject Headings
- Speech processing systems, Signal processing--Digital techniques, Wavelets (Mathematics)
- Format
- Document (PDF)
- Title
- A VLSI implementation of a hexagonal topology CCD image sensor.
- Creator
- Madabushi, Vasudhevan., Florida Atlantic University, Shankar, Ravi, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
In this thesis we report a VLSI design implementation of an application specific, full-frame architecture CCD image sensor for a handwritten Optical Character Recognition system. The design is targeted to the MOSIS 2mu, 2-poly/ 2-metal n-buried channel CCD/CMOS technology. The front side illuminated CCD image sensor uses a transparent polysilicon gate structure and is comprised of 84 (H) x 100 (V) pixels arranged in a hexagonal lattice structure. The sensor has unit pixel dimensions of 18...
Show moreIn this thesis we report a VLSI design implementation of an application specific, full-frame architecture CCD image sensor for a handwritten Optical Character Recognition system. The design is targeted to the MOSIS 2mu, 2-poly/ 2-metal n-buried channel CCD/CMOS technology. The front side illuminated CCD image sensor uses a transparent polysilicon gate structure and is comprised of 84 (H) x 100 (V) pixels arranged in a hexagonal lattice structure. The sensor has unit pixel dimensions of 18 lambda (H) x 16 lambda (V). A second layer of metal is used for shielding certain areas from incident light, and the effective pixel photosite area is 8 lambda x 8 lambda. The imaging pixels use a 3-phase structure (with an innovative addressing scheme for the hexagonal lattice) for image sensing and horizontal charge shift. Columns of charge are shifted into the vertical 2-phase CCD shift registers, which shift the charge out serially at high speed. The chip has been laid out on the 'tinychip' (2250 mu m x 2220 (mu m) pad frame and fabrication through MOSIS is planned next.
Show less - Date Issued
- 1995
- PURL
- http://purl.flvc.org/fcla/dt/15123
- Subject Headings
- Integrated circuits--Very large scale integration, Optical character recognition devices, Pattern recognition systems, Imaging systems
- Format
- Document (PDF)
- Title
- A VLSI implementable thinning algorithm.
- Creator
- Zhang, Wei, Florida Atlantic University, Shankar, Ravi, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Thinning is a very important step in a Character Recognition System. This thesis evolves a thinning algorithm that can be hardware implemented to improve the speed of the process. The software thinning algorithm features a simple set of rules that can be applied on both hexagonal and orthogonal character images. The hardware architecture features the SIMD structure, simple processing elements and near neighbor communications. The algorithm was simulated against the U.S. Postal Service...
Show moreThinning is a very important step in a Character Recognition System. This thesis evolves a thinning algorithm that can be hardware implemented to improve the speed of the process. The software thinning algorithm features a simple set of rules that can be applied on both hexagonal and orthogonal character images. The hardware architecture features the SIMD structure, simple processing elements and near neighbor communications. The algorithm was simulated against the U.S. Postal Service Character Database. The architecture, evolved with consideration of both the software constraints and the physical layout limitations, was simulated using VHDL hardware description language. Subsequent to VLSI design and simulations the chip was fabricated. The project provides for a feasibility study in utilizing the parallel processor architecture for the implementation of a parallel image thinning algorithm. It is hoped that such a hardware implementation will speed up the processing and lead eventually to a real time system.
Show less - Date Issued
- 1992
- PURL
- http://purl.flvc.org/fcla/dt/14837
- Subject Headings
- Optical character recognition devices--Computer simulation, Algorithms, Integrated circuits--Very large scale integration
- Format
- Document (PDF)
- Title
- A VLSI implementable learning algorithm.
- Creator
- Ruiz, Laura V., Florida Atlantic University, Pandya, Abhijit S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
A top-down design methodology using hardware description languages (HDL's) and powerful design, analysis, synthesis and layout software tools for electronic circuit design is described and applied to the design of a single layer artificial neural network that incorporates on-chip learning. Using the perception learning algorithm, these simple neurons learn a classification problem in 10.55 microseconds in one application. The objective is to describe a methodology by following the design of a...
Show moreA top-down design methodology using hardware description languages (HDL's) and powerful design, analysis, synthesis and layout software tools for electronic circuit design is described and applied to the design of a single layer artificial neural network that incorporates on-chip learning. Using the perception learning algorithm, these simple neurons learn a classification problem in 10.55 microseconds in one application. The objective is to describe a methodology by following the design of a simple network. This methodology is later applied in the design of a novel architecture, a stochastic neural network. All issues related to algorithmic design for VLSI implementability are discussed and results of layout and timing analysis given over software simulations. A top-down design methodology is presented, including a brief introduction to HDL's and an overview of the software tools used throughout the design process. These tools make it possible now for a designer to complete a design in a relative short period of time. In-depth knowledge of computer architecture, VLSI fabrication, electronic circuits and integrated circuit design is not fundamental to accomplish a task that a few years ago would have required a large team of specialized experts in many fields. This may appeal to researchers from a wide background of knowledge, including computer scientists, mathematicians, and psychologists experimenting with learning algorithms. It is only in a hardware implementation of artificial neural network learning algorithms that the true parallel nature of these architectures could be fully tested. Most of the applications of neural networks are basically software simulations of the algorithms run on a single CPU executing sequential simulations of a parallel, richly interconnected architecture. This dissertation describes a methodology whereby a researcher experimenting with a known or new learning algorithm will be able to test it as it was intentionally designed for, on a parallel hardware architecture.
Show less - Date Issued
- 1996
- PURL
- http://purl.flvc.org/fcla/dt/12453
- Subject Headings
- Integrated circuits--Very large scale integration--Design and construction, Neural networks (Computer science)--Design and construction, Computer algorithms, Machine learning
- Format
- Document (PDF)
- Title
- A VLSI implementable handwritten digit recognition system using artificial neural networks.
- Creator
- Agba, Lawrence C., Florida Atlantic University, Shankar, Ravi, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
A VLSI implementable feature extraction scheme, and two VLSI implementable algorithms for feature classification that should lead to a practical handwritten digit recognition system are proposed. The feature extraction algorithm exploits the concept of holon dynamics. Holons can be regarded as a group of cooperative processors with self-organizing property. Two types of artificial neural network-based classifiers have been evolved to classify these features. The United States Post Office...
Show moreA VLSI implementable feature extraction scheme, and two VLSI implementable algorithms for feature classification that should lead to a practical handwritten digit recognition system are proposed. The feature extraction algorithm exploits the concept of holon dynamics. Holons can be regarded as a group of cooperative processors with self-organizing property. Two types of artificial neural network-based classifiers have been evolved to classify these features. The United States Post Office handwritten digit database was used to train and test these networks. The first type of classifier system used limited interconnect multi-layer perceptron (LIMP) modules in a hierarchical configuration. Each classifier in this system was independently trained and designated to recognize a particular digit. A maximum of sixty-one digits were used to train and 464 digits which included the training set were used to test the classifiers. A cumulative performance of 93.75% (correctly recognized digits) was recorded. The second classifier system consists of a cluster of small multi-layer perceptron (CLUMP) networks. Each cell in this system was independently trained to trace the boundary between two or more digits in the recognition plane. A combination of these cells distinguish a digit from the rest. This system was trained with 1796 digits and tested on 1918 different set of digits. On the training set a performance of 95.55% was recorded while 79.35% resulted from the test data. These results, which are expected to further improve, are superior to those obtained by other researchers on the same database. This technique of digit recognition is general enough for application in the development of a universal alphanumeric recognition system. A hybrid VLSI system consisting of both analog and digital circuitry, and utilizing both Bi-CMOS and switched capacitor technologies has been designed. The design is intended for implementation with the current MOSIS 2 $\mu$m, double poly, double metal, and p-well CMOS technology. The integrated circuit is such that both classifier systems can be realized using the same chip.
Show less - Date Issued
- 1990
- PURL
- http://purl.flvc.org/fcla/dt/12260
- Subject Headings
- Optical character recognition devices--Computer simulation, Pattern recognition systems--Computer simulation
- Format
- Document (PDF)
- Title
- Visualization tool for molecular dynamics simulation.
- Creator
- Garg, Meha., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
A study of Molecular Dynamics using computational methods and modeling provides the understanding on the interaction of the atoms, properties, structure, and motion and model phenomenon. There are numerous commercial tools available for simulation, analysis and visualization. However any particular tool does not provide all the functionalities. The main objective of this work is the development of the visualization tool customized for our research needs to view the three dimensional...
Show moreA study of Molecular Dynamics using computational methods and modeling provides the understanding on the interaction of the atoms, properties, structure, and motion and model phenomenon. There are numerous commercial tools available for simulation, analysis and visualization. However any particular tool does not provide all the functionalities. The main objective of this work is the development of the visualization tool customized for our research needs to view the three dimensional orientation of the atom, process the simulation results offline, able to handle large volume of data, ability to display complete frame, atomic trails, and runtime response to the researchers' query with low processing time. This thesis forms the basis for the development of such an in-house tool for analysis and display of simulation results based on Open GL and MFC. Advantages, limitations, capabilities and future aspects are also discussed. The result is the system capable of processing large amount of simulation result data in 11 minutes and query response and display in less than 1 second.
Show less - Date Issued
- 2010
- PURL
- http://purl.flvc.org/FAU/1927308
- Subject Headings
- Molecular dynamics, Computer simulation, Condensed matter, Computer simulation, Intermolecular forces, Computer simulation, Molecules, Mathematical models
- Format
- Document (PDF)
- Title
- Visualization of search engine query result using region-based document model on XML documents.
- Creator
- Parikh, Sunish Umesh., Florida Atlantic University, Horton, Thomas, Pandya, Abhijit S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Information access systems have traditionally focused on retrieval of documents consisting of titles and abstracts. The underlying assumptions of such systems are not necessarily appropriate for full text, structured documents. Context and structure should play an important role in information access from full text document collections. When a system retrieves a document in response to a query, it is important to indicate not only how strong the match is (e.g., how many terms from the query...
Show moreInformation access systems have traditionally focused on retrieval of documents consisting of titles and abstracts. The underlying assumptions of such systems are not necessarily appropriate for full text, structured documents. Context and structure should play an important role in information access from full text document collections. When a system retrieves a document in response to a query, it is important to indicate not only how strong the match is (e.g., how many terms from the query are present in the document), but also how frequent each term is, how each term is distributed in the text and where the terms overlap within the document. This information is especially important in long texts, since it is less clear how the terms in the query contribute to the ranking of a long text than a short abstract. This thesis does research in the application of information visualization techniques to the problem of navigating and finding information in XML files which are becoming available in increasing quantities on the World Wide Web (WWW). It provides a methodology for presenting detailed information about a specific topic while also presenting a complete overview of all the information available. A prototype has been developed for visualization of search query results. Limitations of the prototype developed and future direction of work have also been discussed.
Show less - Date Issued
- 2000
- PURL
- http://purl.flvc.org/fcla/dt/12694
- Subject Headings
- XML (Document markup language), Web search engines
- Format
- Document (PDF)