Current Search: Graduate College (x)
View All Items
Pages
- Title
- A feedback-based multimedia synchronization technique for distributed systems.
- Creator
- Ehley, Lynnae Anne., Florida Atlantic University, Ilyas, Mohammad, Furht, Borko, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Multimedia applications incorporate the use of more than one type of media, i.e., voice, video, data, text and image. With the advances in high-speed communication, the ability to transmit multimedia is becoming widely available. One of the means of transport for multimedia in distributed networks is Broadband Integrated Services Digital Network (B-ISDN). B-ISDN supports the transport of large volumes of data with a low error rate. It also handles the burstiness of multimedia traffic by...
Show moreMultimedia applications incorporate the use of more than one type of media, i.e., voice, video, data, text and image. With the advances in high-speed communication, the ability to transmit multimedia is becoming widely available. One of the means of transport for multimedia in distributed networks is Broadband Integrated Services Digital Network (B-ISDN). B-ISDN supports the transport of large volumes of data with a low error rate. It also handles the burstiness of multimedia traffic by providing dynamic bandwidth allocation. When multimedia is requested for transport in a distributed network, different Quality of Service (QOS) may be required for each type of media. For example, video can withstand more errors than voice. In order to provide, the most efficient form of transfer, different QOS media are sent using different channels. By using different channels for transport, jitter can impose skews on the temporal relations between the media. Jitter is caused by errors and buffering delays. Since B-ISDN uses Asynchronous Transfer Mode (ATM) as its transfer mode, the jitter that is incurred can be assumed to be bounded if traffic management principles such as admission control and resource reservation are employed. Another network that can assume bounded buffering is the 16 Mbps token-ring LAN when the LAN Server (LS) Ultimedia(TM) software is applied over the OS/2 LAN Server(TM) (using OS/2(TM)). LS Ultimedia(TM) reserves critical resources such as disk, server processor, and network resources for multimedia use. In addition, it also enforces admission control(1). Since jitter is bounded on the networks chosen, buffers can be used to realign the temporal relations in the media. This dissertation presents a solution to this problem by proposing a Feedback-based Multimedia Synchronization Technique (FMST) to correct and compensate for the jitter that is incurred when media are received over high speed communication channels and played back in real time. FMST has been implemented at the session layer for the playback of the streams. A personal computer was used to perform their synchronized playback from a 16 Mbps token-ring and from a simulated B-ISDN network.
Show less - Date Issued
- 1994
- PURL
- http://purl.flvc.org/fcla/dt/12382
- Subject Headings
- Multimedia systems, Broadband communication systems, Data transmission systems, Integrated services digital networks, Electronic data processing--Distributed processing
- Format
- Document (PDF)
- Title
- The finite element method as a parametric tool in the design and analysis of a pressure vessel having a threaded closure.
- Creator
- Merkl, Garrett Andrew., Florida Atlantic University, Case, Robert O., Tsai, Chi-Tay, College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
The finite element method is a very powerful tool used to analyze a variety of problems in engineering. This thesis looks at the finite element method as a tool and several important modeling features of concern. A well known finite element software package, ANSYS, will be used to demonstrate a diverse number of its capabilities, and several procedures followed in solving a specific engineering problem. The subject matter involves a nonlinear contact analysis of a pressure vessel having a...
Show moreThe finite element method is a very powerful tool used to analyze a variety of problems in engineering. This thesis looks at the finite element method as a tool and several important modeling features of concern. A well known finite element software package, ANSYS, will be used to demonstrate a diverse number of its capabilities, and several procedures followed in solving a specific engineering problem. The subject matter involves a nonlinear contact analysis of a pressure vessel having a threaded closure. The choice of this application is prompted by an interest in better understanding how the finite element method is implemented in the design and analysis of different pressure vessel parameters. A parametric finite element analysis was performed. Load and stress distributions along the threaded region of the vessel were examined for parameters including number of threads, thread pitch, diameter ratio, closure plug length, and thread profile.
Show less - Date Issued
- 1996
- PURL
- http://purl.flvc.org/fcla/dt/15243
- Subject Headings
- Finite element method, Pressure vessels--Design and construction, Strains and stresses--Mathematical models
- Format
- Document (PDF)
- Title
- A fault-tolerant memory architecture for storing one hour of D-1 video in real time on long polyimide tapes.
- Creator
- Monteiro, Pedro Cox de Sousa., Florida Atlantic University, Glenn, William E., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Research is under way to fabricate large-area thin-film transistor arrays produced on a thin polyimide substrate. The polyimide substrate is available in long thirty centimeter wide rolls of tape, and lithography hardware is being developed to expose hundreds of meters of this tape with electrically addressable light modulators which can resolve 2 $\mu$m features. A fault-tolerant memory architecture is proposed that is capable of storing one hour of D-1 component digital video (almost 10^12...
Show moreResearch is under way to fabricate large-area thin-film transistor arrays produced on a thin polyimide substrate. The polyimide substrate is available in long thirty centimeter wide rolls of tape, and lithography hardware is being developed to expose hundreds of meters of this tape with electrically addressable light modulators which can resolve 2 $\mu$m features. A fault-tolerant memory architecture is proposed that is capable of storing one hour of D-1 component digital video (almost 10^12 bits) in real-time, on eight two-hundred meter long tapes. Appropriate error correcting codes and error concealment are proposed to compensate for drop-outs resulting from manufacturing defects so as to yield video images with error rates low enough to survive several generations of copies.
Show less - Date Issued
- 1992
- PURL
- http://purl.flvc.org/fcla/dt/14869
- Subject Headings
- Polyimides, Computer architecture, Memory hierarchy (Computer science), Fault-tolerant computing
- Format
- Document (PDF)
- Title
- The human face recognition problem: A solution based on third-order synthetic neural networks and isodensity analysis.
- Creator
- Uwechue, Okechukwu A., Florida Atlantic University, Pandya, Abhijit S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Third-order synthetic neural networks are applied to the recognition of isodensity facial images extracted from digitized grayscale facial images. A key property of neural networks is their ability to recognize invariances and extract essential parameters from complex high-dimensional data. In pattern recognition an input image must be recognized regardless of its position, size, and angular orientation. In order to achieve this, the neural network needs to learn the relationships between the...
Show moreThird-order synthetic neural networks are applied to the recognition of isodensity facial images extracted from digitized grayscale facial images. A key property of neural networks is their ability to recognize invariances and extract essential parameters from complex high-dimensional data. In pattern recognition an input image must be recognized regardless of its position, size, and angular orientation. In order to achieve this, the neural network needs to learn the relationships between the input pixels. Pattern recognition requires the nonlinear subdivision of the pattern space into subsets representing the objects to be identified. Single-layer neural networks can only perform linear discrimination. However, multilayer first-order networks and high-order neural networks can both achieve this. The most significant advantage of a higher-order net over a traditional multilayer perceptron is that invariances to 2-dimensional geometric transformations can be incorporated into the network and need not be learned through prolonged training with an extensive family of exemplars. It is shown that a third-order network can be used to achieve translation-, scale-, and rotation-invariant recognition with a significant reduction in training time over other neural net paradigms such as the multilayer perceptron. A model based on an enhanced version of the Widrow-Hoff training algorithm and a new momentum paradigm are introduced and applied to the complex problem of human face recognition under varying facial expressions. Arguments for the use of isodensity information in the recognition algorithm are put forth and it is shown how the technique of coarse-coding is applied to reduce the memory required for computer simulations. The combination of isodensity information and neural networks for image recognition is described and its merits over other image recognition methods are explained. It is shown that isodensity information coupled with the use of an "adaptive threshold strategy" (ATS) yields a system that is relatively impervious to image contrast noise. The new momentum paradigm produces much faster convergence rates than ordinary momentum and renders the network behaviour independent of its training parameters over a broad range of parameter values.
Show less - Date Issued
- 1996
- PURL
- http://purl.flvc.org/fcla/dt/12464
- Subject Headings
- Image processing, Face perception, Neural networks (Computer science)
- Format
- Document (PDF)
- Title
- iVEST A: Interactive Data Visualization and Analysis for Drive Test Data Evaluation.
- Creator
- Lee, Yongsuk, Zhu, Xingquan, Pandya, Abhijit S., Hsu, Sam, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
In this thesis, a practical solution for drive test data evaluation and a real application are studied. We propose a system framework to project high dimensional Drive Test Data (DTD) to well-organized web pages, such that users can visually review phone performance with respect to different factors. The proposed application, iVESTA (interactive Visualization and Evaluation System for driven Test dAta), employs a web-based architecture which enables users to upload DTD and immediately...
Show moreIn this thesis, a practical solution for drive test data evaluation and a real application are studied. We propose a system framework to project high dimensional Drive Test Data (DTD) to well-organized web pages, such that users can visually review phone performance with respect to different factors. The proposed application, iVESTA (interactive Visualization and Evaluation System for driven Test dAta), employs a web-based architecture which enables users to upload DTD and immediately visualize the test results and observe phone and network performances with respect to different factors such as dropped call rate, signal quality, vehicle speed, handover and network delays. iVESTA provides practical solutions for mobile phone manufacturers and network service providers to perform comprehensive study on their products from the real-world DTD.
Show less - Date Issued
- 2007
- PURL
- http://purl.flvc.org/fau/fd/FA00012532
- Subject Headings
- Information visualization--Data processing, Object-oriented programming (Computer science), Information technology--Management, Application software--Development
- Format
- Document (PDF)
- Title
- VoIP Network Security and Forensic Models using Patterns.
- Creator
- Pelaez, Juan C., Fernandez, Eduardo B., Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Voice over Internet Protocol (VoIP) networks is becoming the most popular telephony system in the world. However, studies of the security of VoIP networks are still in their infancy. VoIP devices and networks are commonly attacked, and it is therefore necessary to analyze the threats against the converged network and the techniques that exist today to stop or mitigate these attacks. We also need to understand what evidence can be obtained from the VoIP system after an attack has occurred....
Show moreVoice over Internet Protocol (VoIP) networks is becoming the most popular telephony system in the world. However, studies of the security of VoIP networks are still in their infancy. VoIP devices and networks are commonly attacked, and it is therefore necessary to analyze the threats against the converged network and the techniques that exist today to stop or mitigate these attacks. We also need to understand what evidence can be obtained from the VoIP system after an attack has occurred. Many of these attacks occur in similar ways in different contexts or environments. Generic solutions to these issues can be expressed as patterns. A pattern can be used to guide the design or simulation of VoIP systems as an abstract solution to a problem in this environment. Patterns have shown their value in developing good quality software and we expect that their application to VoIP will also prove valuable to build secure systems. This dissertation presents a variety of patterns (architectural, attack, forensic and security patterns). These patterns will help forensic analysts as well, as secure systems developers because they provide a systematic approach to structure the required information and help understand system weaknesses. The patterns will also allow us to specify, analyze and implement network security investigations for different architectures. The pattern system uses object-oriented modeling (Unified Modeling Language) as a way to formalize the information and dynamics of attacks and systems.
Show less - Date Issued
- 2007
- PURL
- http://purl.flvc.org/fau/fd/FA00012576
- Subject Headings
- Internet telephony--Security measures, Computer network protocols, Global system for mobile communications, Software engineering
- Format
- Document (PDF)
- Title
- Visualization of Impact Analysis on Configuration Management Data for Software Process Improvement.
- Creator
- Lo, Christopher Hoi-Yin, Huang, Shihong, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The software development process is an incremental and iterative activity. Source code is constantly altered to reflect changing requirements, to respond to testing results, and to address problem reports. Proper software measurement that derives meaningful numeric values for some attributes of a software product or process can help in identifying problem areas and development bottlenecks. Impact analysis is the evaluation of the risks associated with change requests or problem reports,...
Show moreThe software development process is an incremental and iterative activity. Source code is constantly altered to reflect changing requirements, to respond to testing results, and to address problem reports. Proper software measurement that derives meaningful numeric values for some attributes of a software product or process can help in identifying problem areas and development bottlenecks. Impact analysis is the evaluation of the risks associated with change requests or problem reports, including estimates of effects on resources, effort, and schedule. This thesis presents a methodology called VITA for applying software analysis techniques to configuration management repository data with the aim of identifying the impact on file changes due to change requests and problem reports. The repository data can be analyzed and visualized in a semi-automated manner according to user-selectable criteria. The approach is illustrated with a model problem concerning software process improvement of an embedded software system in the context of performing high-quality software maintenance.
Show less - Date Issued
- 2007
- PURL
- http://purl.flvc.org/fau/fd/FA00012535
- Subject Headings
- Software mesurement, Software engineering--Quality control, Data mining--Quality control
- Format
- Document (PDF)
- Title
- Visualization of search engine query result using region-based document model on XML documents.
- Creator
- Parikh, Sunish Umesh., Florida Atlantic University, Horton, Thomas, Pandya, Abhijit S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Information access systems have traditionally focused on retrieval of documents consisting of titles and abstracts. The underlying assumptions of such systems are not necessarily appropriate for full text, structured documents. Context and structure should play an important role in information access from full text document collections. When a system retrieves a document in response to a query, it is important to indicate not only how strong the match is (e.g., how many terms from the query...
Show moreInformation access systems have traditionally focused on retrieval of documents consisting of titles and abstracts. The underlying assumptions of such systems are not necessarily appropriate for full text, structured documents. Context and structure should play an important role in information access from full text document collections. When a system retrieves a document in response to a query, it is important to indicate not only how strong the match is (e.g., how many terms from the query are present in the document), but also how frequent each term is, how each term is distributed in the text and where the terms overlap within the document. This information is especially important in long texts, since it is less clear how the terms in the query contribute to the ranking of a long text than a short abstract. This thesis does research in the application of information visualization techniques to the problem of navigating and finding information in XML files which are becoming available in increasing quantities on the World Wide Web (WWW). It provides a methodology for presenting detailed information about a specific topic while also presenting a complete overview of all the information available. A prototype has been developed for visualization of search query results. Limitations of the prototype developed and future direction of work have also been discussed.
Show less - Date Issued
- 2000
- PURL
- http://purl.flvc.org/fcla/dt/12694
- Subject Headings
- XML (Document markup language), Web search engines
- Format
- Document (PDF)
- Title
- Visualization as a Qualitative Method for Analysis of Data from Location Tracking Technologies.
- Creator
- Mani, Mohan, VanHilst, Michael, Pandya, Abhijit S., Hsu, Sam, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
One of the biggest factors in the quest to better wireless communication is cellular call handoff, which in tum, is a function of geographic location. In this thesis, our fundamental goal was to demonstrate the value addition brought forth by spatial data visualization techniques for the analysis of geo-referenced data from two different location tracking technologies: GPS and cellular systems. Through our efforts, we unearthed some valuable and surprising insights from the data being...
Show moreOne of the biggest factors in the quest to better wireless communication is cellular call handoff, which in tum, is a function of geographic location. In this thesis, our fundamental goal was to demonstrate the value addition brought forth by spatial data visualization techniques for the analysis of geo-referenced data from two different location tracking technologies: GPS and cellular systems. Through our efforts, we unearthed some valuable and surprising insights from the data being analyzed that led to interesting observations about the data itself as opposed to the entity, or entities, that the data is supposed to describe. In doing so, we underscored the value addition brought forth by spatial data visualization techniques even in the incipient stages of analysis of georeferenced data from cellular networks. We also demonstrated the value of visualization techniques as a verification tool to verify the results of analysis done through other methods, such as statistical analysis.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/fau/fd/FA00012536
- Subject Headings
- Mobile communication systems, Algorithms--Data analysis, Radio--Transmitters and transmissions, Code division multiple access
- Format
- Document (PDF)
- Title
- An intelligent neural network forecaster to predict the Standard & Poor 500's index.
- Creator
- Shah, Sulay Bipin., Florida Atlantic University, Pandya, Abhijit S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
In this thesis we present an intelligent forecaster based on neural network technology to capture the future path of the market indicator. This thesis is about the development of a new methodology in financial forecasting. An effort is made to develop a neural network forecaster using the financial indicators as the input variables. A complex recurrent neural network is used to capture the behavior of the nonlinear characteristics of the S&P 500. The main outcome of this research is, a...
Show moreIn this thesis we present an intelligent forecaster based on neural network technology to capture the future path of the market indicator. This thesis is about the development of a new methodology in financial forecasting. An effort is made to develop a neural network forecaster using the financial indicators as the input variables. A complex recurrent neural network is used to capture the behavior of the nonlinear characteristics of the S&P 500. The main outcome of this research is, a systematic way of constructing a forecaster for nonlinear and non-stationary data series of S&P 500 that leads to very good out-of-sample prediction. The results of the training and testing of the network are presented along with conclusion. The tool used for the validation of this research is "Brainmaker". This thesis also contains a brief survey of available tools for financial forecasting.
Show less - Date Issued
- 1999
- PURL
- http://purl.flvc.org/fcla/dt/15741
- Subject Headings
- Neural networks (Computer science), Stock price forecasting, Time-series analysis
- Format
- Document (PDF)
- Title
- The cochlea: A signal processing paradigm.
- Creator
- Barrett, Raymond L. Jr., Florida Atlantic University, Erdol, Nurgun, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The cochlea provides frequency selectivity for acoustic input signal processing in mammals. The excellent performance of human hearing for speech processing leads to examination of the cochlea as a paradigm for signal processing. The components of the hearing process are examined and suitable models are selected for each component's function. The signal processing function is simulated by a computer program and the ensemble is examined for behavior and improvement. The models reveal that the...
Show moreThe cochlea provides frequency selectivity for acoustic input signal processing in mammals. The excellent performance of human hearing for speech processing leads to examination of the cochlea as a paradigm for signal processing. The components of the hearing process are examined and suitable models are selected for each component's function. The signal processing function is simulated by a computer program and the ensemble is examined for behavior and improvement. The models reveal that the motion of the basilar membrane provides a very selective low pass transmission characteristic. Narrowband frequency resolution is obtained from the motion by computation of spatial differences in the magnitude of the motion as energy propagates along the membrane. Basilar membrane motion is simulated using the integrable model of M. R. Schroeder, but the paradigm is useful for any model that exhibits similar high selectivity. Support is shown for an hypothesis that good frequency discrimination is possible without highly resonant structure. The nonlinear magnitude calculation is performed on signals developed without highly resonant structure, and differences in those magnitudes are signals shown to have good narrowband selectivity. Simultaneously, good transient behavior is preserved due to the avoidance of highly resonant structure. The cochlear paradigm is shown to provide a power spectrum with serendipitous good frequency selectivity and good transient response simultaneously.
Show less - Date Issued
- 1990
- PURL
- http://purl.flvc.org/fcla/dt/12251
- Subject Headings
- Engineering, Electronics and Electrical, Computer Science
- Format
- Document (PDF)
- Title
- A connectionist approach to adaptive reasoning: An expert system to predict skid numbers.
- Creator
- Reddy, Mohan S., Florida Atlantic University, Pandya, Abhijit S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This project illustrates the neural network approach to constructing a fuzzy logic decision system. This technique employs an artificial neural network (ANN) to recognize the relationships that exit between the various inputs and outputs. An ANN is constructed based on the variables present in the application. The network is trained and tested. Various training methods are explored, some of which include auxiliary input and output columns. After successful testing, the ANN is exposed to new...
Show moreThis project illustrates the neural network approach to constructing a fuzzy logic decision system. This technique employs an artificial neural network (ANN) to recognize the relationships that exit between the various inputs and outputs. An ANN is constructed based on the variables present in the application. The network is trained and tested. Various training methods are explored, some of which include auxiliary input and output columns. After successful testing, the ANN is exposed to new data and the results are grouped into fuzzy membership sets based membership evaluation rules. This data grouping forms the basis of a new ANN. The network is now trained and tested with the fuzzy membership data. New data is presented to the trained network and the results form the fuzzy implications. This approach is used to compute skid resistance values from G-analyst accelerometer readings on open grid bridge decks.
Show less - Date Issued
- 1996
- PURL
- http://purl.flvc.org/fcla/dt/15239
- Subject Headings
- Artificial intelligence, Fuzzy logic, Neural networks (Computer science), Pavements--Skid resistance
- Format
- Document (PDF)
- Title
- A communication protocol for wireless sensor networks.
- Creator
- Callaway, Edgar Herbert, Jr., Florida Atlantic University, Shankar, Ravi, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Many wireless network applications, such as wireless computing on local area networks, employ data throughput as a primary performance metric. The data throughput on such networks has therefore been increasing in recent years. However, there are other potential wireless network applications, such as industrial monitoring and control, consumer home automation, and military remote sensing, that have relaxed throughput requirements, often measured in bits/day. Such networks have power...
Show moreMany wireless network applications, such as wireless computing on local area networks, employ data throughput as a primary performance metric. The data throughput on such networks has therefore been increasing in recent years. However, there are other potential wireless network applications, such as industrial monitoring and control, consumer home automation, and military remote sensing, that have relaxed throughput requirements, often measured in bits/day. Such networks have power consumption and cost as primary performance metrics, rather than data throughput, and have been called wireless sensor networks. This work describes a physical layer, a data link layer, and a network layer design suitable for use in wireless sensor networks. To minimize node duty cycle and therefore average power consumption, while minimizing the symbol rate, the proposed physical layer employs a form of orthogonal multilevel signaling in a direct sequence spread spectrum format. Results of Signal Processing Worksystem (SPW, Cadence, Inc.) simulations are presented showing a 4-dB sensitivity advantage of the proposed modulation method compared to binary signaling, in agreement with theory. Since the proposed band of operation is the 2.4 GHz unlicensed band, interference from other services is possible; to address this, SPW simulations of the proposed modulation method in the presence of Bluetooth interference are presented. The processing gain inherent in the proposed spread spectrum scheme is shown to require the interferer to be significantly stronger than the desired signal before materially affecting the received bit error rate. The proposed data link layer employs a novel distributed mediation device (MD) technique to enable networked nodes to synchronize to each other, even when the node duty cycle is arbitrarily low (e.g., <0.1%). This technique enables low-cost devices, which may employ only low-stability time bases, to remain asynchronous to one another, becoming synchronized only when communication is necessary between them. Finally, a wireless sensor network design is presented. A cluster-type architecture is chosen; the clusters are organized in a hierarchical tree to simplify the routing algorithm. Results of several network performance metrics simulations, including the effects of the distributed MD dynamic synchronization scheme, are presented, including the average message latency, node duty cycle, and data throughput. The architecture is shown to represent a practical alternative for the design of wireless sensor networks.
Show less - Date Issued
- 2002
- PURL
- http://purl.flvc.org/fcla/dt/11991
- Subject Headings
- Wireless communication systems, Computer network protocols, Radio detectors
- Format
- Document (PDF)
- Title
- A critical comparison of three user interface architectures in object-oriented design.
- Creator
- Walls, David Paul., Florida Atlantic University, Fernandez, Eduardo B., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Frameworks for the development of object-oriented, user interactive applications have been examined. Three alternate approaches have been explored; the Model-View-Controller (MVC) approach, the MVC++ approach and the Presentation-Abstraction-Control (PAC) approach. For the purpose of assessing the approaches, a simple engineering application was selected for object-oriented analysis using the three techniques. The utility of each technique was compared on the basis of complexity,...
Show moreFrameworks for the development of object-oriented, user interactive applications have been examined. Three alternate approaches have been explored; the Model-View-Controller (MVC) approach, the MVC++ approach and the Presentation-Abstraction-Control (PAC) approach. For the purpose of assessing the approaches, a simple engineering application was selected for object-oriented analysis using the three techniques. The utility of each technique was compared on the basis of complexity, extensibility and reusability. While the approaches aim to provide reusable user interface components and extensibility through incorporation of an additional class, only MVC++ and PAC truly achieve this goal, although at the expense of introducing additional messaging complexity. It was also noted that, in general, decoupling of the GUI classes, while providing increased extensibility and reusability, increases the inter-object messaging requirement.
Show less - Date Issued
- 1999
- PURL
- http://purl.flvc.org/fcla/dt/15747
- Subject Headings
- User interfaces (Computer systems), Object-oriented methods (Computer science)
- Format
- Document (PDF)
- Title
- A comparative study of attribute selection techniques for CBR-based software quality classification models.
- Creator
- Nguyen, Laurent Quoc Viet., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
To achieve high reliability in software-based systems, software metrics-based quality classification models have been explored in the literature. However, the collection of software metrics may be a hard and long process, and some metrics may not be helpful or may be harmful to the classification models, deteriorating the models' accuracies. Hence, methodologies have been developed to select the most significant metrics in order to build accurate and efficient classification models. Case...
Show moreTo achieve high reliability in software-based systems, software metrics-based quality classification models have been explored in the literature. However, the collection of software metrics may be a hard and long process, and some metrics may not be helpful or may be harmful to the classification models, deteriorating the models' accuracies. Hence, methodologies have been developed to select the most significant metrics in order to build accurate and efficient classification models. Case-Based Reasoning is the classification technique used in this thesis. Since it does not provide any metric selection mechanisms, some metric selection techniques were studied. In the context of CBR, this thesis presents a comparative evaluation of metric selection methodologies, for raw and discretized data. Three attribute selection techniques have been studied: Kolmogorov-Smirnov Two-Sample Test, Kruskal-Wallis Test, and Information Gain. These techniques resulted in classification models that are useful for software quality improvement.
Show less - Date Issued
- 2002
- PURL
- http://purl.flvc.org/fcla/dt/12944
- Subject Headings
- Case-based reasoning, Software engineering, Computer software--Quality control
- Format
- Document (PDF)
- Title
- GENERALIZED PADE APPROXIMATION TECHNIQUES AND MULTIDIMENSIONAL SYSTEMS.
- Creator
- MESSITER, MARK A., Florida Atlantic University, Shamash, Yacov A., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Two algorithms for greatest common factor (GCF) extraction from two multivariable polynomials, based on generalized Pade approximation, are presented. The reduced transfer matrices for two-dimensional (20) systems are derived from two 20 state-space models. Tests for product and sum separabilities of multivariable functions are also given.
- Date Issued
- 1983
- PURL
- http://purl.flvc.org/fcla/dt/14175
- Subject Headings
- Multivariate analysis, Padé approximant, Polynomials
- Format
- Document (PDF)
- Title
- GENERIC NETWORK EXECUTIVE.
- Creator
- SARMIENTO, JESUS LEOPOLDO., Florida Atlantic University, Fernandez, Eduardo B., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
A Generic Network Executive (GNE) package is presented in this thesis. It encompasses the strategy and methodology to follow when implementing data communication software. GNE was designed for portability and high utilization of available resources (efficiency). It does not impose implementation constraints because it does not include features specific to any system (hardware or operating system}. It uses a highly concurrent process model with a pipelined structure . It is not protocol...
Show moreA Generic Network Executive (GNE) package is presented in this thesis. It encompasses the strategy and methodology to follow when implementing data communication software. GNE was designed for portability and high utilization of available resources (efficiency). It does not impose implementation constraints because it does not include features specific to any system (hardware or operating system}. It uses a highly concurrent process model with a pipelined structure . It is not protocol dependent, rather it is meant to be used to implement low level services for higher level communic ation protocols. It is intended to provide interprocess communication in distributed systems by coupling application programs with a general purpose packet delivery system, i.e., a datagram service.
Show less - Date Issued
- 1986
- PURL
- http://purl.flvc.org/fcla/dt/14321
- Subject Headings
- Computer networks, Data transmission systems
- Format
- Document (PDF)
- Title
- Fuzzy identification of processes on finite training sets with known features.
- Creator
- Diaz-Robainas, Regino R., Florida Atlantic University, Huang, Ming Z., Zilouchian, Ali, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
A methodology is presented to construct an approximate fuzzy-mapping algorithm that maps multiple inputs to single outputs given a finite training set of argument vectors functionally linked to corresponding scalar outputs. Its scope is limited to problems where the features are known in advance, or equivalently, where the expected functional representation is known to depend exclusively on the known selected variables. Programming and simulations to implement the methodology make use of...
Show moreA methodology is presented to construct an approximate fuzzy-mapping algorithm that maps multiple inputs to single outputs given a finite training set of argument vectors functionally linked to corresponding scalar outputs. Its scope is limited to problems where the features are known in advance, or equivalently, where the expected functional representation is known to depend exclusively on the known selected variables. Programming and simulations to implement the methodology make use of Matlab Fuzzy and Neural toolboxes and a PC application of Prolog, and applications range from approximate representations of the direct kinematics of parallel manipulators to fuzzy controllers.
Show less - Date Issued
- 1996
- PURL
- http://purl.flvc.org/fcla/dt/12487
- Subject Headings
- Fuzzy algorithms, Set theory, Logic, Symbolic and mathematical, Finite groups, Representations of groups
- Format
- Document (PDF)
- Title
- Fuzzy vault fingerprint cryptography: Experimental and simulation studies.
- Creator
- Kotlarchyk, Alex J., Florida Atlantic University, Pandya, Abhijit S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The fuzzy vault scheme introduced by Juels and Sudan [Jue02] was implemented in a fingerprint cryptography system using COTS software. This system proved to be unsuccessful. Failure analysis led to a series of simulations to investigate the parameters and system thresholds necessary for such a system to perform adequately and as guidance for constructing similar systems in the future. First, a discussion of the role of biometrics in data security and cryptography is presented, followed by a...
Show moreThe fuzzy vault scheme introduced by Juels and Sudan [Jue02] was implemented in a fingerprint cryptography system using COTS software. This system proved to be unsuccessful. Failure analysis led to a series of simulations to investigate the parameters and system thresholds necessary for such a system to perform adequately and as guidance for constructing similar systems in the future. First, a discussion of the role of biometrics in data security and cryptography is presented, followed by a review of the key developments leading to the development of the fuzzy vault scheme. The relevant mathematics and algorithms are briefly explained. This is followed by a detailed description of the implementation and simulation of the fuzzy vault scheme. Finally, conclusions drawn from analysis of the results of this research are presented.
Show less - Date Issued
- 2006
- PURL
- http://purl.flvc.org/fcla/dt/13360
- Subject Headings
- Computer networks--Security measures, Computer security, Data encryption (Computer science)
- Format
- Document (PDF)
- Title
- Generating formal models from UML class diagrams.
- Creator
- Shroff, Malcolm Keki., Florida Atlantic University, France, Robert B., Larrondo-Petrie, Maria M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The rich structuring mechanisms, and abstract modeling constructs available in most graphical object-oriented modeling methods (OOMs) facilitate the creation of abstract, visually-appealing, highly-structured graphical models. On the other hand, lack of formal semantics for the modeling notation can severely limit the utility of OOMs. Formal specification techniques (FSTs) support the creation of precise and analyzable specifications, but they can be tedious to create and difficult to read,...
Show moreThe rich structuring mechanisms, and abstract modeling constructs available in most graphical object-oriented modeling methods (OOMs) facilitate the creation of abstract, visually-appealing, highly-structured graphical models. On the other hand, lack of formal semantics for the modeling notation can severely limit the utility of OOMs. Formal specification techniques (FSTs) support the creation of precise and analyzable specifications, but they can be tedious to create and difficult to read, especially by system developers not trained in formal methods. The complementary strengths of OOMs and FSTs suggest that their integration can result in techniques that can be used to create precise and analyzable models. This thesis describes a technique for integrating analysis level UML (Unified Modeling Language) Class Diagrams with the formal notation Object-Z.
Show less - Date Issued
- 1997
- PURL
- http://purl.flvc.org/fcla/dt/15514
- Subject Headings
- UML (Computer science), Object-oriented methods (Computer science)
- Format
- Document (PDF)