Current Search: Software engineering (x)
View All Items
Pages
- Title
- Assessing Data Interoperability ofUML Modeling Tools.
- Creator
- Gohel, Vaishali P., Huang, Shihong, Florida Atlantic University
- Abstract/Description
-
In the globalization software development environments, where the development activities are distributed geographically and temporally, it is increasingly important for the Computer Aided Software Engineering (CASE) tools to maintain the information (both syntactic and semantic) captured in the design models. The Unified Modeling Language (UML) is the de facto standard for modeling software applications and UML diagrams serve as graphical documentations of the software system. The...
Show moreIn the globalization software development environments, where the development activities are distributed geographically and temporally, it is increasingly important for the Computer Aided Software Engineering (CASE) tools to maintain the information (both syntactic and semantic) captured in the design models. The Unified Modeling Language (UML) is the de facto standard for modeling software applications and UML diagrams serve as graphical documentations of the software system. The interoperability of UML modeling tools is important in supporting the models exchange, and further support design reuse. Tool interoperability is often implemented using XML Metadata Interchange (XMI). Unfortunately, there is a loss of fidelity of the design documentation when transforming between UML and XMI due to the compatibility of different versions of UML, XMI and add-on proprietary information, which hinder reuse. This thesis evaluates the interoperability of UML modeling tools by assessing the quality of XMI documents representing the design. Case studies in this thesis demonstrate a framework of preserving the fidelity of UML model 's data when importing and exporting different UML models in a distributed heterogeneous environment.
Show less - Date Issued
- 2007
- PURL
- http://purl.flvc.org/fau/fd/FA00012519
- Subject Headings
- UML (Computer science), Computer software--Development, Software engineering, Computer-aided design
- Format
- Document (PDF)
- Title
- Visualization of Impact Analysis on Configuration Management Data for Software Process Improvement.
- Creator
- Lo, Christopher Hoi-Yin, Huang, Shihong, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The software development process is an incremental and iterative activity. Source code is constantly altered to reflect changing requirements, to respond to testing results, and to address problem reports. Proper software measurement that derives meaningful numeric values for some attributes of a software product or process can help in identifying problem areas and development bottlenecks. Impact analysis is the evaluation of the risks associated with change requests or problem reports,...
Show moreThe software development process is an incremental and iterative activity. Source code is constantly altered to reflect changing requirements, to respond to testing results, and to address problem reports. Proper software measurement that derives meaningful numeric values for some attributes of a software product or process can help in identifying problem areas and development bottlenecks. Impact analysis is the evaluation of the risks associated with change requests or problem reports, including estimates of effects on resources, effort, and schedule. This thesis presents a methodology called VITA for applying software analysis techniques to configuration management repository data with the aim of identifying the impact on file changes due to change requests and problem reports. The repository data can be analyzed and visualized in a semi-automated manner according to user-selectable criteria. The approach is illustrated with a model problem concerning software process improvement of an embedded software system in the context of performing high-quality software maintenance.
Show less - Date Issued
- 2007
- PURL
- http://purl.flvc.org/fau/fd/FA00012535
- Subject Headings
- Software mesurement, Software engineering--Quality control, Data mining--Quality control
- Format
- Document (PDF)
- Title
- A comparative study of attribute selection techniques for CBR-based software quality classification models.
- Creator
- Nguyen, Laurent Quoc Viet., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
To achieve high reliability in software-based systems, software metrics-based quality classification models have been explored in the literature. However, the collection of software metrics may be a hard and long process, and some metrics may not be helpful or may be harmful to the classification models, deteriorating the models' accuracies. Hence, methodologies have been developed to select the most significant metrics in order to build accurate and efficient classification models. Case...
Show moreTo achieve high reliability in software-based systems, software metrics-based quality classification models have been explored in the literature. However, the collection of software metrics may be a hard and long process, and some metrics may not be helpful or may be harmful to the classification models, deteriorating the models' accuracies. Hence, methodologies have been developed to select the most significant metrics in order to build accurate and efficient classification models. Case-Based Reasoning is the classification technique used in this thesis. Since it does not provide any metric selection mechanisms, some metric selection techniques were studied. In the context of CBR, this thesis presents a comparative evaluation of metric selection methodologies, for raw and discretized data. Three attribute selection techniques have been studied: Kolmogorov-Smirnov Two-Sample Test, Kruskal-Wallis Test, and Information Gain. These techniques resulted in classification models that are useful for software quality improvement.
Show less - Date Issued
- 2002
- PURL
- http://purl.flvc.org/fcla/dt/12944
- Subject Headings
- Case-based reasoning, Software engineering, Computer software--Quality control
- Format
- Document (PDF)
- Title
- Fuzzy logic techniques for software reliability engineering.
- Creator
- Xu, Zhiwei., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Modern people are becoming more and more dependent on computers in their daily lives. Most industries, from automobile, avionics, oil, and telecommunications to banking, stocks, and pharmaceuticals, require computers to function. As the tasks required become more complex, the complexity of computer software and hardware has increased dramatically. As a consequence, the possibility of failure increases. As the requirements for and dependence on computers increases, the possibility of crises...
Show moreModern people are becoming more and more dependent on computers in their daily lives. Most industries, from automobile, avionics, oil, and telecommunications to banking, stocks, and pharmaceuticals, require computers to function. As the tasks required become more complex, the complexity of computer software and hardware has increased dramatically. As a consequence, the possibility of failure increases. As the requirements for and dependence on computers increases, the possibility of crises caused by computer failures also increases. High reliability is an important attribute for almost any software system. Consequently, software developers are seeking ways to forecast and improve quality before release. Since many quality factors cannot be measured until after the software becomes operational, software quality models are developed to predict quality factors based on measurements collected earlier in the life cycle. Due to incomplete information in the early life cycle of software development, software quality models with fuzzy characteristics usually perform better because fuzzy concepts deal with phenomenon that is vague in nature. This study focuses on the usage of fuzzy logic in software reliability engineering. Discussing will include the fuzzy expert systems and the application of fuzzy expert systems in early risk assessment; introducing the interval prediction using fuzzy regression modeling; demonstrating fuzzy rule extraction for fuzzy classification and its usage in software quality models; demonstrating the fuzzy identification, including extraction of both rules and membership functions from fuzzy data and applying the technique to software project cost estimations. The following methodologies were considered: nonparametric discriminant analysis, Z-test and paired t-test, neural networks, fuzzy linear regression, fuzzy nonlinear regression, fuzzy classification with maximum matched method, fuzzy identification with fuzzy clustering, and fuzzy projection. Commercial software systems and the COCOMO database are used throughout this dissertation to demonstrate the usefulness of concepts and to validate new ideas.
Show less - Date Issued
- 2001
- PURL
- http://purl.flvc.org/fcla/dt/11948
- Subject Headings
- Software engineering, Fuzzy logic, Computer software--Quality control, Fuzzy systems
- Format
- Document (PDF)
- Title
- Software decomposition for multicore architectures.
- Creator
- Jain, Ankit., Florida Atlantic University, Shankar, Ravi
- Abstract/Description
-
Current multicore processors attempt to optimize consumer experience via task partitioning and concurrent execution of these (sub)tasks on the cores. Conversion of sequential code to parallel and concurrent code is neither easy, nor feasible with current methodologies. We have developed a mapping process that synergistically uses top-down and bottom-up methodologies. This process is amenable to automation. We use bottom-up analysis to determine decomposability and estimate computation and...
Show moreCurrent multicore processors attempt to optimize consumer experience via task partitioning and concurrent execution of these (sub)tasks on the cores. Conversion of sequential code to parallel and concurrent code is neither easy, nor feasible with current methodologies. We have developed a mapping process that synergistically uses top-down and bottom-up methodologies. This process is amenable to automation. We use bottom-up analysis to determine decomposability and estimate computation and communication metrics. The outcome is a set of proposals for software decomposition. We then build abstract concurrent models that map these decomposed (abstract) software modules onto candidate multicore architectures; this resolves concurrency issues. We then perform a system level simulation to estimate concurrency gain and/or cost, and QOS (Qualify-of-Service) metrics. Different architectural combinations yield different QOS metrics; the requisite system architecture may then be chosen. We applied this 'middle-out' methodology to optimally map a digital camera application onto a processor with four cores.
Show less - Date Issued
- 2006
- PURL
- http://purl.flvc.org/fcla/dt/13349
- Subject Headings
- Optimal designs (Statistics), Software architecture, Software engineering, Computer architecture, System design, Computer networks--Security measures
- Format
- Document (PDF)
- Title
- Software reliability engineering with genetic programming.
- Creator
- Liu, Yi., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Software reliability engineering plays a vital role in managing and controlling software quality. As an important method of software reliability engineering, software quality estimation modeling is useful in defining a cost-effective strategy to achieve a reliable software system. By predicting the faults in a software system, the software quality models can identify high-risk modules, and thus, these high-risk modules can be targeted for reliability enhancements. Strictly speaking, software...
Show moreSoftware reliability engineering plays a vital role in managing and controlling software quality. As an important method of software reliability engineering, software quality estimation modeling is useful in defining a cost-effective strategy to achieve a reliable software system. By predicting the faults in a software system, the software quality models can identify high-risk modules, and thus, these high-risk modules can be targeted for reliability enhancements. Strictly speaking, software quality modeling not only aims at lowering the misclassification rate, but also takes into account the costs of different misclassifications and the available resources of a project. As a new search-based algorithm, Genetic Programming (GP) can build a model without assuming the size, shape, or structure of a model. It can flexibly tailor the fitness functions to the objectives chosen by the customers. Moreover, it can optimize several objectives simultaneously in the modeling process, and thus, a set of multi-objective optimization solutions can be obtained. This research focuses on building software quality estimation models using GP. Several GP-based models of predicting the class membership of each software module and ranking the modules by a quality factor were proposed. The first model of categorizing the modules into fault-prone or not fault-prone was proposed by considering the distinguished features of the software quality classification task and GP. The second model provided quality-based ranking information for fault-prone modules. A decision tree-based software classification model was also proposed by considering accuracy and simplicity simultaneously. This new technique provides a new multi-objective optimization algorithm to build decision trees for real-world engineering problems, in which several trade-off objectives usually have to be taken into account at the same time. The fourth model was built to find multi-objective optimization solutions by considering both the expected cost of misclassification and available resources. Also, a new goal-oriented technique of building module-order models was proposed by directly optimizing several goals chosen by project analysts. The issues of GP , bloating and overfitting, were also addressed in our research. Data were collected from three industrial projects, and applied to validate the performance of the models. Results indicate that our proposed methods can achieve useful performance results. Moreover, some proposed methods can simultaneously optimize several different objectives of a software project management team.
Show less - Date Issued
- 2003
- PURL
- http://purl.flvc.org/fau/fd/FADT12047
- Subject Headings
- Computer software--Quality control, Genetic programming (Computer science), Software engineering
- Format
- Document (PDF)
- Title
- Models and Implementations of Online Laboratories; A Definition of a Standard Architecture to Integrate Distributed Remote Experiments.
- Creator
- Zapata Rivera, Luis Felipe, Larrondo Petrie, Maria M., Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Hands-on laboratory experiences are a key part of all engineering programs. Currently there is high demand for online engineering courses, but offering lab experiences online still remain a great challenge. Remote laboratories have been under development for more than 20 years and are part of a bigger category, called online laboratories, which includes also virtual laboratories. Development of remote laboratories in academic settings has been held back because of the lack of standardization...
Show moreHands-on laboratory experiences are a key part of all engineering programs. Currently there is high demand for online engineering courses, but offering lab experiences online still remain a great challenge. Remote laboratories have been under development for more than 20 years and are part of a bigger category, called online laboratories, which includes also virtual laboratories. Development of remote laboratories in academic settings has been held back because of the lack of standardization of technology, processes, operation and their integration with formal educational environments. Remote laboratories can be used in educational settings for a variety of reasons, for instance, when the equipment is not available in the physical laboratory; when the physical laboratory space available is not sufficient to either set up the experiments or permit access to all on-site students in the course; or when the teacher needs to provide online laboratory experiences to students taking courses via distance education. This dissertation proposes a new approach for the development and deployment of online laboratories over online platforms. The research activities performed include: The design and implementation of an architecture of a system for Smart Adaptive Remote Laboratories (SARL) integrated to educational environments to improve the remote laboratory users experience through the implementation of a modular architecture and the use of context information about the users and laboratory activities; the design pattern and implementation for the Remote Laboratory Management System (RLMS); the definition and implementation of an xAPI-based activity tracking system for online laboratories with support for both centralized and distributed architectures of Learning Record Stores (LRS); the definition of Smart Laboratory Learning Object (SLLO) capable of being integrated in different educational environments, including the implementation of a Lab Authoring module; and finally, the definition of a reliability model to detect and report failures and possible causes and countermeasures applying ruled based systems. The architecture proposed complies with the just approved IEEE 1876 Standard for Networked Smart Learning for Online Laboratories and supports virtual, remote, hybrid and mobile laboratories. A full set of low-cost online laboratory experiment stations were designed and implemented to support the Introduction to Logic Design course, providing true hands-on lab experience to students through the a low-cost, student-built mobile laboratory platform connected via USB to the SARL System. The SARL prototype have been successfully integrated to a Virtual Learning Environment (VLE) and a variety of configurations tested that can support privacy and security requirements of different stakeholders. The prototype online laboratory experiments developed have contributed and been featured in IEEE 1876 standard, as well as been integrated into an Industry Connections Actionable Data Book (ADB) that was featured in the Frankfurt Book Fair in 2017. SARL is being developed as the infrastructure to support a Latin American and Caribbean network of online laboratories.
Show less - Date Issued
- 2019
- PURL
- http://purl.flvc.org/fau/fd/FA00013282
- Subject Headings
- Remote laboratories, Online laboratories, Engineering Education, Software architecture
- Format
- Document (PDF)
- Title
- Modular software design methodology in a social context: Its use on large projects.
- Creator
- Hayes, William D., Florida Atlantic University, Coulter, Neal S.
- Abstract/Description
-
The history of software development reflects a continuous series of problems, crises and triumphs in the development of reliable software systems. Problems with comprehension of machine language led to assemblers and high level languages, and eventually to the discipline of structured programming. Problems with program and system size led to modularity and modular design. None of these solutions proved to be final because aspirations have risen along with competence. This thesis makes the...
Show moreThe history of software development reflects a continuous series of problems, crises and triumphs in the development of reliable software systems. Problems with comprehension of machine language led to assemblers and high level languages, and eventually to the discipline of structured programming. Problems with program and system size led to modularity and modular design. None of these solutions proved to be final because aspirations have risen along with competence. This thesis makes the argument that the increasing size of projects, in terms of their complexity and the numbers of persons required to bring them to fruition, gives rise to a set of problems caused by the social interaction of those persons. This social context is investigated. It is argued that solutions ignoring this social context are inadequate for solving the software crisis brought on by the increasing demand for larger software systems.
Show less - Date Issued
- 1990
- PURL
- http://purl.flvc.org/fcla/dt/14652
- Subject Headings
- Computer programming management, Software engineering--Management, Computer programmers
- Format
- Document (PDF)
- Title
- Software reliability engineering: An evolutionary neural network approach.
- Creator
- Hochman, Robert., Florida Atlantic University, Khoshgoftaar, Taghi M.
- Abstract/Description
-
This thesis presents the results of an empirical investigation of the applicability of genetic algorithms to a real-world problem in software reliability--the fault-prone module identification problem. The solution developed is an effective hybrid of genetic algorithms and neural networks. This approach (ENNs) was found to be superior, in terms of time, effort, and confidence in the optimality of results, to the common practice of searching manually for the best-performing net. Comparisons...
Show moreThis thesis presents the results of an empirical investigation of the applicability of genetic algorithms to a real-world problem in software reliability--the fault-prone module identification problem. The solution developed is an effective hybrid of genetic algorithms and neural networks. This approach (ENNs) was found to be superior, in terms of time, effort, and confidence in the optimality of results, to the common practice of searching manually for the best-performing net. Comparisons were made to discriminant analysis. On fault-prone, not-fault-prone, and overall classification, the lower error proportions for ENNs were found to be statistically significant. The robustness of ENNs follows from their superior performance over many data configurations. Given these encouraging results, it is suggested that ENNs have potential value in other software reliability problem domains, where genetic algorithms have been largely ignored. For future research, several plans are outlined for enhancing ENNs with respect to accuracy and applicability.
Show less - Date Issued
- 1997
- PURL
- http://purl.flvc.org/fcla/dt/15474
- Subject Headings
- Neural networks (Computer science), Software engineering, Genetic algorithms
- Format
- Document (PDF)
- Title
- Development of a Mobile Mapping System for Road Corridor Mapping.
- Creator
- Sairam, Nivedita, Nagarajan, Sudhagar, Florida Atlantic University, College of Engineering and Computer Science, Department of Civil, Environmental and Geomatics Engineering
- Abstract/Description
-
In any infrastructure project, managing the built assets is an important task. In the case of transportation asset inventories, a significant cost and effort is spent on recording and storing the asset information. In order to reduce the time and cost involved in road corridor mapping, this paper proposes a low cost MMS (Mobile Mapping System) using an equipped laser scanner and cameras. The process of building the MMS, components and sensors involved and calibration procedures are discussed....
Show moreIn any infrastructure project, managing the built assets is an important task. In the case of transportation asset inventories, a significant cost and effort is spent on recording and storing the asset information. In order to reduce the time and cost involved in road corridor mapping, this paper proposes a low cost MMS (Mobile Mapping System) using an equipped laser scanner and cameras. The process of building the MMS, components and sensors involved and calibration procedures are discussed. The efficiency of this Mobile Mapping System is experimented by mounting it on a truck and golf cart. The paper also provides a framework to extract road assets both automatically and manually using stateof- the-art techniques. The efficiency of this method is compared with traditional field survey methods. Quality of collected data, data integrity and process flow are experimented with a sample asset management framework and a spatial database structure for mapping road corridor features.
Show less - Date Issued
- 2016
- PURL
- http://purl.flvc.org/fau/fd/FA00004629, http://purl.flvc.org/fau/fd/FA00004629
- Subject Headings
- Transportation engineering., Electronics in engineering., Geographic information systems--Software., Internetworking (Telecommuniation), Geospatial data.
- Format
- Document (PDF)
- Title
- Analyzing software repository data to synthesize and visualize relationships between development artifacts.
- Creator
- Mulcahy, James J., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
As computing technology continues to advance, it has become increasingly difficult to find businesses that do not rely, at least in part, upon the collection and analysis of data for the purpose of project management and process improvement. The cost of software tends to increase over time due to its complexity and the cost of employing humans to develop, maintain, and evolve it. To help control the costs, organizations often seek to improve the process by which software systems are developed...
Show moreAs computing technology continues to advance, it has become increasingly difficult to find businesses that do not rely, at least in part, upon the collection and analysis of data for the purpose of project management and process improvement. The cost of software tends to increase over time due to its complexity and the cost of employing humans to develop, maintain, and evolve it. To help control the costs, organizations often seek to improve the process by which software systems are developed and evolved. Improvements can be realized by discovering previously unknown or hidden relationships between the artifacts generated as a result of developing a software system. The objective of the work described in this thesis is to provide a visualization tool that helps managers and engineers better plan for future projects by discovering new knowledge gained by synthesizing and visualizing data mined from software repository records from previous projects.
Show less - Date Issued
- 2011
- PURL
- http://purl.flvc.org/FAU/3333053
- Subject Headings
- Data mining, Mathematical models, Software engineering, Inofrmation visualization, Data processing, Application software, Development, Object-oriented programming (Computer science)
- Format
- Document (PDF)
- Title
- CBR-based software quality models and quality of data.
- Creator
- Xiao, Yudong., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The performance accuracy of software quality estimation models is influenced by several factors, including the following two important factors: performance of the prediction algorithm and the quality of data. This dissertation addresses these two factors, and consists of two components: (1) a proposed genetic algorithm (GA) based optimization of software quality models for accuracy enhancement, and (2) a proposed partitioning- and rule-based filter (PRBF) for noise detection toward...
Show moreThe performance accuracy of software quality estimation models is influenced by several factors, including the following two important factors: performance of the prediction algorithm and the quality of data. This dissertation addresses these two factors, and consists of two components: (1) a proposed genetic algorithm (GA) based optimization of software quality models for accuracy enhancement, and (2) a proposed partitioning- and rule-based filter (PRBF) for noise detection toward improvement of data quality. We construct a generalized framework of our embedded GA-optimizer, and instantiate the GA-optimizer for three optimization problems in software quality engineering: parameter optimization for case-based reasoning (CBR) models; module rank optimization for module-order modeling (MOM); and structural optimization for our multi-strategy classification modeling approach, denoted RB2CBL. Empirical case studies using software measurement data from real-world software systems were performed for the optimization problems. The GA-optimization approaches improved software quality prediction accuracy, highlighting the practical benefits of using GA for solving optimization problems in software engineering. The proposed noise detection approach, PRBF, was empirically evaluated using data categorized into two classes. Empirical studies on artificially corrupted datasets and datasets with known (natural) noise demonstrated that PRBF can effectively detect both artificial and natural noise. The proposed filter is a stable and robust technique, and always provided optimal or near-optimal noise detection results. In addition, it is applicable on datasets with nominal and numerical attributes, as well as those with missing values. The PRBF technique supports two methods of noise detection: class noise detection and cost-sensitive noise detection. The former is an easy-to-use method and does not need parameter settings, while the latter is suited for applications where each class has a specific misclassification cost. PRBF can also be used iteratively to investigate the two general types of data noise: attribute and class noise.
Show less - Date Issued
- 2005
- PURL
- http://purl.flvc.org/fcla/dt/12141
- Subject Headings
- Computer software--Quality control, Genetic programming (Computer science), Software engineering, Case-based reasoning, Combinatorial optimization, Computer network architecture
- Format
- Document (PDF)
- Title
- Ensemble-classifier approach to noise elimination: A case study in software quality classification.
- Creator
- Joshi, Vedang H., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This thesis presents a noise handling technique that attempts to improve the quality of training data for classification purposes by eliminating instances that are likely to be noise. Our approach uses twenty five different classification techniques to create an ensemble of classifiers that acts as a noise filter on real-world software measurement datasets. Using a relatively large number of base-level classifiers for the ensemble-classifier filter facilitates in achieving the desired level...
Show moreThis thesis presents a noise handling technique that attempts to improve the quality of training data for classification purposes by eliminating instances that are likely to be noise. Our approach uses twenty five different classification techniques to create an ensemble of classifiers that acts as a noise filter on real-world software measurement datasets. Using a relatively large number of base-level classifiers for the ensemble-classifier filter facilitates in achieving the desired level of noise removal conservativeness with several possible levels of filtering. It also provides a higher degree of confidence in the noise elimination procedure as the results are less likely to get influenced by (possible) inappropriate learning bias of a few algorithms with twenty five base-level classifiers than with a relatively smaller number of base-level classifiers. Empirical case studies of two different high assurance software projects demonstrate the effectiveness of our noise elimination approach by the significant improvement achieved in classification accuracies at various levels of filtering.
Show less - Date Issued
- 2004
- PURL
- http://purl.flvc.org/fcla/dt/13144
- Subject Headings
- Computer interfaces--Software--Quality control, Acoustical engineering, Noise control--Case studies, Expert systems (Computer science), Software documentation
- Format
- Document (PDF)
- Title
- Correcting noisy data and expert analysis of the correction process.
- Creator
- Seiffert, Christopher N., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This thesis expands upon an existing noise cleansing technique, polishing, enabling it to be used in the Software Quality Prediction domain, as well as any other domain where the data contains continuous values, as opposed to categorical data for which the technique was originally designed. The procedure is applied to a real world dataset with real (as opposed to injected) noise as determined by an expert in the domain. This, in combination with expert assessment of the changes made to the...
Show moreThis thesis expands upon an existing noise cleansing technique, polishing, enabling it to be used in the Software Quality Prediction domain, as well as any other domain where the data contains continuous values, as opposed to categorical data for which the technique was originally designed. The procedure is applied to a real world dataset with real (as opposed to injected) noise as determined by an expert in the domain. This, in combination with expert assessment of the changes made to the data, provides not only a more realistic dataset than one in which the noise (or even the entire dataset) is artificial, but also a better understanding of whether the procedure is successful in cleansing the data. Lastly, this thesis provides a more in-depth view of the process than previously available, in that it gives results for different parameters and classifier building techniques. This allows the reader to gain a better understanding of the significance of both model generation and parameter selection.
Show less - Date Issued
- 2005
- PURL
- http://purl.flvc.org/fcla/dt/13223
- Subject Headings
- Computer interfaces--Software--Quality control, Acoustical engineering, Noise control--Computer programs, Expert systems (Computer science), Software documentation
- Format
- Document (PDF)
- Title
- A procedure for evaluation and selection of computer-aided software engineering (CASE) tools for analysis and design.
- Creator
- Phillips, Steven David., Florida Atlantic University, Levow, Roy B., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Due to the relative youth of the computer-aided software engineering (CASE) market and the lack of standards, evaluation of CASE tools is a difficult problem. This problem is made more difficult by the fact that no single CASE tool is able to satisfy the needs of all potential users. In addition, an incorrect choice is expensive in terms of money and time invested. In this thesis, the literature is surveyed and synthesized to produce procedures and criteria to be used in the evaluation and...
Show moreDue to the relative youth of the computer-aided software engineering (CASE) market and the lack of standards, evaluation of CASE tools is a difficult problem. This problem is made more difficult by the fact that no single CASE tool is able to satisfy the needs of all potential users. In addition, an incorrect choice is expensive in terms of money and time invested. In this thesis, the literature is surveyed and synthesized to produce procedures and criteria to be used in the evaluation and selection of CASE tools intended for the analysis and design phases of the software development life cycle.
Show less - Date Issued
- 1991
- PURL
- http://purl.flvc.org/fcla/dt/14701
- Subject Headings
- Electronic data processing--Structured techniques, System analysis, Computer software--Development, Computer-aided software engineering
- Format
- Document (PDF)
- Title
- Fault tolerance and reliability patterns.
- Creator
- Buckley, Ingrid A., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The need to achieve dependability in critical infrastructures has become indispensable for government and commercial enterprises. This need has become more necessary with the proliferation of malicious attacks on critical systems, such as healthcare, aerospace and airline applications. Additionally, due to the widespread use of web services in critical systems, the need to ensure their reliability is paramount. We believe that patterns can be used to achieve dependability. We conducted a...
Show moreThe need to achieve dependability in critical infrastructures has become indispensable for government and commercial enterprises. This need has become more necessary with the proliferation of malicious attacks on critical systems, such as healthcare, aerospace and airline applications. Additionally, due to the widespread use of web services in critical systems, the need to ensure their reliability is paramount. We believe that patterns can be used to achieve dependability. We conducted a survey of fault tolerance, reliability and web service products and patterns to better understand them. One objective of our survey is to evaluate the state of these patterns, and to investigate which standards are being used in products and their tool support. Our survey found that these patterns are insufficient, and many web services products do not use them. In light of this, we wrote some fault tolerance and web services reliability patterns and present an analysis of them.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/FAU/166447
- Subject Headings
- Fault-tolerant computing, Computer software, Reliability, Reliability (Engineering), Computer programs
- Format
- Document (PDF)
- Title
- Towards a methodology for building reliable systems.
- Creator
- Buckley, Ingrid A., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Reliability is a key system characteristic that is an increasing concern for current systems. Greater reliability is necessary due to the new ways in which services are delivered to the public. Services are used by many industries, including health care, government, telecommunications, tools, and products. We have defined an approach to incorporate reliability along the stages of system development. We first did a survey of existing dependability patterns to evaluate their possible use in...
Show moreReliability is a key system characteristic that is an increasing concern for current systems. Greater reliability is necessary due to the new ways in which services are delivered to the public. Services are used by many industries, including health care, government, telecommunications, tools, and products. We have defined an approach to incorporate reliability along the stages of system development. We first did a survey of existing dependability patterns to evaluate their possible use in this methodology. We have defined a systematic methodology that helps the designer apply reliability in all steps of the development life cycle in the form of patterns. A systematic failure enumeration process to define corresponding countermeasures was proposed as a guideline to define where reliability is needed. We introduced the idea of failure patterns which show how failures manifest and propagate in a system. We also looked at how to combine reliability and security. Finally, we defined an approach to certify the level of reliability of an implemented web service. All these steps lead towards a complete methodology.
Show less - Date Issued
- 2012
- PURL
- http://purl.flvc.org/FAU/3342037
- Subject Headings
- Computer software, Reliability, Reliability (Engineering), Computer programs, Fault-tolerant computing
- Format
- Document (PDF)
- Title
- A pattern-driven process for secure service-oriented applications.
- Creator
- Delessy, Nelly A., Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
During the last few years, Service-Oriented Architecture (SOA) has been considered to be the new phase in the evolution of distributed enterprise applications. Even though there is a common acceptance of this concept, a real problem hinders the widespread use of SOA : A methodology to design and build secure service-oriented applications is needed. In this dissertation, we design a novel process to secure service-oriented applications. Our contribution is original not only because it applies...
Show moreDuring the last few years, Service-Oriented Architecture (SOA) has been considered to be the new phase in the evolution of distributed enterprise applications. Even though there is a common acceptance of this concept, a real problem hinders the widespread use of SOA : A methodology to design and build secure service-oriented applications is needed. In this dissertation, we design a novel process to secure service-oriented applications. Our contribution is original not only because it applies the MDA approach to the design of service-oriented applications but also because it allows their securing by dynamically applying security patterns throughout the whole process. Security patterns capture security knowledge and describe security mechanisms. In our process, we present a structured map of security patterns for SOA and web services and its corresponding catalog. At the different steps of a software lifecycle, the architect or designer needs to make some security decisions., An approach using a decision tree made of security pattern nodes is proposed to help making these choices. We show how to extract a decision tree from our map of security patterns. Model-Driven Architecture (MDA) is an approach which promotes the systematic use of models during a system's development lifecycle. In the dissertation we describe a chain of transformations necessary to obtain secure models of the service-oriented application. A main benefit of this process is that it decouples the application domain expertise from the security expertise that are both needed to build a secure application. Security knowledge is captured by pre-defined security patterns, their selection is rendered easier by using the decision trees and their application can be automated. A consequence is that the inclusion of security during the software development process becomes more convenient for the architects/designers., A second benefit is that the insertion of security is semi-automated and traceable. Thus, the process is flexible and can easily adapt to changing requirements. Given that SOA was developed in order to provide enterprises with modular, reusable and adaptable architectures, but that security was the principal factor that hindered its use, we believe that our process can act as an enabler for service-oriented applications.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/FAU/58003
- Subject Headings
- Computer network architectures, Web servers, Management, Software engineering, Expert systems (Computer science)
- Format
- Document (PDF)
- Title
- A Comparison of Model Checking Tools for Service Oriented Architectures.
- Creator
- Venkat, Raghava, Khoshgoftaar, Taghi M., Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Recently most of the research pertaining to Service-Oriented Architecture (SOA) is based on web services and how secure they are in terms of efficiency and effectiveness. This requires validation, verification, and evaluation of web services. Verification and validation should be collaborative when web services from different vendors are integrated together to carry out a coherent task. For this purpose, novel model checking technologies have been devised and applied to web services. "Model...
Show moreRecently most of the research pertaining to Service-Oriented Architecture (SOA) is based on web services and how secure they are in terms of efficiency and effectiveness. This requires validation, verification, and evaluation of web services. Verification and validation should be collaborative when web services from different vendors are integrated together to carry out a coherent task. For this purpose, novel model checking technologies have been devised and applied to web services. "Model Checking" is a promising technique for verification and validation of software systems. WS-BPEL (Business Process Execution Language for Web Services) is an emerging standard language to describe web service composition behavior. The advanced features of BPEL such as concurrency and hierarchy make it challenging to verify BPEL models. Based on all such factors my thesis surveys a few important technologies (tools) for model checking and comparing each of them based on their "functional" and "non-functional" properties. The comparison is based on three case studies (first being the small case, second medium and the third one a large case) where we construct synthetic web service compositions for each case (as there are not many publicly available compositions [1]). The first case study is "Enhanced LoanApproval Process" and is considered a small case. The second is "Enhanced Purchase Order Process" which is of medium size and the third, and largest is based on a scientific workflow pattern, called the "Service Oriented Architecture Implementing BOINC Workflow" based on BOINC (Berkeley Open Infrastructure Network Computing) architecture.
Show less - Date Issued
- 2007
- PURL
- http://purl.flvc.org/fau/fd/FA00012565
- Subject Headings
- Computer network architectures, Expert systems (Computer science), Software engineering, Web servers--Management
- Format
- Document (PDF)
- Title
- Model-Driven Architecture and the Secure Systems Methodology.
- Creator
- Morrison, Patrick, Fernandez, Eduardo B., Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
As a compamon and complement to the work being done to build a secure systems methodology, this thesis evaluates the use of Model-Driven Architecture (MDA) in support of the methodology's lifecycle. The development lifecycle illustrated follows the recommendations of this secure systems methodology, while using MDA models to represent requirements, analysis, design, and implementation information. In order to evaluate MDA, we analyze a well-understood distributed systems security problem,...
Show moreAs a compamon and complement to the work being done to build a secure systems methodology, this thesis evaluates the use of Model-Driven Architecture (MDA) in support of the methodology's lifecycle. The development lifecycle illustrated follows the recommendations of this secure systems methodology, while using MDA models to represent requirements, analysis, design, and implementation information. In order to evaluate MDA, we analyze a well-understood distributed systems security problem, remote access, as illustrated by the internet "secure shell" protocol, ssh. By observing the ability of MDA models and transformations to specify remote access in each lifecycle phase, MDA's strengths and weaknesses can be evaluated in this context. A further aim of this work is to extract concepts that can be contained in an MDA security metamodel for use in future projects.
Show less - Date Issued
- 2007
- PURL
- http://purl.flvc.org/fau/fd/FA00012537
- Subject Headings
- Expert systems (Computer science), Software engineering, Computer-aided design, Computer network architectures
- Format
- Document (PDF)