Current Search: Computer engineering. (x)
View All Items
Pages
- Title
- A METHOD FOR AUTOMATICALLY GENERATING AND ANIMATING THREE-DIMENSIONAL MODELS OF PLANAR LINKAGES.
- Creator
- KEIL, MITCHEL JASON., Florida Atlantic University, Myklebust, Arvid, College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
This sturty presents a method for automatically generating and animating 3-D models of planar linkages. A computer program called ANIMEC is introduced which serves as a link between two existing programs KAPCA, and MOVIE. The algorithms of ANIMEC are described in detail and a program listing is provided.
- Date Issued
- 1984
- PURL
- http://purl.flvc.org/fcla/dt/14220
- Subject Headings
- Computer engineering, Computer graphics
- Format
- Document (PDF)
- Title
- Visions of engineering in the new global economy.
- Creator
- Furht, Borko
- Date Issued
- 2007-09
- PURL
- http://purl.flvc.org/fcla/dt/332869
- Subject Headings
- Technological innovations., Computer engineering., Computer engineering --Research., Computer science --Study and teaching.
- Format
- Document (PDF)
- Title
- Impact of best manufacturing engineering practices on software engineering practices.
- Creator
- Akhter, Shahina., Florida Atlantic University, Coulter, Neal S.
- Abstract/Description
-
This thesis involves original research in the area of semantic analysis of textual databases (content analysis). The main intention of this study is to examine how software engineering practices can benefit from the best manufacturing practices. There is a deliberate focus and emphasis on competitive effectiveness worldwide. The ultimate goal of the U.S. NAVY's Best Manufacturing Practices Program is to strengthen the U.S. industrial base and reduce the cost of defense systems by solving...
Show moreThis thesis involves original research in the area of semantic analysis of textual databases (content analysis). The main intention of this study is to examine how software engineering practices can benefit from the best manufacturing practices. There is a deliberate focus and emphasis on competitive effectiveness worldwide. The ultimate goal of the U.S. NAVY's Best Manufacturing Practices Program is to strengthen the U.S. industrial base and reduce the cost of defense systems by solving manufacturing problems and improving quality and reliability. Best manufacturing practices can assist software engineering practices in a way that when software companies use these practices they can: (1) Improve both software quality and staff productivity; (2) Determine the current status of the organization's software process; (3) Set goals for process improvement; (4) Create effective plans for reaching those goals; (5) Implement the major elements of the plans.
Show less - Date Issued
- 1997
- PURL
- http://purl.flvc.org/fcla/dt/15445
- Subject Headings
- Software engineering, Production engineering, Production management--Computer softwares
- Format
- Document (PDF)
- Title
- Detection of change-prone telecommunications software modules.
- Creator
- Weir, Ronald Eugene., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Accurately classifying the quality of software is a major problem in any software development project. Software engineers develop models that provide early estimates of quality metrics which allow them to take actions against emerging quality problems. The use of a neural network as a tool to classify programs as a low, medium, or high risk for errors or change is explored using multiple software metrics as input. It is demonstrated that a neural network, trained using the back-propagation...
Show moreAccurately classifying the quality of software is a major problem in any software development project. Software engineers develop models that provide early estimates of quality metrics which allow them to take actions against emerging quality problems. The use of a neural network as a tool to classify programs as a low, medium, or high risk for errors or change is explored using multiple software metrics as input. It is demonstrated that a neural network, trained using the back-propagation supervised learning strategy, produced the desired mapping between the static software metrics and the software quality classes. The neural network classification methodology is compared to the discriminant analysis classification methodology in this experiment. The comparison is based on two and three class predictive models developed using variables resulting from principal component analysis of software metrics.
Show less - Date Issued
- 1995
- PURL
- http://purl.flvc.org/fcla/dt/15183
- Subject Headings
- Computer software--Evaluation, Software engineering, Neural networks (Computer science)
- Format
- Document (PDF)
- Title
- Modeling software quality with TREEDISC algorithm.
- Creator
- Yuan, Xiaojing, Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Software quality is crucial both to software makers and customers. However, in reality, improvement of quality and reduction of costs are often at odds. Software modeling can help us to detect fault-prone software modules based on software metrics, so that we can focus our limited resources on fewer modules and lower the cost but still achieve high quality. In the present study, a tree classification modeling technique---TREEDISC was applied to three case studies. Several major contributions...
Show moreSoftware quality is crucial both to software makers and customers. However, in reality, improvement of quality and reduction of costs are often at odds. Software modeling can help us to detect fault-prone software modules based on software metrics, so that we can focus our limited resources on fewer modules and lower the cost but still achieve high quality. In the present study, a tree classification modeling technique---TREEDISC was applied to three case studies. Several major contributions have been made. First, preprocessing of raw data was adopted to solve the computer memory problem and improve the models. Secondly, TREEDISC was thoroughly explored by examining the roles of important parameters in modeling. Thirdly, a generalized classification rule was introduced to balance misclassification rates and decrease type II error, which is considered more costly than type I error. Fourthly, certainty of classification was addressed. Fifthly, TREEDISC modeling was validated over multiple releases of software product.
Show less - Date Issued
- 1999
- PURL
- http://purl.flvc.org/fcla/dt/15718
- Subject Headings
- Computer software--Quality control, Computer simulation, Software engineering
- Format
- Document (PDF)
- Title
- Modular software design methodology in a social context: Its use on large projects.
- Creator
- Hayes, William D., Florida Atlantic University, Coulter, Neal S.
- Abstract/Description
-
The history of software development reflects a continuous series of problems, crises and triumphs in the development of reliable software systems. Problems with comprehension of machine language led to assemblers and high level languages, and eventually to the discipline of structured programming. Problems with program and system size led to modularity and modular design. None of these solutions proved to be final because aspirations have risen along with competence. This thesis makes the...
Show moreThe history of software development reflects a continuous series of problems, crises and triumphs in the development of reliable software systems. Problems with comprehension of machine language led to assemblers and high level languages, and eventually to the discipline of structured programming. Problems with program and system size led to modularity and modular design. None of these solutions proved to be final because aspirations have risen along with competence. This thesis makes the argument that the increasing size of projects, in terms of their complexity and the numbers of persons required to bring them to fruition, gives rise to a set of problems caused by the social interaction of those persons. This social context is investigated. It is argued that solutions ignoring this social context are inadequate for solving the software crisis brought on by the increasing demand for larger software systems.
Show less - Date Issued
- 1990
- PURL
- http://purl.flvc.org/fcla/dt/14652
- Subject Headings
- Computer programming management, Software engineering--Management, Computer programmers
- Format
- Document (PDF)
- Title
- Modeling fault-prone modules of subsystems.
- Creator
- Thaker, Vishal Kirit., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
In software engineering software quality has become a topic of major concern. It has also been recognized that the role of maintenance organization is to understand and estimate the cost of maintenance releases of software systems. Planning the next release so as to maximize the increase in functionality and the improvement in quality are essential to successful maintenance management. With the growing collection of software in organizations this cost is becoming substantial. In this research...
Show moreIn software engineering software quality has become a topic of major concern. It has also been recognized that the role of maintenance organization is to understand and estimate the cost of maintenance releases of software systems. Planning the next release so as to maximize the increase in functionality and the improvement in quality are essential to successful maintenance management. With the growing collection of software in organizations this cost is becoming substantial. In this research we have compared two software quality models. We tried to see whether a model built on entire system which predicts subsystem and a model built on subsystem which predicts the same subsystem has similar, better or worst classification results. We used Classification And Regression Tree algorithm (CART) to build classification models. A case study is based on a very large telecommunication system.
Show less - Date Issued
- 2000
- PURL
- http://purl.flvc.org/fcla/dt/12700
- Subject Headings
- Computer software--Quality control, Software engineering
- Format
- Document (PDF)
- Title
- The cochlea: A signal processing paradigm.
- Creator
- Barrett, Raymond L. Jr., Florida Atlantic University, Erdol, Nurgun, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The cochlea provides frequency selectivity for acoustic input signal processing in mammals. The excellent performance of human hearing for speech processing leads to examination of the cochlea as a paradigm for signal processing. The components of the hearing process are examined and suitable models are selected for each component's function. The signal processing function is simulated by a computer program and the ensemble is examined for behavior and improvement. The models reveal that the...
Show moreThe cochlea provides frequency selectivity for acoustic input signal processing in mammals. The excellent performance of human hearing for speech processing leads to examination of the cochlea as a paradigm for signal processing. The components of the hearing process are examined and suitable models are selected for each component's function. The signal processing function is simulated by a computer program and the ensemble is examined for behavior and improvement. The models reveal that the motion of the basilar membrane provides a very selective low pass transmission characteristic. Narrowband frequency resolution is obtained from the motion by computation of spatial differences in the magnitude of the motion as energy propagates along the membrane. Basilar membrane motion is simulated using the integrable model of M. R. Schroeder, but the paradigm is useful for any model that exhibits similar high selectivity. Support is shown for an hypothesis that good frequency discrimination is possible without highly resonant structure. The nonlinear magnitude calculation is performed on signals developed without highly resonant structure, and differences in those magnitudes are signals shown to have good narrowband selectivity. Simultaneously, good transient behavior is preserved due to the avoidance of highly resonant structure. The cochlear paradigm is shown to provide a power spectrum with serendipitous good frequency selectivity and good transient response simultaneously.
Show less - Date Issued
- 1990
- PURL
- http://purl.flvc.org/fcla/dt/12251
- Subject Headings
- Engineering, Electronics and Electrical, Computer Science
- Format
- Document (PDF)
- Title
- Fault tolerance and reliability patterns.
- Creator
- Buckley, Ingrid A., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The need to achieve dependability in critical infrastructures has become indispensable for government and commercial enterprises. This need has become more necessary with the proliferation of malicious attacks on critical systems, such as healthcare, aerospace and airline applications. Additionally, due to the widespread use of web services in critical systems, the need to ensure their reliability is paramount. We believe that patterns can be used to achieve dependability. We conducted a...
Show moreThe need to achieve dependability in critical infrastructures has become indispensable for government and commercial enterprises. This need has become more necessary with the proliferation of malicious attacks on critical systems, such as healthcare, aerospace and airline applications. Additionally, due to the widespread use of web services in critical systems, the need to ensure their reliability is paramount. We believe that patterns can be used to achieve dependability. We conducted a survey of fault tolerance, reliability and web service products and patterns to better understand them. One objective of our survey is to evaluate the state of these patterns, and to investigate which standards are being used in products and their tool support. Our survey found that these patterns are insufficient, and many web services products do not use them. In light of this, we wrote some fault tolerance and web services reliability patterns and present an analysis of them.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/FAU/166447
- Subject Headings
- Fault-tolerant computing, Computer software, Reliability, Reliability (Engineering), Computer programs
- Format
- Document (PDF)
- Title
- Towards a methodology for building reliable systems.
- Creator
- Buckley, Ingrid A., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Reliability is a key system characteristic that is an increasing concern for current systems. Greater reliability is necessary due to the new ways in which services are delivered to the public. Services are used by many industries, including health care, government, telecommunications, tools, and products. We have defined an approach to incorporate reliability along the stages of system development. We first did a survey of existing dependability patterns to evaluate their possible use in...
Show moreReliability is a key system characteristic that is an increasing concern for current systems. Greater reliability is necessary due to the new ways in which services are delivered to the public. Services are used by many industries, including health care, government, telecommunications, tools, and products. We have defined an approach to incorporate reliability along the stages of system development. We first did a survey of existing dependability patterns to evaluate their possible use in this methodology. We have defined a systematic methodology that helps the designer apply reliability in all steps of the development life cycle in the form of patterns. A systematic failure enumeration process to define corresponding countermeasures was proposed as a guideline to define where reliability is needed. We introduced the idea of failure patterns which show how failures manifest and propagate in a system. We also looked at how to combine reliability and security. Finally, we defined an approach to certify the level of reliability of an implemented web service. All these steps lead towards a complete methodology.
Show less - Date Issued
- 2012
- PURL
- http://purl.flvc.org/FAU/3342037
- Subject Headings
- Computer software, Reliability, Reliability (Engineering), Computer programs, Fault-tolerant computing
- Format
- Document (PDF)
- Title
- An integrated component selection framework for system level design.
- Creator
- Calvert, Chad., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The increasing system design complexity is negatively impacting the overall system design productivity by increasing the cost and time of product development. One key to overcoming these challenges is exploiting Component Based Engineering practices. However it is a challenge to select an optimum component from a component library that will satisfy all system functional and non-functional requirements, due to varying performance parameters and quality of service requirements. In this thesis...
Show moreThe increasing system design complexity is negatively impacting the overall system design productivity by increasing the cost and time of product development. One key to overcoming these challenges is exploiting Component Based Engineering practices. However it is a challenge to select an optimum component from a component library that will satisfy all system functional and non-functional requirements, due to varying performance parameters and quality of service requirements. In this thesis we propose an integrated framework for component selection. The framework is a two phase approach that includes a system modeling and analysis phase and a component selection phase. Three component selection algorithms have been implemented for selecting components for a Network on Chip architecture. Two algorithms are based on a standard greedy method, with one being enhanced to produce more intelligent behavior. The third algorithm is based on simulated annealing. Further, a prototype was developed to evaluate the proposed framework and compare the performance of all the algorithms.
Show less - Date Issued
- 2009
- PURL
- http://purl.flvc.org/FAU/368608
- Subject Headings
- High performance computing, Computer architecture, Engineering design, Data processing, Computer-aided design
- Format
- Document (PDF)
- Title
- An efficient and scalable core allocation strategy for multicore systems.
- Creator
- Rani, Manira S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Multiple threads can run concurrently on multiple cores in a multicore system and improve performance/power ratio. However, effective core allocation in multicore and manycore systems is very challenging. In this thesis, we propose an effective and scalable core allocation strategy for multicore systems to achieve optimal core utilization by reducing both internal and external fragmentations. Our proposed strategy helps evenly spreading the servicing cores on the chip to facilitate better...
Show moreMultiple threads can run concurrently on multiple cores in a multicore system and improve performance/power ratio. However, effective core allocation in multicore and manycore systems is very challenging. In this thesis, we propose an effective and scalable core allocation strategy for multicore systems to achieve optimal core utilization by reducing both internal and external fragmentations. Our proposed strategy helps evenly spreading the servicing cores on the chip to facilitate better heat dissipation. We introduce a multi-stage power management scheme to reduce the total power consumption by managing the power states of the cores. We simulate three multicore systems, with 16, 32, and 64 cores, respectively, using synthetic workload. Experimental results show that our proposed strategy performs better than Square-shaped, Rectangle-shaped, L-Shaped, and Hybrid (contiguous and non-contiguous) schemes in multicore systems in terms of fragmentation and completion time. Among these strategies, our strategy provides a better heat dissipation mechanism.
Show less - Date Issued
- 2011
- PURL
- http://purl.flvc.org/FAU/3172698
- Subject Headings
- Modularity (Engineering), Multicasting (Computer networks), Convergence (Telecommunication), Computer architecture, Memory management (Computer science), Cache memory
- Format
- Document (PDF)
- Title
- Assessing Data Interoperability ofUML Modeling Tools.
- Creator
- Gohel, Vaishali P., Huang, Shihong, Florida Atlantic University
- Abstract/Description
-
In the globalization software development environments, where the development activities are distributed geographically and temporally, it is increasingly important for the Computer Aided Software Engineering (CASE) tools to maintain the information (both syntactic and semantic) captured in the design models. The Unified Modeling Language (UML) is the de facto standard for modeling software applications and UML diagrams serve as graphical documentations of the software system. The...
Show moreIn the globalization software development environments, where the development activities are distributed geographically and temporally, it is increasingly important for the Computer Aided Software Engineering (CASE) tools to maintain the information (both syntactic and semantic) captured in the design models. The Unified Modeling Language (UML) is the de facto standard for modeling software applications and UML diagrams serve as graphical documentations of the software system. The interoperability of UML modeling tools is important in supporting the models exchange, and further support design reuse. Tool interoperability is often implemented using XML Metadata Interchange (XMI). Unfortunately, there is a loss of fidelity of the design documentation when transforming between UML and XMI due to the compatibility of different versions of UML, XMI and add-on proprietary information, which hinder reuse. This thesis evaluates the interoperability of UML modeling tools by assessing the quality of XMI documents representing the design. Case studies in this thesis demonstrate a framework of preserving the fidelity of UML model 's data when importing and exporting different UML models in a distributed heterogeneous environment.
Show less - Date Issued
- 2007
- PURL
- http://purl.flvc.org/fau/fd/FA00012519
- Subject Headings
- UML (Computer science), Computer software--Development, Software engineering, Computer-aided design
- Format
- Document (PDF)
- Title
- Model-Driven Architecture and the Secure Systems Methodology.
- Creator
- Morrison, Patrick, Fernandez, Eduardo B., Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
As a compamon and complement to the work being done to build a secure systems methodology, this thesis evaluates the use of Model-Driven Architecture (MDA) in support of the methodology's lifecycle. The development lifecycle illustrated follows the recommendations of this secure systems methodology, while using MDA models to represent requirements, analysis, design, and implementation information. In order to evaluate MDA, we analyze a well-understood distributed systems security problem,...
Show moreAs a compamon and complement to the work being done to build a secure systems methodology, this thesis evaluates the use of Model-Driven Architecture (MDA) in support of the methodology's lifecycle. The development lifecycle illustrated follows the recommendations of this secure systems methodology, while using MDA models to represent requirements, analysis, design, and implementation information. In order to evaluate MDA, we analyze a well-understood distributed systems security problem, remote access, as illustrated by the internet "secure shell" protocol, ssh. By observing the ability of MDA models and transformations to specify remote access in each lifecycle phase, MDA's strengths and weaknesses can be evaluated in this context. A further aim of this work is to extract concepts that can be contained in an MDA security metamodel for use in future projects.
Show less - Date Issued
- 2007
- PURL
- http://purl.flvc.org/fau/fd/FA00012537
- Subject Headings
- Expert systems (Computer science), Software engineering, Computer-aided design, Computer network architectures
- Format
- Document (PDF)
- Title
- An improved neural net-based approach for predicting software quality.
- Creator
- Guasti, Peter John., Florida Atlantic University, Khoshgoftaar, Taghi M., Pandya, Abhijit S.
- Abstract/Description
-
Accurately predicting the quality of software is a major problem in any software development project. Software engineers develop models that provide early estimates of quality metrics which allow them to take action against emerging quality problems. Most often the predictive models are based upon multiple regression analysis which become unstable when certain data assumptions are not met. Since neural networks require no data assumptions, they are more appropriate for predicting software...
Show moreAccurately predicting the quality of software is a major problem in any software development project. Software engineers develop models that provide early estimates of quality metrics which allow them to take action against emerging quality problems. Most often the predictive models are based upon multiple regression analysis which become unstable when certain data assumptions are not met. Since neural networks require no data assumptions, they are more appropriate for predicting software quality. This study proposes an improved neural network architecture that significantly outperforms multiple regression and other neural network attempts at modeling software quality. This is demonstrated by applying this approach to several large commercial software systems. After developing neural network models, we develop regression models on the same data. We find that the neural network models surpass the regression models in terms of predictive quality on the data sets considered.
Show less - Date Issued
- 1995
- PURL
- http://purl.flvc.org/fcla/dt/15134
- Subject Headings
- Neural networks (Computer science), Computer software--Development, Computer software--Quality control, Software engineering
- Format
- Document (PDF)
- Title
- Rough Set-Based Software Quality Models and Quality of Data.
- Creator
- Bullard, Lofton A., Khoshgoftaar, Taghi M., Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
In this dissertation we address two significant issues of concern. These are software quality modeling and data quality assessment. Software quality can be measured by software reliability. Reliability is often measured in terms of the time between system failures. A failure is caused by a fault which is a defect in the executable software product. The time between system failures depends both on the presence and the usage pattern of the software. Finding faulty components in the development...
Show moreIn this dissertation we address two significant issues of concern. These are software quality modeling and data quality assessment. Software quality can be measured by software reliability. Reliability is often measured in terms of the time between system failures. A failure is caused by a fault which is a defect in the executable software product. The time between system failures depends both on the presence and the usage pattern of the software. Finding faulty components in the development cycle of a software system can lead to a more reliable final system and will reduce development and maintenance costs. The issue of software quality is investigated by proposing a new approach, rule-based classification model (RBCM) that uses rough set theory to generate decision rules to predict software quality. The new model minimizes over-fitting by balancing the Type I and Type II niisclassiflcation error rates. We also propose a model selection technique for rule-based models called rulebased model selection (RBMS). The proposed rule-based model selection technique utilizes the complete and partial matching rule sets of candidate RBCMs to determine the model with the least amount of over-fitting. In the experiments that were performed, the RBCMs were effective at identifying faulty software modules, and the RBMS technique was able to identify RBCMs that minimized over-fitting. Good data quality is a critical component for building effective software quality models. We address the significance of the quality of data on the classification performance of learners by conducting a comprehensive comparative study. Several trends were observed in the experiments. Class and attribute had the greatest impact on the performance of learners when it occurred simultaneously in the data. Class noise had a significant impact on the performance of learners, while attribute noise had no impact when it occurred in less than 40% of the most significant independent attributes. Random Forest (RF100), a group of 100 decision trees, was the most, accurate and robust learner in all the experiments with noisy data.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/fau/fd/FA00012567
- Subject Headings
- Computer software--Quality control, Computer software--Reliability, Software engineering, Computer arithmetic
- Format
- Document (PDF)
- Title
- CAD design patterns.
- Creator
- Hayes, Joey C., Florida Atlantic University, Pandya, Abhijit S.
- Abstract/Description
-
Software reuse has been looked upon in recent years as a promising mechanism for achieving increased levels of software quality and productivity within an organization. A form of software reuse which has been gaining in popularity is the use of design patterns. Design patterns are a higher level of abstraction than source code and are proving to be a valuable resource for both software developers and new hires within a company. This thesis develops the idea of applying design patterns to the...
Show moreSoftware reuse has been looked upon in recent years as a promising mechanism for achieving increased levels of software quality and productivity within an organization. A form of software reuse which has been gaining in popularity is the use of design patterns. Design patterns are a higher level of abstraction than source code and are proving to be a valuable resource for both software developers and new hires within a company. This thesis develops the idea of applying design patterns to the Computer Aided Design (CAD) software development environment. The benefits and costs associated with implementing a software reuse strategy are explained and the reasoning for developing design patterns is given. Design patterns are then described in detail and a potential method for applying design patterns within the CAD environment is demonstrated through the development of a CAD design pattern catalog.
Show less - Date Issued
- 1996
- PURL
- http://purl.flvc.org/fcla/dt/15358
- Subject Headings
- Computer-aided design, Computer-aided software engineering, Computer software--Development, Computer software--Reusability
- Format
- Document (PDF)
- Title
- Remote Labs: A Method to Implement a Portable Logic Design Laboratory Infrastructure and to Provide Access to Modern Test Equipment.
- Creator
- Weinthal, Charles Perry, Petrie, Maria Mercedes Larrondo, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This Thesis explores building low cost and reliable portable laboratory infrastructure platform for Logic Design, methods for allowing access to modern test equipment via the internet, and issues related to academic integrity. A comprehensive engineering education, per ABET, requires an equal emphasis on both lecture and laboratory components. The laboratory experience builds and establishes a foundation of skills and experiences that the student cannot obtain through any other means. The...
Show moreThis Thesis explores building low cost and reliable portable laboratory infrastructure platform for Logic Design, methods for allowing access to modern test equipment via the internet, and issues related to academic integrity. A comprehensive engineering education, per ABET, requires an equal emphasis on both lecture and laboratory components. The laboratory experience builds and establishes a foundation of skills and experiences that the student cannot obtain through any other means. The laboratory must use modern, pertinent methods and techniques including the use of appropriate tools. This is especially true when it comes to test equipment. Engineering students require and deserve training on and access to modern test equipment in order to obtain better career opportunities. However, providing access to modern and relevant labs requires a significant budget commitment. One way to extend current budgets is to adopt the growing concept of “remote labs.” This approach allows higher utilization of existing (and costly) equipment, it improves an institution’s Return on Investment (ROI), and also can be used to meet the needs of students’ complicated schedules, especially in the case of a “commuter campus,” where a majority of students live off campus. By developing remote labs, both the institution and the students benefit: Institutions increase equipment utilization, and utilize space, budgets and support personnel more efficiently. Students can access a lab whenever and wherever they have internet access. Finally, academic integrity must be protected to ensure the potential of remote laboratories in education. This Thesis presents a design and implementation plan for a low cost Logic Design laboratory infrastructure built and tested over 3 years by over 1,500 Logic Design students; a design and implementation of the infrastructure to include the ability to measure using remote test equipment; and the design of a case (3d printed or laser cut) to encapsulate a USB enabled micro-controller; and a scheme to ensure the academic integrity is maintained for in-person, hybrid and fully online classes.
Show less - Date Issued
- 2018
- PURL
- http://purl.flvc.org/fau/fd/FA00013177
- Subject Headings
- Logic design, Engineering laboratories, Logic design--Computer-assisted instruction
- Format
- Document (PDF)
- Title
- Classification of software quality using tree modeling with the SPRINT/SLIQ algorithm.
- Creator
- Mao, Wenlei., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Providing high quality software products is the common goal of all software engineers. Finding faults early can produce large savings over the software life cycle. Therefore, software quality has become the main subject in our research field. This thesis presents a series of studies on a very large legacy telecommunication system. The system has significantly more than ten million lines of code written in a high-level language similar to Pascal. Software quality models were developed to...
Show moreProviding high quality software products is the common goal of all software engineers. Finding faults early can produce large savings over the software life cycle. Therefore, software quality has become the main subject in our research field. This thesis presents a series of studies on a very large legacy telecommunication system. The system has significantly more than ten million lines of code written in a high-level language similar to Pascal. Software quality models were developed to predict the class of each module either as fault-prone or as not fault-prone. We used the SPRINT/SLIQ algorithm to build the classification tree models. We found out that SPRINT/ SLIQ as an improved CART algorithm can give us tree models with more accuracy, more balance, and less overfitting. We also found that software process metrics can significantly improve the predictive accuracy of software quality models.
Show less - Date Issued
- 2000
- PURL
- http://purl.flvc.org/fcla/dt/15767
- Subject Headings
- Computer software--Quality control, Software engineering, Software measurement
- Format
- Document (PDF)
- Title
- Applications of evolutionary algorithms in mechanical engineering.
- Creator
- Nelson, Kevin M., Florida Atlantic University, Huang, Ming Z., College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
Many complex engineering designs have conflicting requirements that must be compromised to effect a successful product. Traditionally, the engineering approach breaks up the complex problem into smaller sub-components in known areas of study. Tradeoffs occur between the conflicting requirements and a sub-optimal design results. A new computational approach based on the evolutionary processes observed in nature is explored in this dissertation. Evolutionary algorithms provide methods to solve...
Show moreMany complex engineering designs have conflicting requirements that must be compromised to effect a successful product. Traditionally, the engineering approach breaks up the complex problem into smaller sub-components in known areas of study. Tradeoffs occur between the conflicting requirements and a sub-optimal design results. A new computational approach based on the evolutionary processes observed in nature is explored in this dissertation. Evolutionary algorithms provide methods to solve complex engineering problems by optimizing the entire system, rather than sub-components of the system. Three standard forms of evolutionary algorithms have been developed: evolutionary programming, genetic algorithms and evolution strategies. Mathematical and algorithmic details are described for each of these methods. In this dissertation, four engineering problems are explored using evolutionary programming and genetic algorithms. Exploiting the inherently parallel nature of evolution, a parallel version of evolutionary programming is developed and implemented on the MasPar MP-1. This parallel version is compared to a serial version of the same algorithm in the solution of a trial set of unimodal and multi-modal functions. The parallel version had significantly improved performance over the serial version of evolutionary programming. An evolutionary programming algorithm is developed for the solution of electronic part placement problems with different assembly devices. The results are compared with previously published results for genetic algorithms and show that evolutionary programming can successfully solve this class of problem using fewer genetic operators. The finite element problem is cast into an optimization problem and an evolutionary programming algorithm is developed to solve 2-D truss problems. A comparison to LU-decomposition showed that evolutionary programming can solve these problems and that it has the capability to solve the more complex nonlinear problems. Finally, ordinary differential equations are discretized using finite difference representation and an objective function is formulated for the application of evolutionary programming and genetic algorithms. Evolutionary programming and genetic algorithms have the benefit of permitting over-constraining a problem to obtain a successful solution. In all of these engineering problems, evolutionary algorithms have been shown to offer a new solution method.
Show less - Date Issued
- 1997
- PURL
- http://purl.flvc.org/fcla/dt/12514
- Subject Headings
- Mechanical engineering, Genetic algorithms, Evolutionary programming (Computer science)
- Format
- Document (PDF)