Current Search: Quality control (x)
View All Items
Pages
- Title
- Potential Efficacy of the Monte Carlo Dose Calculations of 6MV Flattening Filter-Free Photon Beam of M6™ Cyberknife® System.
- Creator
- Neupane, Taindra, Shang, Charles, Leventouri, Theodora, Florida Atlantic University, Charles E. Schmidt College of Science, Department of Physics
- Abstract/Description
-
MapCheck measurements for 50 retrospective patient’s treatment plans suggested that MapCheck could be effectively employed in routine patient specific quality assurance in M6 Cyberknife with beams delivered at different treatment angles. However, these measurements also suggested that for highly intensity modulated MLC plans, field segments of width
Show moreMapCheck measurements for 50 retrospective patient’s treatment plans suggested that MapCheck could be effectively employed in routine patient specific quality assurance in M6 Cyberknife with beams delivered at different treatment angles. However, these measurements also suggested that for highly intensity modulated MLC plans, field segments of width < 8 mm should further be analyzed with a modified (-4%) correction factor. Results of MC simulations of the M6 Cyberknife using the EGSnrc program for 2-5 millions of incident particles in BEAMnrc and 10-20 millions in DOSXYZnrc have shown dose uncertainties within 2% for open fields from 7.6 x 7.7 mm2 to 100 x 100 mm2. Energy and corresponding FWHM were optimized by comparing with water phantom measurements at 800 mm SAD resulting to E = 7 MeV and FWHM = 2.2 mm. Good agreement of dose profiles (within 2%) and outputs (within 3%) were found between the MC simulations and water phantom measurements for the open fields.
Show less - Date Issued
- 2018
- PURL
- http://purl.flvc.org/fau/fd/FA00013147
- Subject Headings
- Radiosurgery--Quality control, Monte Carlo method, Radiation dosimetry
- Format
- Document (PDF)
- Title
- Investigation of Rotational Deviations on Single Fiducial Tumor Tracking with Simulated Respiratory Motion using Synchrony® Respiratory Motion Tracking for Cyberknife® Treatment.
- Creator
- Christ, Zachary A., Shang, Charles, Leventouri, Theodora, Florida Atlantic University, Charles E. Schmidt College of Science, Department of Physics
- Abstract/Description
-
It is hypothesized that the uncertainty of the Synchrony® model from the rotation of a geometrically asymmetrical single fiducial shall be non-zero during the motion tracking. To validate this hypothesis, the uncertainty was measured for a Synchrony® model built for a respiratory motion phantom oriented at different yaw angles on a Cyberknife® treatment table. A Mini-ball Cube with three cylindrical GoldMark™ (1mmx5mm Au) numbered fiducials was placed inside a respiratory phantom and used for...
Show moreIt is hypothesized that the uncertainty of the Synchrony® model from the rotation of a geometrically asymmetrical single fiducial shall be non-zero during the motion tracking. To validate this hypothesis, the uncertainty was measured for a Synchrony® model built for a respiratory motion phantom oriented at different yaw angles on a Cyberknife® treatment table. A Mini-ball Cube with three cylindrical GoldMark™ (1mmx5mm Au) numbered fiducials was placed inside a respiratory phantom and used for all tests. The fiducial with the least artifact interference was selected for the motion tracking. A 2cm periodic, longitudinal, linear motion of the Mini-ball cube was executed and tested for yaw rotational angles, 0° – 90°. The test was repeated over 3 nonconsecutive days. The uncertainty increased with the yaw angle with the most noticeable changes seen between20° and 60° yaw, where uncertainty increased from 23.5% to 57.9%. A similar test was performed using a spherical Gold Anchor™ fiducial. The uncertainties found when using the Gold Anchor™ were statistically lower than those found when using the GoldMark™ fiducial for all angles of rotation. For the first time, it is found that Synchrony® model uncertainty depends on fiducial geometry. In addition, this research has shown that tracking target rotation using a single fiducial can be accomplished with the Synchrony® model uncertainty as it is displayed on the treatment console. The results of this research could lead to decreased acute toxicity effects related to multiple fiducials.
Show less - Date Issued
- 2018
- PURL
- http://purl.flvc.org/fau/fd/FA00013041
- Subject Headings
- Fiducial Markers, Radiosurgery--Quality control, Robotic radiosurgery
- Format
- Document (PDF)
- Title
- Perspectives of professional competence by newly licensed, registered nurses.
- Creator
- Bartolone, Priscilla Dunson., Christine E. Lynn College of Nursing
- Abstract/Description
-
Professional competence is expected of all nurses in practice. Although new nurses have met the competency requirement for practice legally, opinions vary among new nurses and nurse administrators as to whether new nurses are indeed competent to practice nursing. The purpose of this phenomenological research study was to learn what new nurses think about professional competence. The research question guiding this study was, "What is professional competence from the perspective of newly...
Show moreProfessional competence is expected of all nurses in practice. Although new nurses have met the competency requirement for practice legally, opinions vary among new nurses and nurse administrators as to whether new nurses are indeed competent to practice nursing. The purpose of this phenomenological research study was to learn what new nurses think about professional competence. The research question guiding this study was, "What is professional competence from the perspective of newly licensed registered nurses?"
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/fcla/dt/172666
- Subject Headings
- Clinical competence, Nursing, Standards, Nursing, Quality control, Nursing services, Administration
- Format
- Document (PDF)
- Title
- Classification of software quality using tree modeling with the SPRINT/SLIQ algorithm.
- Creator
- Mao, Wenlei., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Providing high quality software products is the common goal of all software engineers. Finding faults early can produce large savings over the software life cycle. Therefore, software quality has become the main subject in our research field. This thesis presents a series of studies on a very large legacy telecommunication system. The system has significantly more than ten million lines of code written in a high-level language similar to Pascal. Software quality models were developed to...
Show moreProviding high quality software products is the common goal of all software engineers. Finding faults early can produce large savings over the software life cycle. Therefore, software quality has become the main subject in our research field. This thesis presents a series of studies on a very large legacy telecommunication system. The system has significantly more than ten million lines of code written in a high-level language similar to Pascal. Software quality models were developed to predict the class of each module either as fault-prone or as not fault-prone. We used the SPRINT/SLIQ algorithm to build the classification tree models. We found out that SPRINT/ SLIQ as an improved CART algorithm can give us tree models with more accuracy, more balance, and less overfitting. We also found that software process metrics can significantly improve the predictive accuracy of software quality models.
Show less - Date Issued
- 2000
- PURL
- http://purl.flvc.org/fcla/dt/15767
- Subject Headings
- Computer software--Quality control, Software engineering, Software measurement
- Format
- Document (PDF)
- Title
- Modeling software quality with classification trees using principal components analysis.
- Creator
- Shan, Ruqun., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Software quality models often have raw software metrics as the input data for predicting quality. Raw metrics are usually highly correlated with one another and thus may result in unstable models. Principal components analysis is a statistical method to improve model stability. This thesis presents a series of studies on a very large legacy telecommunication system. The system has significantly more than ten million lines of code written in a high level language similar to Pascal. Software...
Show moreSoftware quality models often have raw software metrics as the input data for predicting quality. Raw metrics are usually highly correlated with one another and thus may result in unstable models. Principal components analysis is a statistical method to improve model stability. This thesis presents a series of studies on a very large legacy telecommunication system. The system has significantly more than ten million lines of code written in a high level language similar to Pascal. Software quality models were developed to predict the class of each module either as fault-prone or as not fault-prone. We found out that the models based on principal components analysis were more robust than those based on raw metrics. We also found out that software process metrics can significantly improve the predictive accuracy of software quality models.
Show less - Date Issued
- 1999
- PURL
- http://purl.flvc.org/fcla/dt/15714
- Subject Headings
- Principal components analysis, Computer software--Quality control, Software engineering
- Format
- Document (PDF)
- Title
- Modeling software quality with TREEDISC algorithm.
- Creator
- Yuan, Xiaojing, Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Software quality is crucial both to software makers and customers. However, in reality, improvement of quality and reduction of costs are often at odds. Software modeling can help us to detect fault-prone software modules based on software metrics, so that we can focus our limited resources on fewer modules and lower the cost but still achieve high quality. In the present study, a tree classification modeling technique---TREEDISC was applied to three case studies. Several major contributions...
Show moreSoftware quality is crucial both to software makers and customers. However, in reality, improvement of quality and reduction of costs are often at odds. Software modeling can help us to detect fault-prone software modules based on software metrics, so that we can focus our limited resources on fewer modules and lower the cost but still achieve high quality. In the present study, a tree classification modeling technique---TREEDISC was applied to three case studies. Several major contributions have been made. First, preprocessing of raw data was adopted to solve the computer memory problem and improve the models. Secondly, TREEDISC was thoroughly explored by examining the roles of important parameters in modeling. Thirdly, a generalized classification rule was introduced to balance misclassification rates and decrease type II error, which is considered more costly than type I error. Fourthly, certainty of classification was addressed. Fifthly, TREEDISC modeling was validated over multiple releases of software product.
Show less - Date Issued
- 1999
- PURL
- http://purl.flvc.org/fcla/dt/15718
- Subject Headings
- Computer software--Quality control, Computer simulation, Software engineering
- Format
- Document (PDF)
- Title
- An empirical study of analogy-based software quality classification models.
- Creator
- Ross, Fletcher Douglas., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Time and cost are among the most important elements in a software project. By efficiently using time and resources we can reduce costs. Any program can potentially contain faults. If we can identify those program modules that have better quality and are less likely to be fault-prone, then we can reduce the effort and cost required in testing these modules. This thesis presents a series of studies evaluating the use of Case-Based Reasoning (CBR ) as an effective method for classifying program...
Show moreTime and cost are among the most important elements in a software project. By efficiently using time and resources we can reduce costs. Any program can potentially contain faults. If we can identify those program modules that have better quality and are less likely to be fault-prone, then we can reduce the effort and cost required in testing these modules. This thesis presents a series of studies evaluating the use of Case-Based Reasoning (CBR ) as an effective method for classifying program modules based upon their quality. We believe that this is the first time that the mahalanobis distance, a distance measure utilizing the covariance matrix of the independent variables which accounts for the multi-colinearity of the data without the necessity for preprocessing, and data clustering, wherein the data was separated into groups based on a dependent variable have been used as modeling techniques in conjunction with (CBR).
Show less - Date Issued
- 2001
- PURL
- http://purl.flvc.org/fcla/dt/12817
- Subject Headings
- Modular programming, Computer software--Quality control, Software measurement
- Format
- Document (PDF)
- Title
- A comprehensive comparative study of multiple classification techniques for software quality estimation.
- Creator
- Puppala, Kishore., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Reliability and quality are desired features in industrial software applications. In some cases, they are absolutely essential. When faced with limited resources, software project managers will need to allocate such resources to the most fault prone areas. The ability to accurately classify a software module as fault-prone or not fault-prone enables the manager to make an informed resource allocation decision. An accurate quality classification avoids wasting resources on modules that are not...
Show moreReliability and quality are desired features in industrial software applications. In some cases, they are absolutely essential. When faced with limited resources, software project managers will need to allocate such resources to the most fault prone areas. The ability to accurately classify a software module as fault-prone or not fault-prone enables the manager to make an informed resource allocation decision. An accurate quality classification avoids wasting resources on modules that are not fault-prone. It also avoids missing the opportunity to correct faults relatively early in the development cycle, when they are less costly. This thesis seeks to introduce the classification algorithms (classifiers) that are implemented in the WEKA software tool. WEKA (Waikato Environment for Knowledge Analysis) was developed at the University of Waikato in New Zealand. An empirical investigation is performed using a case study at a real-world system.
Show less - Date Issued
- 2003
- PURL
- http://purl.flvc.org/fcla/dt/13039
- Subject Headings
- Software engineering, Computer software--Quality control, Decision trees
- Format
- Document (PDF)
- Title
- Information theory and software measurement.
- Creator
- Allen, Edward B., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Development of reliable, high quality, software requires study and understanding at each step of the development process. A basic assumption in the field of software measurement is that metrics of internal software attributes somehow relate to the intrinsic difficulty in understanding a program. Measuring the information content of a program attempts to indirectly quantify the comprehension task. Information theory based software metrics are attractive because they quantify the amount of...
Show moreDevelopment of reliable, high quality, software requires study and understanding at each step of the development process. A basic assumption in the field of software measurement is that metrics of internal software attributes somehow relate to the intrinsic difficulty in understanding a program. Measuring the information content of a program attempts to indirectly quantify the comprehension task. Information theory based software metrics are attractive because they quantify the amount of information in a well defined framework. However, most information theory based metrics have been proposed with little reference to measurement theory fundamentals, and empirical validation of predictive quality models has been lacking. This dissertation proves that representative information theory based software metrics can be "meaningful" components of software quality models in the context of measurement theory. To this end, members of a major class of metrics are shown to be regular representations of Minimum Description Length or Variety of software attributes, and are interval scale. An empirical validation case study is presented that predicted faults in modules based on Operator Information. This metric is closely related to Harrison's Average Information Content Classification, which is the entropy of the operators. New general methods for calculating synthetic complexity at the system level and module level are presented, quantifying the joint information of an arbitrary set of primitive software measures. Since all kinds of information are not equally relevant to software quality factors, components of synthetic module complexity are also defined. Empirical case studies illustrate the potential usefulness of the proposed synthetic metrics. A metrics data base is often the key to a successful ongoing software metrics program. The contribution of any proposed metric is defined in terms of measured variation using information theory, irrespective of the metric's usefulness in quality models. This is of interest when full validation is not practical. Case studies illustrate the method.
Show less - Date Issued
- 1995
- PURL
- http://purl.flvc.org/fcla/dt/12412
- Subject Headings
- Software engineering, Computer software--Quality control, Information theory
- Format
- Document (PDF)
- Title
- Count models for software quality estimation.
- Creator
- Gao, Kehan, Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The primary aim of software engineering is to produce quality software that is delivered on time, within budget, and fulfils all its requirements. A timely estimation of software quality can serve as a prerequisite in achieving high reliability of software-based systems. More specifically, software quality assurance efforts can be prioritized for targeting program modules that are most likely to have a high number of faults. Software quality estimation models are generally of two types: a...
Show moreThe primary aim of software engineering is to produce quality software that is delivered on time, within budget, and fulfils all its requirements. A timely estimation of software quality can serve as a prerequisite in achieving high reliability of software-based systems. More specifically, software quality assurance efforts can be prioritized for targeting program modules that are most likely to have a high number of faults. Software quality estimation models are generally of two types: a classification model that predicts the class membership of modules into two or more quality-based classes, and a quantitative prediction model that estimates the number of faults (or some other software quality factor) that are likely to occur in software modules. In the literature, a variety of techniques have been developed for software quality estimation, most of which are suited for either prediction or classification but not for both, e.g., the multiple linear regression (only for prediction) and logistic regression (only for classification).
Show less - Date Issued
- 2003
- PURL
- http://purl.flvc.org/fcla/dt/12042
- Subject Headings
- Computer software--Quality control, Software engineering, Econometrics, Regression analysis
- Format
- Document (PDF)
- Title
- Three-group software quality classification modeling with TREEDISC algorithm.
- Creator
- Liu, Yongbin., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Maintaining superior quality and reliability of software systems is important nowadays. Software quality modeling detects fault-prone modules and enables us to achieve high quality in software system by focusing on fewer modules, because of limited resources and budget. Tree-based modeling is a simple and effective method that predicts the fault proneness in software systems. In this thesis, we introduce TREEDISC modeling technique with a three-group classification rule to predict the quality...
Show moreMaintaining superior quality and reliability of software systems is important nowadays. Software quality modeling detects fault-prone modules and enables us to achieve high quality in software system by focusing on fewer modules, because of limited resources and budget. Tree-based modeling is a simple and effective method that predicts the fault proneness in software systems. In this thesis, we introduce TREEDISC modeling technique with a three-group classification rule to predict the quality of software modules. A general classification rule is applied and validated. The three impact parameters, group number, minimum leaf size and significant level, are thoroughly evaluated. An optimization procedure is conducted and empirical results are presented. Conclusions about the impact factors as well as the robustness of our research are performed. TREEDISC modeling technique with three-group classification has proved to be an efficient and convincing method in software quality control.
Show less - Date Issued
- 2003
- PURL
- http://purl.flvc.org/fcla/dt/13008
- Subject Headings
- Computer software--Quality control, Software measurement, Decision trees
- Format
- Document (PDF)
- Title
- Modeling Perceived Audio for Cellular Phones Based on Field Data.
- Creator
- Kitbamrung, Pateep, Pandya, Abhijit S., Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Even though, the current cellular network provides the user with a wide array of services, for a typical user voice communication is still the primary usage. It has become increasingly important for a cellular network provider to provide the customers with the clearest possible end-to-end speech during a call. However, this perceptually motivated QoS is hard to measure. While the main goal of this research has been on the modeling of the perceptual audio quality, this thesis focuses on the...
Show moreEven though, the current cellular network provides the user with a wide array of services, for a typical user voice communication is still the primary usage. It has become increasingly important for a cellular network provider to provide the customers with the clearest possible end-to-end speech during a call. However, this perceptually motivated QoS is hard to measure. While the main goal of this research has been on the modeling of the perceptual audio quality, this thesis focuses on the discovery of procedures for collecting audio and diagnostic data, the evaluation of the captured audio, and the mapping and visualization of the diagnostic and audio related data. The correct application of these modified procedures should increase the productivity of the drive test team as well as provides a platform for the accurate assessment of the data collected.
Show less - Date Issued
- 2007
- PURL
- http://purl.flvc.org/fau/fd/FA00012530
- Subject Headings
- Computer network protocols, Wireless communication systems--Quality control, Routers (Computer networks), Mobile communication systems--Quality control
- Format
- Document (PDF)
- Title
- QoS Driven Communication Backbone for NOC Based Embedded Systems.
- Creator
- Agarwal, Ankur, Shankar, Ravi, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
With the increasing complexity of the system design, it has become very critical to enhance system design productivity to meet with the time-to-market demands. Real Time embedded system designers are facing extreme challenges in underlying architectural design selection. It involves the selection of a programmable, concurrent, heterogeneous multiprocessor architecture platform. Such a multiprocessor system on chip (MPSoC) platform has set new innovative trends for the real-time systems and...
Show moreWith the increasing complexity of the system design, it has become very critical to enhance system design productivity to meet with the time-to-market demands. Real Time embedded system designers are facing extreme challenges in underlying architectural design selection. It involves the selection of a programmable, concurrent, heterogeneous multiprocessor architecture platform. Such a multiprocessor system on chip (MPSoC) platform has set new innovative trends for the real-time systems and system on Chip (SoC) designers. The consequences of this trend imply the shift in concern from computation and sequential algorithms to modeling concurrency, synchronization and communication in every aspect of hardware and software co-design and development. Some of the main problems in the current deep sub-micron technologies characterized by gate lengths in the range of 60-90 nm arise from non scalable wire delays, errors in signal integrity and un-synchronized communication. These problems have been addressed by the use of packet switched Network on Chip (NOC) architecture for future SoCs and thus, real-time systems. Such a NOC based system should be able to support different levels of quality of service (QoS) to meet the real time systems requirements. It will further help in enhancing the system productivity by providing a reusable communication backbone. Thus, it becomes extremely critical to properly design a communication backbone (CommB) for NOC. Along with offering different levels of QoS, CommB is responsible directing the flow of data from one node to another node through routers, allocators, switches, queues and links. In this dissertation I present a reusable component based, design of CommB, suitable for embedded applications, which supports three types of QoS (real-time, multi-media and control applications).
Show less - Date Issued
- 2006
- PURL
- http://purl.flvc.org/fau/fd/FA00012566
- Subject Headings
- Computer networks--Quality control, Data transmission systems, Embedded computer systems--Quality control, Interconnects (Integrated circuit technology)
- Format
- Document (PDF)
- Title
- Software quality modeling and analysis with limited or without defect data.
- Creator
- Seliya, Naeem A., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The key to developing high-quality software is the measurement and modeling of software quality. In practice, software measurements are often used as a resource to model and comprehend the quality of software. The use of software measurements to understand quality is accomplished by a software quality model that is trained using software metrics and defect data of similar, previously developed, systems. The model is then applied to estimate quality of the target software project. Such an...
Show moreThe key to developing high-quality software is the measurement and modeling of software quality. In practice, software measurements are often used as a resource to model and comprehend the quality of software. The use of software measurements to understand quality is accomplished by a software quality model that is trained using software metrics and defect data of similar, previously developed, systems. The model is then applied to estimate quality of the target software project. Such an approach assumes that defect data is available for all program modules in the training data. Various practical issues can cause an unavailability or limited availability of defect data from the previously developed systems. This dissertation presents innovative and practical techniques for addressing the problem of software quality analysis when there is limited or completely absent defect data. The proposed techniques for software quality analysis without defect data include an expert-based approach with unsupervised clustering and an expert-based approach with semi-supervised clustering. The proposed techniques for software quality analysis with limited defect data includes a semi-supervised classification approach with the Expectation-Maximization algorithm and an expert-based approach with semi-supervised clustering. Empirical case studies of software measurement datasets obtained from multiple NASA software projects are used to present and evaluate the different techniques. The empirical results demonstrate the attractiveness, benefit, and definite promise of the proposed techniques. The newly developed techniques presented in this dissertation is invaluable to the software quality practitioner challenged by the absence or limited availability of defect data from previous software development experiences.
Show less - Date Issued
- 2005
- PURL
- http://purl.flvc.org/fcla/dt/12151
- Subject Headings
- Software measurement, Computer software--Quality control, Computer software--Reliability--Mathematical models, Software engineering--Quality control
- Format
- Document (PDF)
- Title
- Object recognition by genetic algorithm.
- Creator
- Li, Jianhua., Florida Atlantic University, Han, Chingping (Jim), Zhuang, Hanqi, College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
Vision systems have been widely used for parts inspection in electronics assembly lines. In order to improve the overall performance of a visual inspection system, it is important to employ an efficient object recognition algorithm. In this thesis work, a genetic algorithm based correlation algorithm is designed for the task of visual electronic parts inspection. The proposed procedure is composed of two stages. In the first stage, a genetic algorithm is devised to find a sufficient number of...
Show moreVision systems have been widely used for parts inspection in electronics assembly lines. In order to improve the overall performance of a visual inspection system, it is important to employ an efficient object recognition algorithm. In this thesis work, a genetic algorithm based correlation algorithm is designed for the task of visual electronic parts inspection. The proposed procedure is composed of two stages. In the first stage, a genetic algorithm is devised to find a sufficient number of candidate image windows. For each candidate window, the correlation is performed between the sampled template and the image pattern inside the window. In the second stage, local searches are conducted in the neighborhood of these candidate windows. Among all the searched locations, the one that has a highest correlation value with the given template is selected as the best matched location. To apply the genetic algorithm technique, a number of important issues, such as selection of a fitness function, design of a coding scheme, and tuning of genetic parameters are addressed in the thesis. Experimental studies have confirmed that the proposed GA-based correlation method is much more effective in terms of accuracy and speed in locating the desired object, compared with the existing Monte-Carlo random search method.
Show less - Date Issued
- 1995
- PURL
- http://purl.flvc.org/fcla/dt/15225
- Subject Headings
- Genetic algorithms, Robots--Control systems, Computer vision, Quality control--Optical methods
- Format
- Document (PDF)
- Title
- The bureaucratic system: A positive or negative effect on nursing home quality of care?.
- Creator
- Lipsman, Lisa A., Florida Atlantic University, Evans, Arthur S.
- Abstract/Description
-
Over the last fifty years quality of care has been a consistent problem in nursing home facilities. The federal government implemented a bureaucratic system as an attempt to improve this standard. This thesis traces the emergence of this system in nursing homes and illustrates its failure to solve the problem. George Ritzer's four-point McDonaldization model of bureaucracy is applied to argue that the bureaucratic system for governing nursing homes has a negative effect on the quality of care...
Show moreOver the last fifty years quality of care has been a consistent problem in nursing home facilities. The federal government implemented a bureaucratic system as an attempt to improve this standard. This thesis traces the emergence of this system in nursing homes and illustrates its failure to solve the problem. George Ritzer's four-point McDonaldization model of bureaucracy is applied to argue that the bureaucratic system for governing nursing homes has a negative effect on the quality of care. Although this hypothesis has proven to be accurate, additional factors were consistently cited as having detrimental effects on resident care. These include issues such as insufficient pay and lack of training/education for CNAs. Moreover, human greed and societal views of the elderly appear to be the true root of the problem.
Show less - Date Issued
- 2004
- PURL
- http://purl.flvc.org/fcla/dt/13148
- Subject Headings
- Long-term care of the sick--Quality control, Nursing home care--Quality control, Outcome assessment (Medical care), Long-term care facilities--Standards
- Format
- Document (PDF)
- Title
- Distribution and migration of pesticide residues in mosquito control impoundments St. Lucie County, Florida, USA.
- Creator
- Parkinson, R. W., Wang, Tsen C., White, J. R., David, J. R., Hoffman, M. E., Harbor Branch Oceanographic Institute
- Date Issued
- 1993
- PURL
- http://purl.flvc.org/FCLA/DT/3318877
- Subject Headings
- Mosquitoes --Control --Florida, Pesticides --Environmental aspects --Florida, Water quality --Florida --Saint Lucie County, Water quality management --Florida
- Format
- Document (PDF)
- Title
- A genetic algorithm for non-constrained process and economic process optimization.
- Creator
- Chirdchid, Sangthen., Florida Atlantic University, Masory, Oren, Mazouz, Abdel Kader, College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
Improving the quality of a product and manufacturing processes at a low cost is an economic and technological challenge which quality engineers and researches must contend with. In general, the quality of products and their cost are the main concerns for manufactures. This is because improving quality is very crucial for staying competitive and improving the organization's market position. However, some difficulty of finding where the standard of good quality is still remains. Customers'...
Show moreImproving the quality of a product and manufacturing processes at a low cost is an economic and technological challenge which quality engineers and researches must contend with. In general, the quality of products and their cost are the main concerns for manufactures. This is because improving quality is very crucial for staying competitive and improving the organization's market position. However, some difficulty of finding where the standard of good quality is still remains. Customers' satisfaction is a key for setting up the quality target. One possible solution is to develop control limits, which are the limits for indicating the area of nonconforming product on the basis of minimizing the total cost or loss to the customer as well as to the manufacturer. Therefore, the goal of this dissertation is to develop an effective tool to improve a high quality of product while maintaining a minimum cost.
Show less - Date Issued
- 2004
- PURL
- http://purl.flvc.org/fau/fd/FADT12081
- Subject Headings
- Genetic algorithms, Quality of products--Cost effectiveness--Econometric models, Multivariate analysis, Taguchi methods (Quality control)
- Format
- Document (PDF)
- Title
- Ensemble-classifier approach to noise elimination: A case study in software quality classification.
- Creator
- Joshi, Vedang H., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This thesis presents a noise handling technique that attempts to improve the quality of training data for classification purposes by eliminating instances that are likely to be noise. Our approach uses twenty five different classification techniques to create an ensemble of classifiers that acts as a noise filter on real-world software measurement datasets. Using a relatively large number of base-level classifiers for the ensemble-classifier filter facilitates in achieving the desired level...
Show moreThis thesis presents a noise handling technique that attempts to improve the quality of training data for classification purposes by eliminating instances that are likely to be noise. Our approach uses twenty five different classification techniques to create an ensemble of classifiers that acts as a noise filter on real-world software measurement datasets. Using a relatively large number of base-level classifiers for the ensemble-classifier filter facilitates in achieving the desired level of noise removal conservativeness with several possible levels of filtering. It also provides a higher degree of confidence in the noise elimination procedure as the results are less likely to get influenced by (possible) inappropriate learning bias of a few algorithms with twenty five base-level classifiers than with a relatively smaller number of base-level classifiers. Empirical case studies of two different high assurance software projects demonstrate the effectiveness of our noise elimination approach by the significant improvement achieved in classification accuracies at various levels of filtering.
Show less - Date Issued
- 2004
- PURL
- http://purl.flvc.org/fcla/dt/13144
- Subject Headings
- Computer interfaces--Software--Quality control, Acoustical engineering, Noise control--Case studies, Expert systems (Computer science), Software documentation
- Format
- Document (PDF)
- Title
- Correcting noisy data and expert analysis of the correction process.
- Creator
- Seiffert, Christopher N., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This thesis expands upon an existing noise cleansing technique, polishing, enabling it to be used in the Software Quality Prediction domain, as well as any other domain where the data contains continuous values, as opposed to categorical data for which the technique was originally designed. The procedure is applied to a real world dataset with real (as opposed to injected) noise as determined by an expert in the domain. This, in combination with expert assessment of the changes made to the...
Show moreThis thesis expands upon an existing noise cleansing technique, polishing, enabling it to be used in the Software Quality Prediction domain, as well as any other domain where the data contains continuous values, as opposed to categorical data for which the technique was originally designed. The procedure is applied to a real world dataset with real (as opposed to injected) noise as determined by an expert in the domain. This, in combination with expert assessment of the changes made to the data, provides not only a more realistic dataset than one in which the noise (or even the entire dataset) is artificial, but also a better understanding of whether the procedure is successful in cleansing the data. Lastly, this thesis provides a more in-depth view of the process than previously available, in that it gives results for different parameters and classifier building techniques. This allows the reader to gain a better understanding of the significance of both model generation and parameter selection.
Show less - Date Issued
- 2005
- PURL
- http://purl.flvc.org/fcla/dt/13223
- Subject Headings
- Computer interfaces--Software--Quality control, Acoustical engineering, Noise control--Computer programs, Expert systems (Computer science), Software documentation
- Format
- Document (PDF)