Current Search: Department of Computer and Electrical Engineering and Computer Science (x)
View All Items
Pages
- Title
- A transputer-based fault-tolerant robot controller.
- Creator
- Kulkarni, Shubhada R., Florida Atlantic University, Fernandez, Eduardo B., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
In recent years robots have become increasingly important in many areas. A robotic controller requires high speed and high reliability and its design must continue these two aspects. This thesis presents a design for a Transputer based fault tolerant robot controller. For concreteness, we have designed this controller for a specific robot, the SEDAB, a prototype developed by IBM Corp. This design attempts to satisfy the two requirements of speed and reliability. Speed is achieved by the use...
Show moreIn recent years robots have become increasingly important in many areas. A robotic controller requires high speed and high reliability and its design must continue these two aspects. This thesis presents a design for a Transputer based fault tolerant robot controller. For concreteness, we have designed this controller for a specific robot, the SEDAB, a prototype developed by IBM Corp. This design attempts to satisfy the two requirements of speed and reliability. Speed is achieved by the use of a concurrent structure composed of Transputers. Reliability is provided by a self-testing mechanism and a multiprocessor system architecture. The Occam implementation of the robot processes is described. We have evaluated the reliability of this controller. The reliability study shows that there is a significant increase in the reliability of this controller due to the new architecture and proposed fault detection mechanism. While we have not been able to actually control this robot, we have shown that some scheduling heuristics can be effectively used to provide a higher level of performance.
Show less - Date Issued
- 1990
- PURL
- http://purl.flvc.org/fcla/dt/14616
- Subject Headings
- Robots--Control systems, Automatic control--Computer programs
- Format
- Document (PDF)
- Title
- A procedure for evaluation and selection of computer-aided software engineering (CASE) tools for analysis and design.
- Creator
- Phillips, Steven David., Florida Atlantic University, Levow, Roy B., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Due to the relative youth of the computer-aided software engineering (CASE) market and the lack of standards, evaluation of CASE tools is a difficult problem. This problem is made more difficult by the fact that no single CASE tool is able to satisfy the needs of all potential users. In addition, an incorrect choice is expensive in terms of money and time invested. In this thesis, the literature is surveyed and synthesized to produce procedures and criteria to be used in the evaluation and...
Show moreDue to the relative youth of the computer-aided software engineering (CASE) market and the lack of standards, evaluation of CASE tools is a difficult problem. This problem is made more difficult by the fact that no single CASE tool is able to satisfy the needs of all potential users. In addition, an incorrect choice is expensive in terms of money and time invested. In this thesis, the literature is surveyed and synthesized to produce procedures and criteria to be used in the evaluation and selection of CASE tools intended for the analysis and design phases of the software development life cycle.
Show less - Date Issued
- 1991
- PURL
- http://purl.flvc.org/fcla/dt/14701
- Subject Headings
- Electronic data processing--Structured techniques, System analysis, Computer software--Development, Computer-aided software engineering
- Format
- Document (PDF)
- Title
- A recovery metaprogram for fault diagnosis in a network of processors.
- Creator
- Pendse, Sateesh V., Florida Atlantic University, Fernandez, Eduardo B., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Recent advances in computer technology have increased the performance of computers, but application requirements will always exceed the performance level available today. This requires the use of multiprocessors. The importance of multiprocessor systems is increasing due to many reasons, one of which is reliability. Reliability is also an important aspect in any computer system design. For reliable operation the system should be able to detect and locate most of its faults. The idea of using...
Show moreRecent advances in computer technology have increased the performance of computers, but application requirements will always exceed the performance level available today. This requires the use of multiprocessors. The importance of multiprocessor systems is increasing due to many reasons, one of which is reliability. Reliability is also an important aspect in any computer system design. For reliable operation the system should be able to detect and locate most of its faults. The idea of using a set of processes collectively known as a Recovery Metaprogram (RMP) is applied in this thesis to system diagnosis. Several error location algorithms are analyzed and compared. Most of them are comparison methods. A new algorithm, called Duplication algorithm, is developed and analyzed. Primitives, oriented to the specific functions of error diagnosis, required by the RMP to coordinate recovery functions are also developed in this thesis.
Show less - Date Issued
- 1990
- PURL
- http://purl.flvc.org/fcla/dt/14656
- Subject Headings
- Fault location (Engineering)--Data processing, Multiprocessors
- Format
- Document (PDF)
- Title
- A practical methodology for strong LL(k) parsing.
- Creator
- Feriozi, Dan T., Florida Atlantic University, Levow, Roy B., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
A new method of analyzing and parsing the strong LL(k) class of the deterministic context-free grammars is presented. The method is shown to be efficient when applied to many typical parsing applications such as compilers for common programming languages. Another advantage of the method is that it uses the familiar LL(1) parse table as a base. Parse conflicts that are found in the LL(1) parse table are treated as exceptions, and are resolved by consulting another table. This extension to the...
Show moreA new method of analyzing and parsing the strong LL(k) class of the deterministic context-free grammars is presented. The method is shown to be efficient when applied to many typical parsing applications such as compilers for common programming languages. Another advantage of the method is that it uses the familiar LL(1) parse table as a base. Parse conflicts that are found in the LL(1) parse table are treated as exceptions, and are resolved by consulting another table. This extension to the traditional LL(1) methodology increases its power significantly. Both the space and the time efficiency of the new method are shown to be much greater than that of the standard strong LL(k) method, when used on common grammars. Since the recognition problem for the strong LL(k) grammars is known to be NP-complete, the worst-case complexity of the new method is exponential, as is the case with the standard method. However, the complexity of the new method is highly dependent on the form of the grammar that is being analyzed. Both the space requirements and the time complexity of the new method are polynomial functions of the size of the input to the problem for common grammars, such as those used for programming languages. This is in contrast to the standard method that has been used for parsing the strong LL(k) languages. That method always uses space and time that is an exponential function of the size of the input.
Show less - Date Issued
- 1996
- PURL
- http://purl.flvc.org/fcla/dt/12488
- Subject Headings
- Parsing (Computer grammar), Formal languages
- Format
- Document (PDF)
- Title
- PATH PLANNING ALGORITHMS FOR UNMANNED AIRCRAFT SYSTEMS WITH A SPACE-TIME GRAPH.
- Creator
- Steinberg, Andrew, Cardei, Mihaela, Cardei, Ionut, Florida Atlantic University, Department of Computer and Electrical Engineering and Computer Science, College of Engineering and Computer Science
- Abstract/Description
-
Unmanned Aircraft Systems (UAS) have grown in popularity due to their widespread potential applications, including efficient package delivery, monitoring, surveillance, search and rescue operations, agricultural uses, along with many others. As UAS become more integrated into our society and airspace, it is anticipated that the development and maintenance of a path planning collision-free system will become imperative, as the safety and efficiency of the airspace represents a priority. The...
Show moreUnmanned Aircraft Systems (UAS) have grown in popularity due to their widespread potential applications, including efficient package delivery, monitoring, surveillance, search and rescue operations, agricultural uses, along with many others. As UAS become more integrated into our society and airspace, it is anticipated that the development and maintenance of a path planning collision-free system will become imperative, as the safety and efficiency of the airspace represents a priority. The dissertation defines this problem as the UAS Collision-free Path Planning Problem. The overall objective of the dissertation is to design an on-demand, efficient and scalable aerial highway path planning system for UAS. The dissertation explores two solutions to this problem. The first solution proposes a space-time algorithm that searches for shortest paths in a space-time graph. The solution maps the aerial traffic map to a space-time graph that is discretized on the inter-vehicle safety distance. This helps compute safe trajectories by design. The mechanism uses space-time edge pruning to maintain the dynamic availability of edges as vehicles move on a trajectory. Pruning edges is critical to protect active UAS from collisions and safety hazards. The dissertation compares the solution with another related work to evaluate improvements in delay, run time scalability, and admission success while observing up to 9000 flight requests in the network. The second solution to the path planning problem uses a batch planning algorithm. This is a new mechanism that processes a batch of flight requests with prioritization on the current slack time. This approach aims to improve the planning success ratio. The batch planning algorithm is compared with the space-time algorithm to ascertain improvements in admission ratio, delay ratio, and running time, in scenarios with up to 10000 flight requests.
Show less - Date Issued
- 2021
- PURL
- http://purl.flvc.org/fau/fd/FA00013696
- Subject Headings
- Unmanned aerial vehicles, Drone aircraft, Drone aircraft--Automatic control, Space and time, Algorithms
- Format
- Document (PDF)
- Title
- ILLUMINATING CYBER THREATS FOR SMART CITIES: A DATA-DRIVEN APPROACH FOR CYBER ATTACK DETECTION WITH VISUAL CAPABILITIES.
- Creator
- Neshenko, Nataliia, Furht, Borko, Bou-Harb, Elias, Florida Atlantic University, Department of Computer and Electrical Engineering and Computer Science, College of Engineering and Computer Science
- Abstract/Description
-
A modern urban infrastructure no longer operates in isolation but instead leverages the latest technologies to collect, process, and distribute aggregated knowledge to improve the quality of the provided services and promote the efficiency of resource consumption. However, the ambiguity of ever-evolving cyber threats and their debilitating consequences introduce new barriers for decision-makers. Numerous techniques have been proposed to address the cyber misdemeanors against such critical...
Show moreA modern urban infrastructure no longer operates in isolation but instead leverages the latest technologies to collect, process, and distribute aggregated knowledge to improve the quality of the provided services and promote the efficiency of resource consumption. However, the ambiguity of ever-evolving cyber threats and their debilitating consequences introduce new barriers for decision-makers. Numerous techniques have been proposed to address the cyber misdemeanors against such critical realms and increase the accuracy of attack inference; however, they remain limited to detection algorithms omitting attack attribution and impact interpretation. The lack of the latter prompts the transition of these methods to operation difficult to impossible. In this dissertation, we first investigate the threat landscape of smart cities, survey and reveal the progress in data-driven methods for situational awareness and evaluate their effectiveness when addressing various cyber threats. Further, we propose an approach that integrates machine learning, the theory of belief functions, and dynamic visualization to complement available attack inference for ICS deployed in the realm of smart cities. Our framework offers an extensive scope of knowledge as opposed to solely evident indicators of malicious activity. It gives the cyber operators and digital investigators an effective tool to dynamically and visually interact, explore and analyze heterogeneous, complex data, and provide rich context information. Such an approach is envisioned to facilitate the cyber incident interpretation and support a timely evidence-based decision-making process.
Show less - Date Issued
- 2021
- PURL
- http://purl.flvc.org/fau/fd/FA00013813
- Subject Headings
- Smart cities, Cyber intelligence (Computer security), Visual analytics, Threats
- Format
- Document (PDF)
- Title
- SPACE-TIME GRAPH PATH PLANNING FOR UAS TRAFFIC MANAGEMENT SYSTEMS.
- Creator
- Papa, Rafael, Cardei, Mihaela, Cardei, Ionut, Florida Atlantic University, Department of Computer and Electrical Engineering and Computer Science, College of Engineering and Computer Science
- Abstract/Description
-
The unmanned aerial vehicle (UAV) technology has evolved considerably in recent years and the global demand for package delivery is expected to grow even more during COVID-19 and the social distance era. The low cost of acquisition, payload capacity, maneuverability, and the ability to y at low-altitude with a very low cost of operation, make UAVs a perfect fit to revolutionize the payload transportation of small items. The large-scale adoption of drone package delivery in high-density urban...
Show moreThe unmanned aerial vehicle (UAV) technology has evolved considerably in recent years and the global demand for package delivery is expected to grow even more during COVID-19 and the social distance era. The low cost of acquisition, payload capacity, maneuverability, and the ability to y at low-altitude with a very low cost of operation, make UAVs a perfect fit to revolutionize the payload transportation of small items. The large-scale adoption of drone package delivery in high-density urban areas can be challenging and the Unmanned Aircraft Systems (UAS) operators must ensure safety, security, efficiency and equity of the airspace system. In order to address some of these challenges, FAA and NASA have developed a new architecture that will support a set of services to enable cooperative management of low-altitude operations between UAS operators. The architecture is still in its conceptual stage and designing a mechanism that ensures the fair distribution of the available airspace to commercial applications has become increasingly important. Considering that, the path planning is one of the most important problems to be explored. The objective is not only to find an optimal and shortest path but also to provide a collision-free environment to the UAVs. Taking into consideration all these important aspects and others such as serving on-demand requests, flight duration limitation due to energy constraints, maintaining the safety distance to avoid collisions, and using warehouses as starting and ending points in parcel delivery, this dissertation proposes: (i) an energy-constrained scheduling mechanism using a multi-source A* algorithm variant, and (ii) a generalized path planning mechanism using a space-time graph with multi-source multi-destination BFS generalization to ensure pre-flight UAV collision-free trajectories. This dissertation also uses the generalized path planning mechanism to solve the energy-constrained drone delivery problem. The experimental results show that the proposed algorithms are computationally efficient and scalable with the number of requests and graph size.
Show less - Date Issued
- 2021
- PURL
- http://purl.flvc.org/fau/fd/FA00013861
- Subject Headings
- Unmanned aerial vehicles, Drone aircraft, Space and time
- Format
- Document (PDF)
- Title
- SPATIAL NETWORK BIG DATABASE APPROACH TO RESOURCE ALLOCATION PROBLEMS.
- Creator
- Qutbuddin, Ahmad, Yang, KwangSoo, Florida Atlantic University, Department of Computer and Electrical Engineering and Computer Science, College of Engineering and Computer Science
- Abstract/Description
-
Resource allocation for Spatial Network Big Database is challenging due to the large size of spatial networks, variety of types of spatial data, a fast update rate of spatial and temporal elements. It is challenging to learn, manage and process the collected data and produce meaningful information in a limited time. Produced information must be concise and easy to understand. At the same time, the information must be very descriptive and useful. My research aims to address these challenges...
Show moreResource allocation for Spatial Network Big Database is challenging due to the large size of spatial networks, variety of types of spatial data, a fast update rate of spatial and temporal elements. It is challenging to learn, manage and process the collected data and produce meaningful information in a limited time. Produced information must be concise and easy to understand. At the same time, the information must be very descriptive and useful. My research aims to address these challenges through the development of fundamental data processing components for advanced spatial network queries that clearly and briefly deliver critical information. This thesis proposal studied two challenging Spatial Network Big Database problems: (1) Multiple Resource Network Voronoi Diagram and (2) Node-attributed Spatial Graph Partitioning. To address the challenge of query processing for multiple resource allocation in preparing for or after a disaster, we investigated the problem of the Multiple Resource Network Voronoi Diagram (MRNVD). Given a spatial network and a set of service centers from k different resource types, a Multiple Resource Network Voronoi Diagram (MRNVD) partitions the spatial network into a set of Service Areas that can minimize the total cycle-distances of graph-nodes to allotted k service centers with different resource types. The MRNVD problem is important for critical societal applications such as assigning essential survival supplies (e.g., food, water, gas, and medical assistance) to residents impacted by man-made or natural disasters. The MRNVD problem is NP-hard; it is computationally challenging due to the large size of the transportation network. Previous work proposed the Distance bounded Pruning (DP) approach to produce an optimal solution for MRNVD. However, we found that DP can be generalized to reduce the computational cost for the minimum cycle-distance. We extend our prior work and propose a novel approach that reduces the computational cost. Experiments using real-world datasets from five different regions demonstrate that the proposed approach creates MRNVD and significantly reduces the computational cost.
Show less - Date Issued
- 2021
- PURL
- http://purl.flvc.org/fau/fd/FA00013854
- Subject Headings
- Spatial data infrastructures, Big data--Data processing, Resource allocation, Voronoi polygons
- Format
- Document (PDF)
- Title
- A REVIEW AND ANALYSIS OF BOT-IOT SECURITY DATA FOR MACHINE LEARNING.
- Creator
- Peterson, Jared M., Khoshgoftaar, Taghi M., Florida Atlantic University, Department of Computer and Electrical Engineering and Computer Science, College of Engineering and Computer Science
- Abstract/Description
-
Machine learning is having an increased impact on the Cyber Security landscape. The ability for predictive models to accurately identify attack patterns in security data is set to overtake more traditional detection methods. Industry demand has led to an uptick in research in the application of machine learning for Cyber Security. To facilitate this research many datasets have been created and made public. This thesis provides an in-depth analysis of one of the newest datasets, Bot-IoT. The...
Show moreMachine learning is having an increased impact on the Cyber Security landscape. The ability for predictive models to accurately identify attack patterns in security data is set to overtake more traditional detection methods. Industry demand has led to an uptick in research in the application of machine learning for Cyber Security. To facilitate this research many datasets have been created and made public. This thesis provides an in-depth analysis of one of the newest datasets, Bot-IoT. The full dataset contains about 73 million instances (big data), 3 dependent features, and 43 independent features. The purpose of this thesis is to provide researchers with a foundational understanding of Bot-IoT, its development, its features, its composition, and its pitfalls. It will also summarize many of the published works that utilize Bot-IoT and will propose new areas of research based on the issues identified in the current research and in the dataset.
Show less - Date Issued
- 2021
- PURL
- http://purl.flvc.org/fau/fd/FA00013838
- Subject Headings
- Machine learning, Cyber security, Big data
- Format
- Document (PDF)
- Title
- Electrical characterization of an innovative pad array carrier package for application specific electronic modules (ASEMs).
- Creator
- Nagaraja, Padma S., Florida Atlantic University, Barrett, Raymond L. Jr., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Pad Array Carrier (PAC) packaging is used for surface mounting of modules on printed circuit cards. The package described in this study has an added feature that allows for the testing of the package through the holes laid along the periphery of the package board. MAGIC was used in the design and layout of an 8 x 8 array size PAC. Two key contributors to electrical noise in the package, viz., cross talk and signal reflections were analyzed. Transmission line models were developed for...
Show morePad Array Carrier (PAC) packaging is used for surface mounting of modules on printed circuit cards. The package described in this study has an added feature that allows for the testing of the package through the holes laid along the periphery of the package board. MAGIC was used in the design and layout of an 8 x 8 array size PAC. Two key contributors to electrical noise in the package, viz., cross talk and signal reflections were analyzed. Transmission line models were developed for analyzing these parameters. HSPICE and HP 85150B Microwave Design Systems (MDS) were used to simulate the transmission line models to evaluate the effects of cross talk and signal reflections on the package board. The performance of the package for the speed and maintenance of signal integrity was evaluated. Guidelines specifying the physical geometry limitations for line length, line width, line spacing, and layout configurations required to meet specific noise budget (cross talk and signal reflection considerations) were established.
Show less - Date Issued
- 1992
- PURL
- http://purl.flvc.org/fcla/dt/14856
- Subject Headings
- Electronic packaging, Multichip modules (Microelectronics)
- Format
- Document (PDF)
- Title
- Industrial-strength formalization of object-oriented real-time systems.
- Creator
- Raghavan, Gopalakrishna., Florida Atlantic University, Larrondo-Petrie, Maria M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The goal of this dissertation is to propose an industrial-strength formal model for object-oriented real-time systems that captures real-time constraints using industry standard notations and tools. A light-weight formalization process is proposed that is semi-formal, graphical and easier to read and understand. This process supports formal behavior analysis, verification and validation. It is very effective in early detection of incompleteness and ambiguities in the specifications. The...
Show moreThe goal of this dissertation is to propose an industrial-strength formal model for object-oriented real-time systems that captures real-time constraints using industry standard notations and tools. A light-weight formalization process is proposed that is semi-formal, graphical and easier to read and understand. This process supports formal behavior analysis, verification and validation. It is very effective in early detection of incompleteness and ambiguities in the specifications. The proposed process uses industry standard tools and fits well within stringent industrial schedules. Formal requirements analysis is conducted using High Level Message Sequencing Chart (HMSC) and Message Sequencing Chart (MSC). In the formal analysis phase, the static structures are modeled using Unified Modeling Language (UML) and the constraints are formalized using Object Constraint Language (OCL). System behavior is formally modeled using Specification and Description Language (SDL) during the formal design phase. SDL is used for behavior modeling due to wide commercial availability of SDL-based tools for formal behavior analysis and validation. Transition rules mapping from UML Class Diagrams and Statecharts to SDL models are proposed. SDL models are formally simulated and validated during the formal validation phase. Using the proposed process real-time clock, timer, periodic process, aperiodic process, resource and precedence constraints were formalized. Different types of timers, such as periodic, aperiodic, one-shot, fixed-interval and variable-interval timers are derived using inheritance models. Semaphore wait and signal operations are formalized as part of the resource constraint. Pre-conditions, post-conditions and invariants for the real-time constraints were captured using OCL. Behavior of the proposed models were captured using Statecharts. The proposed mapping rules were used to translate the behavior models to SDL. The SDL models were formally simulated and validated using Telelogic Software Development Tool (SDT). The tools allowed extensive model analysis and helped uncover several design flaws. The real-time constraints were stereotyped and packaged into reusable formal components. These components can be easily imported by applications. Two case studies, Cruise Control System and Bottle Filling System, are included to illustrate the use of the proposed process and the real-time package. The "industrial-strength" of the process was validated by utilizing the proposed process in an industrial project where it was found to accelerate the development process.
Show less - Date Issued
- 2000
- PURL
- http://purl.flvc.org/fcla/dt/12632
- Subject Headings
- Object-oriented programming (Computer science), Real-time data processing, Formal methods (Computer science)
- Format
- Document (PDF)
- Title
- Intrusion detection in wireless networks: A data mining approach.
- Creator
- Nath, Shyam Varan., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The security of wireless networks has gained considerable importance due to the rapid proliferation of wireless communications. While computer network heuristics and rules are being used to control and monitor the security of Wireless Local Area Networks (WLANs), mining and learning behaviors of network users can provide a deeper level of security analysis. The objective and contribution of this thesis is three fold: exploring the security vulnerabilities of the IEEE 802.11 standard for...
Show moreThe security of wireless networks has gained considerable importance due to the rapid proliferation of wireless communications. While computer network heuristics and rules are being used to control and monitor the security of Wireless Local Area Networks (WLANs), mining and learning behaviors of network users can provide a deeper level of security analysis. The objective and contribution of this thesis is three fold: exploring the security vulnerabilities of the IEEE 802.11 standard for wireless networks; extracting features or metrics, from a security point of view, for modeling network traffic in a WLAN; and proposing a data mining-based approach to intrusion detection in WLANs. A clustering- and expert-based approach to intrusion detection in a wireless network is presented in this thesis. The case study data is obtained from a real-word WLAN and contains over one million records. Given the clusters of network traffic records, a distance-based heuristic measure is proposed for labeling clusters as either normal or intrusive. The empirical results demonstrate the promise of the proposed approach, laying the groundwork for a clustering-based framework for intrusion detection in computer networks.
Show less - Date Issued
- 2005
- PURL
- http://purl.flvc.org/fcla/dt/13246
- Subject Headings
- Wireless communication systems, Data warehousing, Data mining, Telecommunication--Security measures, Computer networks--Security measures, Computer security
- Format
- Document (PDF)
- Title
- Kinematic modeling, identification and compensation of robot manipulators.
- Creator
- Zhuang, Hanqi, Florida Atlantic University, Hamano, Fumio, Roth, Zvi S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Theoretical and practical issues of kinematic modeling, measurement, identification and compensation are addressed in this dissertation. A comprehensive robot calibration methodology using a new Complete and Parametrically Continuous (CPC) kinematic model is presented. The dissertation focuses on model-based robot calibration techniques. Parametric continuity of a kinematic model is defined and discussed to characterize model singularity. Irreducibility is defined to facilitate error model...
Show moreTheoretical and practical issues of kinematic modeling, measurement, identification and compensation are addressed in this dissertation. A comprehensive robot calibration methodology using a new Complete and Parametrically Continuous (CPC) kinematic model is presented. The dissertation focuses on model-based robot calibration techniques. Parametric continuity of a kinematic model is defined and discussed to characterize model singularity. Irreducibility is defined to facilitate error model reduction. Issues of kinematic parameter identification are addressed by utilizing generic forms of linearized kinematic error models. The CPC model is a complete and parametrically continuous kinematic model capable of describing geometry and motion of a robot manipulator. Owing to the completeness of the CPC model, the transformation from the base frame to the world frame and from the tool frame to the last link frame can be modeled with the same modeling convention as the one used for internal link transformations. Due to the parametric continuity of the CPC model, numerical difficulties in kinematic parameter identification using error models are reduced. The CPC model construction, computation of the link parameters from a given link transformation, inverse kinematics, transformations between the CPC model and the Denavit-Hartenberg model, and linearized CPC error model construction are investigated. New methods for self-calibration of a laser tracking coordinate-measuring-machine are reported. Two calibration methods, one based on a four-tracker system and the other based on three trackers with a precision plane, are proposed. Iterative estimation algorithms along with simulation results are presented. Linear quadratic regulator (LQR) theory is applied to design robot accuracy compensators. In the LQR algorithm, additive corrections of joint commands are found without explicitly solving the inverse kinematic problem for an actual robot; a weighting matrix and coefficients in the cost function can be chosen systematically to achieve specific objective such as emphasizing the positioning accuracy of the end-effector over its orientation accuracy and vice versa and taking into account joint travelling limits as well as singularity zones of the robot. The results of the kinematic identification and compensation experiments using the PUMA robot have shown that the CPC modeling technique presented in this dissertation is a convenient and effective means for accuracy improvements of industrial robots.
Show less - Date Issued
- 1989
- PURL
- http://purl.flvc.org/fcla/dt/12243
- Subject Headings
- Robotics, Manipulators (Mechanism)
- Format
- Document (PDF)
- Title
- A MACHINE LEARNING APPROACH FOR OCEAN EVENT MODELING AND PREDICTION.
- Creator
- Muhamed, Ali Ali Abdullateef, Zhuang, Hanqi, Florida Atlantic University, Department of Computer and Electrical Engineering and Computer Science, College of Engineering and Computer Science
- Abstract/Description
-
In the last decade, deep learning models have been successfully applied to a variety of applications and solved many tasks. The ultimate goal of this study is to produce deep learning models to improve the skills of forecasting ocean dynamic events in general and those of the Loop Current (LC) system in particular. A specific forecast target is to predict the geographic location of the (LC) extension and duration, LC eddy shedding events for a long lead time with high accuracy. Also, this...
Show moreIn the last decade, deep learning models have been successfully applied to a variety of applications and solved many tasks. The ultimate goal of this study is to produce deep learning models to improve the skills of forecasting ocean dynamic events in general and those of the Loop Current (LC) system in particular. A specific forecast target is to predict the geographic location of the (LC) extension and duration, LC eddy shedding events for a long lead time with high accuracy. Also, this study aims to improve the predictability of velocity fields (or more precisely, velocity volumes) of subsurface currents. In this dissertation, several deep learning based prediction models have been proposed. The core of these models is the Long-Short Term Memory (LSTM) network. This type of recurrent neural network is trained with Sea Surface Height (SSH) and LC velocity datasets. The hyperparameters of these models are tuned according to each model's characteristics and data complexity. Prior to training, SSH and velocity data are decomposed into their temporal and spatial counterparts.A model uses the Robust Principle Component Analysis is first proposed, which produces a six-week lead time in forecasting SSH evolution. Next, the Wavelet+EOF+LSTM (WELL) model is proposed to improve the forecasting capability of a prediction model. This model is tested on the prediction of two LC eddies, namely eddy Cameron and Darwin. It is shown that the WELL model can predict the separation of both eddies 10 and 14 weeks ahead respectively, which is two more weeks than the DAC model. Furthermore, the WELL model overcomes the problem due to the partitioning step involved in the DAC model. For subsurface currents forecasting, a layer partitioning method is proposed to predict the subsurface field of the LC system. A weighted average fusion is used to improve the consistency of the predicted layers of the 3D subsurface velocity field. The main challenge of forecasting of the LC and its eddies is the small number of events that have occurred over time, which is only once or twice a year, which makes the training task difficult. Forecasting the velocity of subsurface currents is equally challenging because of the limited insitu measurements.
Show less - Date Issued
- 2021
- PURL
- http://purl.flvc.org/fau/fd/FA00013727
- Subject Headings
- Machine learning, Loop Current, Oceanography--Forecasting
- Format
- Document (PDF)
- Title
- CYBER-PHYSICAL SYSTEMS: BUILDING A SECURITY REFERENCE ARCHITECTURE FOR CARGO PORTS.
- Creator
- Romero, Virginia Mendiola, Fernandez, Eduardo B., Florida Atlantic University, Department of Computer and Electrical Engineering and Computer Science, College of Engineering and Computer Science
- Abstract/Description
-
Cyber-Physical Systems (CPS) are physical entities whose operations are monitored, coordinated, and controlled by a computing and communication core. These systems are highly heterogeneous and complex. Their numerous components and cross domain complexity make attacks easy to propagate and security difficult to implement. Consequently, to secure these systems, they need to be built in a systematic and holistic way, where security is an integral part of the development lifecycle and not just...
Show moreCyber-Physical Systems (CPS) are physical entities whose operations are monitored, coordinated, and controlled by a computing and communication core. These systems are highly heterogeneous and complex. Their numerous components and cross domain complexity make attacks easy to propagate and security difficult to implement. Consequently, to secure these systems, they need to be built in a systematic and holistic way, where security is an integral part of the development lifecycle and not just an activity after development. These systems present a multitude of implementation details in their component units, so it is fundamental to use abstraction in the analysis and construction of their architecture. In particular, we can apply abstraction through the use of patterns. Pattern-based architectural modeling is a powerful way to describe the system and analyze its security and the other non-functional aspects. Patterns also have the potential to unify the design of their computational, communication, and control aspects. Architectural modeling can be performed through UML diagrams to show the interactions and dependencies between different components and its stakeholders. Also, it can be used to analyze security threats and describe the possible countermeasures to mitigate these threats. An important type of CPS is a maritime container terminal, a facility where cargo containers are transported between ships and land vehicles; for example, trains or trucks, for onward transportation, and vice versa. Every cargo port performs four basic functions: receiving, storing, staging and loading for both, import and export containers. We present here a set of patterns that describe the elements and functions of a cargo port system, and a Reference Architecture (RA) built using these patterns. We analyze and systematically enumerate the possible security threats to a container terminal in a cargo port using activity diagrams derived from selected use cases of the system. We describe these threats using misuse patterns, and from them select security patterns as defenses. The RA provides a framework to determine where to add these security mechanisms to stop or mitigate these threats and build a Security Reference Architecture (SRA) for CPS. An SRA is an abstract architecture describing a conceptual model of security that provides a way to specify security requirements for a wide range of concrete architectures. The analysis and design are given using a cargo port as our example, but the approach can be used in other domains as well. This is the first work we know where patterns and RAs are used to represent cargo ports and analyze their security.
Show less - Date Issued
- 2021
- PURL
- http://purl.flvc.org/fau/fd/FA00013737
- Subject Headings
- Cyber-physical systems, Cooperating objects (Computer systems), Container terminals
- Format
- Document (PDF)
- Title
- Studies on information-theoretics based data-sequence pattern-discriminant algorithms: Applications in bioinformatic data mining.
- Creator
- Arredondo, Tomas Vidal., Florida Atlantic University, Neelakanta, Perambur S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This research refers to studies on information-theoretic (IT) aspects of data-sequence patterns and developing thereof discriminant algorithms that enable distinguishing the features of underlying sequence patterns having characteristic, inherent stochastical attributes. The application potentials of such algorithms include bioinformatic data mining efforts. Consistent with the scope of the study as above, considered in this research are specific details on information-theoretics and entropy...
Show moreThis research refers to studies on information-theoretic (IT) aspects of data-sequence patterns and developing thereof discriminant algorithms that enable distinguishing the features of underlying sequence patterns having characteristic, inherent stochastical attributes. The application potentials of such algorithms include bioinformatic data mining efforts. Consistent with the scope of the study as above, considered in this research are specific details on information-theoretics and entropy considerations vis-a-vis sequence patterns (having stochastical attributes) such as DNA sequences of molecular biology. Applying information-theoretic concepts (essentially in Shannon's sense), the following distinct sets of metrics are developed and applied in the algorithms developed for data-sequence pattern-discrimination applications: (i) Divergence or cross-entropy algorithms of Kullback-Leibler type and of general Czizar class; (ii) statistical distance measures; (iii) ratio-metrics; (iv) Fisher type linear-discriminant measure and (v) complexity metric based on information redundancy. These measures are judiciously adopted in ascertaining codon-noncodon delineations in DNA sequences that consist of crisp and/or fuzzy nucleotide domains across their chains. The Fisher measure is also used in codon-noncodon delineation and in motif detection. Relevant algorithms are used to test DNA sequences of human and some bacterial organisms. The relative efficacy of the metrics and the algorithms is determined and discussed. The potentials of such algorithms in supplementing the prevailing methods are indicated. Scope for future studies is identified in terms of persisting open questions.
Show less - Date Issued
- 2003
- PURL
- http://purl.flvc.org/fau/fd/FADT12057
- Subject Headings
- Data mining, Bioinformatics, Discriminant analysis, Information theory in biology
- Format
- Document (PDF)
- Title
- Synchronization in digital wireless radio receivers.
- Creator
- Nezami, Mohamed Khalid., Florida Atlantic University, Sudhakar, Raghavan, Helmken, Henry, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Time Division Multiple Access (TDMA) architecture is an established technology for digital cellular, personal and satellite communications, as it supports variable data rate transmission and simplified receiver design. Due to transmission bandwidth restrictions, increasing user demands and the necessity to operate at lower signal-to-noise ratio (SNR), the TDMA systems employ high order modulation schemes such as M-ary Quadrature Amplitude Modulation (M-QAM) and burst transmission. Use of such...
Show moreTime Division Multiple Access (TDMA) architecture is an established technology for digital cellular, personal and satellite communications, as it supports variable data rate transmission and simplified receiver design. Due to transmission bandwidth restrictions, increasing user demands and the necessity to operate at lower signal-to-noise ratio (SNR), the TDMA systems employ high order modulation schemes such as M-ary Quadrature Amplitude Modulation (M-QAM) and burst transmission. Use of such techniques in low SNR fading channels causes degradations of carrier frequency error, phase rotation error, and symbol timing jitter. To compensate for the severe degradation due to additive white Gaussian noise (AWGN) and channel impairments, precise and robust synchronization algorithms are required. This dissertation deals with the synchronization techniques for TDMA receivers using short burst mode transmission with emphasis on preamble-less feedforward synchronization schemes. The objective is to develop new algorithms for symbol timing, carrier frequency offset acquisition, and carrier phase tracking using preamble-less synchronization techniques. To this end, the currently existing synchronization algorithms are surveyed and analyzed. The performance evaluation of the developed algorithms is conducted through Monte-Carlo simulations and theoretical analyses. The statistical properties of the proposed algorithms in AWGN and fading channels are evaluated in terms of the mean and variance of the estimated synchronization errors and their Cramer-Rao lower bounds. Based on the investigation of currently employed feedforward symbol timing algorithms, two new symbol timing recovery schemes are proposed for 16-QAM land mobile signals operating in fading channels. Both schemes achieve better performance in fading channels compared to their existing counterparts without increasing the complexity of the receiver implementation. Further, based on the analysis of currently employed carrier offset and carrier phase recovery algorithms, two new algorithms are proposed for carrier acquisition and carrier tracking of mobile satellite systems utilizing short TDMA bursts with large frequency offsets. The proposed algorithms overcome some of the conventional problems associated with currently employed carrier recovery schemes in terms of capture range, speed of convergence, and stability.
Show less - Date Issued
- 2001
- PURL
- http://purl.flvc.org/fcla/dt/11947
- Subject Headings
- Radio--Receivers and reception, Digital communications, Time division multiple access
- Format
- Document (PDF)
- Title
- Software quality modeling and analysis with limited or without defect data.
- Creator
- Seliya, Naeem A., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The key to developing high-quality software is the measurement and modeling of software quality. In practice, software measurements are often used as a resource to model and comprehend the quality of software. The use of software measurements to understand quality is accomplished by a software quality model that is trained using software metrics and defect data of similar, previously developed, systems. The model is then applied to estimate quality of the target software project. Such an...
Show moreThe key to developing high-quality software is the measurement and modeling of software quality. In practice, software measurements are often used as a resource to model and comprehend the quality of software. The use of software measurements to understand quality is accomplished by a software quality model that is trained using software metrics and defect data of similar, previously developed, systems. The model is then applied to estimate quality of the target software project. Such an approach assumes that defect data is available for all program modules in the training data. Various practical issues can cause an unavailability or limited availability of defect data from the previously developed systems. This dissertation presents innovative and practical techniques for addressing the problem of software quality analysis when there is limited or completely absent defect data. The proposed techniques for software quality analysis without defect data include an expert-based approach with unsupervised clustering and an expert-based approach with semi-supervised clustering. The proposed techniques for software quality analysis with limited defect data includes a semi-supervised classification approach with the Expectation-Maximization algorithm and an expert-based approach with semi-supervised clustering. Empirical case studies of software measurement datasets obtained from multiple NASA software projects are used to present and evaluate the different techniques. The empirical results demonstrate the attractiveness, benefit, and definite promise of the proposed techniques. The newly developed techniques presented in this dissertation is invaluable to the software quality practitioner challenged by the absence or limited availability of defect data from previous software development experiences.
Show less - Date Issued
- 2005
- PURL
- http://purl.flvc.org/fcla/dt/12151
- Subject Headings
- Software measurement, Computer software--Quality control, Computer software--Reliability--Mathematical models, Software engineering--Quality control
- Format
- Document (PDF)
- Title
- Self-calibration of laser tracking measurement system with planar constraints.
- Creator
- Motaghedi, Shui Hu., Florida Atlantic University, Zhuang, Hanqi, Roth, Zvi S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Laser tracking coordinate measuring machines have the potential of continuously measuring three dimensional target coordinates in a large workspace with a fast sampling rate and high accuracy. Proper calibration of a laser tracking measurement system is essential prior to use of such a device for metrology. In the absence of a more accurate instrument for system calibration, one has to rely on self-calibration strategies. In this dissertation, a kinematic model that describes not only the...
Show moreLaser tracking coordinate measuring machines have the potential of continuously measuring three dimensional target coordinates in a large workspace with a fast sampling rate and high accuracy. Proper calibration of a laser tracking measurement system is essential prior to use of such a device for metrology. In the absence of a more accurate instrument for system calibration, one has to rely on self-calibration strategies. In this dissertation, a kinematic model that describes not only the motion but also geometric variations of a multiple-beam laser tracking system was developed. The proposed model has the following features: (1) Target positions can be computed from both distance and angular measurements. (2) Through error analysis it was proven that even rough angular measurement may improve the overall system calibration results. A self-calibration method was proposed to calibrate intelligent machines with planar constraints. The method is also applied to the self-calibration of the laser tracking system and a standard PUMA 560 robot. Various calibration strategies utilizing planar constraints were explored to deal with different system setups. For each calibration strategy, issues about the error parameter estimation of the system were investigated to find out under which conditions these parameters can be uniquely estimated. These conditions revealed the applicability of the planar constraints to the system self-calibration. The observability conditions can serve as a guideline for the experimental setup when planar constraint is utilized in the machine calibration including the calibration of the laser tracking systems. Intensive simulation studies were conducted to check validity of the theoretical results. Realistic noise values were injected to the system models to statistically assess the behavior of the self-calibration system under real-world conditions. Various practical calibration issues were also explored in the simulations and therefore to pave ways for experimental investigation. The calibration strategies were also applied experimentally to calibrate a laser tracking system constructed at the Robotics Center in Florida Atlantic University.
Show less - Date Issued
- 1999
- PURL
- http://purl.flvc.org/fcla/dt/12599
- Subject Headings
- Robots--Kinematics, Robotics--Calibration--Measurement, Robots--Control systems
- Format
- Document (PDF)
- Title
- Software fault prediction using tree-based models.
- Creator
- Seliya, Naeem A., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Maintaining superior quality and reliability in software systems is of utmost importance in today's world. Early fault prediction is a proven method for achieving this. Tree based modelling is a simple and effective method that can be used to predict the number of faults in a software system. In this thesis, we use regression tree based modelling to predict the number of faults in a software module. The goal of this study is four-fold. First, a comparative study of the tree based modelling...
Show moreMaintaining superior quality and reliability in software systems is of utmost importance in today's world. Early fault prediction is a proven method for achieving this. Tree based modelling is a simple and effective method that can be used to predict the number of faults in a software system. In this thesis, we use regression tree based modelling to predict the number of faults in a software module. The goal of this study is four-fold. First, a comparative study of the tree based modelling tools CART and S-PLUS. CART yielded simpler regression trees than those built by S-PLUS. Second, a comparative study of the least squares and the least absolute deviation methods of CART. It is shown that the latter yielded better results than the former. Third, a study of the possible benefits of using principal components analysis when performing regression tree modelling. The fourth and final study is a comparison of tree based modelling with other prediction techniques namely, Case Based Reasoning, Artificial Neural Networks and Multiple Linear Regression.
Show less - Date Issued
- 2001
- PURL
- http://purl.flvc.org/fcla/dt/12782
- Subject Headings
- Software measurement, Computer software--Quality control
- Format
- Document (PDF)