Current Search: Expert systems Computer science (x)
View All Items
Pages
- Title
- DESIGN/IMPLEMENTATION OF AN EXPERT SYSTEM USING A DATA BASE MANAGEMENT SYSTEM ARCHITECTURE IN DEVELOPMENT OF ITS INFERENCE ENGINE.
- Creator
- LABOONE, PERRY ALLEN., Florida Atlantic University, Hoffman, Frederick, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The subject of this thesis is the design and implement at ion of an expert system from a standard data base management software language. The advantages and limitations of such a system design are discussed and supported by an accompanying implementation. Both, the design and implementation, demonstrate what gives the expertise or personification of human reasoning to a machine and why this type of reasoning is well suited to certain types of problems. This fundamental departure from...
Show moreThe subject of this thesis is the design and implement at ion of an expert system from a standard data base management software language. The advantages and limitations of such a system design are discussed and supported by an accompanying implementation. Both, the design and implementation, demonstrate what gives the expertise or personification of human reasoning to a machine and why this type of reasoning is well suited to certain types of problems. This fundamental departure from traditional deterministic analytical problem solving is accomplished by developing a system that is heuristic in nature. This heuristic implementation provides for a system that assists in the development of an emerging solution, rather than a deterministic solution in and of itself (i.e., a system that is programmed with a set of meta-knowledge rules that governs the decision making process and acts upon a second set of knowledge rules).
Show less - Date Issued
- 1986
- PURL
- http://purl.flvc.org/fcla/dt/14349
- Subject Headings
- Expert systems (Computer science), Systems design
- Format
- Document (PDF)
- Title
- An expert system for the selection and design of retaining walls.
- Creator
- Lee, Sunghoon., Florida Atlantic University, Arockiasamy, Madasamy, College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
This thesis presents a procedure for the selection and design of retaining walls using an expert system. The selection part is formulated in the form of production rules by OPS5, a programming language for production systems, and the design part is written in the procedural language, BASIC. Nine different types of retaining walls are incorporated in the knowledge base of the selection part, and three types of walls in the design part of the expert system. The selection and design parts are...
Show moreThis thesis presents a procedure for the selection and design of retaining walls using an expert system. The selection part is formulated in the form of production rules by OPS5, a programming language for production systems, and the design part is written in the procedural language, BASIC. Nine different types of retaining walls are incorporated in the knowledge base of the selection part, and three types of walls in the design part of the expert system. The selection and design parts are combined using OPS5 support routines.
Show less - Date Issued
- 1989
- PURL
- http://purl.flvc.org/fcla/dt/14542
- Subject Headings
- Retaining walls, Expert systems (Computer science)
- Format
- Document (PDF)
- Title
- Modeling and analysis of security.
- Creator
- Ajaj, Ola, Fernandez, Eduardo B., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Cloud Computing is a new computing model consists of a large pool of hardware and software resources on remote datacenters that are accessed through the Internet. Cloud Computing faces significant obstacles to its acceptance, such as security, virtualization, and lack of standardization. For Cloud standards, there is a long debate about their role, and more demands for Cloud standards are put on the table. The Cloud standardization landscape is so ambiguous. To model and analyze security...
Show moreCloud Computing is a new computing model consists of a large pool of hardware and software resources on remote datacenters that are accessed through the Internet. Cloud Computing faces significant obstacles to its acceptance, such as security, virtualization, and lack of standardization. For Cloud standards, there is a long debate about their role, and more demands for Cloud standards are put on the table. The Cloud standardization landscape is so ambiguous. To model and analyze security standards for Cloud Computing and web services, we have surveyed Cloud standards focusing more on the standards for security, and we classified them by groups of interests. Cloud Computing leverages a number of technologies such as: Web 2.0, virtualization, and Service Oriented Architecture (SOA). SOA uses web services to facilitate the creation of SOA systems by adopting different technologies despite their differences in formats and protocols. Several committees such as W3C and OASIS are developing standards for web services; their standards are rather complex and verbose. We have expressed web services security standards as patterns to make it easy for designers and users to understand their key points. We have written two patterns for two web services standards; WS-Secure Conversation, and WS-Federation. This completed an earlier work we have done on web services standards. We showed relationships between web services security standards and used them to solve major Cloud security issues, such as, authorization and access control, trust, and identity management. Close to web services, we investigated Business Process Execution Language (BPEL), and we addressed security considerations in BPEL and how to enforce them. To see how Cloud vendors look at web services standards, we took Amazon Web Services (AWS) as a case-study. By reviewing AWS documentations, web services security standards are barely mentioned. We highlighted some areas where web services security standards could solve some AWS limitations, and improve AWS security process. Finally, we studied the security guidance of two major Cloud-developing organizations, CSA and NIST. Both missed the quality of attributes offered by web services security standards. We expanded their work and added benefits of adopting web services security standards in securing the Cloud.
Show less - Date Issued
- 2013
- PURL
- http://purl.flvc.org/fau/fd/FA0004001
- Subject Headings
- Cloud Computing, Computational grids (Computer systems), Computer network architectures, Expert systems (Computer science), Web services -- Management
- Format
- Document (PDF)
- Title
- Knowledge based expert system for the analysis and rating of Florida bridges.
- Creator
- Sawka, Mark Joseph., Florida Atlantic University, Arockiasamy, Madasamy, College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
The automation of bridge strength evaluation by utilizing artificial intelligence tools is presented herein. The study involved the development of a microcomputer based expert system, REX (Rating EXpert). REX is an interactive menu-driven system combining the expert system shell EXSYS, the Grillage program 'BRIDGES', and various other pre- and post- processing devices. The system is capable of analyzing and assessing the load carrying capacity of solid and voided slab, and AASHTO girder and...
Show moreThe automation of bridge strength evaluation by utilizing artificial intelligence tools is presented herein. The study involved the development of a microcomputer based expert system, REX (Rating EXpert). REX is an interactive menu-driven system combining the expert system shell EXSYS, the Grillage program 'BRIDGES', and various other pre- and post- processing devices. The system is capable of analyzing and assessing the load carrying capacity of solid and voided slab, and AASHTO girder and slab bridges.
Show less - Date Issued
- 1992
- PURL
- http://purl.flvc.org/fcla/dt/14841
- Subject Headings
- Bridges--Florida, Bridges--Testing, Expert systems (Computer science)
- Format
- Document (PDF)
- Title
- Transforming directed graphs into uncertain rules.
- Creator
- Lantigua, Jose Salvador., Florida Atlantic University, Hoffman, Frederick, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The intent of this thesis is to show how rule structures can be derived from influence diagrams and how these structures can be mapped to existing rule-based shell paradigms. We shall demonstrate this mapping with an existing shell having the Evidence (E) --> Hypothesis (H), Certainty Factor (CF) paradigm structure. Influence diagrams are graphical representations of hypothesis to evidence, directed forms of Bayesian influence networks. These allow for inferencing about both diagnostic and...
Show moreThe intent of this thesis is to show how rule structures can be derived from influence diagrams and how these structures can be mapped to existing rule-based shell paradigms. We shall demonstrate this mapping with an existing shell having the Evidence (E) --> Hypothesis (H), Certainty Factor (CF) paradigm structure. Influence diagrams are graphical representations of hypothesis to evidence, directed forms of Bayesian influence networks. These allow for inferencing about both diagnostic and predictive (or causal) behavior based on uncertain evidence. We show how this can be implemented through a Probability (P) to CF mapping algorithm and a rule-set conflict resolution methodology. The thesis contains a discussion about the application of probabilistic semantics from Bayesian networks and of decision theory, to derive qualitative assertions about the likelihood of an occurrence; the sensitivity of a conclusion; and other indicators of usefulness. We show an example of this type of capability by the addition of a probability range function for the premise clause in our shell's rule structure.
Show less - Date Issued
- 1989
- PURL
- http://purl.flvc.org/fcla/dt/14570
- Subject Headings
- Decision-making--Mathematical models, Probabilities, Expert systems (Computer science)
- Format
- Document (PDF)
- Title
- Model-Driven Architecture and the Secure Systems Methodology.
- Creator
- Morrison, Patrick, Fernandez, Eduardo B., Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
As a compamon and complement to the work being done to build a secure systems methodology, this thesis evaluates the use of Model-Driven Architecture (MDA) in support of the methodology's lifecycle. The development lifecycle illustrated follows the recommendations of this secure systems methodology, while using MDA models to represent requirements, analysis, design, and implementation information. In order to evaluate MDA, we analyze a well-understood distributed systems security problem,...
Show moreAs a compamon and complement to the work being done to build a secure systems methodology, this thesis evaluates the use of Model-Driven Architecture (MDA) in support of the methodology's lifecycle. The development lifecycle illustrated follows the recommendations of this secure systems methodology, while using MDA models to represent requirements, analysis, design, and implementation information. In order to evaluate MDA, we analyze a well-understood distributed systems security problem, remote access, as illustrated by the internet "secure shell" protocol, ssh. By observing the ability of MDA models and transformations to specify remote access in each lifecycle phase, MDA's strengths and weaknesses can be evaluated in this context. A further aim of this work is to extract concepts that can be contained in an MDA security metamodel for use in future projects.
Show less - Date Issued
- 2007
- PURL
- http://purl.flvc.org/fau/fd/FA00012537
- Subject Headings
- Expert systems (Computer science), Software engineering, Computer-aided design, Computer network architectures
- Format
- Document (PDF)
- Title
- Unifying the conceptual levels of network security through the use of patterns.
- Creator
- Kumar, Ajoy, Fernandez, Eduardo B., Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Network architectures are described by the International Standard for Organization (ISO), which contains seven layers. The internet uses four of these layers, of which three are of interest to us. These layers are Internet Protocol (IP) or Network Layer, Transport Layer and Application Layer. We need to protect against attacks that may come through any of these layers. In the world of network security, systems are plagued by various attacks, internal and external, and could result in Denial...
Show moreNetwork architectures are described by the International Standard for Organization (ISO), which contains seven layers. The internet uses four of these layers, of which three are of interest to us. These layers are Internet Protocol (IP) or Network Layer, Transport Layer and Application Layer. We need to protect against attacks that may come through any of these layers. In the world of network security, systems are plagued by various attacks, internal and external, and could result in Denial of Service (DoS) and/or other damaging effects. Such attacks and loss of service can be devastating for the users of the system. The implementation of security devices such as Firewalls and Intrusion Detection Systems (IDS), the protection of network traffic with Virtual Private Networks (VPNs), and the use of secure protocols for the layers are important to enhance the security at each of these layers.We have done a survey of the existing network security patterns and we have written the missing patterns. We have developed security patterns for abstract IDS, Behavior–based IDS and Rule-based IDS and as well as for Internet Protocol Security (IPSec) and Transport Layer Security (TLS) protocols. We have also identified the need for a VPN pattern and have developed security patterns for abstract VPN, an IPSec VPN and a TLS VPN. We also evaluated these patterns with respect to some aspects to simplify their application by system designers. We have tried to unify the security of the network layers using security patterns by tying in security patterns for network transmission, network protocols and network boundary devices.
Show less - Date Issued
- 2014
- PURL
- http://purl.flvc.org/fau/fd/FA00004132, http://purl.flvc.org/fau/fd/FA00004132
- Subject Headings
- Computer architecture, Computer network architectures, Computer network protocols, Computer network protocols, Computer networks -- Security measures, Expert systems (Computer science)
- Format
- Document (PDF)
- Title
- Intelligent systems using GMDH algorithms.
- Creator
- Gupta, Mukul., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Design of intelligent systems that can learn from the environment and adapt to the change in the environment has been pursued by many researchers in this age of information technology. The Group Method of Data Handling (GMDH) algorithm to be implemented is a multilayered neural network. Neural network consists of neurons which use information acquired in training to deduce relationships in order to predict future responses. Most software tool during the simulation of the neural network based...
Show moreDesign of intelligent systems that can learn from the environment and adapt to the change in the environment has been pursued by many researchers in this age of information technology. The Group Method of Data Handling (GMDH) algorithm to be implemented is a multilayered neural network. Neural network consists of neurons which use information acquired in training to deduce relationships in order to predict future responses. Most software tool during the simulation of the neural network based algorithms in a sequential, single processor machine like Pascal, C or C++ takes several hours or even days. But in this thesis, the GMDH algorithm was modified and implemented into a software tool written in Verilog HDL and tested with specific application (XOR) to make the simulation faster. The purpose of the development of this tool is also to keep it general enough so that it can have a wide range of uses, but robust enough that it can give accurate results for all of those uses. Most of the applications of neural networks are basically software simulations of the algorithms only but in this thesis the hardware design is also developed of the algorithm so that it can be easily implemented on hardware using Field Programmable Gate Array (FPGA) type devices. The design is small enough to require a minimum amount of memory, circuit space, and propagation delay.
Show less - Date Issued
- 2010
- PURL
- http://purl.flvc.org/FAU/2976442
- Subject Headings
- GMDH algorithms, Genetic algorithms, Pattern recognition systems, Expert systems (Computer science), Neural networks (Computer science), Fuzzy logic, Intelligent control systems
- Format
- Document (PDF)
- Title
- Progress towards push button verification for business process execution language artifacts.
- Creator
- Vargas, Augusto., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Web Service Business Process Execution Language (BPEL) has become a standard language in the world of Service Oriented Architecture (SOA) for specifying interactions between internet services. This standard frees developers from low-level concerns involving platform, implementation, and versioning. These freedoms risk development of less robust artifacts that may even become part of a mission-critical system. Model checking a BPEL artifact for correctness with respect to temporal logic...
Show moreWeb Service Business Process Execution Language (BPEL) has become a standard language in the world of Service Oriented Architecture (SOA) for specifying interactions between internet services. This standard frees developers from low-level concerns involving platform, implementation, and versioning. These freedoms risk development of less robust artifacts that may even become part of a mission-critical system. Model checking a BPEL artifact for correctness with respect to temporal logic properties is computationally complex, since it requires enumerating all communication and synchronization amongst various services with itself. This entails modeling BPEL features such as concurrency, hierarchy, interleaving, and non-deterministic choice. The thesis will provide rules and procedures for translating these features to a veriable model written in Promela. We will use these rules to build a program which automates the translation process, bringing us one step closer to push button verification. Finally, two BPEL artifacts will be translated, manually edited, verified, and analyzed.
Show less - Date Issued
- 2009
- PURL
- http://purl.flvc.org/FAU/369386
- Subject Headings
- Electronic commerce, Computer programs, Computer network architectures, Expert systems (Computer science), Web servers, Management, Computer systems, Verification
- Format
- Document (PDF)
- Title
- A pattern-driven process for secure service-oriented applications.
- Creator
- Delessy, Nelly A., Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
During the last few years, Service-Oriented Architecture (SOA) has been considered to be the new phase in the evolution of distributed enterprise applications. Even though there is a common acceptance of this concept, a real problem hinders the widespread use of SOA : A methodology to design and build secure service-oriented applications is needed. In this dissertation, we design a novel process to secure service-oriented applications. Our contribution is original not only because it applies...
Show moreDuring the last few years, Service-Oriented Architecture (SOA) has been considered to be the new phase in the evolution of distributed enterprise applications. Even though there is a common acceptance of this concept, a real problem hinders the widespread use of SOA : A methodology to design and build secure service-oriented applications is needed. In this dissertation, we design a novel process to secure service-oriented applications. Our contribution is original not only because it applies the MDA approach to the design of service-oriented applications but also because it allows their securing by dynamically applying security patterns throughout the whole process. Security patterns capture security knowledge and describe security mechanisms. In our process, we present a structured map of security patterns for SOA and web services and its corresponding catalog. At the different steps of a software lifecycle, the architect or designer needs to make some security decisions., An approach using a decision tree made of security pattern nodes is proposed to help making these choices. We show how to extract a decision tree from our map of security patterns. Model-Driven Architecture (MDA) is an approach which promotes the systematic use of models during a system's development lifecycle. In the dissertation we describe a chain of transformations necessary to obtain secure models of the service-oriented application. A main benefit of this process is that it decouples the application domain expertise from the security expertise that are both needed to build a secure application. Security knowledge is captured by pre-defined security patterns, their selection is rendered easier by using the decision trees and their application can be automated. A consequence is that the inclusion of security during the software development process becomes more convenient for the architects/designers., A second benefit is that the insertion of security is semi-automated and traceable. Thus, the process is flexible and can easily adapt to changing requirements. Given that SOA was developed in order to provide enterprises with modular, reusable and adaptable architectures, but that security was the principal factor that hindered its use, we believe that our process can act as an enabler for service-oriented applications.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/FAU/58003
- Subject Headings
- Computer network architectures, Web servers, Management, Software engineering, Expert systems (Computer science)
- Format
- Document (PDF)
- Title
- A Comparison of Model Checking Tools for Service Oriented Architectures.
- Creator
- Venkat, Raghava, Khoshgoftaar, Taghi M., Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Recently most of the research pertaining to Service-Oriented Architecture (SOA) is based on web services and how secure they are in terms of efficiency and effectiveness. This requires validation, verification, and evaluation of web services. Verification and validation should be collaborative when web services from different vendors are integrated together to carry out a coherent task. For this purpose, novel model checking technologies have been devised and applied to web services. "Model...
Show moreRecently most of the research pertaining to Service-Oriented Architecture (SOA) is based on web services and how secure they are in terms of efficiency and effectiveness. This requires validation, verification, and evaluation of web services. Verification and validation should be collaborative when web services from different vendors are integrated together to carry out a coherent task. For this purpose, novel model checking technologies have been devised and applied to web services. "Model Checking" is a promising technique for verification and validation of software systems. WS-BPEL (Business Process Execution Language for Web Services) is an emerging standard language to describe web service composition behavior. The advanced features of BPEL such as concurrency and hierarchy make it challenging to verify BPEL models. Based on all such factors my thesis surveys a few important technologies (tools) for model checking and comparing each of them based on their "functional" and "non-functional" properties. The comparison is based on three case studies (first being the small case, second medium and the third one a large case) where we construct synthetic web service compositions for each case (as there are not many publicly available compositions [1]). The first case study is "Enhanced LoanApproval Process" and is considered a small case. The second is "Enhanced Purchase Order Process" which is of medium size and the third, and largest is based on a scientific workflow pattern, called the "Service Oriented Architecture Implementing BOINC Workflow" based on BOINC (Berkeley Open Infrastructure Network Computing) architecture.
Show less - Date Issued
- 2007
- PURL
- http://purl.flvc.org/fau/fd/FA00012565
- Subject Headings
- Computer network architectures, Expert systems (Computer science), Software engineering, Web servers--Management
- Format
- Document (PDF)
- Title
- Simulation of autonomous knowledge-based navigation in unknown two-dimensional environment with polygonal obstacles.
- Creator
- McKendrick, John DeMilly., Florida Atlantic University, Cheng, Linfu, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The problem of finding optimal paths for a robot navigating in an environment where the position of each obstacle is precisely known has received much attention in the literature, however, the majority of applications problems for a robot would require it to navigate in a completely unknown. This paper focuses on an approach to solving the problem of robot navigation in an unknown, unstructured, two-dimensional environment where the positions of the polygonal obstacles were fixed in time. Few...
Show moreThe problem of finding optimal paths for a robot navigating in an environment where the position of each obstacle is precisely known has received much attention in the literature, however, the majority of applications problems for a robot would require it to navigate in a completely unknown. This paper focuses on an approach to solving the problem of robot navigation in an unknown, unstructured, two-dimensional environment where the positions of the polygonal obstacles were fixed in time. Few studies have reported on the utilization of an expert system to govern robot motion. This study relied on a knowledge-based expert system that interacted with lower-level procedures to carry out path finding and exploration functions. The expert-system shell used was OPS5 which ran on top of Lisp.
Show less - Date Issued
- 1988
- PURL
- http://purl.flvc.org/fcla/dt/14496
- Subject Headings
- Robots--Motion, Expert systems (Computer science), Robots--Motion--Computer simulation
- Format
- Document (PDF)
- Title
- Finite safety models for high-assurance systems.
- Creator
- Sloan, John C., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Preventing bad things from happening to engineered systems, demands improvements to how we model their operation with regard to safety. Safety-critical and fiscally-critical systems both demand automated and exhaustive verification, which is only possible if the models of these systems, along with the number of scenarios spawned from these models, are tractably finite. To this end, this dissertation ad dresses problems of a model's tractability and usefulness. It addresses the state space...
Show morePreventing bad things from happening to engineered systems, demands improvements to how we model their operation with regard to safety. Safety-critical and fiscally-critical systems both demand automated and exhaustive verification, which is only possible if the models of these systems, along with the number of scenarios spawned from these models, are tractably finite. To this end, this dissertation ad dresses problems of a model's tractability and usefulness. It addresses the state space minimization problem by initially considering tradeoffs between state space size and level of detail or fidelity. It then considers the problem of human interpretation in model capture from system artifacts, by seeking to automate model capture. It introduces human control over level of detail and hence state space size during model capture. Rendering that model in a manner that can guide human decision making is also addressed, as is an automated assessment of system timeliness. Finally, it addresses state compression and abstraction using logical fault models like fault trees, which enable exhaustive verification of larger systems by subsequent use of transition fault models like Petri nets, timed automata, and process algebraic expressions. To illustrate these ideas, this dissertation considers two very different applications - web service compositions and submerged ocean machinery.
Show less - Date Issued
- 2010
- PURL
- http://purl.flvc.org/FAU/2683206
- Subject Headings
- System failures (Engineering), Prevention, Sustainable engineering, Finite element method, Expert systems (Computer science)
- Format
- Document (PDF)
- Title
- Patterns for web services standards.
- Creator
- Ajaj, Ola, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Web services intend to provide an application integration technology that can be successfully used over the Internet in a secure, interoperable and trusted manner. Policies are high-level guidelines defining the way an institution conducts its activities. The WS-Policy standard describes how to apply policies of security definition, enforcement of access control, authentication and logging. WS-Trust defines a security token service and a trust engine which are used by web services to...
Show moreWeb services intend to provide an application integration technology that can be successfully used over the Internet in a secure, interoperable and trusted manner. Policies are high-level guidelines defining the way an institution conducts its activities. The WS-Policy standard describes how to apply policies of security definition, enforcement of access control, authentication and logging. WS-Trust defines a security token service and a trust engine which are used by web services to authenticate other web services. Using the functions defined in WS-Trust, applications can engage in secure communication after establishing trust. BPEL is a language for web service composition that intends to provide convenient and effective means for application integration over the Internet. We address security considerations in BPEL and how to enforce them, as well as its interactions with other web services standards such as WS-Security and WS-Policy.
Show less - Date Issued
- 2010
- PURL
- http://purl.flvc.org/FAU/1927300
- Subject Headings
- Computational grids (Computer systems), Computer systems, Verification, Expert systems (Computer science), Computer network architectures, Web servers, Management, Electronic commerce, Computer programs
- Format
- Document (PDF)
- Title
- Context-based Image Concept Detection and Annotation.
- Creator
- Zolghadr, Esfandiar, Furht, Borko, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Scene understanding attempts to produce a textual description of visible and latent concepts in an image to describe the real meaning of the scene. Concepts are either objects, events or relations depicted in an image. To recognize concepts, the decision of object detection algorithm must be further enhanced from visual similarity to semantical compatibility. Semantically relevant concepts convey the most consistent meaning of the scene. Object detectors analyze visual properties (e.g., pixel...
Show moreScene understanding attempts to produce a textual description of visible and latent concepts in an image to describe the real meaning of the scene. Concepts are either objects, events or relations depicted in an image. To recognize concepts, the decision of object detection algorithm must be further enhanced from visual similarity to semantical compatibility. Semantically relevant concepts convey the most consistent meaning of the scene. Object detectors analyze visual properties (e.g., pixel intensities, texture, color gradient) of sub-regions of an image to identify objects. The initially assigned objects names must be further examined to ensure they are compatible with each other and the scene. By enforcing inter-object dependencies (e.g., co-occurrence, spatial and semantical priors) and object to scene constraints as background information, a concept classifier predicts the most semantically consistent set of names for discovered objects. The additional background information that describes concepts is called context. In this dissertation, a framework for building context-based concept detection is presented that uses a combination of multiple contextual relationships to refine the result of underlying feature-based object detectors to produce most semantically compatible concepts. In addition to the lack of ability to capture semantical dependencies, object detectors suffer from high dimensionality of feature space that impairs them. Variances in the image (i.e., quality, pose, articulation, illumination, and occlusion) can also result in low-quality visual features that impact the accuracy of detected concepts. The object detectors used to build context-based framework experiments in this study are based on the state-of-the-art generative and discriminative graphical models. The relationships between model variables can be easily described using graphical models and the dependencies and precisely characterized using these representations. The generative context-based implementations are extensions of Latent Dirichlet Allocation, a leading topic modeling approach that is very effective in reduction of the dimensionality of the data. The discriminative contextbased approach extends Conditional Random Fields which allows efficient and precise construction of model by specifying and including only cases that are related and influence it. The dataset used for training and evaluation is MIT SUN397. The result of the experiments shows overall 15% increase in accuracy in annotation and 31% improvement in semantical saliency of the annotated concepts.
Show less - Date Issued
- 2016
- PURL
- http://purl.flvc.org/fau/fd/FA00004745, http://purl.flvc.org/fau/fd/FA00004745
- Subject Headings
- Computer vision--Mathematical models., Pattern recognition systems., Information visualization., Natural language processing (Computer science), Multimodal user interfaces (Computer systems), Latent structure analysis., Expert systems (Computer science)
- Format
- Document (PDF)
- Title
- Correcting noisy data and expert analysis of the correction process.
- Creator
- Seiffert, Christopher N., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This thesis expands upon an existing noise cleansing technique, polishing, enabling it to be used in the Software Quality Prediction domain, as well as any other domain where the data contains continuous values, as opposed to categorical data for which the technique was originally designed. The procedure is applied to a real world dataset with real (as opposed to injected) noise as determined by an expert in the domain. This, in combination with expert assessment of the changes made to the...
Show moreThis thesis expands upon an existing noise cleansing technique, polishing, enabling it to be used in the Software Quality Prediction domain, as well as any other domain where the data contains continuous values, as opposed to categorical data for which the technique was originally designed. The procedure is applied to a real world dataset with real (as opposed to injected) noise as determined by an expert in the domain. This, in combination with expert assessment of the changes made to the data, provides not only a more realistic dataset than one in which the noise (or even the entire dataset) is artificial, but also a better understanding of whether the procedure is successful in cleansing the data. Lastly, this thesis provides a more in-depth view of the process than previously available, in that it gives results for different parameters and classifier building techniques. This allows the reader to gain a better understanding of the significance of both model generation and parameter selection.
Show less - Date Issued
- 2005
- PURL
- http://purl.flvc.org/fcla/dt/13223
- Subject Headings
- Computer interfaces--Software--Quality control, Acoustical engineering, Noise control--Computer programs, Expert systems (Computer science), Software documentation
- Format
- Document (PDF)
- Title
- An evaluation of machine learning algorithms for tweet sentiment analysis.
- Creator
- Prusa, Joseph D., Khoshgoftaar, Taghi M., Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Sentiment analysis of tweets is an application of mining Twitter, and is growing in popularity as a means of determining public opinion. Machine learning algorithms are used to perform sentiment analysis; however, data quality issues such as high dimensionality, class imbalance or noise may negatively impact classifier performance. Machine learning techniques exist for targeting these problems, but have not been applied to this domain, or have not been studied in detail. In this thesis we...
Show moreSentiment analysis of tweets is an application of mining Twitter, and is growing in popularity as a means of determining public opinion. Machine learning algorithms are used to perform sentiment analysis; however, data quality issues such as high dimensionality, class imbalance or noise may negatively impact classifier performance. Machine learning techniques exist for targeting these problems, but have not been applied to this domain, or have not been studied in detail. In this thesis we discuss research that has been conducted on tweet sentiment classification, its accompanying data concerns, and methods of addressing these concerns. We test the impact of feature selection, data sampling and ensemble techniques in an effort to improve classifier performance. We also evaluate the combination of feature selection and ensemble techniques and examine the effects of high dimensionality when combining multiple types of features. Additionally, we provide strategies and insights for potential avenues of future work.
Show less - Date Issued
- 2015
- PURL
- http://purl.flvc.org/fau/fd/FA00004460, http://purl.flvc.org/fau/fd/FA00004460
- Subject Headings
- Social media., Natural language processing (Computer science), Machine learning., Algorithms., Fuzzy expert systems., Artificial intelligence.
- Format
- Document (PDF)
- Title
- Sparse Coding and Compressed Sensing: Locally Competitive Algorithms and Random Projections.
- Creator
- Hahn, William E., Barenholtz, Elan, Florida Atlantic University, Charles E. Schmidt College of Science, Center for Complex Systems and Brain Sciences
- Abstract/Description
-
For an 8-bit grayscale image patch of size n x n, the number of distinguishable signals is 256(n2). Natural images (e.g.,photographs of a natural scene) comprise a very small subset of these possible signals. Traditional image and video processing relies on band-limited or low-pass signal models. In contrast, we will explore the observation that most signals of interest are sparse, i.e. in a particular basis most of the expansion coefficients will be zero. Recent developments in sparse...
Show moreFor an 8-bit grayscale image patch of size n x n, the number of distinguishable signals is 256(n2). Natural images (e.g.,photographs of a natural scene) comprise a very small subset of these possible signals. Traditional image and video processing relies on band-limited or low-pass signal models. In contrast, we will explore the observation that most signals of interest are sparse, i.e. in a particular basis most of the expansion coefficients will be zero. Recent developments in sparse modeling and L1 optimization have allowed for extraordinary applications such as the single pixel camera, as well as computer vision systems that can exceed human performance. Here we present a novel neural network architecture combining a sparse filter model and locally competitive algorithms (LCAs), and demonstrate the networks ability to classify human actions from video. Sparse filtering is an unsupervised feature learning algorithm designed to optimize the sparsity of the feature distribution directly without having the need to model the data distribution. LCAs are defined by a system of di↵erential equations where the initial conditions define an optimization problem and the dynamics converge to a sparse decomposition of the input vector. We applied this architecture to train a classifier on categories of motion in human action videos. Inputs to the network were small 3D patches taken from frame di↵erences in the videos. Dictionaries were derived for each action class and then activation levels for each dictionary were assessed during reconstruction of a novel test patch. We discuss how this sparse modeling approach provides a natural framework for multi-sensory and multimodal data processing including RGB video, RGBD video, hyper-spectral video, and stereo audio/video streams.
Show less - Date Issued
- 2016
- PURL
- http://purl.flvc.org/fau/fd/FA00004713, http://purl.flvc.org/fau/fd/FA00004713
- Subject Headings
- Artificial intelligence, Expert systems (Computer science), Image processing -- Digital techniques -- Mathematics, Sparse matrices
- Format
- Document (PDF)
- Title
- An implementation of the IEEE 1609.4 wave standard for use in a vehicular networking testbed.
- Creator
- Kuffermann, Kyle, Mahgoub, Imad, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
We present an implementation of the IEEE WAVE (Wireless Access in Vehicular Environments) 1609.4 standard, Multichannel Operation. This implementation provides concurrent access to a control channel and one or more service channels, enabling vehicles to communicate among each other on multiple service channels while still being able to receive urgent and control information on the control channel. Also included is functionality that provides over-the-air timing synchronization, allowing...
Show moreWe present an implementation of the IEEE WAVE (Wireless Access in Vehicular Environments) 1609.4 standard, Multichannel Operation. This implementation provides concurrent access to a control channel and one or more service channels, enabling vehicles to communicate among each other on multiple service channels while still being able to receive urgent and control information on the control channel. Also included is functionality that provides over-the-air timing synchronization, allowing participation in alternating channel access in the absence of a reliable time source. Our implementation runs on embedded Linux and is built on top of IEEE 802.11p, as well as a customized device driver. This implementation will serve as a key compo- nent in our IEEE 1609-compliant Vehicular Multi-technology Communication Device (VMCD) that is being developed for a VANET testbed under the Smart Drive initiative, supported by the National Science Foundation.
Show less - Date Issued
- 2014
- PURL
- http://purl.flvc.org/fau/fd/FA00004299, http://purl.flvc.org/fau/fd/FA00004299
- Subject Headings
- Vehicular ad hoc networks (Computer networks)., Wireless sensor networks., Wireless communication systems., Wireless LANs., Linux., Expert systems (Computer science), Operating systems (Computers)
- Format
- Document (PDF)
- Title
- Ensemble-classifier approach to noise elimination: A case study in software quality classification.
- Creator
- Joshi, Vedang H., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This thesis presents a noise handling technique that attempts to improve the quality of training data for classification purposes by eliminating instances that are likely to be noise. Our approach uses twenty five different classification techniques to create an ensemble of classifiers that acts as a noise filter on real-world software measurement datasets. Using a relatively large number of base-level classifiers for the ensemble-classifier filter facilitates in achieving the desired level...
Show moreThis thesis presents a noise handling technique that attempts to improve the quality of training data for classification purposes by eliminating instances that are likely to be noise. Our approach uses twenty five different classification techniques to create an ensemble of classifiers that acts as a noise filter on real-world software measurement datasets. Using a relatively large number of base-level classifiers for the ensemble-classifier filter facilitates in achieving the desired level of noise removal conservativeness with several possible levels of filtering. It also provides a higher degree of confidence in the noise elimination procedure as the results are less likely to get influenced by (possible) inappropriate learning bias of a few algorithms with twenty five base-level classifiers than with a relatively smaller number of base-level classifiers. Empirical case studies of two different high assurance software projects demonstrate the effectiveness of our noise elimination approach by the significant improvement achieved in classification accuracies at various levels of filtering.
Show less - Date Issued
- 2004
- PURL
- http://purl.flvc.org/fcla/dt/13144
- Subject Headings
- Computer interfaces--Software--Quality control, Acoustical engineering, Noise control--Case studies, Expert systems (Computer science), Software documentation
- Format
- Document (PDF)