Current Search: Graduate College (x)
View All Items
Pages
- Title
- Information theory and software measurement.
- Creator
- Allen, Edward B., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Development of reliable, high quality, software requires study and understanding at each step of the development process. A basic assumption in the field of software measurement is that metrics of internal software attributes somehow relate to the intrinsic difficulty in understanding a program. Measuring the information content of a program attempts to indirectly quantify the comprehension task. Information theory based software metrics are attractive because they quantify the amount of...
Show moreDevelopment of reliable, high quality, software requires study and understanding at each step of the development process. A basic assumption in the field of software measurement is that metrics of internal software attributes somehow relate to the intrinsic difficulty in understanding a program. Measuring the information content of a program attempts to indirectly quantify the comprehension task. Information theory based software metrics are attractive because they quantify the amount of information in a well defined framework. However, most information theory based metrics have been proposed with little reference to measurement theory fundamentals, and empirical validation of predictive quality models has been lacking. This dissertation proves that representative information theory based software metrics can be "meaningful" components of software quality models in the context of measurement theory. To this end, members of a major class of metrics are shown to be regular representations of Minimum Description Length or Variety of software attributes, and are interval scale. An empirical validation case study is presented that predicted faults in modules based on Operator Information. This metric is closely related to Harrison's Average Information Content Classification, which is the entropy of the operators. New general methods for calculating synthetic complexity at the system level and module level are presented, quantifying the joint information of an arbitrary set of primitive software measures. Since all kinds of information are not equally relevant to software quality factors, components of synthetic module complexity are also defined. Empirical case studies illustrate the potential usefulness of the proposed synthetic metrics. A metrics data base is often the key to a successful ongoing software metrics program. The contribution of any proposed metric is defined in terms of measured variation using information theory, irrespective of the metric's usefulness in quality models. This is of interest when full validation is not practical. Case studies illustrate the method.
Show less - Date Issued
- 1995
- PURL
- http://purl.flvc.org/fcla/dt/12412
- Subject Headings
- Software engineering, Computer software--Quality control, Information theory
- Format
- Document (PDF)
- Title
- Information-theoretics based analysis of hard handoffs in mobile communications.
- Creator
- Bendett, Raymond Morris., Florida Atlantic University, Neelakanta, Perambur S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The research proposed and elaborated in this dissertation is concerned with the development of new decision algorithms for hard handoff strategies in mobile communication systems. Specifically, the research tasks envisaged include the following: (1) Use of information-theoretics based statistical distance measures as a metric for hard handoff decisions; (2) A study to evaluate the log-likelihood criterion towards decision considerations to perform the hard handoff; (3) Development of a...
Show moreThe research proposed and elaborated in this dissertation is concerned with the development of new decision algorithms for hard handoff strategies in mobile communication systems. Specifically, the research tasks envisaged include the following: (1) Use of information-theoretics based statistical distance measures as a metric for hard handoff decisions; (2) A study to evaluate the log-likelihood criterion towards decision considerations to perform the hard handoff; (3) Development of a statistical model to evaluate optimum instants of measurements of the metric used for hard handoff decision. The aforesaid objectives refer to a practical scenario in which a mobile station (MS) traveling away from a serving base station (BS-I) may suffer communications impairment due to interference and shadowing affects, especially in an urban environment. As a result, it will seek to switch over to another base station (BS-II) that facilitates a stronger signal level. This is called handoff procedure. (The hard handoff refers to the specific case in which only one base station serves the mobile at the instant of handover). Classically, the handoff decision is done on the basis of the difference between received signal strengths (RSS) from BS-I and BS-II. The algorithms developed here, in contrast, stipulate the decision criterion set by the statistical divergence and/or log-likelihood ratio that exists between the received signals. The purpose of the present study is to evaluate the relative efficacy of the conventional and proposed algorithms in reference to: (i) Minimization of unnecessary handoffs ("ping-pongs"); (ii) Minimization of delay in handing over; (iii) Ease of implementation and (iv) Minimization of possible call dropouts due to ineffective handover envisaged. Simulated results with data commensurate with practical considerations are furnished and discussed. Background literature is presented in the introductory chapter and scope for future work is identified via open questions in the concluding chapter.
Show less - Date Issued
- 2000
- PURL
- http://purl.flvc.org/fcla/dt/12639
- Subject Headings
- Mobile communication systems, Information theory, Algorithms
- Format
- Document (PDF)
- Title
- Innovative video error resilient techniques for MBMS systems.
- Creator
- Sanigepalli, Praveen., Florida Atlantic University, Kalva, Hari, Furht, Borko, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
In the current communications age, the capabilities of mobile devices are increasing. The mobiles are capable of communicating at data rates of hundreds of mbps on 4G networks. This enables playback of rich multimedia content comparable to internet and television networks. However, mobile networks need to be spectrum-efficient to be affordable to users. Multimedia Broadcast Multicast Systems (MBMS) is a wireless broadcasting standard that is being drafted to enable multimedia broadcast while...
Show moreIn the current communications age, the capabilities of mobile devices are increasing. The mobiles are capable of communicating at data rates of hundreds of mbps on 4G networks. This enables playback of rich multimedia content comparable to internet and television networks. However, mobile networks need to be spectrum-efficient to be affordable to users. Multimedia Broadcast Multicast Systems (MBMS) is a wireless broadcasting standard that is being drafted to enable multimedia broadcast while focusing on being spectrum-efficient. The hybrid video coding techniques facilitate low bitrate transmission, but result in dependencies across frames. With a mobile environment being error prone, no error correction technique can guarantee error free transmission. Such errors propagate, resulting in quality degradation. With numerous mobiles sharing the broadcast session, any error resilient scheme should account for heterogeneous device capabilities and channel conditions. The current research on wireless video broadcasting focuses on network based techniques such as FEC and retransmissions, which add bandwidth overhead. There is a need to design innovative error resilient techniques that make video codec robust with minimal bandwidth overhead. This Dissertation introduces novel techniques in the area of MBMS systems. First, robust video structures are proposed in Periodic Intra Frame based Prediction (PIFBP) and Periodic Anchor Frame based Prediction (PAFBP) schemes. In these schemes, the Intra frames or anchor frames serve as reference frames for prediction during GOP period. The intermediate frames are independent of others; any errors in such frames are not propagated, thereby resulting in error resilience. In prior art, intra block rate is adapted based on the channel characteristics for error resilience. This scheme has been generalized in multicasting to address a group of users sharing the same session. Average packet loss is used to determine the intra block rate. This improves performance of the overall group and strives for consistent performance. Also, the inherent diversity in the broadcasting session can be used for its advantage. With mobile devices capable of accessing a WLAN during broadcast, they form an adhoc network on a WLAN to recover lost packets. New error recovery schemes are proposed for error recovery and their performance comparison is presented.
Show less - Date Issued
- 2005
- PURL
- http://purl.flvc.org/fcla/dt/12187
- Subject Headings
- Wireless communication systems, Signal processing, Digital video, Multimedia systems, Digital communications, Data transmission systems
- Format
- Document (PDF)
- Title
- Improved design methods for evaluating the performance of landfill double liner systems.
- Creator
- Shivashankar, Mirle R., Florida Atlantic University, Fluet, J. E. Jr., Reddy, Dronnadula V., College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
Many modern landfills are constructed with double liner systems. Leachate leakage rates through double liner systems are calculated using recently developed formulations which are theoretically correct for leakage detection system (LDS) materials that have unrestricted lateral flow properties. But their applicability to geonets, the most commonly used LDS material, has yet to be determined. In double liner systems, the leakage through the primary liner, the properties of the LDS material, and...
Show moreMany modern landfills are constructed with double liner systems. Leachate leakage rates through double liner systems are calculated using recently developed formulations which are theoretically correct for leakage detection system (LDS) materials that have unrestricted lateral flow properties. But their applicability to geonets, the most commonly used LDS material, has yet to be determined. In double liner systems, the leakage through the primary liner, the properties of the LDS material, and the slope of the LDS determine the flow patterns in the LDS. These flow patterns are then used to determine the amount of leachate, if any, which leaks through the bottom liner into the ground. This thesis describes the experimental determination of the flow patterns in the geonets and their relationships to established design formulations.
Show less - Date Issued
- 1995
- PURL
- http://purl.flvc.org/fcla/dt/15197
- Subject Headings
- Sanitary landfills--Leaching, Sanitary landfills--Linings, Geosynthetics
- Format
- Document (PDF)
- Title
- Interactive computer aided digital control design.
- Creator
- Yakali, Huseyin Hakan., Florida Atlantic University, Roth, Zvi S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This thesis outlines the design philosophy and implementation aspects of a new interactive CAD tool implemented in BASIC language on an IBM PC/AT computer for single input single output (SISO) digital control systems. The direct Digital Control design method presented is classical in nature. The program main features are: (1) The use of Modified z-transform to model the effects of transport delay due to control computation time. (2) The use of windows on a split screen to allow the designer...
Show moreThis thesis outlines the design philosophy and implementation aspects of a new interactive CAD tool implemented in BASIC language on an IBM PC/AT computer for single input single output (SISO) digital control systems. The direct Digital Control design method presented is classical in nature. The program main features are: (1) The use of Modified z-transform to model the effects of transport delay due to control computation time. (2) The use of windows on a split screen to allow the designer observation of the closed-loop step response while systematically shaping a root locus or synthesizing closed-loop pole/zero patterns. (3) Display of system response in between sampling instants.
Show less - Date Issued
- 1988
- PURL
- http://purl.flvc.org/fcla/dt/14488
- Subject Headings
- Computer-aided design, Digital control systems, Engineering design--Data processing
- Format
- Document (PDF)
- Title
- Integrating Multi-user Scheduling with Retransmission Diversity over Wireless Links.
- Creator
- Li, Irena, Zhuang, Hanqi, Wang, Xin, Florida Atlantic University, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Research presented in this thesis develops a mainly theoretical basis and computer models for enhancing the throughput of multi-user wireless communication networks. The cross-layer combination of an adaptive modulation and coding (AMC) scheme at the physical layer and the use of automatic repeat request (ARQ) retransmi ssions at the data link layer is integrated into a scheduling framework for multi-user networks. Scheduling algorithms incorporating retransmission diversity are derived for...
Show moreResearch presented in this thesis develops a mainly theoretical basis and computer models for enhancing the throughput of multi-user wireless communication networks. The cross-layer combination of an adaptive modulation and coding (AMC) scheme at the physical layer and the use of automatic repeat request (ARQ) retransmi ssions at the data link layer is integrated into a scheduling framework for multi-user networks. Scheduling algorithms incorporating retransmission diversity are derived for three cases of typical network traffic: best-effort, non-realtime, and realtime. For each case, numeric computer si mulations of wireless communications over Nakagami-m block fading channels are developed to examine the effectiveness of the formulated schemes.
Show less - Date Issued
- 2008
- PURL
- http://purl.flvc.org/fau/fd/FA00012533
- Subject Headings
- Wireless communication networks, Code division multiple access, Modulation (Electronics), Signal processing (Digital techniques)
- Format
- Document (PDF)
- Title
- Information hiding: Digital watermarking techniques.
- Creator
- Sadicoff, Mauricio Levy., Florida Atlantic University, Larrondo-Petrie, Maria M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Digital Watermarking is a multimedia technique recently developed with the purpose of enhancing copyright protection on multimedia files. This thesis presents a survey of digital watermark features and classifications. It also proposes a classification method that includes most of previous classifications. The thesis then proceeds to detail two digital watermarking methods, Lower Significant Bit Encoding and Spread Spectrum Encoding. Software is designed and implemented to show the...
Show moreDigital Watermarking is a multimedia technique recently developed with the purpose of enhancing copyright protection on multimedia files. This thesis presents a survey of digital watermark features and classifications. It also proposes a classification method that includes most of previous classifications. The thesis then proceeds to detail two digital watermarking methods, Lower Significant Bit Encoding and Spread Spectrum Encoding. Software is designed and implemented to show the capabilities and behavior of each method. The software also shows how each method reacts to four typical transformations (attacks). The results of applying the two methods and their survival rates against the typical transformations are discussed in detail. Finally, the source code for the software is made available.
Show less - Date Issued
- 2002
- PURL
- http://purl.flvc.org/fcla/dt/12897
- Subject Headings
- Computer software--Development, Digital watermarking, Data encryption (Computer science)
- Format
- Document (PDF)
- Title
- Extensions to real-time object-oriented software design methodologies.
- Creator
- Woodcock, Timothy G., Florida Atlantic University, Fernandez, Eduardo B., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Real-time systems are systems where time is considered a system resource that needs to be managed. Time is usually represented in these systems as a deadline to complete a task. Unfortunately, by adding timing to even simple algorithms, it complicates them greatly. Real-time systems are by nature difficult and complex to understand. Object-oriented methodologies have attributes that allow real-time systems to be designed and implemented with less error and some control over the resultant...
Show moreReal-time systems are systems where time is considered a system resource that needs to be managed. Time is usually represented in these systems as a deadline to complete a task. Unfortunately, by adding timing to even simple algorithms, it complicates them greatly. Real-time systems are by nature difficult and complex to understand. Object-oriented methodologies have attributes that allow real-time systems to be designed and implemented with less error and some control over the resultant complexity. With object-oriented design, the system is modeled in the environment that it will be used in. Objects themselves, are partitions of the system, into logical, understandable units. In this dissertation, we start by surveying the current real-time object-oriented design methodologies. By comparing these methodologies and developing a set of criteria for evaluating them, we discover that certain aspects of these methodologies still need some work. The most important aspects of the methodologies are understanding the effects of deadlines on statechart behavioral models and understanding the effects of deadlines when object models are inherited or undergo aggregation. The effects of deadlines on statecharts are then explored in detail. There are two basic ways that deadlines are added to statecharts. The first, and most popular, is adding timing as a condition on a state transition. The second is adding a count down timer to a state and forcing a transition if the timer reaches zero. We show that these are equivalent and can be used interchangeably to simplify designs. Next, the effects of deadlines on behavior models when the corresponding object models undergo inheritance or aggregation are studied. We will first analyze the effects on the behavior model when object inheritance is encountered. We found eight ways that the behavior model can be modified and still maintain the properties of inheritance. Finally, deadlines are added and the analysis is repeated.
Show less - Date Issued
- 1996
- PURL
- http://purl.flvc.org/fcla/dt/12493
- Subject Headings
- Real-time data processing, Computer software--Development, Object-oriented programming (Computer science)
- Format
- Document (PDF)
- Title
- Mitigating worm propagation on virtual LANs.
- Creator
- Sun, Xiaoguang., Florida Atlantic University, Rajput, Saeed, Hsu, Sam, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Recent worms have used sophisticated propagation techniques to propagate faster than the patch distribution and have utilized previously unknown vulnerabilities. To mitigate repetition of such epidemics in future, active defense mechanisms are needed that not only identify malicious activity, but can also defend against widespread outbreak. We provide a framework capable of reacting quickly to quarantine infections. The fundamental components of our framework are detector and VLAN switch. We...
Show moreRecent worms have used sophisticated propagation techniques to propagate faster than the patch distribution and have utilized previously unknown vulnerabilities. To mitigate repetition of such epidemics in future, active defense mechanisms are needed that not only identify malicious activity, but can also defend against widespread outbreak. We provide a framework capable of reacting quickly to quarantine infections. The fundamental components of our framework are detector and VLAN switch. We have provided a proof of concept implementation, where we use the Blaster worm as an example, and demonstrate that detection of worms is possible, and individual infected hosts can be isolated quickly. Furthermore, using Monte Carlo simulations, we show that such containment of future epidemics is possible. In addition, we also compute the overhead of detection and mitigation approaches and have shown that our approach has lower overhead compared to the others.
Show less - Date Issued
- 2006
- PURL
- http://purl.flvc.org/fcla/dt/13369
- Subject Headings
- Wireless LANs--Security measures, Wireless communication systems--Security measures, Computer viruses--Prevention, Computer security
- Format
- Document (PDF)
- Title
- Modeling and measurement of the response of small antennas near multilayered two or three-dimensional dielectric bodies.
- Creator
- Ponce de Leon, Lorenzo Angel., Florida Atlantic University, Helmken, Henry, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
A theory of the circular loop antenna constructed from finite conductivity wire is developed via a Fourier series expansion of the currents in the loop. Models for a family of small loop antennas are also presented. A new high sensitivity and selectivity heterodyne fiber optic based electromagnetic field detector is developed compatible with open antenna range measurements made at low signal levels and in the presence of strong interfering signals. A new analytical solution pertaining to the...
Show moreA theory of the circular loop antenna constructed from finite conductivity wire is developed via a Fourier series expansion of the currents in the loop. Models for a family of small loop antennas are also presented. A new high sensitivity and selectivity heterodyne fiber optic based electromagnetic field detector is developed compatible with open antenna range measurements made at low signal levels and in the presence of strong interfering signals. A new analytical solution pertaining to the response of a disk loaded dipole antenna representing a dipole configured on a lossy dielectric medium is developed using a field compensation theorem and a geometrical theory of diffraction. The multipole expansions for the scattered fields of a multilayered infinite cylinder illuminated by oblique incidence plane wave are formulated and programmed for numerical analysis. The response of cylinders with constitutive parameters reflecting those used in human phantoms are calculated. The response of a small antenna proximal to a multilayered cylinder is analyzed. The scattered fields from multilayered bodies are coupled to a small wire antenna using a combined methods induced electromagnetic force (EMF) technique. New results concerning the response of a loop antenna near a multilayered body obtained via a zero and first phase current model are presented. The new technique is applied in the analysis of human phantoms tested in an open field antenna range. Validation of the theory of multilayered human phantoms with measurements using the new detector is demonstrated.
Show less - Date Issued
- 1992
- PURL
- http://purl.flvc.org/fcla/dt/12294
- Subject Headings
- Antennas (Electronics)
- Format
- Document (PDF)
- Title
- Estimation of information-theoretics-based delay-bounds in ATM networks.
- Creator
- Wei, Liqun., Florida Atlantic University, Hsu, Sam, Neelakanta, Perambur S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This thesis addresses a method of deducing the statistical upper and lower bounds associated with the cell-transfer delay variations (CDVs) encountered by the cells transmitted in the asynchronous transfer mode (ATM) networks due to cell losses. This study focuses on: (1) Estimating CDV arising from multiplexing/switching for both constant bit rate and variable bit rate services via simulations. (2) Deducing an information-theoretics based new technique to get an insight of the combined BER...
Show moreThis thesis addresses a method of deducing the statistical upper and lower bounds associated with the cell-transfer delay variations (CDVs) encountered by the cells transmitted in the asynchronous transfer mode (ATM) networks due to cell losses. This study focuses on: (1) Estimating CDV arising from multiplexing/switching for both constant bit rate and variable bit rate services via simulations. (2) Deducing an information-theoretics based new technique to get an insight of the combined BER-induced and multiplexing/switching-induced CDVs in ATM networks. Algorithms on the CDV statistics are derived and the lower and upper bounds of the statistics are obtained via simulations in respect of CBR and VBR traffics. These bounds bounds are useful in the cell-admission control (CAC) strategies adapted in ATM transmissions. Inferential remarks indicating the effects of traffic parameters (such as bandwidth, burstiness etc.) on the values of the statistical bounds are presented, and scope for further work is indicated.
Show less - Date Issued
- 1997
- PURL
- http://purl.flvc.org/fcla/dt/15444
- Subject Headings
- Asynchronous transfer mode, Telecommunication, Computer networks, Broadband communication systems
- Format
- Document (PDF)
- Title
- THE APPLICATION OF THE ENERGY ACCOUNTANCY EQUATION IN THE INVESTIGATION OF ENERGY TRANSFER IN A THIN WALLED SHELL STRUCTURE.
- Creator
- SCHAPLEY, RAMON FRANK, II., Florida Atlantic University, Dunn, Stanley E., Cuschieri, Joseph M., College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
The Energy Accountancy method is used to describe the response of a system by accounting for the various energy components in a system, that is components describing the input energy, the energy dissipated, and the energy transfered by the system. These components are functions of quantities that can be determined either through measurement or finite element analysis of the system. This concept is used in this study to determine the response of a small diameter pipe containing two different...
Show moreThe Energy Accountancy method is used to describe the response of a system by accounting for the various energy components in a system, that is components describing the input energy, the energy dissipated, and the energy transfered by the system. These components are functions of quantities that can be determined either through measurement or finite element analysis of the system. This concept is used in this study to determine the response of a small diameter pipe containing two different fluids, air and water. The results of this study have shown that the Snergy Accountancy method can be used to describe the response of a thin walled shell structure with good results. It has also been shown in this study that in small diameter pipes the fluid contained by the system can be considered to act as a reactive medium in the response of the structure.
Show less - Date Issued
- 1985
- PURL
- http://purl.flvc.org/fcla/dt/14277
- Subject Headings
- Force and energy--Analysis, Force and energy--Measurement
- Format
- Document (PDF)
- Title
- Course scheduling support system.
- Creator
- Khan, Jawad Ahmed., Florida Atlantic University, Levow, Roy B., Hsu, Sam, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The Course Scheduling Support System is designed to facilitate manual generation of the faculty course scheduling process. It aids in assigning faculty to courses and assigning each course section to their time block. It captures historic and current scheduling information in an organized manner making information needed to create new schedules more readily and quickly available. The interaction between user and database is made as friendly as possible so that managing, manipulating,...
Show moreThe Course Scheduling Support System is designed to facilitate manual generation of the faculty course scheduling process. It aids in assigning faculty to courses and assigning each course section to their time block. It captures historic and current scheduling information in an organized manner making information needed to create new schedules more readily and quickly available. The interaction between user and database is made as friendly as possible so that managing, manipulating, populating and retrieving scheduling data is simple and efficient. We have implemented an open source web-based prototype of the proposed system using PHP, MySQL, and the Apache Web Server. It can be invoked with a standard Web browser and has an intuitive user interface. It provides tools for customizing web forms that can be easily used by non-technical users. Our department plans to deploy this system by Fall 2006.
Show less - Date Issued
- 2006
- PURL
- http://purl.flvc.org/fcla/dt/13343
- Subject Headings
- Scheduling--Data processing, Constraints (Artificial intelligence), Electronic data processing--Distributed processing
- Format
- Document (PDF)
- Title
- DCVS logic synthesis.
- Creator
- Xiao, Kang., Florida Atlantic University, Barrett, Raymond L. Jr., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Implementation of CMOS combinational logic with Differential Cascode Voltage Switch logic (DCVS) may have many advantages over the traditional CMOS logic approaches with respect to device count, layout density and timing. DCVS is an ideal target technology for a logic synthesis system in that it provides a complete function cover by providing the function and its complement simultaneously. DCVS is also more testable due to this. We have developed for IBM's DCVS technology a synthesis...
Show moreImplementation of CMOS combinational logic with Differential Cascode Voltage Switch logic (DCVS) may have many advantages over the traditional CMOS logic approaches with respect to device count, layout density and timing. DCVS is an ideal target technology for a logic synthesis system in that it provides a complete function cover by providing the function and its complement simultaneously. DCVS is also more testable due to this. We have developed for IBM's DCVS technology a synthesis algorithm and a new test generation approach, that are based on topologies rather than individual logic functions. We have found that 19 and 363 DCVS topologies can represent 256 and 65,536 functions, respectively, for the 3- and 4-varaible cases. Physical defect analysis was conducted with the aid of a building block approach to analyze the n-type logic tree and provides a basis for evolving hierarchical test pattern generation for the topologies.
Show less - Date Issued
- 1992
- PURL
- http://purl.flvc.org/fcla/dt/14850
- Subject Headings
- Integrated circuits--Very large scale integration--Data processing, Metal oxide semiconductors, Complementary, Computer-aided design, Electronic systems, Logic design--Data processing
- Format
- Document (PDF)
- Title
- Correcting noisy data and expert analysis of the correction process.
- Creator
- Seiffert, Christopher N., Florida Atlantic University, Khoshgoftaar, Taghi M., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
This thesis expands upon an existing noise cleansing technique, polishing, enabling it to be used in the Software Quality Prediction domain, as well as any other domain where the data contains continuous values, as opposed to categorical data for which the technique was originally designed. The procedure is applied to a real world dataset with real (as opposed to injected) noise as determined by an expert in the domain. This, in combination with expert assessment of the changes made to the...
Show moreThis thesis expands upon an existing noise cleansing technique, polishing, enabling it to be used in the Software Quality Prediction domain, as well as any other domain where the data contains continuous values, as opposed to categorical data for which the technique was originally designed. The procedure is applied to a real world dataset with real (as opposed to injected) noise as determined by an expert in the domain. This, in combination with expert assessment of the changes made to the data, provides not only a more realistic dataset than one in which the noise (or even the entire dataset) is artificial, but also a better understanding of whether the procedure is successful in cleansing the data. Lastly, this thesis provides a more in-depth view of the process than previously available, in that it gives results for different parameters and classifier building techniques. This allows the reader to gain a better understanding of the significance of both model generation and parameter selection.
Show less - Date Issued
- 2005
- PURL
- http://purl.flvc.org/fcla/dt/13223
- Subject Headings
- Computer interfaces--Software--Quality control, Acoustical engineering, Noise control--Computer programs, Expert systems (Computer science), Software documentation
- Format
- Document (PDF)
- Title
- THE DESIGN OF HIGH FREQUENCY OSCILLATORS: NOISE CHARACTERIZATION, DESIGN THEORY, AND MEASUREMENTS.
- Creator
- VICTOR, ALAN MICHAEL., Florida Atlantic University, Gazourian, Martin G., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
A design theory for high frequency oscillators is presented. Emphasis is placed on oscillator design techniques which are applicable to the electrical tuning of LC and transmission line resonators. Attention is paid to design approaches which yield an oscillator with high spectral purity and a large signal to noise ratio. Theory and measurements demonstrate for the oscillator configurations investigated the a small L/C ratio is desirable for improved oscillator signal to noise ratio....
Show moreA design theory for high frequency oscillators is presented. Emphasis is placed on oscillator design techniques which are applicable to the electrical tuning of LC and transmission line resonators. Attention is paid to design approaches which yield an oscillator with high spectral purity and a large signal to noise ratio. Theory and measurements demonstrate for the oscillator configurations investigated the a small L/C ratio is desirable for improved oscillator signal to noise ratio. Equations are developed which define the noise figure the oscillator due to the additive noise of the active device. This analysis demonstrates the need for a high device starting transconductance which should be subsequently reduced during oscillation to minimize the device noise contribution. A relationship is developed between the receiver dynamic range and the oscillator signal to the noise ratio. Oscillator designs in the region 20 Mhz - 200 Mhz verify the analysis. A unified approach to large signal oscillator design is investigated and relationships to oscillator signal to noise ratio using the previously developed theory are noted
Show less - Date Issued
- 1980
- PURL
- http://purl.flvc.org/fcla/dt/14043
- Subject Headings
- Oscillators, Audio-frequency
- Format
- Document (PDF)
- Title
- THE DESIGN OF SWITCHED-CAPACITOR HIGHPASS FILTERS.
- Creator
- LEE, KING FU., Florida Atlantic University, Gazourian, Martin G., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The design of high order switched-capacitor highpass filters is presented. Emphasis is placed on the design procedures of cascaded biquadratic sections and ladder network realizations of switchedcapacitor highpass filters. The stability problem of the doubly terminated switched-capacitor ladder highpass filter is discussed. Design examples are presented to illustrate the design procedures. The sensitivities of the realization methods are discussed. An .analytical equation of the gain...
Show moreThe design of high order switched-capacitor highpass filters is presented. Emphasis is placed on the design procedures of cascaded biquadratic sections and ladder network realizations of switchedcapacitor highpass filters. The stability problem of the doubly terminated switched-capacitor ladder highpass filter is discussed. Design examples are presented to illustrate the design procedures. The sensitivities of the realization methods are discussed. An .analytical equation of the gain deviation for the cascaded biquadratic sections realization is derived. Monte Carlo analysis is performed for the design examples. The results of the analyses are compared to reveal the differences in sensitivities in terms of the order of the filters and the type of realizations.
Show less - Date Issued
- 1983
- PURL
- http://purl.flvc.org/fcla/dt/14169
- Subject Headings
- Switched capacitor circuits, Digital filters (Mathematics)
- Format
- Document (PDF)
- Title
- A very high-performance neural network system architecture using grouped weight quantization.
- Creator
- Karaali, Orhan., Florida Atlantic University, Shankar, Ravi, Gluch, David P., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
Recently, Artificial Neural Network (ANN) computing systems have become one of the most active and challenging areas of information processing. The successes of experimental neural computing systems in the fields of pattern recognition, process control, robotics, signal processing, expert system, and functional analysis are most promising. However due to a number of serious problems, only small size fully connected neural networks have been implemented to run in real-time. The primary problem...
Show moreRecently, Artificial Neural Network (ANN) computing systems have become one of the most active and challenging areas of information processing. The successes of experimental neural computing systems in the fields of pattern recognition, process control, robotics, signal processing, expert system, and functional analysis are most promising. However due to a number of serious problems, only small size fully connected neural networks have been implemented to run in real-time. The primary problem is that the execution time of neural networks increases exponentially as the neural network's size increases. This is because of the exponential increase in the number of multiplications and interconnections which makes it extremely difficult to implement medium or large scale ANNs in hardware. The Modular Grouped Weight Quantization (MGWQ) presented in this dissertation is an ANN design which assures that the number of multiplications and interconnections increase linearly as the neural network's size increases. The secondary problems are related to scale-up capability, modularity, memory requirements, flexibility, performance, fault tolerance, technological feasibility, and cost. The MGWQ architecture also resolves these problems. In this dissertation, neural network characteristics and existing implementations using different technologies are described. Their shortcomings and problems are addressed, and solutions to these problems using the MGWQ approach are illustrated. The theoretical and experimental justifications for MGWQ are presented. Performance calculations for the MGWQ architecture are given. The mappings of the most popular neural network models to the proposed architecture are demonstrated. System level architecture considerations are discussed. The proposed ANN computing system is a flexible and a realistic way to implement large fully connected networks. It offers very high performance using currently available technology. The performance of ANNs is measured in terms of interconnections per second (IC/S); the performance of the proposed system changes between 10^11 to 10^14 IC/S. In comparison, SAIC's DELTA II ANN system achieves 10^7. A Cray X-MP achieves 5*10^7 IC/S.
Show less - Date Issued
- 1989
- PURL
- http://purl.flvc.org/fcla/dt/12245
- Subject Headings
- Neural circuitry, Neural computers, Computer architecture
- Format
- Document (PDF)
- Title
- A visual perception threshold matching algorithm for real-time video compression.
- Creator
- Noll, John M., Florida Atlantic University, Pandya, Abhijit S., College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
A barrier to the use of digital imaging is the vast storage requirements involved. One solution is compression. Since imagery is ultimately subject to human visual perception, it is worthwhile to design and implement an algorithm which performs compression as a function of perception. The underlying premise of the thesis is that if the algorithm closely matches visual perception thresholds, then its coded images contain only the components necessary to recreate the perception of the visual...
Show moreA barrier to the use of digital imaging is the vast storage requirements involved. One solution is compression. Since imagery is ultimately subject to human visual perception, it is worthwhile to design and implement an algorithm which performs compression as a function of perception. The underlying premise of the thesis is that if the algorithm closely matches visual perception thresholds, then its coded images contain only the components necessary to recreate the perception of the visual stimulus. Psychophysical test results are used to map the thresholds of visual perception, and develop an algorithm that codes only the image content exceeding those thresholds. The image coding algorithm is simulated in software to demonstrate compression of a single frame image. The simulation results are provided. The algorithm is also adapted to real-time video compression for implementation in hardware.
Show less - Date Issued
- 1992
- PURL
- http://purl.flvc.org/fcla/dt/14857
- Subject Headings
- Image processing--Digital techniques, Computer algorithms, Visual perception, Data compression (Computer science)
- Format
- Document (PDF)
- Title
- The role of monitoring wells in modern landfill designs.
- Creator
- Reddi, Vinod Jayasankar, Florida Atlantic University, Fluet, J. E. Jr., Scarlatos, Panagiotis (Pete) D., College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
Modem technology has led to a new generation of landfill liner systems that are highly efficient at intercepting and removing leachate. Many of the modem liner systems are so effective that little or no leakage occurs through the liner systems. What leakage may occur is so minimal that, although it can be theoretically predicted, it cannot be measured, i.e., the resulting groundwater concentrations are well beneath minimum detection levels of available monitoring well technology. In addition...
Show moreModem technology has led to a new generation of landfill liner systems that are highly efficient at intercepting and removing leachate. Many of the modem liner systems are so effective that little or no leakage occurs through the liner systems. What leakage may occur is so minimal that, although it can be theoretically predicted, it cannot be measured, i.e., the resulting groundwater concentrations are well beneath minimum detection levels of available monitoring well technology. In addition to being highly effective, some modem liner systems are constructed with two liners separated by a drainage medium which detects and removes any leakage through the top liner. These significant improvements in liner system technology have led many landfill designers, operators, and regulators to question the necessity for current monitoring well practices. Currently, landfills are required to have a large number of monitoring wells, and the associated large installation, sampling, and testing costs are inevitably reflected in higher tipping fees or higher taxes. In either case, the costs are borne by the public. If the number and frequency of sampling of monitoring wells could be reduced, significant cost savings could be realized, and the money saved could perhaps be better spent elsewhere. This thesis reports the results of research conducted at eleven landfills constructed with modem landfill liner systems to determine the actual and probable efficacy of the role o: monitoring wells, and conducts a cost-saving analysis to evaluate whether funds would have been better spent elsewhere.
Show less - Date Issued
- 1994
- PURL
- http://purl.flvc.org/fcla/dt/15019
- Subject Headings
- Fills (Earthwork), Leachate, Sanitary landfills--Linings, Waste disposal in the ground
- Format
- Document (PDF)