Current Search: Probabilities (x)
View All Items
- Title
- Ballot problem with two and three candidates.
- Creator
- Muhundan, Arumugam, Florida Atlantic University, Niederhausen, Heinrich
- Abstract/Description
-
The classical ballot problem (two candidate ballot problem) was first introduced and solved by Whitworth. Andre attacked this problem with his famous tool, the reflection principle, and explained that the (classical) ballot problem was an application of lattice path enumeration. Kreweras formulated and solved the three candidate ballot problem in 1965. But the proof was more complicated. In 1983, Niederhausen provided a simple proof. The three candidate ballot problem was also solved by...
Show moreThe classical ballot problem (two candidate ballot problem) was first introduced and solved by Whitworth. Andre attacked this problem with his famous tool, the reflection principle, and explained that the (classical) ballot problem was an application of lattice path enumeration. Kreweras formulated and solved the three candidate ballot problem in 1965. But the proof was more complicated. In 1983, Niederhausen provided a simple proof. The three candidate ballot problem was also solved by Gessel in 1986 using completely different approach, namely, the probabilistic method. We study those methods carefully and make some additional observations in this thesis.
Show less - Date Issued
- 1990
- PURL
- http://purl.flvc.org/fcla/dt/14636
- Subject Headings
- Lattice paths, Combinatorial probabilities
- Format
- Document (PDF)
- Title
- Multivariate finite operator calculus applied to counting ballot paths containing patterns [electronic resource].
- Creator
- Sullivan, Shaun, Charles E. Schmidt College of Science, Department of Mathematical Sciences
- Abstract/Description
-
Counting lattice paths where the number of occurrences of a given pattern is monitored requires a careful analysis of the pattern. Not the length, but the characteristics of the pattern are responsible for the difficulties in finding explicit solutions. Certain features, like overlap and difference in number of ! and " steps determine the recursion formula. In the case of ballot paths, that is paths the stay weakly above the line y = x, the solutions to the recursions are typically polynomial...
Show moreCounting lattice paths where the number of occurrences of a given pattern is monitored requires a careful analysis of the pattern. Not the length, but the characteristics of the pattern are responsible for the difficulties in finding explicit solutions. Certain features, like overlap and difference in number of ! and " steps determine the recursion formula. In the case of ballot paths, that is paths the stay weakly above the line y = x, the solutions to the recursions are typically polynomial sequences. The objects of Finite Operator Calculus are polynomial sequences, thus the theory can be used to solve the recursions. The theory of Finite Operator Calculus is strengthened and extended to the multivariate setting in order to obtain solutions, and to prepare for future applications.
Show less - Date Issued
- 2011
- PURL
- http://purl.flvc.org/FAU/3174076
- Subject Headings
- Combinatorial probabilities, Lattice paths, Combinatorial enumeration problems, Generating functions
- Format
- Document (PDF)
- Title
- Transforming directed graphs into uncertain rules.
- Creator
- Lantigua, Jose Salvador., Florida Atlantic University, Hoffman, Frederick, College of Engineering and Computer Science, Department of Computer and Electrical Engineering and Computer Science
- Abstract/Description
-
The intent of this thesis is to show how rule structures can be derived from influence diagrams and how these structures can be mapped to existing rule-based shell paradigms. We shall demonstrate this mapping with an existing shell having the Evidence (E) --> Hypothesis (H), Certainty Factor (CF) paradigm structure. Influence diagrams are graphical representations of hypothesis to evidence, directed forms of Bayesian influence networks. These allow for inferencing about both diagnostic and...
Show moreThe intent of this thesis is to show how rule structures can be derived from influence diagrams and how these structures can be mapped to existing rule-based shell paradigms. We shall demonstrate this mapping with an existing shell having the Evidence (E) --> Hypothesis (H), Certainty Factor (CF) paradigm structure. Influence diagrams are graphical representations of hypothesis to evidence, directed forms of Bayesian influence networks. These allow for inferencing about both diagnostic and predictive (or causal) behavior based on uncertain evidence. We show how this can be implemented through a Probability (P) to CF mapping algorithm and a rule-set conflict resolution methodology. The thesis contains a discussion about the application of probabilistic semantics from Bayesian networks and of decision theory, to derive qualitative assertions about the likelihood of an occurrence; the sensitivity of a conclusion; and other indicators of usefulness. We show an example of this type of capability by the addition of a probability range function for the premise clause in our shell's rule structure.
Show less - Date Issued
- 1989
- PURL
- http://purl.flvc.org/fcla/dt/14570
- Subject Headings
- Decision-making--Mathematical models, Probabilities, Expert systems (Computer science)
- Format
- Document (PDF)
- Title
- Nonlinear properties of cardiac rhythm abnormalities.
- Creator
- Liebovitch, Larry S., Todorov, Angelo T., Zochowski, Michal, Scheurle, Daniela, Colgin, Laura, Wood, Mark A., Ellenbogen, Kenneth A., Herre, John M., Bernstein, Robert C.
- Date Issued
- 1999-03
- PURL
- http://purl.flvc.org/fau/165473
- Subject Headings
- Biophysics--Research, Medicine-Mathematics, Arrythmia--therapy, Fractals, Nonlinear systems, Probabilities-Mathematical Models
- Format
- Document (PDF)
- Title
- Avoiding abelian squares in infinite partial words.
- Creator
- Severa, William., Harriet L. Wilkes Honors College
- Abstract/Description
-
Famous mathematician Paul Erdèos conjectured the existence of infinite sequences of symbols where no two adjacent subsequences are permutations of one another. It can easily be checked that no such sequence can be constructed using only three symbols, but as few as four symbols are sufficient. Here, we expand this concept to include sequences that may contain 'do not know'' characters, called holes. These holes make the undesired subsequences more common. We explore both finite and infinite...
Show moreFamous mathematician Paul Erdèos conjectured the existence of infinite sequences of symbols where no two adjacent subsequences are permutations of one another. It can easily be checked that no such sequence can be constructed using only three symbols, but as few as four symbols are sufficient. Here, we expand this concept to include sequences that may contain 'do not know'' characters, called holes. These holes make the undesired subsequences more common. We explore both finite and infinite sequences. For infinite sequences, we use iterating morphisms to construct the non-repetitive sequences with either a finite number of holes or infinitely many holes. We also discuss the problem of using the minimum number of different symbols.
Show less - Date Issued
- 2010
- PURL
- http://purl.flvc.org/FAU/3335460
- Subject Headings
- Abelian groups, Mathematics, Study and teaching (Higher), Combinatorial analysis, Combinatorial set theory, Probabilities
- Format
- Document (PDF)
- Title
- Empirical likelihood method for segmented linear regression.
- Creator
- Liu, Zhihua., Charles E. Schmidt College of Science, Department of Mathematical Sciences
- Abstract/Description
-
For a segmented regression system with an unknown change-point over two domains of a predictor, a new empirical likelihood ratio test statistic is proposed to test the null hypothesis of no change. The proposed method is a non-parametric method which releases the assumption of the error distribution. Under the null hypothesis of no change, the proposed test statistic is shown empirically Gumbel distributed with robust location and scale parameters under various parameter settings and error...
Show moreFor a segmented regression system with an unknown change-point over two domains of a predictor, a new empirical likelihood ratio test statistic is proposed to test the null hypothesis of no change. The proposed method is a non-parametric method which releases the assumption of the error distribution. Under the null hypothesis of no change, the proposed test statistic is shown empirically Gumbel distributed with robust location and scale parameters under various parameter settings and error distributions. Under the alternative hypothesis with a change-point, the comparisons with two other methods (Chen's SIC method and Muggeo's SEG method) show that the proposed method performs better when the slope change is small. A power analysis is conducted to illustrate the performance of the test. The proposed method is also applied to analyze two real datasets: the plasma osmolality dataset and the gasoline price dataset.
Show less - Date Issued
- 2011
- PURL
- http://purl.flvc.org/FAU/3332719
- Subject Headings
- Change-point problems, Regression analysis, Econometrics, Limit theory (Probability theory)
- Format
- Document (PDF)
- Title
- Determination of probability density from statistical moments by neural network approach.
- Creator
- Zheng, Zhiyin., Florida Atlantic University, Cai, Guo-Qiang, College of Engineering and Computer Science, Department of Ocean and Mechanical Engineering
- Abstract/Description
-
It is known that response probability densities, although important in failure analysis, are seldom achievable for stochastically excited systems except for linear systems under additive excitations of Gaussian processes. Most often, statistical moments are obtainable analytically or experimentally. It is proposed in this thesis to determine the probability density from the known statistical moments using artificial neural networks. A multi-layered feed-forward type of neural networks with...
Show moreIt is known that response probability densities, although important in failure analysis, are seldom achievable for stochastically excited systems except for linear systems under additive excitations of Gaussian processes. Most often, statistical moments are obtainable analytically or experimentally. It is proposed in this thesis to determine the probability density from the known statistical moments using artificial neural networks. A multi-layered feed-forward type of neural networks with error back-propagation training algorithm is proposed for the purpose and the parametric method is adopted for identifying the probability density function. Three examples are given to illustrate the applicability of the approach. All three examples show that the neural network approach gives quite accurate results in comparison with either the exact or simulation ones.
Show less - Date Issued
- 1996
- PURL
- http://purl.flvc.org/fcla/dt/15330
- Subject Headings
- Distribution (Probability theory), Moments method (Statistics), Estimation theory, Structural failures--Investigation, Neural networks (Computer science)
- Format
- Document (PDF)
- Title
- Stochastic optimal impulse control of jump diffusions with application to exchange rate.
- Creator
- Perera, Sandun C., Charles E. Schmidt College of Science, Department of Mathematical Sciences
- Abstract/Description
-
We generalize the theory of stochastic impulse control of jump diffusions introduced by Oksendal and Sulem (2004) with milder assumptions. In particular, we assume that the original process is affected by the interventions. We also generalize the optimal central bank intervention problem including market reaction introduced by Moreno (2007), allowing the exchange rate dynamic to follow a jump diffusion process. We furthermore generalize the approximation theory of stochastic impulse control...
Show moreWe generalize the theory of stochastic impulse control of jump diffusions introduced by Oksendal and Sulem (2004) with milder assumptions. In particular, we assume that the original process is affected by the interventions. We also generalize the optimal central bank intervention problem including market reaction introduced by Moreno (2007), allowing the exchange rate dynamic to follow a jump diffusion process. We furthermore generalize the approximation theory of stochastic impulse control problems by a sequence of iterated optimal stopping problems which is also introduced in Oksendal and Sulem (2004). We develop new results which allow us to reduce a given impulse control problem to a sequence of iterated optimal stopping problems even though the original process is affected by interventions.
Show less - Date Issued
- 2009
- PURL
- http://purl.flvc.org/FAU/3174308
- Subject Headings
- Management, Mathematical models, Control theory, Stochastic differential equations, Distribution (Probability theory), Optimal stopping (Mathematical statistics), Economics, Mathematical
- Format
- Document (PDF)
- Title
- Influences of Climate variability on Rainfall Extremes of Different Durations.
- Creator
- Metellus, Wilord, Teegavarapu, Ramesh, Florida Atlantic University, College of Engineering and Computer Science, Department of Civil, Environmental and Geomatics Engineering
- Abstract/Description
-
The concept of Intensity Duration Frequency (IDF) relationship curve presents crucial design contribution for several decades under the assumption of a stationary climate, the frequency and intensity of extreme rainfall nonetheless seemingly increase worldwide. Based on the research conducted in recent years, the greatest increases are likely to occur in short-duration storms lasting less than a day, potentially leading to an increase in the magnitude and frequency of flash floods. The trend...
Show moreThe concept of Intensity Duration Frequency (IDF) relationship curve presents crucial design contribution for several decades under the assumption of a stationary climate, the frequency and intensity of extreme rainfall nonetheless seemingly increase worldwide. Based on the research conducted in recent years, the greatest increases are likely to occur in short-duration storms lasting less than a day, potentially leading to an increase in the magnitude and frequency of flash floods. The trend analysis of the precipitation influencing the climate variability and extreme rainfall in the state of Florida is conducted in this study. Since these local changes are potentially or directly related to the surrounding oceanic-atmospheric oscillations, the following oscillations are analyzed or highlighted in this study: Atlantic Multi-Decadal Oscillation (AMO), El Niño Southern Oscillation (ENSO), and Pacific Decadal Oscillations (PDO). Collected throughout the state of Florida, the precipitation data from rainfall gages are grouped and analyzed based on type of duration such as short-term duration or minute, in hourly and in daily period. To assess statistical associations based on the ranks of the data, the non-parametric tests Kendall’s tau and Spearman’s rho correlation coefficient are used to determine the orientation of the trend and ultimately utilize the testing results to determine the statistical significance of the analyzed data. The outcome of the latter confirms with confidence whether there is an increasing or decreasing trend in precipitation depth in the State of Florida. The main emphasis is on the influence of rainfall extremes of short-term duration over a period of about 50 years. Results from both Spearman and Mann-Kendall tests show that the greatest percentage of increase occurs during the short rainfall duration period. The result highlights a tendency of increasing trends in three different regions, two of which are more into the central and peninsula region of Florida and one in the continental region. Given its topography and the nature of its water surface such as the everglades and the Lake Okeechobee, Florida experience a wide range of weather patterns resulting in frequent flooding during wet season and drought in the dry season.
Show less - Date Issued
- 2016
- PURL
- http://purl.flvc.org/fau/fd/FA00004787, http://purl.flvc.org/fau/fd/FA00004787
- Subject Headings
- Climatic changes., Climate change mitigation., Ocean-atmosphere interaction., Rain and rainfall--Measurement., Rainfall probabilities., Rainfall intensity duration frequencies--Florida.
- Format
- Document (PDF)
- Title
- A min/max algorithm for cubic splines over k-partitions.
- Creator
- Golinko, Eric David, Charles E. Schmidt College of Science, Department of Mathematical Sciences
- Abstract/Description
-
The focus of this thesis is to statistically model violent crime rates against population over the years 1960-2009 for the United States. We approach this question as to be of interest since the trend of population for individual states follows different patterns. We propose here a method which employs cubic spline regression modeling. First we introduce a minimum/maximum algorithm that will identify potential knots. Then we employ least squares estimation to find potential regression...
Show moreThe focus of this thesis is to statistically model violent crime rates against population over the years 1960-2009 for the United States. We approach this question as to be of interest since the trend of population for individual states follows different patterns. We propose here a method which employs cubic spline regression modeling. First we introduce a minimum/maximum algorithm that will identify potential knots. Then we employ least squares estimation to find potential regression coefficients based upon the cubic spline model and the knots chosen by the minimum/maximum algorithm. We then utilize the best subsets regression method to aid in model selection in which we find the minimum value of the Bayesian Information Criteria. Finally, we preent the R2adj as a measure of overall goodness of fit of our selected model. We have found among the fifty states and Washington D.C., 42 out of 51 showed an R2adj value that was greater than 90%. We also present an overall model of the United States. Also, we show additional applications our algorithm for data which show a non linear association. It is hoped that our method can serve as a unified model for violent crime rate over future years.
Show less - Date Issued
- 2012
- PURL
- http://purl.flvc.org/FAU/3342107
- Subject Headings
- Spline theory, Data processing, Bayesian statistical decision theory, Data processing, Neural networks (Computer science), Mathematical statistics, Uncertainty (Information theory), Probabilities, Regression analysis
- Format
- Document (PDF)
- Title
- Simplicial matter in discrete and quantum spacetimes.
- Creator
- McDonald, Jonathan Ryan., Charles E. Schmidt College of Science, Department of Physics
- Abstract/Description
-
A discrete formalism for General Relativity was introduced in 1961 by Tulio Regge in the form of a piecewise-linear manifold as an approximation to (pseudo-)Riemannian manifolds. This formalism, known as Regge Calculus, has primarily been used to study vacuum spacetimes as both an approximation for classical General Relativity and as a framework for quantum gravity. However, there has been no consistent effort to include arbitrary non-gravitational sources into Regge Calculus or examine the...
Show moreA discrete formalism for General Relativity was introduced in 1961 by Tulio Regge in the form of a piecewise-linear manifold as an approximation to (pseudo-)Riemannian manifolds. This formalism, known as Regge Calculus, has primarily been used to study vacuum spacetimes as both an approximation for classical General Relativity and as a framework for quantum gravity. However, there has been no consistent effort to include arbitrary non-gravitational sources into Regge Calculus or examine the structural details of how this is done. This manuscript explores the underlying framework of Regge Calculus in an effort elucidate the structural properties of the lattice geometry most useful for incorporating particles and fields. Correspondingly, we first derive the contracted Bianchi identity as a guide towards understanding how particles and fields can be coupled to the lattice so as to automatically ensure conservation of source. In doing so, we derive a Kirchhoff-like conservation principle that identifies the flow of energy and momentum as a flux through the circumcentric dual boundaries. This circuit construction arises naturally from the topological structure suggested by the contracted Bianchi identity. Using the results of the contracted Bianchi identity we explore the generic properties of the local topology in Regge Calculus for arbitrary triangulations and suggest a first-principles definition that is consistent with the inclusion of source. This prescription for extending vacuum Regge Calculus is sufficiently general to be applicable to other approaches to discrete quantum gravity. We discuss how these findings bear on a quantized theory of gravity in which the coupling to source provides a physical interpretation for the approximate invariance principles of the discrete theory.
Show less - Date Issued
- 2009
- PURL
- http://purl.flvc.org/FAU/186691
- Subject Headings
- Special relativity (Physics), Space and time, Distribution (Probability theory), Global differential geometry, Quantum field theory, Mathematical physics
- Format
- Document (PDF)