http://www.scholarpedia.org/article/Algorithmic_randomness
"Algorithmic randomness is the study of random individual elements in sample spaces, mostly the set of all infinite binary sequences. An algorithmically random element passes all effectively devised tests for randomness."
An algorithm is a set of instructions that will solve a problem.(I have learnt this from my further maths AS level in a topic called Decision Maths 1). Algorithms are mainly used for computers.
I've begun to realise that a lot of the information about random mechanisms is very complex. I have two choices:
1. Don't go into much detail about the random mechanisms and it is very complicated and a very vast subject. This will make writing that part of my dissertation much easier.
2. Research random mechanisms IN DEPTH so that my dissertation will be more informed. I will then learn a lot more which is what this project is all about.
Number 2 it is.
This is an extract from the same webpage as the above quote:
"The theory of algorithmic randomness tries to clarify what it means for an individual element of a sample space, e.g. a sequence of coin tosses, represented as a binary string, to be random. While Kolmogorov's formalization of classical probability theory assigns probabilities to sets of outcomes and determines how to calculate with such probabilities, it does not distinguish between individual random and non-random elements. For example, under a uniform distribution, the outcome "000000000000000....0" (n zeros) has the same probability as any other outcome of n coin tosses, namely 2-n. However, there is an intuitive feeling that a sequence of all zeros is not very random. This is even more so when looking at infinite sequences. It seems desirable to clarify what we mean when we speak of a random object. The modern view of algorithmic randomness proposes three paradigms to distinguish random from non-random elements.
Unpredictability: It should be impossible to win against a random sequence in a fair betting game when using a feasible betting strategy.
Incompressibility: It should be impossible to feasibly compress a random sequence.
Measure theoretical typicalness: Random sequences pass every feasible statistical test.
It is the characteristic feature of algorithmic randomness that it interprets feasible as algorithmically feasible. "
This indicates that there are certain properties that a sequence must have for it to seem random. The fact that people are aware of these properties and can recreate them in things such as a random number generator shows that although one cannot predict a random sequence, a sequence can easily be indentified as random and it is possible to produce sequences that can easily pass as random.
Wednesday, 8 September 2010
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment