Because all my quotes and sources are scattered around this blog, I think it would be helpful to have one blog with all my sources so far, each libked to a section of my dissertation. This will make it easier for me when it comes to writing each section because I won't miss anything out.
The history of randomness - why is it important?:
Randomness by Deborah J. Bennett: "The first atomist, Leucippus (circa 450 B.C.), said, 'Nothing happens at random; everything happens out of reason and by necessity'. The atomic school contended that chance could not mean uncaused, since everything is caused. Chance must instead mean hidden cause."
Another quote from Randomness:"[...] Newtonian physics - a system of thought which represented the full bloom of the Scientific Revolution in the late seventeenth century. [...] a belief developed among scientists that everything about the natural world was knowable through mathematics. And if everything conformed to mathematics, then a Grand Designer must exist. Pure chance or randomness had no place in this philosophy."
Another quote: "Though not always recognized or acknowledged as such, chance mechanisms have been used since antiquity: to divide property, delegate civic responsibilities or privileges, settle disputes among neighbors, choose which strategy to follow in the course of battle, and drive the play in games of chance."
Wikipedia - "Some theologians have attempted to resolve the apparent contradiction between an omniscient deity, or a first cause, and free will using randomness. Discordians have a strong belief in randomness and unpredictability. Buddhist philosophy states that any event is the result of previous events (karma), and as such, there is no such thing as a random event or a first event.
A quote by Jim French (physics PhD) : "The concept of being able to predict and describe the future behaviour of anything we wanted dates back to the 18th century and is typically associated with the French mathematician Pierre-Simon Laplace and something called Laplace's Demon, which is some hypothetical creature that he thought of as (in principle and possibly way into the future) possessing sufficient knowledge of all the positions and speeds of all particles in the universe as to be able to perfectly predict the entire future evolution of the universe. This concept of causal or Newtonian determinism seemed unavoidable at the time, though it had some uncomfortable questions for the nature of free will, until the beginning of the twentieth century, when it ran into the twin problems of quantum theory and chaos theory. The latter presents no conflict with the principle of determinism, but it does in a practical sense. The physics of chaotic systems are still governed by underlying deterministic processes, but what was realised in the middle of the last century was that putting any knowledge of initial conditions into an actual prediction was far harder than previously realised. A small lack of precision in our knowledge of an initial state can quickly lead into huge uncertainty in the state of the system at some later time."
A quote from wikipedia: "In ancient history, the concepts of chance and randomness were intertwined with that of fate. Many ancient peoples threw dice to determine fate, and this later evolved into games of chance. Most ancient cultures used various methods of divination to attempt to circumvent randomness and fate.
The Chinese were perhaps the earliest people to formalize odds and chance 3,000 years ago. The Greek philosophers discussed randomness at length, but only in non-quantitative forms. It was only in the sixteenth century that Italian mathematicians began to formalize the odds associated with various games of chance. The invention of the calculus had a positive impact on the formal study of randomness. In the 1888 edition of his book The Logic of Chance John Venn wrote a chapter on "The conception of randomness" which included his view of the randomness of the digits of the number Pi by using them to construct a random walk in two dimensions.
The early part of the twentieth century saw a rapid growth in the formal analysis of randomness, as various approaches for a mathematical foundations of probability were introduced. In the mid to late twentieth century ideas of algorithmic information theory introduced new dimensions to the field via the concept of algorithmic randomness.
Although randomness had often been viewed as an obstacle and a nuisance for many centuries, in the twentieth century computer scientists began to realize that the deliberate introduction of randomness into computations can be an effective tool for designing better algorithms. In some cases such randomized algorithms outperform the best deterministic methods."
Probability:
Laura Wherity: "Rolling a die may appear to be random, but in fact it depends on your starting conditions. For example, if you could control the experiment such that the die is always rolled from the same height, at the same angle with the same forces etc. then it should be possible to achieve the same outcome each time. What would appear to be random actually depends on the starting state. Extending this idea, it may be possible to control the starting conditions of other events aswell, so in this sense events that appear 'random' at present may become more predictable in the future as we understand the conditions in more detail. Another good example of this are the weather models used for predicting the weather in forecasts. The better we can get at determining the initial conditions, the better our models will become. Of course in certain situations there may be a limit to the accuracies involved, and thus exact predictions or modelling may not be possible."
Jonathan Wright: "As an applied mathematician, all physical situations can be modelled mathematically, and as such we can predict all possible outcomes. If we roll a dice in exactly the same way 100 times, 100 times it would give us the same result. If we model the roll of the die, given the starting conditions we could predict the outcome every time."
Randomness - Deborah J Bennett
Also Chance and Reckoning with Risk
Short Introduction to Chaos and Quantum Mechanics:
Jim French: "However, in quantum theory (at least in the Copenhagen interpretation), it is meaningless to speak of a property of a particle (such as its position) before we go in and measure it. The particle is not sitting there, waiting for us to shine a light on it, revealing its location. All we can talk of is the probability of observing it in one place and not another. The Heisenberg Uncertainty Principle is related to this concept and it states that our certainty in predicting a particle's velocity is limited by our certainty in measuring its position (and vice versa). The more precisely we know where a particle is, the less precision we can have in knowing how fast it is moving. To be clear, this is not an engineering limitation, something that will be overcome in a hundred years’ time with improved technology; it is a fundamental property of nature. Before we have made a particular measurement, it is meaningless to talk of a particle’s position etc, since such properties simply do not exist. This has implications for predictability and randomness, since if a particle’s position (or velocity etc) does not objectively exist, it is impossible to predict precisely what that position will be measured to be and what the subsequent evolution of a system of particles will be."
Another quote: "...quantum mechanics gave predictions which couldn’t be explained by any locally real theory (that is, any theory which pictured particles as having objectively real, well-defined properties). Various experiments have demonstrated that nature does indeed obey the rules of quantum physics and we must therefore adopt this peculiar view of nature based probability and abstraction, rather than concrete realism."
Jonathan Wright: "Quantum theory on the other hand, may also appear to be random, but similarly I think it is just not fully understood. We may not know the exact position of electrons in an atom, so instead we give electrons a 'probability' of being in certain positions or states. This doesn't mean that the electrons are in a random place, just that we are unable to observe their exact position. (In fact, and here is where you should ask a physicist, I think the very process of looking into an atom changes the states of the electrons..So we dont know.) But does this make it random?"
Randomness: "Chaos theory, the science which predicts that the future state of most systems is unpredictable due to even small initial uncertainties, holds new meaning for the notion of randomness, and simulating these systems requires huge numbers of random digits. It has been shown that with even small deterministic systems, initial observational error and tiny disturbances grown exponentially and create enormous problems with predictability in the long run"
Wikipedia: "According to several standard interpretations of quantum mechanics, microscopic phenomena are objectively random. That is, in an experiment where all causally relevant parameters are controlled, there will still be some aspects of the outcome which vary randomly. An example of such an experiment is placing a single unstable atom in a controlled environment; it cannot be predicted how long it will take for the atom to decay; only the probability of decay within a given time can be calculated. Thus, quantum mechanics does not specify the outcome of individual experiments but only the probabilities. Hidden variable theories are inconsistent with the view that nature contains irreducible randomness: such theories posit that in the processes that appear random, properties with a certain statistical distribution are somehow at work "behind the scenes" determining the outcome in each case."
Chaos by James Gleick:
"The rotation of the waterwheel shares some of the properties of the rotating cylinders of the fluid in the process of convection. [...] Water pours in from the top at a steady rate. If the flow of the water in the waterwheel is slow, the top bucket never fills up enough to overcome friction, and the wheel never starts turning. [...]
If the flow is faster, the weight of the top bucket sets the wheel in motion (left). The waterwheel can settle into a rotation that continues at a steady rate (center).
But if the flow is faster still (right), the spin can become chaotic, because of the nonlinear effects built into the system. As buckets pass under the flowing water, how much they fill depends on the speed of the spin. If the wheel is spinning rapidly, the buckets have little time to fill up. [...] Also, if the wheel is spinning rapidly, buckets can start up the other side before they have time to empty. As a result, heavy buckets on the side moving upward can cause the spin to slow down and then reverse."
John Polkinghorne - Quantum Theory: A very Short Introduction
Introduction to random time and quantum randomness - Kai Lai Chung
Quantum: A guide for the perplexed - Jim Al-Khalili.
Quantum - Manjit Kumar
Random number generators:
Robert R. Coveyou (american mathematician) - "The generation of random numbers is too important to be left to chance."
Randomness - "Within any sequence generated by the computer through a programmed algorithm or formula, the next digit is a completely deterministic choice, not random in the sense that a dice throw, a spinning disc, an electronic pulse or even the infinite digits of the mysterious pi are random. The very notion that a deterministic formula could generate a random sequence seemed like a contradiction".
http://www.scholarpedia.org/article/Algorithmic_randomness: "Algorithmic randomness is the study of random individual elements in sample spaces, mostly the set of all infinite binary sequences. An algorithmically random element passes all effectively devised tests for randomness."
Uncertainty and Unpredictability.
A quote from Jim French: "Essentially, yes, in principle, if we knew enough about all the degrees of freedom (at its finest point, the positions and momenta of all particles in the system, though I suspect this could be done on a classical level, without recourse to quantum considerations), we could predict the result. Practically speaking, though, no. We would either need to set up such a precisely controlled system or know so much about the system under consideration, that it would be impractical and/or would require the power of a supercomputer that has better things to do with its time. At the very least, predicting the behaviour of a die with well-measured properties would actually be pretty trivial and it tells you little about whether or not truly random things do actually exist."
Jonathan Wright: "However, with both the roll of the dice or in predicting the weather, it is this 'knowing' of the starting conditions which creates the randomness that we experience I every day life. In the weather models, if your temperature measurement is off by 0.01 degrees, eventually, perhaps in hours, days or weeks time, the predictions made by the model will become drastically different from those you experience. In fact, this was how chaos was discovered; a seemingly well understood piece of theory, when run on a computer on two occasions, gave two drastically different answers with seemingly the same starting values. The difference was attributed to a difference in the 6th decimal place of the starting values.."
Randomness: "Chance is a fair way to determine moves in some games and in certain real-life situations; the random element allows each participant to believe, 'I have an oppurtunity equal to that of my opponent.'"
Reckoning with Risk
Conclusion
Okay, I didn't manage to include every single relevant quote in this, but I have written the titles of books that I may need to refer to.
Making this plan has made me feel a lot more confident about writing my dissertation.
Things to do this week:
Update GANTT Chart
Write Random number generator section of dissertation!!!
Tuesday, 28 September 2010
Final reply to the email
Yesterday, Jim French replied to my email:
"Hi Zainab,
Randomness is difficult to define. With regard to your first question, I think it's fair to say it's not clear that it is a proper question - that is to say, if we equate randomness with unpredictability, then of course we wouldn't be able to predict any random event (by definition). If you mean could we predict the behaviour of systems typically described as random (such as the roll of a die), the answer would be... it depends. Essentially, yes, in principle, if we knew enough about all the degrees of freedom (at its finest point, the positions and momenta of all particles in the system, though I suspect this could be done on a classical level, without recourse to quantum considerations), we could predict the result. Practically speaking, though, no. We would either need to set up such a precisely controlled system or know so much about the system under consideration, that it would be impractical and/or would require the power of a supercomputer that has better things to do with its time. At the very least, predicting the behaviour of a die with well-measured properties would actually be pretty trivial and it tells you little about whether or not truly random things do actually exist.
The concept of being able to predict and describe the future behaviour of anything we wanted dates back to the 18th century and is typically associated with the French mathematician Pierre-Simon Laplace and something called Laplace's Demon, which is some hypothetical creature that he thought of as (in principle and possibly way into the future) possessing sufficient knowledge of all the positions and speeds of all particles in the universe as to be able to perfectly predict the entire future evolution of the universe. This concept of causal or Newtonian determinism seemed unavoidable at the time, though it had some uncomfortable questions for the nature of free will, until the beginning of the twentieth century, when it ran into the twin problems of quantum theory and chaos theory. The latter presents no conflict with the principle of determinism, but it does in a practical sense. The physics of chaotic systems are still governed by underlying deterministic processes, but what was realised in the middle of the last century was that putting any knowledge of initial conditions into an actual prediction was far harder than previously realised. A small lack of precision in our knowledge of an initial state can quickly lead into huge uncertainty in the state of the system at some later time. Plenty of physical systems are not chaotic and (fortunately) perfectly predictable, but this is not so for many other systems (such as the weather, where even with huge computing power at our disposal, we struggle to make good, accurate and specific predictions more than a week ahead).
The most important concept at play in the existence (or lack thereof) of true randomness is quantum theory. Causal determinism assumes a realist view of the world - objects in it have definite, objective properties that are true regardless of our having measured or observed those properties (the moon does exist and it is it not made of cheese and this remains true whether or not I choose to taste a mouthful). However, in quantum theory (at least in the Copenhagen interpretation), it is meaningless to speak of a property of a particle (such as its position) before we go in and measure it. The particle is not sitting there, waiting for us to shine a light on it, revealing its location. All we can talk of is the probability of observing it in one place and not another. The Heisenberg Uncertainty Principle is related to this concept and it states that our certainty in predicting a particle's velocity is limited by our certainty in measuring its position (and vice versa). The more precisely we know where a particle is, the less precision we can have in knowing how fast it is moving. To be clear, this is not an engineering limitation, something that will be overcome in a hundred years’ time with improved technology; it is a fundamental property of nature. Before we have made a particular measurement, it is meaningless to talk of a particle’s position etc, since such properties simply do not exist. This has implications for predictability and randomness, since if a particle’s position (or velocity etc) does not objectively exist, it is impossible to predict precisely what that position will be measured to be and what the subsequent evolution of a system of particles will be.
Of course, it could be objected that this is only according to quantum theory and that theory may be incorrect. Indeed, the theory was not (and I suppose is still not) uncontroversial and its most famous detractor was Albert Einstein. He helped found the subject, but came to reject the theory as it was developed and pursued his own independent (and ultimately unsuccessful) line of research. He disliked the interpretation of nature as probabilistic at heart and famously declared “God does not play dice”. He developed various different thought experiments to try to show that quantum mechanics, as formulated at the time, was incomplete and led to contradictions and paradoxes. None of these convinced the mainstream, but one of the most intriguing was called the Einstein-Podolsky-Rosen (EPR) paradox. The details of the proposal aren’t important here, but they led a British physicist called John Bell to formulate Bell’s theorem. This showed that quantum mechanics gave predictions which couldn’t be explained by any locally real theory (that is, any theory which pictured particles as having objectively real, well-defined properties). Various experiments have demonstrated that nature does indeed obey the rules of quantum physics and we must therefore adopt this peculiar view of nature based probability and abstraction, rather than concrete realism.
In answer to your questions, then: yes, randomness does exist in nature and it is found in quantum processes. Radioactivity, for example, is governed by quantum physics. There is simply no way to predict when a radioactive nucleus will decay and it may be considered genuinely random.
There are all sorts of books out there that deal with the subjects of randomness, chaos theory and quantum physics. The standard popular exposition of chaos theory is Chaos by James Gleick. A good recent book that deals with randomness is The Drunkard’s Walk: How Randomness Rules Our Lives by Leonard Mlodinow. Having not read any pop science books about quantum theory for years, I can’t give many recommendations, but any library or book shop will be stuffed with many that all cover the same ground. One that was particularly important to me as a teenager (though it is about a particular aspect of quantum physics rather than a general introduction) was QED: The Strange Theory of Light and Matter by Richard Feynman. You just need to google things like the EPR paradox and the uncertainty principle to find out about them (and they are fascinating subjects). Wikipedia has large articles on them.
If you have any more questions, I’d be happy to answer them."
So, to conclude, all the graduates believe that for most random events, it is possible to predict the outcome. However, there are concepts such as chaos and quantum theory in which the outcome cannot be predicted and so pure randomness is present.
"Hi Zainab,
Randomness is difficult to define. With regard to your first question, I think it's fair to say it's not clear that it is a proper question - that is to say, if we equate randomness with unpredictability, then of course we wouldn't be able to predict any random event (by definition). If you mean could we predict the behaviour of systems typically described as random (such as the roll of a die), the answer would be... it depends. Essentially, yes, in principle, if we knew enough about all the degrees of freedom (at its finest point, the positions and momenta of all particles in the system, though I suspect this could be done on a classical level, without recourse to quantum considerations), we could predict the result. Practically speaking, though, no. We would either need to set up such a precisely controlled system or know so much about the system under consideration, that it would be impractical and/or would require the power of a supercomputer that has better things to do with its time. At the very least, predicting the behaviour of a die with well-measured properties would actually be pretty trivial and it tells you little about whether or not truly random things do actually exist.
The concept of being able to predict and describe the future behaviour of anything we wanted dates back to the 18th century and is typically associated with the French mathematician Pierre-Simon Laplace and something called Laplace's Demon, which is some hypothetical creature that he thought of as (in principle and possibly way into the future) possessing sufficient knowledge of all the positions and speeds of all particles in the universe as to be able to perfectly predict the entire future evolution of the universe. This concept of causal or Newtonian determinism seemed unavoidable at the time, though it had some uncomfortable questions for the nature of free will, until the beginning of the twentieth century, when it ran into the twin problems of quantum theory and chaos theory. The latter presents no conflict with the principle of determinism, but it does in a practical sense. The physics of chaotic systems are still governed by underlying deterministic processes, but what was realised in the middle of the last century was that putting any knowledge of initial conditions into an actual prediction was far harder than previously realised. A small lack of precision in our knowledge of an initial state can quickly lead into huge uncertainty in the state of the system at some later time. Plenty of physical systems are not chaotic and (fortunately) perfectly predictable, but this is not so for many other systems (such as the weather, where even with huge computing power at our disposal, we struggle to make good, accurate and specific predictions more than a week ahead).
The most important concept at play in the existence (or lack thereof) of true randomness is quantum theory. Causal determinism assumes a realist view of the world - objects in it have definite, objective properties that are true regardless of our having measured or observed those properties (the moon does exist and it is it not made of cheese and this remains true whether or not I choose to taste a mouthful). However, in quantum theory (at least in the Copenhagen interpretation), it is meaningless to speak of a property of a particle (such as its position) before we go in and measure it. The particle is not sitting there, waiting for us to shine a light on it, revealing its location. All we can talk of is the probability of observing it in one place and not another. The Heisenberg Uncertainty Principle is related to this concept and it states that our certainty in predicting a particle's velocity is limited by our certainty in measuring its position (and vice versa). The more precisely we know where a particle is, the less precision we can have in knowing how fast it is moving. To be clear, this is not an engineering limitation, something that will be overcome in a hundred years’ time with improved technology; it is a fundamental property of nature. Before we have made a particular measurement, it is meaningless to talk of a particle’s position etc, since such properties simply do not exist. This has implications for predictability and randomness, since if a particle’s position (or velocity etc) does not objectively exist, it is impossible to predict precisely what that position will be measured to be and what the subsequent evolution of a system of particles will be.
Of course, it could be objected that this is only according to quantum theory and that theory may be incorrect. Indeed, the theory was not (and I suppose is still not) uncontroversial and its most famous detractor was Albert Einstein. He helped found the subject, but came to reject the theory as it was developed and pursued his own independent (and ultimately unsuccessful) line of research. He disliked the interpretation of nature as probabilistic at heart and famously declared “God does not play dice”. He developed various different thought experiments to try to show that quantum mechanics, as formulated at the time, was incomplete and led to contradictions and paradoxes. None of these convinced the mainstream, but one of the most intriguing was called the Einstein-Podolsky-Rosen (EPR) paradox. The details of the proposal aren’t important here, but they led a British physicist called John Bell to formulate Bell’s theorem. This showed that quantum mechanics gave predictions which couldn’t be explained by any locally real theory (that is, any theory which pictured particles as having objectively real, well-defined properties). Various experiments have demonstrated that nature does indeed obey the rules of quantum physics and we must therefore adopt this peculiar view of nature based probability and abstraction, rather than concrete realism.
In answer to your questions, then: yes, randomness does exist in nature and it is found in quantum processes. Radioactivity, for example, is governed by quantum physics. There is simply no way to predict when a radioactive nucleus will decay and it may be considered genuinely random.
There are all sorts of books out there that deal with the subjects of randomness, chaos theory and quantum physics. The standard popular exposition of chaos theory is Chaos by James Gleick. A good recent book that deals with randomness is The Drunkard’s Walk: How Randomness Rules Our Lives by Leonard Mlodinow. Having not read any pop science books about quantum theory for years, I can’t give many recommendations, but any library or book shop will be stuffed with many that all cover the same ground. One that was particularly important to me as a teenager (though it is about a particular aspect of quantum physics rather than a general introduction) was QED: The Strange Theory of Light and Matter by Richard Feynman. You just need to google things like the EPR paradox and the uncertainty principle to find out about them (and they are fascinating subjects). Wikipedia has large articles on them.
If you have any more questions, I’d be happy to answer them."
So, to conclude, all the graduates believe that for most random events, it is possible to predict the outcome. However, there are concepts such as chaos and quantum theory in which the outcome cannot be predicted and so pure randomness is present.
Sunday, 26 September 2010
Great news! and some bad news.
I've finally finished reading randomness! The final chapter is called "paradoxes in probability" and was about the probability related problems that I have come across before. This included the probability of two people sharing the same birthday and the monty hall problem.
The next book that I am going to read is Quantum by Manjit Kumar. This is the first book that I've read about quantum mechanics so it should be interesting.
Unfortunately, my laptop is broken and I'm not sure when it will be fixed. Luckily I can blog via my mobile, but a lot of the work that I do now will have to be done in a library. This is the main reason why I haven't finished writing the "random number generators" section of my dissertation yet. This will be a priority for this week though.
The next book that I am going to read is Quantum by Manjit Kumar. This is the first book that I've read about quantum mechanics so it should be interesting.
Unfortunately, my laptop is broken and I'm not sure when it will be fixed. Luckily I can blog via my mobile, but a lot of the work that I do now will have to be done in a library. This is the main reason why I haven't finished writing the "random number generators" section of my dissertation yet. This will be a priority for this week though.
Thursday, 23 September 2010
Another reply to my email.
Yesterday Laura Wherity, a maths graduate, replied to the email I sent out:
"Question 1:
Rolling a die may appear to be random, but in fact it depends on your starting conditions. For example, if you could control the experiment such that the die is always rolled from the same height, at the same angle with the same forces etc. then it should be possible to achieve the same outcome each time. What would appear to be random actually depends on the starting state. Extending this idea, it may be possible to control the starting conditions of other events aswell, so in this sense events that appear 'random' at present may become more predictable in the future as we understand the conditions in more detail. Another good example of this are the weather models used for predicting the weather in forecasts. The better we can get at determining the initial conditions, the better our models will become. Of course in certain situations there may be a limit to the accuracies involved, and thus exact predictions or modelling may not be possible.
Question 2:
I liked your comment that randomness is subjective - this may well be true, mainly depending on people's understanding of the models. This links into the comments in the previous paragraph - the idea of the randomness of rolling a die depends on how well informed a person might be. To some it may be random, to others it may be predictable.
An area of maths associated with randomness is chaos theory - a book that has been recommended (although I have not read it) is James Gleick - chaos. I am afraid I do not know much about chaos, although the basics of it are that a small change to the starting conditions can start to spiral out of control and lead to large changes. The book appears to be an overview of ideas regarding to chaos, with some maths in it although only a small amount.
I hope some of this helps.
Good luck with the project,
Laura Wherity"
The "Question 1" section to Laura Wherity's email is similar to that of Jonathan Wright's. They both believe that a random event can be calculated if we know all the conditions of the situation. This seems very logical. In the Mechanics 1 module of my Further Maths AS level, I learnt a number of equations that involve the conditions of an object in which you can find out things about it such as its mass or speed. If we have enough information, we can mathematically figure out the way in which an event will happen.
Laura Wherity then recommends the book Chaos by James Gleick. I have already bought that book, so at least I know that my bibliography is on the right track. Basically all of my books will be relevant to my project, which is reassuring.
"Question 1:
Rolling a die may appear to be random, but in fact it depends on your starting conditions. For example, if you could control the experiment such that the die is always rolled from the same height, at the same angle with the same forces etc. then it should be possible to achieve the same outcome each time. What would appear to be random actually depends on the starting state. Extending this idea, it may be possible to control the starting conditions of other events aswell, so in this sense events that appear 'random' at present may become more predictable in the future as we understand the conditions in more detail. Another good example of this are the weather models used for predicting the weather in forecasts. The better we can get at determining the initial conditions, the better our models will become. Of course in certain situations there may be a limit to the accuracies involved, and thus exact predictions or modelling may not be possible.
Question 2:
I liked your comment that randomness is subjective - this may well be true, mainly depending on people's understanding of the models. This links into the comments in the previous paragraph - the idea of the randomness of rolling a die depends on how well informed a person might be. To some it may be random, to others it may be predictable.
An area of maths associated with randomness is chaos theory - a book that has been recommended (although I have not read it) is James Gleick - chaos. I am afraid I do not know much about chaos, although the basics of it are that a small change to the starting conditions can start to spiral out of control and lead to large changes. The book appears to be an overview of ideas regarding to chaos, with some maths in it although only a small amount.
I hope some of this helps.
Good luck with the project,
Laura Wherity"
The "Question 1" section to Laura Wherity's email is similar to that of Jonathan Wright's. They both believe that a random event can be calculated if we know all the conditions of the situation. This seems very logical. In the Mechanics 1 module of my Further Maths AS level, I learnt a number of equations that involve the conditions of an object in which you can find out things about it such as its mass or speed. If we have enough information, we can mathematically figure out the way in which an event will happen.
Laura Wherity then recommends the book Chaos by James Gleick. I have already bought that book, so at least I know that my bibliography is on the right track. Basically all of my books will be relevant to my project, which is reassuring.
Wednesday, 22 September 2010
What have I done today?
I have just discovered that I can blog via my mobile phone! This is the first blog that I am doing in this way and I can tell that this will definitely make it easier for me to document my project.
Today I gave Mr Wright my grant proposal. He talked to the librarians in my school and they are going to buy the books that I listed so that I can borrow them and after that, they will be available to all students in my school. The books should be arriving next week, which will mean that I will be behind on my GANTT chart once again. Hopefully I can get some more stuff done this week to make up for waiting for the remaining books.
I began writing the "random number generators" section to my dissertation today. So far so good, should be done by Friday if I really push myself.
Today I gave Mr Wright my grant proposal. He talked to the librarians in my school and they are going to buy the books that I listed so that I can borrow them and after that, they will be available to all students in my school. The books should be arriving next week, which will mean that I will be behind on my GANTT chart once again. Hopefully I can get some more stuff done this week to make up for waiting for the remaining books.
I began writing the "random number generators" section to my dissertation today. So far so good, should be done by Friday if I really push myself.
Updated Bibliography
BOOKS
oxford english dictionary
John Polkinghorne - Quantum Theory: A very Short Introduction
Chance - Amir Aczel
Reckoning with Risk - Gerd Gigerenzer
Chaos by James Gleick
Randomness by Deborah J. Bennett.
Does God Play Dice? - Ian Stewart
Introduction to random time and quantum randomness - Kai Lai Chung
Quantum: A guide for the perplexed - Jim Al-Khalili.
Quantum - Manjit Kumar
WEBSITES
http://en.wikipedia.org/wiki/Randomness
http://www.igs.net/~cmorris/index_subject.htm
http://en.wikipedia.org/wiki/Pseudorandom_number_generator
RANDOM.ORG
http://www.fortunecity.com/emachines/e11/86/random.html
http://t1.gstatic.com/images?q=tbn:ANd9GcQk5Bw3BdGRlboekIxXWw8YvWhsHzVQFK8tM8s7vSRCWEUaOsE&t=1&usg=__CxFNmGp1uPBw86HSTTmeF9oBhEw=
http://www.goodreads.com/book/show/441215.Chance
http://ezinearticles.com/?Book-Review---Chance,-by-Amir-D-Aczel&id=3507603
http://www.faqs.org/docs/qp/chap01.html
http://www.scholarpedia.org/article/Algorithmic_randomness
PEOPLE
Ben Green and Imre Leader
Einstein
Tony Hillerman
Robert R. Coveyou
Jonathan Wright
oxford english dictionary
John Polkinghorne - Quantum Theory: A very Short Introduction
Chance - Amir Aczel
Reckoning with Risk - Gerd Gigerenzer
Chaos by James Gleick
Randomness by Deborah J. Bennett.
Does God Play Dice? - Ian Stewart
Introduction to random time and quantum randomness - Kai Lai Chung
Quantum: A guide for the perplexed - Jim Al-Khalili.
Quantum - Manjit Kumar
WEBSITES
http://en.wikipedia.org/wiki/Randomness
http://www.igs.net/~cmorris/index_subject.htm
http://en.wikipedia.org/wiki/Pseudorandom_number_generator
RANDOM.ORG
http://www.fortunecity.com/emachines/e11/86/random.html
http://t1.gstatic.com/images?q=tbn:ANd9GcQk5Bw3BdGRlboekIxXWw8YvWhsHzVQFK8tM8s7vSRCWEUaOsE&t=1&usg=__CxFNmGp1uPBw86HSTTmeF9oBhEw=
http://www.goodreads.com/book/show/441215.Chance
http://ezinearticles.com/?Book-Review---Chance,-by-Amir-D-Aczel&id=3507603
http://www.faqs.org/docs/qp/chap01.html
http://www.scholarpedia.org/article/Algorithmic_randomness
PEOPLE
Ben Green and Imre Leader
Einstein
Tony Hillerman
Robert R. Coveyou
Jonathan Wright
Improved structure of my dissertation
After having my supervision with Ms Caroussis, I realised that I need to change my dissertation slightly so that it includes a part on the history of the debate of random and why randomness is important.
My previous structure was:
Introduction: Includes definitions of random and examples of random in everyday life. (approximately 500 words)
Probability: Probability explained and why it implies that random does not exist, and how random events are not as unpredictable as people think. Possible topics to include : reference to Chance by Amir Aczel, gambling, dice throws. (approximately 1000 words)
Short Introduction to Chaos and Quantum Mechanics: Proof to why pure randomness does exist. May also include other topics that I way discover. (approximately 1000 words)
How do some random mechanisms work?: Random number generators and other objects that have been programmed to be "random". (1000 words)
Uncertainty and Unpredictability. How should we cope with random events? How should one go about handling the subject of random? (1000 words)
Conclusion: Do I believe that random really exists? What have I learnt about random by doing this investigation? (500 words)
My new structure will be:
Introduction: Includes definitions of random and examples of random in everyday life. (approximately 500 words)
The history of randomness - why is it important? (approximately 500 words)
Probability: Probability explained and why it implies that random does not exist, and how random events are not as unpredictable as people think. Possible topics to include : reference to Chance by Amir Aczel, gambling, dice throws. (approximately 1000 words)
Short Introduction to Chaos and Quantum Mechanics: Proof to why pure randomness does exist. May also include other topics that I way discover. (approximately 1000 words)
How do some random mechanisms work?: Random number generators and other objects that have been programmed to be "random". (1000 words)
Uncertainty and Unpredictability. How should we cope with random events? How should one go about handling the subject of random? (500 words)
Conclusion: Do I believe that random really exists? What have I learnt about random by doing this investigation? (500 words)
I have reduced the "Uncertainty and Unpredictability" section from 1000 words to 500 words, so that the section about the history of randomness and its important can be included, at 500 words. I chose to cut down that Uncertainty and Unpredictability part because in comparison to he other "chapters", I don't think it is as important.
My previous structure was:
Introduction: Includes definitions of random and examples of random in everyday life. (approximately 500 words)
Probability: Probability explained and why it implies that random does not exist, and how random events are not as unpredictable as people think. Possible topics to include : reference to Chance by Amir Aczel, gambling, dice throws. (approximately 1000 words)
Short Introduction to Chaos and Quantum Mechanics: Proof to why pure randomness does exist. May also include other topics that I way discover. (approximately 1000 words)
How do some random mechanisms work?: Random number generators and other objects that have been programmed to be "random". (1000 words)
Uncertainty and Unpredictability. How should we cope with random events? How should one go about handling the subject of random? (1000 words)
Conclusion: Do I believe that random really exists? What have I learnt about random by doing this investigation? (500 words)
My new structure will be:
Introduction: Includes definitions of random and examples of random in everyday life. (approximately 500 words)
The history of randomness - why is it important? (approximately 500 words)
Probability: Probability explained and why it implies that random does not exist, and how random events are not as unpredictable as people think. Possible topics to include : reference to Chance by Amir Aczel, gambling, dice throws. (approximately 1000 words)
Short Introduction to Chaos and Quantum Mechanics: Proof to why pure randomness does exist. May also include other topics that I way discover. (approximately 1000 words)
How do some random mechanisms work?: Random number generators and other objects that have been programmed to be "random". (1000 words)
Uncertainty and Unpredictability. How should we cope with random events? How should one go about handling the subject of random? (500 words)
Conclusion: Do I believe that random really exists? What have I learnt about random by doing this investigation? (500 words)
I have reduced the "Uncertainty and Unpredictability" section from 1000 words to 500 words, so that the section about the history of randomness and its important can be included, at 500 words. I chose to cut down that Uncertainty and Unpredictability part because in comparison to he other "chapters", I don't think it is as important.
A reply to the email
Last night I got a reply from the email I sent out to the maths and physics graduates. Jonathan Wright, a maths graduate, said:
"In my opinion, there is no such thing as a random event. As an applied mathematician, all physical situations can be modelled mathematically, and as such we can predict all possible outcomes. If we roll a dice in exactly the same way 100 times, 100 times it would give us the same result. If we model the roll of the die, given the starting conditions we could predict the outcome every time.
Predictions are made using models such as this all the time, a classic example being those used by weathermen every day to make their forecasts. Granted, the models can always be improved, but in theory if we had every piece of information (all temperatures, all pressures all over the world etc), and the perfect model, we could exactly predict the future weather.
However, with both the roll of the dice or in predicting the weather, it is this 'knowing' of the starting conditions which creates the randomness that we experience I every day life. In the weather models, if your temperature measurement is off by 0.01 degrees, eventually, perhaps in hours, days or weeks time, the predictions made by the model will become drastically different from those you experience. In fact, this was how chaos was discovered; a seemingly well understood piece of theory, when run on a computer on two occasions, gave two drastically different answers with seemingly the same starting values. The difference was attributed to a difference in the 6th decimal place of the starting values..
And here lies the problem, at some point you reach the limit of your accuracy. If you know the starting position of your dice to 1000 decimal places, the difference in the 1001st decimal place coupled with a similar error in your understanding of the spin the die is thrown with etc may result in a different outcome. A good book on this is by Ian Stewart, Does God Play Dice? The Mathematics of Chaos
Quantum theory on the other hand, may also appear to be random, but similarly I think it is just not fully understood. We may not know the exact position of electrons in an atom, so instead we give electrons a 'probability' of being in certain positions or states. This doesn't mean that the electrons are in a random place, just that we are unable to observe their exact position. (In fact, and here is where you should ask a physicist, I think the very process of looking into an atom changes the states of the electrons..So we dont know.) But does this make it random?
Similarly, is an earthquake a random event?..To someone capable of studying the inner fluid dynamics of the earth, and properties of it's crust etc, then it is not random, in fact with the right models predictable. But will humans ever be able to actually predict such events precisely?..Probably not, because we will never be able to know the starting conditions to the required accuracy.."
So, to conclude, Jonathan Wright believes that pure randomness doesn't exist because probability has allowed us to be able to accurately predict so-called "random" events. Also, he discusses the fact that a random event could be subjective - it really depends on how knowledgeable a person is. Therefore, things that seem random for us right now may just be things that are too mathematically complex for us to understand, or things that we do not have much information about.
This reply has helped me look at randomness for a mathematicians point of view. Jonathan Wright touches upon the different areas of randomness that I will be covering in my dissertation and so all of this will be very helpful to me.
"In my opinion, there is no such thing as a random event. As an applied mathematician, all physical situations can be modelled mathematically, and as such we can predict all possible outcomes. If we roll a dice in exactly the same way 100 times, 100 times it would give us the same result. If we model the roll of the die, given the starting conditions we could predict the outcome every time.
Predictions are made using models such as this all the time, a classic example being those used by weathermen every day to make their forecasts. Granted, the models can always be improved, but in theory if we had every piece of information (all temperatures, all pressures all over the world etc), and the perfect model, we could exactly predict the future weather.
However, with both the roll of the dice or in predicting the weather, it is this 'knowing' of the starting conditions which creates the randomness that we experience I every day life. In the weather models, if your temperature measurement is off by 0.01 degrees, eventually, perhaps in hours, days or weeks time, the predictions made by the model will become drastically different from those you experience. In fact, this was how chaos was discovered; a seemingly well understood piece of theory, when run on a computer on two occasions, gave two drastically different answers with seemingly the same starting values. The difference was attributed to a difference in the 6th decimal place of the starting values..
And here lies the problem, at some point you reach the limit of your accuracy. If you know the starting position of your dice to 1000 decimal places, the difference in the 1001st decimal place coupled with a similar error in your understanding of the spin the die is thrown with etc may result in a different outcome. A good book on this is by Ian Stewart, Does God Play Dice? The Mathematics of Chaos
Quantum theory on the other hand, may also appear to be random, but similarly I think it is just not fully understood. We may not know the exact position of electrons in an atom, so instead we give electrons a 'probability' of being in certain positions or states. This doesn't mean that the electrons are in a random place, just that we are unable to observe their exact position. (In fact, and here is where you should ask a physicist, I think the very process of looking into an atom changes the states of the electrons..So we dont know.) But does this make it random?
Similarly, is an earthquake a random event?..To someone capable of studying the inner fluid dynamics of the earth, and properties of it's crust etc, then it is not random, in fact with the right models predictable. But will humans ever be able to actually predict such events precisely?..Probably not, because we will never be able to know the starting conditions to the required accuracy.."
So, to conclude, Jonathan Wright believes that pure randomness doesn't exist because probability has allowed us to be able to accurately predict so-called "random" events. Also, he discusses the fact that a random event could be subjective - it really depends on how knowledgeable a person is. Therefore, things that seem random for us right now may just be things that are too mathematically complex for us to understand, or things that we do not have much information about.
This reply has helped me look at randomness for a mathematicians point of view. Jonathan Wright touches upon the different areas of randomness that I will be covering in my dissertation and so all of this will be very helpful to me.
Monday, 20 September 2010
More books.
Today, I wrote up my proposal for a grant. I couldn't find Mr Wright today, so I need to make sure that I give it to him tomorrow. Time is of the essence!
I went to Muswell Hill Library after school to look for any of the books that I need to buy. None of the books that I need were in that library, however, I found two books on quantum theory that I borrowed:
1. Quantum: A guide for the perplexed - Jim Al-Khalili. Chapter 2 is called "Probability and Chance" and I am sure that this will be useful to me.
2. Quantum - Manjit Kumar. The 4th part of the book is named "Does God Play Dice?" so this will most probably be relevant to my project.
I went to Muswell Hill Library after school to look for any of the books that I need to buy. None of the books that I need were in that library, however, I found two books on quantum theory that I borrowed:
1. Quantum: A guide for the perplexed - Jim Al-Khalili. Chapter 2 is called "Probability and Chance" and I am sure that this will be useful to me.
2. Quantum - Manjit Kumar. The 4th part of the book is named "Does God Play Dice?" so this will most probably be relevant to my project.
Sunday, 19 September 2010
GANTT chart
Tomorrow will be September 20th. The main task that I need to complete in order to keep progressing through my GANTT chart is to buy the remaining books. I have written out a grant proposal for Mr Wright so I must make sure that I print it out at school tomorrow and give it to him. All that is left for me to do is finish reading Randomness and I will be back on track.
So, priorities for this week are:
FINISH READING RANDOMNESS.
Get the remaining books.
And perhaps start researching quantum mechanics and chaos theory, to prepare for writing that part of my dissertation. One way of doing this is to talk to physics teachers in my school.
More quotes from Randomness
I have read quite a lot of Randomness, but haven't had time to blog. Instead, I noted the pages of the quotes that I have come across and waited until I had time to blog (now) and type up all of them.
"The first atomist, Leucippus (circa 450 B.C.), said, 'Nothing happens at random; everything happens out of reason and by necessity'. The atomic school contended that chance could not mean uncaused, since everything is caused. Chance must instead mean hidden cause."
"[...] Newtonian physics - a system of thought which represented the full bloom of the Scientific Revolution in the late seventeenth century. [...] a belief developed among scientists that everything about the natural world was knowable through mathematics. And if everything conformed to mathematics, then a Grand Designer must exist. Pure chance or randomness had no place in this philosophy."
The above quotes are reasons to way random cannot exist. When I read these passages, I thought of the domino effect and how every event has a cause, and that cause has its own cause and so on. It doesn't seem correct to say that an event has no cause, and there is no way to understand how it came to be.
Robert R. Coveyou (american mathematician) - "The generation of random numbers is too important to be left to chance."
"Within any sequence generated by the computer through a programmed algorithm or formula, the next digit is a completely deterministic choice, not random in the sense that a dice throw, a spinning disc, an electronic pulse or even the infinite digits of the mysterious pi are random. The very notion that a deterministic formula could generate a random sequence seemed like a contradiction".
This is what I have breifly mentioned before. Surely pure random doesn't truly exist if it is possible to produce a sequence that is identical to a random one.
"Today, primarily three types of generators are in use: (1) congruential generators, which are based on modular arithmetic, or remainders after division,(2) generators which use the binary (bit) structure of computer-stored information, and (3) generators based on number theory."
(1)"Congruential generators use modular arithmetic, or the remainder after division, as the next digit in the sequence. For example, a number mod 7 is replace by the number's remiander after dividing the number by 7."
I have come across modular arithmetic very breifly when I attended one of the mathematics taster days at Queen Mary's University.
More research on congruential generators:
"An LCG generates pseudorandom numbers by starting with a value called the seed, and repeatedly applying a given recurrence relation to it to create a sequence of such numbers. At a glance, the graphs will always look random (except in trivial cases, such as when the modulus is a multiple of the multiplier), but there is actually a sophisticated study of how closely pseudorandom number generators approximate processes that are truly random." - http://demonstrations.wolfram.com/LinearCongruentialGenerators/
"The linear congruential generator (LCG) was proposed by Lehmer in 1949" - http://random.mat.sbg.ac.at/results/karl/server/node3.html
(2) "A new class of number theoretic generators has recently been developed by George Marsaglia and Arif Zaman. [...] Called add-with-carry and subtract-with-borrow generators, their technique relies on the Fibonacci sequence and the so-called lagged-Fibonacci sequence [...]"
I have come across the Fibonacci sequence many times before. The sequence begins with 0,1, and the next number in the sequence is found by adding together the two previous numbers. So, the sequence starts off like so: 0,1,1,2,3,5,8 etc.
"A logged-Fibonacci sequence begins with two enormously large starting numbers, or seeds, instead of 0 and 1.[...] As in the Fibonacci sequence, in the add-with-carry method each new number will be obtained by summing up the two digits previous to it. If the sum is 10 or more, we use the right-most digit only and carry the 1 (to be used in obtaining the next digit.) [...] For instance, beginning with the two initial seeds 0 and 1, we obtain the same beginning of the Fibonacci sequence 0,1,1,2,3,5,8 until we reach 13. Here, the 3 is used and the 1 is carried. The next number in the sequence is obtained by summing the previous two, 8 and 3, and the 1 that was carried, 8+3+1 is 12, so the 2 is used and the 1 is carried."
(3) Number Theory. Deborah J. Bennett does not discuss number theory in the book but I did a bit of research: There are many branches of number theory but I think that the most relevant one to my project is probabilistic number theory:
"In probabilistic number theory statistical limit theorems are established in problems involving "almost independent" random variables. Methods used include a combination of probabilistic, elementary and analytic ideas.
One of the first achievements in this area was the Erdos-Kac theorem, which asserts that properly normalized values of a rather general additive arithmetical function have a Gaussian limit distribution. The determination of necessary and sufficient conditions for such functions to have a limit distribution is an outstanding problem."
I'm not going to do any deeper research into the number theory because the topic has many different branches and I don't think that it will be extremely worth it.
All the above research will help me for when I write my "random number generator" part of my dissertation, because I now understand how these generators work. I can see the advantages and disadvantages of them. I feel as though I am almost ready to write the chapter, which is good because I have been meaning to write it for the past week.
I found a passage about Chaos in the book:
"Chaos theory, the science which predicts that the future state of most systems is unpredictable due to even small initial uncertainties, holds new meaning for the notion of randomness, and simulating these systems requires huge numbers of random digits. It has been shown that with even small deterministic systems, initial observational error and tiny disturbances grown exponentially and create enormous problems with predictability in the long run".
I think this quote summarises chaos theory really well and will be useful to me when I write my chaos chapter of my dissertation.
"The first atomist, Leucippus (circa 450 B.C.), said, 'Nothing happens at random; everything happens out of reason and by necessity'. The atomic school contended that chance could not mean uncaused, since everything is caused. Chance must instead mean hidden cause."
"[...] Newtonian physics - a system of thought which represented the full bloom of the Scientific Revolution in the late seventeenth century. [...] a belief developed among scientists that everything about the natural world was knowable through mathematics. And if everything conformed to mathematics, then a Grand Designer must exist. Pure chance or randomness had no place in this philosophy."
The above quotes are reasons to way random cannot exist. When I read these passages, I thought of the domino effect and how every event has a cause, and that cause has its own cause and so on. It doesn't seem correct to say that an event has no cause, and there is no way to understand how it came to be.
Robert R. Coveyou (american mathematician) - "The generation of random numbers is too important to be left to chance."
"Within any sequence generated by the computer through a programmed algorithm or formula, the next digit is a completely deterministic choice, not random in the sense that a dice throw, a spinning disc, an electronic pulse or even the infinite digits of the mysterious pi are random. The very notion that a deterministic formula could generate a random sequence seemed like a contradiction".
This is what I have breifly mentioned before. Surely pure random doesn't truly exist if it is possible to produce a sequence that is identical to a random one.
"Today, primarily three types of generators are in use: (1) congruential generators, which are based on modular arithmetic, or remainders after division,(2) generators which use the binary (bit) structure of computer-stored information, and (3) generators based on number theory."
(1)"Congruential generators use modular arithmetic, or the remainder after division, as the next digit in the sequence. For example, a number mod 7 is replace by the number's remiander after dividing the number by 7."
I have come across modular arithmetic very breifly when I attended one of the mathematics taster days at Queen Mary's University.
More research on congruential generators:
"An LCG generates pseudorandom numbers by starting with a value called the seed, and repeatedly applying a given recurrence relation to it to create a sequence of such numbers. At a glance, the graphs will always look random (except in trivial cases, such as when the modulus is a multiple of the multiplier), but there is actually a sophisticated study of how closely pseudorandom number generators approximate processes that are truly random." - http://demonstrations.wolfram.com/LinearCongruentialGenerators/
"The linear congruential generator (LCG) was proposed by Lehmer in 1949" - http://random.mat.sbg.ac.at/results/karl/server/node3.html
(2) "A new class of number theoretic generators has recently been developed by George Marsaglia and Arif Zaman. [...] Called add-with-carry and subtract-with-borrow generators, their technique relies on the Fibonacci sequence and the so-called lagged-Fibonacci sequence [...]"
I have come across the Fibonacci sequence many times before. The sequence begins with 0,1, and the next number in the sequence is found by adding together the two previous numbers. So, the sequence starts off like so: 0,1,1,2,3,5,8 etc.
"A logged-Fibonacci sequence begins with two enormously large starting numbers, or seeds, instead of 0 and 1.[...] As in the Fibonacci sequence, in the add-with-carry method each new number will be obtained by summing up the two digits previous to it. If the sum is 10 or more, we use the right-most digit only and carry the 1 (to be used in obtaining the next digit.) [...] For instance, beginning with the two initial seeds 0 and 1, we obtain the same beginning of the Fibonacci sequence 0,1,1,2,3,5,8 until we reach 13. Here, the 3 is used and the 1 is carried. The next number in the sequence is obtained by summing the previous two, 8 and 3, and the 1 that was carried, 8+3+1 is 12, so the 2 is used and the 1 is carried."
(3) Number Theory. Deborah J. Bennett does not discuss number theory in the book but I did a bit of research: There are many branches of number theory but I think that the most relevant one to my project is probabilistic number theory:
"In probabilistic number theory statistical limit theorems are established in problems involving "almost independent" random variables. Methods used include a combination of probabilistic, elementary and analytic ideas.
One of the first achievements in this area was the Erdos-Kac theorem, which asserts that properly normalized values of a rather general additive arithmetical function have a Gaussian limit distribution. The determination of necessary and sufficient conditions for such functions to have a limit distribution is an outstanding problem."
I'm not going to do any deeper research into the number theory because the topic has many different branches and I don't think that it will be extremely worth it.
All the above research will help me for when I write my "random number generator" part of my dissertation, because I now understand how these generators work. I can see the advantages and disadvantages of them. I feel as though I am almost ready to write the chapter, which is good because I have been meaning to write it for the past week.
I found a passage about Chaos in the book:
"Chaos theory, the science which predicts that the future state of most systems is unpredictable due to even small initial uncertainties, holds new meaning for the notion of randomness, and simulating these systems requires huge numbers of random digits. It has been shown that with even small deterministic systems, initial observational error and tiny disturbances grown exponentially and create enormous problems with predictability in the long run".
I think this quote summarises chaos theory really well and will be useful to me when I write my chaos chapter of my dissertation.
Wednesday, 15 September 2010
More research
http://en.wikipedia.org/wiki/Randomness
About the history of randomness:
"In ancient history, the concepts of chance and randomness were intertwined with that of fate. Many ancient peoples threw dice to determine fate, and this later evolved into games of chance. Most ancient cultures used various methods of divination to attempt to circumvent randomness and fate.
The Chinese were perhaps the earliest people to formalize odds and chance 3,000 years ago. The Greek philosophers discussed randomness at length, but only in non-quantitative forms. It was only in the sixteenth century that Italian mathematicians began to formalize the odds associated with various games of chance. The invention of the calculus had a positive impact on the formal study of randomness. In the 1888 edition of his book The Logic of Chance John Venn wrote a chapter on "The conception of randomness" which included his view of the randomness of the digits of the number Pi by using them to construct a random walk in two dimensions.
The early part of the twentieth century saw a rapid growth in the formal analysis of randomness, as various approaches for a mathematical foundations of probability were introduced. In the mid to late twentieth century ideas of algorithmic information theory introduced new dimensions to the field via the concept of algorithmic randomness.
Although randomness had often been viewed as an obstacle and a nuisance for many centuries, in the twentieth century computer scientists began to realize that the deliberate introduction of randomness into computations can be an effective tool for designing better algorithms. In some cases such randomized algorithms outperform the best deterministic methods."
This extract implies that randomness has been a part of our world for a very long time. People have learnt to accept randomness and there are many uses for it now.
About Quantum Mechanics:
"According to several standard interpretations of quantum mechanics, microscopic phenomena are objectively random. That is, in an experiment where all causally relevant parameters are controlled, there will still be some aspects of the outcome which vary randomly. An example of such an experiment is placing a single unstable atom in a controlled environment; it cannot be predicted how long it will take for the atom to decay; only the probability of decay within a given time can be calculated. Thus, quantum mechanics does not specify the outcome of individual experiments but only the probabilities. Hidden variable theories are inconsistent with the view that nature contains irreducible randomness: such theories posit that in the processes that appear random, properties with a certain statistical distribution are somehow at work "behind the scenes" determining the outcome in each case."
About Religion:
"Some theologians have attempted to resolve the apparent contradiction between an omniscient deity, or a first cause, and free will using randomness. Discordians have a strong belief in randomness and unpredictability. Buddhist philosophy states that any event is the result of previous events (karma), and as such, there is no such thing as a random event or a first event.
Martin Luther, the forefather of Protestantism, believed that there was nothing random based on his understanding of the Bible. As an outcome of his understanding of randomness, he strongly felt that free will was limited to low-level decision making by humans. Therefore, when someone sins against another, decision making is only limited to how one responds, preferably through forgiveness and loving actions. He believed, based on Biblical scripture, that humans cannot will themselves faith, salvation, sanctification, or other gifts from God. Additionally, the best people could do, according to his understanding, was not sin, but they fall short, and free will cannot achieve this objective. Thus, in his view, absolute free will and unbounded randomness are severely limited to the point that behaviors may even be patterned or ordered and not random. This is a point emphasized by the field of behavioral psychology."
While searchng through Google, I can across a film called "Chaos Theory".
This may be a useful film to watch. I read some summaries about the film and it doesn't seem extremely relevant to my project, but perhaps I could watch it if I have the time.
I emailed the maths and physics graduates that my supervisor introduced me to:
"Thanks.
Hi everyone.
First question: Do you think that one day, with enough research, we will be able to confidently predict a random event (for example, the rolling of a dice)?
Also: Do you believe that pure randomness exists? It can be argued that randomness doesn't exists because it is subjective - something that may seem random to one person may seem like an obvious pattern to others. On the other hand, there are some mathematical (and physics related) concepts such as chaos and quantum theory, that show that randomness is very much a part of our world?
To conclude, I would just like your opinions on the subject of randomness and any suggestions for books or certain topics that may be useful to me.
Thanks,
Zainab Kwaw-Swanzy"
About the history of randomness:
"In ancient history, the concepts of chance and randomness were intertwined with that of fate. Many ancient peoples threw dice to determine fate, and this later evolved into games of chance. Most ancient cultures used various methods of divination to attempt to circumvent randomness and fate.
The Chinese were perhaps the earliest people to formalize odds and chance 3,000 years ago. The Greek philosophers discussed randomness at length, but only in non-quantitative forms. It was only in the sixteenth century that Italian mathematicians began to formalize the odds associated with various games of chance. The invention of the calculus had a positive impact on the formal study of randomness. In the 1888 edition of his book The Logic of Chance John Venn wrote a chapter on "The conception of randomness" which included his view of the randomness of the digits of the number Pi by using them to construct a random walk in two dimensions.
The early part of the twentieth century saw a rapid growth in the formal analysis of randomness, as various approaches for a mathematical foundations of probability were introduced. In the mid to late twentieth century ideas of algorithmic information theory introduced new dimensions to the field via the concept of algorithmic randomness.
Although randomness had often been viewed as an obstacle and a nuisance for many centuries, in the twentieth century computer scientists began to realize that the deliberate introduction of randomness into computations can be an effective tool for designing better algorithms. In some cases such randomized algorithms outperform the best deterministic methods."
This extract implies that randomness has been a part of our world for a very long time. People have learnt to accept randomness and there are many uses for it now.
About Quantum Mechanics:
"According to several standard interpretations of quantum mechanics, microscopic phenomena are objectively random. That is, in an experiment where all causally relevant parameters are controlled, there will still be some aspects of the outcome which vary randomly. An example of such an experiment is placing a single unstable atom in a controlled environment; it cannot be predicted how long it will take for the atom to decay; only the probability of decay within a given time can be calculated. Thus, quantum mechanics does not specify the outcome of individual experiments but only the probabilities. Hidden variable theories are inconsistent with the view that nature contains irreducible randomness: such theories posit that in the processes that appear random, properties with a certain statistical distribution are somehow at work "behind the scenes" determining the outcome in each case."
About Religion:
"Some theologians have attempted to resolve the apparent contradiction between an omniscient deity, or a first cause, and free will using randomness. Discordians have a strong belief in randomness and unpredictability. Buddhist philosophy states that any event is the result of previous events (karma), and as such, there is no such thing as a random event or a first event.
Martin Luther, the forefather of Protestantism, believed that there was nothing random based on his understanding of the Bible. As an outcome of his understanding of randomness, he strongly felt that free will was limited to low-level decision making by humans. Therefore, when someone sins against another, decision making is only limited to how one responds, preferably through forgiveness and loving actions. He believed, based on Biblical scripture, that humans cannot will themselves faith, salvation, sanctification, or other gifts from God. Additionally, the best people could do, according to his understanding, was not sin, but they fall short, and free will cannot achieve this objective. Thus, in his view, absolute free will and unbounded randomness are severely limited to the point that behaviors may even be patterned or ordered and not random. This is a point emphasized by the field of behavioral psychology."
While searchng through Google, I can across a film called "Chaos Theory".
This may be a useful film to watch. I read some summaries about the film and it doesn't seem extremely relevant to my project, but perhaps I could watch it if I have the time.
I emailed the maths and physics graduates that my supervisor introduced me to:
"Thanks.
Hi everyone.
First question: Do you think that one day, with enough research, we will be able to confidently predict a random event (for example, the rolling of a dice)?
Also: Do you believe that pure randomness exists? It can be argued that randomness doesn't exists because it is subjective - something that may seem random to one person may seem like an obvious pattern to others. On the other hand, there are some mathematical (and physics related) concepts such as chaos and quantum theory, that show that randomness is very much a part of our world?
To conclude, I would just like your opinions on the subject of randomness and any suggestions for books or certain topics that may be useful to me.
Thanks,
Zainab Kwaw-Swanzy"
Tuesday, 14 September 2010
Supervision
I had my first supervision with Ms Caroussis since last term. We discussed the work I did over the summer. I feel a lot more confident about my progress now, but I really need to make sure that I dont fall behind on my GANTT chart.
I talked about possibly obtaining a grant and my supervisor told me that I need to write a proposal to Mr Wright. This must include:
Project Brief
Detailed costs
Why I need the money to pay for these things
Why is it important to get the grant for my project
I asked about how large my range of sources should be and so, next supervision, I must bring a detailed, up to date bibliography so we can discuss other sources that I may need. I haven't managed to get opinions from any professionals so my supervisor started up an email with a few maths and physics undergraduates that she knows. I could also talk to physics and maths teachers in the school.
Possible questions that I should ask them:
- Whether they believe that randomness really exists
- Will we ever be able to predict the next number to show on a dice?
- Any book recommendations?
I can't think of any other questions at the moment but I need to come up with more so I can make the most out of having maths and physics related contacts.
During the supervision I thought about other things that I may want to put in my dissertation that I had not thought about before. My supervisor suggested that I wirte a bit about the importance of the topic - why should people be interested in randomness? And also possible a history of the debate of randomness.
The supervision helped me find out more about getting different types of sources to add to my list. I am now confident that my research will become very detailed and varied.
Tasks for this week:
FINISH READING RANDOMNESS!!
Write up random generators chapter.
I talked about possibly obtaining a grant and my supervisor told me that I need to write a proposal to Mr Wright. This must include:
Project Brief
Detailed costs
Why I need the money to pay for these things
Why is it important to get the grant for my project
I asked about how large my range of sources should be and so, next supervision, I must bring a detailed, up to date bibliography so we can discuss other sources that I may need. I haven't managed to get opinions from any professionals so my supervisor started up an email with a few maths and physics undergraduates that she knows. I could also talk to physics and maths teachers in the school.
Possible questions that I should ask them:
- Whether they believe that randomness really exists
- Will we ever be able to predict the next number to show on a dice?
- Any book recommendations?
I can't think of any other questions at the moment but I need to come up with more so I can make the most out of having maths and physics related contacts.
During the supervision I thought about other things that I may want to put in my dissertation that I had not thought about before. My supervisor suggested that I wirte a bit about the importance of the topic - why should people be interested in randomness? And also possible a history of the debate of randomness.
The supervision helped me find out more about getting different types of sources to add to my list. I am now confident that my research will become very detailed and varied.
Tasks for this week:
FINISH READING RANDOMNESS!!
Write up random generators chapter.
Sunday, 12 September 2010
GANTT Chart Update
This week I need to focus on finishing Randomness and writing the "Random Generators/mechanisms" chapter of the dissertation. This means that I must complete my research on that topic because at the moment, I may not have as many sources as I would like that will help me with writing that chapter.
I also need to enquire about possibly receiving a grant so I can buy the remaining books.
Useful quote
"From where we stand the rain seems random. If we would stand somewhere else, we would see the order in it."
Tony Hillerman - american author.
Suggests that randomness is subjective - doesn't truly exist.
Tony Hillerman - american author.
Suggests that randomness is subjective - doesn't truly exist.
Wednesday, 8 September 2010
Random generators
http://www.scholarpedia.org/article/Algorithmic_randomness
"Algorithmic randomness is the study of random individual elements in sample spaces, mostly the set of all infinite binary sequences. An algorithmically random element passes all effectively devised tests for randomness."
An algorithm is a set of instructions that will solve a problem.(I have learnt this from my further maths AS level in a topic called Decision Maths 1). Algorithms are mainly used for computers.
I've begun to realise that a lot of the information about random mechanisms is very complex. I have two choices:
1. Don't go into much detail about the random mechanisms and it is very complicated and a very vast subject. This will make writing that part of my dissertation much easier.
2. Research random mechanisms IN DEPTH so that my dissertation will be more informed. I will then learn a lot more which is what this project is all about.
Number 2 it is.
This is an extract from the same webpage as the above quote:
"The theory of algorithmic randomness tries to clarify what it means for an individual element of a sample space, e.g. a sequence of coin tosses, represented as a binary string, to be random. While Kolmogorov's formalization of classical probability theory assigns probabilities to sets of outcomes and determines how to calculate with such probabilities, it does not distinguish between individual random and non-random elements. For example, under a uniform distribution, the outcome "000000000000000....0" (n zeros) has the same probability as any other outcome of n coin tosses, namely 2-n. However, there is an intuitive feeling that a sequence of all zeros is not very random. This is even more so when looking at infinite sequences. It seems desirable to clarify what we mean when we speak of a random object. The modern view of algorithmic randomness proposes three paradigms to distinguish random from non-random elements.
Unpredictability: It should be impossible to win against a random sequence in a fair betting game when using a feasible betting strategy.
Incompressibility: It should be impossible to feasibly compress a random sequence.
Measure theoretical typicalness: Random sequences pass every feasible statistical test.
It is the characteristic feature of algorithmic randomness that it interprets feasible as algorithmically feasible. "
This indicates that there are certain properties that a sequence must have for it to seem random. The fact that people are aware of these properties and can recreate them in things such as a random number generator shows that although one cannot predict a random sequence, a sequence can easily be indentified as random and it is possible to produce sequences that can easily pass as random.
"Algorithmic randomness is the study of random individual elements in sample spaces, mostly the set of all infinite binary sequences. An algorithmically random element passes all effectively devised tests for randomness."
An algorithm is a set of instructions that will solve a problem.(I have learnt this from my further maths AS level in a topic called Decision Maths 1). Algorithms are mainly used for computers.
I've begun to realise that a lot of the information about random mechanisms is very complex. I have two choices:
1. Don't go into much detail about the random mechanisms and it is very complicated and a very vast subject. This will make writing that part of my dissertation much easier.
2. Research random mechanisms IN DEPTH so that my dissertation will be more informed. I will then learn a lot more which is what this project is all about.
Number 2 it is.
This is an extract from the same webpage as the above quote:
"The theory of algorithmic randomness tries to clarify what it means for an individual element of a sample space, e.g. a sequence of coin tosses, represented as a binary string, to be random. While Kolmogorov's formalization of classical probability theory assigns probabilities to sets of outcomes and determines how to calculate with such probabilities, it does not distinguish between individual random and non-random elements. For example, under a uniform distribution, the outcome "000000000000000....0" (n zeros) has the same probability as any other outcome of n coin tosses, namely 2-n. However, there is an intuitive feeling that a sequence of all zeros is not very random. This is even more so when looking at infinite sequences. It seems desirable to clarify what we mean when we speak of a random object. The modern view of algorithmic randomness proposes three paradigms to distinguish random from non-random elements.
Unpredictability: It should be impossible to win against a random sequence in a fair betting game when using a feasible betting strategy.
Incompressibility: It should be impossible to feasibly compress a random sequence.
Measure theoretical typicalness: Random sequences pass every feasible statistical test.
It is the characteristic feature of algorithmic randomness that it interprets feasible as algorithmically feasible. "
This indicates that there are certain properties that a sequence must have for it to seem random. The fact that people are aware of these properties and can recreate them in things such as a random number generator shows that although one cannot predict a random sequence, a sequence can easily be indentified as random and it is possible to produce sequences that can easily pass as random.
I haven't blogged in a few days so here we go...
Bibliography so far:
BOOKS
Chance by Amir D. Aczel
Reckoning with Risk by Gerd Gigerenzer
Chaos by James Gleick
Randomness by Deborah J. Bennett
Introduction to random time and quantum randomness by Kai Lai Chung
Quantum Theory: A very Short Introduction by John Polkinghorne
Does God Play Dice? by Ian Stewart
WEBSITES
http://www.fortunecity.com/emachines/e11/86/random.html
RANDOM.ORG
http://www.scientificamerican.com/article.cfm?id=how-randomness-rules-our-world
http://en.wikipedia.org/wiki/Pseudorandom_number_generator
http://www.igs.net/~cmorris/index_subject.htm
http://ezinearticles.com/?Book-Review---Chance,-by-Amir-D-Aczel&id=3507603
http://www.goodreads.com/book/show/441215.Chance
http://www.faqs.org/docs/qp/chap01.html
I should be having a meeting with my supervisor very soon, now that I am back at school. I need to ask if this is a good amount of sources considering my stage in the project.
Reading through Randomness:
The next chapter is about dice rolls and how the probability of each number coming up will change depending on the dice (e.g. having 2,2,3,4,5,6 instead of 1,2,3,4,5,6)
Bennett then talks about rolling two dice.
"[...] let's imagine using coloured dice, one red and one green. For each of the 6 possible throws on the red die, 6 are possible on the green die, for a total of 6 x 6 - 36 equally possible throws. But many of those yield the same sum. To make things even more complicated, different throws can result in the same two numbers. For example, a sum of 3 can occur when the red dice shows 1 and the green die shows 2, or when the red die shows 2 and the green die shows 1. Thus the probability of throwing a total of 3 is 2 out of 36 possibilities, or 2/36. A sum of 7, on the other hand, can be thrown 6 different ways - when red is 1 and green is 6; red is 6 and green is 1; red is 2 and green is 5; red is 5 and green is 2; red is 3 and green is 4; red is 4 and green is 3. Therefore the probability of throwing a 7 is 6/36"
This extract implies that if you throw two dice at random many times and writed down the total of the dice, one can predict that a total of 7 will occur more than a total of 3. This quote, like many others that I have found on the subject of probability, show that is isn't absolutely impossible to predict a random event - we havea rough idea of what the occurence of numbers should be when a dice (or 2) are thrown a certain number of times.
Bibliography so far:
BOOKS
Chance by Amir D. Aczel
Reckoning with Risk by Gerd Gigerenzer
Chaos by James Gleick
Randomness by Deborah J. Bennett
Introduction to random time and quantum randomness by Kai Lai Chung
Quantum Theory: A very Short Introduction by John Polkinghorne
Does God Play Dice? by Ian Stewart
WEBSITES
http://www.fortunecity.com/emachines/e11/86/random.html
RANDOM.ORG
http://www.scientificamerican.com/article.cfm?id=how-randomness-rules-our-world
http://en.wikipedia.org/wiki/Pseudorandom_number_generator
http://www.igs.net/~cmorris/index_subject.htm
http://ezinearticles.com/?Book-Review---Chance,-by-Amir-D-Aczel&id=3507603
http://www.goodreads.com/book/show/441215.Chance
http://www.faqs.org/docs/qp/chap01.html
I should be having a meeting with my supervisor very soon, now that I am back at school. I need to ask if this is a good amount of sources considering my stage in the project.
Reading through Randomness:
The next chapter is about dice rolls and how the probability of each number coming up will change depending on the dice (e.g. having 2,2,3,4,5,6 instead of 1,2,3,4,5,6)
Bennett then talks about rolling two dice.
"[...] let's imagine using coloured dice, one red and one green. For each of the 6 possible throws on the red die, 6 are possible on the green die, for a total of 6 x 6 - 36 equally possible throws. But many of those yield the same sum. To make things even more complicated, different throws can result in the same two numbers. For example, a sum of 3 can occur when the red dice shows 1 and the green die shows 2, or when the red die shows 2 and the green die shows 1. Thus the probability of throwing a total of 3 is 2 out of 36 possibilities, or 2/36. A sum of 7, on the other hand, can be thrown 6 different ways - when red is 1 and green is 6; red is 6 and green is 1; red is 2 and green is 5; red is 5 and green is 2; red is 3 and green is 4; red is 4 and green is 3. Therefore the probability of throwing a 7 is 6/36"
This extract implies that if you throw two dice at random many times and writed down the total of the dice, one can predict that a total of 7 will occur more than a total of 3. This quote, like many others that I have found on the subject of probability, show that is isn't absolutely impossible to predict a random event - we havea rough idea of what the occurence of numbers should be when a dice (or 2) are thrown a certain number of times.
Monday, 6 September 2010
Conclusion of Reckoning with Risk
While surfing Google I came across this website which may help me later on, when I am writing my section about quantum mechanics:
http://www.faqs.org/docs/qp/chap01.html
Here is an extract of a Reckoning with Risk review, written by Helen Joyce:
"Gerd Gigerenzer is not a mathematician or statistician per se, but primarily a psychologist, working across disciplines to understand how human beings make decisions in the face of uncertainty. What he offers here is nothing less than a prescription for how to think, how to choose, and how to live, when the information on which we base our decisions is necessarily incomplete and flawed. For example - how worried should you be if you have a positive mammogram as part of a screening programme for breast cancer, or a positive HIV test despite the fact that you are in a low-risk group? You may be surprised to learn that the answer may well be "not too worried" - what should really worry you is that not many medical personnel know this!
The book also looks at the way courts deal with uncertainty, and offers some suggestions for improving the handling of statistical evidence such as DNA testimony"
The fact that Gerd Gigerenzer is not a mathematician or statistician may make the book less reliable. The book in general does not seem to talk about randomness, but more about uncertainty. There is a clear difference between the two because something that is uncertain may not be random. However, I did find a lot of information that will help me with my probability section of my dissertation. For example, at the end of the book, there is a chapter called "Fun Problems". The Monty Hall Problem is included in the chapter, along with many problems that I have never come across:
"The First Night in Paradise.
It is the night after Adam and Eve's first day in paradise. Together,they watched the sun rise and illuminate the marvelous trees, flowers, and birds. At some point the air got cooler, and the sun sank below the horizon. Will it stay dark forever? Adam and Eve wonder, What is the probability that the sun will rise again tomorrow?
[...] If Adam and Eve had never seen the sun rising, they would assign equal probabilities to both possible outcomes. Adam and Eve represent this initial belief y placing one white marble (a sun that rises) and one black marble (a sun that does not rise) into a bag. Because they have seen the sun rise once, they put another white marble in the bag. [...] Their degree of belief that the sun will rise tomorrow has increased from 1/2 to 2/3. [...] According to the rule of succession, introduces by the French mathematician Pierre-Simon Laplace in 1812, your degree of belief that the sun will rise again after you have seen the sun rise n times should be (n+1)/(n+2)."
This quote shows a different, unusual way to use probability which may be very useful to my dissertation.
Reckoning with Risk was a very interesting book but I don't think that it really gave me a lot of information that helped me progress in my project. I guess it showed me a more psychological side to mathematics but I do not know how relevant this actually is to randomness. In general, there was barely any information about the concept of randomness but it has shown me that randomness and uncertainty are two very different things.
http://www.faqs.org/docs/qp/chap01.html
Here is an extract of a Reckoning with Risk review, written by Helen Joyce:
"Gerd Gigerenzer is not a mathematician or statistician per se, but primarily a psychologist, working across disciplines to understand how human beings make decisions in the face of uncertainty. What he offers here is nothing less than a prescription for how to think, how to choose, and how to live, when the information on which we base our decisions is necessarily incomplete and flawed. For example - how worried should you be if you have a positive mammogram as part of a screening programme for breast cancer, or a positive HIV test despite the fact that you are in a low-risk group? You may be surprised to learn that the answer may well be "not too worried" - what should really worry you is that not many medical personnel know this!
The book also looks at the way courts deal with uncertainty, and offers some suggestions for improving the handling of statistical evidence such as DNA testimony"
The fact that Gerd Gigerenzer is not a mathematician or statistician may make the book less reliable. The book in general does not seem to talk about randomness, but more about uncertainty. There is a clear difference between the two because something that is uncertain may not be random. However, I did find a lot of information that will help me with my probability section of my dissertation. For example, at the end of the book, there is a chapter called "Fun Problems". The Monty Hall Problem is included in the chapter, along with many problems that I have never come across:
"The First Night in Paradise.
It is the night after Adam and Eve's first day in paradise. Together,they watched the sun rise and illuminate the marvelous trees, flowers, and birds. At some point the air got cooler, and the sun sank below the horizon. Will it stay dark forever? Adam and Eve wonder, What is the probability that the sun will rise again tomorrow?
[...] If Adam and Eve had never seen the sun rising, they would assign equal probabilities to both possible outcomes. Adam and Eve represent this initial belief y placing one white marble (a sun that rises) and one black marble (a sun that does not rise) into a bag. Because they have seen the sun rise once, they put another white marble in the bag. [...] Their degree of belief that the sun will rise tomorrow has increased from 1/2 to 2/3. [...] According to the rule of succession, introduces by the French mathematician Pierre-Simon Laplace in 1812, your degree of belief that the sun will rise again after you have seen the sun rise n times should be (n+1)/(n+2)."
This quote shows a different, unusual way to use probability which may be very useful to my dissertation.
Reckoning with Risk was a very interesting book but I don't think that it really gave me a lot of information that helped me progress in my project. I guess it showed me a more psychological side to mathematics but I do not know how relevant this actually is to randomness. In general, there was barely any information about the concept of randomness but it has shown me that randomness and uncertainty are two very different things.
Thursday, 2 September 2010
Reading through Randomness...
The book begins by discussing the misconceptions that occur when probability is used. The cases are similar to that of the ones used in Reckoning with Risk i.e. false positives and negatives, and so I won't use them.
She then talks about the use of randomness in decision making:
"Chance is a fair way to determine moves in some games and in certain real-life situations; the random element allows each participant to believe, 'I have an oppurtunity equal to that of my opponent.'"
I haven't really thought about the role of randomness is something like decision making. The quote suggests that randomness may be the only unbiased way of making decisions in certain situations.
Bennett then moves onto the randomizers that were used in ancient history.
"Though not always recognized or acknowledged as such, chance mechanisms have been used since antiquity: to divide property, delegate civic responsibilities or privileges, settle disputes among neighbors, choose which strategy to follow in the course of battle, and drive the play in games of chance."
It seems as though randomness has always been around, and people took advantage of its unpredictability to make quick decisions about certain events.
Deborah J. Bennett then talks through the various randomizers that were used in ancient times, many resembling the dice that is used in our time. Chapter three discusses religion and the role of God in the concept of random:
"The purpose of randomizers such as lots or dice was to eliminate the possibility of human manipulation and thereby to give the gods a clear channel through which to express their divine will. Even today, some people see a chance outcome as fate or destiny, that which was 'meant to be'"
This quote coincides with one I found in March: "God does not play dice" - Einstein
Both quotes seem to indicate that random does exist but it is an act of God. Randomness, like many other things in the world that humans do not fully understand, is a concept that we must accept and put faith in because it is a part of God.
She then talks about the use of randomness in decision making:
"Chance is a fair way to determine moves in some games and in certain real-life situations; the random element allows each participant to believe, 'I have an oppurtunity equal to that of my opponent.'"
I haven't really thought about the role of randomness is something like decision making. The quote suggests that randomness may be the only unbiased way of making decisions in certain situations.
Bennett then moves onto the randomizers that were used in ancient history.
"Though not always recognized or acknowledged as such, chance mechanisms have been used since antiquity: to divide property, delegate civic responsibilities or privileges, settle disputes among neighbors, choose which strategy to follow in the course of battle, and drive the play in games of chance."
It seems as though randomness has always been around, and people took advantage of its unpredictability to make quick decisions about certain events.
Deborah J. Bennett then talks through the various randomizers that were used in ancient times, many resembling the dice that is used in our time. Chapter three discusses religion and the role of God in the concept of random:
"The purpose of randomizers such as lots or dice was to eliminate the possibility of human manipulation and thereby to give the gods a clear channel through which to express their divine will. Even today, some people see a chance outcome as fate or destiny, that which was 'meant to be'"
This quote coincides with one I found in March: "God does not play dice" - Einstein
Both quotes seem to indicate that random does exist but it is an act of God. Randomness, like many other things in the world that humans do not fully understand, is a concept that we must accept and put faith in because it is a part of God.
Subscribe to:
Posts (Atom)