The Illusion of Freedom

  There appears to me to be a great bit of confusion when looking into the term of freedom. Something I feel that should be addressed for future discussion. There will be certainly some subjectivity however I hope that it will be treated as objective as it can be ;). I will certainly try my best to keep it as objective as I can (which in itself is somewhat impossible given the conversation).


   This is a good visual example of what I’m about to explain. Think of the above image as describing the freedom of choice between two options given their outcome (assuming you know it). In particular while I’m not a fan of digging into religious discussions (mainly because it tends to get bitter even if you don’t want it to) but I feel a particular ‘choice’ is very useful for illustrating my point.

  For a choice to be free it must not, in the case of a fully and properly functional organism, contradict the very nature of self preservation. A praying mantis male may sacrifice its own being but it does so in an act to feed it female and to press on the likelihood that it’s seed/genes (and in essence its very being) will pass on. In many cases the actions an organism does that appear to contradict it’s natural inclination for survival are indeed actions that are required for such. Drinking from a stream that has a predator in them is a necessity where the danger of the predator does not outweigh the danger of dying of thirst.

  This is where the image comes into play. It works in a negative correlation (or sorts), the further you move in either direction (to the left towards option A or to the right for option B) the less likely the opposing option becomes. If you enter the red range of the Freedometer you have essentially left the realm of freedom. It’s not necessarily that once you pass 50% you are no longer free in all cases, I’m using the simplest example to help explain my point with the least amount of thought (so you can use your extra resources to expand the concept).

  In essence if you place a gazelle in the situation where it must drink or it will die, the odds of dying while drinking are outweighed, if they were not outweighed it would not do what it does. This is why some animals do indeed starve or dehydrate (I imagine there is a better term) to death, the odds of them being eaten far exceed the odds of them dying at the current moment. It’s a sad state of affairs.

  If you are presented with a ‘simple choice’ one with two outcomes. You either believe in a single an entity, or you spend the eternity damned to a torturous nothingness. You are no longer provided a free choice, you are given an ultimatum. It is a choice only in the most literal of terms, but it is not free. In the case of an ultimatum you are placed in the extreme red zones, a point in which no properly thinking and functioning beings would choose the other option. Whenever you are placed in a position in which one of the options is not an option that the organism would choose it is no longer a choice and the illusion of freedom is the only freedom that is truly present.

  I am not sure if this has made anything more clear, but at the very least I’d hope that the next time someone reads about a ‘free choice’ that they remember the Freedometer and remember that at a certain point even a choice is not a choice.

By | 2009-02-13T19:16:39+00:00 February 13th, 2009|Journal|Comments Off on The Illusion of Freedom


  Dipping back into the wonderful world of Determinism we have a slightly different take on the idea. We’ll begin by defining Indeterminism:

Indeterminism: Not every event that occurs is completely determined by previous conditions.

  Now you might be saying “Well that’s obvious Captain Jack.” To which I’d remind you I’m not Jack, regardless we’ll continue. The next important thing to know is what it means for something to be undetermined.

e is undetermined =df e is not completely determined by previous conditions.

  This is important because it leads us into what becomes of Indeterminism. It’s a fascinating philosophy because as with other philosophies it gets taken to some very unusual extremes. One such extreme is called “simple determinism” which looks as follows:

Simple Indeterminism
1) There are some undetermined actions.
2) People act freely whenever they perform undetermined actions.
3) People are morally responsible for their own undetermined actions.

    At first glance this appeared to be a pretty reasonable setup. The first tenet says that there are ‘some’ undetermined actions. It is difficult to say that something never happens, so its safe to say that there are ‘some’ undetermined actions. Likewise it comes out to feeling common sense when you look back on your life. Some events did not move along because of the flowing of the wind or the force of gravity.

  Likewise tenet two is making a general statement that seems difficult to argue with. A person is acting freely when they are performing an action that is undetermined. In essence they had options A) and B) (at the very least) and decided to go with one or the other. That choice was their freedom in action.

  Finally the third tenet states that when you act freely you are morally responsible. The idea being that if you choose to murder someone then you are responsible for that activity. However there is a large loophole that this particular version of Indeterminism has and it has to do with the global nature of the third tenet. A seizure is indeed an undetermined action, during that particular moment before the seizure you had the event you desired and the possibility of a seizure. However if you killed someone while having seizures many would be hard pressed to file charges against you. There are more complicated examples but this simple situation puts us in the place of either denouncing Simple Indeterminism or deciding that indeed a seizure victim is fully responsible for the repercussions of their seizure.

  If we were to make some minor modifications to Simple Indeterminism we could clean up most situations.

Rico’s Indeterminism
1) There are some undetermined actions.
2) People act freely whenever they perform undetermined and desired actions.
3) People are morally responsible for their own undetermined and desired actions.

  The introduction of desired to the pre mentioned “undetermined” situations fixes the issue of involuntary problems. In this case when presented with (for simplicities sake) two options if you choose to do one but you are unable because of uncontrollable variables (IE. Seizures or similar situations) then the resulting act is not something you are morally responsible to.

  Now this philosophy has the possibility for abuse but only in the judgement of the person by other persons. If we are to accept morality as something of a natural law (which it isn’t but bear with me) then the person IS morally responsible it’s just that they are attempting to mask the fact that they are. Likewise the action of lying about it being their desired action is also another situation of being responsible.

  Morality is certainly a discussion for another day, however when taken into context of the idea of Indeterminism I feel it’s somewhat self explanatory. Likewise I’m relatively hard pressed to think of a situation where your desired action is not one you are morally responsible for or even one that isn’t an act of freedom. However anyone feeling frisky can take a shot at providing a situation where you are acting out in a desired fashion and are not morally responsible (or even should not be).

By | 2009-02-12T19:32:33+00:00 February 12th, 2009|Journal|Comments Off on Indeterminism

Mean, Median, and Mode.

  As people who have been keeping up with these daily updates know. I would like to become a teacher. However there are some things I think require some dramatic overhauling, mainly because we are at an unusual cross roads where I feel misinformation is causing a dramatic (and ill directed) change in the education system.

  For further discussion later I will note that I am strongly opposed to most (if not all) of the premises behind the No-child-left-behind act. The logic behind taking money away from worse off schools and shoveling that money into better off schools until the worse off schools implode is terrible at best. In Seattle quite a few schools closed down because of the policy and the funding cuts and it makes me wonder when you shove 5 school’s worth of children into less than 5 schools how you are not leaving children behind.

  However it’s painfully obvious that if you wrote a bill that legalized the beating of babies and called it “The don’t beat babies bill.” people would vote for it for fear of voting otherwise leaving them labeled baby beater.

  We’ll leave that there for now and I’ll come back to it on another day. For today I’d like to discuss why all exams should be looked over with the mean, median, and mode or shouldn’t be looked over at all.

  In most of my psychology courses the professor would explain how they feel the exam tested well because the average grade was a low B. This sounded cute and fulfilling till you looked at all the possible situations where the mean or average gives you no idea of the ‘average’ performance in a course.

  Say you have an exam for simplicities sake that has 10 questions. We’ll assume each class has the same amount of students.

7 – 7 – 7 – 7 – 7
Class Average: 7 (Or 70%)

  When we thing of the average grade in a class we think of it like this (or perhaps a more direct 9 – 8 – 7 – 6 – 5 setup). If the average was 7 then everyone got roughly a C and passed the class. That’s fine and dandy in theory however the average really tells us nothing about how the class overall really performed.

10 – 10 – 5 – 5 – 5
Class Average: 7 (Or 70%)

  As you can see in this example over half the class failed the exam. If you walked into a course and the professor told you that over half of you would fail would you stay? What if they told you the class average was 70%? It is likely that the latter would trick you into assuming the course was doable when for all intensive purposes the people passing appear (when looking at the performance of the whole) to have the knowledge necessary to pass regardless of how the professor or the book educates them.

  I’ve noticed in courses with much larger numbers of students you’ll have a small group that performs exceptionally well and a vast majority that perform at just below par or quite far below par. This offsets to some ‘theoretically’ comfortable average and when seen by other faculty or staff the average alone gives the allusion of proper examination and instructional procedures.

  Likewise I feel that the next source of information is by itself relatively worthless. The median essentially tells us nothing about the performance of a class. I’ll again provide two examples that return the same result but are dramatically different. The median for those curious is the middle number when all numbers are placed in numerical order (IE. from least to greatest or vice versa). In the case of an even number you generally would take whatever is in between the two. If you meet at the middle with 7 and 6 the median would be 6.5 else if you met with 6 and 6 you’d end with 6.

  10 10 7 1 1
  Class Median: 7 (or 70%)

This seems appropriate for what the median does by itself. Indeed when you lined up the grades of all the students in your class the median grade was 70%. This is a pretty respectable performance, however you still have just barely under 50% of your course failing, 40% to be exact. This again is offset by the fact that generally speaking the people who do exceptionally well on an exam that the majority of students do on par or sub par would have performed at such high rankings regardless of the professor or the book.

8 – 8 – 7 – 7 – 3
  Class Median: 7 (or 70%)

  So the reason this bothers me is that as you can see the performance of these two classes is dramatically different. In the bottom set 80% of the class has passed, likewise the upper performance is not perfect which may hint towards a more accurate examination to teaching style. I’ll come back to why the above example is better than the first (of the median examples) in a short while.

  We finally move onto the mode, a mode is the most common digit in a set of digits. For example if you have 3 numbers and two of them are the same then the mode would be whatever that number is. However much like its cousins (or brothers/sisters what have you) the mode is utterly meaningless by itself.

10 – 7 – 7 – 2 – 1
  Class Mode: 7 (or 70%)

  A 70% is essentially the bare minimum you can receive in a course before you pseudo-pass it. When you pseudo-pass something you receive a ‘passing’ grade however you are strongly requested to retake the course. It’s essentially failing without failing and I know in the one case where it happened to me it was treated as worse than failing (which I found odd).

  Modes can get far more hokey when you get into larger groups of people. On an exam with 50 possible outcomes you could end up with only one outcome being performed more than once leaving it as the ‘mode’ where really the only thing it has on others is that its merely one larger. The mode is a support function when looking at grading and really means nothing by itself.

7 – 7 – 7 – 7 – 7
  Class Mode: 7 (or 70%)

  This case is dramatically different from the first case, your entire class passed the test which is a good thing however nobody performed better or worse than anyone else. This tends to show failure on the part of the examiner for either providing misleading study suggestions, poorly worded questions, or some other mistake that is all too common. The main reason for pulling this out again (as it matches up with the very first example) is that it leads into my main point (took a while to get here…perhaps unnecessarily so…but I’m rarely as succinct as I want to be).

  The only case in which statistics for a course are acceptable without the full print out is when the mean, median, and mode all are fairly close to one another. If any of them is dramatically different than the other an investigation should be taken. Not necessarily by the FBI but someone should look into the teaching or testing style of the professor/teacher. This is usually a good sign that something is wrong and generally when an entire classroom is effected its not all the students (people aren’t quite that homogenous yet).

  A few examples are as follows.

8 – 8 – 8 – 8 – 7
  Class Mean: 7.8 (or 78%)
  Class Median: 8 (or 80%)
  Class Mode: 8 (or 80%)

  So in the above case you have a class where everyone passed, we don’t see a case of a ceiling or floor affect (everyone neither got a 10 nor did everyone get a 1), and roughly speaking all three M’s are very close to one another. It is difficult to create a situation where all 3 are the same and you don’t have an accurate idea (without seeing the individual performance of all the students on an excel printout) and I don’t have one handy but I’ll give it a shot. (Update: In retrospect I feel I failed. Feel free to comment if you have a working example).

9 – 7 – 7 – 6 – 3
  Class Mean: 6.4 (or 64%)
  Class Median: 7 (or 70%)
  Class Mode: 7 (or 70%)

  I believe in this case we do see a relatively broad range of grades (as high as 90% and as low as 30%) however the overall performance is so poor that it doesn’t matter. These kinds of cases should always spark curiosity in the institutions that they unfold in. Maybe I’m being close minded and there is a grand example of a mean, median, and mode all showing great performance yet most of the class failing but I’m not convinced that is entirely possible (I will not say it isn’t though).

  The perfect case is obviously when everyone in the course gets the exact same grade, that’s the point of popping out the lucky 7’s scenario. Ideally I would hope everyone would get ‘lucky’ 8’s or 9’s but it seems odd that we immediately assume that all people are ‘above average’…I would think that if everyone is above average then they are not above average they are average.

  These three functions are almost meaningless by themselves, each can in certain (and numerous) situations provide dramatically misleading information supporting all sorts of flimsy or hokey ideals. The use of any of these three functions by themselves when fashioning policies or judging the performance of an entity is likely to end in misinformation and failure. I propose that either professors and teachers produce all three pieces of information or provide none because it has become all too apparent at the very least at Western Washington University, that Mean is being abused worse than the proverbial red headed step child.

  The psychology department routinely fails large portions of their students (or D’s them) and yet the average make it appear that people are performing at or slightly above average performance. This is unacceptable and an additional reason I’m not the least bit troubled that the college has lost 35 (possibly more) million dollars.

  If you can’t see the flimsy nature of the mean function than you probably cannot see the danger of living dramatically far beyond your means (pun intended and in at least one sense its not even a pun).

For Next time: I’ll likely discuss the idea of Indeterminism.

By | 2009-02-11T17:52:27+00:00 February 11th, 2009|Journal|Comments Off on Mean, Median, and Mode.


  As I sit here realizing that it would cost me 100 dollars to recover my corrupt one note files, I feel that (time permitting) that it may at least be good for the topic of determinism (Note: Time was not permitting). Otherwise its an incredibly crushing experience that gives me a very negative view of Microsoft at the moment (which is a shame because Windows 7 had peaked my interest in them). So to anyone out there make backups of your one notes, there is NO way to recover them (as far as I know) without paying some random asshole 100 dollars. May sound hard but charging that much for file recover is preposterous.

  So without further delay:

Determinism: Every event that occurs is completely determined by previous conditions.

E is physically necessary =df E is required by the laws of nature.

E is completely determined by previous conditions =df the combination of previous events and the laws of nature makes e physically necessary.

  Now this is important information. For those that do not know =df means "if by definition", likewise if you’ve seen it before iff means if and only if. E is simply a variable much like *.

  The idea is that our history is a singular line, each event directed by the previous event which is also governed by its previous event traveling all the way back to the big bang (be that for you a cosmic blast or just some deity doing the ole bang solo). In essence it says that all actions have been predetermined by events that happened even before the agents of those actions were alive.

  It goes so far for some as to say that because we cannot control our actions (and instead are merely reacting) that we should not be responsible for those actions. It has caused a few thoughts to spark in my head that make it a very sketchy system.


  The second is an example of the life of a simple system. In the beginning it’s as simple as a coin toss, then in this particular case it becomes a 3 way outcome. However if we look back historically the actions of this organism would look like the above line because there would be only one outcome to every one of the actions.

  Essentially Determinism says that because natural law is unable to make decisions and must act directly with the world around it that living organisms act entirely in the same way. However essentially determinism says the following.

If every single variable in the past happened exactly in the same way the exact same history would transpire.

  This explains everything and in the same manner states nothing (a very popular tactic in philosophy). When an organism is born the actions it makes are at the simplest level a coin toss between two possibilities, the response is so quick to the outside viewer that it can be mistaken for a mere reaction to the environment not unlike how the waves move with the turning of the Earth, the gravitation of the moon and other bodies, as well as the shear orgy of shoving between the forces in the sea. In fact random functions like coin tosses aren’t even negated in the universe of determinism. Because if the coin was tossed with the exact same force, from the exact same point, with the exact same wind resistance, the exact same gravitational pull, landing exactly on the same spot on a surface that is in the exact same condition as the first attempt, and all other variables I cannot even fathom being exactly the same then the results would be exactly the same. However this requires a universal knowledge (omniscience I think its called) for it to be a form of understanding that has any purpose in life.

  Even the previous example explained by determinism breaks down when rendered on a PC. If you created a simple program to flip a coin (return the result of 0 or 1 with a 50% chance) you would receive results each time that are completely separate from the results of the former. In theory if you were to rewind time you would get a different set of results each time that you did it (assuming you did it 10-20 times). This is because the forces acting upon the results in the digital realm are not a result of the outside world and thusly would not be held down by the rules of determinism.

  There are many different forms of determinism, some harsher than the one I’ve described and some weaker. However in all cases it appears that determinism is very weak. Organisms much like Personal Computers take in information, run it through a series of scripts (or a script if its a really small program on a PC >_>), this is a process that nothing else in the universe does (to my knowledge). There is a point where the outside senses of an organism does indeed directly influence it to a point that trumps the looping processes, but that appears to only be in instances of errors. If you place a person in a room that’s twelve times their body temperature you will find a direct influence on their activities (producing a small set of responses), however if you place someone in a room that is .12 percent warmer than their body you will find a multitude of responses. However in either case when you look back it will only look like a straight line.

  It is impossible for history to have more than one result when looking back, philosophy aside, it wouldn’t make any form of sense to expect anything else but single incidents in history. One of the simplest forms of logic known as modal logic says (essentially) the following:

If X can exist without Y then X is not identical to Y.

  What this essentially says is that you cannot logically exist and not exist at the exact same time. Likewise in history an event cannot happen and not happen. This means that in history even if an event had 90 trillion possible outcomes, there can only be one to have happened, and I feel this is something that even a child grasps. Which is what bothers me about Determinism. It essentially says that since history only happened in one way then all of time has only one possible outcome. This seems like quite a bold statement, it would be like me saying that if I ran a program to flip a coin and it returned heads, that in all cases in the future I’d see heads. It seems foolish to take an obvious property of history and attempt to use it to explain the future.

  I’d have dipped more into this but spending 5 hours attempting to recover my notes has all but crushed my writing spirit… Goodnight all :). I should be more chipper next update.

By | 2009-02-10T23:59:36+00:00 February 10th, 2009|Journal|Comments Off on Determinism

Future Dreams

  Today’s post will be more personal than usual. Partially because I had to walk from my home to the college today (which is a very tiring operation).

  I am certain I know what Job I’d like to do. I want to be a teacher and eventually a professor. But why?! You might ask…oh its quite simple!

  Well first we look at the schedule. I realize that teachers work extra hours before and after the average school day, likewise I realize that ‘short days’ are not short for teachers. However anytime I think to myself “what days am I working next week?” The answer will be pretty simple “Oh right! Monday through Friday!”

  I like that sort of consistency, it provides me with a very manageable life schedule which will make my writing a plausible activity (seeing as I need consistent time frames to keep my mind in check). Likewise I love helping people and I can see few better ways to help people than to help them learn the most that they possibly can. The consistent breaks each year would also be very pleasant. I know most of my professors go to other countries to enjoy things related (and not so related) to their studies.

  The pay might not be the greatest in the world but that’s not necessarily a terrible thing. The US is an amazingly interesting place if only for the fact that across our country there are millionaire CEO’s that have more money than they can even figure out what to do with, yet they’ll die in obscurity just like all the people around them. Short of Steve Jobs and Bill Gates (two people who are more famous for what they invented than their finances) I can think of no CEO’s name. However I can name most of my favorite teachers and professors.

  The almost entrancing freedom it would give me in terms of my writing also excites me greatly. So I think I’m set. As for what I’ll teach that’s up in the air, but just about anything interests me. Which might be another thing to chock up on the list.

  So I suppose that’s it for today. I apologize for the succinctness as well as the somewhat random nature of this post, but my face feels like its swallowing itself (I’m … really … tired) and I don’t think I could manage something much better tonight.

  I think for the sake of future sight, there is a great chance you’ll see an article about Determinism on here tomorrow. Mostly because I think it’s an interesting topic…and I hope you will to.

By | 2009-02-09T23:26:11+00:00 February 9th, 2009|Journal|Comments Off on Future Dreams