The United States, and the rest of the developed world for that matter, no longer has a stranglehold on the global economy. This isn't to say there isn't more growth to be found in developed markets, but the rest of the world has been catching up. Rapidly.
Those of us who analyze educational trends and realities aren't surprised. The onslaught of the public school system by deranged conservative forces have long rendered the public schools intellectual blood baths.
Just as Alex, in Stanley Kubrick's Clockwork Orange
is forced to continuously view subject matter so are American students presumed guilty of ignorance and forced to regurgitate the basics for their own good until they are suffocating from the educational process.
And the public or, more specifically, John and Jane Taxpayer, justify this pedagogy in the holy name of school accountability, closing achievement gaps, and giving our kids a dose of social equality that they will never forget.
The umbrella program that this is administered by is called No Child Left Behind or NCLB (pron. nickel-bee). It is the vehicle for redundant and relentless high-stress testing on all students without consideration for their learning disabilities, emotional frailties, or intellectual development patterns. The message is as clear as a Nazi torturers instruction, "We have ways of making you learn!"
Indeed we do.
NCLB claims to be a data-driven approach to teaching yet the data that has accumulated points more to its failure than any success. By measuring the results of random student populations in the same school it asserts that schools are either passing or failing to teach the prescribed dogmas. Its results more accurately reinforce a truism that is well-known with or without high-stress testing. That is that schools in poor, urban minority neighborhoods have difficulty reaching the same achievement levels of middle and upper class neighborhood schools.
NCLB sells the idea that raising the test scores in these schools is a remedy for the poverty, isolation, and racism these realizations point to. By eliminating these symptoms presumably we eliminate the disease. Theories like this are brought to us with a straight face by the same people who think democracy will stabilize the Middle East.
But NCLB claims that by gathering data and lots of it that the activity of collecting such data is a sufficiently scientific activity so as to justify the education theories being applied.
Many years ago, Richard Feynman used the Cargo Cult Science analogy to illuminate this kind of scientific behavior. During World War II, pacific island people who had been isolated from the outside world before the war suddenly became locations for air bases. After the war, the planes stopped coming.
...we really ought to look into theories that don't work, and science that isn't science.The renewal of NCLB is being debated by a closed forum of well paid NCLB cronies and enablers. They control the money, the process, the invitation lists, and all the power. You and I are little more than a nuisance. They are turning our children into uniform croquets. Anyone who has read Kurt Vonnegut's Mother Night will know what croquets are.
I think the educational and psychological studies I mentioned are examples of what I would like to call cargo cult science. In the South Seas there is a cargo cult of people. During the war they saw airplanes with lots of good materials, and they want the same thing to happen now. So they've arranged to make things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in, with two wooden pieces on his head to headphones and bars of bamboo sticking out like antennas -- he's the controller -- and they wait for the airplanes to land. They're doing everything right. The form is perfect. It looks exactly the way it looked before. But it doesn't work. No airplanes land. So I call these things cargo cult science, because they follow all the apparent precepts and forms of scientific investigation, but they're missing something essential, because the planes don't land.
Now it behooves me, of course, to tell you what they're missing. But it would be just about as difficult to explain to the South Sea islanders how they have to arrange things so that they get some wealth in their system. It is not something simple like telling them how to improve the shapes of the earphones. But there is one feature I notice that is generally missing in cargo cult science. That is the idea that we all hope you have learned in studying science in school -- we never say explicitly what this is, but just hope that you catch on by all the examples of scientific investigation. It is interesting, therefore, to bring it out now and speak of it explicitly. It's a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty -- a kind of leaning over backwards. For example, if you're doing an experiment, you should report everything that you think might make it invalid -- not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you've eliminated by some other experiment, and how they worked -- to make sure the other fellow can tell they have been eliminated.
Details that could throw doubt on your interpretation must be given, if you know them. You must do the best you can -- if you know anything at all wrong, or possibly wrong -- to explain it. If you make a theory, for example, and advertise it, or put it out, then you must also put down all the facts that disagree with it, as well as those that agree with it. There is also a more subtle problem. When you have put a lot of ideas together to make an elaborate theory, you want to make sure, when explaining what it fits, that those things it fits are not just the things that gave you the idea for the theory; but that the finished theory makes something else come out right, in addition.
In summary, the idea is to give all of the information to help others to judge the value of your contribution; not just the information that leads to judgement in one particular direction or another.
The easiest way to explain this idea is to contrast it, for example, with advertising. Last night I heard that Wesson oil doesn't soak through food. Well, that's true. It's not dishonest; but the thing I'm talking about is not just a matter of not being dishonest; it's a matter of scientific integrity, which is another level. The fact that should be added to that advertising statement is that no oils soak through food, if operated at a certain temperature. If operated at another temperature, they all will -- including Wesson oil. So it's the implication which has been conveyed, not the fact, which is true, and the difference is what we have to deal with.
We've learned from experience that the truth will come out. Other experimenters will repeat your experiment and find out whether you were wrong or right. Nature's phenomena will agree or they'll disagree with your theory. And, although you may gain some temporary fame and excitement, you will not gain a good reputation as a scientist if you haven't tried to be very careful in this kind of work. And it's this type of integrity, this kind of care not to fool yourself, that is missing to a large extent in much of the research in cargo cult science.
A great deal of their difficulty is, of course, the difficulty of the subject and the inapplicability of the scientific method to the subject. Nevertheless, it should be remarked that this is not the only difficulty. That's why the planes don't land -- but they don't land.
We have learned a lot from experience about how to handle some of the ways we fool ourselves. One example: Millikan measured the charge on an electron by an experiment with falling oil drops, and got an answer which we now know not to be quite right. It's a little bit off because he had the incorrect value for the viscosity of air. It's interesting to look at the history of measurements of the charge of an electron, after Millikan. If you plot them as a function of time, you find that one is a little bit bigger than Millikan's, and the next one's a little bit bigger than that, and the next one's a little bit bigger than that, until finally they settle down to a number which is higher.
Why didn't they discover the new number was higher right away? It's a thing that scientists are ashamed of -- this history -- because it's apparent that people did things like this: when they got a number that was too high above Millikan's, they thought something must be wrong -- and they would look for and find a reason why something might be wrong. When they got a number close to Millikan's value they didn't look so hard. And so they eliminated the numbers that were too far off, and did other things like that. We've learned those tricks nowadays, and now we don't have that kind of a disease.
But this long history of learning how to not fool ourselves -- of having utter scientific integrity -- is, I'm sorry to say, something that we haven't specifically included in any particular course that I know of. We just hope you've caught on by osmosis
The first principle is that you must not fool yourself -- and you are the easiest person to fool. So you have to be very careful about that. After you've not fooled yourself, it's easy not to fool other scientists. You just have to be honest in a conventional way after that.
I would like to add something that's not essential to the science, but something I kind of believe, which is that you should not fool the layman when you're talking as a scientist. I am not trying to tell you what to do about cheating on your wife, or fooling your girlfriend, or something like that, when you're not trying to be a scientist, but just trying to be an ordinary human being. We'll leave those problems up to you and your rabbi. I'm talking about a specific, extra type of integrity that is not lying, but bending over backwards to show how you're maybe wrong, that you ought to have when acting as a scientist. And this is our responsibility as scientists, certainly to other scientists, and I think to laymen.
For example, I was a little surprised when I was talking to a friend who was going to go on the radio. He does work on cosmology and astronomy, and he wondered how he would explain what the applications of his work were. "Well", I said, "there aren't any". He said, "Yes, but then we won't get support for more research of this kind". I think that's kind of dishonest. If you're representing yourself as a scientist, then you should explain to the layman what you're doing -- and if they don't support you under those circumstances, then that's their decision.
One example of the principle is this: If you've made up your mind to test a theory, or you want to explain some idea, you should always decide to publish it whichever way it comes out. If we only publish results of a certain kind, we can make the argument look good. We must publish BOTH kinds of results.
But on the outside and along the watchtowers of what's left of civilization, a growing chorus of voices are offering new ideas, hope, and a shining optimism that insists we can offer our children a better learning environment, a brighter future, and they need you to listen to their message and give it a fair hearing.
Here are just a few of the most important voices in education today.
Digg It! | Add to Del.icio.us | Add to Technorati
No comments:
Post a Comment