![]() |
https://wiki.opensourceecology.org/wiki/File:Backhoe.png |
He asks himself two questions about broad objectives as a teacher:
- What do I want my students to remember five years from now about this class?
- What do I want my students to know how to do after they have taken this class?
- What is our question?
- How would you go about discovering the best answer to it, at least to your satisfaction?
- How would you then persuade somebody else that the answer you have come up with is the most likely answer?
Could that serve as a scaffolding for education in the age of AI?I think it could. By focusing on 'how would you discover?' and then 'how would you persuade?', we could avoid all of the traps involved in the overuse of LLMs.
Let a be a new thing you're encountering: something you're reading about, an action or phenomenon you're observing in real life.
Our short-term memory holds 5 to 9 things - let's say 7, for convenience.
Our long-term memory doesn't seem to have a limit. Or rather, because of the efficiencies achieved by "chunking" and grouping things up, our long-term memory is sufficiently expandable that the functional constraint is not how much information you can meaningfully hold, but rather, how much time we have available to do the work needed to move things into long-term memory.
But not only is short-term memory miniscule compared to long-term; it also uses the same mental resources as we use to make connections among things - that is, to think.
So you can jam 7 small pieces of info into your short-term memory, and then have no cognitive capacity left to think about them.
Or you can put 4 small pieces of info into your short-term memory and have some cognitive capacity left over.
Or use 2 pieces of info, but now the possible thoughts are extremely limited, because of having only a and b to compare.
When you bring the long-term memory into the game, it's fundamentally different.
Instead of moving among small pieces of info a, b, and c, you're picking up a, and maybe b as well, and comparing them to the large information sets
A, B, C, ... Z, AA, AB, AC, ... AZ, BA, BB, BC, ..., CA, CB, CC, ...
When you're playing that game, thoughts are available to you that are impossible for a person somehow trying to use only short-term memory. There far more possible useful connections then when you're only playing with a, b, c, and d.
The person drawing on the long-term memory described above will have more thoughts than a person whose long-term memory is
A, B, C, ... Z, BA, BB, BC, ..., BZ, CA, CB, CC, ...
And this second person will have different thoughts than a person whose long-term memory is
A, B, C, ... Z, AA, AB, AC, ... AZ, CA, CB, CC, ...
I suppose it's possible to have lots of stuff in your long-term memory and be bad at thinking.
But I don't see how it's possible to have little in your long-term memory and be good at thinking.
It follows that a key part of learning to think is simply building up your long-term memory.
And now back to AI.
When LLM's interact with our conventional ways of teaching and of evaluating learning, it's often easy for the machines to replace the work that students (or all of us) need to do to move things into long-term memory. And so we undermine the development of the ability to think broadly.
The calculator example comes up often in these discussions, and I have a speculation about that. The claim when they were introduced was that they would remove the burden of doing low-level stuff like arithmetic, freeing up our mental energies for the real math of algebra or higher-level mathematical thinking.
On one hand, it's true that arithmetic takes time, and that if my mind is occupied figuring out 45 x 1,325,409, or working out an approximation of 78^(0.5), it can't be doing other, higher-order things, like solving the algebra of an economic model.
On the other, I wonder whether the _ability_ to solve those arithmetic problems is a substrate that makes higher-order mathematical thinking easier.
Confronted with a*(b + c), many students have a hard time not turning it into ab + c. And the idea that a^b * a^c = a^(b+c) is like witchcraft or gobbledygook, rather than something that becomes intuitive once demonstrated.
And I wonder about childhood practice doing sums in the head, memorizing times tables up to 12x12, working larger multiplication or division problems by hand on paper: does that activity build up a feel for quantitative relationships that then helps one feel the rightness of a*(b + c) = ab + ac?
Calculators are still great, because doing arithmetic _does_ take mental effort that then is not available for doing other, higher-order things.
But if we use calculators in ways that prevent learning how to do arithmetic, does that impair the ability to do higher order things that we now have the mental space for?
And are LLM's poised to do the same thing?
If your long-term memory is already well stocked, and your short-term memory is well practiced at reaching into the long-term memory to find things that relate to items a and b that are currently being held in short-term memory, then an LLM might be a tool that allows you to extend your reach.
But if you encounter LLMs early in the educational process, do they take the place of developing your own long-term memory and facility at reaching into that memory?
I keep returning to a metaphor involving a backhoe and a bench-press.
If you want a foundation hole dug in a construction project, you care about the end result: is there a hole of the size that you require?
If I show up with a shovel and a wheelbarrow, while Kate shows up with a backhoe and a dump-truck, you're going to hire Kate. It's true that my way of doing the job is going to do a lot more for my muscular development than Kate's approach, and perhaps the reliance on my own strength will also put a premium on finding clever ways to do things (though Kate's backhoe also rewards her skill in knowing how to control it).
But you're not interested in how much my muscles improve. You want there to be a big hole in the ground, and Kate with her backhoe will do that job far more effectively then I will with my shovel and wheelbarrow.
It's different if you're the coach of a sports team. You assign us to bench-press a certain amount of weight. I go over to a normal bench-press. Kate goes to one that has a motor assist. She will get her "reps" done much quicker than I will, but she will get almost no muscular development out of it.
In this case, you don't care in the least about having x pounds go up y inches, a total of z times. You've told us to do xyz because of the effect you want it to have on our muscles. If we use the motor-assisted bench-press, the "work" gets "done," but nothing useful was accomplished.
Do LLMs make it harder to learn how to think?
I've definitely seem them used in ways that are as pointless as a motorized bench-press in a weight room.
What are the applications that are more like a backhoe on a construction site?
And how do we move students - and ourselves - more in that direction?
No comments:
Post a Comment