Here's a post from Keith Devlin working through some thoughts about the tension between calculation and mathematical thinking. https://devlinsangle.blogspot.com/2018/05/calculation-was-price-we-used-to-have.html For any mathematician alive today, mathematics is a subject that studies formally-defined concepts, with a focus on the establishment of truth (based on accepted axioms), with various forms of calculation (numerical, algebraic, set-theoretic, logical, etc.) being tools developed and used in the pursuit of those goals. That’s the only kind of mathematics we have known.
Except, that is, when we were at school. By and large, the 19th Century revolution in mathematics did not permeate the world’s school systems, which remained firmly in the “mathematics is about calculation” mindset. The one attempt to bring the school system into the modern age (in the US, the UK, and a few other countries), was the 1960s “New Math”. Though well-intentioned, its rollout was disastrous, in large part because very few teachers understood what it was about – and hence could not teach it well. The confusion caused to parents (other than mathematician parents) was nicely encapsulated by the satirical songwriter and singer Tom Lehrer (who taught mathematics at Harvard, and did understand New Math), in his hilarious, and pointedly accurate, song New Math.
As a result of the initial chaos, the initiative was quickly dropped, and school math remained largely unchanged while real-world uses of mathematics kept steadily changing, leaving the schools increasingly separated from the way people did math in their jobs. Eventually, the separation blew up into a full-fledged divorce. That occurred in the late 1980s. The divorce was finalized on June 23, 1988. That was the date when Steve Wolfram released his mammoth software package Mathematica.[...]
Devlin is really good on matters pedagogical, and always worth the read.
I do tend to think, though, that students will have a very hard time with understanding math (or written communication) if they have not had enough experience with doing the work, and not seen enough examples to get an idea of the possible range of approaches to doing the work, etc.. Early in my teaching I tended not to give enough examples, figuring that teaching the conceptual side would lead students to sort through their own database of examples to see the underlying principles. I've since learned that most students come in having seen and understood too few examples, and having no idea of more than one approach to the tasks they have been called upon to do.
I do a lot more modeling of approaches, and evaluation of those approaches, now that I'm finally starting to figure out this whole teaching thing.
2025-07-19 22:24:17
I'm puzzled by this. I'm not good at languages, relative to my other skills, but switching alphabets - Cyrillic, Greek, Georgian... is trivial.
It's not onerous, no, but it is a factor on at least two levels in my experience.
First off, it can create some noise when particular letters look similar to letters in the other language that are not phonetically equivalent, and that usually triggers a bit of recursion in the reading process. It's not a lot of load on the system, but it is processing power that is not being used to make sense of the meaning. Writing English using the Greek alphabet barely affects reading comprehension when deciphering the message when one is fluent in English. Combine a lack of fluency with the need to decipher and the effects compound.
Second of all, it messes with the pattern recognition that one relies upon when skimming a text. When I'm reading Swedish or Spanish, I can skim the text fairly easily and a lot of the language has enough root-equivalency to make those reading skills transfer. That sort of whole-word pattern recognition doesn't fire the same way when I am faced with another alphabet.
All of these things mess with your language in the same way that when a student is asked to write about an unfamiliar topic with its own technical vocabulary, they often end up writing language that has a greater number of grammar and spelling errors than when they are writing about familiar topics. The familiar has a much simplified processing economy.
And again, with functional and transactional language, these difficulties are much less pronounced than when dealing with more complex and nuanced subjects.
At least that is my experience, and it seems to match with my observations of how my non-native student writers interact with texts. Actual linguists would likely have a lot to say about the places where I'm wallpapering over some complex topics, or missing the boat entirely.
2025-07-19 19:10:30
Instant translation is fine for functional and transactional language, but it hits its limits pretty quickly as language complexity increases and becomes problematic for understanding as soon as there is an intertextual element at work. I see this a lot with my international students when they are working their way through English texts with the help of translation software. They miss a lot of the features that the authors are using to communicate - parallelisms, homophones, puns, etc.
To be fair, a lot of my native language domestic students miss those things too, but the international students have the reading skills to catch those elements in their own languages, and would notice those things if they were actually working with the original text.
One thing I can add that speaks to lj's first point. Language-wise I've studied Spanish, French, Swedish, and Ancient Greek. I can muddle through in Spanish, and would probably be able to attain fluency in any of the first three in a few months with immersion. Greek, however, never sticks particularly well, and the alphabet contributes somewhat to that difficulty. It's one more unfamiliar element (deciphering) that takes up processing power that would otherwise be used for linguistic sense-making.
2025-07-19 17:58:17
In case anyone is interested in the subject (and in lieu of fraught AI summaries): https://en.wikipedia.org/wiki/English_as_a_lingua_franca
It's entirely possible that English will become the lingua franca for international communications, but if it does, I'd expect, like Hartmut, that it continues to shed irregular constructions and colloquialisms and that native dialects will be treated as quaint variants with charming local color. I also predict that both Americans and Brits will complain bitterly that ELF is "not proper English" when that happens, and resent any standard that treats ELF as the paradigm.
Here's a post from Keith Devlin working through some thoughts about the tension between calculation and mathematical thinking.
https://devlinsangle.blogspot.com/2018/05/calculation-was-price-we-used-to-have.html
For any mathematician alive today, mathematics is a subject that studies formally-defined concepts, with a focus on the establishment of truth (based on accepted axioms), with various forms of calculation (numerical, algebraic, set-theoretic, logical, etc.) being tools developed and used in the pursuit of those goals. That’s the only kind of mathematics we have known.
Except, that is, when we were at school. By and large, the 19th Century revolution in mathematics did not permeate the world’s school systems, which remained firmly in the “mathematics is about calculation” mindset. The one attempt to bring the school system into the modern age (in the US, the UK, and a few other countries), was the 1960s “New Math”. Though well-intentioned, its rollout was disastrous, in large part because very few teachers understood what it was about – and hence could not teach it well. The confusion caused to parents (other than mathematician parents) was nicely encapsulated by the satirical songwriter and singer Tom Lehrer (who taught mathematics at Harvard, and did understand New Math), in his hilarious, and pointedly accurate, song New Math.
As a result of the initial chaos, the initiative was quickly dropped, and school math remained largely unchanged while real-world uses of mathematics kept steadily changing, leaving the schools increasingly separated from the way people did math in their jobs. Eventually, the separation blew up into a full-fledged divorce. That occurred in the late 1980s. The divorce was finalized on June 23, 1988. That was the date when Steve Wolfram released his mammoth software package Mathematica.[...]
Devlin is really good on matters pedagogical, and always worth the read.
I do tend to think, though, that students will have a very hard time with understanding math (or written communication) if they have not had enough experience with doing the work, and not seen enough examples to get an idea of the possible range of approaches to doing the work, etc.. Early in my teaching I tended not to give enough examples, figuring that teaching the conceptual side would lead students to sort through their own database of examples to see the underlying principles. I've since learned that most students come in having seen and understood too few examples, and having no idea of more than one approach to the tasks they have been called upon to do.
I do a lot more modeling of approaches, and evaluation of those approaches, now that I'm finally starting to figure out this whole teaching thing.
I'm puzzled by this. I'm not good at languages, relative to my other skills, but switching alphabets - Cyrillic, Greek, Georgian... is trivial.
It's not onerous, no, but it is a factor on at least two levels in my experience.
First off, it can create some noise when particular letters look similar to letters in the other language that are not phonetically equivalent, and that usually triggers a bit of recursion in the reading process. It's not a lot of load on the system, but it is processing power that is not being used to make sense of the meaning. Writing English using the Greek alphabet barely affects reading comprehension when deciphering the message when one is fluent in English. Combine a lack of fluency with the need to decipher and the effects compound.
Second of all, it messes with the pattern recognition that one relies upon when skimming a text. When I'm reading Swedish or Spanish, I can skim the text fairly easily and a lot of the language has enough root-equivalency to make those reading skills transfer. That sort of whole-word pattern recognition doesn't fire the same way when I am faced with another alphabet.
All of these things mess with your language in the same way that when a student is asked to write about an unfamiliar topic with its own technical vocabulary, they often end up writing language that has a greater number of grammar and spelling errors than when they are writing about familiar topics. The familiar has a much simplified processing economy.
And again, with functional and transactional language, these difficulties are much less pronounced than when dealing with more complex and nuanced subjects.
At least that is my experience, and it seems to match with my observations of how my non-native student writers interact with texts. Actual linguists would likely have a lot to say about the places where I'm wallpapering over some complex topics, or missing the boat entirely.
Instant translation is fine for functional and transactional language, but it hits its limits pretty quickly as language complexity increases and becomes problematic for understanding as soon as there is an intertextual element at work. I see this a lot with my international students when they are working their way through English texts with the help of translation software. They miss a lot of the features that the authors are using to communicate - parallelisms, homophones, puns, etc.
To be fair, a lot of my native language domestic students miss those things too, but the international students have the reading skills to catch those elements in their own languages, and would notice those things if they were actually working with the original text.
One thing I can add that speaks to lj's first point. Language-wise I've studied Spanish, French, Swedish, and Ancient Greek. I can muddle through in Spanish, and would probably be able to attain fluency in any of the first three in a few months with immersion. Greek, however, never sticks particularly well, and the alphabet contributes somewhat to that difficulty. It's one more unfamiliar element (deciphering) that takes up processing power that would otherwise be used for linguistic sense-making.
In case anyone is interested in the subject (and in lieu of fraught AI summaries):
https://en.wikipedia.org/wiki/English_as_a_lingua_franca
It's entirely possible that English will become the lingua franca for international communications, but if it does, I'd expect, like Hartmut, that it continues to shed irregular constructions and colloquialisms and that native dialects will be treated as quaint variants with charming local color. I also predict that both Americans and Brits will complain bitterly that ELF is "not proper English" when that happens, and resent any standard that treats ELF as the paradigm.