Commenter Archive

Comments by Hartmut*

On “The law of the letter

Michael, this is just way cool!
When you feel like the software is mostly together, is it something you would be willing to share? Sell? (I hesitate to suggest beta test. ;-)

"

As we've gotten around to archives...
TL;DR version: I've started playing with a toy version of the beginnings of software that will eventually be a tool for my archival project. In a couple of years it ought to be interesting :^)
I'm starting to play with toy versions of software I'll be using eventually in my role as extended family archivist building a digitized record from the hundreds/thousands of pictures and document pages that have accumulated. Everything I'm doing right now is grayscale, just so that's not a surprise to anyone who goes so far as to look at the images. Most of the images are large; you'll have to do whatever tricks your browser requires to see them at full resolution.
Text documents first. A JPEG image of a document page I snapped with my iPad is here. The original image is somewhat sharper than the one shown, since JPEG is not as good with details as Apple's HEIC format. For the time being, I use ImageMagick to convert HEICs to uncompressed grayscale.
Right now the toy assumes the document is a rectangle laying flat, and I'm taking a picture of it that's out of alignment. That makes it a linear transform problem. First step is to find the corners of the document. I'm doing something not entirely simple minded. The accuracy of the toy corner-finding code is illustrated here.
It's been a long time since I did anything with linear transforms and the matrix calculations that go with that. After some online reading to refresh my memory, and finding simple versions of code for 3x3 matrices, the toy code can do a perspective transform and produce an approximate equivalent of a 300 dot-per-inch scan (or more, or less). The page in the picture is actually a pile of several sheets, stapled, so doesn't quit meet the flat rectangle assumption. The result is shown here.
In some cases, I will want to do OCR on the images. I'm using the tesseract open-source OCR program for now. Tesseract is not a toy. When I converted the photo to an estimated 600 dpi scan and ran it through tesseract: (a) tesseract estimated the resolution as 607 dpi; (b) to a quick pass through the output, all the text in what are actual text fields are correct; and (c) flat text output is shown here.
The same toy sofware works on pictures of photographs if there's a white border so that the toy can identify corners. An approximate equivalent to a 300 dpi scan of an old Polaroid picture of my wife-to-be from before I knew her is shown here.

"

Perhaps the greatest calculator ... was Kepler
Perhaps. But the ladies who did all the calculations for the Mercury and Apollo Projects were no slouches either. Men's lives hung on their work. As the story goes, John Glenn asked explicitly for Katherine Johnson to do the calculations for his flight. He wouldn't trust anyone else with his safety.

"

Thanks, Tony P. The two most underrated scientists in popular science history are Kepler and James Clerk Maxwell. (Galileo is the most overrated.)

"

I love maps. All kinds: altases, road maps, city maps, world maps. Give me a map to use, read, ponder, and I can be absorbed in it for hours.
I remember, when I was in Athens (Greece, not Georgia), getting my hands on a fold-out street map, assuming I'd be able to make some sense out of it based on knowing the general names of streets, and where things like markets and bridges were. And I remember the delightful/terrifying feeling of not, in fact, being able to do that even a little bit. Thanks to the different alphabet, I was unable to tease out any meaning at all. Terrifying for obvious reasons, but delightful because it was rather fun to see one of my most-cherished objects - a map! - manifest as incomprehensible.
Michael - The issue of records being kept, preserved, and accessible for more than one generation is one I think about a LOT. Just seeing how quickly electronic media become obsolete makes me shake my head in bleak wonder.
We can read the direct writing of people from thousands of years ago - multiple thousands of years - right up to, what, a couple generations ago? When did people stop writing letters or keeping written journals?
It just seems like humanity, or at least the industrialized portions, is engaged in a headlong rush to erase itself from the record. (Which, considering where we are right now as a species, is kind of understandable, though no less alarming.)

"

Pro Bono: Perhaps the greatest calculator ... was Kepler
PB, if you haven't seen this 3Blue1Brown video, you really should take a look. The mind-boggling explanation of what Kepler accomplished starts at around 18:20, but the whole thing is great.
--TP

"

Why spent minutes (at minimum) on the screen when a sketch on paper takes seconds?
Tablets are getting better at imitating what paper and pencil do. But it's still a pale imitation. And, from what I've seen, the rate of progress towards duplicating it as slowed markedly.

"

Don't know about memory but in my case writing by hand clearly improves quality (and is also quicker).
And doing math (not calculating but solving a math problem by hand) I can't essentially do any other way. Chemistry the same. Why spent minutes (at minimum) on the screen when a sketch on paper takes seconds? I am not a tablet guy admittedly.

"

There seems to be a consistent body of work showing that taking notes during a lecture reinforces memory
The more ways you can engage with a body of material, the more it will stick with you.
The greater the degree of attention required of you as part of that engagement, even more so.
Humans should think, machines should work, as the saying goes. That said, thinking *is* work, and attempts to find short cuts around that just make us stupid.

"

At the rate software is improving, I suppose computers will be able to read to us, and write down what we say as well.
During the early 1990s I had lunch regularly with a librarian. We discussed archiving on a regular basis. Ken Burns's Civil War documentary was still pretty new. She used to say, "You want to write the source material for someone to use in 120 years to make a documentary like Burns's? Acid-free paper and pigment-based ink, my friend. And descendants willing to keep your writings in a trunk somewhere dark."

"

...why and whether kids still have to learn to write by hand in our modern age.
There seems to be a consistent body of work showing that taking notes during a lecture reinforces memory, and taking notes longhand reinforces more than typing on a keyboard. That's the pseudo-academic in me speaking, of course.
For the last twelve years or so I've been using a little note-taking application that I wrote myself. There were just too many cases where pasting in an image, or having a live URL, or even just searching for a keyword seemed to justify it. Recently I've been considering going back to paper and pen.
I thought about using an Ipad with an Apple Pencil, which has gotten very much like paper and pen (so long as you don't use the eraser much*). Unfortunately, Apple has seen fit to put handwriting recognition into the OS, and insists on putting a little line under anything you write/draw that it thinks might be a date or time. I've seen many complaints about it, and people asking why Apple can't make it optional.
* One of the reasons I always took notes in ink while I was doing research work was because sometimes I wrote down something that I thought was right, and two days later discovered I was mistaken. With ink, you have to grab a different color pen and put in a dated bit with the correction.

"

Doing recursions on the calculator was also among my first experiences. I did not yet understand though why certain functions would yield the same result independent of start value after pressing the key repeatedly (converging on x=f(x)).

"

All this math talk has me celebrating pi, but not exactly.
Hmmm. I'm thinking exactly, but not precisely.

"

I played with an electronic calculator when I was a kid. It allowed me to start recognizing patterns in numbers, particularly when performing the same calculation recursively. I'm reasonably sure I wasn't typical in that regard, but I wanted to mount some meager defense of electronic calculators.

"

First we smash all the (electronic*) calculators.
Get the logarithm table and/or the slide rule back.
That's how one teaches the basics!
In all seriousness, I work as a tutor and by now mainly for math. Before one can teach them abstract concepts, they need to get the basics right and that means calculations (preferably without electronic help). I see no justification to teach set theory** when they get still puzzled by "Is 123 divisible by 3 without residue?" or "What's the square root of 81?". If that's 19th century, then Gott erhalte Franz den Kaiser!
To me that sounds like the unfortunate discussion about why and whether kids still have to learn to write by hand in our modern age. Why learn orthography when there are spellcheckers? And what by the way is the use of kids reading fictional literature of guys long dead or ancient (pre-1990) history?
Yes, understanding should have priority over rote learning buy everyday math is still mostly basics. And I am cynical enough to say that we are failing there already. Imo math at school should concentrate on practical problem solving not Zermelo's theorem.
*abaci will be tolerated
*exception: difference between natural, integer, rational and real numbers.

"

All this math talk has me celebrating pi, but not exactly.

"

Perhaps the greatest calculator, unmentioned in Devlin's article, was Kepler, who worked out his laws of planetary motion from Brahe's observations.

"

For any mathematician alive today, mathematics is a subject that studies formally-defined concepts, with a focus on the establishment of truth (based on accepted axioms)
Absolutely, however teaching "formally define concepts, axioms, proofs, etc" are taught in PLANE GEOMETRY, not so much in calculation/algebra type classes.

"

I’m on my phone, so can’t give links, but I encourage a dive into how japanese teach math vs US methods. A couple of points I remember:
-in the US, people who are goid at math get pushed into teaching math, so they often don’t understand why students make the mistakes they do. A large component of Japanese math education is predictive, so a good teacher should know where students are likely to go off the rails and adjust their teaching
-a passing grade in Japan is 60, which is good for math, if you understand 60% of some concepts, that’s not too bad, and the bulk of math education happens in hs. I recall I was involved with an exchange program that sent selected prefectural students to BC. One student was from one of the lower ranking schools, and was considered the weakest candidate academically. Wasn’t a bad kid, but was on the baseball team, so 95% of his effort was on the baseball field. He went to a BC high school where classes were in mid term and had to take a math test because that was scheduled and everyone was astonished because he had a perfect score.
-students in japan still aren’t permitted to use calculators
-they also don’t give partial credit, which is how I got thru my math courses.
Will try and toss some links tomorrow

"

Here's a post from Keith Devlin working through some thoughts about the tension between calculation and mathematical thinking.
https://devlinsangle.blogspot.com/2018/05/calculation-was-price-we-used-to-have.html
For any mathematician alive today, mathematics is a subject that studies formally-defined concepts, with a focus on the establishment of truth (based on accepted axioms), with various forms of calculation (numerical, algebraic, set-theoretic, logical, etc.) being tools developed and used in the pursuit of those goals. That’s the only kind of mathematics we have known.
Except, that is, when we were at school. By and large, the 19th Century revolution in mathematics did not permeate the world’s school systems, which remained firmly in the “mathematics is about calculation” mindset. The one attempt to bring the school system into the modern age (in the US, the UK, and a few other countries), was the 1960s “New Math”. Though well-intentioned, its rollout was disastrous, in large part because very few teachers understood what it was about – and hence could not teach it well. The confusion caused to parents (other than mathematician parents) was nicely encapsulated by the satirical songwriter and singer Tom Lehrer (who taught mathematics at Harvard, and did understand New Math), in his hilarious, and pointedly accurate, song New Math.
As a result of the initial chaos, the initiative was quickly dropped, and school math remained largely unchanged while real-world uses of mathematics kept steadily changing, leaving the schools increasingly separated from the way people did math in their jobs. Eventually, the separation blew up into a full-fledged divorce. That occurred in the late 1980s. The divorce was finalized on June 23, 1988. That was the date when Steve Wolfram released his mammoth software package Mathematica.[...]

Devlin is really good on matters pedagogical, and always worth the read.
I do tend to think, though, that students will have a very hard time with understanding math (or written communication) if they have not had enough experience with doing the work, and not seen enough examples to get an idea of the possible range of approaches to doing the work, etc.. Early in my teaching I tended not to give enough examples, figuring that teaching the conceptual side would lead students to sort through their own database of examples to see the underlying principles. I've since learned that most students come in having seen and understood too few examples, and having no idea of more than one approach to the tasks they have been called upon to do.
I do a lot more modeling of approaches, and evaluation of those approaches, now that I'm finally starting to figure out this whole teaching thing.

"

I suppose I can see how, if everybody who knows how to read** has a phone/computer in their hip pocket, knowing basic arithmetic might be less critical than it once was. I'm not convinced, mind, but I can see that it might be.
** At the rate software is improving, I suppose computers will be able to read to us, and write down what we say as well. The reactionaries will no doubt be delighted if illiteracy once again becomes the norm. /snark

"

I was taught using the School Mathematics Project, which seemed OK to me. But I may not be one of the "normal people".
I suggest that being able to divide accurately with pen and paper is now almost useless, whereas being able to divide approximately in one's head is useful for avoiding fat-finger errors. That is, the underpinnings have turned out to be more important than the algorithms.

"

New Math was the same sort of thing. It pushed a much broader view of what math was than just the algorithms. Look, long division is done the way it is because hundreds of years of experience informs us that it's the best way to get the right answers when you have to do a hundred division problems a day, day after day. New Math failed when the teachers pushed the broader view but didn't teach the mechanics.
The trouble with New Math was that it was (apparently) designed by mathematicians. Mathematicians who had forgotten that a) you have to build the foundations (mechanics, as Michael says) first. And that b) normal people are not mathematicians -- and that's 99.99% (or more) of the population. They neither care nor need to know the theoretical underpinnings. They just need to know how to do basic arithmetic reliably.

"

Camel notation* from computer programming would possibly be better: InternalCompustionEngine.
It is interesting that it is widely used in domain names, e.g. KaiserPermanente.org Clearly the sales and marketing folks think it will be easier to parse that way.

*Comment archive for non-registered commenters assembled by email address as provided.