And even if they do remember, the chances are they’d struggle with the mental arithmetic.
I remember media weather reports used to be given in Fahrenheit, and I struggled when they changed over to Centigrade only (or Celsius as the BBC would have it now!); ‘20 degrees’ just didn’t mean anything to me, whereas ‘the 70s’ did (newspapers still use headlines like the ‘sizzling seventies’).
It was many, many years after my schooldays when I learnt of a simpler ‘near enough’ method:
Just double it and add 30
(and the converse: take away 30 and halve it).
I was amazed how easy it was to do in your head, and ever since then I’ve mentally converted (our typical UK) weather temperatures and found them to be within a degree or 2 of the correct answer.
I drew a graph of 'C to F' to help me understand what was happening; the dark blue line uses the exact formula, and the purple line uses the approximate formula.

You can see they coincide when C=10 and F=50, but drift apart either side, so it’s only a good match over a limited range.
This graph shows the actual error between the two formulae, e.g. at C=25, the approx. formula gives a value of F which is 3 degrees too high.

So although it’s no good for scientific use, it’s not bad for getting a gut-feel of everyday weather temperatures (in the UK at least

I would be interested to know if this short-cut is common knowledge, and if kids today are told about it. I see no harm as long as its limitations are explained; it would certainly have helped me ‘when I were a lad’.
Phil.