When i talk to people about code, it’s in terms of the high-level concepts of what’s happening, not the literal code. I might say “we take the summation of …”, I would never say “we take the Sigma” or “sum symbol”, The name of the symbols never come up; our IDE snippet for Σ
is mapped to “sum” not “sigma”. etc. But anyway, I do believe you if you say you personally find it a bit confusing, though me and my team do not, so I think the advice “avoid symbols” is per-team basis and per-symbol basis, not a general principle. I think we agree at this point.
But similar research has definitively been done:
https://dl.acm.org/doi/abs/10.1145/2534973
Yeah I’ve seen this before, it’s relevant but not exactly what I’m talking about. I’m not talking about which words or syntaxes are more intuitive, but rather which ones take the eyeballs and visual system in our brain less physical effort to process, before any sort of comprehension or analysis comes into play. Physiological. Example, I’m quite certain that
hello world
is slightly harder to visually process (not harder to understand understand) than
hello world
due to eye saccade distance,
and
((3) + ((4) / (5)))
is harder than
3 + 4 / 5
because there are now 10 extra elements to register in your vision, and ()
parentheses are such sharp jagged shapes that make everything claustrophobic, your eyes have to jump over them.
This leads to conclusions like the typical c style if
statement requiring braces being a bad idea, because it will make negations harder to process, etc.
if (!(n > 0))
the !
is squished against the (
, it’s a lot easier to see if we just have
if !(n > 0)
or if you simply had a less vertical, more horizontal sign, like ¬
if ¬(n > 0)
again, easier for the eyes to discern because it is less similarly shaped to (
and could conclude that curly-brace languages are worse than whitespaced languages (on the eyes). I think that is true.
As far as I know, no research on this sort of visual optimization is being done at all, (except maybe in unrelated fields like typography and graphic design), most programmers have never considered that this could matter in the code they write, and when they hear about it they usually just laugh and dismiss it immediately “who cares”, since they fundamentally don’t view source code as something that deserves to look good, or something that’s primarily meant to be read by humans and secondarily by computers.