am I bad at arithmetic for the same reason I’m good at software engineering? 

I am, in a sense, a professional mathematician: computer programming is one of the more obvious forms of applied mathematics. I am also absolutely awful at arithmetic and algebra; I swap numbers, make sign errors, fail to evaluate things in the right order, fail to group things correctly, lose track of what I’m doing halfway between lines (this happens a lot), etc.

I have a diagnosis of “mild dyscalcula” from my childhood psychologist, which earned me the very helpful support from my elementary school in the form of being clearly instructed to suck it up and get over it, the generous benefit extended to many students with diagnosed learning disabilities.

Meanwhile, I am thirteen years into an apparently successful career as a programmer. My title has been unnecessarily inflated to “senior software engineer” and I am given authority and responsibility over important things (important enough to make tech news headlines if I screw up, as I have achieved once before). My inability to reliably perform middle school algebra has not impaired my ability to design a low-overhead load balancing algorithm based on an approximate but adequate understanding of probability, or dozens of other math-flavored things. My software runs on thousands of computers my company owns and millions of computers it doesn’t, and it usually works, so I’m probably doing something right.

When I design a software system, I start with this “lump of idea” - every system starts out as a formless black box and a spec for what that box does. But I immediately start inferring, from the spec, what that box must do: the groups and clusters of data read and written, the shape of the main loop (if it’s a loop), and then start imagining the abstract machines that would transform the input to the output, each one broken down as its own lump of idea. This is both a skill that I have refined with practice and an intuitive ability that I cannot describe and cannot stop. People who have tried to talk about pretty much anything to me have probably noticed me breaking things down into interacting systems.

And math is like that! A mathematical expression is a machine - a group of interlocking systems that process an input in a strict way. So that should be easy for me, right?

Well, no. There’s the shape of the problem, and then there’s the execution. I am an average coder *at best*. In practice, I am atypically slow, which is salvaged by my software stabilizing very quickly (it takes me a while to get to the first version, but that first version is unreasonably close to the final version). I have this entire machine living in my head - and poorly described across thirty pages of notes - but then I have to focus on individual pieces enough to represent them as a program.

But I can do that. I can take as many editing passes as I need. I can run a debugger and send numbers through my code to watch what it does and see if I made a mistake. I can describe what I’m doing and fill in the blanks. I can take the assumptions I make about the program and write those into the program, too, so the program will immediately crash and tell me what’s wrong if my assumptions or beliefs about the program that I myself wrote were incorrect. (As they often are.) While writing a program to perform a task, it also tells me if my approach to the task is usable. Ultimately, I never have to do the final step: I write the algorithm, but it’s up to the computer to run it.

Arithmetic is nothing like this. I have to do the calculations. I have been taught the algorithms — I have my own moist and squishy software suite, in a way — but I must execute them myself. Perfectly specifying the steps is inadequate when I must also perfectly perform them.

Unfortunately, my attention doesn’t work that way. The cloud of components I break a problem down into never goes away. I look at smaller parts of a mathematical expression and they’re still a cloud of relationships; there is no natural sequence to these, they aren’t steps; they have dependencies, which imply a sequence of steps, but I am very nearly incapable of maintaining such a sequence in my mind alongside the model of the math problem. So I can’t really do “the next” step when I see all these relationships between numbers and I can’t tell which to elaborate on next.

When I do pin one down, to get room to perform the algorithm to process it, I have to forget most of my model of the math problem. But that’s okay, I can always read it again, right?

Well, sort of. If my handwriting was bad - as it often is when it’s trying to keep up with my thoughts - I make sign errors, exponents fall out of place and become multipliers, logarithms lose their bases, and make similar errors. I also have the difficult task of slotting the piece I just solved back into the expression, without duplicating or dropping a piece! So I have to reread the original expression (correctly), identify the floating abstract part I just re-transcribed differently, and re-transcribe the entire remainder of the expression around it, correctly. That is extraordinarily difficult for me.

So I think I’m good at software for the same reason I’m bad at math. My concept of abstract systems is clear and fast, and I can systematically convert them to software - but I cannot *avoid* using the same model for mathematical expressions. It’s a great model for writing a program to do math and utterly impractical for performing the actual steps.

Is this ADHD? I dunno. An ADHD trait is in there — it takes extraordinary effort for me to keep a sequence of steps in mind and follow it — but caffeine only sort of helps me with math. (It’s critical for programming, of course.) In the end, I can’t freeze part of my thoughts to use a little bit of them - all those connections are alive and moving, and if I need to use a subset of those, I have to choose something to stop thinking about. Hence thirty pages of notes, comments scattered throughout code describing the obvious (because the comments were there long before the code), and dozens of explicit to-do lists.

I can do math with all those tools. Left to my own devices, I do. But it turns out that when a teacher says “show your work”, they mean “write out an exact list of steps with the full expression transformed step by step, perfectly, within an acceptable amount of paper in clear double-spaced format” and not “draw a sequence of mind maps with circles and arrows that look like a conspiracy theorist trying to explain how the lizard people ate Queen Elizabeth Warren and replaced her with a very convincing robot”. Which is what my software notes look like. I know _from experience_ that teachers don’t accept this, even teachers who are explicitly teaching “gifted” students. By the time I made it to college, I was desperately anxious about math and barely able to do it. The professor who explicitly encouraged me to break down expression pieces separately across multiple pages (or tabbed sections in a binder) was 100% a turnaround in my ability to do math, and much to my frustration I’ve forgotten his name so I can’t even thank him properly. I’m certain I’d do better now- now that I’m realizing that no professor would stop me from taking notes like this, and I can work backwards and turn my notes into linear steps later, just like I do for computers.

For people who feel comfortable with arithmetic, how does the process work for you? What is the cognitive experience? I suspect I would find it beautifully alien to me.

re: am I bad at arithmetic for the same reason I’m good at software engineering? 

@kistaro I used to think I was "good at math", but really what I'm good at is abstract symbol manipulation. I learned that doing predicate logic in college philosophy - it was like super-algebra, all these terms with no definitions needed, just relationships stated. It's very pure, and as long as you translated things correctly from regular language into the predicate logic symbolset, you're almost as good as done without even starting.

Doing arithmetic is like... seeing a rainbow-colored cloud, but also seeing that there's something in the cloud that requires you to remove the blue, so you do and the cloud starts to have tantalizing hints of what it might become. You apply steps and rules and relationships to things in this abstract system, and then eventually what was a cloud of colorful possibilities has collapsed into a set of answers that are solid, and beautiful and gleaming.

· · Web · 0 · 0 · 1
Sign in to participate in the conversation
Awoo Space

Awoo.space is a Mastodon instance where members can rely on a team of moderators to help resolve conflict, and limits federation with other instances using a specific access list to minimize abuse.

While mature content is allowed here, we strongly believe in being able to choose to engage with content on your own terms, so please make sure to put mature and potentially sensitive content behind the CW feature with enough description that people know what it's about.

Before signing up, please read our community guidelines. While it's a very broad swath of topics it covers, please do your best! We believe that as long as you're putting forth genuine effort to limit harm you might cause – even if you haven't read the document – you'll be okay!