A Computationally Rational Basis for Stress
Stress, Frame Lag, and Norepinephrine
What is stress?
I think of the mind as a prediction machine. A recurrent system taking the current state and feeding it forward to estimate what the next frame might look like. We can do this on multiple time horizons: Am I in immediate danger? How might my meeting go this afternoon? What will family life look like in a year?. This kind of prediction is some computational process, and my thought stemmed from thinking about the race conditions of this process on dedicated hardware.
Because it's a computational process, it has some kind of execution time. I can predict, say, the location of a thrown projectile 5 seconds from now, in about 0.1 seconds. In 5 seconds I can compare the predicted location to the actual location, and evaluate the accuracy of prediction. But say a scenario arises wherein things are simply moving too quickly, tasks are building up, or some physical disaster like a car crash for example is looming imminently, or it's a problem so multivariate or stochastic it's hard to predict. The problem's complexity class is such that it is not computable within one frame. My prediction mechanism thus has no feedback, and I have uncertainty on security.
It's somewhat common wisdom that uncertainty drives stress. But I think we can go further with this and talk about uncertainty in this computational sense. Stress is felt when our prediction mechanism a) predicts harm, b) makes invalid predictions, c) _is outmatched by the situational complexity_, d) has no feedback.
This kind of psychological overwhelm is similar in response to the newbie at work who is not yet practised at a skill and thus has to spend more cognitive resource solving things and thus gets overwhelmed. It's the same issue that plagues the hypersensitive or autistic - there's just too much information to make sense of before I'm hit by the next batch.
So how does the brain deal with this issue? I think many are quick to assume stress is just bad and annoying and without function. I think generally to make this claim about an evolved trait is quite foolish. What purpose does stress serve then, neurochemically say?
I think one plausible answer is that Norepinephrine (NE) has computational function (or that some other compound I'm not aware of is acting in the manner I'm about to describe). The human body is optimized for resource-usage within its evolutionary environment. The vast vast majority of the time, the system can be running at a low baseline. In times where it needs to move a little faster it will compute within its ordinary bounds. I would suggest there's probably some Resource Attribution Policy (RAP) similar to Kubernetes for example, that is responsible for doling out compute resource. RAP probably varies on temperamental and genetic bases. My thinking is that NE might be the 'Compute Chemical' (CC). Oodles of it get distributed in high intensity moments and in flow. Similar for Epinephrine. These CCs are either a signaler to, or a direct product of, the RAP mechanism and allow for faster, more distributed, or longer-duration compute. That is to say that the upper limit of Complexity Class is temporarily elevated. This is highly functional and resolves our third factor in the stress equation above.
Of course we cannot constantly be in this elevated state or the machinery is overworking and will burn itself out. But in this model stress is functional for short term use, but causes temporary and lasting systemic damage due to the residual compounds in the system, or the degradation of the actual neurons.
A few additional tidbits I feel that justify this thinking:
Firstly, we can consider the RAPs of other species through flicker fusion thresholds (FFT). Some animals can see a faster frame rate than others. This almost certainly has computational impacts and almost certainly is evolved to be optimal based on predator/prey stalker/ambush dynamics etc, hence we tend to see a trade off between lifespan and FFT. One interesting and pertinent counterexample to this is amphibians, who have variable body temperature, and have variable FFT in accordance with this temperature - thus reducing the wear and tear of compute.
Secondly, we would expect (much like in relativistic physics) for speed to determine chronometry, i.e. the passing of time would be slower be compared to an external frame if compute per second increased. Well, when epi/norepi-nephrine flow, typically in moments of shock or sudden importance, we get a time-slowing-down effect. I would wager more compute is being dedicated to guarantee prediction rate.
Thirdly, I think there is a plausible neurological pathway for the this: The Locus Coeruleus(LC)-Norepinephrine System(LCNS). We know that when presented with novel, stressful, or salient stimuli, we get bursts of LC recruitment. These bursts cause releases of NE throughout the cortex leading to some interesting results: 1. NE and glutamate mutual enhance key pathways to increase neuronal speed efficiency i.e. high priority functions are given enhanced and dedicated compute. This is called adaptive gain. 2. we see cortical desynchronization i.e. more distributed and parallelized compute. 3. both the LC and NE are known to mediate circadian rhythm and time perception. Entertainingly the LC is thought to be, in and of itself, quite energy intensive.
I've just focused on a particular aspect here primarily as I tend to find if I try to write about everything, nothing ever gets published. But it's worth just throwing out some other aspects that I think are related but haven't thought about as much: 1. There will of course be other neurotransmitters at play here for different aspects of the prediction mechanism: Dopamine I suppose would be involved in reward for correct prediction, hence the depressant effects of instability. Likewise with Serotonin, which I've read mediates confidence and calm, thus anxiety and instability. 2. Glucose level will likely play a role in this, being something of the fundamental energy source for a hard-working brain. I'd be curious to see how the RAP changes with available glucose supply. 3. I've focused on acute stress. My suspicion is that much like NE, this isn't a byproduct or an unhealthy thing, it is itself a distributed CC that thus portends the usual harms of long term stress. 4. High energy systems tend to be more prone to errors. So a follow up question might be "What happens when this system breaks?". I think it might be reasonable to assume that this could be part of the source of the psychic Clean Homes Hypothesis analog - lack of training of the predictive mechanism on real stressors creates a twitchy stress response.
Experimentally how could one validate this? Well I'm not a medic so can't speak to the safety of this but some pre/post-NE injection experimental design that measures FFT or some objective metric of ability to solve particular complexity classes could be interesting. One could also do a similar experiment with blood glucose level. I think there are some other slightly Milgram-esque methods involving electric shocks and complex puzzles and seeing how long someone lasts, but I doubt these sorts of experiment are considered ethical anymore!
I can see a few application areas for the idea. Clinically one might identify disorders related to abnormal recruitment of the RAP mechanism (possibly the LC), or develop coaching treatments focused on the management and reduction of complexity in the environment. I think this final point thus also brings utility to design theory more generally: How do we design the least complex visual and ergonomic systems, to reduce the stress of users, objectively.
