I bumped today into a 2017 article published on the US Naval Institute website entitled "Hyperwar". The term refers to #war waged with #AI, including all the things you would see in a 2026 Anduril video: swarms of drones and intelligent helmets, as well as less photogenic military tools such as LLMs.It goes without saying that the article is very pro-AI. What is interesting is the initial justification of the use of AI in war. "System 1", a well-known trope in dual-process theories of cognition is mentioned, in connection with Daniel Kahneman's popular science book "Thinking fast and slow". System 1, a.k.a. "fast thinking", is the kind of automatic, instinctive (and mostly dumb) thinking that characterises a good deal of human cognition. The argument of the hyperwar article is that humans get tired and when tired, 'revert' to System 1 rather than using their more evolved System 2: the system of slow, logical thinking that supports well thought-out decisions. The article claims that, since AI does never get tired, it will not suffer from the same issues.Now, I have bad news all around. First, System 1 is actually our default mode of cognition -- we are thoughtless most of the time because it is easier and faster for our brains. System 2 requires more energy and slows us down, with the upshot that it delivers more rationality and better thought-out decisions. So we don't 'revert' to System 1. We have to make the effort of moving to System 2, and indeed, it is naturally harder when we are tired, stressed or under time pressure. Following the hyperwar line of argumentations, this might be a factor in favour of systems that are not liable to such biological constraints. Except that... /1