Stand and the Liver
Pre-flight chickens, post-mortem entrails, the ancient art of keeping large systems moving, and why modern AI research feels familiar
There were a few Roman augurs who would look at the weather and say “not a good day to start an invasion” based on sensible things like seeing lightning on the pass that they would have to march over, or looking at birds getting blown by the wind in the opposite direction to where the navy was going to have to sail. There was probably also the occasional haruspex who looked at the insides of a bird sacrifice and said “that bird is not healthy at all, let’s just feast on some grain offerings at the temple for a while, OK?”
Likewise, there are genuine people doing real machine learning / AI research, and people who genuinely think about what’s going wrong in AI and how to direct the future. This post is not about such people.
How forecasting, prediction and decision making worked in Ancient Rome
Before you were about to do something, you looked up in the sky, and you looked at whatever the birds were doing. Well, of course, you didn’t do that, you had people whose job it was. That was the job of an augur — and we still have that in language like “it doesn’t augur well”.
If the thing you were doing didn’t survive contact with reality very well you then got a specialist to look at the entrails of a sacrifice to see what it said about which god had been offended. That was the job of the haruspex.
A typical augur or typical haruspex just followed some tradition that had been handed down to them, with no process that we would recognise as being scientific. There wasn’t a sense that they needed to confirm whether it genuinely was Demeter that had been offended. You just declared it Demeter, and off everyone marched to the relevant temple. If the crops still failed, you found another angry god to appease.
Truth — in the way we think about it — wasn’t really an requirement, nor particularly relevant. Roman divination was about following the proper rituals as a social coordination mechanism.
Nobody wanted to say “no” out loud too much, particularly to Caesar, for example. It’s much easier to do what Bibulus did. Caesar tries to legislate his agricultural land reforms; his co-consul Bibulus starts looking into the sky and issues announcements of bad omens to make the proceedings technically invalid.
Augury
You didn’t study to be an augur; you cultivated political trust and connections, and when an augur place became available if you have been safe, loyal and useful you might get co-opted into it.
You also ended up in charge of a portable chicken coop which you would take with you if you went out to battle, in case there were no convenient eagles or thunderstorms on the day. This is true, but I mentioned it because I really wanted to put a line here about how you wouldn’t start any procedure that was not passing pre-flight chicks.
Once you had this respectable role you had to follow proper process afterwards. Cicero tells a story about Tiberius Gracchus (an augur himself) who violated augural procedure — he crossed the pomerium before finishing the auspices — and then ran the election anyway. It was a big deal. Lots of angry words were said on what to us seems like the most trivial of activity ordering changes.
Generally, you were called on to support whatever dimwitted idea the consuls wanted, and about the best you could do was to delay the dumbest ideas by a few days and hope that everyone returned to their senses.
This whole process of getting the job and then what you did in it is of course nothing like the equivalent of a full-time academic ML research role.
Haruspex
A haruspex was different. You did study to be a haruspex, but it helped if you were Etruscan. It was an immigrant’s job generally. You had to train via a long process of making your way up through different levels. Here’s a sample study guide: the Liver of Piacenza, which is a convenient map showing which part of the liver corresponds to which displeased Roman god.
(I am hoping it was kept in a black box. Otherwise it would have been an explainable model.)
If you were a working haruspex you got called in when everything went badly, and then you had to do the grunt work of identifying what happened and figure out how to fix it even though you had nothing to do with the original situation. When you did give your advice, there was every chance that it would be taken completely out of context and used for political ends — Cicero took the warnings from the haruspex about impiety and polluted rites and descrated sacred places and the earthquake at Potentia and blamed it all on Clodius.
As a haruspex sometimes you got paid. Sometimes you just got some prestige from a patron, which if you were lucky you got to cash in as a kind of equity later if the patron became extremely successful.
It was standard military administration that you had to have a haruspex. The centurion probably never called on you to do anything, because, well, it’s the Roman military and they were extremely good at not having disasters that would you need a haruspex to review. In any case, there were only a handful of gods that a military haruspex would deal with: Jupiter (the big one that dominated everything), Mars (the aggressive one that you sort of had to deal with) and Victoria (the god you really liked) and maybe some local models. Nonetheless, you couldn’t launch your military startup without a haruspex on the payroll.
Which is nothing like today’s ML engineers. Today you study for years learning how to train models. Maybe your internship after you’ve moved to San Francisco turns out just to be writing a program that calls out to the big three LLM API vendors and a few local models and never involves any of the skills you learned, but that’s OK, you get a better job afterwards. In your better, serious job, you do real stuff. When you see a benchmark go down or a failure to generalise, you wouldn’t just declare that the optimizer is angry, or the inductive bias displeased the gods, or maybe say that we should add a regulariser, tweak a loss, and resubmit. No, you would have a clear-eyed understanding of how the system worked. It’s much easier now, because there’s no supernatural super-intelligence that you need to align.
The point
I like this analogy, probably because I’ve been spending too much time writing programs to process classical texts with AI lately.
Roman soothsaying served a purpose. That purpose wasn’t “generating new knowledge” or “validating good forward-looking plans”. You didn’t have to believe in it. Cicero quotes Cato saying he’s amazed one soothsayer doesn’t laugh when he sees another. Rome didn’t need soothsayers, it was just convenient to have them around so that there was a process that could be followed that let you coordinate with your allies, return favours and keep politics generally peaceful. The alternatives were brutal (as Brutus himself would have agreed).
A lot of people are writing about how the incredible number of machine generated scientific papers is overwhelming the big machine learning / AI conferences: NeurIPS, ICML, ICLR, etc. There is a mix of:
Wonderment: AI is taking over all science and is doing a good job. This is an obvious extrapolation from the data.
Disillusionment: AI is taking over all science, and it is doing a terrible job. Reviewing for NeurIPS is a game of wading through AI-generated slop.
Panic: AI is taking over all science, and it is changing our jobs. How do we keep the institutions we knew and loved?
Yes, we’re rushing through a phase transition on white collar labour so of course it is going to be a rough time for the computer scientists and scientific researchers. They just got hit like the translators did in 2024. (I’m predicting accountants will be next.)
But I think that’s missing the point. The big AI conferences were only peripherally about research and science. From the outside, ICLR, NeurIPS, etc. really look like augur / haruspex conventions. It’s about maintaining a shared fiction that lets thousands of people act as if what they are doing is bringing about progress while the Bitter Lesson plays out.


