Winter Wargaming tomorrow. Don’t forget to check back and vote on the direction for our fictional France—I typically play over the weekend.
I have to write the 2019 audience report at some point. The very short version is: slightly reduced traffic over 2019, but counting Discord, massively increased audience engagement. To our regulars, both the commentariat and the lurkers, we’re happy to have you around.
Books
- Me: Castles of Steel, by Robert Massie. I’m reminded of an era when the government owned the design of its warships.
- Parvusimperator: Samurai!, by Saburo Sakai.
Defense
- Soleimani bites it thanks to good old American Hellfire, Iran retaliates, US perhaps to retaliate back? – I don’t really need to link anything for this, do I?
- Navy Act restricts LCS fleet to 35 hulls
- 8 F-15EXes for the USAF in 2020 – That’s 176 AMRAAMs, or 29⅓ F-35A/Cs, or 44 F-35Bs.
- The Diplomat on the Chinese influence campaign in the US – I still belong to the ‘Overheard at
‘ Facebook group, and it’s been utterly fascinating watching the pro-China Chinese students square off against the anti-China Westerners. For one thing, the Chinese are using the language of the social justice movement, which is a masterstroke—very few of the Western students have any idea how to argue if that rug is pulled out from beneath them. - Iran has better missiles than it used to – In particular, these ones can actually hit point targets.
- Coast Guard to face narco-sub epidemic in 2020? – Granted that most of them are semi-submersibles rather than full-on submarines, but still. In related news, have you heard about the transatlantic narco-submarine trade? I hadn’t.
- On the next world war – The article misses the fact that latent industrial capacity is not really enough to make modern military equipment. You could convert a 1943 Ford factory to make B-24s. It would be much, much harder to convert a 2020 Ford factory to make F-35s.
- Back to H. I. Sutton: how will the balance of power in world submarine fleets change in the 2020s?
Science and Technology
- A post from 2005: why your company isn’t hiring the best N% of employees in its field
- SlateStarCodex reports on GPT-2 playing chess – GPT-2, you may recall, is the text generation AI that took the world by storm last year. It doesn’t play chess very well, but the fact that it can play chess at all by attempting to generate a FEN game record is pretty wild1.
Games
- The former CEO of Nissan-Renault-Mitsubishi flees house arrest in Japan for his native Lebanon – Oh, sorry for the misfiling. I thought this was about a new Shadowrun sourcebook.
- OpenTafl, my hnefatafl games engine, is getting some updates – A new piece type, some bugfixes, perhaps a new flavor of AI.
Grab Bag
- In 1974-1975, bush fires burned across 254 million acres of Australia – This year so far? 26 million acres.
- Cognitive bias in California’s ill-advised anti-Uber bill
- How Phineas and Ferb did the impossible – The impossible being ‘building a successful show where the main characters have no real personal arc or serious conflict’, not ‘building crazy inventions in a day’.
- That said, I don’t find credible the claims that GPT-2 is a revolutionary advancement for a variety of reasons that I should probably explain in an article. Here’s one thought, though, from the SSC comments: “The scary version of [artificial general intelligence] is supposed to learn and improve faster and faster, but GPT-2 is the opposite, because the more things it knows in a domain, the larger the chunk of data it needs to learn the next thing. And the more complex a domain is, the worse this problem gets.” Granted, yes, but that’s also not how humans learn. We’re more like an S-curve, I would say—we start out slow as we familiarize ourselves with the vocabulary of a field, go into a zoom climb where the limit on our learning is our pace of reading, rather than our pace of deep comprehension, then level off as we near our peak achievement. ↩