Wednesday What We’re Reading (Jan. 8, 2020)

Winter Wargaming tomorrow. Don’t forget to check back and vote on the direction for our fictional France—I typically play over the weekend.

I have to write the 2019 audience report at some point. The very short version is: slightly reduced traffic over 2019, but counting Discord, massively increased audience engagement. To our regulars, both the commentariat and the lurkers, we’re happy to have you around.

Books

  • Me: Castles of Steel, by Robert Massie. I’m reminded of an era when the government owned the design of its warships.
  • Parvusimperator: Samurai!, by Saburo Sakai.

Defense

Science and Technology

Games

Grab Bag


  1. That said, I don’t find credible the claims that GPT-2 is a revolutionary advancement for a variety of reasons that I should probably explain in an article. Here’s one thought, though, from the SSC comments: “The scary version of [artificial general intelligence] is supposed to learn and improve faster and faster, but GPT-2 is the opposite, because the more things it knows in a domain, the larger the chunk of data it needs to learn the next thing. And the more complex a domain is, the worse this problem gets.” Granted, yes, but that’s also not how humans learn. We’re more like an S-curve, I would say—we start out slow as we familiarize ourselves with the vocabulary of a field, go into a zoom climb where the limit on our learning is our pace of reading, rather than our pace of deep comprehension, then level off as we near our peak achievement. 

Leave a Reply

Your email address will not be published. Required fields are marked *