Confessions of an Ivy League Dumbass
Not to brag, but I went to Yale with Amanda Claybaugh, Harvard University’s dean of undergraduate education and an English professor. Seeing her quoted in a recent New Yorker article about the decline in college humanities majors got me thinking about my misspent youth. Amanda said, “The last time I taught The Scarlet Letter, I discovered that my students were really struggling to understand the sentences as sentences — like, having trouble identifying the subject and the verb. Their capacities are different, and the nineteenth century is a long time ago.”
I can hear a bit of Amanda’s wry humor in that last sentence. We both studied literature, but she was a far more rigorous reader and critic than I was, although I didn’t realize it at the time. I was too busy trying to ape the gnomic, arch style of the “literary theorists” then in fashion to worry about tedious fundamentals.
I could identify a sentence’s subject and verb, so in that sense I’m better prepared than some of today’s students. But what would a Yale undergrad of 100 years ago have thought of my utter ignorance of Latin and Greek, to name just a few of my academic deficiencies? Maybe the ability to read 19th century English prose, like the ability to read Homer in the original, has become one of those skills we simply no longer value outside of certain specialist communities. Maybe the same fate awaits reading and studying literature in general.
Me, I love the humanities, but I’ve learned much more as an Internet-empowered autodidact than I ever did at school. This says more about me than my teachers, however. So I imagine a Harvard degree is still worth it; the name still opens doors, and still attracts the kind of rich and socially prominent people it’s useful to befriend while young. But even there I’m not sure I’d indulge in something as impractical as a “Comparative Literature” major in this day and age. Statistics is a better bet, if only because data (which, when properly manipulated, can evoke all the emotion of a Shakespeare sonnet) is fast replacing English as the lingua franca of our educated elite.

Provisions
The Ivy League used to have significant influence on our idea of what an American “gentleman” should be. That was long ago, and today the very notion of masculinity is up for grabs. Ben Davis is having his say via the men’s grooming industry: first with his Gents Place barbershop franchise, and now with his new line of skin and haircare products, Rascal.
Davis’s commitment to American entrepreneurship, service, and good, old-fashioned fun makes him just the kind of businessman Align exists to promote. Check him out in conversation with New Founding’s Bart Lomont in the second episode of the Align podcast. And use this link to purchase fine Rascal products.

Training
Back in my days of pretending I understood Jacques Derrida, I would’ve regarded an accounting course as unforgivably vocational. In truth, given my leisurely study habits at the time, I probably would’ve flunked. But then double-entry bookkeeping can be notoriously difficult to master even for motivated students.
According to Bend, Oregon-based computer industry veteran Stephen Elderkin, it doesn’t have to be this way. Inspired by a 14th century Italian treatise, his Addictive Accounting app teaches you a common-sense approach via a simulator, which is designed to be used in short spurts while on the go. As Elderkin says, “If you take a bit of time here and there to read the material and do the practice problems, your skills as a business owner will dramatically improve!”

Culture
The British royalty is another standard-setting institution that continues to lose credibility. Perhaps the most prudent course when it comes to the degrading spectacle of the Duke of Sussex clamoring to become a mere influencer is to ignore it altogether. But if you’d like to understand why Megan and Harry won’t go away — and what that has to do with our reigning cancel culture — Align’s Helen Roy offers her usual astute analysis in her latest dispatch.

Exemplar
Artificial intelligence pioneer John McCarthy was well acquainted with how quickly we take new technology for granted. “As soon as it works, no one calls it AI any more,” he once remarked. And while the splashy debuts of DALL-E, ChatGPT, and other OpenAI products seem to have changed everything overnight, the truth is that machines have been “thinking” for us since the dawn of the Internet. It was McCarthy who coined the term “artificial intelligence,” and without his many innovations (including the creation of the LISP programming language and the introduction of mainframe timesharing) the digital world as we know it would not exist.
McCarthy himself was skeptical of AI’s capacity to surpass us; he preferred to bet on mankind. And the atmosphere he cultivated as head of Stanford’s Artificial Intelligence Laboratory reveals a restless, wide-ranging creativity as yet well beyond our programming capabilities:
“Email, which hardly anyone outside of the community had heard about, was already the normal way of communicating…the Internet was taken for granted; everyone was using graphical displays and full-screen user interfaces; outside, robots were playing volley-ball (not very successfully, it must be said); the vending machines took no coins, but you entered your login name and received a bill at the end of the month, a setup which never failed to astonish visitors.”
McCarthy died in 2011 at 84. It’s a shame we no longer have him to unpack the recent AI hype; by all accounts he was also an excellent, patient teacher. Fortunately, Return has assembled a formidable panel of AI experts to take on the challenge. When will AI match or exceed most humans in every learning, reasoning, or intellectual domain — and what impact will this have on society? Anyone who cares about the answer will find this essential reading.
