Deep Learning is UnAmerican

Source: Deep Learning on Medium

Deep Learning is UnAmerican

Please stop.

The fundamental principle of Deep Learning is that truth, maybe even Truth, lies in a big puddle of unparsed data, and a sufficiently speedy computer that can zip through and turn the data into other data comprised of relationships will eventuate into some reflection of this t/Truth.

Seems to make enough sense on the face of it, but nobody has asked the question: Can human-sized truths be extracted from planet-sized datasets? Or: What is the relationship between the dataset and the individual?

In the currently-fashionable model, the individual does not exist except as, at best, a cluster of datapoints, and then, only in and as a relationship. All fine and well, maybe even quite metaphysically Buddhist, but since when did we have Buddhist metaphysician computers influencing public thinking? Let’s deabstractify it, dear reader: Do you feel as if you exist purely as relation?, because this is what you are being asked to accept when you accept Deep Learning as a paradigm.

Of course it’s quite true that words, for example, seem to have a similar property. We define them using one other, and, the first time a 15-year-old kid smokes a spliff and realizes that words are, like, empty, man, can be a powerful experience. But we Americans are not Zen Buddhists (cf. The Lotus and the Robot, Koestler.) nor ought we be.

Let’s say it’s all true: that you are only useful as a datapoint is absolutely true in the same sense that you are a meaningless, hairless, monkey living a vain life of inanities. What does it feel like for a meaningless, hairless, vain, inane, monkey to exist only as a relation to other parseable brute data-borne objects?

Well, it does not feel good, and it sure as hell does not feel American.

Before you accuse me of patriotic jingoism, let me say: Guilty as charged. I am one of those insufferables that believes America, as an institution built on the bedrock of life, liberty, and the pursuit of happiness, is wholly unique in world history, and that her liberational gifts to the world, even though they are not in any way reflected by the shit-show of modern politics (created in part by exactly what this article is talking about, mind you), are still of incredible value: the individual is the atomic entity that creates reality. Sometimes it spits out Neil Armstrong, and sometimes it spits out GG Allin, but what it does not do is spit out clones.

The transition where humans become easily-aggregatable dataset clones is fully underway already and can be seen throughout the “social” media spheres, where influencers, Instagrammers, Tweeters, and many more besides have become avatars of Pure Data. By this, I mean to say, it is vitally incumbent for all successful influencers (et al) to stick to brand. A guy who has established himself as the purveyor of a pictures of cute dogs MUST NECESSARILY maintain exactly and only that image as he presents his self to the world, and if he were to Tweet a picture of a classic car, his followers would just be confused.

So the flattening of the human is thus two-fold: it constrains the content-producer to on-brand-messaging and turns consumers into thirsters-of-flattened-individuals. Nobody wins.

Until we wrest control from the Data Scientists and their techbro admirers, life is going to keep getting flatter: Facebook, Google, Amazon, and the rest of the great hammers that continue to smash America into an unrecongnizably flat and lifeless pancake of likes and viewers-who-viewed-thises and upvotes and all the other binary, colorless, insipid reflections of ugly Dataset-Truths will continue to extract wealth and life and data from the masses and reify it into stronger versions of the panopticon with more merciless control.

Merry Christmas.