Good Taste Cannot Be Encoded

On the Media is a superb program, one of very few programs whose analysis is always constructive, pithy and correct, to the extent that the correct conclusion is drawn based on all available facts. The superb interview linked below is one of very few instances where OTM gets so close to the heart of the matter but fails to state it or examine it:

Algorithms do not have taste. Algorithms cannot mimic taste. No guise of algorithm–AI, machine learning, etc.–will ever serve as the purveyor of good taste.

Let’s take, for example, KCRW.com. KCRW’s rise to the most powerful radio station in the world was founded on good taste. Since its founding in the 70s, its mission has been the opposite of that of commercial radio stations. As commercial radio consolidated and converged on a seemingly singular playlist of “hits”, legendary KCRW djs like Tom Schnabel and Chris Douridas scoured the planet to find music that was as rich in cultural expression as it was pleasant to hear. (This was not limited to highly refined acts. During Spaceland’s hayday in the mid 1990s, Douridas obtained rough cuts of Beck’s first album and played them on the air.) In the same manner that the purveyor of good taste provides the chef with ingredients that represent the best of the cumulative science and art of agriculture, the calling of a true dj is to find and to promote music that captures the best of the craft, culture and art of musical composition and performance. This essence of good taste can not be encoded by algorithms, especially by what is termed Artificial Intelligence, AI.

The widespread lamentation that Chris Chayka documents in Filterworld (link below) is captured wholly in this single word: taste. The only thing that algorithms can capture is the shared ignorance of the masses and impose them upon the individual. This is the complete opposite of what Schnabel and Douridas do in countering the ignorance of the masses by exposing individuals to new and foreign expressions that place the listener’s life in a different context by creating momentary awareness of the different ways people exist and think in the world.

AI can only guess the user’s taste by comparing its analysis of the user’s music to that of the cohort in which it places the user. Thus, the cohort never really grows because AI merely amplifies and imposes the cohort’s predispositions on the cohort ad infinitum. AI has no awareness of what the music conveys and whether the user wants to have his or her boundaries expanded. AI seeks only to keep the user “engaged”, glued to the device in order to take in more of what the algorithm can serve up to keep the user engaged. Schnabel and Douridas had fixed two-hour sets. They had a limited time every day within which to present an argument to their listeners that there are more expressions of beauty outside the listeners’ limited sphere of cultural awareness. Algorithms work 24 hours a day to keep the user confined to the cohort in which the algorithm has placed the user.

Thus, fundamentally, KCRW fulfills a value proposition and AI does not. KCRW’s value proposition is grow rich in awareness and knowledge through new music. The AI model offers no value proposition to the user because it only promises stagnation in exchange for the user’s time. AI’s value proposition is only to the corporation that uses it in the corpus of a captive audience.

Clearly, taste is not a simple matter. It is the fulcrum on which the value proposition in a bargain hinges. It is not limited to the music domain. In every financial transaction, there is no bargain if one is not aware of the value of the purchase. AI cocoons users in ignorance and thus convinces them that the dross it presents to them has value. AI mediated transactions go far beyond a bad bargain. They represent no bargain at all. Amazon users are ripped off because they think that an Amazon search presents them with a realistic account of the choices they have. In truth, the Amazon search is just as much of a sham as the Spotify playlist: both of them are payola, perfected and shrouded by AI.

Caveat emptor must be heeded more now than ever before.

Algorithms cannot mimic human taste. They trap the user for the sake of payola.

Recommendation algorithms don’t know you.

Source: Micah Speaks To Kyle Chayka About The Filter World | On the Media | WNYC Studios

Bad Marketing Persists

Palantir is undoubtedly an artificial intelligence powerhouse, but the way it chooses to advertise itself in the Wall Street Journal is questionable. Palantir has yet to turn a profit 18 years after its founding, its clientele is composed almost solely of the Department of Defense, and it is entirely unknown whether its product line is broadly applicable to business, especially if any of it is critical to national security. What sense is there, then, in Palantir advertising the fact that it is booking more AI sales than companies that are profitable? Asserting that one specializes in an area that is not profitable is hardly reassuring.

Where are the Borders in the Computing Cloud?

Nominally, the case (linked at the end) is about “privacy”, but the underlying questions are far deeper and far more relevant to anyone who is using any form of “cloud” service: Facebook, Google, Amazon, Apple, Twitter, Microsoft, etc. The government insists that it can access data belonging to a suspect even if that data is stored on a server in another country, but the service company, Microsoft in this case, insists that it cannot provide that data because that act violates the terms under which it operates its servers in Ireland. The question is, therefore, where is the virtual border drawn? Is material belonging to an American subject but stored on a server in a foreign country under a foreign account that was created in that country subject to US law or the laws of the country in which the account was created. A question in the affirmative leads to the following conundrum.

“If U.S. law enforcement can obtain the emails of foreigners stored outside the United States, what’s to stop the government of another country from getting your emails even though they are located in the United States?” Brad Smith, Microsoft’s president and chief legal officer, said in a blog post on Monday.

Where is the line drawn? Does the account belong to the person and, thus, subject to the laws of whichever country in which the person is residing, or is the data owned by the provider and, thus, subject to the laws of the country in which that provider is operating? If the former, then, indeed, foreign countries can have free access to data stored on American servers. This will please Chinese officials who want to identify dissidents. If the latter, then some country–perhaps Ireland–may well become a haven for data akin to the way Switzerland is a haven for money. Neither branch of the dilemma is particularly satisfying. Not solving this problem is an invitation to disaster in the not too distant future as our data slowly come to represent the totality of our existence.

What do we  want as users? Do we want our data to be ours, or do we want to relinquish control to technology companies in order to relieve ourselves of the responsibility of living with the consequences of the data? The breakneck pace of progress in technology doesn’t leave much time for the deep discussion that the subject demands. When the shit hits the fan, it’s going to get really messy. Wear your best virtual rubbers.

Source: U.S. Supreme Court to decide major Microsoft email privacy fight