Skip to content

Article of the Day: Facebook Is Not Your Friend


Sunrise in the San Joaquin Valley of California with smoke from forest fires obscuring the sun

An opinion article in the NYT Sunday Review called, “Silicon Valley is not your Friend” reinforces the aphorism that states, “If you’re not paying for the ‘product’, then you’re not the customer, YOU are the product.”  In other words, those “free” things like Chrome, Facebook, Google, and Twitter are actually using you as much as you use them.

The article mentioned a program that was all in vogue when I was in college in the early 70’s: Eliza.  This was a simple text-based program that pretended to be a psychiatrist.  It used the technique of repeating back to you part of your statements as a question, or if you had nothing to say, it would suggest something.  The following quote illustrates how it could fool the user into thinking that a sympathetic entity was on the other end of the line:

Call it the Eliza problem. In 1966, Joseph Weizenbaum, a professor at M.I.T., unveiled a computer program, Eliza, which imitated a psychotherapist. It would, by rote, inquire about your feelings toward your parents or try to get you talking by rephrasing what you said in the form of a question. The program immediately touched a nerve, becoming a national phenomenon, to the surprise of Mr. Weizenbaum. For example, The New York Times swooned: “Computer Is Being Taught to Understand English.”

Eliza understood nothing, in truth, and could never reach any shared insight with a “patient.” Eliza mechanically responded to whatever appeared on the screen. A typical therapy session quickly devolved into a Monty Python sketch. (Patient: You are not very aggressive, but I think you don’t want me to notice that. Eliza: What makes you think I am not very aggressive? Patient: You don’t argue with me. Eliza: Why do you think I don’t argue with you? Patient: You are afraid of me. Eliza: Does it please you to believe I am afraid of you?)

Imagine Mr. Weizenbaum’s surprise when his secretary looked up from her computer and interrupted her exchanges with Eliza to say to him, “Would you mind leaving the room, please?” She wanted privacy for a conversation with a machine! Mr. Weizenbaum, appalled, suddenly saw the potential for mischief by programmers who could manipulate computers and potentially the rest of us. He soon switched gears and devoted his remaining years to protesting what he considered the amorality of his computer science peers, frequently referring to his experiences as a young refugee from Nazi Germany.

As I mentioned before, I have “stopped using Facebook” although my blog is still copied there.  I did so because I was unsettlingly conscious of the “Eliza effect” every time I tuned into Facebook, allowed it to trigger some reaction, and expressed my reaction in a comment.  The other “people” who left comments may or may not have been human; I had no way of knowing until I was contacted by Ebrima Jallow, a young African man who wanted to be “friends” with me on Facebook.  I “friended” him, but I quickly realized that there was little resemblance to an actual friend in Mr. Jallow (although he is a nice young man, really.)  I’m pretty sure he is human, but having a conversation with him really made me feel bad.  He is, like 9/10 of the world, living on far less than I am, even though I am impecunious by American standards.  So there are several moral lessons here.  It’s not all bad, but it can make you feel kind of overprivileged.

No comments yet

Leave a Comment

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: