Team Human interview with Dennis Yi Tenen

Posted on
politics critique ethics
thumbnail

You can view this episode here, and you can download a transcript I made with whisper-medium here. I accept responsibility for errors in the transcript, alongside OpenAI, all people whose voices exist on the web, and the rest of humanity. :)

writing technology permalink

DYT: You know, it took humans like centuries to perfect the technology of a dictionary and it took hundreds, thousands of people, probably like millions of hours to actually get to the point where you can easily look up a word.

DR: It wasn’t Mr. Webster who sat down for a couple of weeks?

DYT: Right. But so what’s interesting is you never think like when you look at a dictionary, you don’t think like, wow, the dictionary is really smart. You don’t think of the the book. We don’t make that kind of mistake, even though it is really smart kind of because it condenses all that thousand years of of literacy.

DR: Exactly. It’s not just the smartness of knowing what all the words mean. It’s the smartness of the development of all those words by so many people. And I’ve often thought, you know, today, if we were going to develop language today, everybody would want IP on the words that they’ve created and there’d be no way to speak.

DYT: Yeah. Shakespeare would be like, stop saying that.

DYT: But yeah, but I guess the confusing part here is that those folks are now, well they’re dead. They’re the people who contributed to this effort. They’re, most of them are gone. And but yet they’re able to help you. So you’re there just writing your essay. And from far away, they’re still present in some ways and they’re there participating in whatever it is you are doing. And it’s a collaborative effort, but it’s stretched across time and it’s stretched across space, which is difficult to sort of conceptualize. … And technology is the same.

AI labor permalink

Whatever goes by the name of A.I. is really a type of labor. It’s always implicated with labor politics. So you have another good example is like spam filtering or filtering for violent images on Facebook or something. And it’s supposed to be this like automated thing. But then if you actually (and there are people who study this) and that kind of work often gets outsourced to places like Moldova, where I’m from, or places like South Asia, where there are people watching porn and watching incredibly violent images. And they’re using augmented, they’re using some technology to like help them weed this out. But it’s still an immense human labor that goes into it. And it kind of gets hidden from us. So the end of the day, we’re like, oh, OK, there’s a filter. It’s an A.I. filter that sort of is making the space more safe, but then the the actual effort expended and the actual sort of trauma that that this effort inflicts on people gets hidden, because labor politics are often like that. It’s always hidden.