Very interesting article! The technological details were interesting, and the interviewee's confession of his own moral decay was fascinating. I really liked his concluding thoughts:
S: Do you think that in our society we delude ourselves into thinking we have more privacy than we really do?
M: Oh, absolutely. If you think about it, when I use a credit card, the security model is the same as that of handing you my wallet and saying, "Take out whatever money you think you want, and then give it back."
S: ...and yet it seems to be working.
M: Most things don't have to be perfect. In particular, things involving human interactions don’t have to be perfect, because groups of humans have all these self-regulations built in. If you and I have an agreement and you screwed me over badly, you've always got in the back of your mind the nagging worry that I'm going to show up on your doorstep with a club and kill you. Because of that, people don’t tend to screw each other too much, right? At least, they try not to. One danger, perhaps, of moving towards an algorithmically driven society is that the algorithms aren't scared of us showing up and beating them up. The algorithms will do whatever it is that they are designed to do. But mostly I'm not too worried about that.
When it comes down to it, society is built on the constraint that if one person screws over other people, eventually others are going retaliate, either through formal channels (the legal system) or informally (showing up on doorsteps with clubs).
no subject
Date: 2009-02-09 11:24 am (UTC)When it comes down to it, society is built on the constraint that if one person screws over other people, eventually others are going retaliate, either through formal channels (the legal system) or informally (showing up on doorsteps with clubs).