Mindful Cyborgs Ep 1: Data Exhaust, Augmented Dating and Fractalnoia

Mindful Cyborg

Mindful Cyborgs is a new podcast (or internet radio show, if you will) hosted by Chris Dancy and me. The tagline is: “Contemplative living in the age of quantification, augmentation and acceleration.” In our first episode we talk about data exhaust, augmented dating, fractalnoia and more. You can listen to it or download it from Soundcloud or iTunes.

Show notes and full transcript inside.

CONTENT OF THE WEEK:

TWEETS OF THE WEEK:

“Taking notes, putting appointments on a calendar, talking with friends, listening to songs: these are now achieved with identical gestures.” – Alan Jacobs

“People aren’t bad at computers. Computers are bad at people.” – Amber Case

“When you’re connected to everything, everything can seem connected. Confusing those things is dangerous.” –Joshua Topolsky

WORD OF THE WEEK:
Fractalnoia (Present Shock)

EVENTS:

CONNECT:

TRANSCRIPTION:

Mindful Cyborgs – Contemplative living in the age of quantification, augmentation and acceleration, with your hosts Chris Dancy and Klint Finley.

CD: Welcome to Mindful Cyborgs! Hey Klint, how are you doing?

KF: I’m doing good. How are you?

CD: Long week, I think we’ve been in conference overload.

KF: Are we ever not?

CD: That’s a good point. Listen, top stories. What have you got for me?

KF: Let’s see. One of the stories I read this week that was kind of top of mind for me was a story Tim Maly did for The Verge about dating in Google Glass. There’s a woman here in Portland, Oregon, who has been – she actually hasn’t been using Google Glass for this. She brings a webcam with her on dates. She hides in her purse and then records the date, and then she has people on Mechanical Turk watching the date in real time, and they text message her advice about how a date is going and they get paid a few cents or something to give her these pointers. It’s mostly meant to be an art project, but towards the end of the article, she goes on a date without it she kind of doesn’t know what to do because she’s used to getting this advice…

CD: Yes, it’s like a dating GPS.

KF: Yes, exactly.

CD: Like a Crowdsource dating GPS … should I go wash my hands now.

KF: Right. And it didn’t really sound like any of the advice was actually useful that these people are giving. It was like … I can’t specifically remember any of the things. It was all really generic. It was not anything that seemed terribly helpful, but I can understand where if you get used somebody telling you, “Okay, at this point, ask him what he wants out of life.” That was one of the things. It was like, “What are you looking for?” Ask him, “What are you looking for?”

CD: Nice.

KF: Of course, you would ask something like that. But if you get used to being prompted, if you get used to working with a teleprompter basically…

CD: Yes, it’s like people with GPS. I’m constantly missing airport parking because I’m watching the GPS and not the street signs. I literally have to turn the GPS off when I get close to things, and almost go into like thrust or maneuvering mode like a starship.

KF: The point of the article is “what happens when we have something like Google Glass and our whole lives are intertwined with this sort of thing.” It’s not really a new question, but we’re actually starting to see it happen to people in real life. Though, as you said about GPS, that’s been happening quite a while. I mean, the sets that people have in their cars really are augmented reality.

CD: GPS just seems to have handicapped people. Like, they just can’t do anything without and if they do use it they’re actually using it as a weapon so that everyone else in the car will be quiet and just let them drive. Or, “The machine said this…” You know. It’s interesting.

I thought there was a really, really, really kind of interesting story and it kind of combined quantified self, and social with the tragic events that just happened in Boston. The story called “We Know When Dzhokhar Tsarnaev Sleeps” (and I totally wrecked his name). Basically what they did was they looked at his Twitter account, and then his tweets and mapped to a graph of his sleeping patterns to try to make some type of determination of when he was sleeping, what he was doing, or what he was watching. Which I’d never really seen anything that represented someone kind of…

I mean, the idea of investigating someone digitally is not new, but using their digital fingerprint in this quantified way I thought was really interesting because there’s really unique colored graph that almost looks like a DNA pattern. I’ll put a link in the Show Notes. Again, kind of creepy some of that quantified stuff is starting to come in to the everyday lexicon and stories of the news.

KF: That is interesting because sometimes we can go out of our way trying find a way to quantify something, to measure, to track something that we’re interested in knowing about, but we leave behind trails and fingerprints – data exhaust – in and other places so you can find out when somebody sleeps from Twitter instead of keeping a detailed sleep log. I mean, it’s obvious it’s not going to be accurate to the minute or even to the hour that way.

CD: Mine is.

KF: From Twitter, it’s not going to be accurate to the minute.

CD: Yes, mine isn’t accurate because I schedule tweets in the middle of the night.

KF: Right, yes. If you’re using Buffer or something…

CD: Yes, it’s kind of weird, but I just love how, again, this idea of – are you enabled by technology or are you handicapped? But, if you use automation tools to schedule data exhaust, or are you somehow in a more interesting spot in life than someone who just kind of randomly uses these things, on and off?

KF: It shows you how you might be able to jam certain types of surveillance, if your tweets are really sporadic or if they’re just clockwork because of an automated system then it’s harder to mine for certain types of patterns.

CD: I like the idea of digital jamming of surveillance.

KF: What if somebody could look … if you’re using Buffer, for example, if somebody could get a look at Buffer’s logs and they could see when you’re adding stuff to Buffer, when you’re active there. But…

CD: No, no. When I add the Buffer it’s actually the interesting quantified thing because I’m adding the buffer on a specific time and day when I have time to read stories. If anyone could actually see the back into my Pocket or the back into my Buffer I’d be in trouble.

What else have you got?

KF: There’s the article I wrote about that company Citizen, if you want to get into that now.

CD: Sure.

KF: So I was really interested in how people responded to that story. As a recap, this small mobile development company here in Portland called Citizen is doing an experimental pilot program where people can sign up and they can basically plug in their quantified self stuff, like, Fitbit and Runkeeper into the company’s time tracking and metric system. They can start to get a picture of whether healthy employees are more productive or if they have better morale because they also have this sort of like mood tracking thing that’s connected to their time tracking, so that when you do some work you could plug in, like how did you feel about doing this work? Did you enjoy it? Was it drudgery? That sort of thing.

Their whole point of doing it is not very Orwellian, though I have found when I wrote it out, it sounded a lot more sensational than it really is.

CD: Not as sensational as an unemployable tech guy.

KF: They’re taking all their employees’ health data and seeing how productive they are. I mean, when you say it sounds very Orwellian, but I mean it’s really not so much. The employees are doing it voluntarily. There’s no mandate to do it. It’s an internal project – it’s like that Google has 20% thing, they have 15% time, so a project with some employees grew together if they wanted to track this stuff themselves. The comments were almost universally negative, like everyone was sort of terrified of having to give up this information to their employers.

CD: Yes, but I think if I’m beholding to myself as my ultimate employer, I mean we both know I’m a tracker, I use the data to create enhanced environments to create good work product. If I can actually use my biosensors and my environmental sensors to control the lighting and sound and temperature of my house, and I know that I’ve got a project due tomorrow, why not create the environment tomorrow that I know fits that? I don’t know. I love this piece though. But they’re opting in to it so I don’t think there was a problem with it.

KF: I think what concerns people and what concerns me is that, at some point, it will become companies will start mandating it and you can see that actually already happening in some places in different ways, like Whole Foods. It’s not mandatory but you get a bigger employee discount on your groceries as an employee of Whole Foods if you have lower body fat. So you don’t have to participate in the program but in order to get the discount you have to participate, so that’s part of your compensation essentially. So you’re giving up part of your paycheck in a way if you don’t participate.

CD: Is that any different than someone who waits in line for five minutes at Costco for a paper cup full of food that they’re not going to buy? They’re kind of given up their time for some food that may or may not be good for them. Is that any different than … my auto insurance company makes me plug the tracker in my truck and they get a discount on my insurance. My employer actually has a program where you actually can be seen by a physician and then kind of keep track. But I think your point is very valid. What I don’t understand is people’s resistance to this, when they do it so freely in other areas of their life and they don’t think about it.

KF: Yes, that’s fair. If you’re using a Fitbit or if you’re using a Runkeeper you’re already sending somebody that data. I think there’s a pretty valid concern about abuse of the healthcare data by employers though, if there’s prejudices that can be introduced or that somebody, a bad actor within the company, who can choose to use your data against you. But you can see the same sort of thing happening, an employee of one of these companies who actually designs it.

CD: People already use your data against you in the company.

KF: Yes.

CD: They always say, “Did you see this email someone wrote? Did you see what he said?” I mean, I just would rather have it be quantified and say prove it.

KF: Sure.

CD: If you’re going to take me to HR, realize that everything was recorded. So let’s go there. Let’s do this. Yes, I love this story. I thought it was great.

One other interesting story – some people played it up but Facebook kind of emotion … they call it the Facebook Emotion Content. Basically when you do an update it allows you say I’m feeling good, I’m feeling sad, I’m feeling happy. And people are saying that this is a great way for them to target better advertising to you. But, to me, the one thing I haven’t seen in the media is I think this is a great way for Facebook to make Facebook more sticky. So if was to say, “I’m feeling sad,” and then liked a couple of updates directly after that, I think it’s easier for Facebook to start figuring out who brings me out of my bad moods and put those people at the top of my feed. I think this is [[adjuring]] play and this has nothing to do with marketing.

KF: I’m sure it has to do with marketing, I really don’t doubt that they’re going to find some way to tie that into – if you’re feeling sad, showing you ads for ice cream or something…

CD: All the sad people just reach for Dryers right now.

KF: That doesn’t mean that there’s not more use for it – mood tracking. People use things like Mood Panda already, so there’s definitely some potential value outside of Facebook or monetary gain, if you want to participate in that sort of thing.

CD: I’ve been using … I use Mood Panda, but I’ve also benefit using the Facebook new version and I just set the setting to only me, right? That’s who I share it with. Because, unlike Mood Panda, it’s not 1 to 10, it’s a range of emotions. It’s just not available on mobile yet.

We’re getting close to the top of the show here. Real quick, there was a Google Glass unboxing online. I’ll put a link to that in the Show Notes. I thought it was kind of interesting to watch someone. They didn’t actually show them putting them on, but it was like the first unboxing that I’ve watched in a while that I thought that was interesting.

And then there were three really great tweets I saw this week I wanted to share with you.

This guy named Alan Jacobs tweeted out that taking notes, creating appointments in the calendar, talking with your friends, listening to songs – they’re all achieved now with identical gestures. And I thought that was really interesting from a cyborg kind of point of view, especially an anthropologist cyborg kind of point of view, that it’s doing all of those things or the same thing.

Speaking of cyborgs and anthropology, Amber Case tweeted out yesterday: “People aren’t bad at computers, computers are bad at people,” which I loved.

And then kind of going back to the top of the show with the tragedy that happened in Boston, Joshua Topolsky tweeted out: “When you’re connected to everything, everything can seem connected. Confusing those things is dangerous.”

Any thoughts on those tweets?

KF: Right. I like the third one quite a bit, and Douglas Rushkoff calls that “Fractalnoia” in his new book, Present Shock.

CD: Love it.

KF: And what Amber said, I guess the issue there is that computers are designed by people, so I guess we could really extend that to say that people still haven’t figured out how to make computers that we’re really good and comfortable with interacting with. We’re making progress, but it still feels dehumanizing to use technology a lot of the time.

CD: Completely.

KF: And I hope we can fix that someday.

CD: I totally agree with you. It is dehumanizing. And I think all the apps and systems I use, they always do a really good job of reflecting my relationship to them in some way, even if it’s just, not notification, but it’s just that I love things like Timehop that says, “Hey, you did this one a year ago.” That’s so empowering when you’re pumping that much data into the system, to have something return some value to you I think is so critical.

Ending the show on some events, are you going to be anywhere? Any place we can find you? Anything you know of, anything exciting?

KF: I will be at BDigital Global Congress, June 12-14. I’m speaking on the 12th on exactly the sort of thing we’ve been talking about today – employee tracking, quantified work – and that’s in Barcelona. I believe you’re going to be next month at Cyborg Camp in Vancouver. Is that right?

CD: Yup, I wanted to go. Man, I met you at Cyborg Camp. So many amazing things came out of that [00:14:58]. It’s kind of crazy how much happened at that one event. So yes, I’m going to take a trip to Vancouver that’s May 11, if you’re in Vancouver. If you’re not, you can probably find somebody who will give you a ride to Vancouver. And then a couple of other events there I will kind of mention, if you guys are interested, June in New York City at Lincoln Center, Global Future 2045, with that Russian billionaire who wants to create four avatars. I think that’s going to be wild so if you’re interested in that, that’s coming up. If you have any events, send them over to Klint or I, we’ll make sure we’re going to mention. And then, Buddhist Geeks is here in Boulder and that’s three days in August. That’s about it.

KF: All right. I guess we’ll talk again in two weeks.

CD: All right, Klint. This is Mindful Cyborgs, signing off. Thanks, Klint.

KF: Thank you. Bye.

1 Comment

  1. Titilating… Nice Sunday morning read. Ranting some of the juicier tidbits aloud to my girlfriend and her am really freaked them out. But they already know I’m crazy… Tank you gentlemen, keep up the good work. You cyborgs are always welcome at Lsat Word Books.

Comments are closed.

© 2024 Technoccult

Theme by Anders NorénUp ↑