1
00:00:00,000 --> 00:00:11,019
*preroll music*

2
00:00:11,019 --> 00:00:18,269
Herald: I am very happy to introduce this
year’s update on the “State of the Onion”!

3
00:00:18,269 --> 00:00:23,969
This is a talk with about 5 speakers,
so let’s introduce them one by one.

4
00:00:23,969 --> 00:00:28,529
First, Roger. He did it the last talk.
He is the founder of the TOR Project,

5
00:00:28,529 --> 00:00:35,979
*applause*
MIT Graduate and Top 100 Global Thinkers.

6
00:00:35,979 --> 00:00:39,059
Then we have Jake, a
humble PHD math student

7
00:00:39,059 --> 00:00:42,410
*applause*

8
00:00:42,410 --> 00:00:46,350
that is in my opinion not a
National Security threat

9
00:00:46,350 --> 00:00:51,190
but a post National Security promise.

10
00:00:51,190 --> 00:00:55,129
We have Mike Perry, and I think
it is enough to say about him,

11
00:00:55,129 --> 00:00:58,700
that the NSA calls him a worthy adversary.

12
00:00:58,700 --> 00:01:04,909
*applause*

13
00:01:04,909 --> 00:01:09,250
He is also the lead dev
of the TOR Browser.

14
00:01:09,250 --> 00:01:14,220
And then we have Alison Macrina,
a radical, militant librarian.

15
00:01:14,220 --> 00:01:21,270
*applause*

16
00:01:21,270 --> 00:01:28,040
And last but not least: Shari Steele, the
new Executive Director of the TOR Project.

17
00:01:28,040 --> 00:01:35,500
*applause*

18
00:01:35,500 --> 00:01:40,220
So without further ado:
This year’s State of the Onion!

19
00:01:40,220 --> 00:01:45,230
*applause*

20
00:01:45,230 --> 00:01:49,490
Jacob: Alright, it’s a great
honor to be back here again.

21
00:01:49,490 --> 00:01:52,640
And we’re really happy to be able
to introduce so many more faces.

22
00:01:52,640 --> 00:01:56,770
It’s no longer the Roger and Jake
show. That’s very important to us.

23
00:01:56,770 --> 00:02:01,430
Hopefully next year, we won’t
be here, but we’ll still be alive.

24
00:02:01,430 --> 00:02:05,660
So 2015, if I were to express
it in a hand gesture

25
00:02:05,660 --> 00:02:10,310
or with a facial expression, it would
look something like “Ooouuw”.

26
00:02:10,310 --> 00:02:15,460
It was really a year of big changes. Not
all of them were really good changes.

27
00:02:15,460 --> 00:02:18,450
And there were a lot of heavy things
that happened throughout the year.

28
00:02:18,450 --> 00:02:22,020
We won’t even be able to cover all of
them because we only have an hour.

29
00:02:22,020 --> 00:02:25,760
So we want to focus on the
positive things. I would say that

30
00:02:25,760 --> 00:02:30,120
probably the nicest thing is that we are
growing. We’re really, really growing.

31
00:02:30,120 --> 00:02:33,200
Not only growing the network,
but we’re growing the community.

32
00:02:33,200 --> 00:02:37,030
And in some sense we’re expanding
throughout the whole world in terms of

33
00:02:37,030 --> 00:02:41,450
users who are using TOR, what TOR
users are using TOR for, which is

34
00:02:41,450 --> 00:02:45,200
of course extremely important that there
is more and more people just doing

35
00:02:45,200 --> 00:02:49,260
regular things with TOR, protecting
themselves. But then we have of course

36
00:02:49,260 --> 00:02:52,100
lots of specialized things that happen
with the TOR network as well.

37
00:02:52,100 --> 00:02:56,290
We have things like OnionBalance and
Ricochet. Really exciting developments.

38
00:02:56,290 --> 00:03:01,060
And we’ll talk a bit about all of those
things. One of the most unlikely things,

39
00:03:01,060 --> 00:03:05,990
at least when I imagine working
on TOR, say 10 years ago vs. now,

40
00:03:05,990 --> 00:03:09,750
is that we’ve worked with some really
unlikely partners. Some of you know

41
00:03:09,750 --> 00:03:17,190
that I’m not really a big fan of Silicon
Valley, even though I’m from there.

42
00:03:17,190 --> 00:03:21,860
So you know, I sometimes call Facebook
not so nice names, like Stasi-Book.

43
00:03:21,860 --> 00:03:24,190
And part of the reason for that is
because I think it is a little bit weird,

44
00:03:24,190 --> 00:03:28,250
that you report on all your friends
in order to go to parties.

45
00:03:28,250 --> 00:03:32,459
Previously it was to get into the party
and now it is to go to parties.

46
00:03:32,459 --> 00:03:35,860
And yet we worked with them on something.

47
00:03:35,860 --> 00:03:39,680
Because it turns out that sometimes
you have unlikely temporary alliances.

48
00:03:39,680 --> 00:03:43,490
And it turns out that while I personally
may think that they are evil incarnate

49
00:03:43,490 --> 00:03:48,470
in some sense, it is the case that
there is at least one good guy there.

50
00:03:48,470 --> 00:03:52,640
Alec worked on this fantastic RFC7686,

51
00:03:52,640 --> 00:03:58,130
that actually allowed us to help all
Facebook users mitigate some harm.

52
00:03:58,130 --> 00:04:01,540
Which is that if they want to be able
to visit Facebook; and I guess

53
00:04:01,540 --> 00:04:05,280
the reality is that not using Facebook
for a lot of people is sort of like

54
00:04:05,280 --> 00:04:08,590
the “Kill your Television” bumper sticker
of the 90s. For those of you that ever

55
00:04:08,590 --> 00:04:13,470
visited rural America. You know that that
wasn’t like a really successful campaign.

56
00:04:13,470 --> 00:04:18,469
A lot of people have TVs these days
as well. So it’s a little bit like that,

57
00:04:18,469 --> 00:04:22,370
only here we actually built an alternative
where we can mitigate harm.

58
00:04:22,370 --> 00:04:25,400
And that’s really incredibly important
because it mitigates harm in all sorts

59
00:04:25,400 --> 00:04:29,129
of different pieces of software. It
makes it possible for us to talk to

60
00:04:29,129 --> 00:04:32,900
Browser vendors, to DNS resolvers.
And part of this was motivated

61
00:04:32,900 --> 00:04:36,569
by some investigative journalism
that I actually did, where I revealed

62
00:04:36,569 --> 00:04:41,090
XKeyscore rules, where the US
Government’s National Security Agency

63
00:04:41,090 --> 00:04:45,159
was sifting through all of the internet
traffic to look for .onion addresses.

64
00:04:45,159 --> 00:04:49,169
So when they saw a DNS request
for .onion they were actually

65
00:04:49,169 --> 00:04:52,919
learning .onions by harvesting traffic.
And that really motivated me

66
00:04:52,919 --> 00:04:55,779
to want to make it, so that the DNS
resolvers didn’t do that anymore.

67
00:04:55,779 --> 00:05:00,819
It was very important, because one
of my core missions with TOR

68
00:05:00,819 --> 00:05:04,699
is to make that kind of stuff a
lot harder for the spies to do.

69
00:05:04,699 --> 00:05:08,980
And protecting everyday users, even
users who aren’t TOR users, yet.

70
00:05:08,980 --> 00:05:12,300
And that’s very important. So working
with Alec on this has been great,

71
00:05:12,300 --> 00:05:16,169
because the IETF actually
supports this. And now

72
00:05:16,169 --> 00:05:20,190
ICANN will not sell
.onion to anyone.

73
00:05:20,190 --> 00:05:24,250
It’s a special use reserved
name. And that’s incredible!

74
00:05:24,250 --> 00:05:31,269
*applause*

75
00:05:31,269 --> 00:05:34,599
Roger: OK, so. Is this
thing on? Yes it is, great!

76
00:05:34,599 --> 00:05:37,370
So there are a couple of interesting
graphs, that we’re going to give you,

77
00:05:37,370 --> 00:05:42,490
of usage scenarios, usage
instances over the past year.

78
00:05:42,490 --> 00:05:46,539
So pretty recently we were looking at
the number of people in Russia

79
00:05:46,539 --> 00:05:51,199
using TOR. Russia has been talking about
censoring, talking about all sorts of

80
00:05:51,199 --> 00:05:55,979
oppression steps. And at
the beginning of November,

81
00:05:55,979 --> 00:06:01,219
we moved from 180k people in
Russia each day using TOR

82
00:06:01,219 --> 00:06:05,749
up to almost 400k people. And
this is probably a low estimate.

83
00:06:05,749 --> 00:06:10,159
So many hundreds of thousands
of people for that two week period,

84
00:06:10,159 --> 00:06:14,619
which started with a Russian bomber
getting shot down, were trying to get

85
00:06:14,619 --> 00:06:18,319
news from the rest of the world, rather
than news as Russia wanted to show it

86
00:06:18,319 --> 00:06:22,460
to them. So that’s
kind of a cool event.

87
00:06:22,460 --> 00:06:26,139
Another interesting event: Bangladesh
ended up censoring Facebook

88
00:06:26,139 --> 00:06:30,229
and some other websites and a whole
lot of people switched to using TOR.

89
00:06:30,229 --> 00:06:32,909
I was actually talking to one of the
Facebook people and they have their own

90
00:06:32,909 --> 00:06:37,819
internal statistics about the number of
people connecting over the TOR network

91
00:06:37,819 --> 00:06:42,279
to Facebook. And it would be super
cool to super impose these two graphs.

92
00:06:42,279 --> 00:06:45,749
Our data is public and open
and we like sharing it.

93
00:06:45,749 --> 00:06:49,520
They don’t actually share their data.
But one day it would be really cool

94
00:06:49,520 --> 00:06:53,110
to be able to see both of these
graphs at once, to see users shifting

95
00:06:53,110 --> 00:06:57,259
from reaching Facebook
directly to going over TOR.

96
00:06:57,259 --> 00:07:00,050
The other interesting thing from the
Bangladesh side: I was looking at the

97
00:07:00,050 --> 00:07:04,499
Alexa top websites around the
world and we, torproject.org is

98
00:07:04,499 --> 00:07:08,539
like 8000th in the global
rankings, but at least

99
00:07:08,539 --> 00:07:11,649
for the past couple of weeks
torproject.org has been

100
00:07:11,649 --> 00:07:16,849
300th in Bangladesh. So there are a
whole heck of a lot of people there,

101
00:07:16,849 --> 00:07:22,889
learning about these privacy things
that can get around local censorship.

102
00:07:22,889 --> 00:07:28,289
*applause*

103
00:07:28,289 --> 00:07:32,270
OK, and then an exciting
other story that we’re

104
00:07:32,270 --> 00:07:35,900
going to touch on briefly, but
it’s an entire talk on its own.

105
00:07:35,900 --> 00:07:40,439
So let me give you a couple
of facts and we’ll go from there.

106
00:07:40,439 --> 00:07:44,069
January of 2014 a hundred
relays showed up

107
00:07:44,069 --> 00:07:47,699
in the TOR network and we weren’t sure
who was running them, but they weren’t

108
00:07:47,699 --> 00:07:52,159
exit relays, so they didn’t seem like
they were such a threat at the time.

109
00:07:52,159 --> 00:07:57,839
Fast forward a while later: The
CERT organization inside CMU

110
00:07:57,839 --> 00:08:01,929
submitted a presentation to
Blackhat on how cool they were

111
00:08:01,929 --> 00:08:05,939
for being able to attack TOR users. And
they talked about how they were going to

112
00:08:05,939 --> 00:08:09,610
talk about individual users
that they de-anonymized

113
00:08:09,610 --> 00:08:12,990
and how cool they were for that.
And I spent a while trying to extract

114
00:08:12,990 --> 00:08:17,479
details from them. And eventually
I learned what their attack was.

115
00:08:17,479 --> 00:08:21,169
And then Nick Mathewson, one of
the other TOR developers decided

116
00:08:21,169 --> 00:08:25,050
to check the TOR network to see if
anybody was actually doing that attack.

117
00:08:25,050 --> 00:08:29,099
I mean it’s CERT, they are the
folks who publicised the phrase

118
00:08:29,099 --> 00:08:33,059
“responsible disclosure”. Surely,
they are not actually undermining

119
00:08:33,059 --> 00:08:36,679
the TOR network and attacking TOR users.
But then it turns out that somebody was

120
00:08:36,679 --> 00:08:40,880
doing the attack. And it was these
100 relays that looked kind of ordinary

121
00:08:40,880 --> 00:08:44,759
and innocuous before that. Then I sent
mail to the CERT people, saying:

122
00:08:44,759 --> 00:08:48,540
“Hey are those relays yours?” And they
went silent. They have never answered any

123
00:08:48,540 --> 00:08:54,269
of my mails since then. So that’s
what we know. It doesn’t look good.

124
00:08:54,269 --> 00:08:58,009
One of the key things that we,
TOR, have done from here is

125
00:08:58,009 --> 00:09:01,459
we’ve been working on strengthening
the TOR network and getting better

126
00:09:01,459 --> 00:09:05,389
at recognizing these things. So
the core of the attack was that

127
00:09:05,389 --> 00:09:09,150
they did what’s called a Sybil attack,
where you sign up a lot of relays

128
00:09:09,150 --> 00:09:13,449
and you become too large a fraction of the
TOR network. So we’ve been working on

129
00:09:13,449 --> 00:09:18,339
a lot of ways to recognize that
an attack like that is happening,

130
00:09:18,339 --> 00:09:22,139
and mitigate it, and get rid of it
early. For example Philipp Winter

131
00:09:22,139 --> 00:09:26,819
has a bunch of interesting research
areas on recognizing similarity

132
00:09:26,819 --> 00:09:30,670
between relays. So you can
automatically start detecting:

133
00:09:30,670 --> 00:09:33,920
“Wait a minute, this event
happened, where a lot of relays

134
00:09:33,920 --> 00:09:38,480
are more similar than they should
be.” Another example there is:

135
00:09:38,480 --> 00:09:41,610
We used to say: “Well I don’t
know who’s running them,

136
00:09:41,610 --> 00:09:45,399
but they don’t seem that dangerous. So
OK, it’s good to grow the TOR network.”

137
00:09:45,399 --> 00:09:48,940
Now we’re taking the other
approach of “Gosh, that’s weird,

138
00:09:48,940 --> 00:09:52,470
let’s get rid of them and then
we’ll ask questions after that.”

139
00:09:52,470 --> 00:09:56,009
So we’re trying to be more
aggressive, more conservative

140
00:09:56,009 --> 00:09:59,880
at keeping the TOR network
safe from large adversaries.

141
00:09:59,880 --> 00:10:04,620
Whether they’re government organizations
or corporations or individuals.

142
00:10:04,620 --> 00:10:12,029
Whoever might be attacking it.

143
00:10:12,029 --> 00:10:17,220
Jacob: We’ve had a few really big
changes in the TOR community.

144
00:10:17,220 --> 00:10:20,610
One of them is that we had
an Interim Executive Director

145
00:10:20,610 --> 00:10:25,930
come on in a sort of quick moment
and that’s Roger Dingledine.

146
00:10:25,930 --> 00:10:28,850
Some of you probably always thought he
was the Executive Director the whole time.

147
00:10:28,850 --> 00:10:33,279
That’s because for a while he was and then
he wasn’t. And then he was back again.

148
00:10:33,279 --> 00:10:37,490
And that change was quite a
huge change in that instead of

149
00:10:37,490 --> 00:10:41,190
working on a lot of anonymity stuff,
Roger was doing a lot of bureaucratic

150
00:10:41,190 --> 00:10:44,519
paperwork which was actually quite
sad for the anonymity world, I think.

151
00:10:44,519 --> 00:10:48,160
He probably reviewed fewer papers
and did fewer anonymity things

152
00:10:48,160 --> 00:10:51,790
this year than ever before.
Which is really, really sad.

153
00:10:51,790 --> 00:10:55,050
But that really lit a fire under us to
make sure that we would actually

154
00:10:55,050 --> 00:10:58,839
change that. To make sure that it was
possible to get someone else, who is

155
00:10:58,839 --> 00:11:02,399
really good at being an Executive Director
of the TOR Project, to really lead,

156
00:11:02,399 --> 00:11:06,459
so that we could have Roger return to
not only being an anonymity researcher,

157
00:11:06,459 --> 00:11:09,240
but also the true Spirit
Animal of the TOR Project.

158
00:11:09,240 --> 00:11:13,440
He doesn’t look like
an onion, but in spirit.

159
00:11:13,440 --> 00:11:19,540
Roger: Slide!
Jacob: *laughing*

160
00:11:19,540 --> 00:11:22,329
Another really big thing that happened
is working with Laura Poitras

161
00:11:22,329 --> 00:11:27,800
over the last many years.
She has followed the TOR Project

162
00:11:27,800 --> 00:11:31,129
– lots of people like to follow the
people on the TOR Project –

163
00:11:31,129 --> 00:11:35,639
but we consented to her following us.
And she made a film, “Citizenfour”,

164
00:11:35,639 --> 00:11:39,000
I think some of you… have
any of you seen this film?

165
00:11:39,000 --> 00:11:45,170
*applause*
Quite amazingly,

166
00:11:45,170 --> 00:11:48,499
she won an Oscar. Actually, she
basically won every film prize.

167
00:11:48,499 --> 00:11:57,269
*applause*

168
00:11:57,269 --> 00:12:01,170
One of the key things is that people
in this room that work on Free Software

169
00:12:01,170 --> 00:12:04,819
were explicitly thanked. If you work
on Tails, if you work on GnuPG,

170
00:12:04,819 --> 00:12:08,649
if you work on SecureDrop,
OTR, TOR, …

171
00:12:08,649 --> 00:12:11,459
She specifically said in
the credits of the film:

172
00:12:11,459 --> 00:12:15,490
This film wouldn’t have been
possible without that Free Software.

173
00:12:15,490 --> 00:12:18,939
Actually making her job and
the jobs of her source

174
00:12:18,939 --> 00:12:22,000
and other people involved…
making that possible.

175
00:12:22,000 --> 00:12:25,750
And so her winning that Oscar
in some sense feels like

176
00:12:25,750 --> 00:12:29,480
closing a really big loop that had
been open for a very long time.

177
00:12:29,480 --> 00:12:33,000
And it’s really great and she,
I think, would really wish that she

178
00:12:33,000 --> 00:12:37,660
could be here today, again. She
sends her regards, and she is really,

179
00:12:37,660 --> 00:12:42,470
really thankful for everybody here that
writes Free Software for freedom!

180
00:12:42,470 --> 00:12:47,909
*applause*

181
00:12:47,909 --> 00:12:51,639
Roger: So another exciting event
that happened in 2015 is that reddit

182
00:12:51,639 --> 00:12:55,660
gave us 83.000$. They had some
extra profit and they decided

183
00:12:55,660 --> 00:13:00,839
that they would give it to 10 non-profits
chosen from among the Redditer community.

184
00:13:00,839 --> 00:13:03,839
And there were people who came to me
and said: “Hey Roger, you really have to,

185
00:13:03,839 --> 00:13:06,939
you know, start advocating, start
teaching everybody, why TOR should be

186
00:13:06,939 --> 00:13:10,290
one of them.” And I said: “Oh, I’m
busy. Those things never work.

187
00:13:10,290 --> 00:13:13,810
You know, they’ll choose somebody
else.” And so it turns out that we were

188
00:13:13,810 --> 00:13:18,550
the 10th out of 10 without doing
any advocacy work whatsoever

189
00:13:18,550 --> 00:13:22,509
to the reddit community, which is super
cool that they care about us so much.

190
00:13:22,509 --> 00:13:27,089
Also reddit divided the ten equally. So
even though we were the 10th out of 10,

191
00:13:27,089 --> 00:13:31,200
we got 10% of the donations
that they were giving out.

192
00:13:31,200 --> 00:13:37,870
*applause*

193
00:13:37,870 --> 00:13:41,149
Jake: One of the really –
I would say one of the oddest things

194
00:13:41,149 --> 00:13:46,120
about working at the TOR Project for me
is that TOR has supported me through

195
00:13:46,120 --> 00:13:49,629
really crazy times. So when I was
being detained by the US Government

196
00:13:49,629 --> 00:13:54,550
or having my property stolen by fascist
pigs in the United States Government’s

197
00:13:54,550 --> 00:13:59,329
border checkpoints, TOR didn’t fire me.
TOR always backed me and always

198
00:13:59,329 --> 00:14:03,379
kept me safe. And many people often look
like they wanted to kill me from stress,

199
00:14:03,379 --> 00:14:06,389
but often they didn’t, which was nice.
Or they didn’t get close enough

200
00:14:06,389 --> 00:14:10,669
and I could move fast enough. But
they were always very helpful. And

201
00:14:10,669 --> 00:14:14,949
they’ve really helped me to
go and do things to speak for

202
00:14:14,949 --> 00:14:18,430
anonymous users who can’t go
other places. And one of the places

203
00:14:18,430 --> 00:14:22,220
which I was most honored to go in the
last year – I was actually scheduled

204
00:14:22,220 --> 00:14:25,569
to go there with Caspar Bowden, but
unfortunately he was ill at the time.

205
00:14:25,569 --> 00:14:29,899
And as you know, Caspar
has since passed away.

206
00:14:29,899 --> 00:14:32,999
But we were scheduled to go together and
TOR was supporting us both, actually,

207
00:14:32,999 --> 00:14:38,319
to go to this. And it resulted, I believe,

208
00:14:38,319 --> 00:14:41,519
in a very amazing meeting in
Geneva at the United Nations,

209
00:14:41,519 --> 00:14:45,779
where the special rapporteur actually
endorsed TOR and off-the-record messaging

210
00:14:45,779 --> 00:14:49,729
and encryption programs,
and privacy, and free software.

211
00:14:49,729 --> 00:14:54,680
Saying that they are absolutely essential.
And in fact their use should be encouraged

212
00:14:54,680 --> 00:14:59,629
from a human rights perspective. And in
fact the really amazing part about it is

213
00:14:59,629 --> 00:15:03,649
he didn’t do it only from the perspective
of free speech. And this is important,

214
00:15:03,649 --> 00:15:07,139
because actually there are other rights.
And we should think about them.

215
00:15:07,139 --> 00:15:10,370
So for example the right to form
and to hold an idea is a right

216
00:15:10,370 --> 00:15:14,079
that cannot be abridged. The right
to free speech can be abridged

217
00:15:14,079 --> 00:15:18,589
in many free societies, but what is
in your head and how you form it

218
00:15:18,589 --> 00:15:22,040
is something where… that is not
a right that can be abridged.

219
00:15:22,040 --> 00:15:25,579
And he wrote this in the report. And
he, when writing this report with

220
00:15:25,579 --> 00:15:29,899
many other people, made it very clear that
this is something we need to keep in mind.

221
00:15:29,899 --> 00:15:34,249
That when we talk about private spaces
online, where groups may collaborate

222
00:15:34,249 --> 00:15:37,850
to form ideas, to be able to create
a political platform for example,

223
00:15:37,850 --> 00:15:41,220
to be able to make democratic change,
they need to be able to use the internet

224
00:15:41,220 --> 00:15:46,319
to freely exchange those ideas in a secure
and anonymized, encrypted fashion.

225
00:15:46,319 --> 00:15:50,889
And that helps them to form and to hold
ideas. And obviously that helps them later

226
00:15:50,889 --> 00:15:55,470
to express free speech ideas. And that’s
a huge thing to have the United Nations

227
00:15:55,470 --> 00:16:02,409
endorse basically what many of us in this
room have been saying for, well… decades.

228
00:16:02,409 --> 00:16:05,459
Roger: So the UN thing is really cool.
We’ve also been doing some other

229
00:16:05,459 --> 00:16:09,879
policy angles. So Steven Murdoch, who
is a professor in England and also

230
00:16:09,879 --> 00:16:14,350
part of the TOR community, has worked
really hard at teaching the British folks,

231
00:16:14,350 --> 00:16:18,490
that their new backdoor laws and
their new terrible laws are actually

232
00:16:18,490 --> 00:16:23,240
not what any reasonable country wants.
So he’s put a huge amount of energy into

233
00:16:23,240 --> 00:16:27,680
basically advocating for freedom for
them. And similarly Paul Syverson,

234
00:16:27,680 --> 00:16:32,569
part of the TOR community, basically
ended up writing a post note for the UK

235
00:16:32,569 --> 00:16:36,790
about how the dark web is
misunderstood. See previous talk.

236
00:16:36,790 --> 00:16:40,680
So we’ve been doing quite a bit
of education at the policy level

237
00:16:40,680 --> 00:16:44,910
to try to teach the world, that encryption
is good and safe and worthwhile

238
00:16:44,910 --> 00:16:50,070
and should be the default
around the world.

239
00:16:50,070 --> 00:16:54,050
Jake: And there is a kind of interesting
thing here. Maybe a little contentious

240
00:16:54,050 --> 00:16:57,279
with some people in the TOR community.
But I just wanted to make it really clear.

241
00:16:57,279 --> 00:17:01,170
We have the TOR Project, which is
a non-profit in the United States.

242
00:17:01,170 --> 00:17:04,569
And we have a much wider TOR
community all around the world.

243
00:17:04,569 --> 00:17:07,950
And in Berlin we have a really, really
like an incredible TOR community.

244
00:17:07,950 --> 00:17:11,380
We have people like Donncha working
on OnionBalance. We have people like

245
00:17:11,380 --> 00:17:14,810
Leif Ryge working on bananaphone. We
have all of these different people working

246
00:17:14,810 --> 00:17:17,970
on all sorts of Free Software. And many
of those people don’t actually work

247
00:17:17,970 --> 00:17:21,240
for the TOR Project. They’re community
members, they’re volunteers,

248
00:17:21,240 --> 00:17:26,010
there is some of privacy students.
And so the Renewable Freedom Foundation

249
00:17:26,010 --> 00:17:30,050
actually funded the creation
of a sort of separate space

250
00:17:30,050 --> 00:17:33,980
in Berlin where people work on these
kinds of things, which is not affiliated

251
00:17:33,980 --> 00:17:38,100
with US Government money. It’s
not affiliated with the TOR Project

252
00:17:38,100 --> 00:17:41,360
as some sort of corporate thing.
It’s not a multinational thing.

253
00:17:41,360 --> 00:17:46,630
It’s really the peer-to-peer version in
some sense of what we’ve already had

254
00:17:46,630 --> 00:17:49,650
in other places. And it’s really great
and I wanted to just thank Moritz

255
00:17:49,650 --> 00:17:54,350
who made that happen and to all the
people like Aaron Gibson, and Juris

256
00:17:54,350 --> 00:17:57,900
who actually put that space together
and made it possible. So in Berlin,

257
00:17:57,900 --> 00:18:01,740
there is a space, not just c-base,
not just CCCB, but actually

258
00:18:01,740 --> 00:18:05,600
a place which is about anonymity.
It’s called Zwiebelraum.

259
00:18:05,600 --> 00:18:09,430
And this is a place in which people are
working on this Free Software. And they

260
00:18:09,430 --> 00:18:12,340
are doing it in an independent manner.
And we hope actually that people will

261
00:18:12,340 --> 00:18:16,400
come together and support that, because
we need more spaces like that, that

262
00:18:16,400 --> 00:18:20,670
are not directly affiliated with the TOR
Project, necessarily, but where we have

263
00:18:20,670 --> 00:18:24,280
an aligned mission about reproduceable
builds in Free Software and also

264
00:18:24,280 --> 00:18:29,300
about anonymity and actually about caring
about Free Speech. And actually making

265
00:18:29,300 --> 00:18:33,110
it happen. And really building spaces
like that all around the world. So if you

266
00:18:33,110 --> 00:18:36,140
have a place in your town where you want
to work on those things, we would really

267
00:18:36,140 --> 00:18:40,340
hope that you will work on building that.
I called it “general cipher punkery”.

268
00:18:40,340 --> 00:18:44,300
I feel like that’s a good description.
There’s lots of stuff to be done.

269
00:18:44,300 --> 00:18:48,940
And now for a Marxist joke: So we
discovered the division of labor,

270
00:18:48,940 --> 00:18:52,570
which was a really important discovery.
We’re about 180 years too late,

271
00:18:52,570 --> 00:18:58,310
but we started to split up where it didn’t
go very well, the Marxist asked why.

272
00:18:58,310 --> 00:19:02,410
Cheers, cheers!
So the Vegas Teams are really simple.

273
00:19:02,410 --> 00:19:06,620
Basically we have a bunch of people
that previously they did everything.

274
00:19:06,620 --> 00:19:10,130
And this really doesn’t work. It’s very
stressful and it’s very frustrating

275
00:19:10,130 --> 00:19:14,470
and it leads to people doing lots and
lots of things in a very unfocused way.

276
00:19:14,470 --> 00:19:18,740
And so we split it up! And it actually
happened naturally, it was emergent.

277
00:19:18,740 --> 00:19:24,010
So e.g. Mike Perry, who’s gonna talk
about the Applications Team’s work

278
00:19:24,010 --> 00:19:28,280
in a second here, he was
already leading this,

279
00:19:28,280 --> 00:19:32,370
he was really making this happen. And
so we just made it more explicit. And,

280
00:19:32,370 --> 00:19:36,650
in fact we created a way of communicating
and reporting back so that

281
00:19:36,650 --> 00:19:39,850
you don’t have to, like, drink from the
fire hose about absolutely everything

282
00:19:39,850 --> 00:19:42,430
that’s happening everywhere, but you can
sort of tune in to those things, which

283
00:19:42,430 --> 00:19:46,970
means we get higher-level understandings
and that is a really, incredibly useful

284
00:19:46,970 --> 00:19:49,740
thing that has made us much more
productive. And what was part of the

285
00:19:49,740 --> 00:19:53,500
growing pains of the last year actually
was figuring out how to make that work

286
00:19:53,500 --> 00:19:57,210
because we’re a pretty flat group in terms
of a community and a pretty flat group

287
00:19:57,210 --> 00:20:02,060
in terms of an organization writing
Free Software and advocating.

288
00:20:02,060 --> 00:20:06,500
And so that’s a really incredibly good
thing which will come up all the time.

289
00:20:06,500 --> 00:20:09,770
You’ll hear people talking about the
Metrics Team or the Network Team or the

290
00:20:09,770 --> 00:20:13,650
Applications Team or the Community Team.
And that’s what we’re talking about.

291
00:20:13,650 --> 00:20:17,630
In that sense. So we tried to formalize it
and in some ways we may be moving in a

292
00:20:17,630 --> 00:20:23,840
sort of Debian model a little bit. And
we’ll see how that actually goes. So we

293
00:20:23,840 --> 00:20:28,470
have a really great person here to
explain the work of the Metrics Team.

294
00:20:28,470 --> 00:20:32,350
Roger: OK, so I’m gonna tell you a little
bit about what the Metrics Team has been

295
00:20:32,350 --> 00:20:36,570
working on lately to give you a
sense of some of the components

296
00:20:36,570 --> 00:20:40,890
of the TOR community. So there are 5 or
10 people who work on the Metrics Team.

297
00:20:40,890 --> 00:20:45,350
We actually only pay one-ish of them;
so most of them are volunteers

298
00:20:45,350 --> 00:20:48,980
and that’s… on the one hand that’s great.
It’s wonderful that there are researchers

299
00:20:48,980 --> 00:20:53,750
all around the world who are contributing
and helping to visualize and helping to do

300
00:20:53,750 --> 00:20:57,980
analysis on the data. On the other hand
it’s sort of sad that we don’t have a full

301
00:20:57,980 --> 00:21:02,530
team of full-time people who are working
on this all the time. So it’d be great

302
00:21:02,530 --> 00:21:07,710
to have your assistance
working on this. So,

303
00:21:07,710 --> 00:21:12,430
actually Metrics has been accumulating
all sorts of analysis tools

304
00:21:12,430 --> 00:21:16,990
over the past 5 years. So there are up to
30 different little tools. There’s Atlas

305
00:21:16,990 --> 00:21:22,410
and Globe and Stem and 20-something more
which is a challenge to keep coordinated,

306
00:21:22,410 --> 00:21:26,690
a challenge to keep maintained. So
they’ve been working on how to integrate

307
00:21:26,690 --> 00:21:32,090
these things and make them more
usable and maintainable and extensible.

308
00:21:32,090 --> 00:21:36,370
So one example that they… so they wrote
some slides for me to present here.

309
00:21:36,370 --> 00:21:40,050
One example that they were looking
at, to give you an example of how

310
00:21:40,050 --> 00:21:45,540
this analysis works, is bad relays in the
TOR network. So maybe that’s an exit relay

311
00:21:45,540 --> 00:21:50,520
that runs, but it modifies traffic, or
it watches traffic or something.

312
00:21:50,520 --> 00:21:56,150
Maybe it’s a relay that signs up
as a Hidden Service directory

313
00:21:56,150 --> 00:21:59,970
and then when you publish your
onion address to it, it goes to visit it

314
00:21:59,970 --> 00:22:04,370
or it puts it on a big list or something
like that. Or maybe bad relays are Sybils

315
00:22:04,370 --> 00:22:09,580
who – we were talking earlier about
the 2014 attack where a 100 relays

316
00:22:09,580 --> 00:22:14,750
showed up at once and we, the directory
authorities have a couple of ways of

317
00:22:14,750 --> 00:22:19,500
addressing that relays. One of them is
each of the directory authorities can say:

318
00:22:19,500 --> 00:22:22,670
“That relay needs to get out of the
network! We just cut it out of the

319
00:22:22,670 --> 00:22:27,900
network.” We can also say: “Bad exit!”
We can also say: “That relay is no longer

320
00:22:27,900 --> 00:22:33,240
gonna be used as an exit!” So even though
it advertises that it can reach Blockchain

321
00:22:33,240 --> 00:22:39,320
and other websites, clients choose not to
do it that way. So that’s the background.

322
00:22:39,320 --> 00:22:44,920
One of the tools that Damian wrote a while
ago is called Tor-Consensus-Health and it

323
00:22:44,920 --> 00:22:49,570
looks every hour at the new list of relays
in the network and it tries to figure out:

324
00:22:49,570 --> 00:22:53,000
“Is there something suspicious that
just happened at this point?” And in this

325
00:22:53,000 --> 00:22:57,920
case it looks for a bunch of new relays
showing up all at the same time with

326
00:22:57,920 --> 00:23:04,530
similar characteristics and it sends email
to a list. So that’s useful. The second

327
00:23:04,530 --> 00:23:08,910
piece of the analysis is “OK, what do you
do when that happens?” So we get an email

328
00:23:08,910 --> 00:23:13,960
saying “Hey, 40 new relays showed up,
what’s up with that?” So there’s a real

329
00:23:13,960 --> 00:23:18,790
challenge there to decide: do we allow
the TOR network to grow – sounds good –

330
00:23:18,790 --> 00:23:23,280
or do we wonder who these people are
and try to contact them or cut them out of

331
00:23:23,280 --> 00:23:29,600
the network or constrain what fraction
of the network they can become.

332
00:23:29,600 --> 00:23:35,150
So Philipp Winter also has a
visualization, in this case of basically

333
00:23:35,150 --> 00:23:41,310
which relays were around on a given month.
So the X axis is all of the different

334
00:23:41,310 --> 00:23:46,100
relays in the month and the Y axis is each
hour during that month. And they’ve sorted

335
00:23:46,100 --> 00:23:51,010
the relays here by how much they were
present in the given month. And you’ll

336
00:23:51,010 --> 00:23:55,120
notice the red blocks over there are
relays that showed up at the same time

337
00:23:55,120 --> 00:23:59,320
and they’d been consistently present at
the same time since then. So that’s kind

338
00:23:59,320 --> 00:24:03,070
of suspicious. That’s “Hey, wait a minute,
what’s that pattern going on there?”

339
00:24:03,070 --> 00:24:07,260
So this is a cool way of visualizing and
being able to drill down and say:

340
00:24:07,260 --> 00:24:10,780
“Wait a minute, that pattern right there,
something weird just happened.”

341
00:24:10,780 --> 00:24:14,470
So part of the challenge in general for
the Metrics Team is: they have a Terabyte

342
00:24:14,470 --> 00:24:18,350
of interesting data of what the network
has looked like over the years –

343
00:24:18,350 --> 00:24:23,650
how do you turn that into “Wait a minute,
that right there is something mysterious

344
00:24:23,650 --> 00:24:27,320
that just happened. Let’s look at it
more.” So you can look at it from

345
00:24:27,320 --> 00:24:31,650
the visualization side but you can also
– there’s a tool called Onionoo where

346
00:24:31,650 --> 00:24:35,290
you can basically query it, all sorts
of queries in it, it dumps the data

347
00:24:35,290 --> 00:24:39,940
back on to you. So we’ve got a Terabyte
of interesting data out there, what

348
00:24:39,940 --> 00:24:44,810
the relays are on the network, what
sort of statistics they been reporting,

349
00:24:44,810 --> 00:24:48,930
when they’re up, when they’re down,
whether they change keys a lot,

350
00:24:48,930 --> 00:24:55,080
whether they change IP addresses a lot.
So we encourage you to investigate and

351
00:24:55,080 --> 00:24:59,410
look at these tools etc. So there’s
a new website we set up this year

352
00:24:59,410 --> 00:25:05,180
called CollecTor, collector.torproject.org
that has all of these different data sets

353
00:25:05,180 --> 00:25:09,270
and pointers to all these different
libraries and tools etc. that you too

354
00:25:09,270 --> 00:25:15,030
can use to investigate, graph-visualize
etc. So here’s another example.

355
00:25:15,030 --> 00:25:19,280
At this point we’re looking at the 9
directory authorities in the network.

356
00:25:19,280 --> 00:25:24,620
Each of them votes its opinion about
each relay. So whether the relay’s fast,

357
00:25:24,620 --> 00:25:31,060
or stable, or looks like a good exit or
maybe we should vote about “Bad Exit”

358
00:25:31,060 --> 00:25:35,850
for it. So the grey lines are: all of the
directory authorities thought that

359
00:25:35,850 --> 00:25:41,120
it didn’t deserve the flag and it’s very
clear. The green lines are: enough of the

360
00:25:41,120 --> 00:25:45,310
directory authorities said that the relay
should get the flag, also very clear.

361
00:25:45,310 --> 00:25:49,960
And all the brown and light green etc.
in the middle are contradictions.

362
00:25:49,960 --> 00:25:53,290
That’s where some of the directory
authorities said “Yes it’s fast” and some

363
00:25:53,290 --> 00:25:58,710
of them said “No, it’s not fast”. And this
gives us a visualization, a way to see

364
00:25:58,710 --> 00:26:02,800
whether most of the directory authorities
are agreeing with each other.

365
00:26:02,800 --> 00:26:06,290
We should look at this over time and if
suddenly there’s a huge brown area

366
00:26:06,290 --> 00:26:10,930
then we can say “Wait a minute,
something’s going on”, where maybe

367
00:26:10,930 --> 00:26:15,080
a set of relays are trying to look good to
these directory authorities and trying

368
00:26:15,080 --> 00:26:19,700
not to look good to these. So basically
it helps us to recognize patterns

369
00:26:19,700 --> 00:26:26,070
of weird things going on. So on CollecTor
you can find all sorts of data sets

370
00:26:26,070 --> 00:26:32,690
and you can fetch them and do your
analysis of them. And Tor Metrics

371
00:26:32,690 --> 00:26:38,280
– metrics.torproject.org – has a bunch of
examples of this analysis, where you can

372
00:26:38,280 --> 00:26:42,430
look at graphs of the number of people
connecting from different countries, the

373
00:26:42,430 --> 00:26:46,700
number of relays over time, the number
of new relays, the number of bridges,

374
00:26:46,700 --> 00:26:52,530
users connecting to bridges etc. There
are 3 different libraries that help you

375
00:26:52,530 --> 00:26:56,210
to parse these various data sets. So
there’s one in Python, one in Java,

376
00:26:56,210 --> 00:27:01,160
one in Go; so whichever one of those
you enjoy most you can grab and start

377
00:27:01,160 --> 00:27:07,860
doing analysis. They do weekly or so
IRC meetings, so the TOR Metrics Team

378
00:27:07,860 --> 00:27:11,950
invites you to show up on January 7th
and they would love to have your help.

379
00:27:11,950 --> 00:27:15,340
They have a bunch of really interesting
data, they have a bunch of really

380
00:27:15,340 --> 00:27:21,460
interesting analysis tools and they’re
missing curious people. So show up,

381
00:27:21,460 --> 00:27:25,240
start asking questions about the data, try
to learn what’s going on. And you can

382
00:27:25,240 --> 00:27:28,305
learn more about them, on
the Metrics Team, there.

383
00:27:28,305 --> 00:27:32,055
And then I’m gonna pass it on to Mike.

384
00:27:32,055 --> 00:27:38,720
*applause*

385
00:27:38,720 --> 00:27:43,750
Mike: OK, so Hello everyone! So, I’ll be
telling ’bout the Applications Team part

386
00:27:43,750 --> 00:27:48,600
of the Vegas plan that
Jake introduced. Basically,

387
00:27:48,600 --> 00:27:54,060
the Applications Team was created to
bring together all the aspects of TOR

388
00:27:54,060 --> 00:27:58,500
and the extended community that are
working on anything that’s user facing.

389
00:27:58,500 --> 00:28:02,890
So anything with a user interface that
the user will directly interact with,

390
00:28:02,890 --> 00:28:08,550
that’s an application on
either Mobile or Desktop.

391
00:28:08,550 --> 00:28:13,020
So to start, obviously we had the
TOR Browser, that’s sort of like

392
00:28:13,020 --> 00:28:18,620
a flagship application that most people
are familiar with when they think of TOR.

393
00:28:18,620 --> 00:28:22,990
Recently we’ve added OrFox which is a
project by the Guardianproject to port

394
00:28:22,990 --> 00:28:28,050
the TOR Browser patches to Android
and that’s currently in Alpha Status. But

395
00:28:28,050 --> 00:28:34,190
it’s available on the Guardianproject’s
F-Droid Repo. We also have 2 chat clients:

396
00:28:34,190 --> 00:28:39,020
TorMessenger and Ricochet and both with
different security properties. I will be

397
00:28:39,020 --> 00:28:44,290
getting to it later. So I guess, first
off let’s talk about what happened

398
00:28:44,290 --> 00:28:51,070
in the TOR Browser world in 2015.
Basically most of the, or a good deal

399
00:28:51,070 --> 00:28:56,520
of our work is spent keeping up
with the Firefox release treadmill.

400
00:28:56,520 --> 00:29:01,620
That includes responding
to emergency releases,

401
00:29:01,620 --> 00:29:06,730
auditing changes in the Firefox code
base making sure that their features

402
00:29:06,730 --> 00:29:10,940
adhere to our privacy model and making
sure that our releases come out

403
00:29:10,940 --> 00:29:15,060
the same day as the official
Firefox releases so that there’s

404
00:29:15,060 --> 00:29:20,130
no vulnerability exposure to known
vulnerabilities after they’re disclosed.

405
00:29:20,130 --> 00:29:24,870
That has been a little bit rough to over
2015. I believe there is a solid 3..4

406
00:29:24,870 --> 00:29:29,500
months where it felt like we were doing
a release every 2 weeks. Due to either

407
00:29:29,500 --> 00:29:38,880
log jam or random unassessed
vulnerability or any arbitrary

408
00:29:38,880 --> 00:29:43,620
security issue with Firefox. But we did…
despite treading all that water we did

409
00:29:43,620 --> 00:29:48,710
manage to get quite a bit of work done.
As always our work on the browser focuses

410
00:29:48,710 --> 00:29:54,700
in 3 main areas: privacy, security
and usability. Our privacy work is

411
00:29:54,700 --> 00:30:00,330
primarily focused around making sure that
any new browser feature doesn’t enable

412
00:30:00,330 --> 00:30:05,720
new vectors for 3rd party tracking. So no
ways for a 3rd party content resource to

413
00:30:05,720 --> 00:30:12,570
store state or cookies or blob URIs
or some of the newer features.

414
00:30:12,570 --> 00:30:16,940
There’s a new cash API. These sorts
of things need to all be isolated

415
00:30:16,940 --> 00:30:20,840
to the URL bar domain to prevent 3rd
parties from being able to track you.

416
00:30:20,840 --> 00:30:25,180
From being able to recognize it’s the same
you when you log in to Facebook and

417
00:30:25,180 --> 00:30:31,730
when you visit CNN, and CNN loads
the Facebook Like buttons, e.g.

418
00:30:31,730 --> 00:30:36,530
Additionally we have done a lot of work on
fingerprinting defences, the Alpha Release

419
00:30:36,530 --> 00:30:41,250
ships a set of fonts for the
Linux users so that the

420
00:30:41,250 --> 00:30:45,340
font fingerprinting can be normalized
since a lot of Linux users tend to have

421
00:30:45,340 --> 00:30:49,920
different fonts installed on their
systems. As well as tries to normalize

422
00:30:49,920 --> 00:30:54,380
the font list that allowed for Windows
and Mac users where they often get

423
00:30:54,380 --> 00:30:59,670
additional fonts from 3rd party
applications that install them.

424
00:30:59,670 --> 00:31:05,120
On the security front the major exciting
piece is the security slider. So with iSEC

425
00:31:05,120 --> 00:31:11,810
Partners’ help we did a review of all the
Firefox vulnerabilities and categorized

426
00:31:11,810 --> 00:31:16,680
them based on the component that they were
in as well as their prevalence on the web.

427
00:31:16,680 --> 00:31:21,970
And came up with 4 positions that allow
you to choose, basically trade off,

428
00:31:21,970 --> 00:31:26,080
functionality for vulnerability surface
reduction. And this was actually quite

429
00:31:26,080 --> 00:31:31,870
successful. It turned out that
all of the Pwn2own exploits

430
00:31:31,870 --> 00:31:39,990
against Firefox were actually blocked
for non-https sites at medium/high.

431
00:31:39,990 --> 00:31:46,270
And if you enable the high security
level they were blocked for everything.

432
00:31:46,270 --> 00:31:50,130
We additionally released address
sanitizer hardened builds, these are…

433
00:31:50,130 --> 00:31:54,150
basically should… especially the higher
security levels of the security slider

434
00:31:54,150 --> 00:31:58,810
should protect against various memory
safety issues in the browser and also

435
00:31:58,810 --> 00:32:04,630
help us diagnose issues very rapidly.

436
00:32:04,630 --> 00:32:10,380
And of course we now sign our Windows
packages using a hardware security module

437
00:32:10,380 --> 00:32:16,850
from DigiCert. The usability improvements
were primarily focused around this UI and

438
00:32:16,850 --> 00:32:21,100
this new Onion Menus you can see if you
remember the old menu. There was quite a

439
00:32:21,100 --> 00:32:24,400
lot more options there. We sort of
condensed and consolidated options and

440
00:32:24,400 --> 00:32:29,490
eliminated and combined as much as we
could. An additionally displayed the

441
00:32:29,490 --> 00:32:37,360
circuit for the current URL bar domain.
In 2016 we’ll be focusing mostly on again

442
00:32:37,360 --> 00:32:41,910
the same 3 areas. Our main goal for
privacy is to try and convince Mozilla

443
00:32:41,910 --> 00:32:48,160
that they want to adopt our idea of
isolating 3rd party identifiers at least

444
00:32:48,160 --> 00:32:52,150
to the point of if the user goes into the
Preferences and tries to disable 3rd party

445
00:32:52,150 --> 00:32:57,860
cookies, will let you do the same thing
for DOM storage, Cash, blob URIs,

446
00:32:57,860 --> 00:33:02,760
worker threads, and all these
other sources of shared state.

447
00:33:02,760 --> 00:33:07,910
We’re very excited about their work on a
multi-process sandbox, additionally even

448
00:33:07,910 --> 00:33:13,580
application-level sandboxing, it should
be… without Mozilla’s sandbox,

449
00:33:13,580 --> 00:33:18,620
we should still be able to prevent the
browser from bypassing TOR using SecComp

450
00:33:18,620 --> 00:33:22,640
or AppArmor or SeatBelt or one of
these other sandboxing technologies.

451
00:33:22,640 --> 00:33:25,410
We’re looking forward to trying to
get that rolled out. And we’re doing

452
00:33:25,410 --> 00:33:30,500
exploit bounties! We’ll be
partnering with HackerOne,

453
00:33:30,500 --> 00:33:34,080
who’ll be announcing this shortly. The
program will start out invite-only

454
00:33:34,080 --> 00:33:37,200
and then… just, so we can get
used to the flow and scale up

455
00:33:37,200 --> 00:33:41,810
and then we’ll make it public later in the
year to basically provide people with

456
00:33:41,810 --> 00:33:46,560
incentive to review our code to look
for vulnerabilities that might be

457
00:33:46,560 --> 00:33:51,130
specific to our applications. And of
course the usual usability improving,

458
00:33:51,130 --> 00:33:57,470
security, improving installation. And we’d
like to improve the censorship and bridges

459
00:33:57,470 --> 00:34:02,780
ability flow as well hoping to automate
the discovery of bridges and inform you

460
00:34:02,780 --> 00:34:08,639
if your bridges become unreachable.
So TOR messenger

461
00:34:08,639 --> 00:34:13,230
is one of our 2 chat clients, also
part of the Applications Team.

462
00:34:13,230 --> 00:34:17,540
Basically, the goal there was to minimize
the amount of configuration that

463
00:34:17,540 --> 00:34:21,360
the user had to do if they wanted to
use one of their existing chat clients

464
00:34:21,360 --> 00:34:26,780
with TOR and OTR. Now this is based

465
00:34:26,780 --> 00:34:32,290
on another Mozilla platform – Instantbird
which is based on Thunderbird.

466
00:34:32,290 --> 00:34:38,300
This allows us to share a lot of the
TOR Browser configuration codes

467
00:34:38,300 --> 00:34:42,120
for managing the TOR process and
configuring bridges. So the user has a

468
00:34:42,120 --> 00:34:47,270
very similar configuration
experience to the browser

469
00:34:47,270 --> 00:34:53,139
when they first start it up. It also has
some additional memory safety advantages

470
00:34:53,139 --> 00:34:58,770
– all the protocol parsers are written
in Javascript. This basically…

471
00:34:58,770 --> 00:35:03,660
one of the major things when we
were looking at candidates for

472
00:35:03,660 --> 00:35:08,470
a messaging client was we wanted to avoid
the problems of libpurple in the past

473
00:35:08,470 --> 00:35:11,980
where there’s been a lot of, like, remote
code execution vulnerabilities with

474
00:35:11,980 --> 00:35:16,860
protocol parsing. Now there are some
trade-offs here, obviously, when you’re

475
00:35:16,860 --> 00:35:22,560
dealing with a browser product. You
still have a html window rendering

476
00:35:22,560 --> 00:35:30,090
the messages. But it is XSS filtered and
even if an XSS exploit were to get through

477
00:35:30,090 --> 00:35:34,320
to run Javascript in your messaging
window that Javascript would still be

478
00:35:34,320 --> 00:35:40,030
unprivileged. So they need an additional
browser-style exploit. And that filter has

479
00:35:40,030 --> 00:35:44,270
been reviewed by Mozilla and additionally
we’re looking into removing Javascript

480
00:35:44,270 --> 00:35:48,740
from that messaging window at all.
It should be completely possible to just

481
00:35:48,740 --> 00:35:54,950
display a reduced, slightly less sexy
version of the same window at perhaps

482
00:35:54,950 --> 00:36:00,670
another higher security level without
Javascript involved at all in that window.

483
00:36:00,670 --> 00:36:04,070
So we will hand off to Jake now to
describe some of the security properties

484
00:36:04,070 --> 00:36:06,090
and differences between TOR
messenger and Ricochet.

485
00:36:06,090 --> 00:36:12,220
Jacob: Just to be clear about this: We
wanted to sort of echo what Phil Rogaway

486
00:36:12,220 --> 00:36:16,440
has recently said. He wrote a really
wonderful paper quite recently about the

487
00:36:16,440 --> 00:36:20,910
moral character of cryptographic work and
Phil Rogaway for those of you that don’t

488
00:36:20,910 --> 00:36:24,310
know is one of the sort of like amazing
cryptographers, very humble, really

489
00:36:24,310 --> 00:36:29,990
wonderful man who was really a little bit
sad that cryptographers and people

490
00:36:29,990 --> 00:36:34,890
working on security software don’t take
the adversaries seriously. So they use

491
00:36:34,890 --> 00:36:39,610
Alice and Bob, and Mallory and they have
cutie icons and they look very happy.

492
00:36:39,610 --> 00:36:44,620
We wanted to make it clear what we thought
the adversary was. Which is definitely not

493
00:36:44,620 --> 00:36:53,090
a cutie adversary. When anonymity fails
for Muslims that live in Pakistan, or e.g.

494
00:36:53,090 --> 00:36:56,580
the guys that are giving a talk later
today, the CAGE guys, when anonymity fails

495
00:36:56,580 --> 00:37:01,420
for them they get detained or they get
murdered or they end up in Guantanamo Bay

496
00:37:01,420 --> 00:37:05,480
or other things like that. So it’s a
serious thing. And we wanted to talk about

497
00:37:05,480 --> 00:37:11,400
what that looks like. So e.g. a lot of you
use jabber.ccc.de, I guess. Don’t raise

498
00:37:11,400 --> 00:37:16,530
your hands. You should decentralize. Stop
using jabber.ccc.de because we should

499
00:37:16,530 --> 00:37:20,960
decentralize. But that said if you do,
this is sort of what it looks like, right?

500
00:37:20,960 --> 00:37:24,090
There’s the possibility for targeted
attacks when you connect. There’s the

501
00:37:24,090 --> 00:37:29,080
possibility that the Social Graph e.g. of
your buddy list, that that would be on the

502
00:37:29,080 --> 00:37:32,740
server. It would be possible that there’s
a bug on any Jabber server anywhere.

503
00:37:32,740 --> 00:37:36,380
So of course you know that if you’re using
Gmail with Jabber, you know that they are

504
00:37:36,380 --> 00:37:40,100
prison providers. So if you got a pretty
big problem there and the attacker, again,

505
00:37:40,100 --> 00:37:44,410
is not a cutie attacker, it’s, you know,
I like the Grim Reaper, that fit that

506
00:37:44,410 --> 00:37:48,820
Mike chose, if you like that’s accurate.
And now if you see one of the protections

507
00:37:48,820 --> 00:37:51,770
you’ll have for communicating with your
peers is off-the-record messaging. That’s

508
00:37:51,770 --> 00:37:57,770
basically the thing. But that’s a very
slap together protocol in a sense. Because

509
00:37:57,770 --> 00:38:02,720
it’s hacks on top of hacks. Where you
know you compose TOR with Jabber and TLS

510
00:38:02,720 --> 00:38:05,860
and maybe you still have a certificate
authority in there somewhere. Or maybe you

511
00:38:05,860 --> 00:38:09,550
have a TOR Hidden Service but then your
status updates they don’t have any

512
00:38:09,550 --> 00:38:16,430
encryption at all, for example. Or, again,
your roster is an actual thing that

513
00:38:16,430 --> 00:38:19,110
someone can see, including every time you
send a message to those people the server

514
00:38:19,110 --> 00:38:24,820
sees that. So, that said, TOR messenger is
really great because it meets users where

515
00:38:24,820 --> 00:38:28,930
they already are. Right? So e.g. actually
one other point here is if you use a piece

516
00:38:28,930 --> 00:38:33,420
of software like Adium, there is actually
a bug filed against Adium where someone

517
00:38:33,420 --> 00:38:37,630
said “Please disable logging-by-default
because Chelsea Manning went to prison

518
00:38:37,630 --> 00:38:41,620
because of your logging policy”. And the
people working on Adium in this bug report

519
00:38:41,620 --> 00:38:48,710
basically said: “Good!” That’s horrifying!
Right? So what if we made it as reasonable

520
00:38:48,710 --> 00:38:54,590
as possible, as configuration-free as
possible using TOR, using OTR, trying to

521
00:38:54,590 --> 00:38:58,650
remove libpurple which is a whole like…
it’s a flock of Zerodays flying in

522
00:38:58,650 --> 00:39:07,640
formation. Right? So we wanted to kill the
bird in a sense but also not we want to

523
00:39:07,640 --> 00:39:14,360
help provide an incentive for improving.
And so that’s where TOR messenger fits.

524
00:39:14,360 --> 00:39:19,670
But we also want to experiment with next
generation stuff. And one of those things

525
00:39:19,670 --> 00:39:25,120
is written by a really great guy on our
community, almost single-handedly, without

526
00:39:25,120 --> 00:39:30,760
any funding at all, and his name is
“special”, that’s actually his name. He’s

527
00:39:30,760 --> 00:39:37,020
also special. But it’s really nice,
because actually, if you solve the problem

528
00:39:37,020 --> 00:39:40,810
of telling your friend your name, if
you’re familiar with the properties of

529
00:39:40,810 --> 00:39:44,940
Hidden Services where you have a self-
authenticating name you know that you’re

530
00:39:44,940 --> 00:39:47,690
talking to the person that you think you
are because you’ve already done a key

531
00:39:47,690 --> 00:39:51,780
exchange. The important part of the key
exchange. And so one of the things that

532
00:39:51,780 --> 00:39:58,790
you’ll see very clearly is that there is
no more server. Right? So there’s no more

533
00:39:58,790 --> 00:40:05,130
jabber.ccc.de in this picture. So this is
a really good example of how we might

534
00:40:05,130 --> 00:40:09,119
decentralize, actually. It’s an experiment
right now but it means no more servers. It

535
00:40:09,119 --> 00:40:14,500
uses the TOR network’s TOR Hidden Service
protocol and everybody actually becomes a

536
00:40:14,500 --> 00:40:18,720
TOR Hidden Service for chatting with their
buddies. And it’s end-to-end encrypted and

537
00:40:18,720 --> 00:40:23,360
it’s anonymized and of course this means
that your Social Graph is a traffic

538
00:40:23,360 --> 00:40:27,980
analysis problem, it’s no longer a list on
a server. And it means your metadata is

539
00:40:27,980 --> 00:40:32,790
as protected as we currently know how
to do in a low-latency anonymity network.

540
00:40:32,790 --> 00:40:36,480
And in the future one of the really nice
things about this is that it will be

541
00:40:36,480 --> 00:40:41,850
possible – or we think it will be
possible – to even make it better in a

542
00:40:41,850 --> 00:40:46,920
sense, e.g. multiple chats, sending
files, sending pictures, in other words,

543
00:40:46,920 --> 00:40:50,780
everything becomes, instead of a certainty
we move it towards probability. And the

544
00:40:50,780 --> 00:40:52,890
probability is in your favour.

545
00:40:52,890 --> 00:41:00,000
Mike: Yes, additionally, I’ll be working
on various forms of panning for cases like

546
00:41:00,000 --> 00:41:04,140
this to basically increase this high…
the probability that there will be

547
00:41:04,140 --> 00:41:10,000
concurrent traffic at the same time from
multiple TOR clients, which will further

548
00:41:10,000 --> 00:41:13,720
frustrate the discovery of the Social
Graph based on simple traffic analysis

549
00:41:13,720 --> 00:41:21,940
especially for low-traffic cases such as
Ricochet. So just to wrap up that

550
00:41:21,940 --> 00:41:29,230
TOR Applications piece: in 2016 we’re
trying to focus heavily on usability and

551
00:41:29,230 --> 00:41:34,950
gin more people to be able to use TOR,
omitting the barriers to finding TOR,

552
00:41:34,950 --> 00:41:40,110
downloading TOR, being able especially
for censored users, and being able to

553
00:41:40,110 --> 00:41:45,100
install TOR. There’s still some snags,
various difficulties that cause people to

554
00:41:45,100 --> 00:41:49,560
stop at various stages of that process and
we want to try and work for to eliminate

555
00:41:49,560 --> 00:41:53,320
them. We also, of course, want to increase
coordination: share graphics, visual

556
00:41:53,320 --> 00:42:00,900
aesthetics and coordinate the ability to
share the TOR process. And we also want to

557
00:42:00,900 --> 00:42:04,540
create a space for more experimentation,
for more things like Ricochet. There’s

558
00:42:04,540 --> 00:42:08,810
probably a lot more ideas like Ricochet
out there. There could be leverages

559
00:42:08,810 --> 00:42:12,150
of TOR protocol and especially Hidden
Services in creative ways. So we’re

560
00:42:12,150 --> 00:42:16,130
looking to create an official sanctioned
space as part of TOR to give them a home.

561
00:42:16,130 --> 00:42:21,280
And to look for that in the coming
months on the TOR blog.

562
00:42:21,280 --> 00:42:26,600
Jacob: Alright, I just wanted to put in a
picture of a guy wearing a Slayer T-Shirt.

563
00:42:26,600 --> 00:42:31,380
So there it is. That’s Trevor Paglen. Some
of you may remember him from such things

564
00:42:31,380 --> 00:42:36,150
as helping to film Citizenfour, building
Satellites that burn up in space so that

565
00:42:36,150 --> 00:42:41,030
are actually currently on other
satellites. And this on the left is

566
00:42:41,030 --> 00:42:45,550
Leif Ryge, he’s sort of the person that
taught me how to use computers. And he is

567
00:42:45,550 --> 00:42:49,050
an incredible Free Software developer.
Trevor Paglen and myself, and this is

568
00:42:49,050 --> 00:42:52,640
a cube, the Autonomy Cube which we talked
about last year. Because we think that

569
00:42:52,640 --> 00:42:57,220
culture is very important and we think
that it’s important to actually get people

570
00:42:57,220 --> 00:43:01,500
to understand the struggle that exists
right now. So this is installed in a

571
00:43:01,500 --> 00:43:06,470
museum right now in Germany, in the city
of Oldenburg, at the Edith-Russ-Haus. And

572
00:43:06,470 --> 00:43:10,810
it actually opened several months ago,
it’s filled with classified documents, it

573
00:43:10,810 --> 00:43:14,000
has really interesting things to go and
read. I highly encourage you to go and

574
00:43:14,000 --> 00:43:18,060
read. We built a reading room about
anonymity papers, about things that are

575
00:43:18,060 --> 00:43:22,990
happening. About how corporations track
you, and then the entire museum is an

576
00:43:22,990 --> 00:43:27,730
Open-WiFi network that routs you
transparently through TOR. So in Germany

577
00:43:27,730 --> 00:43:32,520
a free open WiFi network that isn’t run by
Freifunk – much respect to them – we

578
00:43:32,520 --> 00:43:36,869
wanted to make it possible for you to just
go and have the ability to bootstrap

579
00:43:36,869 --> 00:43:43,030
yourself anonymously if you needed to. And
also these four boards are Novena boards.

580
00:43:43,030 --> 00:43:47,730
And these Novena boards are Free and Open
Hardware devices made by Bunnie and Sean

581
00:43:47,730 --> 00:43:51,220
in Singapore where you could, if you
wanted to, download the schematics and

582
00:43:51,220 --> 00:43:55,990
fab it yourself. And it’s running the
Debian GNU Linux universal operating

583
00:43:55,990 --> 00:44:01,350
system. And it’s an actual TOR exit node
with absolutely every port allowed. So the

584
00:44:01,350 --> 00:44:06,780
museum’s infrastructure itself on the
city’s internet connection actually is a

585
00:44:06,780 --> 00:44:13,619
TOR exit node for the whole world to be
able to use the internet anonymously.

586
00:44:13,619 --> 00:44:20,340
*applause*

587
00:44:20,340 --> 00:44:24,170
But the museum’s infrastructure is not
just helping people in Oldenburg, it’s

588
00:44:24,170 --> 00:44:28,830
helping people all around the world to be
able to communicate anonymously and it’s

589
00:44:28,830 --> 00:44:31,830
quite amazing actually because when
cultural institutions stand up for this

590
00:44:31,830 --> 00:44:35,960
we recognize it’s not just a problem of
over-there stand. We have mass-surveillance

591
00:44:35,960 --> 00:44:40,850
and corporate surveillance in the West
and we need to deal with that. Here, by

592
00:44:40,850 --> 00:44:45,550
creating spaces like this. But that said,
we also need to make sure that we create

593
00:44:45,550 --> 00:44:49,250
spaces in people’s minds all around the
world. And I want to introduce to you

594
00:44:49,250 --> 00:44:55,380
someone who’s incredibly awesome, the
most bad-ass radical librarian around,

595
00:44:55,380 --> 00:44:58,830
this is Alison.
Alison is going to talk about…

596
00:44:58,830 --> 00:45:03,130
Alison: …Library Freedom Project! Hi!
Thank you so much! I’m so excited

597
00:45:03,130 --> 00:45:09,290
to be here, it’s my first CCC and I’m on
stage, and it’s very… exciting. So I’m

598
00:45:09,290 --> 00:45:12,750
going to talk to you a little bit about my
organization, Library Freedom Project.

599
00:45:12,750 --> 00:45:18,400
I’m the director and what we do: we have
a partnership with TOR project to do

600
00:45:18,400 --> 00:45:23,440
community outreach around TOR and other
privacy-enhancing technologies. Making

601
00:45:23,440 --> 00:45:28,260
TOR network more strong and making tools
like TOR Browser more ubiquitous and

602
00:45:28,260 --> 00:45:35,540
mainstream, all with the help of a
coalition of radical militant librarians.

603
00:45:35,540 --> 00:45:40,040
So we introduced you to the Library
Freedom Project back in February. We told

604
00:45:40,040 --> 00:45:43,520
you a little bit about the kind of work
that we do, mostly in US libraries,

605
00:45:43,520 --> 00:45:48,930
increasingly internationally. Where
essentially we teach them about tools like

606
00:45:48,930 --> 00:45:54,669
TOR Browser, how to install it on their
local computers, how to teach it into

607
00:45:54,669 --> 00:45:59,080
computer classes that they offer for free
in the library or one-on-one technology

608
00:45:59,080 --> 00:46:04,350
sessions for their community. And we’ve
had a really amazing year since then.

609
00:46:04,350 --> 00:46:08,470
In addition to working with the TOR
project we’re really fortunate to work

610
00:46:08,470 --> 00:46:12,470
with the American Civil Liberties Union
(ACLU). If you’re not familiar with them,

611
00:46:12,470 --> 00:46:16,480
they’re basically… they’re the bad asses
who’ve been suing the US Intelligence

612
00:46:16,480 --> 00:46:22,710
Agencies and Police for about a 100 years.
That is me with 2 people from the ACLU

613
00:46:22,710 --> 00:46:27,550
Massachusetts, Jessy Rossman who is a
surveillance law expert and Kay Croqueford

614
00:46:27,550 --> 00:46:31,000
who is an activist for the ACLU. And
they’re here, if you see that human buy

615
00:46:31,000 --> 00:46:35,070
them a drink and ask them about the
surveillance capabilities of the US Police.

616
00:46:35,070 --> 00:46:37,980
*applause*

617
00:46:37,980 --> 00:46:43,300
So, it’s really cool! It’s a great
partnership with the ACLU because

618
00:46:43,300 --> 00:46:48,580
basically they can teach why we need to
use tools like TOR Browser. So how to use

619
00:46:48,580 --> 00:46:52,260
them is super-super important but you need
to know about the authorizations, the

620
00:46:52,260 --> 00:46:57,369
programs, all the bad laws and the uses of
them against ordinary people. So, why do

621
00:46:57,369 --> 00:47:01,770
we teach this stuff to librarians? It’s
basically for 2 big reasons. One of them

622
00:47:01,770 --> 00:47:06,470
is that libraries and librarians have an
amazing history of activism around

623
00:47:06,470 --> 00:47:11,450
privacy, fighting surveillance and
fighting censorship in the US where

624
00:47:11,450 --> 00:47:16,090
I live. Librarians were some of the
staunchest opponents of the USA Patriot

625
00:47:16,090 --> 00:47:20,350
Act from the beginning when it was
codified back in 2002. They made T-Shirts

626
00:47:20,350 --> 00:47:25,869
that said “Another hysterical librarian
for Privacy” because of the…

627
00:47:25,869 --> 00:47:29,720
The Attorney General at the time called
them “hysterical” for the fact that they

628
00:47:29,720 --> 00:47:33,400
didn’t want this awful authorization to go
through. And of course then after Snowden

629
00:47:33,400 --> 00:47:37,369
we learned many more things about just
how bad the Patriot Act was. So librarians

630
00:47:37,369 --> 00:47:40,800
were some of the first people to oppose
that. They also have fought back against

631
00:47:40,800 --> 00:47:45,060
National Security Letters which are the US
Government informational requests that

632
00:47:45,060 --> 00:47:49,750
sometimes go to software providers and
other internet services. They have an

633
00:47:49,750 --> 00:47:53,060
attached gag order that basically says:
“You have to give this information about

634
00:47:53,060 --> 00:47:56,430
your users and you can’t tell anyone that
you got it.” Well, libraries got one of

635
00:47:56,430 --> 00:47:58,900
these and fought back against that in one.
*applause*

636
00:47:58,900 --> 00:48:05,640
They also, all the way back in the 1950s
even, at the height of Anti-Communist

637
00:48:05,640 --> 00:48:10,790
Fervor and FUD, around the time of the
House on American Activities Committee,

638
00:48:10,790 --> 00:48:13,509
librarians came out with this amazing
statement, called the “Freedom to Read”

639
00:48:13,509 --> 00:48:18,910
Statement that I think really is a
beautiful text. It’s about 2 pages long

640
00:48:18,910 --> 00:48:26,080
and it is their commitment to privacy and
democratic ideals made manifest.

641
00:48:26,080 --> 00:48:29,310
And I have a little excerpt from it here.
I’m not gonna read the whole thing to you

642
00:48:29,310 --> 00:48:32,500
’cause I understand I’m all too
pressed for time. But the last line is

643
00:48:32,500 --> 00:48:37,600
my favourite. It says: “Freedom itself is
a dangerous way of life. But it is ours.”

644
00:48:37,600 --> 00:48:40,960
So everybody go and get that tattooed!
You know, on your forehead or whatever.

645
00:48:40,960 --> 00:48:44,150
*applause*

646
00:48:44,150 --> 00:48:49,490
So, the history of activism is one of the
big things. There’s a second part that

647
00:48:49,490 --> 00:48:52,420
is more practical. Libraries have an
amazing relationship to the local

648
00:48:52,420 --> 00:48:56,859
communities. That doesn’t really exist
anywhere else especially in this era of

649
00:48:56,859 --> 00:49:01,650
privatization and the destruction of
public commons. Libraries have already

650
00:49:01,650 --> 00:49:05,520
free computer classes in many places,
sometimes the only free computer help that

651
00:49:05,520 --> 00:49:10,609
you can get anywhere. They offer free
computer terminals to many people who

652
00:49:10,609 --> 00:49:14,480
don’t have any other computer access.
They’re trusted community spaces, they

653
00:49:14,480 --> 00:49:18,400
already teach about a whole number of
things. So we think they’re really the

654
00:49:18,400 --> 00:49:24,310
ideal location for people to learn about
things like TOR Browser. So it’s been

655
00:49:24,310 --> 00:49:31,010
going really well. This year we have
visited hundreds of different locations.

656
00:49:31,010 --> 00:49:36,230
We’ve trained about 2300 librarians in the
US, in Canada and a few other countries,

657
00:49:36,230 --> 00:49:43,150
Australia, UK and Ireland. We held an
amazing conference, you might recognize

658
00:49:43,150 --> 00:49:47,630
this as Noisebridge. Any Noisebridge fans
here? I hope so. Come on, there’s got to

659
00:49:47,630 --> 00:49:50,470
be more Noisebridge fans than that!
Christ! We had an amazing conference in

660
00:49:50,470 --> 00:49:54,050
Noisebridge and actually my co-organizer
is also here, April Glaser, so you can buy

661
00:49:54,050 --> 00:49:58,540
her a drink, she’s right over there. There
has been a huge response from the library

662
00:49:58,540 --> 00:50:02,290
community. They wanna learn about TOR
Browser, they’re so excited that finally

663
00:50:02,290 --> 00:50:06,910
there’s a practical way for them to help
protect their patrons’ privacy. They’ve

664
00:50:06,910 --> 00:50:12,000
cared about this stuff from an ideological
and ethical standpoint for a really long

665
00:50:12,000 --> 00:50:15,980
time, and now they know that there are
tools that they can actually use and

666
00:50:15,980 --> 00:50:19,090
implement in their libraries and teach to
their community to help them take back

667
00:50:19,090 --> 00:50:25,400
their privacy. We’re really lucky that not
only do we get to teach librarians but

668
00:50:25,400 --> 00:50:29,590
occasionally we get invited to visit
the local communities themselves.

669
00:50:29,590 --> 00:50:33,770
So, here we teach how to teach privacy
classes with TOR as a big focus.

670
00:50:33,770 --> 00:50:37,460
But sometimes we get to meet the local
community members themselves. So I want to

671
00:50:37,460 --> 00:50:41,850
show you this picture of a recent visit
that I made to Yonkers, New York. It was

672
00:50:41,850 --> 00:50:46,050
a class just for teens. They’re all
holding TOR stickers if you can see that

673
00:50:46,050 --> 00:50:50,369
and Library Freedom Project stickers.
This is a great picture that sort of is

674
00:50:50,369 --> 00:50:54,130
emblematic of the kind of communities
that we get to visit. Yonkers is one of

675
00:50:54,130 --> 00:50:59,160
the poorest cities in the US. These kids
are… many of them are immigrants, their

676
00:50:59,160 --> 00:51:02,790
parents are immigrants, they face
surveillance and state violence as a

677
00:51:02,790 --> 00:51:07,970
matter of their regular everyday lives.
For them privacy is not just a human

678
00:51:07,970 --> 00:51:12,520
right but it’s sometimes a matter of life
and death. And these kids are just some

679
00:51:12,520 --> 00:51:16,820
of the amazing people that we get to see.
Also, just to give you an idea of how the

680
00:51:16,820 --> 00:51:21,230
public perception around privacy is
shifting in my anecdotal experience:

681
00:51:21,230 --> 00:51:25,890
we had 65 teenagers come to this class!
If you have a teenager or if you’ve been

682
00:51:25,890 --> 00:51:30,359
a teenager you know teenagers don’t show
up for stuff, they don’t do that. 65 kids

683
00:51:30,359 --> 00:51:34,340
came to this! And they were so excited!
This was just the group that was left over

684
00:51:34,340 --> 00:51:38,420
at the end that had so many questions and
wanted more stickers to bring back to

685
00:51:38,420 --> 00:51:44,300
their friends. So it’s pretty cool stuff.
Recently we embarked on a new project

686
00:51:44,300 --> 00:51:50,150
bringing TOR relays into libraries. This
is Nima Fatemi with me, when we set up

687
00:51:50,150 --> 00:51:55,390
our pilot at a library in New Hampshire
which is the state just above where I live

688
00:51:55,390 --> 00:52:02,040
in the United States. And we basically
decided to do this project because we

689
00:52:02,040 --> 00:52:05,500
thought it was a really great continuation
of the work that we were already doing,

690
00:52:05,500 --> 00:52:10,080
teaching and training librarians around
using TOR. We wanted to take a step

691
00:52:10,080 --> 00:52:13,690
further and take the infrastructure that
libraries already have; many of them are

692
00:52:13,690 --> 00:52:19,490
moving to really fast internet, they can
donate an IP address and some bandwidth.

693
00:52:19,490 --> 00:52:24,430
And they… many of them want to do kind
of the next thing to help protect privacy

694
00:52:24,430 --> 00:52:27,750
and not just in their local communities,
as well. They want to help protect

695
00:52:27,750 --> 00:52:31,720
internet freedom everywhere. So we thought
it was a really great sort of next step to

696
00:52:31,720 --> 00:52:35,480
go. So we set up our pilot project in New
Hampshire. It went pretty well, we got a

697
00:52:35,480 --> 00:52:39,130
lot of great press attention, a lot of
really great local and global community

698
00:52:39,130 --> 00:52:44,550
support. We also got the attention of
the Department of Homeland Security.

699
00:52:44,550 --> 00:52:49,610
*applause*

700
00:52:49,610 --> 00:52:53,100
Basically they contacted the local Police
in this town in New Hampshire and they

701
00:52:53,100 --> 00:52:57,160
said: “You know, this is stupid, and bad,
and criminal and you should shut this

702
00:52:57,160 --> 00:53:02,640
down!” And the library was understandably
shaken by this and temporarily suspended

703
00:53:02,640 --> 00:53:09,210
the operation of the relay. So we
responded by writing a letter, an open

704
00:53:09,210 --> 00:53:13,440
letter from Library Freedom Project, from
TOR project, from ACLU and a broad

705
00:53:13,440 --> 00:53:17,000
coalition of public interest groups and
luminary individuals including the

706
00:53:17,000 --> 00:53:21,109
Electronic Frontier Foundation (EFF), the
Freedom of the Press Foundation, the Free

707
00:53:21,109 --> 00:53:24,350
Software Foundation and all of our other
friends many of whom are in this audience

708
00:53:24,350 --> 00:53:28,720
today. We wrote this letter to the library
basically affirming our commitment to

709
00:53:28,720 --> 00:53:32,359
them, how much we are proud of them for
participating in this project and how much

710
00:53:32,359 --> 00:53:36,830
we wanted them to continue. We put a lot
of nice, you know, ideological, why this

711
00:53:36,830 --> 00:53:41,520
is important, warm fuzzy stuff. We also
got EFF to start a petition for us and

712
00:53:41,520 --> 00:53:46,270
over a weekend we got about 4500
signatures from all over the world, the

713
00:53:46,270 --> 00:53:51,659
library was flooded with emails, calls.
Only one negative one. Just one out of

714
00:53:51,659 --> 00:53:55,770
hundreds. And that person was a little
confused, so I’m not even counting that

715
00:53:55,770 --> 00:54:03,230
necessarily. It was like a conspiracy type thing.
So we got this amazing support and this

716
00:54:03,230 --> 00:54:06,880
was all in anticipation of their board
meeting that was gonna happen a few days

717
00:54:06,880 --> 00:54:12,150
later where the board was gonna decide
what to do about the relay. So Nima and I

718
00:54:12,150 --> 00:54:16,270
show up to New Hampshire on a Tuesday
Night and you might imagine what a library

719
00:54:16,270 --> 00:54:20,770
board meeting in rural New Hampshire is
typically like. It was nothing like that.

720
00:54:20,770 --> 00:54:26,270
So we get outside and there’s a protest
happening already. Many people holding

721
00:54:26,270 --> 00:54:32,070
Pro-TOR signs. This was just a glimpse of
it. And the look on my face is because

722
00:54:32,070 --> 00:54:35,740
someone pointed to a very small child and
said: “Alison, look at that child over

723
00:54:35,740 --> 00:54:39,120
there”. This tiny little girl was holding
a sign that said “Dammit Big Brother” and

724
00:54:39,120 --> 00:54:45,650
I was like “I’m done, that’s it, I got to
go home!” So we went into the board

725
00:54:45,650 --> 00:54:52,980
meeting and we were met with about 4 dozen
people and media and a huge amount of

726
00:54:52,980 --> 00:54:57,859
support. Many of the community members
expressed how much they loved TOR, that

727
00:54:57,859 --> 00:55:03,790
this whole incident made them download TOR
and check it out for themselves. Basically

728
00:55:03,790 --> 00:55:07,590
it galvanized this community into a
greater level of support than we even had

729
00:55:07,590 --> 00:55:12,119
when we initially set it up about a month
earlier. People who had no idea that the

730
00:55:12,119 --> 00:55:15,660
library was doing this heard about it
because it got a huge amount of media

731
00:55:15,660 --> 00:55:20,859
attention thanks to a story by Julia
Angwin in ProPublica that broke the news

732
00:55:20,859 --> 00:55:26,130
to everybody and then it just went like
wildfire. So as you might imagine the

733
00:55:26,130 --> 00:55:29,920
relay went back online that night. We were
super-successful. Everybody in the

734
00:55:29,920 --> 00:55:34,920
community was incredibly excited about it
and supportive. And what has happened now

735
00:55:34,920 --> 00:55:41,099
is that this community has sort of… like
I said they’ve been galvanized to support

736
00:55:41,099 --> 00:55:46,520
TOR even more. The library has now allowed
at some of their staff time and travel

737
00:55:46,520 --> 00:55:51,920
budget to help other libraries in the area
set up TOR relays. They’re speaking about

738
00:55:51,920 --> 00:55:57,010
TOR…
*applause*

739
00:55:57,010 --> 00:55:59,900
Thank you!
They’re speaking about TOR at conferences.

740
00:55:59,900 --> 00:56:05,300
And this has really caught on in the
greater library community as well. So I

741
00:56:05,300 --> 00:56:08,450
mentioned already the kind of success that
we’ve had at Library Freedom Project in

742
00:56:08,450 --> 00:56:12,520
teaching tools like TOR Browser and
getting folks to bring us in for trainings.

743
00:56:12,520 --> 00:56:17,630
This is even bigger than that! Libraries
are now organizing their, you know, staff

744
00:56:17,630 --> 00:56:21,920
training days around, you know, “Should we
participate in the TOR relay project?” or

745
00:56:21,920 --> 00:56:27,110
“How can we do this best?”, “What’s the
best angle for us?” So we’re really

746
00:56:27,110 --> 00:56:31,590
excited to do announce that we’re gonna
be continuing the relay project at scale.

747
00:56:31,590 --> 00:56:35,270
Nima Fatemi, who is now also in this
picture again, I’m really sad that he

748
00:56:35,270 --> 00:56:38,930
can’t be here, he is wonderful and
essential to this project. But he will now

749
00:56:38,930 --> 00:56:45,680
be able to travel across the US and we
hope to go a little further opening up

750
00:56:45,680 --> 00:56:49,380
more relays in libraries. We’re gonna
continue teaching, of course, about TOR

751
00:56:49,380 --> 00:56:53,780
Browser and other privacy-enhancing Free
Software. We’re now gonna incorporate some

752
00:56:53,780 --> 00:56:58,160
other TOR services, so we’re really
excited to bring “Let’s Encrypt” into

753
00:56:58,160 --> 00:57:01,489
libraries. And while we’re there, why not
run a Hidden Service on the library’s web

754
00:57:01,489 --> 00:57:06,280
server. Among many other things. The other
goals for Library Freedom Project: to take

755
00:57:06,280 --> 00:57:11,650
this to a much more international level.
So if you want to do this in your country,

756
00:57:11,650 --> 00:57:15,590
you know your librarian, put them in touch
with us. You can follow our progress on

757
00:57:15,590 --> 00:57:19,690
LibraryFreedomProject.org or
@libraryfreedom on Twidder. And we’re

758
00:57:19,690 --> 00:57:22,950
always sort of posting on Tor Blog about
stuff that’s going on with us, so…

759
00:57:22,950 --> 00:57:26,480
Thank you so much for letting me tell you
about it. It’s really a pleasure to be

760
00:57:26,480 --> 00:57:40,520
here!
*applause*

761
00:57:40,520 --> 00:57:45,060
Jacob: So, that’s a really tough act to
follow! But we’re very pressed for time

762
00:57:45,060 --> 00:57:48,740
now. And we want to make sure that we can
tell you two big things. And one of them

763
00:57:48,740 --> 00:57:52,040
is that, as you know, we were looking for
an Executive Director because our Spirit

764
00:57:52,040 --> 00:57:56,550
Animal, Roger,…
Roger: Slide…

765
00:57:56,550 --> 00:58:01,730
Jacob: Right… He couldn’t do it all. And
in fact we needed someone to help us. And

766
00:58:01,730 --> 00:58:05,869
we needed someone to help us who has the
respect not only of the community here but

767
00:58:05,869 --> 00:58:10,709
the community, basically, all around the
world. And we couldn’t think of a better

768
00:58:10,709 --> 00:58:15,380
person, in fact, when we came up with a
list of people. The person that we ended

769
00:58:15,380 --> 00:58:19,440
up with was the Dream Candidate for a
number of the people in the TOR Project

770
00:58:19,440 --> 00:58:24,260
and around the world. And so, I mean, I
have to say that I’m so excited, I’m so

771
00:58:24,260 --> 00:58:28,040
excited that we have her as our Executive
Director. I used to think that our ship

772
00:58:28,040 --> 00:58:32,300
was going to sink, that we would all go to
prison, and that may still happen, the

773
00:58:32,300 --> 00:58:39,609
second part. But the first part, for sure,
is not going to happen. We found someone

774
00:58:39,609 --> 00:58:44,379
who I believe will keep the TOR Project
going long after all of us are dead and

775
00:58:44,379 --> 00:58:50,510
buried. Hopefully, not in shallow graves.
So, this is Shari Steele!

776
00:58:50,510 --> 00:58:58,540
*applause*

777
00:58:58,540 --> 00:59:00,740
Shari: Hi!
*applause*

778
00:59:00,740 --> 00:59:05,400
Thanks! Thanks, it’s actually so fun to be
back in this community. And I wasn’t gone

779
00:59:05,400 --> 00:59:08,650
for very long. I had so much for
retirement. It didn’t work out for me.

780
00:59:08,650 --> 00:59:14,289
But, that’s OK, I’m really excited. I have
had – we’re so tight on time – so I want

781
00:59:14,289 --> 00:59:18,000
to just tell you there are 2 big mandates
that I was given when I first was hired.

782
00:59:18,000 --> 00:59:22,320
And one is: Help build a great
infrastructure so that TOR Project is

783
00:59:22,320 --> 00:59:27,330
sustainable. Working on that! The other
thing is: Money! We need to diversify our

784
00:59:27,330 --> 00:59:31,330
funding sources, as everybody knows here.
The Government funding has been really

785
00:59:31,330 --> 00:59:35,680
difficult for us specifically because it’s
all restricted. And so it limits the kinds

786
00:59:35,680 --> 00:59:41,430
of things we want to do. When you get the
developers in a room blue-skying about the

787
00:59:41,430 --> 00:59:44,900
things that they want to do, it’s
incredible! Really, really brilliant

788
00:59:44,900 --> 00:59:48,040
people who want to do great things but
they’re really limited when the funding

789
00:59:48,040 --> 00:59:52,960
says they have to do particular things. So
we happen to be doing our very first ever

790
00:59:52,960 --> 00:59:59,010
crowd funding campaign right now. I want
to give a shout out to Katina Bishop who

791
00:59:59,010 --> 01:00:03,450
is here somewhere and who is running
the campaign for us and is just doing an

792
01:00:03,450 --> 01:00:09,779
amazing job. As of last count which is a
couple of days ago, we had over 3000

793
01:00:09,779 --> 01:00:15,090
individual donors and over 120.000 Dollars
which is incredible for our very first

794
01:00:15,090 --> 01:00:18,820
time when we didn’t even really have a
mechanism in place to be collecting this

795
01:00:18,820 --> 01:00:24,540
money, even. So, it’s really great! And I
wanna also say we have a limited number

796
01:00:24,540 --> 01:00:31,070
of these T-Shirts that I brought in a
suitcase from Seattle. So, and they’re

797
01:00:31,070 --> 01:00:36,160
gonna be available, if you come down to
the Wau Holland booth at the Noisy Square.

798
01:00:36,160 --> 01:00:39,619
Come talk with us! Give a donation!
We’re doing a special: it’s normally a

799
01:00:39,619 --> 01:00:46,310
100 Dollar donation to get a shirt, but
for the conference we’ll do, for 60 Euro

800
01:00:46,310 --> 01:00:50,320
you can get a shirt and it would be great
you’d be able to show your support. And

801
01:00:50,320 --> 01:00:56,869
you can also donate online if you don’t
wanna do that here. That’s the URL. And

802
01:00:56,869 --> 01:01:01,109
to end, we’d like to have a
word from Down Under!

803
01:01:01,109 --> 01:01:05,079
*Video starts*

804
01:01:05,079 --> 01:01:09,859
*Video Intro Violin Music*

805
01:01:09,859 --> 01:01:15,030
Good Day to you! Fellow Members of the
Intergalactic Resistance against Dystopian

806
01:01:15,030 --> 01:01:20,550
bastardry! It is I, George Orwell, with an
urgent message from Planet Earth, as it

807
01:01:20,550 --> 01:01:25,670
embarks on a new orbit. Transmitting via
the Juice Channeling Portal. Our time is

808
01:01:25,670 --> 01:01:30,290
short. So let’s get straight to the point.
Shall we? This transmission goes out to

809
01:01:30,290 --> 01:01:35,420
all you internet citizens. Denizens of
the one remaining free frequency. In whose

810
01:01:35,420 --> 01:01:40,869
hands rests the fate of humanity.
Lord… f_ckin’ help us!

811
01:01:40,869 --> 01:01:42,869
*typewriter typing sounds*

812
01:01:42,869 --> 01:01:48,560
When I last appeared to you, I warned you
noobs: You must not lose the Internet! Now

813
01:01:48,560 --> 01:01:54,140
before I proceed, let us clarify one
crucial thing. The Internet is not Virtual

814
01:01:54,140 --> 01:02:00,450
Reality, it is actual Reality.
*typewriter typing sounds*

815
01:02:00,450 --> 01:02:05,420
Are you still with me? Good. Now ask
yourselves: Would you let some fascist

816
01:02:05,420 --> 01:02:09,180
dictate with whom you can and cannot
communicate? Because that’s what happens

817
01:02:09,180 --> 01:02:13,700
every time a government blacklists a
website domain. Would you let anyone force

818
01:02:13,700 --> 01:02:18,490
you to get all your information from cable
TV? That’s effectively the case if you

819
01:02:18,490 --> 01:02:24,800
allow corporations to kill Net Neutrality.
*typewriter typing sounds*

820
01:02:24,800 --> 01:02:29,160
Would you let the Thought Police install
telescreens in your house, monitor and

821
01:02:29,160 --> 01:02:34,010
record everything you do, every time you
move, every word you’ve read, to peer into

822
01:02:34,010 --> 01:02:37,880
the most private nook of all, your head?
BECAUSE THAT’S WHAT HAPPENS when

823
01:02:37,880 --> 01:02:42,540
you let your governments monitor the net
and enact mandatory data-retention laws!

824
01:02:42,540 --> 01:02:48,200
*smashing sounds*

825
01:02:48,200 --> 01:02:52,480
If you answered “No” to all those
questions, then we can safely deduce

826
01:02:52,480 --> 01:02:59,600
that terms like “Online”, “IRL” and “in
Cyberspace” are Newspeak. They confuse the

827
01:02:59,600 --> 01:03:05,040
truth: There is no “Cybersphere”. There
is only life. Here. It follows that if you

828
01:03:05,040 --> 01:03:09,380
have an oppressive Internet, you have
an oppressive society, too. Remember:

829
01:03:09,380 --> 01:03:11,490
online is real life…
*typewriter typing sounds*

830
01:03:11,490 --> 01:03:15,950
Your Digital Rights are no different from
everyday human rights! And don’t give me

831
01:03:15,950 --> 01:03:20,089
that BS that you don’t care about
Privacy because you have nothing to hide.

832
01:03:20,089 --> 01:03:24,570
That’s pure Doublethink. As comrade
Snowden clearly explained, that’s like

833
01:03:24,570 --> 01:03:28,730
saying you don’t care about Free Speech
because you have nothing to say!

834
01:03:28,730 --> 01:03:32,970
Stick that up your memory
holes and smoke it, noobs!

835
01:03:32,970 --> 01:03:37,650
Pigs Arse, the portal is closing, I’m
losing you! I’ll leave you with a new tool

836
01:03:37,650 --> 01:03:42,689
to use. I assume you’ve all been fitted
with one of these spying devices. Well,

837
01:03:42,689 --> 01:03:46,420
here’s an app you can use in spite of
this. It’s called Signal, and, yes, it’s

838
01:03:46,420 --> 01:03:50,660
free and simple. Install it and tell all
your contacts to mingle then all your

839
01:03:50,660 --> 01:03:54,520
calls and texts will be encrypted. So even
if Big Brother sees them the c_nt won’t be

840
01:03:54,520 --> 01:04:00,490
able to read them. Hahaa! Now that’s
a smartphone! Our time is up!

841
01:04:00,490 --> 01:04:04,230
*typewriter typing sounds*
Until the next transmission. Heed the

842
01:04:04,230 --> 01:04:09,740
words of George Orwell. Or
should I say: George TORwell?

843
01:04:09,740 --> 01:04:14,870
*typewriter typing sounds*

844
01:04:14,870 --> 01:04:19,609
Remember, just as I went to Spain to fight
the dirty fascists you can come to Onion

845
01:04:19,609 --> 01:04:24,089
land and fight Big Brother’s filthy
tactics. If you’re a Pro run a node and

846
01:04:24,089 --> 01:04:28,180
strengthen the code. Or if you’re in the
Outer Party and can afford it, send TOR

847
01:04:28,180 --> 01:04:33,720
some of your dough. Special Salute to
all my comrades, the “State of the Onion”.

848
01:04:33,720 --> 01:04:38,109
Happy Hacking! Now go forth and
f_ck up Big Brother. That mendacious

849
01:04:38,109 --> 01:04:42,539
motherf_cking, c_ck-sucking bastard
son of a corporatist b_tch…

850
01:04:42,539 --> 01:04:52,910
*Video Outro Music*

851
01:04:52,910 --> 01:05:00,999
*applause*

852
01:05:00,999 --> 01:05:05,410
Jacob: So, I think that’s all the time
that we have. Thank you very much for

853
01:05:05,410 --> 01:05:08,760
coming. And thank you all
for your material support.

854
01:05:08,760 --> 01:05:35,370
*applause*

855
01:05:35,370 --> 01:05:41,720
Herald: Unfortunately we won’t have time
for a Q&A. But I heard that some of the

856
01:05:41,720 --> 01:05:49,940
crew will now go to the Wau Holland booth
at Noisy Square down in the Foyer and

857
01:05:49,940 --> 01:05:54,790
might be ready to answer
questions there. If you have any.

858
01:05:54,790 --> 01:05:59,330
*postroll music*

859
01:05:59,330 --> 01:06:05,881
Subtitles created by c3subtitles.de
in 2016. Join and help us!