John Young | 28 Nov 14:25 2014

What Is Good Encryption Software?

Reader asks: What Is Good Encryption Software?

I have contacted you asking about certain security questions.
After reading a few of the Snowden leaked documents, I have
started to be more aware of my privacy being at risk. I have a
few questions concerning certain programs and safety tips.

First, I've recently started to doubt about my encryption software.
Is Symantec's "PGP Endpoint" a good hard drive encryption software?

In other words, is it trustworthy since it is an American company.
And if not, what encryption software is the best for Mac.

Second, is "ProtonMail" as secure as they say it is? If not, what
email provider doesen't let the NSA see into my account.

Third, is Jetico inc's "Bestcrypt Container Encryption" trustworthy?
If not, what could be an alternative.

Fourth, are these encryption types good? Blowfish, Gost & AES - 256bit.
And which encryption type remains the best above all?

Last, is Kaspersky a good anti-virus software? If not, which one is the
best for Mac.


Important, difficult questions, likely to produce a range of answers.
(Continue reading)

ianG | 26 Nov 18:04 2014

Underhanded Crypto

The Underhanded Crypto contest was inspired by the famous Underhanded C 
Contest, which is a contest for producing C programs that look correct, 
yet are flawed in some subtle way that makes them behave 
inappropriately. This is a great model for demonstrating how hard code 
review is, and how easy it is to slip in a backdoor even when smart 
people are paying attention.

We’d like to do the same for cryptography. We want to see if you can 
design a cryptosystem that looks secure to experts, yet is backdoored or 
vulnerable in a subtle barely-noticable way. Can you design an encrypted 
chat protocol that looks secure to everyone who reviews it, but in 
reality lets anyone who knows some fixed key decrypt the messages?

We’re also interested in clever ways to weaken existing crypto programs. 
Can you make a change to the OpenSSL library that looks like you’re 
improving the random number generator, but actually breaks it and makes 
it produce predictable output?

If either of those things sound interesting, then this is the contest 
for you.
Givon Zirkind | 20 Nov 12:08 2014

Re: new encrypted phones

this whole hulabalu about encrypted phones, its only the data on the 
phone that's encrypted.  not the conversations.  right?

does the encryption extend to call logs?
John Young | 16 Nov 17:22 2014

Call for publication of all Snowden papers gets louder


Call for publication of all Snowden papers gets louder

The Amsterdam Media Professor Geert Lovink left 
no doubt that in his opinion the medium term, the 
entire treasure of the NSA Whistleblowers must be 
made and archived publicly. Such an approach has 
a very different range than the global 24-hour 
news machine. "Journalists want to make headlines 
and politics", was the founder of the Institute 
of Network Cultures . But it was important to think long-term.

The Snowden-papers offer unique insight into a 
loud Lovink "information-military complex in the 
making": They showed roughly, "who are the actors 
and what technologies they use." In addition, 
they made different impact of monitoring on 
individual countries or regions significantly and 
who personally was affected as a group or 
society. Taken together, this was "for all 
investigative journalists around the world for decades interesting".


Ruf nach Veröffentlichung der Snowden-Papiere wird lauter

Der Amsterdamer Medienprofessor Geert Lovink ließ 
(Continue reading)

John Young | 4 Nov 23:18 2014

Wind River Security Features and Cryptography Libraries

Wind River Security Features and Cryptography Libraries (which appear to be the
basis of the $750,000 fine by BIS)
ianG | 22 Oct 10:57 2014

CFP by 24 Nov - Usable Security - San Diego 8th Feb

The Workshop on Usable Security (USEC) will be held in conjunction with
NDSS on February 8, 2015. The deadline for USEC Workshop submissions is
November 24, 2014. – In previous years, USEC has also been collocated
with FC; for example in Okinawa, Bonaire, and Trinidad and Tobago.

Additional information and paper submission instructions:


The Workshop on Usable Security invites submissions on all aspects of
human factors and usability in the context of security and privacy. USEC
2015 aims to bring together researchers already engaged in this
interdisciplinary effort with other computer science researchers in
areas such as visualization, artificial intelligence and theoretical
computer science as well as researchers from other domains such as
economics or psychology. We particularly encourage collaborative
research from authors in multiple fields.

Topics include, but are not limited to:

* Evaluation of usability issues of existing security and privacy models
or technology

* Design and evaluation of new security and privacy models or technology

* Impact of organizational policy or procurement decisions

* Lessons learned from designing, deploying, managing or evaluating
(Continue reading)

Jason Iannone | 22 Oct 04:22 2014

Define Privacy

On a fundamental level I wonder why privacy is important and why we
should care about it.  Privacy advocates commonly cite pervasive
surveillance by businesses and governments as a reason to change an
individual's behavior.  Discussions are stifled and joking references
to The List are made.  The most relevant and convincing issues are
documented cases of chilled expression from authors, artists,
activists, and average Andrews.  Other concerns deal with abuse, ala
LOVEINT, etc.  Additional arguments tend to be obfuscated by nuance
and lack any striking insight.

The usual explanations, while appropriately concerning, don't do it
for me.  After scanning so many articles, journal papers, and NSA
surveillance documents, fundamental questions remain: What is privacy?
 How is it useful?  How am I harmed by pervasive surveillance?  Why do
I want privacy (to the extent that I'm willing to take operational
measures to secure it)?

I read a paper by Julie Cohen for the Harvard Law Review called What
Privacy is For[1] that introduced concepts I hadn't previously seen on
paper.  She describes privacy as a nebulous space for growth.  Cohen
suggests that in private, we can make mistakes with impunity.  We are
self-determinate and define our own identities free of external
subjective forces.  For an example of what happens without the
impunity and self-determination privacy provides, see what happens
when popular politicians change their opinions in public.  I think
Cohen's is a novel approach and her description begins to soothe some
of my agonizing over the topic.  I'm still searching.

(Continue reading)

ianG | 15 Oct 02:03 2014

SSL bug: This POODLE Bites: Exploiting The SSL 3.0 Fallback

SSL 3.0 [RFC6101] is an obsolete and insecure protocol. While for most practical purposes it has been replaced by its successors TLS 1.0 [RFC2246], TLS 1.1 [RFC4346], and TLS 1.2 [RFC5246], many TLS implementations remain backwards­compatible with SSL 3.0 to interoperate with legacy systems in the interest of a smooth user experience. The protocol handshake provides for authenticated version negotiation, so normally the latest protocol version common to the client and the server will be used.

However, even if a client and server both support a version of TLS, the security level offered by SSL 3.0 is still relevant since many clients implement a protocol downgrade dance to work around server­side interoperability bugs. In this Security Advisory, we discuss how attackers can exploit the downgrade dance and break the cryptographic security of SSL 3.0. Our POODLE attack (Padding Oracle On Downgraded Legacy Encryption) will allow them, for example, to steal “secure” HTTP cookies (or other bearer tokens such as HTTP Authorization header contents).

We then give recommendations for both clients and servers on how to counter the attack: if disabling SSL 3.0 entirely is not acceptable out of interoperability concerns, TLS implementations should make use of TLS_FALLBACK_SCSV.

CVE­2014­3566 has been allocated for this protocol vulnerability.

cryptography mailing list
Krisztián Pintér | 13 Oct 18:39 2014

Re: What's the point of using non-NIST ECC Curves?

Derek Miller (at Monday, October 13, 2014, 6:19:07 PM):
> However, both scenarios (NSA engineered them to be bad, NSA
> engineered them to be good) mean that the NSA knows a great deal
> more about weaknesses in Elliptic Curve Cryptography than we do.
> Doesn't that give you great pause in using the algorithm at all?

actually, you have a point. if there is any doubt, there is no doubt.
without even doing anything, just by being secretive, NSA can weaken
crypto. good job guys!
Tony Arcieri | 13 Oct 18:28 2014

Re: What's the point of using non-NIST ECC Curves?

On Mon, Oct 13, 2014 at 9:19 AM, Derek Miller <> wrote:
However, both scenarios (NSA engineered them to be bad, NSA engineered them to be good) mean that the NSA knows a great deal more about weaknesses in Elliptic Curve Cryptography than we do. Doesn't that give you great pause in using the algorithm at all?

Sure, that's why djb and friends are also working on implementing McEliece and Merkle signatures

Tony Arcieri
cryptography mailing list
Derek Miller | 13 Oct 18:19 2014

Re: What's the point of using non-NIST ECC Curves?

Thanks for the additional scenario (I had not even considered trusting the NSA, so had not considered that scenario).
However, both scenarios (NSA engineered them to be bad, NSA engineered them to be good) mean that the NSA knows a great deal more about weaknesses in Elliptic Curve Cryptography than we do. Doesn't that give you great pause in using the algorithm at all?

On Mon, Oct 13, 2014 at 10:53 AM, Derek Miller <> wrote:
For curve P-192, SEED = 3045ae6f c8422f64 ed579528 d38120ea e12196d5 
For curve P-224, SEED = bd713447 99d5c7fc dc45b59f a3b9ab8f 6a948bc5
For curve P-256, SEED = c49d3608 86e70493 6a6678e1 139d26b7 819f7e90

On Mon, Oct 13, 2014 at 10:43 AM, Krisztián Pintér <> wrote:
On Mon, Oct 13, 2014 at 5:38 PM, Ryan Carboni <> wrote:
>> > However, considering one of the scenarios where these curves might be
>> > compromised (the NSA knew of weaknesses in certain curves, and
>> > engineered
>> > the NIST Prime curves to be subject to those weaknesses)
> I forget, what was the original inputs to the hash?

another unexplained constant, if i'm not mistaken. it makes no sense
in any circumstances.
cryptography mailing list
cryptography <at>

cryptography mailing list