Hacker News new | past | comments | ask | show | jobs | submit | neur0mancer's comments login

And some developers are working on this [1] new webpage.

[1] http://popcorn-time.tv/


http://popcorn.cdnjd.com/ seems to be down.


Right now, the website of the company (http://www.digitalglobeblog.com/) redirects to 127.0.0.1


My upvote, sir. HN needs more Simpsons' quotes.


They only come out at night. Or, in this case, during the daytime.


README? A description?


Actually, there is no need for -k I was trying to point that even very common programs have (stack) buffer overflows and nobody cares to fix them.


Someone should build a web plataform to allow to decode it using crowdsourcing ..


Crowdsourcing? Researchers already collaborate on works like this, so all we would get are unexperienced looky-loos and trolls.

(Or were you being silly :P)


Quantity is a quality of it's own. I think getting large number of people to try something is a good way to move forward. Sure it is not efficient, but fairly reliable.


You mean like a large number of people with no qualification and experience went looking for the Boston bombing suspect?

I'll leave it here: http://www.bbc.co.uk/news/technology-22214511


I would argue that this is more of a mob lynching, rather than constructive combined effort.

To look for models that work, look no further than open source. Projects with high amount of interest quite often do better.


i assure you it was intended to be a constructive combined effort. i only skimmed the article linked but it doesn't seem to reference the fact that the kid who was falsely identified was found dead, presumably by his own hand. "hell is full of good meanings, but heaven is full of good works", as they say.

on a lighter note, if you consider an open source project "crowdsourced" then i posit the current collaborative academic efforts to decode the manuscript are "crowdsourced" as well.


Which kid is that? The one that was already missing before the bombing?

http://www.abc.net.au/news/2013-04-26/falsely-accused-bombin...


yes. he was missing, then falsely accused, then found dead.


Your original post could be construed as suggesting that he killed himself due to the false accusations.


he was a depressive runaway and a racist internet lynch mob was threatening to kill his family for something he didn't do.

i suggested a correlation, i didn't propose it as fact. i'm curious as to what scenario you consider more probable?


It seems entirely possible that he was dead at the time that the lynch mob was hurling accusations at him, unless I'm missing something.


you're right. i was considering the events in the order they were revealed, not necessarily the event they occurred. i'm not sure which it is now.


Au contraire. It is efficient but fairly unreliable :)


I would beg to differ. It is not efficient from standpoint of labor allocation. Not experienced people are rarely efficient at completing complex tasks. On the other hand once problem captures imagination of the significant portion of population amount of progress made goes way up.


It depends on the problem. There are problems that are higly parallelizable and problems that are not. There are problems where expertise is paramount and problems where it is not. What you said is true but what I said is also true, if you stretch enough the definition of "fairly unreliable" :)


Something like _Voynich Genius_ ?

http://rapgenius.com


I wish all distros will start compiling all its packages with a consistent policy to enable at least some level of stack protection (like Ubuntu and unlike Debian).


Can you elaborate? Debian has infrastructure for hardening their builds (look on their wiki), but each package has to make sure to use it. Unless Ubuntu patches each package source (which they don't), then they have the same hardening status as Debian. Where did I go wrong?


Doesn't Ubuntu take most of their packages from Debian?


Ubuntu uses the packages that have been maintained and integrated by Debian, but they don't necessarily take the binaries. They compile it themselves, and that means they can specify whatever compile options they want.


The lock (fake) manual is available here:

https://microcorruption.com/manual.pdf


Cool! Notably, it appears that it is based on TI MSP430, which is a real CPU with its own instruction set. At first I thought they had invented their own ISA, which would have been crazy.


MSP430 is a remarkably elegant instruction set. Wikipedia has a good summary: http://en.wikipedia.org/wiki/TI_MSP430#MSP430_CPU


Well, let's just say it's an architecture Square knows well. :)


That has less to do with it than the fact that it was the smallest ISA I could find that GCC would readily compile down to.


I would have loved it to be some old ARM ISA to use it as a testcase for Avatar[0]. On the same topic, FIE paper may be an interesting reading for msp430 lovers[1] (but it needs source for symbolic execution, so doesn't directly apply here).

[0] http://www.s3.eurecom.fr/tools/avatar/

[1] https://www.usenix.org/conference/usenixsecurity13/technical...


In particular, Square's credit card readers use an MSP430 chip to encrypt the stripe data before passing it on the phone.

Their first credit card readers were entirely analog devices, which were very easy to use to skim cards.

Hopefully the latest batches have per-device unique keys (based on some centrally-known KDF) so a compromise of one doesn't re-enable such an exploit.


Just so I can be super clear here: none of the code in this challenge has anything whatsoever to do with anything Square ships. We deliberately made things less realistic to make the levels more fun, and easier to ramp up with.


Hopefully the latest batches have per-device unique keys (based on some centrally-known KDF) so a compromise of one doesn't re-enable such an exploit.

Yes, that's how it works.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: