Hacker Newsnew | past | comments | ask | show | jobs | submit | bobivl's commentslogin

There is a longer article [1] describing the attack, but it is in German and behind a paywall.

[1] https://www.heise.de/hintergrund/c-t-deckt-auf-Keylogger-nut...


To replace a laptop (for me), I need to be able to type. It does not look easily possible from what I have seen, but missed the first part.


It was part of the keynote how it can integrate with Magic Mouse/Keyboard and presumably other bluetooth accessories


They said that you can pair it with Bluetooth mice, keyboards, and game controllers.


/s, work is not supposed to involve too much hand manipulations, you are suppose to create and add value through fabrication and theft by showing happy faces in as many boy’s club meetings as possible



Just out of curiosity, what in the world prompted you to submit the German link to an English news site, and then casually mention the English version, versus the opposite?


If you speak German you might stumble upon the German version of the article first, and realise _after sharing_ that the English one exist.

Happened to me although my German is much worse than my English.


That is exactly what happened, I wanted to share the original article and did not see that they also have an english version (because they usually do not have one).


Also the English version was only published 1.5 hours later, so it might have been overlooked initially.


Literally couldn't figure out how to get through the German cookie dialogues even though it's an English article.


Cut, sort, join and awk can be pretty powerful and fast. If it becomes too tedious to manually write them, you can also use BigBash [1] to convert a SQL query automatically to a one-liner that only use these tools to execute the query.

[1] http://bigbash.it


Any experienced programmer learns to not use string processing on structured data, because that will bite them in the ass.

Meanwhile HN luddites: let me use awk, cut and whatnot despite the existence of an util that explicitly sidesteps this issue.


/me runs the example on bigbash.it, cleaned up a bit:

    (
      trap "kill 0" SIGINT;
      export LC_ALL=C;
      find movies.dat.gz -print0
        | xargs -0 -i sh -c "gzip -dc {} | tail -n +2"
        | sed "s/::/;/g"
        | cut -d $';' -f2
        | sort -t$';'  -k 1,1
        | head -n10
        | awk -F ';' '{print $1}'
    )
Yeah, how about no. That's a very neat site and a clever hack, but there are clear escaping flaws in there for valid movie names.

bash and standard unix tools are a terrible structured-data manipulator. it's part of why `jq` is so widely used and loved, despite being kinda slow and hard to remember at times - it does things correctly, unlike most glued-together tools.


Yep, pretty sure that this script doesn't handle quoted strings in any way, and would promptly mangle a title that contains a semicolon.


"structured data" usually means there are delimiting characters, states, etc. AWK can certainly handle this. It's a simple and powerful language.

I don't think I've ever used it to parse JSON, but I've definitely used it to output simple JSON.


Are you telling me that awk can correctly identify delimiters inside quoted strings? Escaped quotes inside quoted strings? Newlines inside quoted strings? I.e. that awk actually has a csv parser? Very cool if so.


Yeah, you can implement a basic FSM and use `next` to handle fake `RS` (e.g. newlines).

I'm not necessarily recommending it, but it's certainly possible and could be portable and really fast to run with a low memory footprint.


Well, awk having a csv parser via the user implementing that parser is not quite what I have in mind when I turn to awk for some quick field splitting—and I don't think it's what others in the thread meant either, as evidenced by the linked site.

Personally I prefer using a readymade and tested library in any language that I might touch, so I can just do my own thing on top. Or, in command line, to use an util that employs such a library. Kind of hope that I'm never so constrained that only awk is available and I can't even spin up Lua.


Powerful and fast, and also *portable*. It'll run on your low-privileged tools box, 20 year old beige box, vhost, you name it.


There is also BigBash [0] that converts an Sql statement to a bash one-liner (using sed, grep, awk,...) which can then run to execute the query. The advantage is that you can let it run on very large file(s) because of the streaming nature of linus bintools.

[0] http://bigbash.it or the corresponding Github repo.


I also used bash scripts a lot to get quick insights from csv files. Someday I realized that these are mostly sql queries that I encoded into complex scripts. For the sake of trying, I implemented a simple sql to bash transpiler that takes a sql query and returns a bash one-liner that you can execute on csv file(s).

Give it a try: http://bigbash.it


If you like to do data analyses in bash, you might also enjoy bigbash[1]. This tool generates quite performant bash one-liners from SQL Select statements that easily crunch GB of csv data.

Disclaimer: I am the main author.

[1] http://bigbash.it


That's pretty cool.

Do you think you can get it to support Manta? I think a lot of people in that ecosystem could benefit from it if you could. I'd help, but I don't really know Java all that well :-(.


Java is only necessary for the "compilation" of the query. The resulting bash script uses only standard unix tools and runs without java.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: