I have used Digikam because I thought it was 4 years newer but it was a complete and utter disaster. Huge problems (slow) when using it with a lot of pictures, face detection all over the place and a shitty interface to tag people.
Picasa on the other hand has no problems with huge libraries and the tagging interface is actually fun to use.
Downside to Picasa is that the initial indexing is very slow, as it only uses 1 core.
Also (both Digikam and Picasa do this) software needs to write metadata into the JPG file. No sidecar allowed.
Face detection is a critical and difficult part of face classification, it's not like we're decrying the invention of the transistor. It's more like working for a bullet manufacturer and hoping they'll be used for hunting deer.
It ends wherever you set your ethical boundaries, and it's certainly not black and white, but I'd say the man who builds fireworks from gun powder has an easier time than the one who puts it in steel casings.
Xolo does feature vector extraction on face regions. Extracting those regions from larger images is something else’s job.
Zolo does feature vector similarity analysis on facial feature vectors. Extracting those vectors is something else’s job.
Oh look, now every authoritarian government has free access to never before seen levels of data harvesting. But nobody has to feel any guilt because they only contributed a third of the machine. Hooray!
If open source models aren't developed, and even if they are, governments are going to develop their own. At least this way everyone else is on equal footing and not at a strict disadvantage. This tech could be used to identify police when they are in riot gear and cover their identification, for instance.
Well, you often can see police officers' faces through their visors, just as you can sometimes recognize people through fabric masks. Police don't wear gas masks all the time, only when they're about to or immediately after deploying gas.
The police I saw at Seattle protests did indeed wear de-identification equipment even when not actively deploying chemical weapons against the civilians. I have photos.
The future is coming, and the tech that can enable authoritarian behavior will come no matter what we do as they're just tools. It's been here and around us in ever-increasing forms for decades and yet we haven't necessarily devolved into a big brother state.
What we should be worrying is actual usages of grand scale citizen-control and monitoring projects that are enabled by technology. Think China, not UK or US.
We have to worry about it in the US and the UK too. I mean, monitoring is already widespread in the US, UK, Canada, etc. Filtering, and therefore control, was almost reality in the UK last year, albeit limited to pornography. But it's a slippery slope in both cases.
Hmm when I was visiting Japan they had a serious problem with work exhaustion related suicides. The workload and pressure to perform at higher and higher levels was too much so they ended it. That was just a few years ago.
So, to hear that there is a new level of control for this already judgemental and “honor” based society is just appalling.
Any asymmetric activity involving a moderate amount of exertion and repeated regularly would create a difference in strength between the sides of the body, so this could actually be an explanation. ;)