Every year for at least last 5 years I have been trying Digikam. Every year like 99.9% of the reasonably complicated GUI-based single process multi-functional software written on Linux it would blow up while working with marginally large sets of files (~ 50k photos, roughly evenly split between 24 megapixel NEF files and random resolution jpegs )
It is just sad to see that the imaging/image processing/image organizing world is still stuck in 1990s paradigm rather than embrace that the computers have gobs of memory, a single application can be composed of dozen of individual components talking to each other over 127.0.0.1 and background jobs is a thing.
(Not a dozen components, though; just four. A main/watchdog service, a web service, a volume watchdog and directory scanner service, and a cluster of sync-file jobs).
Related: it was a total pita to handle real-time debugging until I made my own `tail -f` that sorted by timestamp and automatically updated as new processes added new log files).
Thanks, I'm going to play with it as it is the closest thing to what I think can function well even though I did not mean using a web browser as an interface.
Conversely the a singular anecdote also doesn't mean it doesn't work either. I strongly suspect that when you have a largish body of people are using something without issue and one person is constantly experiencing problems there might be a reason why.
- Maybe you are constantly trying it on the same distro that constantly has problems with its build for some reason or other. Consistent bad experience for the smaller number of users who conclude incorrectly that digikam is broken. For example suppose it uses out of date libraries that the software depends on.
- Maybe your configuration is unusual in some fashion and because of this consistently hits the same bug. If nobody reports it then it never gets fixed if it doesn't also effect a developers machine.
Neither of these is an indication that the projects low level architectural direction is problematic.
It is not unusual. Contrary to what people in Linux world believe its desktop software that deals with visual image and video processing is atrocious. That's why no one ( rhetorical no one ) uses it rather than Windows or Macs for their photo libraries. It is definitely the case for the software that tries to manipulate RAW files from modern cameras.
What we actually have is a tiny number of users who use Linux as a desktop that have pretty much identical use case with a small number of images succeed. All other workflows flop.
> Neither of these is an indication that the projects low level architectural direction is problematic.
In 2020 having a project where processing a corrupted file under any conditions causes the app to crash means the projects has a bad architecture.
I would guess more people use Windows and Mac because it is the path of least resistance and presumably they are more heavily invested in learning their art and the many complexities of the tools required.
Pixar is an interesting case because as pioneers in their field they made a lot of their own tools and run them on Linux. Presumably unlike single user incentivized to select whatever they are used to they are instead liable to pick the best platform.
Aftershot Pro, Lightworks, Maya, Bloom all seem to be pretty good.
They have in common that they charge money and thus have a budget. Pixars choice seems to suggest linux us a perfectly viable platform and these able tools seem to suggest we can have good tools if we are willing to invest our money in such. This isn't to say that such tools must be commercial. They could well be foss if we change the way we choose to support FOSS. Instead of heaping praise upon them we need to open our wallets and regularly.
Right, computers do have loads of RAM. So just spin up a chromium fork for the UI, and a handful of http servers for the different background tasks, pull in a bazillion of dependencies, and... it's gone. The RAM, that is.
No, spin up a worker that listens on a queue and thumbnails an image on request so in an event the thumbnailer crashes on a corrupted image the entire app does not die.
So the same for all other complicated tasks.
It is not that handling run time errors has demonstrated to be a trivial tasks for a main app.
Realistically you wouldn't want to run libparsealltheformats inside your main app anyway, because that's just obvious madness in case someone uses it on files from the Internet and not just files created by their non-evil camera. Of course, everyone does anyway.
I mean, if you're a bad programmer, sure. But we've had multi-core systems for decades at this point, one should know at least basic MT techniques, or, short of that, not to fuck with resources cross-thread unless you know what you're doing
Since we know the programs have bugs, photo formats evolve, some of the files are intentionally broken by "evil things" should not we write complex programs taking that into account rather than think that everyone else is an amateur and our code is so state of the art it will just handle bad conditions flawlessly?
We are working on a multi-process GPU accelerated image viewer with the ability to seamlessly browse and organise through hundreds of thousands of photos. Although it is multi process, all applications are embedded in a container application. The processes communicate using 127.0.0.1. All done locally.
It was specifically designed to handles hundreds of thousands of images and is in the final stages of release.
We have a little bit more information and screenshots on the website: https://www.pixolage.com and would be grateful for any community feedback (or beta testers!).
Performance is great on single thread/single CPU, but performance is even better when doing multi-threaded for most scenarios.
In terms of development, we prefer developing on lesser hardware so that we can be sure that Pixolage will run super smooth for most setups (although long compile waits can be frustrating).
For scalability, nothing beats multi-process. (Due to the way the OS manages communications between GPU driver<->process using the GPU).
Completely agree - Picasa is/was a great application!
Yep, for the moment its Windows only, however, the vast majority of the core code is platform agnostic, so after the initial Windows release, we shall be targeting Mac and Linux.
I had dozens of photographer friends try importing their multi TB photo libraries into different Linux systems using all kinds of photo management software. They all choke. Shotwell is the most stable but even it will crash every few few weeks and that would periodically require reimporting all the photos again...
My library is only 50k raw files, and Lightroom is not exaxtly fast either. Browsing the library, scrolling, etc. is sluggish. On a powerful workstation, 24 threads, 128gb ram, etc
It is just sad to see that the imaging/image processing/image organizing world is still stuck in 1990s paradigm rather than embrace that the computers have gobs of memory, a single application can be composed of dozen of individual components talking to each other over 127.0.0.1 and background jobs is a thing.