Modern SQL dialects are Turing complete and frankly pretty rich dialects. I know MS SQL Server best so can't speak in detail for others, but the community around that is certainly very substantial. Library support, well, doesn't work quite the same way (yet!) but there's plenty of libraries of code samples available for adapting. Vendor lock-in is a tricky one; by the time you've got to a certain scale of application I tend to think you're programming as much to the API (whether it's the provider's standard API or your own specific API layered over the underlying platform) as to the official 'language'; lock-in can creep up surprisingly easily. Facebook avoided vendor lock-in by writing in open PHP and have since had to write their own PHP compiler to get the performance they needed from the solution they were locked in to.
A former employer used to bulk process EDI order lines in very large quantities. Deduplicating them, dynamically rebatching them according to what was available and what wasn't, updating orders with newer product where customer had specified 'this or better', cross-referencing against multi million row datasets of cataloguing and tagging information to identify how to handle the item. It was a monster; I hate to think about the volumes of data that touched each batch, and with processing orders it absolutely had to have transactional integrity. And yet, written in SQL and running on a very average commodity server, it was actually very fast. The data never left the server until it was ready to do so and all the loads stayed happily internal. The implications of trying to implement it on a NoDB solution - the volumes of data being passed around, the amount of data specific library code the DBMS provides but the underlying language doesn't which would need reading..... It's not pretty.
I don't maintain SQL is the perfect language for everything, that's patently silly. But I do maintain it's a lot more powerful (and with good performance and reliability) than it's given credit for on some very complex operations, and that a lot the reasons people prefer to work in alternatives boil down to lack of understanding. A little learning of what a modern DBMS is capable of can reap huge rewards of saving work in the 'real programming language', as you put it.
Like you, I believe in the right tool for the job. For many applications, an SQL server is awesome.
I'm sure the developer community for MS SQL Server is reasonably large, but it is much, much smaller than the Ruby community or the Java community. The same is true for library code for each environment.
I think the Facebook example cuts the other way. If they had implemented all their core application logic in MS SQL Server, they would have been well and truly screwed if the performance wasn't enough. With PHP, at least they could write their own compiler; trying to reverse-engineer MS SQL Server is orders of magnitude harder.
Regarding the "volumes of data being passed around" part, that works well with a hot-in-RAM system; no data is passed anywhere. As with doing it all in stored procedures, no data leaves the server.
I do agree that a lot of people don't get full value out of their database. Sometimes that's a very reasonable business choice: vendor lock-in is extremely expensive at scale. But it does often come from ignorance. On the other hand, almost every developer has written a few database-backed applications, but very few have written anything in the NoDB style. Many can't even conceive of coding something without a database. I'd love to see both sorts of ignorance reduced.
Modern SQL dialects are Turing complete and frankly pretty rich dialects. I know MS SQL Server best so can't speak in detail for others, but the community around that is certainly very substantial. Library support, well, doesn't work quite the same way (yet!) but there's plenty of libraries of code samples available for adapting. Vendor lock-in is a tricky one; by the time you've got to a certain scale of application I tend to think you're programming as much to the API (whether it's the provider's standard API or your own specific API layered over the underlying platform) as to the official 'language'; lock-in can creep up surprisingly easily. Facebook avoided vendor lock-in by writing in open PHP and have since had to write their own PHP compiler to get the performance they needed from the solution they were locked in to.
A former employer used to bulk process EDI order lines in very large quantities. Deduplicating them, dynamically rebatching them according to what was available and what wasn't, updating orders with newer product where customer had specified 'this or better', cross-referencing against multi million row datasets of cataloguing and tagging information to identify how to handle the item. It was a monster; I hate to think about the volumes of data that touched each batch, and with processing orders it absolutely had to have transactional integrity. And yet, written in SQL and running on a very average commodity server, it was actually very fast. The data never left the server until it was ready to do so and all the loads stayed happily internal. The implications of trying to implement it on a NoDB solution - the volumes of data being passed around, the amount of data specific library code the DBMS provides but the underlying language doesn't which would need reading..... It's not pretty.
I don't maintain SQL is the perfect language for everything, that's patently silly. But I do maintain it's a lot more powerful (and with good performance and reliability) than it's given credit for on some very complex operations, and that a lot the reasons people prefer to work in alternatives boil down to lack of understanding. A little learning of what a modern DBMS is capable of can reap huge rewards of saving work in the 'real programming language', as you put it.