I've occasionally had things spontaneously break. Most recently the cryptography package just stopped installing on deployment. Had to add a pip upgrade on deploy, which somehow prevented the AMI I was using from installing some of its requirements. Had to add those packages to my project requirements.
Also some of the data analysis packages don't work with virtualenv.
Most deploy scripts I've seen tend to do this. Which is insane in my opinion, but the python toolset encourage this kind of approach by making it the default easy thing to do.
That's one of the things I prefer about Go: the compiler can cross-compile, so I can compile the entire codebase on my osx machine and produce a single statically linked linux binary that I can just copy over scp to the server. Deployment becomes infinitely simpler.
> Most deploy scripts I've seen tend to do this. Which is insane in my opinion [...]
Also, most programmers don't know a thing about administering servers.
Ubiquity doesn't mean that it's a proper way, as you clearly see yourself.
> [...] the python toolset encourage this kind of approach by making it the default easy thing to do.
It's not quite the fault of Python tooling in particular. It's really the
easiest way in general. Similarly, the easiest way to deploy a unix daemon is
to run it as root user, because the daemon will have automatically access to
all files and directories it needs (data storage, temporary files, sockets,
pidfiles, log files). Anything else that is easier to manage requires to put
a non-zero effort.
> That's one of the things I prefer about Go: [static linking]
Static linking has its own share of administrative problems. It's not all nice
and dandy.
For a number of reasons, I haven't bothered to do a complex build/deploy process. I write in python, I freeze the requirements into requirements.txt, and I type "eb deploy". Anything else is overkill for me.
And then you have different lists of installed packages in your development
environment and in the production and even between two different production
servers (which can silently and/or subtly break your software), and you need
to manually install all the C/C++ libraries and compilers and non-Python tools
and whatnot, and the upgrade process takes a lot of time and can break on
problems with PyPI and/or packages that went missing, and you can't even
sensibly downgrade your software to an earlier version.
Yes, avoiding learning how to properly package and deploy your software is
totally worth this cost.
Python does not really have a "build" step. _That_ is the problem.
I think Heroku also played a role in prolifirating the idea that you can just push your code (from a git repo) to a server, and the server will take care of deploying it. Which usually means the server will install all the dependencies from pip during the process.
Also some of the data analysis packages don't work with virtualenv.