At work (Trapeze Media), we have this situation where we want to dump the database from the production servers so that we can work with that data on our local development machines. As part of an internal app (which another developer is working on open-sourcing parts of, UPDATE: the app is on github, however I no longer use it myself, not sure if it’s still used internally at Trapeze), we have two management commands to help us with this: load_devdata and dump_devdata.
The reason we use a management command is because we don’t install Fabric on our production servers to run these commands, we don’t use shell scripts either.
The database commands are PostgreSQL-specific and we use rsync to copy the media files.
The code hasn’t been refactored what we all the work we have to do for clients, however it does work. There might be some code missing to be able to run these snippets, but I think this is okay since your own workplace or production environment might be different; you might be using a different database, not using geospatial database extensions, or not interested in copying the media files. Think of the snippets as inspiration.
Dumping the database
The dump command will duplicate the database and the files. You can then zip these up and transfer them to a local machine. 99% of the time we just commit the database and media files to our repository.
Loading the database dump
The load command will create a database and then load up the database dump, it also copies the media files back to the media directory.