Podcasts are great! They are like talk radio, but available on demand when you need them. And the plethora of content and different formates covers every niche from fan-driven formats like on-site theme park reviews, to scripted true-crime documentaries and simple domain specific talk-shows.
Naturally, some of the podcasts that I listen to are better than others. Some even stand the test of time, delivering timeless content that might very well be worth a re-visit in five or ten years time. That might become a problem though. Like all content on the internet, there is no guarantee that this podcast’s feed or file hosting might be around that far out in the future. After all, continuous feed and file hosting costs money and no-one wants to foot that bill for all eternity.
Luckily, local offline storage becomes ever cheaper, enabling us to simply keep a local copy of our favorite podcasts. But how to go about downloading your favorite podcast? Of course, one could simply use any podcast client (back in my day, these were called podcatchers) and then find the files it has downloaded - probably somewhere deep in your OS’s library folders. But even then, you would still need to open that podcast client, add your feed, mark all the episodes for download and wait…
There is only one problem though: If you have multiple podcasts you continuously want to keep an up-to-date local copy of then things turn tedious very quickly. Because you end up having to run that command on a schedule, for every podcast.
Which is where my small addition comes in: https://gitlab.com/JanGregor/podcasts-archiver is a dockerized version of Joshua’s great tool that comes with a twist: Now, you simply need a folder structure like:
podcasts/ ├─ podcast-a/ │ ├─ URL ├─ podcast-b/ │ ├─ URL │ ├─ OPTIONS
And simply point the docker container to that folder structure like:
$ docker run -it -v pwd
/podcasts:/podcasts registry.gitlab.com/jangregor/podcasts-archiver /podcasts and let the magic happen.
The included script will automatically go through every (1st level) subfolder and try to find a
URL file in there that keeps the podcast’s URL - if an
OPTIONS file is also present, it will use its content to append the command. And then it will simply run
podcast-dl for every of your podcasts/subfolders.
Now, do you really need a container for that? Could you not do with individual CRON jobs? Maybe, probably even, but for my particular setup (running this on a schedule on my NAS to permanently keep a copy there) this was the easiest and most portable solution. Hence, I decided that, if I already built it in a portable way, I might as well share it with all of you out there.