linux-y question, boosts appreciated
So I'm trying to capture a bunch of websites as png or pdf repeatedly through a cronjob. So far, the best luck I had was with Chromium, and I also tried CutyCapt and Firefox. However, none of them are able to reliably capture https://tagesschau.de. If anyone knows any good commandline website capturing tools I'd appreciate you telling me about them!
re: linux-y question, boosts appreciated
@Nixfreak that doesn't actually render the page into png or pdf