linux-y question, boosts appreciated
So I'm trying to capture a bunch of websites as png or pdf repeatedly through a cronjob. So far, the best luck I had was with Chromium, and I also tried CutyCapt and Firefox. However, none of them are able to reliably capture https://tagesschau.de. If anyone knows any good commandline website capturing tools I'd appreciate you telling me about them!
re: linux-y question, boosts appreciated
@Nixfreak I use wkhtmltoimage now and it seems to work great
re: linux-y question, boosts appreciated
@noiob oh I'm sorry you wanted to render a web page into a PDF or png , I thought you were trying to scrape PDF's or PNG images , my bad. Could you image-magic to convert html to png or pdf I think you can at least.