linux-y question, boosts appreciated 

So I'm trying to capture a bunch of websites as png or pdf repeatedly through a cronjob. So far, the best luck I had was with Chromium, and I also tried CutyCapt and Firefox. However, none of them are able to reliably capture tagesschau.de. If anyone knows any good commandline website capturing tools I'd appreciate you telling me about them!

Follow

more info: linux-y question, boosts appreciated 

firefox only works half the time, the rest of the time it idles indefinitely at low CPU usage, chromium always idles, cutycapt produces blank files

· · Web · 0 · 0 · 1
Sign in to participate in the conversation
Awoo Space

Awoo.space is a Mastodon instance where members can rely on a team of moderators to help resolve conflict, and limits federation with other instances using a specific access list to minimize abuse.

While mature content is allowed here, we strongly believe in being able to choose to engage with content on your own terms, so please make sure to put mature and potentially sensitive content behind the CW feature with enough description that people know what it's about.

Before signing up, please read our community guidelines. While it's a very broad swath of topics it covers, please do your best! We believe that as long as you're putting forth genuine effort to limit harm you might cause – even if you haven't read the document – you'll be okay!