mastodon.ie is one of the many independent Mastodon servers you can use to participate in the fediverse.
Irish Mastodon - run from Ireland, we welcome all who respect the community rules and members.

Administered by:

Server stats:

1.5K
active users

#cron

0 posts0 participants0 posts today

New blog post!

Backups are an easy task to put off but WOW do you feel the pain when a file is mistakenly deleted or a storage device fails!

I use a combination of ZFS tools + bash scripting + cron(8) to perform an automated daily backup of the contents of my home directory to the home server (both machines are running FreeBSD):

dwarmstrong.org/automate-zfs-b

www.dwarmstrong.orgZFS Snapshots and Backups Part 3: Backups You Don't Have to Think About are Backups that Get Done ☯ Daniel Wayne ArmstrongLibre all the things
#ZFS#Bash#Cron

#Shaarli: Cron Expression Generator - Free online cron expression generator tool. Generate cron expressions by using human language. | Uptimia - Une page qui permet de créer une expression pour un cron à partir d'une phrase.
IA inside ? : uptimia.com/cron-expression-ge #cron #expression #ia

Uptimia.comCron Expression Generator | UptimiaFree online cron expression generator tool. Generate cron expressions by using human language.

Ich bräuchte mal das inverse Tool zu chronic für den Einsatz mit systemd Timern: stdout und stderr eines soll immer gesammelt und per E-Mail versendet werden, quasi eine Entsprechung zu MAILTO für systemd. Jedes Mal das Command in ein Skript wrappen fühlt sich irgendwie nicht richtig an. Oder gibt's da schon eine elegante Lösung? #Linux #systemd #cron #chronic #linuxadmin

Hay there fediverse. Do we have any #bash #cron experts that can help me figure out how to fix things?

TL;DR - I need to reference the `$USER` variable as a parameter in a shell script that forms part of a dynamic path in a backup routine (the location varies by host so trying to automate as much as possible). Running the script as a logged in user over SSH works perfectly but when I add a crontab entry under the same user to run the same script is fails to parse the `$USER` part so the destination is incorrect and it throws an error.

I should also note that if I specify the path manually in the script it also works, but then it's a pain to automate across different hosts so I'm trying to avoid that if possible.

I now been going round in circles trying to figure this out for a couple of days so it's time to ask for help.

I've had my #crontab setup for ages to execute `trash-empty 30` (from the #TrashCLI package) each day, but just realised it hasn't been working.

Running manually shows a confirmation prompt, so I assume that's the reason. Apparently trash-empty can detect when it's in interactive mode, but this appears to be buggy.

Thankfully there's an `-f` flag to force the command with no prompt, so changing my crontab to `trash-empty -f 30` hopefully works instead.

I forgot you can't just run a shell script with rsync calls via cron on macOS because of... security. I think my workaround will do though.

I write a shell script and then create an Automator application that calls the shell script. I then add a cron job to open the application.

I need to check if it runs when the screen is locked. I'm pretty sure it does but I will test again.

#rsync#cron#macOS

So I needed to run some periodic backup jobs, both for personal and professional needs. If were you ever tasked with such a request, you probably looked at cron. But cron has shortcoming: it does not survive power off events, it does not support any logs, and you can’t easily tell when, and if it was ran.

Meet systemd timers. A modern approach to running cron-like job-scheduling.

yieldcode.blog/post/working-wi

yield code(); · Working with systemd timers - Dmitry Kudryavtsev The other day I thought to myself that it would be a good idea to have some backups of my data. So I was wondering, how would I execute a periodic backup task?