Hay there fediverse. Do we have any #bash #cron experts that can help me figure out how to fix things?
TL;DR - I need to reference the `$USER` variable as a parameter in a shell script that forms part of a dynamic path in a backup routine (the location varies by host so trying to automate as much as possible). Running the script as a logged in user over SSH works perfectly but when I add a crontab entry under the same user to run the same script is fails to parse the `$USER` part so the destination is incorrect and it throws an error.
I should also note that if I specify the path manually in the script it also works, but then it's a pain to automate across different hosts so I'm trying to avoid that if possible.
I now been going round in circles trying to figure this out for a couple of days so it's time to ask for help.
I forgot you can't just run a shell script with rsync calls via cron on macOS because of... security. I think my workaround will do though.
I write a shell script and then create an Automator application that calls the shell script. I then add a cron job to open the application.
I need to check if it runs when the screen is locked. I'm pretty sure it does but I will test again.
So I needed to run some periodic backup jobs, both for personal and professional needs. If were you ever tasked with such a request, you probably looked at cron. But cron has shortcoming: it does not survive power off events, it does not support any logs, and you can’t easily tell when, and if it was ran.
Meet systemd timers. A modern approach to running cron-like job-scheduling.
#question #bash #crontab #mysql :
j'ai un script qui fait
for db in database1 database2 database3;
do
echo $db
if [[ "$db" != "database2" ]]; then
echo "dumping database $db"
mysqldump -u $USER -p$PASSWORD --databases $db > $db.sql
fi
done
quand je le lance directement, pas de souci, ça fonctionne. mais en tâche #cron, rien, que dalle, que pouic, mon cul dans les orties.
et je ne vois pas pourquoi.
Une idée ?
@gerowen good story - i have to pull trigger on vps and implement martial law study regime, even on wkds. the server will be a needed mini timesink. it may be a case of the new normal - study for certs until I get a few and then keep studying and test taking indefinitely until right money and right situation magically show up #magical thinking #routine #cron
@jp I'm using #syncthing to sync the posts to my home server. Then a script runs regularly using #cron and updates the site.
I'm using #markor in #android as well.
A couple months back, I bought an external (Samsung) USB SSD. It's a bit aggressive about spinning down for power-saving. As a result, if I want Backblaze to be able to read it as part of its regular activities, I have to do (in #WSL2):
while true
do
touch /mnt/g/BB-backup.touch
printf .
sleep 10
done
cron
service won't run in the WSL instance (no systemd/dbus), but I could always replace it with a #PowerShell script that I put into my Windows laptop's scheduler …But I fucking hate writing PowerShell.#Linux: How to Use #Cron to Schedule Regular Jobs
https://thenewstack.io/linux-how-to-use-cron-to-schedule-jobs/
J'ignorais qu'il y avait dans Linux un outil pour garantir qu'un cronjob ne s'exécutait qu'une fois ... https://linuxhandbook.com/flock-command/ #linux #cron #commandline #bash
I remembered this afternoon that I needed to turn off a #cron job. So I did. I edited root's crontab during half time of the #EnglandA vs #AustraliaA #rugby match. I don't use it often, but #Termius seems to have sprouted an "AI" predictive text tentacle so typing was much easier than it had any right to be. Knowing how to navigate in #vim really helped though. It was a good game too.
@linuxnerd yes, #systemd is a godsent and everyone who doesn't see it that way hasn't done complex administrations and configurations manually with #SysVinit, #daemons, #cron etc.
I dug a bit into #cron on Debian stable (bookworm) and it's quite messy imo. Especially if you're running systemd and also have anacron installed, which is the default.
1/
View all the crontabs on a system
sudo grep -RHEv '^\s*(#|$)' /var/spool/cron/crontabs/
I think I'm done. Ever since Cron became Notion Calendar it's been laggy and glitchy. I also don't like the design as much as I used to because they changed quite a few things ...
Does anyone have a recommendation for an open-source calendar that is able to sync with Google Calendar?
#Calendar #Cron #NotionCalendar
Anyone familiar with #Cron jobs running shell scripts? Please help!
I've got a simple shell script that runs fine manually from the command line but won't from a Cron job. When searching for an answer, it seems to be a common problem but without a common solution. Any ideas?
I have tried / ensured correct:
• #!/bin/bash or #!/bin/sh first line of script
• chmod +x or chmod 755 on script to ensure file permissions are rwx for user
• owner of file is same user as the Cron running the job
• Cron job created using crontab -e whilst logged in as owner of file
• added (and removed) /bin/bash or /bin/sh before the location of the script to run in Cron
• tried cd /home/mastodon && ./purge_cache.sh instead, same outcome
• I am running the latest version of Bookworm (#Debian) on a #RaspberryPi 4 (4GB RAM, 512GB SD)
Cron:
0 2 * * * /home/mastodon/purge_cache.sh
Shell script:
#!/bin/sh
RAILS_ENV=production /home/mastodon/live/bin/tootctl preview_cards remove --days 7
curl -fsS -m 10 --retry 5 -o /dev/null https://hc-ping.com/example1
RAILS_ENV=production /home/mastodon/live/bin/tootctl media remove --days 7 --remove-headers
curl -fsS -m 10 --retry 5 -o /dev/null https://hc-ping.com/example2
RAILS_ENV=production /home/mastodon/live/bin/tootctl media remove --days 7
curl -fsS -m 10 --retry 5 -o /dev/null https://hc-ping.com/example3
Note: The tootctl lines flush different media cache older than 7 days (taken from #Mastodon docs); the curl lines ping healthchecks.io (a Cron job monitor) to say the job ran successfully.