In our environment we have two DBAs to run some cron jobs defined in cron_jobs.txt file. Frequently we need to chan ge timing and commands in that cron_jobs.txt file and reload it.
Problem is, if DBA_1 edits and reloads that file ($ crontab cron_jobs.txt) it is loaded under his useraccount. Now, DBA_2 edits the same cron_jobs.txt file and reloads. The new sets of jobs will run under DBA_2 useraccounts. So, basically same jobs are running twice.
For an example, a job is set to run on 3PM daily by DBA_1. Few days later, DBA_2 changes it to 2PM and reloads. Now we have same job running at 2PM and 3PM.
Is there any way to merge the jobs? Liek no matter who loads the cron_job.txt file, only one set of jobs should be running? By the way (DBA_1 has admin access to UNIX, DBA_2 doesn’t). We are using Sun Os 5.10. A solution will be appreciated.
Setup a separate account for this purpose, and let cron jobs run under that account only.
This is a best practice and offers many benefits beyond fixing your problem:
- Nothing to worry about if either DBA is hit by a bus, or other
- The real user account environments don’t affect your production cron jobs
- Coupled with sudo, it gives good logging/accountability
In my previous company, I disabled cron & at for all real user accounts.
Leave a comment
- SCP transfer only modified files
- How can I automate clearing and resetting a Linux user’s home directory to a default?
- Cron expression that runs every 5 minutes from 1:30 am – 6:00 am [duplicate]
- Understanding redundant power supplies
- Is there a way for administrators to disable users from installing Firefox extensions?