![]() |
Rsync Error Code 23 Kills Script
Hi,
I am using an Automator workflow running a rsync shell script to backup multiple computers to a NAS server on the LAN. During this process rsync displays the error code "rsync error: some files could not be transferred (code 23)" and it kills my workflow. Does anybody have any advice on how would I go about modifying my shell script to tell rsync to ignore or log such errors in a text file? Current script: rsync -aE --delete ~/(home directory) /Volumes/(NAS Server Partition)/ I am using the bash shell in Mac OS 10.5.2. Any help would be much appreciated. Thanks |
is it over SMB or AFP to the NAS?
Both require authentication you may need to toss in a username and password into your script. |
It is over AFP.
Yes I have the username and password in the workflow. The problem is not the connection, it's just the rsync error code 23 kills my workflow when backing up select computers. Some computers don't throw the error, so the workflow executes perfectly. Thanks |
Try putting afp://ip.of.nas/shared/folder
See if that helps |
Thanks tlarkin, but the problem doesn't lie with the afp connection. The connections are working properly and connecting to server doesn't cause the rsync error. The connection takes place before the shell script is executed.
Rsync displays the error mostly after or during a backup. Thanks |
From the man pages for rsync, under Exit Values:
Code:
23 Partial transfer due to errorThis one seems to explain traps better: http://www.davidpashley.com/articles...l-scripts.html |
No worries - figured it out! After the rsync script include this bit of code
> (file_name) 2>&1 Where file_name is the name of your file 2>&1 directs all output to the file. |
Hello hunzinker
Did you ever figure out why you were getting this error? I am having the same problem rsync'ing to a firewire drive. I did the send info to log file but I can't find any error except at the end where it states Quote:
I have run rsync multiple times and it dies in exactly the same spot. Any suggestions would be appreciated. Thanks in advance. |
Hi Nuke,
To answer your question, No I haven't narrowed the rsync error down to anything specific, but I have found a few clues. Here are few possible explanations that may cause rsync to throw error code 23. - Permission problems - Using *NIX unfriendly characters in directory / filenames I have tired using --delete-after and --ignore-errors to combat these issues. Sometimes they work and sometimes they don't. If you come across anything I haven't tried here, or you have any suggestions, please let me know. Again, it's very strange to me because I successfully backup my machine daily via Firewire, while the scripts I use over AFP work somedays and fail others. It would be nice to review the TCP dump on both nodes during the script, but I haven't figured out how to do that yet. If anything else comes up, I'll be sure to post it. Thanks |
Hi Hunzinker
Thank you for your reply. If I am running sudo rsync would that not alleviate the "permissions problem"? Or are there other permissions that I might not know about/understand? I'm having the problem doing the daily backup to a firewire drive. Since it works OK for you, I'll have to do some more testing. Is there another way to get a list of file errors other than the > (file) > 2&1? I'm not getting a full list as error 23 happens too quickly but at the same point. Would the last file in the list be the problem? Thanks again and regards. |
Hi Nuke,
Quote:
Quote:
Quote:
What does your rsync command look like? Are your files backing up? Rsync will display error code 23 even if the rest of your files are being transfered. Give it a run using -vv and see what happens. |
Thanks, I'll try and run these with the additional options tonight when I get home.
The command I am running is <sudo rsync -avE /Users /Volume/BKUP/> /Volume/BKUP/Users folder is to be sync'd with the /Users folder. Thanks again. |
Hi!
I ran the command: Code:
sudo rsync -avvE /Users /Volumes/Files/ > logsync.txt 2>&1Quote:
This doesn't make sense. I feel like bashing my head against the wall. :mad: |
Hi Nuke,
Quote:
Change this line of code Quote:
rsync -avvE /Users /Volumes/Files/<Some New Directory> just to see if rsync will copy the files. If it works, great, if not, hmmmm. |
Thanks for your suggestions. I'll try it tonight when I get home.
I have compiled rsync 3.0.4 and installed in /usr/local/bin yesterday to see if the build in rsync 2.6.9 is part of the problem. I'll provide an update based on these tests. Regards |
Update to my last comment re use of rsync 3.0.4.
I tried to rsync the folders again but used rsync 3.0.4 and still end up with error code 23. Hmmmm. I have to ponder what to try next. |
rsync error
I just now happened upon this thread, and have been having the same problems.
I found the solution was to use a directory other than the NAS for temporary files, which rsync apparently creates in the destination directory by default. I'm guessing that the NAS drive can't handle long file names or special characters that are used for the temporary files. rsync -aE --delete --temp-dir="~/(home directory)/Desktop/" ~/(home directory) /Volumes/(NAS Server Partition)/ |
Thank you for the suggestion. I'll give it a try.
One question though. If you are using ~/Desktop as the temp directory, what keeps it from trying to copy this temp directory over to the NAS? In the past, I use --exclude to keep ensure that I don't get Caches, /Volumes/ etc. from rsync'ing over to the NAS. My initial thought is that using ~/Desktop as a temp directory would be a problem?? Should this command also include --exclude=~/Desktop/(tempdir)???? Thank you. |
Hi,
I stopped using AppleScript to execute rsync for my weekly backups. Instead, I wrote a bash script and now everything works perfectly. Error code 23 is still present, but the backup executes 100%. I placed my .sh file in a directory that is not included in the backup path since I only care about the home directory ~/. Instead I place my backup script in /backup and log my errors in /backup/errors. Maybe that will help. |
Quote:
Quote:
I don't believe these "temp" files are files that are being backed up and so whether you use the desktop as a temp directory or the default of the target directory for rsync's temp files won't matter. I'm not really certain that's what they are, but can see no other reason why I get errors on files that don't exist in either the source or the target directories when the target directory is a NAS with a 32 character filename limit but when I tell rsync to use a directory that allows 32 character filenames for it's own temp directory there are no errors. |
As a rank amateur in terminal, i kept getting the same errors for me, and this is what i did wrong: didn't like the space in the folder name "Open Projects":
rsync -a -vv ~/Documents/PROJECTS/Open Projects/ /Volumes/Orangecicle/Backup ... rsync: link_stat "~/Documents/PROJECTS/Open" failed: No such file or directory I had to rename the folder to "Open_Projects" and the errors stopped and synced. |
Quote:
The \ character ahead of a space tells it to ignore it as a separator in the command or script so it will pass that space correctly into the rsync command. |
In case someone is still having troubles with this and comes across this thread, I thought that rsync escaped the same way that I am used to normal files escaping, so I put in as my source:
blah/iTunes\ Library/ etc, and that simply wouldn't work no matter what I did (I am running Snow Leopard btw). I even tried double quotes around the source- didn't make a difference. Finally I tried single quotes AND used, instead of backslash-escape, a single ? in place of the space like so: blah/iTunes?Library Worked like a charm. While working on this, I ran across a great posting here: http://forum.synology.com/enu/viewto...p?f=39&t=23124 that shows you how to log great information regarding your transfer. If you do this, you need to be familiar with scripting because there are some variables you will need to change and you will also want to single-quote escape the $1 $2 etc variables instead of double quotes. An easy search and replace does the trick. I put all my normal rsync modifiers in that script and then ran my script like so (assuming call-rsync.sh is in $PATH)- if not, go to that directory and run ./call-rsync.sh (see example below). Make sure that script is chmod +x. Works a charm: call-rsync.sh '/Users/acepelon/Music/iTunes/iTunes?Library' /Volumes/ACE-iPod/MusicBackupFromLocalMusic I really hope that helps someone. Rsync has always been a tiny bit of a mystery to me but now I will be able to use it confidently with logging too. acepelon |
.Trashes can cause this error
Just in case this might help others. I was searching for this error and after some thought I realized that the problem was because rsync couldn't copy a .Trashes folder on a mac (i.e. a permissions issue). It was difficult to notice at first because I had -v (verbose) option on and the .Trashes error occurred as the first line of the output so I never saw it until I turned off verbose.
So this failed: Code:
rsync -a /Volumes/ScottsVMs/ /Volumes/My\ Book/Virtual\ Machines/Code:
rsync -a --exclude='.Trashes' /Volumes/ScottsVMs/ /Volumes/My\ Book/Virtual\ Machines/ |
| All times are GMT -5. The time now is 05:46 PM. |
Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.
Site design © IDG Consumer & SMB; individuals retain copyright of their postings
but consent to the possible use of their material in other areas of IDG Consumer & SMB.