The macosxhints Forums

The macosxhints Forums (http://hintsforums.macworld.com/index.php)
-   UNIX - General (http://hintsforums.macworld.com/forumdisplay.php?f=16)
-   -   backups using tar: are there any limitations? (http://hintsforums.macworld.com/showthread.php?t=79820)

cocotu 10-23-2007 04:22 PM

backups using tar: are there any limitations?
 
I'm using this in a crontab:

00 03 * * * tar cvzf /Volumes/backupdisk/projects.bak/projects.tgz /Volumes/DataHD/projects 2>&1 | tee /Users/bhadmin/projects.log

But I'm not able to backup all the contents of the projects directory. I get this error after maybe half way of tarring the contents of the projects directory (234.55GB size)

tar(2209) malloc: *** vm_allocate(size=8421376) failed (error code=3)
tar(2209) malloc: *** error: can't allocate region
tar(2209) malloc: *** set a breakpoint in szone_error to debug
tar(2209) malloc: *** vm_allocate(size=8421376) failed (error code=3)
tar(2209) malloc: *** error: can't allocate region
tar(2209) malloc: *** set a breakpoint in szone_error to debug
tar: memory exhausted

So my question is: Is there a size limit for doing this using tar? If so, can you point me to another method, options? thanks

baf 10-23-2007 07:11 PM

Hmm half of 234G feels close to 128G magic number for your file system?

Well here comes an idea.
This is only tested in small scale. And it doesnt work with z flag. But the files could be compressed afterwards if its the kind of data that benefits from compression
See also this page

backup with:
tar -c -f ttt/ut1.tar -f ttt/ut2.tar -f ttt/ut3.tar -L 1024 Wcalc-2.3/

where 1024 is changed to a suitable size for you (Kilo bytes per file) this gives 1M/file with 1024 as setting
Make sure you give enough filenames to contain the whole archive. It doesnt hurt to have to many. But too few will stop the backup.

test archive with:
tar -tM -f ttt/ut1.tar -f ttt/ut2.tar -f ttt/ut3.tar

extract:
tar -xM -f ttt/ut1.tar -f ttt/ut2.tar -f ttt/ut3.tar

Good luck and tell us if it works or not.

cocotu 10-23-2007 10:52 PM

thanks baf. So what you're saying is this:

tar -c -f ttt/Volumes/backupdisk/projects.bak/projects.tar -L 1024 Wcalc-2.3/

why are you making 3 tar files? not sure if i'm understanding, I'm backing up to an external HD. thanks again.

baf 10-24-2007 04:23 AM

No I was making three files so that I wouldn't hit a program or filesystem size limit and you do that by having several f flags. that ttt was the directory I backed up to.
so one way to make a backup of Wcalc-2.3 to three files in ttt is:
tar -c -f ttt/ut1.tar -f ttt/ut2.tar -f ttt/ut3.tar -L 1024 Wcalc-2.3/

And if you look at that link I gave they show a script that can do the split/rename by itself. I will play with that and see if I can get it to compress the files too. Warning that only works with gnutar but thats the one used by macosx

cocotu 10-24-2007 10:26 AM

I see you're a Major Leaguer that is why I wasn't understanding. So you're saying that tar has a size limit and that I need to backup into 3 tar files? thank for your help baf.

baf 10-24-2007 11:14 AM

Well I say that it looks like a limit so test it with three files. And I dont know if the limit is in tar,gzip or the filesystem or possibly even worse a problem with one of your files. But test it and tell us if it works or not.

cocotu 10-24-2007 03:37 PM

baf, I'm running the command and I get this:

Prepare volume #4 for `/Volumes/backupdisk/test/p1.tar' and hit return:
Prepare volume #5 for `/Volumes/backupdisk/test/p2.tar' and hit return:
Prepare volume #6 for `/Volumes/backupdisk/test/p3.tar' and hit return:
Prepare volume #7 for `/Volumes/backupdisk/test/p1.tar' and hit return:
Prepare volume #8 for `/Volumes/backupdisk/test/p2.tar' and hit return:

not sure what to do because i'm going to have this in a cronjob. Also, why are we using the -L option, I was reading that this option is for when you're backing up to tape I'm backing up to an external HD.

i just tried without the -L option and got:

tar: Multiple archive files require `-M' option
Try `tar --help' for more information.

This is the command I'm using:
tar -c -f /Volumes/backupdisk/test/p1.tar -f /Volumes/backupdisk/test/p2.tar -f /Volumes/backupdisk/test/p3.tar /Volumes/DataHD/projects 2>&1 | tee /Users/bhadmin/projects.log

then I included the -M option and got:

tar: /Volumes/backupdisk/test/p1.tar: Cannot open: No such file or directory
tar: Error is not recoverable: exiting now

thansk!

baf 10-25-2007 03:30 AM

Quote:

backup with:
tar -c -f ttt/ut1.tar -f ttt/ut2.tar -f ttt/ut3.tar -L 1024 Wcalc-2.3/

where 1024 is changed to a suitable size for you (Kilo bytes per file) this gives 1M/file with 1024 as setting
Make sure you give enough filenames to contain the whole archive. It doesnt hurt to have to many. But too few will stop the backup.
You have to change the 1024 as I said to a bigger number. And yes the -L is mostly used for tapes but also for splitting to files.

-L 102400000 should give files around 100G

cocotu 10-26-2007 12:32 PM

thanks baf. This has turn out to be a little complicated. I found an app called iBackup. This one backups the dir and zips each directory inside the projects folder. eg:

projects (main dir)

dir1.zip
dir2.zip
dir3.zip
and so on!

Now, the problem with iBackup is that I noticed it take too long, not sure the reason, maybe because is the first time. iBackup has an option for erasing files/folders erased at the source. if we can do this at the command line I will abandon iBackup. iBackup uses ditto and cpio and zip for compression. Can we do this? what would be the more practical way of doing this backup? the destination HD does not have enough room unless the data is compress. thanks baf.

baf 10-26-2007 01:55 PM

Which version of tar do you use ? try
tar --version

Do you have developer tools installed ?
The reason I ask is that it might work with a newer tar.

cocotu 10-26-2007 06:02 PM

tar --version is:

tar (GNU tar) 1.14 +CVE-2006-0300 +CVE-2006-6097
Copyright (C) 2004 Free Software Foundation, Inc.
This program comes with NO WARRANTY, to the extent permitted by law.
You may redistribute it under the terms of the GNU General Public License;
see the file named COPYING for details.
Written by John Gilmore and Jay Fenlason.
Modified to support extended attributes.

I don't have any developer's tools installed, but I can get them. Are there any specific ones?
Thanks baf

baf 10-26-2007 06:15 PM

Well developer tools is a package on your osX install disks. I think you just have to scroll down a bit after having mounted the cd. Then fetch ftp://ftp.gnu.org/gnu/tar/tar-1.18.tar.gz
and save it somewhere.
In terminal do:
tar zxf path/to/it/tar-1.18.tar.gz
cd tar-1.18
./configure --prefix=/usr/local
make
mkdir -p /usr/local
make install

now you should have a new tar in /usr/local/bin you will have to use its full path in your crontab. Try that one and hope/pray or whatever fits your ideas.
Good luck. And if that doesnt work I could whip up a shell script which mimics iBackups work flow. But then you will have to answer some more questions.
Is there any files directly in /Volumes/DataHD/projects or just directories?
Do you want one logfile or one per directory?

cocotu 10-27-2007 04:58 PM

1. there are sub-directories containing folders and files under the projects directory.
2. I would like one logfile.

I will install tar 1.18 as soon as I can. thanks baf.

Mailman42 10-27-2007 09:52 PM

I thought that TAR had a 2 gig limit on file size creation?

Has this changed?

bb5ch39t 10-28-2007 05:14 AM

Your crontab has:

00 03 * * * tar cvzf /Volumes/backupdisk/projects.bak/projects.tgz /Volumes/DataHD/projects 2>&1 | tee /Users/bhadmin/projects.log

Why are you using a pipe into "tee"? That is normally done to view output interactively as well as putting it to a file. I don't know why you'd do this in a crontab

Why not do:

00 03 * * * tar cvzf /Volumes/backupdisk/projects.bak/projects.tgz /Volumes/DataHD/projects >/Users/bhadmin/projects.log 2>&1

Instead? BTW - I just used tar to create a 8Gb output file

cocotu 10-29-2007 02:54 PM

so what is the limit on the newest tar version 4gb or 8gb? can this be accomplish if I want to backup a 234.55GB size folder? Can you provide a link to these developer tools (c compiler & make command) thanks for all the help.

bb5ch39t 10-29-2007 06:24 PM

IIRC, the developer tools are on the OSX installation disk. I know that I didn't download them! The tar that I used is also the one that came with my OSX installation. I'm not aware of any particular limit on the filesize in tar itself. If there is one, I would think that it is due to a filesystem limitation. Modern filesystems should not have a limit. Well, I think it may be 2**64 for a 64-bit filesystem.

tar --version says:

tar (GNU tar) 1.14 +CVE-2006-0300 +CVE-2006-6097
Copyright (C) 2004 Free Software Foundation, Inc.
This program comes with NO WARRANTY, to the extent permitted by law.
You may redistribute it under the terms of the GNU General Public License;
see the file named COPYING for details.
Written by John Gilmore and Jay Fenlason.
Modified to support extended attributes.

it is in the /usr/bin subdirectory.

cocotu 10-30-2007 10:59 AM

so what will be the conclusion of this topic? impossible or possible? thanks

baf 10-30-2007 02:04 PM

Have you tried with a newer tar yet ?

cocotu 10-30-2007 02:56 PM

is running right now. when I installed the Developer tools there was no need to download the newest tar from source. After installing the developer tools I ran tar --version and got:
tar-1.18, but I just ran the same command right now and I get tar-1.14 again, not sure what happen! thanks

baf 10-30-2007 04:21 PM

You seem to have at least two tars installed so the result depends on which one it finds.
in terminal do:
sudo find / -name tar -print

and then try each with
/path/to/it/tar --version

Then use the full path in your crontab.

cocotu 10-30-2007 05:07 PM

running sudo find / -name tar -print
I got:

/usr/bin/tar

then I did: /usr/bin/tar --version, and got:

tar (GNU tar) 1.14 +CVE-2006-0300 +CVE-2006-6097
Copyright (C) 2004 Free Software Foundation, Inc.
This program comes with NO WARRANTY, to the extent permitted by law.
You may redistribute it under the terms of the GNU General Public License;
see the file named COPYING for details.
Written by John Gilmore and Jay Fenlason.
Modified to support extended attributes.

baf 10-30-2007 05:41 PM

Hmmm then I dont understand how you could see tar-1.18 as you said in post #20 ?

Ok looks like you will have to download the source and compile it like I showed in an earlier post. I still dont know if it will help but it wont hurt at least.
Oh use 1.19 instead not 1.18 as i said earlier.

And if that doesnt help I'll fix you a shell script that makes one tgz file from each directory.

But I wont do that until you have tried with a newer version becuse now I'm curious.

And I tested with 1.19 on a linux box and it seemed ok with a 130G+ tar file. So it may be a problem only on osX or with that 1.14 version of tar.

hayne 10-30-2007 06:16 PM

I note that the error messages shown in post #1 seem to indicate that the problem is with the memory used by 'tar', not with the disk space taken up by the file which tar is trying to produce. So the problem may have more to do with the structure of the files & folders that are being tar'ed than with the total size. I.e. it looks like 'tar' is exceeding some maximum-allowed amount of memory.

Hal Itosis 10-30-2007 08:35 PM

type -a tar

will find multiple occurrences, as well
as aliases (where options might be set).

-HI-

cocotu 10-30-2007 09:57 PM

After running tar again I got:

tar(27702) malloc: *** vm_allocate(size=8421376) failed (error code=3)
tar(27702) malloc: *** error: can't allocate region
tar(27702) malloc: *** set a breakpoint in szone_error to debug
tar(27702) malloc: *** vm_allocate(size=8421376) failed (error code=3)
tar(27702) malloc: *** error: can't allocate region
tar(27702) malloc: *** set a breakpoint in szone_error to debug
tar: memory exhausted

hayne may be right. After typing type -a tar I got:

tar is /usr/bin/tar

thanks for all the help. Do you think I should use zip instead of tar?

cocotu 11-03-2007 08:38 AM

tar 1.18 worked!
 
baf, tar 1.18 worked fine! Why doesn't the tar website specifies this? or maybe I didn't see that info. thanks for the help!

baf 11-03-2007 11:54 AM

Ah interesting. Possibly it's a bug fix that they even didn't know would fix this? Sometimes fixing one thing solves other problems you didn't know about. Or it could be that the act of compiling it yourself solved it. Perhaps apple used some different settings ?
But glad we finally solved it.

P.S. In the process I found and fixed a bug in tar so now it has one less.

cocotu 11-06-2007 11:05 AM

baf, the command worked fine. I placed it in a cronjob to run everyday at 3AM. It is 11:00AM and tar is still running. When I ran it that day it did not take 8 hours to complete. Do you think is something to do with cron? thanks


All times are GMT -5. The time now is 05:45 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.
Site design © IDG Consumer & SMB; individuals retain copyright of their postings
but consent to the possible use of their material in other areas of IDG Consumer & SMB.