Go Back   The macosxhints Forums > OS X Help Requests > UNIX - Newcomers



Reply
 
Thread Tools Rating: Thread Rating: 4 votes, 5.00 average. Display Modes
Old 05-30-2002, 02:41 AM   #21
osxpez
Major Leaguer
 
Join Date: May 2002
Location: Sweden
Posts: 282
I forgot about those resource forks. I think I might somehow have assumed that it was something only concerning that old stinking carcass of an OS, what was its name ... yes! OS 9. =) Can someone shed some light on this? Is it Carbonized apps that still need them or is OS X in itself or Cocoa dependent on them? In OS 9 I know things like document and folder icons resided in the resource fork. Is this still the case with OS X?

Anyway, I have read somewhere about a patched version of tar, called hfstar, that should honour resource forks. When I read it I thought it was for being able to tar Classic stuff, but seeing this thread develop I wonder if hfstar might be the answer? I mean since pyrogen seemed to be OK with the first tar solution if it wasn't for those resource forks...

As for periodic backups and preserving CPU and all that; I thought the original question was about making a backup to CD media, on demand, of a home folder. I say "do the simplest thing that could possibly work" and worry about CPU cycles and periodicity when (or if) it gets to be a real issue.
__________________
/PEZ
osxpez is offline   Reply With Quote
Old 05-30-2002, 03:22 AM   #22
mervTormel
League Commissioner
 
Join Date: Jan 2002
Posts: 5,536
Exclamation rez forks

Quote:
Originally posted by osxpez
Is it Carbonized apps that still need them or is OS X in itself or Cocoa dependent on them? In OS 9 I know things like document and folder icons resided in the resource fork. Is this still the case with OS X?

mox nix. i don't want to have to know that a file i create or change has a resource fork or not, not in the past, present, or future. i want assurity that my backup has integrity. if a command doesn't facilitate it, i can't count on it, because i can't count on myself for making sure.

therefore, if tar/pax/cpio doesn't get rez forks, drive on and look for another facility.

[ i've got 15 year old data that may, or may not, have rez forks. i don't want to know. i want backup integrity. that is a plight of every sysadmin in the world. is my backup 100% reliable? ]

currently, i know that the -rsrc flag of ditto assures me i have a valid backup. i'm exploring hfspax and psync; if my tests indicate they are reasonable facilities, i will post.

i think what we are looking for here is a facility that will backup whatever with integrity and assurity (and compression when it makes sense).

as of yet, i don't think there is such a command line faciclity without some 'home grown' brains.

as for "do the simplest thing that could possibly work," well, if you've got rez forks in the source and you use anything that doesn't facilitate them, then, on restore, you've got less than 100% success, which is failure, which means it doesn't work.

apple should have produced iBackup way before releasing the monster. too many schisms.

Last edited by mervTormel; 05-30-2002 at 03:31 AM.
mervTormel is offline   Reply With Quote
Old 05-30-2002, 03:37 AM   #23
pyrogen
Prospect
 
Join Date: Feb 2002
Posts: 48
osxpez

error

well when I attempt:

% hdiutil create -sectors 2676896 /Users/Shared/pyrogen_backup

i get the message:

hdiutil: create failed - No such file or directory

i believe this is because i have no 'Shared' folder in '/Users'

is it acceptable to:

% hdiutil create -sectors 2676896 /Users/pyrogen_backup

forgoing the 'Shared' folder and creating the disk image in '/Users' ?

OR

Should I create a shared folder in '/Users'?
If so would I use 'mkdir' ?

Thanks,
Destin
pyrogen is offline   Reply With Quote
Old 05-30-2002, 03:47 AM   #24
osxpez
Major Leaguer
 
Join Date: May 2002
Location: Sweden
Posts: 282
merv: This was about an ad-hoc backup of a home folder. If it was me haveing that need I would like to know if I need concern myself with resource forks or not. As for a sysadmins needs for 100% integrity I could not agree more. But 15 year old data should already be properly backed up if you ask me... =)
__________________
/PEZ
osxpez is offline   Reply With Quote
Old 05-30-2002, 04:17 AM   #25
sao
Moderator
 
Join Date: Jan 2002
Location: Singapore
Posts: 4,237
MervTormel,

Thanks for the great input, learned a lot from it.

Using 'psync' worked well with me. (I could easily restore some files I lost, from my backup hard drive)

I am eagerly awaiting your test results. Thanks again for all the work.


Cheers...
sao is offline   Reply With Quote
Old 05-30-2002, 04:17 AM   #26
mervTormel
League Commissioner
 
Join Date: Jan 2002
Posts: 5,536
Quote:
Originally posted by osxpez
merv: This was about an ad-hoc backup of a home folder. If it was me haveing that need I would like to know if I need concern myself with resource forks or not. As for a sysadmins needs for 100% integrity I could not agree more. But 15 year old data should already be properly backed up if you ask me... =)

well, like i said, it's a case of situational awareness. you either know, or you don't. in the case of "don't know", you'll likely be caught with your pants down.

an ad-hoc backup is going to let you down when you need it most. you get comfortable thinking, "i'm assured, my backup is good", when clearly you need to run a few tests that are a little esoteric to the regular user.

as for my 15 year old data, it is backed up. but where? you come over here and hunt for it. i'd rather know it's safely backed up every day than to go wading in my cellar.

it's not a lot of data, but it's critical, ergo, the caution.

ever heard of tape decay? well, CDs decay as well, and who ever verifies/tests their backup? as in "restore it"? lo, many stingers in the retrospect parade, as i see it.

i predict the future will be filled with many frivilous lawsuits called "Your CD Media Ate My Balls"

--
"gimme hot backup or gimme death"
-- merv henry

Last edited by mervTormel; 05-30-2002 at 04:31 AM.
mervTormel is offline   Reply With Quote
Old 05-30-2002, 04:29 AM   #27
mervTormel
League Commissioner
 
Join Date: Jan 2002
Posts: 5,536
Quote:
Originally posted by sao
...Using 'psync' worked well with me. (I could easily restore some files I lost, from my backup hard drive)...

sao,

thanks for the endorsement. it's appealing since psync does an incremental backup and can be instructed to not remove files from the target. i shall focus my investigation on psync.
mervTormel is offline   Reply With Quote
Old 05-30-2002, 05:20 AM   #28
JayBee
Major Leaguer
 
Join Date: Jan 2002
Location: Edinburgh, Scotland
Posts: 437
<embarrassment facecolour="scarlet" pantsstatus="down">
Thanks for the catch Merv - I've changed the disk references to diskXs1, diskXs2 etc so a copy&paste effort won't munge anyone's hardware.

That's what I get for posting a long HOWTO without proofreading just before I go to bed...
</embarrassment>

pyrogen: yes, sorry - once again my "completeness" was incomplete! The directory you're punting the image to has to exist. Anywhere you have write permissions will be fine, just as long as it's not contained in the directory you're trying to back up.

Myself, all my backups live in a separate partition called Dump. So I write out the image to /Volumes/Dump/Backups/.

Erm, anything else? Yeah, merv, I totally agree about the backup integrity thing. I've always used disk images like this when possible - my CD burner used to live on my PC, so it was the easiest way to punt my backups cross-platform and be sure they'd be good. You don't even really have to muck about in the terminal as much to do this. Disk Copy and Get Info can happily put a GUI on to the image creation process. All you need the terminal for is to ditto the backup across.

<topicdrift distance="not very far>
Off-topic, has anyone had experience with restoring a backup from mysql using the "dump" command? I did this a while ago before my last reinstall, and although I can see what the sql file is supposed to be doing, it can't seem to restore users/permissions etc. I managed to munge a restore via some nifty copy&paste work, after setting up the various users in mysql by hand, but I was wondering what people's general restore pattern was for mysql.

As merv says, a backup you can't restore from is no backup
</topicdrift>
__________________
JayBee
--

It's all relative, you know
JayBee is offline   Reply With Quote
Old 05-30-2002, 06:59 AM   #29
sao
Moderator
 
Join Date: Jan 2002
Location: Singapore
Posts: 4,237
Jaybee,

Have you tried using 'psync' ?

I use it to mirror my primary drive to a second drive every night and it works great. And copies both the resource forks and Finder bits.

The way to get 'psync' is via the Perl module "MacOSXFiles". It requires you to have installed the 10.1 Developer Tools. The module can be found at:

http://search.cpan.org/search?dist=MacOSX-File

To backup everything in your startup volume, all you have to say is:

sudo psync -d / /Volumes/I<backup>

And the resulting backup volume is fully-bootable copy thereof. Note `sudo' or root privilege is necessary to restore file ownership.


Cheers...
sao is offline   Reply With Quote
Old 05-30-2002, 07:09 AM   #30
sao
Moderator
 
Join Date: Jan 2002
Location: Singapore
Posts: 4,237
Also, I you would like to use hfspax, I found 'genecutl' wrote a nice perl script for making incremental MacOSX Backups with hfspax.

Here is his web page, so you can check it up :

http://smalltime.com/gene/bax.html

Haven't tried it myself, but I know of several people who did, and said that it will respect the resource forks of files on HFS+ volumes.

Cheers...
sao is offline   Reply With Quote
Old 05-30-2002, 07:37 AM   #31
sao
Moderator
 
Join Date: Jan 2002
Location: Singapore
Posts: 4,237
Jaybee,

I know how to restore the whole database by following this article, maybe (?) you can use a combination from here for restoring a backup from mysq :

--------------------------------------
1. Boot the computer into single-user mode by holding down Command-S as it begins to boot. It'll spew some information about the boot sequence, then drop you into a command line. At this point, the system is only partly started -- most important for our purposes, NetInfo hasn't been started yet. But also a lot of other things haven't been done that'll need to be taken care of by hand, like getting the boot disk checked and mounted for write access.

2. Use the command "fsck -y" to check the integrity of the boot disk's file structure. If it makes any repairs (it'll print "***** FILE SYSTEM WAS MODIFIED *****"), run it again. Keep running it until it stops finding problems.

3. Use the command "mount -uw /" to remount the boot disk with write access enabled.

4. "ls -l /var/backups" – this prints a list of everything in the backups directory. It should respond with something like:
**total 40
**-rw-r--r-- 1 root wheel 19001 Aug 4 03:15 local.nidump
The date on the file (in this case "Aug 4 03:15") indicates when the backup was made. If it's not from a time when the computer was working right, or if the response doesn't list a file named "local.nidump" (e.g. if it simply gives you the localhost# prompt without printing anything first), you don't have an appropriate backup, and these instructions won't work for your situation. Sorry.

5. "cd /var/db/netinfo" – this gets us to the directory where the live NetInfo databases are kept.

6. "mv local.nidb local.nibad" – inactivate the damaged database by giving it an invalid name.

7. "nicl -raw local.nidb -create" – build a nice clean (empty) replacement database in its place.

8. "nicl -raw local.nidb -create /users/root uid 0" – for one of the later steps, we need the root user to exist in NetInfo, so create it now.

9. "SystemStarter" – start up more of the system, including the NetInfo and lookupd infrastructure.

Note: on Mac OS versions 10.1.1 and 10.1.2 there appears to be a bug that prevents SystemStarter from operating properly. See below for a workaround.

10. "niload -r / . </var/backups/local.nidump" – load the contents of the backup into the new (live) database.

Notes: be careful to use "<", not ">". ">" will erase your backup file. Also, be patient; it can take a minute to rebuild the entire database. But if it takes more than 10 minutes or so, something probably went wrong, and you may need to start over.

11. "reboot" – restart the system, this time in a more normal fashion.

12. If all goes well, you can now delete the damaged NetInfo database, /var/db/netinfo/local.nibad (or whatever you renamed it to). If not, or if you don't trust the rebuilt database, hang onto it; you can always switch back to it if necessary.
------------------------------------------------------------------------

Manually starting NetInfo under Mac OS X 10.1.1-.2

The SystemStarter command doesn't seem to work under Mac OS X version 10.1.1-.2; you can start the relevant system components using the commands:

1. "/System/Library/StartupItems/Network/Network"
2. "/System/Library/StartupItems/DirectoryServices/DirectoryServices"

Then proceed to the niload step.>>
--------------------------------


Cheers...

Last edited by sao; 05-30-2002 at 07:51 AM.
sao is offline   Reply With Quote
Old 05-30-2002, 10:33 AM   #32
osxpez
Major Leaguer
 
Join Date: May 2002
Location: Sweden
Posts: 282
Quote:
i predict the future will be filled with many frivilous lawsuits called "Your CD Media Ate My Balls"

Yes, sad but very probable.

I'm still curious about if pure OS X uses resource forks or not though.
__________________
/PEZ
osxpez is offline   Reply With Quote
Old 05-30-2002, 01:06 PM   #33
pyrogen
Prospect
 
Join Date: Feb 2002
Posts: 48
How long does it generally take to ditto the user info?

I been waiting about ten minutes, and just terminal beachball.

Thanks destin
pyrogen is offline   Reply With Quote
Old 05-30-2002, 01:17 PM   #34
pyrogen
Prospect
 
Join Date: Feb 2002
Posts: 48
Wink perfect

A completely successful backup, thanks to Osxpez and Merv,
Regards,
Destin
pyrogen is offline   Reply With Quote
Old 05-30-2002, 01:19 PM   #35
JayBee
Major Leaguer
 
Join Date: Jan 2002
Location: Edinburgh, Scotland
Posts: 437
My backup operation usually takes no more than 6 minutes. This is on a G4 733, and my home directory is around 250 meg.

Depends on the size of your user directory, is the general answer. However, I've never had the terminal beachball on me while dittoing before...
__________________
JayBee
--

It's all relative, you know
JayBee is offline   Reply With Quote
Old 05-31-2002, 01:54 AM   #36
pyrogen
Prospect
 
Join Date: Feb 2002
Posts: 48
Jaybee

Hey Jaybee........


Thanks for the play by play break down of the CLI commands!
With out your patient instruction I would have been lost.

And to everyone.......... thanks for the help.

This is a place I often visit (lurk) between my classes.
(studying for my BFA in digital media; video, animation, and design). The people of this forum provide by far the best constructive assistance to my queries.

Thanks again.
Destin
pyrogen is offline   Reply With Quote
Old 05-31-2002, 04:19 AM   #37
JayBee
Major Leaguer
 
Join Date: Jan 2002
Location: Edinburgh, Scotland
Posts: 437
Glad to be of help!

Most of what I've got here has been culled from either the main site or these very forums, so I can't take much credit!

The site/forum combo provides a fantastic resource. Ask, and ye shall receive
__________________
JayBee
--

It's all relative, you know
JayBee is offline   Reply With Quote
Old 05-31-2002, 10:09 AM   #38
Titanium Man
Guest
 
Posts: n/a
My 2 cents

This is good stuff to explore (creating disk images and backing up to them, etc) but for me the most economical/easy way is backing up to external firewire hard drive. Psync is what I use, and I've been able to initialize my hard drive, reinstall, create a user with the same name as my previous user, and then use psync to get my backed up home directory back onto my machine. This means all my prefs, etc. are in place and I have a minimal amount of tweaking to do once I'm up and running. I've (unfortunately) had to do this twice now, so I have to do things in this order:

1. Install developer tools
2. Install psync
3. Sync the directories

Anyway, I know there are other solutions, but psync works for me, and daily backups to an external drive make me feel much more secure
  Reply With Quote
Old 05-31-2002, 12:11 PM   #39
bluehz
MVP
 
Join Date: Jan 2002
Posts: 1,562
One little snippet that might come in handy here - I saw it on XLR8yourmac.com the other day. It was related to a users inability to use Retrospect with their unsupported CD-R drive. Their solution was to segment the backup archive into 600mb chunks for easy burning onto CD-R media. Might be useful in the scheme of things.

Story here:

http://www.xlr8yourmac.com/archives/...02.html#S13861

Code:
Snippet:
use "split -b 600m BackupFile BF" to split BackupFile into 600MB 
chunks automatically named BFaa, BFab, BFac, etc.
[edit: long line -mt]

Last edited by mervTormel; 05-31-2002 at 12:46 PM.
bluehz is offline   Reply With Quote
Old 05-31-2002, 12:17 PM   #40
sao
Moderator
 
Join Date: Jan 2002
Location: Singapore
Posts: 4,237
Timan,

Thanks, that's what I have also been trying to tell.

Psync works for me too. It's extremely easy to use and quite reliable.


Cheers...
sao is offline   Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



All times are GMT -5. The time now is 06:15 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.
Site design © IDG Consumer & SMB; individuals retain copyright of their postings
but consent to the possible use of their material in other areas of IDG Consumer & SMB.