Go Back   The macosxhints Forums > OS X Help Requests > System



Reply
 
Thread Tools Rate Thread Display Modes
Old 04-18-2005, 05:29 PM   #1
ducasi
Prospect
 
Join Date: Apr 2005
Posts: 2
10.3.9 broke my hdiutil-based backup script!

Hi,

Wondering if anyone else has the same problem, or could try this out...

Every week I have a wee script that runs the command-line equivalent of a "New image from device" in Disk Utility. It boils down to this:
Code:
 sudo hdiutil convert /dev/rdisk0s9 -format UDSP -o Backup
This is supposed to create a sparse image copy of my main OS partition, and up until this weekend it worked. Then I upgraded to 10.3.9.

Now I get an error:
Code:
 hdiutil: convert failed - image/device is too large
Any ideas?

If you're going to try it, you'll maybe need to use something different than "/dev/rdisk0s9" to find your boot partition - basically it's the first entry in the "df" command, with an "r" stuck in before the "disk". Oh, and you'll need to have "cd"-ed to another disk - you can't copy a disk into itself. If it works it'll create a very large file called "Backup.sparseimage"

I've got to say, I'm a bit stumped as to what has caused this...

Cheers!
ducasi is offline   Reply With Quote
Old 04-19-2005, 06:05 PM   #2
derekhed
All Star
 
Join Date: Mar 2002
Location: Anchorage, AK
Posts: 762
How big is your disk? How much free space do you have where you are saving your image to? And what version OS did you upgrade from?

If you do have space problems, your system should be independent from your /Users/ directory, so you could just make backups of your personal stuff and reinstall the system from disk.
__________________
...if only you could see what I've seen with your eyes.
- Batty, Blade Runner

They all float down here...
derekhed is offline   Reply With Quote
Old 04-20-2005, 05:08 AM   #3
bramley
MVP
 
Join Date: Apr 2004
Location: Cumbria, UK
Posts: 2,461
Not really connected with the problem, but I would not use a sparse image with the convert command and rdisk for crucial backups. Sparse images will tend to exacerbate any file fragmentation (more probable the bigger the image) and since you are not doing a buffered device read with rdisk, corruption does become a distinct possibility.

Since it doesn't appear as if you actually need a sparse image - you are overwriting each backup - I would dispense with these. I recommend a non-sparse format, and use the 'disk' node (which is buffered) although you will need to unmount the volume/partition for this to work. This should also generate an image that requires less space to store.

I don't know if there is a maximum image size that can be read, but you are possibly bumping up against it. As derekhed asks is it really necessary to back up the whole system?
bramley is offline   Reply With Quote
Old 04-20-2005, 01:54 PM   #4
StarDust
Prospect
 
Join Date: Apr 2005
Posts: 2
I had the same problem after upgrading to 10.3.9.
I use CCC on a FireWire drive.
I moved back to 10.3.5 and now the cloning works.
StarDust is offline   Reply With Quote
Old 04-20-2005, 02:15 PM   #5
derekhed
All Star
 
Join Date: Mar 2002
Location: Anchorage, AK
Posts: 762
Could you post the text of any message you got when your cloning didn't work? It would be good to know if 10.3.9 was at fault or it was just a more mundane issue.
__________________
...if only you could see what I've seen with your eyes.
- Batty, Blade Runner

They all float down here...
derekhed is offline   Reply With Quote
Old 04-21-2005, 10:01 AM   #6
danlunde
Prospect
 
Join Date: Apr 2005
Posts: 3
I'm so glad I found someone else with this problem, it's been driving me nuts. I was waiting for 10.3.9 to come out so that I could make a clean image for my lab computers. After trying to create an image of my hard drive, I kept getting the error "image/device too large". I tried using the NetRestore Helper app from Bombich, Disk Utility, and command line hdiutil. All failed at the end with the aforementioned error. My hdiutil command was `hdiutil create -ov -srcfolder / -nocrossdev`

After doing extensive reinstalls, I found that it is only when 10.3.9 and Xcode tools are installed together. If you have either installed separately it will work, but not together. Doesn't matter in which order they were installed either.

I don't know what in the Xcode and 10.3.9 update that could be causing this. I'm going to do some more granular installs with Xcode to see what exactly is causing the problem (ie Developer tools, gcc compiler, SDKs).
danlunde is offline   Reply With Quote
Old 04-21-2005, 11:55 AM   #7
StarDust
Prospect
 
Join Date: Apr 2005
Posts: 2
That’s interesting!
My reference computer has 10.3.9 and Xcode installed.
The error occurred when I upgraded the boot partition on the FireWire drive.
“hdiutil: convert failed - image/device is too large”
The boot partition on the FireWire is a stripped down installation. Only the os and CCC.
I restored from a clone and with 10.3.5 I have no problems.
I will move up to 10.3.8 as that was the last version that worked.
I wonder if 10.4 has similar “surprises”?
StarDust is offline   Reply With Quote
Old 04-23-2005, 12:45 PM   #8
faceface
Prospect
 
Join Date: Apr 2005
Posts: 3
10.1 CCC disk too large error

I had the same problem using CCC creating a compressed read-only disk image of from a 12gb 10.3.9 template. I did it a few times to no avail. It would create a sparseimage but wouldn't convert it when booting from an external fw drive runnning 10.3.9. I then tried it from another fw drive running 10.3.5 and it worked.

It seems the drive that is running cloner can't be running 10.3.9.

This is a real problem does anyone know how to downgrade or possibly remove 10.3.9 specific file updates? Or does anyone know a good command line script that would convert the sparse image to a compressed read-only Net Restore doesn't read sparseimages GAH ?

best
faceface is offline   Reply With Quote
Old 04-26-2005, 05:28 PM   #9
danlunde
Prospect
 
Join Date: Apr 2005
Posts: 3
I did some more trial and error. After a clean install of OS 10.3.7 and a software update to 10.3.9, I was able to make a image of the drive.

Code:
lando:~ root# hdiutil create -srcfolder / -nocrossdev /Volumes/Local\ HD/osbase-8.dmg
Initializing...
Creating...
Copying...
...................................................................................................
Converting...
Preparing imaging engine...
Reading DDM...
   (CRC32 $C6091E88: DDM)
Reading Apple_partition_map (0)...
   (CRC32 $924D00C9: Apple_partition_map (0))
Reading Apple_HFS (1)...
..................................................................................................
   (CRC32 $E977FF99: Apple_HFS (1))
Reading Apple_Free (2)...
   (CRC32 $00000000: Apple_Free (2))
Terminating imaging engine...
Adding resources...
...................................................................................................
Elapsed Time: 10m 36.013s
 (2 tasks, weight 100)
File size: 5453629419 bytes, Checksum: CRC32 $E845727C
Sectors processed: 16499794, 16364609 compressed
Speed: 12.6Mbytes/sec
Savings: 35.4%
Finishing...
created: /Volumes/Local HD/osbase-8.dmg
After installing Xcode "Mac OS X SDK" package I was again able to make an image like the above example.

Finally, after installing the Xcode "Developer Tools" package it crashes like below

Code:
lando:~ root# hdiutil create -srcfolder / -nocrossdev /Volumes/Local\ HD/osbase-9.dmg
Initializing...
Creating...
Copying...
.............................................................................................................................................
Converting...
Finishing...
hdiutil: create failed - image/device is too large
I was able to install the Developer Tools without the Mac OS X SDK and create a successful image, but those two packages plus 10.3.9 equals guaranteed failure.

Any ideas?
danlunde is offline   Reply With Quote
Old 04-27-2005, 07:21 AM   #10
bluehz
MVP
 
Join Date: Jan 2002
Posts: 1,562
Great advice bramley - anyway you could flesh out these points you mentioned below "disk node".

Quote:
Originally Posted by bramley
I would dispense with these. I recommend a non-sparse format, and use the 'disk' node (which is buffered) although you will need to unmount the volume/partition for this to work.

bluehz is offline   Reply With Quote
Old 04-27-2005, 09:37 AM   #11
bramley
MVP
 
Join Date: Apr 2004
Location: Cumbria, UK
Posts: 2,461
I should have woken up more fully before making that post because it's not very clear. I was also conscious of not really solving the OP's problem.

Essentially, I was warning against doing 'raw' disk reads (that is using /dev/rdisk?s?) when backing up. Such reads are not via the disk cache - they directly read the disk. The problem with doing this is that previous write operations to the disk (which will usually be via the cache) may not have been written from the cache to disk when the raw disk read happens. i.e you can end up with a disk image that isn't a copy of the disk that you would have read via the cache. Also a large file that spans several blocks on the drive may have some parts of it still in the cache, if it is being updated, and any read of the disc's raw state will lead to that file being corrupt on the image.

Worse still whole-sale corruption might occur if you read the disc as the OS is defragging files. I'm not sure when de-fragging happens on OS X. I should imagine though that the cache is used to store sectors of the drive that are being moved around, and thus the raw disc image would miss those sectors.

Sparse images are supposed to enhance fragmentation that may exist (says so in hdiutil's man file) but on reflection I guess this might mean if files are deleted from an existing image (the OP was overwriting the image so this wouldn't be relevant) which is then compacted, then there might be problems.

So using '/dev/disk?s?' ensures that you are reading the disk through the cache, and therefore anything in the cache ends up in the image. If you type 'sudo hdiutil /dev/disk?s? ~/Desktop/Backup' in the Terminal then you should get a 'device busy' error while the volume is still mounted. I assume this is to prevent the cache being used by other processes while the disc image is being created, and possibly corrupting the disc image. Dismounting the volume, and then typing the above command should work (except of course you can't backup the root drive.)

I see other posters are also creating disc images by coping the file structure i.e '/Volumes/MyOtherDrive' etc. which should be OK (EDIT Actually I'm not certain. I guess it's OK if you know the folder structure is not going to be written to as the image is made, but otherwise I'm not so sure) - and doesn't require dismounting.

None of the above is germane to the problem, of course! I use an external drive to backup without disc images as I am suspicious of disc images beyond a few GBs so this problem isn't affecting me. Although experiments show I do have the problem.

Has anybody tried finding out what the maximum disc image size that can be created with hdiutil now is? Also is it possible to create segmented (read-only I know) disc images as a workaround?

Last edited by bramley; 04-27-2005 at 09:42 AM.
bramley is offline   Reply With Quote
Old 04-28-2005, 12:35 PM   #12
faceface
Prospect
 
Join Date: Apr 2005
Posts: 3
10.1 CCC and Creating Backups of User directories

Hello,

Has anyone figured out if Apple knows about the CCC / hdiutil convert error problem?

Is it safe to create backups of drives using CCC? I use it to create diskimages for storage but is it enough? is it safe? is there just any easier command line script?

Does anyone know if this script is suffiecnt to create backups of user directories?

hdiutil create -format UDRO -srcfolder /Users/username -o /Volumes/StorageDrive/Backups/username.dmg

Are there any problems in using this and then using it later to restore a User Directory?

Best
faceface is offline   Reply With Quote
Old 04-28-2005, 01:06 PM   #13
hayne
Site Admin
 
Join Date: Jan 2002
Location: Montreal
Posts: 32,473
Quote:
Originally Posted by faceface
Has anyone figured out if Apple knows about the CCC / hdiutil convert error problem?

If you want them to know:
http://www.apple.com/macosx/feedback/
hayne is offline   Reply With Quote
Old 04-28-2005, 02:07 PM   #14
danlunde
Prospect
 
Join Date: Apr 2005
Posts: 3
The problem seems to come from a bug in the Disk Utilities framework that was updated with 10.3.9. This brought the version of Disk Utility up to 10.4.4 from 10.4.3. The bug occurs if the source volume you try to create is over 8Gb. Creating a image from this drive will guarantee failure regardless if you try to do a compressed or uncompressed image.

There a discussion of it going on at the bombich forum, and the bug was filed by bombich with Apple.
http://forums.bombich.com/viewtopic.php?t=5203
danlunde is offline   Reply With Quote
Old 04-28-2005, 02:49 PM   #15
faceface
Prospect
 
Join Date: Apr 2005
Posts: 3
10.1 hdiutil 10.4.3

Thanks for the info.

Do you know if there is a way to fix the bug with out downgrading the entire operating system? maybe just replacing the utility with the 10.4.3 version will do the trick?
faceface is offline   Reply With Quote
Old 05-05-2005, 11:46 AM   #16
derekhed
All Star
 
Join Date: Mar 2002
Location: Anchorage, AK
Posts: 762
Thanks for that link danlunde. The fix outlined there did not work, however. Still no way around this unless you have an older 10.3.8 system to boot from.
__________________
...if only you could see what I've seen with your eyes.
- Batty, Blade Runner

They all float down here...
derekhed is offline   Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



All times are GMT -5. The time now is 06:25 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.
Site design © IDG Consumer & SMB; individuals retain copyright of their postings
but consent to the possible use of their material in other areas of IDG Consumer & SMB.