The macosxhints Forums

The macosxhints Forums (http://hintsforums.macworld.com/index.php)
-   UNIX - General (http://hintsforums.macworld.com/forumdisplay.php?f=16)
-   -   Terminal Memory Management? (http://hintsforums.macworld.com/showthread.php?t=7045)

kerim 11-12-2002 09:54 AM

Terminal Memory Management?
 
Anyone who has read this post:

http://forums.macosxhints.com/showth...3955#post33955

knows that I've been learning how to use ImageMagick to post-process raw TIFF files after scanning. (I just enter one line of commands, followed one after another, and separated by semicolons so that they run sequentially.)

It works great, but I like to run it at night and batch-process a whole bunch of scans while I sleep. The problem is, when I wake up I get errors about MacOSX not having enough memory and needing to clear out disk space!!!

The fact of the matter is that I have over 2GB free on my drive! And I have 640MB RAM. But look at what my VM_stats are in the morning (I didn't check them at night):

Quote:

ach Virtual Memory Statistics: (page size of 4096 bytes)
Pages free: 1929.
Pages active: 98668.
Pages inactive: 49350.
Pages wired down: 13893.
"Translation faults": 20210498.
Pages copy-on-write: 88027.
Pages zero filled: 16886108.
Pages reactivated: 9653968.
Pageins: 349692.
Pageouts: 3082766.
Object cache: 30683 hits of 84013 lookups (36% hit rate)
That is a lot of pageouts! I was getting hardly any before.

ImageMagick gives me this kind of error:

Quote:

convert: Unable to open file (convert.pbm) [No space left on device].
"convert.pbm" is an itermediate file I create before creating a PDF file. These can become as large as 10MB. My final PDF files are usually between 1 and 4 MB.

So the question is, is there any way to get ImageMagick to behave better towards my memory? Or to clear out the VM cache between each command so that there is space for the next one? Or maybe I am wrong about where the problem is?

On the ImageMagick list a Unix person suggested that it might be the size of the "TMPDIR" that is a problem. Would this be relevant to OS X?

kerim 11-12-2002 10:30 AM

One possible solution
 
I realize that in order to save time I had been running "Convert" commands in two seperate terminal windows at the same time. Perhaps this overloaded the memory? Maybe if I stick to just running the commands sequentially it will help...

mervTormel 11-12-2002 11:01 AM

yeah, i think you'll want to calm your script down a bit. two heavy processes going at it at once is often not better than them running serially.

i suspect that these processes exhaust real memory (with many inactive pages) which causes the memory manager to become paranoid and thrash and page and swap, creating new page files until the disk is full, then on process death, the swapfiles are released and mem mgr eventually cleans them up before you wake. or gremlins do it.

what you might want to do is to interleave some sustained disk I/O inbetween your convert commands.

it has been seen that a du / or a find / will peel off inactive pages to its scant smallest size.

so try (and this is sh, not csh):

du -sx / >/dev/null 2>&1

convert ...

du -sx / >/dev/null 2>&1

convert ...

etc...

and see if that lets each convert perform without wild occilations in memory mgt

mervTormel 11-12-2002 11:03 AM

Code:

nice +10 convert -gravity South -crop 1700x2200+0+0 -rotate "+90"
-level 10000,1,50000 -unsharp 6x1+100+0.05 -adjoin *.tif pbm:convert.pbm
; convert -compress zip -page 792x612 convert.pbm pdf:document.pdf

btw, that second convert isn't nice

kerim 11-12-2002 11:20 AM

Thanks.

Do you think I should use "nice" at all? If I'm running at night I don't really need to use the computer for anything else, and perhaps not using "nice" will use less VM?

I'm beginning to realize that Aqua apps are about more than a pretty GUI! I've never had errors like this with Graphic Converter. (But unfortunately it produces messed up multi-page TIFF files ...)

kerim 11-12-2002 11:21 AM

Quote:

Originally posted by mervTormel

so try (and this is sh, not csh):

What do you mean by "sh" and "csh"?

mervTormel 11-12-2002 11:40 AM

you never did two at a time with GC, either. and there may have been contributing problems. let's not go making sweeping generalizations :D

nice isn't going to matter unless some other process needs cycles.

there are two families of shells, sh (sh, bash, zsh, ksh) and csh (csh, tcsh, ?)

they have different syntax for things like redirecting file descriptors (stdin, stdout, stderr) [ 2>&1 = sh, >& = csh ]

in sh, command >/dev/null 2>&1 redirects the messages that would come to stderr (2, the terminal) and stdout (1, the terminal) to /dev/null making them vanish. we don't want to see the output or errors of du -sx, we just want its effect on inactive memory pages.

anyhow, if you put your commands into a script, make it a sh script and not a csh script.

kerim 11-12-2002 12:17 PM

How would I do the same thing in the TCSH command line? I tried "du -sx / >/dev/null 2>&1 " but it gives me an error message: "Ambiguous output redirect."

I tried "du -sx / >/dev/null >& " but that didnt' work either. Said I was missing an output name. I tried "1" but it said it was ambiguous ...

TiA!

mervTormel 11-12-2002 12:36 PM

c'mon! this is a hint site, not the magic answer wombat!

dig out yer tcsh manual and search for >&

oh, alright, alright. here you go...

command >& file

kerim 11-12-2002 01:17 PM

Thanks. If I type "man tcsh" I get pages and pages of text. Is there some way to quickly find instructions on use of a particular variable?

mervTormel 11-12-2002 01:21 PM

type 'h' to learn how to use the pager.

gets tedious, doesn't it? wanna hire me? rent's due.

kerim 11-12-2002 01:25 PM

Still not working:

I enter: sudo du -sx / > /dev/null & test

And it returns:
[4] 863

Then it asks for my "password" at the prompt, in a weird way (what I type is visible and it doesn't do anything).

???

pmccann 11-12-2002 05:19 PM

tcsh is a panus in the ain for redirection; as merv suggested, rewriting it as a simple sh script would endow your script with the ability to redirect stdout and stderror easily.

That said, here's some nonsense that I scrawled for myself a while back about how to redirect things using tcsh: merv's suggestion of "command >& filename" redirects *both* stdout and stderror to filename, but won't help if you only want stdout. Anyway, the following works for most purposes, but it really is ugly central. Sorry 'bout the messy notes, but they were for me (and I don't have time or the inclination for a rewrite just now!)

Cheers,
Paul


Redirection is bloody painful in tcsh: >& redirects both, and if you try something like:

grep prompt * | grep -v directory

to get rid of the annoying "blah is a directory" messages it doesn't work. (I know you can simply add in a "-d skip" to the grep call to suppress directories, but let's just pretend that didn't exist.) Problem being that those error messages are in the stderror stream, so grep doesn't whack them as you might hope. Probably the easiest way around this problem is to pipe the results of an initial process to the stdin of grep, so that it doesn't differentiate stdin and stderror.

Recall that > redirects stdout, >& also redirects stderror, and < redirects stdin. But you can't unhook what's in stderror from what's in stdout easily.

( grep prompt * ) | & grep -v directory

Hooray! but far from elegant. Similarly for find, with its permission denied errors.

Suppose that file1 and file2 exist, but not file3. You try to get a listing, but get the error message displayed also:

% ls -l file{1,2,3}
ls: file3: No such file or directory
-rw-r--r-- 1 pmccann staff 0 Jan 29 15:58 file1
-rw-r--r-- 1 pmccann staff 0 Jan 29 15:58 file2

% (ls -l file{1,2,3} > /tmp/pmccann ) > & /dev/null

kerim 11-13-2002 09:37 AM

Update
 
For one thing, running two commands at the same time is not the problem. It stopped last night for the same reason even though I was only running one task at a time!

I think ImageMagick just chokes on some folders with lots of files. I may need to restructure my command to first do the conversions before doing the Adjoin command. This might take up less space. I think the "-unsharp" command is probably the one that takes up the most memory, and not having it write the file to disk before it handles 60 other images is probably too much. I should create another intermediate file for each scan and then merge them afterwards. (I'll try this again.)

But there is also the possibility of clearing out my cache files using the "sudo periodic weekly" command. However, here I run into an issue that it prompts me for a password when I do this and i need it to run automatically when I am asleep! Also, this command takes a *long* time to run ...

Any ideas?

mervTormel 11-13-2002 10:23 AM

the suggestion to run the weekly is merely born out of the effect of the commands contained within that perform sustained disk I/O which expire some inactive memory pages and release them to the free memory pool.

that command is find / # <- everywhere

and my suggestion was to use du -sx / as it can be lighter/faster

the redirection of stdout and stderr to /dev/null was because du is going to bang into files you don't have permissions to stat, and will spit up error messages.

that is, we don't want to see anything from du, just its effects on memory.

so, for tcsh, try:
Code:

% vm_stat | egrep free\|active\|wired
Pages free:                  210490.
Pages active:                  22334.
Pages inactive:              130497.
Pages wired down:              29895.

% du -sx / > & /dev/null

% vm_stat | egrep free\|active\|wired
Pages free:                  218266.
Pages active:                  22140.
Pages inactive:              118223.
Pages wired down:              34587.

to see the net effect of the du memory squisher. that put 32MB back on the free pool. not very dramatic, but i've seen more extreme examples.

kerim 11-13-2002 11:18 AM

How do you calculate 32MB from those stats?

mervTormel 11-13-2002 11:28 AM

well, if you look at vm_stat output, it tells you the units are in 4096 byte pages...
Code:

$ vm_stat
Mach Virtual Memory Statistics: (page size of 4096 bytes)
Pages free:                  230023.
...

the free page math is academic.

the net freepage gain of the above du: 8*4=32

kerim 11-13-2002 11:30 AM

Well, this command worked for me! :D

But it didn't go much faster than doing the "sudo periodic weekly" :(

Still, the results seem to have done something, even though I just ran "sudo periodic weekly", so maybe it works better?

here are the stats:

[QUOTE][ kerim% vm_stat | egrep free\|active\|wired
Pages free: 82674.
Pages active: 36245.
Pages inactive: 30037.
Pages wired down: 14883.

kerim% du -sx / > & /dev/null

kerim% vm_stat | egrep free\|active\|wired
Pages free: 89681.
Pages active: 35741.
Pages inactive: 23540.
Pages wired down: 14878.
/QUOTE]

How many MB would you make that?

kerim 11-13-2002 11:41 AM

So 7x4= 28MB freed for me, even after running the weekly script! Not bad! :cool:

mervTormel 11-13-2002 11:42 AM

free before: 82k

free after: 89k

net: 7k * 4k = 28 MB

--

if you have one large partition, you can speed the du by confining it to a smaller hierarchy...

du -s /System

and if that's not enough to strip the inactive pages, add /usr

du -s /System /usr

kerim 11-13-2002 04:23 PM

I wonder if we aren't making too much of the memory gains. I tried running the vm_stat a few times in a row and found that the pages free can fluctuate quite a bit without doing anything (except running programs in the background)...

kerim 11-14-2002 10:22 AM

Batch operations in UNIX?
 
I think I found one place where memory is getting hogged and I have an idea about how to fix it, but I need UNIX help!

I tried to keep everything as simple as possible. First I changed my ImageMagick command to output mulitple PBM (portable bitmap) files instead of one single file.

Then, I restarted my machine and ran the command:

Code:

nice +10 convert -compress zip *.pbm tiff:output%02d.tiff
(This converts them to individual Tiff files with numbers output01.tif, output02.tif, etc.)

These PBM files are about 500K each, and after it went through 29 of the 59 files, I got this error:

Quote:

convert: Unable to open file (output29.tiff) [No space left on device].
For all the remaining files.

Now, what I realize in using +adjoin in ImageMagick is that it seems to first do all the conversions in memory (for all the files) and then to write them to disk afterwards!!! What I would like to know how to do is to tell ImageMagick to do this for one file at a time!

Something like:

For each file *.pbm
Convert to compressed Tiff
Renaming with the same name but changing the extension.

I imagine that there is a UNIX command for this but I don't know where to look - especially for the naming part!

TIA!

pmccann 11-14-2002 06:24 PM

What about a simple foreach loop: that might start/stop imagemagick once per file, thus leaving lighter footprints.

% foreach file (*.pbm)
foreach? nice +10 convert -compress zip $file (etc etc)
foreach? end

Worth a go!

Cheers,
Paul

(You can of course throw this into a file --say "scriptname" with

#!/bin/sh

on the first line (and no "foreach?" on lines 2 and 3; the shell adds these in when you do this interactively), then make it executable and just use

./scriptname

to run it)

mervTormel 11-14-2002 06:36 PM

better use #!/bin/csh with foreach

sh no got foreach

sh has:

for NAME [in WORDS ... ;] do COMMANDS; done

for file in /path/to/*.pbm;
do
nice...;
done

kerim 11-14-2002 08:42 PM

Getting there!
 
Amazing. This is definately the way to go! It was much faster, and it did all the files without error.

I want to sort out one more thing before I go trying to write a script for this (I'm still doing it manually), and that is get the process of file naming using the "foreach" command down. I tried reading the man (I really did), but what I came up with is this and it doesn't work:

Code:

% foreach file ( *.pbm )
foreach? nice +10 convert -compress zip $file $file:h.tiff
foreach? end

If the original file was file01.pbm, the output file should be file01.tif, but I get: "file01.pbm.tif" instead. I though the :h would strip the tail leaving only the header, but I guess I misunderstood the manual.

Thanks.

oldfogey 11-15-2002 02:19 PM

Re: Getting there!
 
Quote:

Originally posted by kerim

Code:

% foreach file ( *.pbm )
foreach? nice +10 convert -compress zip $file $file:h.tiff
foreach? end

If the original file was file01.pbm, the output file should be file01.tif, but I get: "file01.pbm.tif" instead. I though the :h would strip the tail leaving only the header, but I guess I misunderstood the manual.

Thanks.
Try replacing $file:h.tiff by "`basename $file .pbm`.tif"

And yes, that is a space after $file. (Is it '.tif' you want or '.tiff'? You say both :-) )

hth

kerim 11-15-2002 02:36 PM

Great. In another thread someone suggested "basename" but I didn't notice that it required a space. Now it works! Such small things. Thanks about the tif/tiff thing too. Doesn't really matter as far as I can tell, but it is good to be consistent!

kerim 11-15-2002 02:39 PM

Now the question is how to get this script to work not on the top level folder, but for all the sub-folders in it. That is, I have a folder called "scans" and it contains multiple folders "scanfolder 1" "scanfolder 2" etc. How do I get the script to operate on the TIF files inside each of these folders, rather than just the top level?

Code:

#! /bin/csh

if ( $#argv == 0 ) then
  echo "Usage: $0 <path1> [<path2>...]"
  exit 1
endif

foreach dir ($*)
cd $dir
echo "converting all files to Tiff! in" $PWD
foreach file (*.pbm)

nice +10 convert -compress zip $file tif:`basename $file .pbm`.tif
end
end

I'm also having a problem with this code not accepting directories with spaces in the name, even if they are input in quotes or with backslashes or even both ...

kerim 11-15-2002 04:08 PM

Full script (but problems remain)
 
I haven't solved the issues from my last post, but here is the full script as it now stands:

Code:

#! /bin/csh

if ( $#argv == 0 ) then
  echo "Usage: $0 <path1> [<path2>...]"
  exit 1
endif

foreach dir ($*)

echo "Changing directory to: " $dir
cd $dir

echo "Cropping, Roating, and Filtering TIF to PBM!"
foreach file (*.tif)
nice +10 convert -gravity South -crop 1700x2200+0+0 -rotate "+90" \
  -level 10000,1,50000 -unsharp 6x1+100+0.05 \
  $file pbm:`basename $file .tif`.pbm
echo $file "done"
end

echo "Converting all files to compressed TIFF!"
foreach file (*.pbm)
nice +10 convert -compress zip $file tif:`basename $file .pbm`.tiff
echo $file "done"
end

echo "Converting all TIFF Files into a single landscape PDF!"
nice +10 convert -compress zip -page 792x612 -adjoin *.tiff pdf:document.pdf

echo "All Done!"

end

[edit: readability -mt]

kerim 11-15-2002 11:01 PM

Got it! (Almost)
 
I finally got the recursive directory thing! Now I just need to figure out why it isn't accepting spaces in directory paths...

Here is the final script. All I intend to do now is fix some of the feedback, and set it to work with fixed top-level directories. I want to have one directory called "portrait" and one called "landscape" - that way I can simply save scans to the appropriate directory and the script will treate them each correctly. When done I will post back here. But for now, this is what I have:

Code:

#! /bin/csh

if ( $#argv == 0 ) then
  echo "Usage: $0 <path1> [<path2>...]"
  exit 1
endif

foreach dir ($*)

echo "Changing directory to: " $dir
cd $dir

foreach dir ($dir/*)
cd $dir
echo "Cropping, Roating, and Filtering TIF to PBM!"
foreach file (*.tif)
nice +10 convert -gravity South -crop 1700x2200+0+0 -rotate "+90" \
  -level 10000,1,50000 -unsharp 6x1+100+0.05 \
  $file pbm:`basename $file .tif`.pbm
echo $file "done"
end

echo "Converting all files to compressed TIFF!"
foreach file (*.pbm)
nice +10 convert -compress zip $file tif:`basename $file .pbm`.tiff
echo $file "done"
end

echo "Converting all TIFF Files into a single landscape PDF!"
nice +10 convert -compress zip -page 792x612 -adjoin *.tiff pdf:document.pdf

echo "All Done with directory: " $dir
cd ../
end

end

[edit: readability -mt]

Eckbert 07-30-2003 02:37 PM

would this make sense if you run du on a system that seems to become sluggish to avoid a crash? I mean if you catch it early enough?

tomholland 12-18-2003 10:56 PM

although not a unix solution, I just deleted my loginwindow plist files and the system seems to be handling memory much better.


All times are GMT -5. The time now is 06:12 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.
Site design © IDG Consumer & SMB; individuals retain copyright of their postings
but consent to the possible use of their material in other areas of IDG Consumer & SMB.