![]() |
Quote:
Okay. Quote:
how to get awk to directly apply its pattern processing to a given command-line argument. Awk is hell-bent on reading files (pathname), and more-than-willing to munch up its stdin. But forcing it to feed upon command line args (for pattern processing) is nigh impossible. If anyone here can show me the trick, here's what I have so far... Code:
#!/usr/bin/awk -fecho 12.34.56.78 | okip3 instead of a simple arg: okip3 12.34.56.78 I tried a bunch of stuff like:
|
Yeah I tried to do that too :mad: and it didn't want to do it for me either.
|
How about this
Code:
#! /usr/bin/awk -F. -f |
Oh and a new perl version:
Code:
baf@bengt-arne-fjellners-computer ~ |
Quote:
I think from reading various docs that variations of awk (such as gawk and mawk) may have that type of kluge condensed into a single option... but not awk itself. Though I forgot to mention it, I did play around with some (of the eight) variations of getline... but didn't hit upon the magic you found. I note that -- in addition to that line you added -- the "GNU Awk User's Guide" seems to suggest that such code should be shortly followed by: close("/bin/echo ") so -- all in all -- it costs almost 50 extra chars, just to read the darn argument. :D I'll need to run some tests to be sure... maybe it was that "close()" part that prevented my attempts from working. Thanks again. Quote:
|
And I could make it some 4 or 5 characters shorter IF I totally gave up readability. No newlines and some less spaces but thats going too far. I'm very happy to get below 100 chars (even if it's JUST below).
|
Quote:
After all... "readability" is Perl's strong point! :D :D :D |
I totally agree with that. Ones I wrote a program. Went to lunch. Came back and almost wondered "who wrote this? What does it mean?" :D :D :D
Oh and to use close in awk you have to use the exact same sequence of characters as when you opened it. So for: "/bin/echo " ARGV[1] |getline You would have to do: close("/bin/echo " ARGV[1]) so that's even more characters.... Tip: variables!! C="/bin/echo " ARGV[1] C |getline close(C) And if someone else wants to use that trick and have more code then you should really do: delete ARGV[1] otherwise awk will try to read the file 10.1.1.10 or whatever.... I have tested this and it really seems to close the pipe when it should. Hmm readability what does that mean :D:D |
Quote:
The built-in "echo " will do just fine thanks! We needn't bother with /bin, when Bash (or any shell I imagine) has its own internal echo mechanism. [or am i mistakenly ignoring scenarios where scripts run sans a terminal? doesn't matter... right... or ?] - BTW, I learned that my previous attempts with "echo " ARGV[1] |getline had failed because I was doing that part inside BEGIN { }... and then trying to use that read line inside a second set of { braces }. awk doesn't preserve that 'got line' across to the next group of braces (it seems). Also... since your version works *without* close(), I think it's okay to skip that part. Not sure but -- since awk doesn't send an error about it -- i think it's fine to leave it out. Or wait... Quote:
-- Finally, what is the final consensus about leading zeros in terms of a little tool like the one we have been building. Is it appropriate for our tool to reject *all* leading zero attempts? Or perhaps should we allow just the 000 thru 007 range (like i did), or should we allow *all* leading zero fields (as long as no other violation occurs), with a philosophy that: well... the user must know what they want if they do that. Whaddya think? :confused: |
Use this code:
Code:
#! /usr/bin/awk -F. -fxxxx 10.1.1.10 - (don't miss that hyphen and for this it must be /bin/echo) Now it doesn't terminate until you press enter. In another terminal/window do: ps axco command |grep echo And you will find a running echo process press enter and it dies. Now remove the comment mark before "close" and repeat. No extra process. That's what I mean. But really in a short program like this it isn't needed but it's good practise to always use it. In a bigger program if you don't do close and later do echo again (with the same parameters) it would "reuse" the same process and fail. Also you could have a bunch of unnecessary processes. |
Quote:
Code:
ronk@sexy:bin(0)$ cat ipok |
About the new version "OOps"
And about yours "I zink we has a winner" |
Quote:
I did a search through all the RFCs, looking for every occurrence of the word "dotted". The RFCs seem to be unanimous, in that IPv4 addresses are dotted decimal, while IPv6 addresses are always hex. Typical phraseology, repeated over and over again in the RFCs, is something like: Quote:
However, when an IP address is used as part of a URI, they're emphatically opposed to the notion that any reasonable format will do. One issue is the lack of standardization: some sites will interpret a leading zero to mean octal, and others will treat it still as decimal. That means that the same exact URI means different things at different hosts, making routing a nightmare. (Yeah, yeah, different DNS servers might map the same FQDN to different IP addresses, but at least they all agree about which FQDN they're mapping.) Worse, some sites will interpret a leading 0x to mean hexadecimal, while others will interpret the non-digit to mean it's not a numeric address at all, and try to resolve it as a Fully Qualified Domain Name. All of which leads to some hairy security issues. The use of non-canonical dotted IP addresses (dotted octal, dotted hex, dotted decimal with leading zeroes) opens up some potential attack vectors. It seems to me to be best to nip the problem in the bud by rejecting non-canonical IP address outright. The worst that will happen is that our subroutine rejects it as an IP address, forcing the caller to look it up using DNS, and that search will probably fail. (If it really could be interpreted as an IP address, its Top Level Domain will be invalid. None of the extant TLDs are numeric or begin with 0x.) A case could be made for allowing the leading zeroes in situations where you know the string is intended to be an IP address... except that I can't think of many such situations. Utilities like ping will happily accept domain name in lieu of an IP address. The only places I can think of where it has to be an IP address is in specifying a Domain Name Server, or in configuring your local system. In both of those cases, it's not onerous to require the user to use a canonical IP address. And besides, the octal/decimal ambiguity is enough to make me think that a person prepending extra zeroes in an IP address does not know what they're doing. Are they even aware of the ambiguity? Can they be certain how the address will be interpreted? If so, how? If not, why are they doing it? |
Quote:
|
Quote:
Working through the first part, I looked at the process status in more detail: Code:
$ ps axcru |sed -E '1p;/.*(awk|echo)/!d'As you mentioned, in this little script -- where we bail out virtually as soon as we "BEGIN" -- and also make no further attempts to access that process/pipe... we can omit explicitly stating all the close and delete business. Super. Thanks for making it more understandable. Quote:
Quote:
Thanks for weighing in again. Being that it makes more sense to either reject all or accept all leading-zero cases, I therefore removed the awk line which accepted leading zeros on numbers less than 8. [indeed, such code behavior could more easily cause confusion than add convenience.] I tried to reduce things a bit more, by not calling length() repeatedly. So here then is my (unsexy, 200-character) latest version: Code:
#!/usr/bin/awk -F. -f |
Quote:
Typing afp://192.168.001.010/ in the server address field (plus a password) brings up a message saying "Connecting to afp://192.168.001.010/":) It worked. [though netstat still shows it as 192.168.1.10] |
| All times are GMT -5. The time now is 05:46 PM. |
Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.
Site design © IDG Consumer & SMB; individuals retain copyright of their postings
but consent to the possible use of their material in other areas of IDG Consumer & SMB.