Ben Norcutt wrote:
Hi,
After a few days head scratching I thought I'd ask this one. Basically what I want to do is this:
I have 2 files one containing urls called url with each url on a new line and another containing words called keywords with each keyword on a new line I want to search the url file for each of the keywords in the keyword file and output any matches to the file output
I can do this for individual keywords using awk '/word1/||/word2/||/word3/' url > output But I need a way to do it for a load of keywords.
I believe the following would work from zsh or bash:
for K in `cat keywords`; do echo "---- $K ----" >> output; grep $K url
output; done
If your keywords file is too huge then you might have a problem with the commandline being too long; if so, use Perl ;-)
perl -e 'foreach (split "\n", `cat keywords') { `echo "---- $K ----" >> output; grep $K url >> output` }'
I haven't done Perl for a year, so that might be a little off. These ` are backticks btw, they're coming up funny in the Mozilla mail editor (cringe, hide).
If it doesn't work the first time remember to delete 'output' before you try again or it will all get icky.
HTH, Alexis