> You can do it in BBEdit using a Perl script, but in what form do you > want the results?
John, thanks for your reply. I have spent my evenings attempting to write a pithy one liner from the command line to do this, but the resolution is just not there with grep. All of the files to be searched contain paragraphs of text that is soft wrapped (I don't know if that is the correct term, sorry). I have not written any Perl in over 10 years, time to break out the books! What is needed? ;-) A frequency count of each word in the input file for each file that was searched. For example, the first word of the input file is "it". Document one is searched for "it" and it shows up 248 times. Optimal output would be (in tabbed columns): it Document 1 248 I know the output is going to be huge (as the input file is rather large), but that is fine--I just need to get to the analysis part at this point. Cheers! -- You received this message because you are subscribed to the "BBEdit Talk" discussion group on Google Groups. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at <http://groups.google.com/group/bbedit?hl=en> If you have a feature request or would like to report a problem, please email "[email protected]" rather than posting to the group. Follow @bbedit on Twitter: <http://www.twitter.com/bbedit>
