Any time something's not working as you expect, you should write to
[email protected]. They actually write back. (And in this case I'd
include a sample data file and your Text Factory.)

--Kerri

On Tue, May 5, 2015 at 9:29 PM, Christopher Jones <[email protected]>
wrote:

> Hey all! Thanks for your attention!
>
> I am a linguist, and I've recently documented a BRAND NEW language (by NEW
> i mean 'ANCIENT'... but newly 'discovered'...). There are less than 1,000
> people on the planet that speak this language, but they really want an
> indigenous literacy program, and to be able to read and write their own
> language. I'm living with them, and will be helping them to realize this
> dream.
>
> That said... I'm using a syllable-based literacy lesson-plan, and have
> done lots of research and documentation, and figured out the order in which
> I need to teach the syllables in their language. Now, i have to put
> together little 'reading books' for them, and am having a bit of trouble...
>
> Basically, we'll teach a new 'syllable' at each class session/lesson, and
> then use that new syllable with the syllables taught in previous lessons to
> 'build new words' out of the syllables. We need to compose practice stories
> for our students to use their new skills on, but we can only use a limited
> number of NEW 'built words' in every little story, or we'll risk
> overwhelming our brand new readers.
>
> So I have an md document with a bunch of stories in it, separated by H2s:
>
>
> ##101
> Me a.
>
>
> ##102
> Me le.
> Me a.
> Me mala a.
>
>
> ##103
> Hahe.
> Hahe mele mehe.
> Hahe hele mehe.
> Hahe ale mehe.
>
>
> ##104
> Hele hahe.
> Mele hahe.
> Hahe le.
> Hahe a.
>
>
> and I want to be able to churn out something like:
>
> ##101
> 1. Me
> * a
>
> ##102
> 1. le
> * mala
>
> ##103
> 1. Hahe
> * mele
> * mehe
> * hele
> * ale
>
> ##104
> 1. Hele
> * hahe
> * Mele
>
> basically lists of all the 'new' material in each story, so that I can
> know which stories have too much 'new material, make appropriate tweaks,
> and then re-evaluate, until I keep all the 'new material' down at
> acceptable levels for each progressive lesson.
>
> Does that make sense?
>
> I've been doing something like this already with some grep by individually:
>
> * deleting all punctuation
> * turning all whitespace into line breaks
> * processing/deleting all duplicate lines (case-sensitive)
>
> and this has been pretty helpful, but its a pain to have to do this over
> and over again... Textfactories are perfect for this, right?
>
> WRONG.
>
> I think it is a bug or something, but upon 'Processing Duplicate Lines' in
> a text factory, I t looks like maybe the first instance of duplication gets
> completely deleted or something.... You can try it on my sample data... the
> 'Me' never makes it through the text factory, but running the operations
> individually DOES work. WTF?
>
> Please... hook me up with some wisdom on a killer way to do this?
> Thanks for any help you might be able to provide, guys... I'd greatly
> appreciate it!
>
> I'm running the latest version.
>
>
>  --
> This is the BBEdit Talk public discussion group. If you have a
> feature request or would like to report a problem, please email
> "[email protected]" rather than posting to the group.
> Follow @bbedit on Twitter: <http://www.twitter.com/bbedit>
>
> ---
> You received this message because you are subscribed to the Google Groups
> "BBEdit Talk" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To post to this group, send email to [email protected].
>

-- 
This is the BBEdit Talk public discussion group. If you have a 
feature request or would like to report a problem, please email
"[email protected]" rather than posting to the group.
Follow @bbedit on Twitter: <http://www.twitter.com/bbedit>

--- 
You received this message because you are subscribed to the Google Groups 
"BBEdit Talk" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].

Reply via email to