Feeds are conduits for arbitrary text to flow into the lexer, and to be converted into wordings.

§1. Feed sessions. Each feed has a unique ID. At present only one is ever open at a time, but we don't want to assume that.

define feed_t int  (not a typedef only because it makes trouble for inweb)

§2. There are two ways to make a feed. One is simply to call one of the feed_text routines below and use its output. The other is for a multi-stage process, that is, for when multiple pieces of text need to go into the same feed: to start such, call Feeds::begin and get an ID; to end, call Feeds::end with the corresponding ID back again.

feed_t Feeds::begin(void) {
    return (feed_t) lexer_wordcount;
wording Feeds::end(feed_t id) {
    return Wordings::new((int) id, lexer_wordcount-1);

§3. Feeding a feed. Some variations on a theme:

wording Feeds::feed_C_string(wchar_t *text) {
    return Feeds::feed_C_string_full(text, FALSE, NULL, FALSE);

wording Feeds::feed_text(text_stream *text) {
    return Feeds::feed_text_full(text, FALSE, NULL);

wording Feeds::feed_C_string_expanding_strings(wchar_t *text) {
    return Feeds::feed_C_string_full(text, TRUE, NULL, FALSE);

wording Feeds::feed_text_expanding_strings(text_stream *text) {
    return Feeds::feed_text_full(text, TRUE, NULL);

wording Feeds::feed_text_punctuated(text_stream *text, wchar_t *pmarks) {
    wording W = Feeds::feed_text_full(text, FALSE, pmarks);
    return W;

§4. ...all of which result in calls to these two, which are really the same function, written two ways:

wording Feeds::feed_C_string_full(wchar_t *text, int expand, wchar_t *nonstandard,
    int break_at_slashes) {
    Set up the lexer4.1;
    lexer_break_at_slashes = break_at_slashes;
    for (int i=0; text[i] != 0; i++) {
        int last_cr, cr, next_cr;
        if (i > 0) last_cr = text[i-1]; else last_cr = EOF;
        cr = text[i];
        if (cr != 0) next_cr = text[i+1]; else next_cr = EOF;
        Lexer::feed_triplet(last_cr, cr, next_cr);
    Extract results from the lexer4.2;

wording Feeds::feed_text_full(text_stream *text, int expand, wchar_t *nonstandard) {
    Set up the lexer4.1;
    for (int i=0, L=Str::len(text); i<L; i++) {
        int last_cr, cr, next_cr;
        if (i > 0) last_cr = Str::get_at(text, i-1); else last_cr = EOF;
        cr = Str::get_at(text, i);
        if (cr != 0) next_cr = Str::get_at(text, i+1); else next_cr = EOF;
        Lexer::feed_triplet(last_cr, cr, next_cr);
    Extract results from the lexer4.2;

§4.1. Set up the lexer4.1 =

    lexer_divide_strings_at_text_substitutions = expand;
    lexer_allow_I6_escapes = TRUE;
    if (nonstandard) {
        lexer_punctuation_marks = nonstandard;
        lexer_allow_I6_escapes = FALSE;
    } else
        lexer_punctuation_marks = STANDARD_PUNCTUATION_MARKS;

§4.2. Extract results from the lexer4.2 =

    wording LEXW = Lexer::feed_ends(FALSE, NULL);
    return LEXW;

§5. If we want to feed a wording, we could do that by printing it out to a text stream, then feeding this text; but that would be slow and rather circular, and would also lose the origin. Much quicker is to splice, and then there's no need for a feed at all:

wording Feeds::feed_wording(wording W) {
    return Lexer::splice_words(W);