• corbin@awful.systems
    link
    fedilink
    English
    arrow-up
    8
    ·
    3 months ago

    Calling it now: codepoint-level non-tokenizing, with a remapping step to only recognize the most popular thousands of codepoints, would outperform what OpenAI has forced themselves into using. Evidence is circumstantial but strong, e.g. how arithmetic isn’t learned right because BPE tokenizers obscure Arabic digits. They can’t backpedal on this without breaking some of their API and re-pretraining a model, and they make a big deal about how expensive GPT pretraining is, so they’re stuck in their local minimum.

      • UnseriousAcademic@awful.systems
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 months ago

        The only viable use case, in my opinion, is to utilise its strong abilities in SolidGoldMagicarp to actualise our goals in the SolidGoldMagicarp sector and achieve increased margins on SolidGoldMagicarp.