Hi all! I'm writing a large package in LaTeX3 right now (a rewrite of 'concepts' actually). I keep feeling the need to write some general purpose functions. For now I'm putting them as private functions in my package but I was wondering if they might not be a nice addition to the standard modules. However, even the example idea I mentioned in TeX.se chat seemed to be controversial, so I'll start with just that one. It's a small set of functions I wrote to scan ahead in the input stream to find modifiers and various delimited optional arguments without having to write auxiliary functions. First there's `\scan_tokens:nnw`, which scans ahead for any tokens that are in `#1`, grabs them and calls the (literal) function `#2` with the resulting token list as an argument. In my package I use them to grab digits and modifiers. Example: ------------------------------------------------------------ \ExplSyntaxOn \NewDocumentCommand \GrabDigits {} { \scan_tokens:nnw {1234567890} { (##1) } } \NewDocumentCommand \GrabModifiers {} { \scan_tokens:nnw {!*^} { (\tl_to_str:n{##1}) } } \ExplSyntaxOff \GrabDigits 76543xyz % yields (76543)xyz \GrabModifiers ^!^*$x$ % yields (^!^*)$x$ ------------------------------------------------------------ I've also written several related functions, like `\scan_tokens:nnFw` which calls `F` if no tokens were found. Also `\scan_delimited:NNnw` and `\scan_delimited:NNnFw` which are essentially an inline wrapper for an xparse `r` argument. I've attached a MWE. The implementation could be more efficient. I just felt like doing a recursive version. It would have been expandable if only `\peek_meaning_remove:NTF` was. :-) Please tell me what you think! -- www.mhelvens.net