1 AbbIE: Autoregressive Block-Based Iterative Encoder for Efficient Sequence Modeling We introduce the Autoregressive Block-Based Iterative Encoder (AbbIE), a novel recursive generalization of the encoder-only Transformer architecture, which achieves better perplexity than a standard Transformer and allows for the dynamic scaling of compute resources at test time. This simple, recursive approach is a complement to scaling large language model (LLM) performance through parameter and token counts. AbbIE performs its iterations in latent space, but unlike latent reasoning models, does not require a specialized dataset or training protocol. We show that AbbIE upward generalizes (ability to generalize to arbitrary iteration lengths) at test time by only using 2 iterations during train time, far outperforming alternative iterative methods. AbbIE's ability to scale its computational expenditure based on the complexity of the task gives it an up to 12\% improvement in zero-shot in-context learning tasks versus other iterative and standard methods and up to 5\% improvement in language perplexity. The results from this study open a new avenue to Transformer performance scaling. We perform all of our evaluations on model sizes up to 350M parameters. 10 authors · Jul 11, 2025
- Intercepting Interstellar Objects We describe how the ESA Comet Interceptor mission, which is due to launch in 2028/29 to a yet-to-be-discovered target, can provide a conceptual basis for a future mission to visit an Interstellar Object. Comet Interceptor will wait in space until a suitable long period comet is discovered, allowing rapid response to perform a fast flyby of an object that will be in the inner Solar System for only a few years; an enhanced version of this concept could realistically provide the first in situ investigation of a visitor from another star system. 6 authors · Nov 29, 2025