Grounding Inferred Relationships in Complex World Models with Continual Reasoning
This paper proposes a Continual Reasoning framework to improve language models’ ability to infer relationships in complex world models like ARC-AGI and DABStep. By leveraging a structured external memory for hypothesis generation and refinement, our approach allows models to iteratively learn relationships at inference time, enhancing their adaptability to out-of-distribution tasks.