Collective dynamics emerge from individual-level decisions, yet we still poorly understand the link between individual-level decision-making processes and collective outcomes in realistic physical environments.
Using collective foraging to study the key trade-off between personal and social information use, we present a mechanistic, spatially-explicit agent-based model that combines individual-level evidence accumulation of personal and (visual) social cues with particle-based movement.
Under idealized conditions without physical constraints, our mechanistic framework reproduces findings from established probabilistic models, but explains how individual-level decision processes generate collective outcomes in a bottom-up way.
Groups performed best in clustered environments if agents quickly accumulated social information and approached successful others; individualistic search was most beneficial in uniform environments.
Incorporating different real-world physical and perceptual constraints profoundly shaped collective performance, occasionally buffering maladaptive herding and generating self-organized exploration.
Our study uncovers the mechanisms linking individual cognition to collective outcomes in human and animal foraging and paves the way for decentralized robotic applications. |