Abstract: Autonomous systems often operate in environments where the behavior of all agents is mostly governed by the perception of a specific feature of the environment. When an autonomous system cannot recover this feature, there can be disastrous consequences. We introduce a novel framework for agent-aware state estimation that exploits the dependency of all agents’ behavior on a feature to better indirectly observe the feature. To allow for fast and accurate inference, we provide a mapping of our framework to a dynamic Bayesian network and show that speed of inference scales favorably with the number of agents in the environment. We then apply our approach to traffic light classification, focusing on instances where direct vision of the light may be obstructed by glare, heavy rain, vehicles, or other environmental factors. Finally, we show that agent-aware state estimation outperforms prevailing methods that only use direct image data of the traffic light on a real-world autonomous vehicle data set of challenging scenarios.