Decision making is of crucial interest in many disciplines such as psychology, neuroscience, economics and machine learning. Binary perceptual decision theories relate to situations where an observer (or machine) is confronted with one of two possible noisy stimuli. For example, human readers have to decide whether a handwritten character is an n or a u; a trader has to decide whether to sell or to keep; a monkey has to decide whether dots on a screen are moving to the left or to the right. While engineering and economical decision theories focus on how to compute optimal decisions, psychology and neuroscience investigate the actual decision making process in humans and animals. Humans only need a fraction of a second to recognize objects. This astonishing speed is also evident in sports such as table tennis or soccer requiring rapid reactions to moving balls. In these tasks, the brain has to decide rapidly upon visual information available for only a hundred milliseconds or less. However, most experimental and theoretical work on decision making focuses on stationary paradigms and surprisingly few studies have taken the timing of the stimulus into account. Decisions about noisy stimuli require evidence integration over time. Traditionally, evidence integration and decision making are described as a one-stage process: a decision is made when evidence for the presence of a stimulus crosses a threshold. We will show that this model is incompatible with psychophysical experiments on feature fusion, where two visual stimuli are presented in rapid succession. Paradoxically, the second stimulus biases decisions more strongly than the first one, contrary to predictions of one-stage models and intuition. We present a two-stage model where sensory information is integrated and buffered before it starts driving the diffusion process.