We investigate the problem of online detection of complex activities (such as cooking, lunch, work at desk), i.e., recognizing them while the activities are being performed using parts of the sensor data. In contrast to prior work, where complex activity recognition is performed offline with the observation of the activity available for its entire duration and utilizing deeply-instrumented environments, we focus on online activity detection using only accelerometer data from a single body-worn smartphone device. We present window based algorithms for online detection that effectively perform different tradeoffs between classification accuracy and detection latency. We present results of our exploration using a longitudinally-extensive and clearly-annotated cellphone accelerometer data trace that captures the true-life complex activity behavior of five subjects.