In this paper, we describe our approach and discuss evaluation results for the MediaEval 2012 Visual Privacy task. The goal of the task is to obscure faces of people visible in provided surveillance clips to preserve their personal privacy. We also additionally assume, although it is not explicitly stated in the task description, that the privacy protection should be done in an automated way and the applied privacy tool should be reversible and prone to attacks. We use a combination of a face detection algorithm and transform-domain scrambling technique, which pseudo-randomly scrambles bits during encoding, that was applied to the detected face regions. The evaluations of the resulted automated privacy protection tool showed that inaccuracies of the face detection algorithm affected both objective and subjective results. An interesting finding is how- ever that scrambling, while being non-distractive to the evaluating subjects, appeared significantly irritating with score of 0.8, but only for ’evening’ subset of the dataset.