Measures of signal complexity can be used to distinguish neurophysiological activation from noise in those neuroimaging techniques where we record variations of brain activity with time, e.g., fMRI, EEG, ERP. In this paper we explore a recently developed approach to calculate a quantitative measure of deterministic signal complexity and information content: The Renyi number. The Renyi number is by definition an entropy, i.e., a classically used measure of disorder in physical systems, and is calculated in this paper over the basis of the time frequency representation (TFRs) of the measured signals. When calculated in this form, the Renyi entropy (RE) indirectly characterizes the complexity of a signal by providing an approximate counting of the number of separated elementary atoms that compose the time series in the time frequency plane. In this sense, this measure conforms closely to our visual notion of complexity since low complexity values are obtained for signals formed by a small number of "components". The most remarkable properties of this measure are twofold: 1) It does not rely on assumptions about the time series such as stationarity or gaussianity and 2) No model of the neural process under study is required, e.g., no hemodynamic response model for fMRI. The method is illustrated in this paper using fMRI, intracranial ERPs and intracranial potentials estimated from scalp recorded ERPs through an inverse solution (ELECTRA). The main theoretical and practical drawbacks of this measure, especially its dependence of the selected TFR, are discussed. Also the capability of this approach to produce, with less restrictive hypothesis, results comparable to those obtained with more standard methods but is emphasized.