Suppose Q is a family of discrete memoryless channels. An unknown member of Q will be available, with perfect, causal output feedback for communication. Is there a coding scheme (possibly with variable transmission time) that can achieve the Burnashev error exponent uniformly over Q ? For two families of channels we show that the answer is yes. Furthermore, for each of these two classes, in addition to achieve the maximum error exponent, it is possible to uniformly attain any given fraction of the channel capacity. Therefore, in terms of achievable rates and delay, there are situations in which the knowledge of the channel becomes irrelevant. In the second part of the thesis, we show that for arbitrary sets of channels the Burnashev error exponent cannot in general be uniformly achieved. In particular we give a sufficient condition for a pair of channels so that no coding strategy reaches Burnashev's exponent simultaneously on both channels. As a third part we study a scenario where communication is carried by first testing the channel by means of a training sequence, then coding according to the channel estimate. We provide an upper bound on the maximum achievable error exponent of such coding schemes. This bound is typically much lower than the maximum achievable error exponent over a channel with feedback. For example in the case of binary symmetric channels this bound has a slope that vanishes at capacity. This result suggests that in terms of error exponent, a good universal feedback scheme combines channel estimation with information delivery, rather than separating them. In the final chapter, we address the question of communicating quickly and reliably. We consider a simple situation of two message communication over a known channel with feedback. We propose a simple decoding rule, and show that it minimizes a weighted combination of the probability of error and decoding delay for a certain range of crossover probabilities and combination weights.