In the recent years, Independent Component Analysis has become a fundamental tool in signal and data processing, especially in the field of Blind Source Separation (BSS); under mild conditions, independent source signals can be recovered from mixtures of them by maximizing a so-called contrast function. Neither the mixing system nor the original sources are needed for that purpose, justifying the "blind" term. Among the existing BSS methods is the class of approaches maximizing Information-Theoretic Criteria (ITC), that rely on Rényi's entropies, including the well-known Shannon and Hartley entropies. These ITC are maximized via adaptive optimization schemes. Two major issues in this field are the following: i) Are ITC really contrast functions? and ii) As most of the algorithms look in fact for a local maximum point, what about the relevance of these local optima from the BSS point of view? Even though there are some partial answers to these questions in the literature, most of them are based on simulations and conjectures; formal developments are often lacking. This thesis aims at filling this lack as well as providing intuitive justifications, too. The BSS problem is stated in Chapter 1, and viewed under the information theory angle. The two next chapters address specifically the above questions: Chapter 2 discusses the contrast function property of ITC while the possible existence of spurious local maximum points in ITC is the purpose of Chapter 3. Finally, Chapter 4 deals with a range-based criterion, the only “entropy-based” contrast function which is discriminant, i.e. free from spurious local maxima. The interest of this approach is confirmed by testing the proposed technique on various examples, including the MLSP 2006 data analysis competition benchmark; our method outperforms the previously obtained results on large-scale and noisy mixture samples obtained through ill-conditioned mixing matrices.