I present a didactical project, introducing the concept of information with all its interdisciplinary ramifications to students of physics and the neighbouring sciences. Proposed by Boltzmann as entropy, information has evolved into a common paradigm in science, economy, and culture, superseding energy in this role. As an integrating factor of the natural sciences at least, it lends itself as guiding principle for innovative teaching that transcends the frontiers of the traditional disciplines and emphasizes general viewpoints. Based on this idea, the postgraduate topical lecture presented here is intended to provide a firm conceptual basis, technically precise but versatile enough to be applied to specific topics from a broad range of fields. Basic notions of physics like causality, chance, irreversibility, symmetry, disorder, chaos, complexity can be reinterpreted on a common footing in terms of information and information flow. Dissipation and deterministic chaos, exemplifying information currents between macroscopic and microscopic scales, receive special attention. An important part is dedicated to quantum mechanics as an approach to physics that takes the finiteness of information systematically into account. Emblematic features like entanglement and non-locality appear as natural consequences. The course has been planned and tested for an audience comprising, besides physicists, students of other natural sciences as well as mathematics, informatics, engineering, sociology, and philosophy. I sketch history and objectives of this project, provide a resume of the course, report on experiences gained teaching it in various formats, and indicate possible future developments.