Lecturer(s)
|
-
Andres Jan, prof. RNDr. dr hab. DSc.
-
Zámečník Hadwiger Lukáš, Mgr. Ph.D.
|
Course content
|
(1) Introduction to information theory Data compression (2) Probability and entropy (3) Shannon's source coding theorem (4) Types of code Communication (5) Noisy-channel capacity (6) Noisy-channel coding theorem (7) Other types of code Probability (8) Entailment models (9) Decision theory (10) Bayes's theorem Neuron networks (11) Hopfield network (12) Multi-level network
|
Learning activities and teaching methods
|
Lecture, Monologic Lecture(Interpretation, Training), Dialogic Lecture (Discussion, Dialog, Brainstorming), Work with Text (with Book, Textbook), Methods of Written Work
|
Learning outcomes
|
The course is designed to introduce students to the fundamental of information theory. Following Theory of communication, a new aspect of the communication process is introduced - information. Students will find out about information-related features of the language signal transmission and get familiar with the basic mathematical apparatus used for information analysis (entropy, redundancy, encoding/decoding, Bayes' theorem etc.). They will get exposed to practical text analysis using mathematical text analysis, a skill they can re-use in a number of other courses or elsewhere. The main topics covered in the course are: introduction to the theory of information, data compression, communication, probability, and neuron networks.
Conceptual analysis Scholarly text analysis Scholarly text analysis - other than CZ Presentation
|
Prerequisites
|
Basic reading English is expected as the key texts are available in English only No prerequisite courses
|
Assessment methods and criteria
|
Oral exam, Written exam, Student performance, Analysis of linguistic, Dialog
(1) regular class attendance (80%) (2) regular homework / reading assignments (3) one-to-one discussion of a selected text (students are expected to be knowledgeable in selected topics and know the extended/recommended literature)
|
Recommended literature
|
-
Eco, U. Teorie sémiotiky. Praha: Argo 2009..
-
Gleick, J. (2013). Informace. Praha.
-
Kvasnička, V. - Pospíchal, J. (2006). Matematická logika. Bratislava.
-
MacKay, D. J. C. (2003). Information Theory, Inference, and Learning Algorithms. Cambridge.
-
Neubauer, J. ? Sedlačík, M. ? Kříž, O. (2012). Základy statistiky. Praha.
|