top of page

Courses

Entropy, Time, Encoding
Explore time, entropy, and Transformers linking physics, information, AI.

Uncertainty, Information, Shannon Entropy
Build on entropy concepts: see how encoding sequence order lowers uncertainty in AI models, improving clarity, prediction accuracy, and real-world decision-making.

Temporal Encoding in Transformers
Understand how AI learns order in language and data, why position matters, and how this unlocks clearer meaning and smarter predictions.

Information, Time, Memory
Learn how information is physical: erasing bits generates heat, linking memory, entropy, and efficiency—insights vital for AI, computing, and cognition.

Causal, Bidirectional, Time
Learn why causal Transformers generate step-by-step, while bidirectional models excel at comprehension—helping you pick the right AI approach for real tasks.
bottom of page