Sequences and series | Combinatorics on words
In mathematics, a Sturmian word (Sturmian sequence or billiard sequence), named after Jacques Charles François Sturm, is a certain kind of infinitely long sequence of characters. Such a sequence can be generated by considering a game of English billiards on a square table. The struck ball will successively hit the vertical and horizontal edges labelled 0 and 1 generating a sequence of letters. This sequence is a Sturmian word. (Wikipedia).
How to recognize billiard strings #SOME2
For the curious, "Sturmian sequences" are the name for balanced binary strings that are also infinite and not eventually periodic. Sturmian sequences are related in some very cool ways to continued fractions and several other topics. That may be the subject of Part 3, if I ever get that fa
From playlist Summer of Math Exposition 2 videos
Aleksi Saarela : k-abelian complexity and fluctuation
Abstract : Words u and v are defined to be k-abelian equivalent if every factor of length at most k appears as many times in u as in v. The k-abelian complexity function of an infinite word can then be defined so that it maps a number n to the number of k-abelian equivalence classes of len
From playlist Combinatorics
Jörg Thuswaldner: S-adic sequences: a bridge between dynamics, arithmetic, and geometry
Abstract: Based on work done by Morse and Hedlund (1940) it was observed by Arnoux and Rauzy (1991) that the classical continued fraction algorithm provides a surprising link between arithmetic and diophantine properties of an irrational number αα, the rotation by αα on the torus 𝕋=ℝ/ℤT=R/
From playlist Dynamical Systems and Ordinary Differential Equations
Valérie Berthé: Dimension groups and recurrence for tree subshifts
Abstract: Dimension groups are invariants of orbital equivalence. We show in this lecture how to compute the dimension group of tree subshifts. Tree subshifts are defined in terms of extension graphs that describe the left and right extensions of factors of their languages: the extension g
From playlist Mathematical Aspects of Computer Science
The Story of Chinese Character : 犬
In modern Chinese language, 犬 has been replaced by another word 狗, hardly been used independently, only exists in idiom, while 犬(inu) is still being used in Japanese.
From playlist The Story of HanZi (Chinese Characters)
Stanford CS224N NLP with Deep Learning | Winter 2021 | Lecture 2 - Neural Classifiers
For more information about Stanford's Artificial Intelligence professional and graduate programs visit: https://stanford.io/2ZB72nu Lecture 2: Word Vectors, Word Senses, and Neural Network Classifiers 1. Course organization (2 mins) 2. Finish looking at word vectors and word2vec (13 mins)
From playlist Stanford CS224N: Natural Language Processing with Deep Learning | Winter 2021
Introduction to Modern Linguistics by Prof.Shreesh Chaudhary & Prof. Rajesh Kumar,Department of Humanities and Social Sciences,IIT Madras.For more details on NPTEL visit http://nptel.ac.in
From playlist IIT Madras: Introduction to Modern Linguistics | CosmoLearning.org English Language
Stanford CS224N: NLP with Deep Learning | Winter 2021 | Lecture 1 - Intro & Word Vectors
For more information about Stanford's Artificial Intelligence professional and graduate programs visit: https://stanford.io/3w46jar This lecture covers: 1. The course (10min) 2. Human language and word meaning (15 min) 3. Word2vec algorithm introduction (15 min) 4. Word2vec objective func
From playlist Stanford CS224N: Natural Language Processing with Deep Learning | Winter 2021
Language - Lecture 6 - CS50's Introduction to Artificial Intelligence with Python 2020
00:00:00 - Introduction 00:00:15 - Language 00:04:55 - Syntax and Semantics 00:10:23 - Context-Free Grammar 00:20:35 - nltk 00:28:00 - n-grams 00:30:28 - Tokenization 00:38:00 - Markov Models 00:42:41 - Bag-of-Words Model 00:46:38 - Naive Bayes 01:09:18 - Information Retrieval 01:12:06 - t
From playlist CS50's Introduction to Artificial Intelligence with Python 2020
Word Embedding and Word2Vec, Clearly Explained!!!
Words are great, but if we want to use them as input to a neural network, we have to convert them to numbers. One of the most popular methods for assigning numbers to words is to use a Neural Network to create Word Embeddings. In this StatQuest, we go through the steps required to create W
From playlist StatQuest
The Story of Chinese Character :覺
覺 is composed of a picture of practising divination and 見(to look). Actually, 覺 is originated from 爻 and 見, so the pronunciations of 覺 and 爻 are similar. Divination is abstract and not easy to be understood, learners won't know if there is narration only, once they have seen the demonstrat
From playlist The Story of HanZi (Chinese Characters)
PSY 523 Word Recognition Part 2
Lecturer: Dr. Erin M. Buchanan Missouri State University Summer/Fall 2016 PSY 523 Psychology and Language lectures covering material from Harley's The Psychology of Language: From Data to Theory. Lecture materials and assignments available at statisticsofdoom.com. https://statisticsofdo
From playlist PSY 523 Psychology and Language