Тёмный

Zipf's Law 

Martin Hilpert
Подписаться 33 тыс.
Просмотров 9 тыс.
50% 1

Опубликовано:

 

29 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 29   
@shamsuddeenhassanmuhammad2143
@shamsuddeenhassanmuhammad2143 3 года назад
Excellent video. You teach excellently, your students must be happy with you.
@MartinHilpert
@MartinHilpert 3 года назад
Thank you for your kind words. I teach linguistics to my students, but my students taught me how to do that, if that makes any sense. ;)
@languagetv4756
@languagetv4756 2 года назад
thanks a lot
@ChildSarcophagus
@ChildSarcophagus 2 года назад
This blew my mind.
@LeonaDarkwind
@LeonaDarkwind Год назад
I LOVE that you've linked to Michael Stevens' video. I'm playing around with predictive language models and I'm really happy you're talking about WORD TOKENS in this video!
@TheRealGnolti
@TheRealGnolti 3 года назад
Martin, Zipf's law makes me wonder about the value of MI scores, not that they aren't meaningful, but when you review collocation results for a word and find that MI seems to have nothing to do with absolute frequency, but just mutual attraction continuing to exert its pull regardless of frequency. Collocation is a function of context, and it's the frequency of contexts that varies, analogous to the way certain climatic circumstances can promote the health of, say, vegetation and insects. Plug "miserable" into COCA and you get "creature" at rank 15 and an MI of 7.38 after a long line of MIs in the 3.0 range, because "miserable creature" is construction that occurs on certain rhetorical occasions. Am I overthinking this?
@yingyusu3529
@yingyusu3529 4 года назад
Hello :) I watched your abralin talk live on Wednesday. I study generative syntax, and I was very inspired by your discussion of negative evidence in the Q&A session! Thank you for all the wonderful videos!
@MartinHilpert
@MartinHilpert 4 года назад
Thanks a lot, Yingyu Su, that's very kind of you to say!
@Temerold_se
@Temerold_se 3 года назад
But what if you make a language with "aaa" before every word? Does Zipf's law apply then?
@MartinHilpert
@MartinHilpert 3 года назад
Mathematically, adding "aaa" to each word does not change the distribution. In the real world, languages like that don't exist, though. Speakers would be too lazy to pronounce extra vowels that don't mean anything, and so some of the "aaa"s would disappear very soon.
@Temerold_se
@Temerold_se 3 года назад
@@MartinHilpert ehm ok, but there's this asian language where they say like "Praise God" before every sentence. Also, real language or not, how does it apply?
@Temerold_se
@Temerold_se 3 года назад
@@MartinHilpert btw, how does it now change the distribution? Take an existing text and add "aaa" to the beginning of each word, it wouldn't work, right?
@thinaradesilva9351
@thinaradesilva9351 2 года назад
I'm doing a project on this same thing, would there be any chance for me to get in contact with you for a possible interview? awesome video by the way
@zerobit778
@zerobit778 3 года назад
Great professor
@carolynknight4233
@carolynknight4233 4 года назад
Hi, thank you for your wonderful videos. Does this law hold true for words uttered or written by non-native speakers of a language? or uttered by children before having mastered the language?
@MartinHilpert
@MartinHilpert 4 года назад
Hey Carolyn! Both L2 language and child language in first language acquisition show Zipfian distributions. Here is an interesting lecture by Nick C. Ellis on Zipf and L2 language use: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-7cKaX57tEXc.html Better video & audio, similar content: www.uttv.ee/naita?id=25911 Here is a study about Zipf and child language: journals.plos.org/plosone/article?id=10.1371/journal.pone.0053227
@carolynknight4233
@carolynknight4233 4 года назад
@@MartinHilpert Martin Hilpert Thank you so much Dr. Hilbert! I'm very excited about learning more about this, and I always look forward to your videos 🙂
@MartinHilpert
@MartinHilpert 4 года назад
@@carolynknight4233 Thank you, Carolyn!
@cidiladamourasemedo1805
@cidiladamourasemedo1805 2 года назад
Hi Martin, thank you for the wonderful and very helpful video. I am applying Zipf's law on my task to create a dictionary of words that are specific for a particular category - However, I wonder if I could use the curve to determine a threshold number for the most significant words for the dictionary ? For instance, use the intercept to determine this?
@BassmanTh
@BassmanTh 3 года назад
Thanks for that extensive video! It put a great value into my master's thesis. Even though I'm dealing with distributions in geographical data, it was great and easy way to understand Zipf's law.
@Mustafghan
@Mustafghan 2 года назад
Never seen normal distribution being explained so clearly and easy way to understand.
@Melnish
@Melnish 3 года назад
Thank you for the video:D I'm trying to download Antconc on mac with the newest version but there can be opened because "Apple cannot check it for malicious software." Also, when I was forced to open it doesn't have a way to open files on it. I would wondering do there have any ways to fix those problems?
@MartinHilpert
@MartinHilpert 3 года назад
It's hard to diagnose these issues from afar, but Anthony Laurence has a great series of tutorials on his webpage: www.laurenceanthony.net/software/antconc/ Good luck!
@coffecoding
@coffecoding 2 года назад
This is great. You teach so clearly.
@daquarlow
@daquarlow 3 года назад
mathematicians paradise right here
@duck2608
@duck2608 4 года назад
Hi, Thank you,I will follow all video of uncle
@Pakanahymni
@Pakanahymni 4 года назад
Have you ever tried plotting the multiple "position × n", would be interesting to see how much it varies. (if it was in the video I missed it)
@MartinHilpert
@MartinHilpert 4 года назад
Hi Järvi! The common way of visualizing Zipf's Law is the scatterplot of rank and frequency with logged axes. I adopted that format in order to match up with other explanations that are out there.
@topsiterings
@topsiterings 3 года назад
awesome!
Далее
The Zipf Mystery
21:05
Просмотров 27 млн
Офицер, я всё объясню
01:00
Просмотров 3,9 млн
DAXSHAT!!! Avaz Oxun sahnada yeg'lab yubordi
10:46
Просмотров 242 тыс.
Barno
00:22
Просмотров 701 тыс.
A course in Cognitive Linguistics: Categorization
27:00
Collocation measures in corpus linguistics
32:45
Просмотров 11 тыс.
A course in Cognitive Linguistics: Polysemy
40:49
Просмотров 26 тыс.
Machine Learning for Everybody - Full Course
3:53:53
A course in Cognitive Linguistics: Metaphor
25:34
Просмотров 53 тыс.
IR2.2 Zipf's law
10:18
Просмотров 24 тыс.
What Is Reality?
2:32:23
Просмотров 2,6 млн
Englishes around the World - Colonialism
29:54
Просмотров 8 тыс.
Sociolinguistics - the study of variation in language
36:50
Офицер, я всё объясню
01:00
Просмотров 3,9 млн