Тёмный
Leo Isikdogan
Leo Isikdogan
Leo Isikdogan
Подписаться
I make concise educational videos on artificial intelligence, deep learning, image and video processing, computer vision, computational photography, and various topics in computer science.

Playlists
📌 Deep Learning Crash Course
📌 Image and Video Processing
📌 Hands-on Deep Learning: TensorFlow Coding Sessions
📌 Computer Science in 5 Minutes
📌 Machine Learning in Finance
📌 Cognitive Science Inspirations

Subscribe for more videos!

This is my personal RU-vid channel. Opinions are my own and do not express the opinions of my current or former employers, including Netflix, Apple, Intel, and Motorola.
How Diffusion Models Work
9:17
Год назад
Image Filters Explained
8:57
3 года назад
Neural Networks for Babies
2:32
3 года назад
Can AI Create Original Art?
8:28
3 года назад
Reward Hacking in AI
5:58
4 года назад
Perceptual Fusion
4:43
4 года назад
Optical Illusions Explained
7:45
4 года назад
How Super Resolution Works
9:29
4 года назад
Computer Science Electives
5:11
5 лет назад
Computer Science Curriculum
4:21
5 лет назад
Комментарии
@newbie8051
@newbie8051 21 час назад
5:09 I'll remember this forever, was even asked in our midsem exam hahaah
@ojaspatil2094
@ojaspatil2094 2 дня назад
rarely do I stop taking notes simultaneously and actually view the video because of interesting it is. very well done, thank you!
@BenStoneking
@BenStoneking 15 дней назад
Excellent explanation! Thanks for uploading! :)
@Floofskan
@Floofskan 17 дней назад
I don't think that I'm cut out for this...
@Danielle-ew1el
@Danielle-ew1el Месяц назад
every video you upload is a gem, filled with wisdom and fun!
@LD-dt1sk
@LD-dt1sk Месяц назад
He does not sound like he looks like
@mm-ro7th
@mm-ro7th Месяц назад
thank you for your video
@TheTrainMaster15
@TheTrainMaster15 2 месяца назад
I just want a picture of a gotdang hotdog
@newbie8051
@newbie8051 3 месяца назад
Read about this in my Image Processing class, but tbh we didn't had much time and clarity to this topic. Thanks !
@Monkeymario.
@Monkeymario. 3 месяца назад
i wonder whats the point in image compressio... oh wait people usualy store a lot of images on their devices and people usualy use phones for taking pictures which have small storage
@jeffsad8391
@jeffsad8391 3 месяца назад
Question:What kind of math should i know?
@avramukk
@avramukk 3 месяца назад
good job
@Opinionman2
@Opinionman2 3 месяца назад
Awesome explanation dude.
@cityandcountryit4289
@cityandcountryit4289 3 месяца назад
Ok, it's now 2024, where is this? we need it for Teams/Zoom calls etc.
@davidsling
@davidsling 4 месяца назад
thanks broo!!
@max2009
@max2009 4 месяца назад
bruv sounds like a woman lmao
@ShrirangKanade
@ShrirangKanade 4 месяца назад
LOVED THE EXPLANATION
@thatdude3685
@thatdude3685 4 месяца назад
i came from from the mkbhd video, a little late i know, but fuck it, i'm in!
@kushalgalipally3510
@kushalgalipally3510 4 месяца назад
Amazing quick explanation. Thanks for the video!
@d.bebfjcgfdfijhj
@d.bebfjcgfdfijhj 4 месяца назад
I've suspected for a while that the best way to eliminate any lingering vestiges of an accent, should you wish to do so, is probably to record your own voice and listen to yourself from a 3rd person perspective-which this video seems to confirm. Interestingly, I found that as you (partly/mostly) lose your original accent, you also lose the ability to speak authentically/consistently with that (former) accent. Meaning, you can't fake it any more than you current accent permits. In fact, to "regain authenticity," you would probably have to record your own voice and compare it to a recording of someone who's still accustomed to speaking the language that forms the basis of the accent on a daily basis. Of course the same is ultimately true for dialects as well.
@user-maymay2002
@user-maymay2002 4 месяца назад
damn, one of the best SRGAN & ECRGAN explanation vids out there! It was straight to the point and the way of explaining was flawless. Thanks!
@rafaellara9264
@rafaellara9264 4 месяца назад
Beautiful video
@Alzter0
@Alzter0 5 месяцев назад
At 5:40 (Model Distillation), why does the new model to be trained on the new, specific task receive as an input the same large dataset as the large pre-trained model? I thought that only the mainstream model should receive the large dataset as an input.
@shawn22459
@shawn22459 5 месяцев назад
Actually, 1920x1080i (interlaced) is 1.5Gb/s an 1920x1080p (progressive scan) is 3Gb/s. This is raw uncompressed video.
@busterfranken9105
@busterfranken9105 5 месяцев назад
Very interesting and well-explained, love your content
@fzycold8652
@fzycold8652 5 месяцев назад
Sayısal analiz ve matematiksel modellemeye ilgili bir mühendis adayı olarak, çok ilginç bir anlatı, teşekkürler! :)
@DevilNeverKnows
@DevilNeverKnows 6 месяцев назад
I'm debating burning 200GB of footage to a 100GB disc by compressing it all but I don't know if 50% is too low
@user-uc8nn9kf8l
@user-uc8nn9kf8l 6 месяцев назад
you have explained so well, i have seen so many videos , but, the way you explain from start to end is vary relative to what we are learning. very very good explanation about stable diffusion work-flow.
@Dan-hw9iu
@Dan-hw9iu 6 месяцев назад
Surprisingly, you have it backwards: LLMs always _extrapolate_ and never _interpolate*._ Essentially due to the curse of dimensionality. Even in compressed latent spaces, neighbors are simply light-years away from each other. There is no convex hull boundary around the training distribution from which the model can sample/interpolate. These things are the real deal. Turns out that Sutskever was right -- if you train on a sufficiently general problem with enough compute/parameters/data, then reasoning pops out. It's obvious in retrospect. If brute force works well enough for mother nature, why not for us? It's maybe a disappointingly prosaic, inefficient, or intellectually humbling discovery; but reality has a long history of popping humanity's ego bubble. Turns out that humanity's intellect is no more special than our place within the solar system. So it goes. Bring on the autoregressive revolution. The skeptic's position will slowly erode away as their arbitrary goal posts are continually scored against and consequently moved further back. AI masters the "impossible" game Go. Then language, textual knowledge. Produces incredible audio, and understands yours. Understands images and produces remarkable works of its own. AI reaches modality after modality, with continuous improvements. As I write this, transformers are revolutionizing robotics. Surely once an AI fetches a glass of water for a skeptic, they'll _finally_ have an epiphany...right? An alien mind can only be intelligent if it supports the exact same modalities as us, right? And only makes mistakes like humans, right? And masters _every_ human skill at _only_ prodigy level or above, right? And reciting past material implies a good memory for humans, but merely being an autocorrect for LLMs, right? And when machines imagine something we enjoy, it's imitating creativity, but if it's something we dislike, it's hallucinating. And... ..Ah, let myself go a little bit, didn't I? I'll stop ranting at vague critics. 😬 Can I make one last observation though? About job anxiety: you know who's not jazzed about AI taking out grunt work? The grunts. They're rarely emphasized, but universally present, collateral damage from technological disruption. The net good of revolutions contain the gross misery of displaced millions. I worry about them, and their chance of adapting to our neat new tools. I'm not sure a laid off career truck driver will be so chipper struggling to understand what these tools are, let alone using them to pay his family's mortgage. I apologize for the negativity. It just felt like you were cheerfully waving away justified anxiety for what is the heaviest, most profound shift in human history. We're _commoditizing an on-demand supply of general intelligence._ We aren't just adding handy buttons to Photoshop or whatever. That's like watching the electrification of America and pointing to the potential of brighter lamps. What we're going through requires gravitas, deep empathy, and steadfast mindfulness. It's the only way we'll make it through this together. Buckle up. * See _Learning in High Dimension Always Amounts to Extrapolation_ (LeCun et al., 2021)
@Zoecatsplants
@Zoecatsplants 6 месяцев назад
Awesome content! Thank you!
@kutay8421
@kutay8421 6 месяцев назад
Podcast creation is also sth AI can't achieve yet 😂
@nunoalexandre6408
@nunoalexandre6408 6 месяцев назад
Market are driven by Faith Only...
@nunoalexandre6408
@nunoalexandre6408 6 месяцев назад
Love it!!!!!!!!!!!!!!!
@lambda653
@lambda653 7 месяцев назад
Kinda crazy to think just 3 years ago the idea of an AI making art was a niche and curious idea. Now look at where we are. Mind boggling. I remeber a time when openAI's gpt technology was practically a toy used for auto generated text adventure games, and now it has reshaped our entire civilization. Everyone has heard of chatGPT, my grandmother knows.
@user-di4vl2lu8b
@user-di4vl2lu8b 7 месяцев назад
how interesting! thanks for your explanation :) I clicked the 'good' bb
@aloglute
@aloglute 7 месяцев назад
Çok iyi açıklamışsın valla daha önce izlemediğim için üzüldüm ❤
@ShuZhang-tx3vb
@ShuZhang-tx3vb 7 месяцев назад
Good stuff!
@user-lw6jq3fe7x
@user-lw6jq3fe7x 7 месяцев назад
Great video! and a great start for the podcast and wish you the best for future episodes!
@leoisikdogan
@leoisikdogan 7 месяцев назад
Thanks!
@salmanmaghsoudi9127
@salmanmaghsoudi9127 7 месяцев назад
کونی زیرنویس پارسی هم بزار مشکل شما با ما چیه تو همه برنامه ها تخمی ترین زبان های دنیا را می گذارید و اما زبان پارسی را اضافه نمیکنید
@leoisikdogan
@leoisikdogan 8 месяцев назад
Starting the New Year with something new! I've just launched my own podcast: Cognitive Creations! Episodes cover cognitive science, computer science, AI, and more. Join me on this journey into the science and art of the human mind. Spotify: podcasters.spotify.com/pod/show/cognitive-creations/
@mohammadatif1614
@mohammadatif1614 8 месяцев назад
you are so much great my professor only teach the compression technique but dont even explain when and how and why they are used m thanks to you i understand this now
@ananthakrishnank3208
@ananthakrishnank3208 8 месяцев назад
Thanks, Leo, for compressing it all within 5 minutes. :)
@Ksensei41
@Ksensei41 8 месяцев назад
I like that you put names of all the processes involved, so one can easily know, where to look further.
@TheEngineerpodcast
@TheEngineerpodcast 9 месяцев назад
5:50
@AbhinavanandSingh
@AbhinavanandSingh 9 месяцев назад
RU-vid should crown this human!
@deleted-something
@deleted-something 9 месяцев назад
Lol
@jcorey333
@jcorey333 9 месяцев назад
This is a really great video and a great breakdown of these ideas! Thank you!
@nirshadnijam2291
@nirshadnijam2291 9 месяцев назад
Massive respect to the pioneers behind this technology. It is insane how humans are capable of thinking like and implementing it on a computer. Crazy!
@NasserTabook
@NasserTabook 9 месяцев назад
Thank you for the simple and informative step by step explanation. the whole subject was simple and very clear.👏👏👏
@trueberryless
@trueberryless 9 месяцев назад
Wow, that's well explained. The only thing I think is (kinda) wrong: It should not be Megabyte (MB), but instead Mebibyte (MiB). Because Megabytes are base 2 which means 12MB = 12.582.912 Bytes, and Mebibyte are base 10 which means 12MiB = 12.000.000 Bytes. Another fix would be to say that the original image is 4.096x3.072 pixels... However, this mistake is in almost every Literature, so it doesn't actually matter and, moreover, it's not the point of the video... ❤