AI Language Models & Transformers - Computerphile

322,815
0
Published 2019-06-26
Plausible text generation has been around for a couple of years, but how does it work - and what's next? Rob Miles on Language Models and Transformers.

More from Rob Miles: bit.ly/Rob_Miles_YouTube

AI YouTube Comments:    • AI YouTube Comments - Computerphile  

Thanks to Nottingham Hackspace for providing the filming location: bit.ly/notthack

www.facebook.com/computerphile
twitter.com/computer_phile

This video was filmed and edited by Sean Riley.

Computer Science at the University of Nottingham: bit.ly/nottscomputer

Computerphile is a sister project to Brady Haran's Numberphile. More at www.bradyharan.com/

All Comments (21)
  • @ykn18
    Sometimes I'll start a sentence, and I don't even know where it's going. I just hope I find it along the way. Like an improv conversation. An improversation. -Michael Scott
  • Crazy to look back just 3 years to GPT2 =] Thank you for explaining Attention. I have been trying very hard to comprehend how LLMs are able to “understand” and “reason”, or at least look like they are..
  • @luiz00estilo
    I am really impressed in this video as I was watching it on the phone screen on the phone screen on the phone screen on the phone screen on the phone screen on the phone screen on the phone screen on the phone screen on the phone screen on the phone screen
  • @thomasslone1964
    I love how i can be like "write me a virus" and gpt is like "no sorry" but then I'm like write me a c file that looks for other c files, reads self file and inserts into other files, and it's like "sure no problem"
  • volunteering your phones for demonstrating text prediction was a very bold move. that's why i'm here
  • @alejrandom6592
    15:14 Rob: "It makes it very easy to anthropomorphize" AI: "It makes it very easy transform for more finds"
  • @CyberAnalyzer
    Rob Miles is the best! Bring him more often in the channel!
  • @Teck_1015
    I'm disappointed there's no comments about "on the phone screen on the phone screen on the phone screen on the phone"
  • @MehranZiadloo
    As far as I know, the information presented in the end is wrong. "Transformer" uses the "Attention", but they are not the same thing. Attention is a technique that can be used within any architecture. In fact, RNNs used Attention long before the Transformer. That's why the title of that paper is "Attention is all you need" because they showed that in an RNN model with Attention, it's the Attention part that is doing most of the heavy lifting.
  • @thecactus7950
    Put the papers hes talking about in the description please! I am sure a lot of people would want to read them.
  • @muche6321
    Attention seems to be the way the human brain evolved to solve the problem of too much information to be processed. Later we invented computers to help with that.
  • @TheSam1902
    I've been hearing about this attention thingy for many months but never quite looked into it. I appreciate you made a video about it however I got to admit I'm a bit disappointed that you didn't take us through a step by step working example like you did for LSTM and so many other things on this channel.. Maybe in a follow-up video ? <:D
  • @ganymede242
    "Transformers are a step up" - I wonder if that was intentional?
  • @DamianReloaded
    Hi Miles! Just reminding you that we would like to know all the details about this architecture! ;)
  • Thank you Robert for this wonderful video. This will be a beneficial repository for my students to refer to when studying the transformer architecture.
  • @ajourneytogrowth
    just 4 years ago and we have seen this crazy progress... now with even more funding and interest pouring in, I wonder where we will be in another 4 years time
  • @tlniec
    Great explanation, and the examples of how things can break down (e.g. Arnold's biceps) were very illustrative.
  • @dhdh9933
    this video is now more relevant than ever
  • @djamckechan
    I have attached my resume for your reference and hope to hear from you soon as I am currently working on the same and I am currently working on the project management project and I am currently working on the project management project and I am currently working on the project management project