AI/ML+Physics Part 4: Crafting a Loss Function [Physics Informed Machine Learning]

28,439
0
Publicado 2024-03-22
This video discusses the fourth stage of the machine learning process: (4) designing a loss function to assess the performance of the model. There are opportunities to incorporate physics into this stage of the process, such as adding regularization terms to promote sparsity or extra loss functions to ensure that a partial differential equation is satisfied, as in PINNs.

This video was produced at the University of Washington, and we acknowledge funding support from the Boeing Company

%%% CHAPTERS %%%
00:00 Intro
00:55 Case Study: Fluid Velocity & Navier-Stokes
05:56 Case Study: Incompressible Flows & Poisson
07:46 Case Study: Lagrangian Neural Networks & Euler-Lagrange
09:38 Sparse Loss and the L1 Norm
12:51 Case Study: SINDy + AutoEncoder
15:41 SINDy and Loss Regularization
17:59 Parsimonious Modeling
20:16 Equivariant Loss
21:59 Outro

Todos los comentarios (21)
  • Dear Brunton, your classes are wonderful. Perfect didactic and incredibly easy to convey complex information in a simple and practical way. Congratulations!
  • @pgrudzien1221
    Thank you Steve, I was waiting to see that video. I'm excited for the series, great work
  • That's gold, I always can't wait for the next video, wish I could jump forward into future and just watch them all at once!
  • Thank you, Prof. I have seen the videos you cite here and this new video was an effectie review for me to understand them. Your lectures are parsimonious! I promise only people with mechanical engineering background have the ability and tendency to understand and describe complex math parsimoniously ;)
  • Thank you for your excellent panning and knowledge base. You are a learned machine!
  • @brady1123
    Another great video. It would be interesting to do a video about Physics Informed ML in the context of Sutton's Bitter Lesson, since this seems to be a case where adding extra knowledge into the architecture/loss of the network actually beats out more generalist approaches. This is probably due to the lack of training data in physics/engineering domains, but maybe building in physics knowledge helps in the large data regime as well.
  • @grapix1184
    Great video, this is the good stuff! Can't wait for the next one!
  • @drskelebone
    I guess the fluid dynamics case actually does have a case where the Lagrangian constraint could result in observable errors between model and reality (where training is done from t0 to t1, and then fit from t1 to tf).
  • Hi Steve. I was recently diving into ML and AI applications for CFD but I realized its better to focus on robust solvers . I don't know much about this but in other video you show the adaptive wavelet method for CFD and this seem to handle or solve a bunch of CFD problems whereas ML can tackle a specific problem. Am I right ? btw I am a huge fan of your videos . Thanks .
  • @BreakingMathPod
    Are there models of loss functions/ optimization functions that can switch tactics (or swap out functions completely) depending on what stage of training a machine learning model gets is on? I was studying transformers/ self-attention architecture and it made curious if “self attention” could be used specifically on the loss or the optimizing functions to either tighten up focus on a more specific goal, or broaden it depending on where the training is at. Does that make sense? This is the “Multiple loss or optimizing functions that are activated at different times using different triggers” approach. The self-attention method of deciding when to tune or modify (or replace) an optimizing function- I think would be spectacular to demonstrate! It is the key architecture used in OpenAi’s SORA, Chat GPT 4 (and 3 and 2), and many other successful machine learning tools. Also- open question: in what ways can LLM’s or image classifiers / generators be utilized in physics intended machine meaning? Could SORA figure some physics on its own simply via studying video footage as well as the ability to identify, label, and categorize objects in a video **and ** learn and generalize how these objects change over time? (Like objects falling). I know a lot depends on the training data (watching leaves fall vs watching rocks fall… That’s my big question!!
  • @datagigs5478
    Dear Brunton, your teaching is excellent. My path of learning is book oriented. Could you recommend me a book/paper where I can learn physics informed neural networks so that i can apply this in my nuclear engineering field like reactor design parameters, radiation transport, neutron transport etc.
  • @khawar0o7
    Please list this video. I couldn't find this video on your channel directly.