• What is Optimization? + Learning Gradient Descent | Two Minute Papers #82

    Let's talk about what mathematical optimization is, how gradient descent can solve simpler optimization problems, and Google DeepMind's proposed algorithm that automatically learn optimization algorithms. The paper "Learning to learn by gradient descent by gradient descent" is available here: http://arxiv.org/pdf/1606.04474v1.pdf Source code: https://github.com/deepmind/learning-to-learn ______________________________ Recommended for you: Gradients, Poisson's Equation and Light Transport - https://www.youtube.com/watch?v=sSnDTPjfBYU WE WOULD LIKE TO THANK OUR GENEROUS PATREON SUPPORTERS WHO MAKE TWO MINUTE PAPERS POSSIBLE: David Jaenisch, Sunil Kim, Julian Josephs, Daniel John Benton. https://www.patreon.com/TwoMinutePapers We also thank Experiment for sponsoring our series. - https:...

    published: 29 Jul 2016
  • How To Program For Beginners | Episode 1: Algorithms

    This is the start to a new series, and I hope to teach you guys all the tricks and tips you need to becoming a successful programmer! If you're interested in more videos, and you want to continue to get better at programming, please subscribe for all future episodes!

    published: 30 May 2016
  • Algorithms: Graph Search, DFS and BFS

    Learn the basics of graph search and common operations; Depth First Search (DFS) and Breadth First Search (BFS). This video is a part of HackerRank's Cracking The Coding Interview Tutorial with Gayle Laakmann McDowell. http://www.hackerrank.com/domains/tutorials/cracking-the-coding-interview?utm_source=video&utm_medium=youtube&utm_campaign=ctci

    published: 27 Sep 2016
  • R11. Principles of Algorithm Design

    MIT 6.006 Introduction to Algorithms, Fall 2011 View the complete course: http://ocw.mit.edu/6-006F11 Instructor: Victor Costan License: Creative Commons BY-NC-SA More information at http://ocw.mit.edu/terms More courses at http://ocw.mit.edu

    published: 14 Jan 2013
  • 13. Classification

    MIT 6.0002 Introduction to Computational Thinking and Data Science, Fall 2016 View the complete course: http://ocw.mit.edu/6-0002F16 Instructor: John Guttag Prof. Guttag introduces supervised learning with nearest neighbor classification using feature scaling and decision trees. License: Creative Commons BY-NC-SA More information at http://ocw.mit.edu/terms More courses at http://ocw.mit.edu

    published: 19 May 2017
  • Algorithm using Flowchart and Pseudo code Level 1 Flowchart

    Algorithm using Flowchart and Pseudo code Level 1 Flowchart By: Yusuf Shakeel http://www.dyclassroom.com/flowchart/introduction 0:05 Things we will learn 0:21 Level 0:28 Level 1 Flowchart 0:33 Important terms 0:37 Procedure 0:45 Algorithm 0:54 Flowchart 1:00 Pseudo code 1:08 Answer this simple question 1:14 How will you log into your facebook account 1:30 Next question 1:32 Write an algorithm to log into your facebook account 1:44 Algorithm to log in to facebook account in simple English 2:06 Writing Algorithm 2:14 Flowchart 2:16 There are 6 basic symbols that are commonly used in Flowchart 2:20 Terminal 2:27 Input/Output 2:35 Process 2:42 Decision 2:52 Connector 3:00 Control Flow 3:06 All the 6 symbols 3:13 Flowchart rules 3:25 Flowchart exercise 3:28 Add 10 and 20 4:00 Another exerci...

    published: 27 Aug 2013
  • Brian Christian & Tom Griffiths: "Algorithms to Live By" | Talks at Google

    Practical, everyday advice which will easily provoke an interest in computer science. In a dazzlingly interdisciplinary work, acclaimed author Brian Christian and cognitive scientist Tom Griffiths show how the algorithms used by computers can also untangle very human questions. They explain how to have better hunches and when to leave things to chance, how to deal with overwhelming choices and how best to connect with others. From finding a spouse to finding a parking spot, from organizing one's inbox to understanding the workings of memory, Algorithms to Live By transforms the wisdom of computer science into strategies for human living. Brian Christian is the author of The Most Human Human, a Wall Street Journal bestseller, New York Times editors’ choice, and a New Yorker favorite book ...

    published: 12 May 2016
  • Sorting in Python || Learn Python Programming (Computer Science)

    Sorting is a fundamental task in software engineering. In Python, there are a variety of ways to sort lists, tuples, and other objects. Today we talk about the sort() method which is an in-place algorithm for sorting lists. We also cover the sorted() function which can be used on more objects, and creates a sorted copy, leaving the original object unchanged. We were able to make this Python video with the help of our Patrons on Patreon! We would like to recognize the generosity of our VIP Patrons Matt Peters, Andrew Mengede, Martin Stephens, and Markie Ward. Thank you so much for helping us continue our work! ➢➢➢➢➢➢➢➢➢➢ To​ ​help​ us continue making videos,​ ​you​ ​can​ ​support​ Socratica at: ​Patreon​: https://www.patreon.com/socratica Socratica Paypal: https://www.paypal.m...

    published: 08 Oct 2017
  • Let’s Write a Decision Tree Classifier from Scratch - Machine Learning Recipes #8

    Hey everyone! Glad to be back! Decision Tree classifiers are intuitive, interpretable, and one of my favorite supervised learning algorithms. In this episode, I’ll walk you through writing a Decision Tree classifier from scratch, in pure Python. I’ll introduce concepts including Decision Tree Learning, Gini Impurity, and Information Gain. Then, we’ll code it all up. Understanding how to accomplish this was helpful to me when I studied Machine Learning for the first time, and I hope it will prove useful to you as well. You can find the code from this video here: https://goo.gl/UdZoNr https://goo.gl/ZpWYzt Books! Hands-On Machine Learning with Scikit-Learn and TensorFlow https://goo.gl/kM0anQ Follow Josh on Twitter: https://twitter.com/random_forests Check out more Machine Learning Rec...

    published: 13 Sep 2017
  • How Random Forest algorithm works

    In this video I explain very briefly how the Random Forest algorithm works with a simple example composed by 4 decision trees.

    published: 04 Apr 2014
  • Cognition: How Your Mind Can Amaze and Betray You - Crash Course Psychology #15

    You can directly support Crash Course at http://www.subbable.com/crashcourse Subscribe for as little as $0 to keep up with everything we're doing. Also, if you can afford to pay a little every month, it really helps us to continue producing great content. We used to think that the human brain was a lot like a computer; using logic to figure out complicated problems. It turns out, it's a lot more complex and, well, weird than that. In this episode of Crash Course Psychology, Hank discusses thinking & communication, solving problems, creating problems, and a few ideas about what our brains are doing up there. -- Table of Contents Thinking & Communicating 01:39:16 Solving Problems 03:21:03 Creating Problems 05:46:06 -- Want to find Crash Course elsewhere on the internet? Facebook - http...

    published: 19 May 2014
  • Algorithms and Tips You Need to know to Master EPLL

    Hope this video helped! Thanks for watching! Video idea I may or may not continue: Basically, you guys can film yourselves on any event official or unofficial, and you can send in a good solve for YOUR standards. Note: Good reactions will be targeted Then, each month maybe, or 2 months, I'll upload a 'best solves of the month'. Featuring your videos. What defines best? Just an interesting solve of yours, maybe an event your good at, maybe an official solve, and good reactions will be fun to watch. How to send in videos: Nothing fancy, just upload to youtube public or unlisted and private message me a link to it. Or if you really want, you can just send the link here in the comments. Now remember, I'm not sure I will continue this series, I just want to see how successful and entertainin...

    published: 06 Apr 2017
  • Master Class with Prof. Monica Higgins | "Learning to Lead Through Case Discussion"

    The Harvard Graduate School of Education is pleased to continue "Master Class," a series that celebrates inspiring teaching at Harvard. Each event involves a demonstration of teaching followed by a reflective discussion with the participants. The “demonstration” part of the time will be an authentic experience of learning for members of the audience, drawing on the faculty member’s chosen teaching approach and topic; the “reflection” part will be a dialogue in which the faculty member shares his or her pedagogical assumptions, intentions, and moves, and engages in a conversation with a discussant and the audience that “pulls back the curtain” on his or her teaching. This event precedes the Ed School's annual Teaching and Learning Week, which will run from October 6 - 10. In the third clas...

    published: 09 Oct 2014
  • YouTube Algorithm 2017 Explained - The A.I. Behind The Curtain

    Here is part of my CVXLive 2017 Presentation: YouTube Algorithm 2017 Explained - The A.I. Behind The Curtain. This is the 3rd part of the Decoding the YouTube Algorithm and learn how to grow fast on YouTube with Algorithm Driven views series. Learn about YouTube Machine Learning Technology and how it interacts with viewers, videos, and channels. Get Access to Free VidSummit Replays https://vidsummit.com/freereplays Tickets to VidSummit 2017 https://vidsummit.com Decoding the YouTube Algorithm and learn how to grow fast on YouTube with Algorithm Driven views series ➜ https://goo.gl/sa6aGp Get More Great Tips - Subscribe ➜ http://goo.gl/dWNo9H Share this Video: ➜ https://youtu.be/Ix1gnDmuMyI My Favorite YouTube Tool TubeBuddy Download TubeBuddy Free Today! ➜ http://derral.link/tubebudd...

    published: 09 Aug 2017
  • Decision Trees and Boosting, XGBoost | Two Minute Papers #55

    A decision tree is a great tool to help making good decisions from a huge bunch of data. In this episode, we talk about boosting, a technique to combine a lot of weak decision trees into a strong learning algorithm. Please note that gradient boosting is a broad concept and this is only one possible application of it! __________________________________ Our Patreon page is available here: https://www.patreon.com/TwoMinutePapers If you don't want to spend a dime or you can't afford it, it's completely okay, I'm very happy to have you around! And please, stay with us and let's continue our journey of science together! The paper "Experiments with a new boosting algorithm" is available here: http://www.public.asu.edu/~jye02/CLASSES/Fall-2005/PAPERS/boosting-icml.pdf Another great introduct...

    published: 24 Mar 2016
  • CppCon 2017: Nicholas Ormrod “Fantastic Algorithms and Where To Find Them”

    Presentation Slides, PDFs, Source Code and other presenter materials are available at: https://github.com/CppCon/CppCon2017 — Come dive into some exciting algorithms — tools rare enough to be novel, but useful enough to be found in practice. Want to learn about "heavy hitters" to prevent DOS attacks? Come to this talk. Want to avoid smashing your stack during tree destruction? Come to this talk. Want to hear war stories about how a new algorithm saved the day? Come to this talk! We'll dive into the finest of algorithms and see them in use — Fantastic Algorithms, and Where To Find Them. — Nicholas Ormrod: Facebook, Software Engineer Nicholas is a infrastructure engineer at Facebook. If he talks too much, disable him with a well-placed nerd snipe. — Videos Filmed & Edited by Bash Films: ht...

    published: 27 Oct 2017
  • Introduction To Optimization: Gradient Based Algorithms

    A conceptual overview of gradient based optimization algorithms. This video is part of an introductory optimization series. QUIZ: https://goo.gl/forms/1NaFUcqCnWgWbrQh1 TRANSCRIPT: Hello, and welcome to Introduction To Optimization. This video covers gradient based algorithms. Gradient based algorithms and gradient free algorithms are the two main types of methods for solving optimization problems. In this video, we will learn the basic ideas behind how gradient based solvers work. Gradient based solvers use derivatives to find the optimum value of a function. To understand how this works, imagine that you are hiking on a mountainside, trying to find your way to a campsite at the bottom of the mountain. How would you know where to go? Perhaps you could follow a trail, look at ...

    published: 29 Mar 2017
  • Programming For Beginners | Episode 1: Algorithms

    I can't wait to continue this series! I hope you guys found this first episode useful. Please leave comments to tell me how I can improve this series, or if you have any questions! Thanks for watching, see ya next time! If you enjoyed the video please hit the "Like" button, leave a comment on what you want to see next, and hit the link below to Subscribe! ➤ Twitter: goo.gl/aUPMZD ➤ Subscribe: goo.gl/lQl7mw

    published: 30 May 2016
  • Java Programming

    Cheat Sheet is Here : http://goo.gl/OPMjte Slower Java Tutorial : http://goo.gl/UHdlyP How to Install Java & Eclipse : http://goo.gl/vEEEJE Best Java Book : http://amzn.to/2l27h2h Support Me on Patreon : https://www.patreon.com/derekbanas In this Java programming Tutorial I'll teach you all of the core knowledge needed to write Java code in 30 minutes. This is the most popular request from everyone. I specifically cover the following topics: primitive data types, comments, class, import, Scanner, final, Strings, static, private, protected, public, constructors, math, hasNextLine, nextLine, getters, setters, method overloading, Random, casting, toString, conversion from Strings to primitives, converting from primitives to Strings, if, else, else if, print, println, printf, logical ope...

    published: 03 Jun 2014
  • Decoding The New YouTube Algorithm 2017 - How To Grow Fast on YouTube Pt. 2

    Here is my VidCon 2017 Presentation: Decoding the YouTube Algorithm and learn how to grow fast on YouTube with Algorithm Driven views. Part 2. Share this Video: ➜ https://youtu.be/BfxgJet8ym0 Get Access to Free VidSummit Replays https://vidsummit.com/freereplays Tickets to VidSummit 2017 https://vidsummit.com Get More Great Tips - Subscribe ➜ http://goo.gl/dWNo9H My other presentation on the YouTube Algorithm How To Grow on Youtube Fast - Decoding The New YouTube Algorithm 2017 ➜ https://goo.gl/fSpCRX My Favorite YouTube Tool TubeBuddy Download TubeBuddy Free Today! ➜ http://derral.link/tubebuddy Join My Patreon Community for More Advanced Training Check Out Our Community! ➜ http://derral.link/patreon ★ ★ Be the Next Lucky Subscriber to get an In-depth Channel Evaluation: 1. Mu...

    published: 11 Jul 2017
  • Lecture 3 | Loss Functions and Optimization

    Lecture 3 continues our discussion of linear classifiers. We introduce the idea of a loss function to quantify our unhappiness with a model’s predictions, and discuss two commonly used loss functions for image classification: the multiclass SVM loss and the multinomial logistic regression loss. We introduce the idea of regularization as a mechanism to fight overfitting, with weight decay as a concrete example. We introduce the idea of optimization and the stochastic gradient descent algorithm. We also briefly discuss the use of feature representations in computer vision. Keywords: Image classification, linear classifiers, SVM loss, regularization, multinomial logistic regression, optimization, stochastic gradient descent Slides: http://cs231n.stanford.edu/slides/2017/cs231n_2017_lecture3...

    published: 11 Aug 2017
  • 'The Algorithm' - How YouTube Search & Discovery Works

    Welcome to this series of videos on how YouTube's search & discovery system works. In this first installment, we talk about how our 'algorithm' follows the audience. WATCH THE NEXT VIDEO: https://goo.gl/SJiwDS GO TO THE LESSON: https://goo.gl/qV5PgY SUBSCRIBE: https://goo.gl/So4XIG With over 400 hours of video uploaded every minute, that can be a challenge. YouTube’s recommendation systems provide a real-time feedback loop to cater to each viewer and their varying interests. It learns from over 80 billion bits of feedback from the audience, daily, to understand how to serve the right videos to the right viewers at the right time. Our goal is to get people to watch more videos that they enjoy, so that they come back to YouTube regularly. Creators often ask, “What kind of videos does the ...

    published: 28 Aug 2017
  • Ray Kurzweil | Our Brain Is a Blueprint for the Master Algorithm | Singularity Hub

    Ray Kurzweil is an inventor, thinker, and futurist famous for forecasting the pace of technology and predicting the world of tomorrow. In this video, Kurzweil suggests the blueprint for the master algorithm—or a single, general purpose learning algorithm—is hidden in the brain. The brain, according to Kurzweil, consists of repeating modules that self-organize into hierarchies that build simple patterns into complex concepts. We don’t have a complete understanding of how this process works yet, but Kurzweil believes that as we study the brain more and reverse engineer what we find, we’ll learn to write the master algorithm. Hub Article: https://wp.me/phyoN-t5u Subscribe: http://bit.ly/1Wq6gwm Connect with Singularity University: Website: http://su.org Hub: http://singularityhub.com Face...

    published: 30 Jun 2017
  • Algorithms - 03 Introduction to Time Complexity - Unsolvable problems (ADA)

    published: 29 Jun 2017
developed with YouTube
What is Optimization? + Learning Gradient Descent | Two Minute Papers #82

What is Optimization? + Learning Gradient Descent | Two Minute Papers #82

  • Order:
  • Duration: 5:31
  • Updated: 29 Jul 2016
  • views: 8001
videos
Let's talk about what mathematical optimization is, how gradient descent can solve simpler optimization problems, and Google DeepMind's proposed algorithm that automatically learn optimization algorithms. The paper "Learning to learn by gradient descent by gradient descent" is available here: http://arxiv.org/pdf/1606.04474v1.pdf Source code: https://github.com/deepmind/learning-to-learn ______________________________ Recommended for you: Gradients, Poisson's Equation and Light Transport - https://www.youtube.com/watch?v=sSnDTPjfBYU WE WOULD LIKE TO THANK OUR GENEROUS PATREON SUPPORTERS WHO MAKE TWO MINUTE PAPERS POSSIBLE: David Jaenisch, Sunil Kim, Julian Josephs, Daniel John Benton. https://www.patreon.com/TwoMinutePapers We also thank Experiment for sponsoring our series. - https://experiment.com/ Subscribe if you would like to see more of these! - http://www.youtube.com/subscription_center?add_user=keeroyz The chihuahua vs muffin image is a courtesy of teenybiscuit - https://twitter.com/teenybiscuit More fun stuff here: http://twistedsifter.com/2016/03/puppy-or-bagel-meme-gallery/ The thumbnail background image was created by Alan Levine - https://flic.kr/p/vbEd1W Splash screen/thumbnail design: Felícia Fehér - http://felicia.hu Károly Zsolnai-Fehér's links: Facebook → https://www.facebook.com/TwoMinutePapers/ Twitter → https://twitter.com/karoly_zsolnai Web → https://cg.tuwien.ac.at/~zsolnai/
https://wn.com/What_Is_Optimization_Learning_Gradient_Descent_|_Two_Minute_Papers_82
How To Program For Beginners | Episode 1: Algorithms

How To Program For Beginners | Episode 1: Algorithms

  • Order:
  • Duration: 24:10
  • Updated: 30 May 2016
  • views: 962
videos
This is the start to a new series, and I hope to teach you guys all the tricks and tips you need to becoming a successful programmer! If you're interested in more videos, and you want to continue to get better at programming, please subscribe for all future episodes!
https://wn.com/How_To_Program_For_Beginners_|_Episode_1_Algorithms
Algorithms: Graph Search, DFS and BFS

Algorithms: Graph Search, DFS and BFS

  • Order:
  • Duration: 11:49
  • Updated: 27 Sep 2016
  • views: 145271
videos
Learn the basics of graph search and common operations; Depth First Search (DFS) and Breadth First Search (BFS). This video is a part of HackerRank's Cracking The Coding Interview Tutorial with Gayle Laakmann McDowell. http://www.hackerrank.com/domains/tutorials/cracking-the-coding-interview?utm_source=video&utm_medium=youtube&utm_campaign=ctci
https://wn.com/Algorithms_Graph_Search,_Dfs_And_Bfs
R11. Principles of Algorithm Design

R11. Principles of Algorithm Design

  • Order:
  • Duration: 58:26
  • Updated: 14 Jan 2013
  • views: 27966
videos
MIT 6.006 Introduction to Algorithms, Fall 2011 View the complete course: http://ocw.mit.edu/6-006F11 Instructor: Victor Costan License: Creative Commons BY-NC-SA More information at http://ocw.mit.edu/terms More courses at http://ocw.mit.edu
https://wn.com/R11._Principles_Of_Algorithm_Design
13. Classification

13. Classification

  • Order:
  • Duration: 49:54
  • Updated: 19 May 2017
  • views: 6881
videos
MIT 6.0002 Introduction to Computational Thinking and Data Science, Fall 2016 View the complete course: http://ocw.mit.edu/6-0002F16 Instructor: John Guttag Prof. Guttag introduces supervised learning with nearest neighbor classification using feature scaling and decision trees. License: Creative Commons BY-NC-SA More information at http://ocw.mit.edu/terms More courses at http://ocw.mit.edu
https://wn.com/13._Classification
Algorithm using Flowchart and Pseudo code Level 1 Flowchart

Algorithm using Flowchart and Pseudo code Level 1 Flowchart

  • Order:
  • Duration: 5:41
  • Updated: 27 Aug 2013
  • views: 379932
videos
Algorithm using Flowchart and Pseudo code Level 1 Flowchart By: Yusuf Shakeel http://www.dyclassroom.com/flowchart/introduction 0:05 Things we will learn 0:21 Level 0:28 Level 1 Flowchart 0:33 Important terms 0:37 Procedure 0:45 Algorithm 0:54 Flowchart 1:00 Pseudo code 1:08 Answer this simple question 1:14 How will you log into your facebook account 1:30 Next question 1:32 Write an algorithm to log into your facebook account 1:44 Algorithm to log in to facebook account in simple English 2:06 Writing Algorithm 2:14 Flowchart 2:16 There are 6 basic symbols that are commonly used in Flowchart 2:20 Terminal 2:27 Input/Output 2:35 Process 2:42 Decision 2:52 Connector 3:00 Control Flow 3:06 All the 6 symbols 3:13 Flowchart rules 3:25 Flowchart exercise 3:28 Add 10 and 20 4:00 Another exercise 4:03 Find the sum of 5 numbers 4:34 Another exercise 4:35 Print Hello World 10 times 5:06 Another exercise 5:07 Draw a flowchart to log in to facebook account 5:26 Note! End of Level 1 Related Videos Algorithm Flowchart and Pseudo code Level 1 Flowchart http://youtu.be/vOEN65nm4YU Level 2 Important Programming Concepts http://youtu.be/kwA3M8YxNk4 Level 3 Pseudo code http://youtu.be/r1BpraNa2Zc
https://wn.com/Algorithm_Using_Flowchart_And_Pseudo_Code_Level_1_Flowchart
Brian Christian & Tom Griffiths: "Algorithms to Live By" | Talks at Google

Brian Christian & Tom Griffiths: "Algorithms to Live By" | Talks at Google

  • Order:
  • Duration: 1:07:28
  • Updated: 12 May 2016
  • views: 47769
videos
Practical, everyday advice which will easily provoke an interest in computer science. In a dazzlingly interdisciplinary work, acclaimed author Brian Christian and cognitive scientist Tom Griffiths show how the algorithms used by computers can also untangle very human questions. They explain how to have better hunches and when to leave things to chance, how to deal with overwhelming choices and how best to connect with others. From finding a spouse to finding a parking spot, from organizing one's inbox to understanding the workings of memory, Algorithms to Live By transforms the wisdom of computer science into strategies for human living. Brian Christian is the author of The Most Human Human, a Wall Street Journal bestseller, New York Times editors’ choice, and a New Yorker favorite book of the year. His writing has appeared in The New Yorker, The Atlantic, Wired, The Wall Street Journal, The Guardian, and The Paris Review, as well as in scientific journals such as Cognitive Science, and has been translated into eleven languages. He lives in San Francisco. Tom Griffiths is a professor of psychology and cognitive science at UC Berkeley, where he directs the Computational Cognitive Science Lab. He has published more than 150 scientific papers on topics ranging from cognitive psychology to cultural evolution, and has received awards from the National Science Foundation, the Sloan Foundation, the American Psychological Association, and the Psychonomic Society, among others. He lives in Berkeley. On behalf of Talks at Google this talk was hosted by Boris Debic. eBook https://play.google.com/store/books/details/Brian_Christian_Algorithms_to_Live_By?id=yvaLCgAAQBAJ
https://wn.com/Brian_Christian_Tom_Griffiths_Algorithms_To_Live_By_|_Talks_At_Google
Sorting in Python  ||  Learn Python Programming  (Computer Science)

Sorting in Python || Learn Python Programming (Computer Science)

  • Order:
  • Duration: 6:24
  • Updated: 08 Oct 2017
  • views: 14138
videos
Sorting is a fundamental task in software engineering. In Python, there are a variety of ways to sort lists, tuples, and other objects. Today we talk about the sort() method which is an in-place algorithm for sorting lists. We also cover the sorted() function which can be used on more objects, and creates a sorted copy, leaving the original object unchanged. We were able to make this Python video with the help of our Patrons on Patreon! We would like to recognize the generosity of our VIP Patrons Matt Peters, Andrew Mengede, Martin Stephens, and Markie Ward. Thank you so much for helping us continue our work! ➢➢➢➢➢➢➢➢➢➢ To​ ​help​ us continue making videos,​ ​you​ ​can​ ​support​ Socratica at: ​Patreon​: https://www.patreon.com/socratica Socratica Paypal: https://www.paypal.me/socratica We also accept Bitcoin! :) Our​ ​address​ ​is: 1EttYyGwJmpy9bLY2UcmEqMJuBfaZ1HdG9 Thank​ ​you!! ➢➢➢➢➢➢➢➢➢➢ If you’d like a reference book, we recommend “Python Cookbook, 3rd Edition” from O’Reilly: http://amzn.to/2sCNYlZ The Mythical Man Month - Essays on Software Engineering & Project Management http://amzn.to/2tYdNeP ➢➢➢➢➢➢➢➢➢➢ You​ ​can​ ​also​ ​follow​ ​Socratica​ ​on: -​ ​Twitter:​ ​@socratica -​ ​Instagram:​ ​@SocraticaStudios -​ ​Facebook:​ ​@SocraticaStudios ➢➢➢➢➢➢➢➢➢➢ Python instructor: Ulka Simone Mohanty (@ulkam on Twitter) Written & Produced by Michael Harrison (@mlh496 on Twitter)
https://wn.com/Sorting_In_Python_||_Learn_Python_Programming_(Computer_Science)
Let’s Write a Decision Tree Classifier from Scratch - Machine Learning Recipes #8

Let’s Write a Decision Tree Classifier from Scratch - Machine Learning Recipes #8

  • Order:
  • Duration: 9:53
  • Updated: 13 Sep 2017
  • views: 43898
videos
Hey everyone! Glad to be back! Decision Tree classifiers are intuitive, interpretable, and one of my favorite supervised learning algorithms. In this episode, I’ll walk you through writing a Decision Tree classifier from scratch, in pure Python. I’ll introduce concepts including Decision Tree Learning, Gini Impurity, and Information Gain. Then, we’ll code it all up. Understanding how to accomplish this was helpful to me when I studied Machine Learning for the first time, and I hope it will prove useful to you as well. You can find the code from this video here: https://goo.gl/UdZoNr https://goo.gl/ZpWYzt Books! Hands-On Machine Learning with Scikit-Learn and TensorFlow https://goo.gl/kM0anQ Follow Josh on Twitter: https://twitter.com/random_forests Check out more Machine Learning Recipes here: https://goo.gl/KewA03 Subscribe to the Google Developers channel: http://goo.gl/mQyv5L
https://wn.com/Let’S_Write_A_Decision_Tree_Classifier_From_Scratch_Machine_Learning_Recipes_8
How Random Forest algorithm works

How Random Forest algorithm works

  • Order:
  • Duration: 5:47
  • Updated: 04 Apr 2014
  • views: 213763
videos
In this video I explain very briefly how the Random Forest algorithm works with a simple example composed by 4 decision trees.
https://wn.com/How_Random_Forest_Algorithm_Works
Cognition: How Your Mind Can Amaze and Betray You - Crash Course Psychology #15

Cognition: How Your Mind Can Amaze and Betray You - Crash Course Psychology #15

  • Order:
  • Duration: 10:42
  • Updated: 19 May 2014
  • views: 1314043
videos
You can directly support Crash Course at http://www.subbable.com/crashcourse Subscribe for as little as $0 to keep up with everything we're doing. Also, if you can afford to pay a little every month, it really helps us to continue producing great content. We used to think that the human brain was a lot like a computer; using logic to figure out complicated problems. It turns out, it's a lot more complex and, well, weird than that. In this episode of Crash Course Psychology, Hank discusses thinking & communication, solving problems, creating problems, and a few ideas about what our brains are doing up there. -- Table of Contents Thinking & Communicating 01:39:16 Solving Problems 03:21:03 Creating Problems 05:46:06 -- Want to find Crash Course elsewhere on the internet? Facebook - http://www.facebook.com/YouTubeCrashCourse Twitter - http://www.twitter.com/TheCrashCourse Tumblr - http://thecrashcourse.tumblr.com Support CrashCourse on Subbable: http://subbable.com/crashcourse
https://wn.com/Cognition_How_Your_Mind_Can_Amaze_And_Betray_You_Crash_Course_Psychology_15
Algorithms and Tips You Need to know to Master EPLL

Algorithms and Tips You Need to know to Master EPLL

  • Order:
  • Duration: 4:12
  • Updated: 06 Apr 2017
  • views: 2703
videos
Hope this video helped! Thanks for watching! Video idea I may or may not continue: Basically, you guys can film yourselves on any event official or unofficial, and you can send in a good solve for YOUR standards. Note: Good reactions will be targeted Then, each month maybe, or 2 months, I'll upload a 'best solves of the month'. Featuring your videos. What defines best? Just an interesting solve of yours, maybe an event your good at, maybe an official solve, and good reactions will be fun to watch. How to send in videos: Nothing fancy, just upload to youtube public or unlisted and private message me a link to it. Or if you really want, you can just send the link here in the comments. Now remember, I'm not sure I will continue this series, I just want to see how successful and entertaining it is. Also, my PBs! (Let me know if you can't access them) https://docs.google.com/spreadsheets/d/1-_G72PqdH3o4V3UpeWQLaoPLd6M5G_BxlY7iB03-rg8/edit#gid=0
https://wn.com/Algorithms_And_Tips_You_Need_To_Know_To_Master_Epll
Master Class with Prof. Monica Higgins | "Learning to Lead Through Case Discussion"

Master Class with Prof. Monica Higgins | "Learning to Lead Through Case Discussion"

  • Order:
  • Duration: 1:19:01
  • Updated: 09 Oct 2014
  • views: 12042
videos
The Harvard Graduate School of Education is pleased to continue "Master Class," a series that celebrates inspiring teaching at Harvard. Each event involves a demonstration of teaching followed by a reflective discussion with the participants. The “demonstration” part of the time will be an authentic experience of learning for members of the audience, drawing on the faculty member’s chosen teaching approach and topic; the “reflection” part will be a dialogue in which the faculty member shares his or her pedagogical assumptions, intentions, and moves, and engages in a conversation with a discussant and the audience that “pulls back the curtain” on his or her teaching. This event precedes the Ed School's annual Teaching and Learning Week, which will run from October 6 - 10. In the third class of the series Professor Monica Higgins will teach a session entitled, "Learning to Lead through Case Discussion." HGSE Senior Lecturer James Honan will serve as the discussant.
https://wn.com/Master_Class_With_Prof._Monica_Higgins_|_Learning_To_Lead_Through_Case_Discussion
YouTube Algorithm 2017 Explained - The A.I. Behind The Curtain

YouTube Algorithm 2017 Explained - The A.I. Behind The Curtain

  • Order:
  • Duration: 33:07
  • Updated: 09 Aug 2017
  • views: 22681
videos
Here is part of my CVXLive 2017 Presentation: YouTube Algorithm 2017 Explained - The A.I. Behind The Curtain. This is the 3rd part of the Decoding the YouTube Algorithm and learn how to grow fast on YouTube with Algorithm Driven views series. Learn about YouTube Machine Learning Technology and how it interacts with viewers, videos, and channels. Get Access to Free VidSummit Replays https://vidsummit.com/freereplays Tickets to VidSummit 2017 https://vidsummit.com Decoding the YouTube Algorithm and learn how to grow fast on YouTube with Algorithm Driven views series ➜ https://goo.gl/sa6aGp Get More Great Tips - Subscribe ➜ http://goo.gl/dWNo9H Share this Video: ➜ https://youtu.be/Ix1gnDmuMyI My Favorite YouTube Tool TubeBuddy Download TubeBuddy Free Today! ➜ http://derral.link/tubebuddy Join My Patreon Community for More Advanced Training Check Out Our Community! ➜ http://derral.link/patreon ★ ★ Be the Next Lucky Subscriber to get an In-depth Channel Evaluation: 1. Must be subscribed to My YouTube Channel Subscribe ➜ http://goo.gl/dWNo9H 2. Must be one of my Patrons on Patreon. Check it out! ➜ http://derral.link/patreon 3. Must be uploading good quality content frequently to your YouTube Channel and really trying hard to make it 4. Must be engaged in my channel by liking, commenting, posting, sharing and encouraging others to subscribe to my channel. Ask me A Question by using hashtag on YouTube or Twitter #AskDerral @derraleves
https://wn.com/Youtube_Algorithm_2017_Explained_The_A.I._Behind_The_Curtain
Decision Trees and Boosting, XGBoost | Two Minute Papers #55

Decision Trees and Boosting, XGBoost | Two Minute Papers #55

  • Order:
  • Duration: 3:37
  • Updated: 24 Mar 2016
  • views: 19795
videos
A decision tree is a great tool to help making good decisions from a huge bunch of data. In this episode, we talk about boosting, a technique to combine a lot of weak decision trees into a strong learning algorithm. Please note that gradient boosting is a broad concept and this is only one possible application of it! __________________________________ Our Patreon page is available here: https://www.patreon.com/TwoMinutePapers If you don't want to spend a dime or you can't afford it, it's completely okay, I'm very happy to have you around! And please, stay with us and let's continue our journey of science together! The paper "Experiments with a new boosting algorithm" is available here: http://www.public.asu.edu/~jye02/CLASSES/Fall-2005/PAPERS/boosting-icml.pdf Another great introduction to tree boosting: http://homes.cs.washington.edu/~tqchen/pdf/BoostedTree.pdf WE WOULD LIKE TO THANK OUR GENEROUS SUPPORTERS WHO MAKE TWO MINUTE PAPERS POSSIBLE: Sunil Kim, Vinay S. The thumbnail image background was created by John Voo (CC BY 2.0), content-aware filling has been applied - https://flic.kr/p/BLphju Splash screen/thumbnail design: Felícia Fehér - http://felicia.hu Károly Zsolnai-Fehér's links: Patreon → https://www.patreon.com/TwoMinutePapers Facebook → https://www.facebook.com/TwoMinutePap... Twitter → https://twitter.com/karoly_zsolnai Web → https://cg.tuwien.ac.at/~zsolnai/
https://wn.com/Decision_Trees_And_Boosting,_Xgboost_|_Two_Minute_Papers_55
CppCon 2017: Nicholas Ormrod “Fantastic Algorithms and Where To Find Them”

CppCon 2017: Nicholas Ormrod “Fantastic Algorithms and Where To Find Them”

  • Order:
  • Duration: 46:58
  • Updated: 27 Oct 2017
  • views: 5820
videos
Presentation Slides, PDFs, Source Code and other presenter materials are available at: https://github.com/CppCon/CppCon2017 — Come dive into some exciting algorithms — tools rare enough to be novel, but useful enough to be found in practice. Want to learn about "heavy hitters" to prevent DOS attacks? Come to this talk. Want to avoid smashing your stack during tree destruction? Come to this talk. Want to hear war stories about how a new algorithm saved the day? Come to this talk! We'll dive into the finest of algorithms and see them in use — Fantastic Algorithms, and Where To Find Them. — Nicholas Ormrod: Facebook, Software Engineer Nicholas is a infrastructure engineer at Facebook. If he talks too much, disable him with a well-placed nerd snipe. — Videos Filmed & Edited by Bash Films: http://www.BashFilms.com
https://wn.com/Cppcon_2017_Nicholas_Ormrod_“Fantastic_Algorithms_And_Where_To_Find_Them”
Introduction To Optimization: Gradient Based Algorithms

Introduction To Optimization: Gradient Based Algorithms

  • Order:
  • Duration: 5:27
  • Updated: 29 Mar 2017
  • views: 1842
videos
A conceptual overview of gradient based optimization algorithms. This video is part of an introductory optimization series. QUIZ: https://goo.gl/forms/1NaFUcqCnWgWbrQh1 TRANSCRIPT: Hello, and welcome to Introduction To Optimization. This video covers gradient based algorithms. Gradient based algorithms and gradient free algorithms are the two main types of methods for solving optimization problems. In this video, we will learn the basic ideas behind how gradient based solvers work. Gradient based solvers use derivatives to find the optimum value of a function. To understand how this works, imagine that you are hiking on a mountainside, trying to find your way to a campsite at the bottom of the mountain. How would you know where to go? Perhaps you could follow a trail, look at a map, or use a GPS. You might even be able to see your destination, and head straight there. Now imagine that you have no map, no GPS, no trail, and there are trees all around that keep you from seeing anything but the area immediately around you. Now what? Knowing nothing except for the fact that the campsite is at the bottom of the mountain, one possible option is to head downhill. You could look around, evaluate the slope of the mountain in the small area you can see, and walk in the direction with the steepest downhill slope. You could continue doing this, pausing once in awhile to find the best path forward, and eventually make it to the campsite. On a basic level, this is the same thing that gradient based algorithms do. There are three main steps: Search Direction: The first step is to pick a direction to go. The solver evaluates the slope by taking the derivative at its current location. In one dimension this derivative is the slope. In more than one dimension, this is called the gradient. The solver then uses this information together with other rules to pick a direction to go. This is called the search direction. Step Size: The next step is to decide how far to go in the chosen direction. You don’t want to go too far in one direction, or you might end up going back up a different mountain. However, you do want to go far enough to make some progress towards your goal. The value the solver chooses is called the step size. Convergence Check: Once a direction and a step size are chosen, the solver moves in the chosen direction. Then it checks to see if it has reached the bottom. If not, it uses the slope again to pick a new direction and step size. This continues until the solver reaches the bottom of the mountain, or the minimum. We call this convergence. There are many variations on the way that these steps are performed, but these are the basic ideas behind how a gradient based optimization algorithm works. Let’s take a look at what this might look like on an actual function. We’ll try to find the minimum of the equation x^3 + 15x^2 + y^3 +15y^2. We’ll start out by visualizing the function. This is a plot of the function values over a range from -10 to 10 in both directions. Notice how the function slopes down towards a minimum in the center. To begin, we’ll need to give the optimizer an initial guess. Let’s choose (8,8). Another way we can represent this information is with a contour plot, where the lines represent constant levels or function values. We can watch now as the optimizer chooses a search direction, and takes a step, a direction, and a step. Eventually it reaches the minimum point at x = 0, y = 0. Gradient based algorithms have their own strengths and weaknesses. They are widely used, have fast performance, and scale well to large problems. However, they do require smooth, continuous function gradients, and computing those gradients can be computationally expensive. Many gradient based optimizers are also susceptible to finding local minima rather than a global optimum, meaning that they will find the bottom of the closest valley, rather than the lowest point on the whole map. Gradient based optimizers are a powerful tool, but as with any optimization problem, it takes experience and practice to know which method is the right one to use in your situation.
https://wn.com/Introduction_To_Optimization_Gradient_Based_Algorithms
Programming For Beginners | Episode 1: Algorithms

Programming For Beginners | Episode 1: Algorithms

  • Order:
  • Duration: 24:10
  • Updated: 30 May 2016
  • views: 17
videos
I can't wait to continue this series! I hope you guys found this first episode useful. Please leave comments to tell me how I can improve this series, or if you have any questions! Thanks for watching, see ya next time! If you enjoyed the video please hit the "Like" button, leave a comment on what you want to see next, and hit the link below to Subscribe! ➤ Twitter: goo.gl/aUPMZD ➤ Subscribe: goo.gl/lQl7mw
https://wn.com/Programming_For_Beginners_|_Episode_1_Algorithms
Java Programming

Java Programming

  • Order:
  • Duration: 34:30
  • Updated: 03 Jun 2014
  • views: 3086177
videos
Cheat Sheet is Here : http://goo.gl/OPMjte Slower Java Tutorial : http://goo.gl/UHdlyP How to Install Java & Eclipse : http://goo.gl/vEEEJE Best Java Book : http://amzn.to/2l27h2h Support Me on Patreon : https://www.patreon.com/derekbanas In this Java programming Tutorial I'll teach you all of the core knowledge needed to write Java code in 30 minutes. This is the most popular request from everyone. I specifically cover the following topics: primitive data types, comments, class, import, Scanner, final, Strings, static, private, protected, public, constructors, math, hasNextLine, nextLine, getters, setters, method overloading, Random, casting, toString, conversion from Strings to primitives, converting from primitives to Strings, if, else, else if, print, println, printf, logical operators, comparison operators, ternary operator, switch, for, while, break, continue, do while, polymorphism, arrays, for each, multidimensional arrays and more.
https://wn.com/Java_Programming
Decoding The New YouTube Algorithm 2017 - How To Grow Fast on YouTube Pt. 2

Decoding The New YouTube Algorithm 2017 - How To Grow Fast on YouTube Pt. 2

  • Order:
  • Duration: 51:27
  • Updated: 11 Jul 2017
  • views: 43531
videos
Here is my VidCon 2017 Presentation: Decoding the YouTube Algorithm and learn how to grow fast on YouTube with Algorithm Driven views. Part 2. Share this Video: ➜ https://youtu.be/BfxgJet8ym0 Get Access to Free VidSummit Replays https://vidsummit.com/freereplays Tickets to VidSummit 2017 https://vidsummit.com Get More Great Tips - Subscribe ➜ http://goo.gl/dWNo9H My other presentation on the YouTube Algorithm How To Grow on Youtube Fast - Decoding The New YouTube Algorithm 2017 ➜ https://goo.gl/fSpCRX My Favorite YouTube Tool TubeBuddy Download TubeBuddy Free Today! ➜ http://derral.link/tubebuddy Join My Patreon Community for More Advanced Training Check Out Our Community! ➜ http://derral.link/patreon ★ ★ Be the Next Lucky Subscriber to get an In-depth Channel Evaluation: 1. Must be subscribed to My YouTube Channel Subscribe ➜ http://goo.gl/dWNo9H 2. Must be one of my Patrons on Patreon. Check it out! ➜ http://derral.link/patreon 3. Must be uploading good quality content frequently to your YouTube Channel and really trying hard to make it 4. Must be engaged in my channel by liking, commenting, posting, sharing and encouraging others to subscribe to my channel. Ask me A Question by using hashtag on YouTube or Twitter #AskDerral @derraleves
https://wn.com/Decoding_The_New_Youtube_Algorithm_2017_How_To_Grow_Fast_On_Youtube_Pt._2
Lecture 3 | Loss Functions and Optimization

Lecture 3 | Loss Functions and Optimization

  • Order:
  • Duration: 1:14:40
  • Updated: 11 Aug 2017
  • views: 54361
videos
Lecture 3 continues our discussion of linear classifiers. We introduce the idea of a loss function to quantify our unhappiness with a model’s predictions, and discuss two commonly used loss functions for image classification: the multiclass SVM loss and the multinomial logistic regression loss. We introduce the idea of regularization as a mechanism to fight overfitting, with weight decay as a concrete example. We introduce the idea of optimization and the stochastic gradient descent algorithm. We also briefly discuss the use of feature representations in computer vision. Keywords: Image classification, linear classifiers, SVM loss, regularization, multinomial logistic regression, optimization, stochastic gradient descent Slides: http://cs231n.stanford.edu/slides/2017/cs231n_2017_lecture3.pdf -------------------------------------------------------------------------------------- Convolutional Neural Networks for Visual Recognition Instructors: Fei-Fei Li: http://vision.stanford.edu/feifeili/ Justin Johnson: http://cs.stanford.edu/people/jcjohns/ Serena Yeung: http://ai.stanford.edu/~syyeung/ Computer Vision has become ubiquitous in our society, with applications in search, image understanding, apps, mapping, medicine, drones, and self-driving cars. Core to many of these applications are visual recognition tasks such as image classification, localization and detection. Recent developments in neural network (aka “deep learning”) approaches have greatly advanced the performance of these state-of-the-art visual recognition systems. This lecture collection is a deep dive into details of the deep learning architectures with a focus on learning end-to-end models for these tasks, particularly image classification. From this lecture collection, students will learn to implement, train and debug their own neural networks and gain a detailed understanding of cutting-edge research in computer vision. Website: http://cs231n.stanford.edu/ For additional learning opportunities please visit: http://online.stanford.edu/
https://wn.com/Lecture_3_|_Loss_Functions_And_Optimization
'The Algorithm' - How YouTube Search & Discovery Works

'The Algorithm' - How YouTube Search & Discovery Works

  • Order:
  • Duration: 2:02
  • Updated: 28 Aug 2017
  • views: 132621
videos
Welcome to this series of videos on how YouTube's search & discovery system works. In this first installment, we talk about how our 'algorithm' follows the audience. WATCH THE NEXT VIDEO: https://goo.gl/SJiwDS GO TO THE LESSON: https://goo.gl/qV5PgY SUBSCRIBE: https://goo.gl/So4XIG With over 400 hours of video uploaded every minute, that can be a challenge. YouTube’s recommendation systems provide a real-time feedback loop to cater to each viewer and their varying interests. It learns from over 80 billion bits of feedback from the audience, daily, to understand how to serve the right videos to the right viewers at the right time. Our goal is to get people to watch more videos that they enjoy, so that they come back to YouTube regularly. Creators often ask, “What kind of videos does the algorithm like most?” Our systems have no opinion about what type of video you make, and doesn’t favor any particular format. Rather, it tries its best to follow the audience by paying attention to things like: • what they watch • what they don’t watch • how much time they spend watching • likes and dislikes • ‘not interested’ feedback Instead of worrying about what the algorithm likes, it’s better to focus on what your audience likes instead. If you do that and people watch, the algorithm will follow. So, which videos do they enjoy most? How often do they like to watch your channel? Check your YouTube Analytics to answer these questions. Whether you’re pursuing a passion or a business, we strive to give every video a chance to reach its potential audience. We realize however that YouTube has a lot of features, and it can be easy to get confused. Keep watching to learn about six key places where your videos appear, and what you can do to improve your chances for success: Search, Suggested Videos, Home, Trending, Subscriptions, and Notifications, in no particular order. - Level up your YouTube skills with Creator Academy lessons: http://goo.gl/E9umlU - See index of all lessons: http://goo.gl/x2h1NG - Get how-to step-by-step help: http://goo.gl/fBzr7
https://wn.com/'The_Algorithm'_How_Youtube_Search_Discovery_Works
Ray Kurzweil | Our Brain Is a Blueprint for the Master Algorithm | Singularity Hub

Ray Kurzweil | Our Brain Is a Blueprint for the Master Algorithm | Singularity Hub

  • Order:
  • Duration: 7:50
  • Updated: 30 Jun 2017
  • views: 7536
videos
Ray Kurzweil is an inventor, thinker, and futurist famous for forecasting the pace of technology and predicting the world of tomorrow. In this video, Kurzweil suggests the blueprint for the master algorithm—or a single, general purpose learning algorithm—is hidden in the brain. The brain, according to Kurzweil, consists of repeating modules that self-organize into hierarchies that build simple patterns into complex concepts. We don’t have a complete understanding of how this process works yet, but Kurzweil believes that as we study the brain more and reverse engineer what we find, we’ll learn to write the master algorithm. Hub Article: https://wp.me/phyoN-t5u Subscribe: http://bit.ly/1Wq6gwm Connect with Singularity University: Website: http://su.org Hub: http://singularityhub.com Facebook: https://www.facebook.com/singularityu Twitter: https://twitter.com/singularityu Linkedin: https://www.linkedin.com/company/singularity-university About Singularity University: Singularity University is a benefit corporation headquartered at NASA’s research campus in Silicon Valley. We provide educational programs, innovative partnerships and a startup accelerator to help individuals, businesses, institutions, investors, NGOs and governments understand cutting-edge technologies, and how to utilize these technologies to positively impact billions of people. Singularity University http://www.youtube.com/user/SingularityU
https://wn.com/Ray_Kurzweil_|_Our_Brain_Is_A_Blueprint_For_The_Master_Algorithm_|_Singularity_Hub
Algorithms - 03 Introduction to Time Complexity - Unsolvable problems (ADA)

Algorithms - 03 Introduction to Time Complexity - Unsolvable problems (ADA)

  • Order:
  • Duration: 9:07
  • Updated: 29 Jun 2017
  • views: 323
videos
https://wn.com/Algorithms_03_Introduction_To_Time_Complexity_Unsolvable_Problems_(Ada)
×