Discuss philosophy, ideology, religion, etc. Strong language is allowed. Don't be a moron, respect others (unless they are too dumb). Created and supported by @dailyEm and @egorcentric
Maybe they saw the future profits they'd be sitting on and realized walking away would be a massive loss? Sam hyped up AGI/ASI and its possibilitiess
Читать полностью…Naa, not rly. But they have weird practices and shifted to for-profit
Читать полностью…https://youtu.be/WLQkSmyW5rE?si=UVh4oFwjLEkWk8OH
Читать полностью…if you need a good base, refer these videos: https://www.youtube.com/playlist?list=PL5-TkQAfAZFbzxjBHtzdVCWE0Zbhomg7r
this prof taught me well 🙏
Gradient are basically the changes that are supposed to be made to the following vairbles right
Читать полностью…yeah for sure! just do a simple network and you will understand
Читать полностью…that's done thru gradient descent, so need to propagate (or send) those gradients all the way from y to each of these variables
Читать полностью…But like we can understand right how it's done
Читать полностью…so autodiff is a way to propagate gradients in your weight update step from the output all the way back to the gradients
Читать полностью…I used the call back that slows down the learning rate to monitor the loss
Читать полностью…RIP my student. Hope you will reach the depths of deep learning, nice knowing you!
Читать полностью…I think someone send me the Cnn video yesterday
Читать полностью…Like today I was learning linear algebra and it was related to the vectors and stuff so I got a little deep and figured out the regression formulas and stuff
Читать полностью…for e.g., you need to update theta10,11,20,21,30,21 and phi 0,1,2,3
to update these 10 variables, you need to send the information of the output (i.e. in which way to change these vars so the output aligns with true_value
).
and when your network gains depth, it becomes even more complicated
Читать полностью…I'd say we need to make that one guy fall flat on his teeth so go for accuracy
Читать полностью…