Coder Social home page Coder Social logo

Comments (14)

ZuoGangwei avatar ZuoGangwei commented on May 3, 2024 5

南瓜书对于5.24公式的解释太长了,看不太懂,能不能出期视频讲解一下,谢谢!

from pumpkin-book.

Sm1les avatar Sm1les commented on May 3, 2024 1

@ZuoGangwei 好的,后续安排,到时候在南瓜书读者交流群通知,进群方式参见README最下方

from pumpkin-book.

TonyHzr avatar TonyHzr commented on May 3, 2024

第五章,公式5.17
引入连接权与阈值平方和后,误差目标函数变为

图片

想请教一下,ωi,是代表了每一层(输入层到隐含层、隐含层到输出层)的阈值和连接权吗?

十分感谢。

from pumpkin-book.

Sm1les avatar Sm1les commented on May 3, 2024

@TonyHzr 是的,正则化项通常都是针对模型的所有参数的,在神经网络模型里面所有的参数就是连接权和阈值。

from pumpkin-book.

TonyHzr avatar TonyHzr commented on May 3, 2024

@TonyHzr 是的,正则化项通常都是针对模型的所有参数的,在神经网络模型里面所有的参数就是连接权和阈值。

了解了,十分感谢~

from pumpkin-book.

chenshijin1 avatar chenshijin1 commented on May 3, 2024

https://datawhalechina.github.io/pumpkin-book/#/chapter5/chapter5
这个在线阅读网页显示出了问题,5.10公式。
image

datawhalechina项目希望能够留下邮箱联系方式哦。这样反馈比较及时。
因为没有看到错误反馈专区,所以就在这儿写了,不好意思。

from pumpkin-book.

Sm1les avatar Sm1les commented on May 3, 2024

@chenshijin1 同学你好,感谢你的反馈,5.10已经改正,你没写错,这里就是错误反馈区 :)

from pumpkin-book.

libo-huang avatar libo-huang commented on May 3, 2024

5.12中的偏导没有对对j进行累加运算,但5.13中却对j进行累加运算,能不能适当解释一下,非常感谢。
image
image

from pumpkin-book.

Sm1les avatar Sm1les commented on May 3, 2024

@HLBayes 同学你好,5.12里面是对某个具体的theta_j求导,所以和其他j的取值无关,因此无需求和,例如,对theta_1求导时,跟theta_2、theta_3、theta_4等都没关系,所以只需要对theta_1求导即可。5.13里面是对某个具体的v_ih求导,由于任意一个v_ih的变动都会影响到y_1至y_l的结果,因此对v_ih求导时,需要分别将y_1至y_l对v_ih的求导结果加起来。

from pumpkin-book.

libo-huang avatar libo-huang commented on May 3, 2024

@Sm1les 感谢。明白了。v_{ih}是隐层神经元h输入的组成部分,最终构成隐层神经元h的输出b_h,并与所有“隐层-输出层”间权重w_{hj}和输出层所有神经元的阈值theta_j构成了y_j, j=1,...,l。

from pumpkin-book.

wanyixue avatar wanyixue commented on May 3, 2024

第五章 式(5.2) 中最后推导梯度的时候,\hat(yi)也应该是关于w的函数,没有对其求导,直接当成常数处理了,这似乎是不正确的。
\hat(yi)应该是阶越函数的输出,而阶越函数不可导,也没有subgradient,30页直接将\hat(yi)写入了求导的结果,似乎是不正确的。

from pumpkin-book.

Sm1les avatar Sm1les commented on May 3, 2024

@wanyixue 同学你好,损失函数L是关于w和theta的函数,只有w和theta是未知的变量,\hat(yi)和yi都是已知量,所以不用对他们求导,因此也不存在不可导一说

from pumpkin-book.

wanyixue avatar wanyixue commented on May 3, 2024

@wanyixue 同学你好,损失函数L是关于w和theta的函数,只有w和theta是未知的变量,\hat(yi)和yi都是已知量,所以不用对他们求导,因此也不存在不可导一说

\hat(yi)也是关于w和theta的函数,\hat(yi)=epsilon(w^T+b),也是预测出的吧,怎么能当已知量?

from pumpkin-book.

Sm1les avatar Sm1les commented on May 3, 2024

@wanyixue 因为最开始我们会随机初始化一个w_0和theta_0,然后会将所有样本代入模型找到分类错误的点构成误分类点集合M,接着从M中随机选一个误分类点(xi,yi),根据w_0和theta_0计算出\hat(yi),最后算出梯度并将w_0和theta_0按照参数更新公式更新到w_1和theta_1,所以此时\hat(yi)是已知量,这块南瓜书上没有展开写,推荐你去看李航老师的《统计学习方法》中的感知机那一节,里面有道例题,你跟着例题算一下会更容易理解。

from pumpkin-book.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.