开发者:上海品职教育科技有限公司 隐私政策详情

应用版本:4.2.11(IOS)|3.2.5(安卓)APP下载

DaiDai · 2022年05月30日

可否中文解释一下C选项?

NO.PZ2015120204000038

问题如下:

Regarding neural networks (NNs) , which of the following statements is least accurate?

选项:

A.

NNs must have at least 10 hidden layers to be considered deep learning nets.

B.

The activation function in a node operates like a light dimmer switch since it decreases or increases the strength of the total net input.

C.


The summation operator receives input values, multiplies each by a weight, sums up the weighted values into the total net input, and passes it to the activation function.

解释:

A is correct. It is the least accurate answer because neural networks with many hidden layers—at least 3, but often more than 20 hidden layers—are known as deep learning nets.

B is incorrect, because the node’s activation function operates like a light dimmer switch which decreases or increases the strength of the (total net) input.

C is incorrect, because the node’s summation operator multiplies each (input) value by a weight and sums up the weighted values to form the total net input. The total net input is then passed to the activation function.

如题

1 个答案

星星_品职助教 · 2022年05月31日

同学你好,

NN的过程为:通过summation operator接到input后,把每个input都赋予权重后加总,得到一个总的total net input,然后传递给activation function。

C选项的描述是讲义(即原版书内容)的小幅改写,如果对于这个流程不熟悉,可以参照截图相应的内容。

  • 1

    回答
  • 0

    关注
  • 360

    浏览
相关问题

NO.PZ2015120204000038问题如下Regarng neurnetworks (NNs) , whiof the following statements is least accurate? A.NNs must have least 10 hien layers to consireep learning nets. B.The activation function in a no operates like a light mmer switsinit creases or increases the strength of the totnet input. C.The summation operator receives input values, multiplies eaa weight, sums up the weightevalues into the totnet input, anpasses it to the activation function.A is correct. It is the least accurate answer because neurnetworks with many hien layers—least 3, but often more th20 hien layers—are known ep learning nets.B is incorrect, because the no’s activation function operates like a light mmer switwhicreases or increases the strength of the (totnet) input.C is incorrect, because the no’s summation operator multiplies ea(input) value a weight ansums up the weightevalues to form the totnet input. The totnet input is then passeto the activation function. 现在最新的到底最少是2层还是3层?基础班讲的是3层,强化班说的是2层。

2023-08-22 12:47 1 · 回答

NO.PZ2015120204000038 The activation function in a no operates like a light mmer switsinit creases or increases the strength of the totnet input. The summation operator receives input values, multiplies eaa weight, sums up the weightevalues into the totnet input, anpasses it to the activation function. A is correct. It is the least accurate answer because neurnetworks with many hien layers—least 3, but often more th20 hien layers—are known ep learning nets. B is incorrect, because the no’s activation function operates like a light mmer switwhicreases or increases the strength of the (totnet) input. C is incorrect, because the no’s summation operator multiplies ea(input) value a weight ansums up the weightevalues to form the totnet input. The totnet input is then passeto the activation function. 这个不是问最不准确的吗?为什么回答选了唯一正确的那个呢

2021-03-23 10:32 1 · 回答