<p>In <a href="https://github.com/mlpack/mlpack/pull/757#discussion_r73980642">src/mlpack/methods/ann/layer/inception_layer.hpp</a>:</p>
<pre style='color:#555'>> + // Forward pass for 3x3 pool path.
> + pool3.InputParameter() = input;
> + pool3.Forward(input, pool3.OutputParameter());
> + Pad(pool3.OutputParameter(), 1, 1, convPool.InputParameter());
> + convPool.Forward(pool3.OutputParameter(), convPool.OutputParameter());
> + biasPool.Forward(convPool.OutputParameter(), biasPool.OutputParameter());
> + basePool.InputParameter() = biasPool.OutputParameter();
> + basePool.Forward(convPool.OutputParameter(), basePool.OutputParameter());
> +
> + // concatenate outputs of all the paths.
> + output = arma::join_slices(
> + arma::join_slices(
> + arma::join_slices(
> + base1.OutputParameter(), base3.OutputParameter() ),
> + base5.OutputParameter() ), basePool.OutputParameter());
> +
</pre>
<p>Since the network class initializes the weights, you have to define the size e.g. using:</p>
<pre><code>weights.set_size(rows, cols);
</code></pre>
<p>Since the inception layer combines different layer, we have to calculate the overall weight dimension accordingly. You can get the weight dimension of each layer by calling the weight function:</p>
<pre><code>conv1.Weights().n_rows
conv1.Weights().n_cols
</code></pre>
<p style="font-size:small;-webkit-text-size-adjust:none;color:#666;">—<br />You are receiving this because you are subscribed to this thread.<br />Reply to this email directly, <a href="https://github.com/mlpack/mlpack/pull/757/files/06d923321f246f2c6ead9ad56e99309fe24a6f5c#r73980642">view it on GitHub</a>, or <a href="https://github.com/notifications/unsubscribe-auth/AJ4bFE_tenGrmwNmosuqvVv1lKD9yvKyks5qd8lggaJpZM4JejWj">mute the thread</a>.<img alt="" height="1" src="https://github.com/notifications/beacon/AJ4bFPNCZ3WF1tyjs6A7smxg92UuL7nmks5qd8lggaJpZM4JejWj.gif" width="1" /></p>
<div itemscope itemtype="http://schema.org/EmailMessage">
<div itemprop="action" itemscope itemtype="http://schema.org/ViewAction">
<link itemprop="url" href="https://github.com/mlpack/mlpack/pull/757/files/06d923321f246f2c6ead9ad56e99309fe24a6f5c#r73980642"></link>
<meta itemprop="name" content="View Pull Request"></meta>
</div>
<meta itemprop="description" content="View this Pull Request on GitHub"></meta>
</div>
<script type="application/json" data-scope="inboxmarkup">{"api_version":"1.0","publisher":{"api_key":"05dde50f1d1a384dd78767c55493e4bb","name":"GitHub"},"entity":{"external_key":"github/mlpack/mlpack","title":"mlpack/mlpack","subtitle":"GitHub repository","main_image_url":"https://assets-cdn.github.com/images/modules/aws/aws-bg.jpg","avatar_image_url":"https://cloud.githubusercontent.com/assets/143418/15842166/7c72db34-2c0b-11e6-9aed-b52498112777.png","action":{"name":"Open in GitHub","url":"https://github.com/mlpack/mlpack"}},"updates":{"snippets":[{"icon":"PERSON","message":"@zoq in #757: Since the network class initializes the weights, you have to define the size e.g. using:\r\n\r\n```\r\nweights.set_size(rows, cols);\r\n```\r\n\r\nSince the inception layer combines different layer, we have to calculate the overall weight dimension accordingly. You can get the weight dimension of each layer by calling the weight function:\r\n\r\n```\r\nconv1.Weights().n_rows\r\nconv1.Weights().n_cols\r\n```\r\n\r\n"}],"action":{"name":"View Pull Request","url":"https://github.com/mlpack/mlpack/pull/757/files/06d923321f246f2c6ead9ad56e99309fe24a6f5c#r73980642"}}}</script>