Skip to content

Commit d5c4c48

Browse files
committed
Updated equations for BGNN blogpost
1 parent 96566ea commit d5c4c48

File tree

8 files changed

+22
-22
lines changed

8 files changed

+22
-22
lines changed

authors/vijay-dwivedi/index.xml

+3-3
Original file line numberDiff line numberDiff line change
@@ -296,7 +296,7 @@ The <strong>quality</strong> of these datasets also leads one to que
296296
<blockquote>
297297
<p>MLP node update equation at layer $\ell$ is:
298298
$$
299-
h^{\ell+1}<em>{i} = \sigma \left( W^{\ell} \ h^{\ell}</em>{i} \right)
299+
h_{i}^{\ell+1} = \sigma \left( W^{\ell} \ h_{i}^{\ell} \right)
300300
$$</p>
301301
</blockquote>
302302
<p>MLP evaluates to consistently low scores on each of the datasets which shows the necessity to consider graph structure for these tasks. This result is also indicative of how appropriate these datasets are for GNN research as they statistically separate model’s performance.</p>
@@ -309,13 +309,13 @@ $$</p>
309309
<blockquote>
310310
<p>Isotropic layer update equation:
311311
$$
312-
h^{\ell+1}<em>{i} = \sigma\Big(W_1^{\ell} \ h^{\ell}</em>{i} + \sum_{j\in\mathcal{N}_i} W_2^{\ell} \ h^{\ell}_{j} \Big)
312+
h_{i}^{\ell+1} = \sigma \Big( W_1^{\ell} \ h_{i}^{\ell} + \sum_{j \in \mathcal{N}_i} W_2^{\ell} \ h_{j}^{\ell} \Big)
313313
$$</p>
314314
</blockquote>
315315
<blockquote>
316316
<p>Anisotropic layer update equation:
317317
$$
318-
h^{\ell+1}<em>{i} = \sigma\Big(W_1^{\ell} \ h^{\ell}</em>{i} + \sum_{j\in\mathcal{N}_i} <span style="color:red">\eta_{ij}</span> W_2 h^{\ell}_{j} \Big)
318+
h_{i}^{\ell+1} = \sigma \Big( W_1^{\ell} \ h_{i}^{\ell} + \sum_{j \in \mathcal{N}_i} \eta_{ij} W_2 h_{j}^{\ell} \Big)
319319
$$</p>
320320
</blockquote>
321321
<p>As per the above equations, GCN, GraphSage and GIN are isotropic GCNs whereas GAT, MoNet and GatedGCN are anisotropic GCNs.</p>

index.json

+1-1
Large diffs are not rendered by default.

index.xml

+3-3
Original file line numberDiff line numberDiff line change
@@ -296,7 +296,7 @@ The <strong>quality</strong> of these datasets also leads one to que
296296
<blockquote>
297297
<p>MLP node update equation at layer $\ell$ is:
298298
$$
299-
h^{\ell+1}<em>{i} = \sigma \left( W^{\ell} \ h^{\ell}</em>{i} \right)
299+
h_{i}^{\ell+1} = \sigma \left( W^{\ell} \ h_{i}^{\ell} \right)
300300
$$</p>
301301
</blockquote>
302302
<p>MLP evaluates to consistently low scores on each of the datasets which shows the necessity to consider graph structure for these tasks. This result is also indicative of how appropriate these datasets are for GNN research as they statistically separate model’s performance.</p>
@@ -309,13 +309,13 @@ $$</p>
309309
<blockquote>
310310
<p>Isotropic layer update equation:
311311
$$
312-
h^{\ell+1}<em>{i} = \sigma\Big(W_1^{\ell} \ h^{\ell}</em>{i} + \sum_{j\in\mathcal{N}_i} W_2^{\ell} \ h^{\ell}_{j} \Big)
312+
h_{i}^{\ell+1} = \sigma \Big( W_1^{\ell} \ h_{i}^{\ell} + \sum_{j \in \mathcal{N}_i} W_2^{\ell} \ h_{j}^{\ell} \Big)
313313
$$</p>
314314
</blockquote>
315315
<blockquote>
316316
<p>Anisotropic layer update equation:
317317
$$
318-
h^{\ell+1}<em>{i} = \sigma\Big(W_1^{\ell} \ h^{\ell}</em>{i} + \sum_{j\in\mathcal{N}_i} <span style="color:red">\eta_{ij}</span> W_2 h^{\ell}_{j} \Big)
318+
h_{i}^{\ell+1} = \sigma \Big( W_1^{\ell} \ h_{i}^{\ell} + \sum_{j \in \mathcal{N}_i} \eta_{ij} W_2 h_{j}^{\ell} \Big)
319319
$$</p>
320320
</blockquote>
321321
<p>As per the above equations, GCN, GraphSage and GIN are isotropic GCNs whereas GAT, MoNet and GatedGCN are anisotropic GCNs.</p>

post/benchmarking-gnns/index.html

+3-3
Original file line numberDiff line numberDiff line change
@@ -982,7 +982,7 @@ <h3 id="benchmarking-gnns-on-the-proposed-datasets">Benchmarking GNNs on the pro
982982
<blockquote>
983983
<p>MLP node update equation at layer $\ell$ is:
984984
$$
985-
h^{\ell+1}<em>{i} = \sigma \left( W^{\ell} \ h^{\ell}</em>{i} \right)
985+
h_{i}^{\ell+1} = \sigma \left( W^{\ell} \ h_{i}^{\ell} \right)
986986
$$</p>
987987
</blockquote>
988988
<p>MLP evaluates to consistently low scores on each of the datasets which shows the necessity to consider graph structure for these tasks. This result is also indicative of how appropriate these datasets are for GNN research as they statistically separate model’s performance.</p>
@@ -995,13 +995,13 @@ <h3 id="benchmarking-gnns-on-the-proposed-datasets">Benchmarking GNNs on the pro
995995
<blockquote>
996996
<p>Isotropic layer update equation:
997997
$$
998-
h^{\ell+1}<em>{i} = \sigma\Big(W_1^{\ell} \ h^{\ell}</em>{i} + \sum_{j\in\mathcal{N}_i} W_2^{\ell} \ h^{\ell}_{j} \Big)
998+
h_{i}^{\ell+1} = \sigma \Big( W_1^{\ell} \ h_{i}^{\ell} + \sum_{j \in \mathcal{N}_i} W_2^{\ell} \ h_{j}^{\ell} \Big)
999999
$$</p>
10001000
</blockquote>
10011001
<blockquote>
10021002
<p>Anisotropic layer update equation:
10031003
$$
1004-
h^{\ell+1}<em>{i} = \sigma\Big(W_1^{\ell} \ h^{\ell}</em>{i} + \sum_{j\in\mathcal{N}_i} <span style="color:red">\eta_{ij}</span> W_2 h^{\ell}_{j} \Big)
1004+
h_{i}^{\ell+1} = \sigma \Big( W_1^{\ell} \ h_{i}^{\ell} + \sum_{j \in \mathcal{N}_i} \eta_{ij} W_2 h_{j}^{\ell} \Big)
10051005
$$</p>
10061006
</blockquote>
10071007
<p>As per the above equations, GCN, GraphSage and GIN are isotropic GCNs whereas GAT, MoNet and GatedGCN are anisotropic GCNs.</p>

post/index.xml

+3-3
Original file line numberDiff line numberDiff line change
@@ -296,7 +296,7 @@ The &lt;strong&gt;quality&lt;/strong&gt; of these datasets also leads one to que
296296
&lt;blockquote&gt;
297297
&lt;p&gt;MLP node update equation at layer $\ell$ is:
298298
$$
299-
h^{\ell+1}&lt;em&gt;{i} = \sigma \left( W^{\ell} \ h^{\ell}&lt;/em&gt;{i} \right)
299+
h_{i}^{\ell+1} = \sigma \left( W^{\ell} \ h_{i}^{\ell} \right)
300300
$$&lt;/p&gt;
301301
&lt;/blockquote&gt;
302302
&lt;p&gt;MLP evaluates to consistently low scores on each of the datasets which shows the necessity to consider graph structure for these tasks. This result is also indicative of how appropriate these datasets are for GNN research as they statistically separate model’s performance.&lt;/p&gt;
@@ -309,13 +309,13 @@ $$&lt;/p&gt;
309309
&lt;blockquote&gt;
310310
&lt;p&gt;Isotropic layer update equation:
311311
$$
312-
h^{\ell+1}&lt;em&gt;{i} = \sigma\Big(W_1^{\ell} \ h^{\ell}&lt;/em&gt;{i} + \sum_{j\in\mathcal{N}_i} W_2^{\ell} \ h^{\ell}_{j} \Big)
312+
h_{i}^{\ell+1} = \sigma \Big( W_1^{\ell} \ h_{i}^{\ell} + \sum_{j \in \mathcal{N}_i} W_2^{\ell} \ h_{j}^{\ell} \Big)
313313
$$&lt;/p&gt;
314314
&lt;/blockquote&gt;
315315
&lt;blockquote&gt;
316316
&lt;p&gt;Anisotropic layer update equation:
317317
$$
318-
h^{\ell+1}&lt;em&gt;{i} = \sigma\Big(W_1^{\ell} \ h^{\ell}&lt;/em&gt;{i} + \sum_{j\in\mathcal{N}_i} &lt;span style=&#34;color:red&#34;&gt;\eta_{ij}&lt;/span&gt; W_2 h^{\ell}_{j} \Big)
318+
h_{i}^{\ell+1} = \sigma \Big( W_1^{\ell} \ h_{i}^{\ell} + \sum_{j \in \mathcal{N}_i} \eta_{ij} W_2 h_{j}^{\ell} \Big)
319319
$$&lt;/p&gt;
320320
&lt;/blockquote&gt;
321321
&lt;p&gt;As per the above equations, GCN, GraphSage and GIN are isotropic GCNs whereas GAT, MoNet and GatedGCN are anisotropic GCNs.&lt;/p&gt;

tags/benchmark/index.xml

+3-3
Original file line numberDiff line numberDiff line change
@@ -296,7 +296,7 @@ The &lt;strong&gt;quality&lt;/strong&gt; of these datasets also leads one to que
296296
&lt;blockquote&gt;
297297
&lt;p&gt;MLP node update equation at layer $\ell$ is:
298298
$$
299-
h^{\ell+1}&lt;em&gt;{i} = \sigma \left( W^{\ell} \ h^{\ell}&lt;/em&gt;{i} \right)
299+
h_{i}^{\ell+1} = \sigma \left( W^{\ell} \ h_{i}^{\ell} \right)
300300
$$&lt;/p&gt;
301301
&lt;/blockquote&gt;
302302
&lt;p&gt;MLP evaluates to consistently low scores on each of the datasets which shows the necessity to consider graph structure for these tasks. This result is also indicative of how appropriate these datasets are for GNN research as they statistically separate model’s performance.&lt;/p&gt;
@@ -309,13 +309,13 @@ $$&lt;/p&gt;
309309
&lt;blockquote&gt;
310310
&lt;p&gt;Isotropic layer update equation:
311311
$$
312-
h^{\ell+1}&lt;em&gt;{i} = \sigma\Big(W_1^{\ell} \ h^{\ell}&lt;/em&gt;{i} + \sum_{j\in\mathcal{N}_i} W_2^{\ell} \ h^{\ell}_{j} \Big)
312+
h_{i}^{\ell+1} = \sigma \Big( W_1^{\ell} \ h_{i}^{\ell} + \sum_{j \in \mathcal{N}_i} W_2^{\ell} \ h_{j}^{\ell} \Big)
313313
$$&lt;/p&gt;
314314
&lt;/blockquote&gt;
315315
&lt;blockquote&gt;
316316
&lt;p&gt;Anisotropic layer update equation:
317317
$$
318-
h^{\ell+1}&lt;em&gt;{i} = \sigma\Big(W_1^{\ell} \ h^{\ell}&lt;/em&gt;{i} + \sum_{j\in\mathcal{N}_i} &lt;span style=&#34;color:red&#34;&gt;\eta_{ij}&lt;/span&gt; W_2 h^{\ell}_{j} \Big)
318+
h_{i}^{\ell+1} = \sigma \Big( W_1^{\ell} \ h_{i}^{\ell} + \sum_{j \in \mathcal{N}_i} \eta_{ij} W_2 h_{j}^{\ell} \Big)
319319
$$&lt;/p&gt;
320320
&lt;/blockquote&gt;
321321
&lt;p&gt;As per the above equations, GCN, GraphSage and GIN are isotropic GCNs whereas GAT, MoNet and GatedGCN are anisotropic GCNs.&lt;/p&gt;

tags/deep-learning/index.xml

+3-3
Original file line numberDiff line numberDiff line change
@@ -296,7 +296,7 @@ The &lt;strong&gt;quality&lt;/strong&gt; of these datasets also leads one to que
296296
&lt;blockquote&gt;
297297
&lt;p&gt;MLP node update equation at layer $\ell$ is:
298298
$$
299-
h^{\ell+1}&lt;em&gt;{i} = \sigma \left( W^{\ell} \ h^{\ell}&lt;/em&gt;{i} \right)
299+
h_{i}^{\ell+1} = \sigma \left( W^{\ell} \ h_{i}^{\ell} \right)
300300
$$&lt;/p&gt;
301301
&lt;/blockquote&gt;
302302
&lt;p&gt;MLP evaluates to consistently low scores on each of the datasets which shows the necessity to consider graph structure for these tasks. This result is also indicative of how appropriate these datasets are for GNN research as they statistically separate model’s performance.&lt;/p&gt;
@@ -309,13 +309,13 @@ $$&lt;/p&gt;
309309
&lt;blockquote&gt;
310310
&lt;p&gt;Isotropic layer update equation:
311311
$$
312-
h^{\ell+1}&lt;em&gt;{i} = \sigma\Big(W_1^{\ell} \ h^{\ell}&lt;/em&gt;{i} + \sum_{j\in\mathcal{N}_i} W_2^{\ell} \ h^{\ell}_{j} \Big)
312+
h_{i}^{\ell+1} = \sigma \Big( W_1^{\ell} \ h_{i}^{\ell} + \sum_{j \in \mathcal{N}_i} W_2^{\ell} \ h_{j}^{\ell} \Big)
313313
$$&lt;/p&gt;
314314
&lt;/blockquote&gt;
315315
&lt;blockquote&gt;
316316
&lt;p&gt;Anisotropic layer update equation:
317317
$$
318-
h^{\ell+1}&lt;em&gt;{i} = \sigma\Big(W_1^{\ell} \ h^{\ell}&lt;/em&gt;{i} + \sum_{j\in\mathcal{N}_i} &lt;span style=&#34;color:red&#34;&gt;\eta_{ij}&lt;/span&gt; W_2 h^{\ell}_{j} \Big)
318+
h_{i}^{\ell+1} = \sigma \Big( W_1^{\ell} \ h_{i}^{\ell} + \sum_{j \in \mathcal{N}_i} \eta_{ij} W_2 h_{j}^{\ell} \Big)
319319
$$&lt;/p&gt;
320320
&lt;/blockquote&gt;
321321
&lt;p&gt;As per the above equations, GCN, GraphSage and GIN are isotropic GCNs whereas GAT, MoNet and GatedGCN are anisotropic GCNs.&lt;/p&gt;

tags/graph-neural-networks/index.xml

+3-3
Original file line numberDiff line numberDiff line change
@@ -296,7 +296,7 @@ The &lt;strong&gt;quality&lt;/strong&gt; of these datasets also leads one to que
296296
&lt;blockquote&gt;
297297
&lt;p&gt;MLP node update equation at layer $\ell$ is:
298298
$$
299-
h^{\ell+1}&lt;em&gt;{i} = \sigma \left( W^{\ell} \ h^{\ell}&lt;/em&gt;{i} \right)
299+
h_{i}^{\ell+1} = \sigma \left( W^{\ell} \ h_{i}^{\ell} \right)
300300
$$&lt;/p&gt;
301301
&lt;/blockquote&gt;
302302
&lt;p&gt;MLP evaluates to consistently low scores on each of the datasets which shows the necessity to consider graph structure for these tasks. This result is also indicative of how appropriate these datasets are for GNN research as they statistically separate model’s performance.&lt;/p&gt;
@@ -309,13 +309,13 @@ $$&lt;/p&gt;
309309
&lt;blockquote&gt;
310310
&lt;p&gt;Isotropic layer update equation:
311311
$$
312-
h^{\ell+1}&lt;em&gt;{i} = \sigma\Big(W_1^{\ell} \ h^{\ell}&lt;/em&gt;{i} + \sum_{j\in\mathcal{N}_i} W_2^{\ell} \ h^{\ell}_{j} \Big)
312+
h_{i}^{\ell+1} = \sigma \Big( W_1^{\ell} \ h_{i}^{\ell} + \sum_{j \in \mathcal{N}_i} W_2^{\ell} \ h_{j}^{\ell} \Big)
313313
$$&lt;/p&gt;
314314
&lt;/blockquote&gt;
315315
&lt;blockquote&gt;
316316
&lt;p&gt;Anisotropic layer update equation:
317317
$$
318-
h^{\ell+1}&lt;em&gt;{i} = \sigma\Big(W_1^{\ell} \ h^{\ell}&lt;/em&gt;{i} + \sum_{j\in\mathcal{N}_i} &lt;span style=&#34;color:red&#34;&gt;\eta_{ij}&lt;/span&gt; W_2 h^{\ell}_{j} \Big)
318+
h_{i}^{\ell+1} = \sigma \Big( W_1^{\ell} \ h_{i}^{\ell} + \sum_{j \in \mathcal{N}_i} \eta_{ij} W_2 h_{j}^{\ell} \Big)
319319
$$&lt;/p&gt;
320320
&lt;/blockquote&gt;
321321
&lt;p&gt;As per the above equations, GCN, GraphSage and GIN are isotropic GCNs whereas GAT, MoNet and GatedGCN are anisotropic GCNs.&lt;/p&gt;

0 commit comments

Comments
 (0)