Skip to content

Commit 67ff574

Browse files
committed
Added Axel profile
1 parent 62c89d9 commit 67ff574

19 files changed

+1665
-26
lines changed

Diff for: authors/axel-nilsson/avatar.jpg

905 KB
Loading
Loading
Loading

Diff for: authors/axel-nilsson/index.html

+851
Large diffs are not rendered by default.

Diff for: authors/axel-nilsson/index.xml

+16
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
<?xml version="1.0" encoding="utf-8" standalone="yes" ?>
2+
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
3+
<channel>
4+
<title>NTU Graph Deep Learning Lab</title>
5+
<link>https://graphdeeplearning.github.io/authors/axel-nilsson/</link>
6+
<atom:link href="https://graphdeeplearning.github.io/authors/axel-nilsson/index.xml" rel="self" type="application/rss+xml" />
7+
<description>NTU Graph Deep Learning Lab</description>
8+
<generator>Source Themes Academic (https://sourcethemes.com/academic/)</generator><language>en-us</language><copyright>Xavier Bresson © 2020</copyright>
9+

14+
15+
</channel>
16+
</rss>

Diff for: authors/chaitanya-joshi/index.html

+2
Original file line numberDiff line numberDiff line change
@@ -690,6 +690,8 @@ <h3>
690690
<h3>Interests</h3>
691691
<ul class="ul-interests">
692692

693+
<li>Graph Neural Networks</li>
694+
693695
<li>Combinatorial Optimization</li>
694696

695697
<li>Natural Language Processing</li>

Diff for: authors/chaitanya-joshi/index.xml

+3-3
Original file line numberDiff line numberDiff line change
@@ -341,7 +341,7 @@ are all promising new ideas for better Transformers.&lt;/p&gt;
341341

342342
&lt;h4 id=&#34;are-transformers-learning-neural-syntax&#34;&gt;Are Transformers learning &amp;lsquo;neural syntax&amp;rsquo;?&lt;/h4&gt;
343343
&lt;p&gt;There have been &lt;a href=&#34;https://pair-code.github.io/interpretability/bert-tree/&#34;&gt;several&lt;/a&gt; &lt;a href=&#34;https://arxiv.org/abs/1905.05950&#34;&gt;interesting&lt;/a&gt; &lt;a href=&#34;https://arxiv.org/abs/1906.04341&#34;&gt;papers&lt;/a&gt; from the NLP community on what Transformers might be learning.
344-
The basic premise is that performing attention on all word pairs in a sentence&amp;ndash;with the purpose of identifying which pairs are the most interesting&amp;ndash;enables Transformers to learn something like a &lt;strong&gt;task-specific syntax&lt;/strong&gt;.&lt;br&gt;
344+
The basic premise is that performing attention on all word pairs in a sentence&amp;ndash;with the purpose of identifying which pairs are the most interesting&amp;ndash;enables Transformers to learn something like a &lt;strong&gt;task-specific syntax&lt;/strong&gt;.
345345
Different heads in the multi-head attention might also be &amp;lsquo;looking&amp;rsquo; at different syntactic properties.&lt;/p&gt;
346346
&lt;p&gt;In graph terms, by using GNNs on full graphs, can we recover the most important edges&amp;ndash;and what they might entail&amp;ndash;from how the GNN performs neighbourhood aggregation at each layer?
347347
I&amp;rsquo;m &lt;a href=&#34;https://arxiv.org/abs/1909.07913&#34;&gt;not so convinced&lt;/a&gt; by this view yet.&lt;/p&gt;
@@ -419,8 +419,8 @@ For a code walkthrough, the DGL team has &lt;a href=&#34;https://docs.dgl.ai/en/
419419
&lt;p&gt;Finally, we wrote &lt;a href=&#34;https://graphdeeplearning.github.io/publication/xu-2019-multi/&#34;&gt;a recent paper&lt;/a&gt; applying Transformers to sketch graphs. Do check it out!&lt;/p&gt;
420420
&lt;hr&gt;
421421
&lt;h4 id=&#34;updates&#34;&gt;Updates&lt;/h4&gt;
422-
&lt;p&gt;The post has also been translated to &lt;a href=&#34;https://mp.weixin.qq.com/s/DABEcNf1hHahlZFMttiT2g&#34;&gt;Chinese&lt;/a&gt;.
423-
Do join the discussion on &lt;a href=&#34;https://twitter.com/chaitjo/status/1233220586358181888?s=20&#34;&gt;Twitter&lt;/a&gt; or &lt;a href=&#34;https://www.reddit.com/r/MachineLearning/comments/fb86mo/d_transformers_are_graph_neural_networks_blog/&#34;&gt;Reddit&lt;/a&gt;!&lt;/p&gt;
422+
&lt;p&gt;The post is also available on &lt;a href=&#34;https://medium.com/@chaitjo/transformers-are-graph-neural-networks-bca9f75412aa?source=friends_link&amp;amp;sk=c54de873b2cec3db70166a6cf0b41d3e&#34;&gt;Medium&lt;/a&gt;, and has been translated to &lt;a href=&#34;https://mp.weixin.qq.com/s/DABEcNf1hHahlZFMttiT2g&#34;&gt;Chinese&lt;/a&gt; and &lt;a href=&#34;https://habr.com/ru/post/491576/&#34;&gt;Russian&lt;/a&gt;.
423+
Do join the discussion on &lt;a href=&#34;https://twitter.com/chaitjo/status/1233220586358181888?s=20&#34;&gt;Twitter&lt;/a&gt;, &lt;a href=&#34;https://www.reddit.com/r/MachineLearning/comments/fb86mo/d_transformers_are_graph_neural_networks_blog/&#34;&gt;Reddit&lt;/a&gt; or &lt;a href=&#34;https://news.ycombinator.com/item?id=22518263&#34;&gt;HackerNews&lt;/a&gt;!&lt;/p&gt;
424424
&lt;blockquote class=&#34;twitter-tweet&#34;&gt;&lt;p lang=&#34;en&#34; dir=&#34;ltr&#34;&gt;Transformers are a special case of Graph Neural Networks. This may be obvious to some, but the following blog post does a good job at explaining these important concepts. &lt;a href=&#34;https://t.co/H8LT2F7LqC&#34;&gt;https://t.co/H8LT2F7LqC&lt;/a&gt;&lt;/p&gt;&amp;mdash; Oriol Vinyals (@OriolVinyalsML) &lt;a href=&#34;https://twitter.com/OriolVinyalsML/status/1233783593626951681?ref_src=twsrc%5Etfw&#34;&gt;February 29, 2020&lt;/a&gt;&lt;/blockquote&gt; &lt;script async src=&#34;https://platform.twitter.com/widgets.js&#34; charset=&#34;utf-8&#34;&gt;&lt;/script&gt;
425425
</description>
426426
</item>

Diff for: authors/index.html

+10-1
Original file line numberDiff line numberDiff line change
@@ -570,11 +570,20 @@ <h1>Authors</h1>
570570
<li><a href="https://graphdeeplearning.github.io/authors/suyash-lakhotia/">Suyash Lakhotia</a></li>
571571

572572

573-
<li><a href="https://graphdeeplearning.github.io/authors/david-low/">David Low Jia Wei</a></li>
573+
<li><a href="https://graphdeeplearning.github.io/authors/axel-nilsson/">Axel Nilsson</a></li>
574574

575575
</ul>
576576

577577

578+
<nav>
579+
<ul class="pagination justify-content-center">
580+
581+
582+
<li class="page-item"><a class="page-link" href="/authors/page/2/">&raquo;</a></li>
583+
584+
</ul>
585+
</nav>
586+
578587

579588
</div>
580589

0 commit comments

Comments
 (0)