Skip to content

Commit 8916a1e

Browse files
committed
new: feature vs node
1 parent 9266a50 commit 8916a1e

File tree

4 files changed

+24
-0
lines changed

4 files changed

+24
-0
lines changed

Diff for: site/content/3.10/data-science/arangographml/_index.md

+6
Original file line numberDiff line numberDiff line change
@@ -66,6 +66,12 @@ Node classification is a **supervised learning** task where the goal is to predi
6666

6767
Node embedding is an **unsupervised learning** technique that converts nodes into numerical vector representations, preserving their **structural relationships** within the graph. Unlike simple feature aggregation, node embeddings **capture the influence of neighboring nodes and graph topology**, making them powerful for downstream tasks like clustering, anomaly detection, and link prediction. Combining this with downstream tasks like clustering, anomaly detection, and link prediction can provide valuable insights. Consider using [ArangoDB's Vector Search](https://arangodb.com/2024/11/vector-search-in-arangodb-practical-insights-and-hands-on-examples/) capabilities to find similar nodes based on their embeddings.
6868

69+
**Feature Embeddings vs Node Embeddings**
70+
71+
**Feature Embeddings** are vector representations derived from the attributes or features associated with nodes. These embeddings aim to capture the inherent characteristics of the data. For example, in a social network, a feature embedding might encode user attributes like age, location, and interests. Techniques like **Word2Vec**, **TF-IDF**, or **autoencoders** are commonly used to generate such embeddings.
72+
73+
In the context of graphs, **Node Embeddings** are a **combination of a node’s feature embedding and the structural information from its connected edges**. Essentially, they aggregate both the node’s attributes and the connectivity patterns within the graph. This fusion helps capture not only the individual properties of a node but also its position and role within the network.
74+
6975
**How It Works in ArangoGraphML**
7076
- The model learns an embedding (a vector representation) for each node based on its **position within the graph and its connections**.
7177
- It **does not rely on labeled data**—instead, it captures structural patterns through graph traversal and aggregation of neighbor information.

Diff for: site/content/3.11/data-science/arangographml/_index.md

+6
Original file line numberDiff line numberDiff line change
@@ -66,6 +66,12 @@ Node classification is a **supervised learning** task where the goal is to predi
6666

6767
Node embedding is an **unsupervised learning** technique that converts nodes into numerical vector representations, preserving their **structural relationships** within the graph. Unlike simple feature aggregation, node embeddings **capture the influence of neighboring nodes and graph topology**, making them powerful for downstream tasks like clustering, anomaly detection, and link prediction. Combining this with downstream tasks like clustering, anomaly detection, and link prediction can provide valuable insights. Consider using [ArangoDB's Vector Search](https://arangodb.com/2024/11/vector-search-in-arangodb-practical-insights-and-hands-on-examples/) capabilities to find similar nodes based on their embeddings.
6868

69+
**Feature Embeddings vs Node Embeddings**
70+
71+
**Feature Embeddings** are vector representations derived from the attributes or features associated with nodes. These embeddings aim to capture the inherent characteristics of the data. For example, in a social network, a feature embedding might encode user attributes like age, location, and interests. Techniques like **Word2Vec**, **TF-IDF**, or **autoencoders** are commonly used to generate such embeddings.
72+
73+
In the context of graphs, **Node Embeddings** are a **combination of a node’s feature embedding and the structural information from its connected edges**. Essentially, they aggregate both the node’s attributes and the connectivity patterns within the graph. This fusion helps capture not only the individual properties of a node but also its position and role within the network.
74+
6975
**How It Works in ArangoGraphML**
7076
- The model learns an embedding (a vector representation) for each node based on its **position within the graph and its connections**.
7177
- It **does not rely on labeled data**—instead, it captures structural patterns through graph traversal and aggregation of neighbor information.

Diff for: site/content/3.12/data-science/arangographml/_index.md

+6
Original file line numberDiff line numberDiff line change
@@ -66,6 +66,12 @@ Node classification is a **supervised learning** task where the goal is to predi
6666

6767
Node embedding is an **unsupervised learning** technique that converts nodes into numerical vector representations, preserving their **structural relationships** within the graph. Unlike simple feature aggregation, node embeddings **capture the influence of neighboring nodes and graph topology**, making them powerful for downstream tasks like clustering, anomaly detection, and link prediction. Combining this with downstream tasks like clustering, anomaly detection, and link prediction can provide valuable insights. Consider using [ArangoDB's Vector Search](https://arangodb.com/2024/11/vector-search-in-arangodb-practical-insights-and-hands-on-examples/) capabilities to find similar nodes based on their embeddings.
6868

69+
**Feature Embeddings vs Node Embeddings**
70+
71+
**Feature Embeddings** are vector representations derived from the attributes or features associated with nodes. These embeddings aim to capture the inherent characteristics of the data. For example, in a social network, a feature embedding might encode user attributes like age, location, and interests. Techniques like **Word2Vec**, **TF-IDF**, or **autoencoders** are commonly used to generate such embeddings.
72+
73+
In the context of graphs, **Node Embeddings** are a **combination of a node’s feature embedding and the structural information from its connected edges**. Essentially, they aggregate both the node’s attributes and the connectivity patterns within the graph. This fusion helps capture not only the individual properties of a node but also its position and role within the network.
74+
6975
**How It Works in ArangoGraphML**
7076
- The model learns an embedding (a vector representation) for each node based on its **position within the graph and its connections**.
7177
- It **does not rely on labeled data**—instead, it captures structural patterns through graph traversal and aggregation of neighbor information.

Diff for: site/content/3.13/data-science/arangographml/_index.md

+6
Original file line numberDiff line numberDiff line change
@@ -66,6 +66,12 @@ Node classification is a **supervised learning** task where the goal is to predi
6666

6767
Node embedding is an **unsupervised learning** technique that converts nodes into numerical vector representations, preserving their **structural relationships** within the graph. Unlike simple feature aggregation, node embeddings **capture the influence of neighboring nodes and graph topology**, making them powerful for downstream tasks like clustering, anomaly detection, and link prediction. Combining this with downstream tasks like clustering, anomaly detection, and link prediction can provide valuable insights. Consider using [ArangoDB's Vector Search](https://arangodb.com/2024/11/vector-search-in-arangodb-practical-insights-and-hands-on-examples/) capabilities to find similar nodes based on their embeddings.
6868

69+
**Feature Embeddings vs Node Embeddings**
70+
71+
**Feature Embeddings** are vector representations derived from the attributes or features associated with nodes. These embeddings aim to capture the inherent characteristics of the data. For example, in a social network, a feature embedding might encode user attributes like age, location, and interests. Techniques like **Word2Vec**, **TF-IDF**, or **autoencoders** are commonly used to generate such embeddings.
72+
73+
In the context of graphs, **Node Embeddings** are a **combination of a node’s feature embedding and the structural information from its connected edges**. Essentially, they aggregate both the node’s attributes and the connectivity patterns within the graph. This fusion helps capture not only the individual properties of a node but also its position and role within the network.
74+
6975
**How It Works in ArangoGraphML**
7076
- The model learns an embedding (a vector representation) for each node based on its **position within the graph and its connections**.
7177
- It **does not rely on labeled data**—instead, it captures structural patterns through graph traversal and aggregation of neighbor information.

0 commit comments

Comments
 (0)