You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: .github/ISSUE_TEMPLATE/bug_report.md
+17-6
Original file line number
Diff line number
Diff line change
@@ -9,27 +9,38 @@ assignees: 'andre-marcos-perez'
9
9
10
10
## Introduction
11
11
12
-
Hi there, thanks for helping the project! We are doing our best to help the community to learn and practice parallel computing in distributed environments through our projects. :sparkles:
12
+
Hi there, thanks for helping the project! We are doing our best to help the community to learn and practice
13
+
parallel computing in distributed environments through our projects. :sparkles:
13
14
14
15
## Bug
15
16
17
+
Please fill the template below.
18
+
16
19
### Expected behaviour
17
20
21
+
*Describe the expected behaviour*
22
+
18
23
### Current behaviour
19
24
25
+
*Describe the current behaviour*
26
+
20
27
### Steps to reproduce
21
28
22
-
1. Step 1
23
-
2. Step 2
24
-
3. Step 3
29
+
1.*Step 1*
30
+
2.*Step 2*
31
+
3.*Step 3*
25
32
26
33
### Possible solutions (optional)
27
34
35
+
*Add some solutions, if any*
36
+
28
37
### Comments (optional)
29
38
39
+
*Add some comments, if any*
40
+
30
41
### Checklist
31
42
32
43
Please provide the following:
33
44
34
-
-[] Docker Engine version:
35
-
-[] Docker Compose version:
45
+
-[] Docker Engine version:*Can be found using `docker version`, e.g.: 19.03.6*
46
+
-[] Docker Compose version:*Can be found using `docker-compose version`, e.g.: 1.21.0*
Copy file name to clipboardExpand all lines: .github/ISSUE_TEMPLATE/feature_request.md
+8-1
Original file line number
Diff line number
Diff line change
@@ -9,10 +9,17 @@ assignees: 'andre-marcos-perez'
9
9
10
10
## Introduction
11
11
12
-
Hi there, thanks for helping the project! We are doing our best to help the community to learn and practice parallel computing in distributed environments through our projects. :sparkles:
12
+
Hi there, thanks for helping the project! We are doing our best to help the community to learn and practice
13
+
parallel computing in distributed environments through our projects. :sparkles:
Hi there, thanks for helping the project! We are doing our best to help the community to learn and practice parallel computing in distributed environments through our projects. :sparkles:
3
+
Hi there, thanks for helping the project! We are doing our best to help the community to learn and practice
4
+
parallel computing in distributed environments through our projects. :sparkles:
4
5
5
6
## Pull Request
6
7
7
-
### Description
8
+
### Issue
9
+
10
+
-*Issue number with link, e.g.: [#22](https://github.com/andre-marcos-perez/spark-standalone-cluster-on-docker/issues/22)*
8
11
9
12
### Changes
10
13
11
-
- Change 1
12
-
- Change 2
14
+
-*High level description of change 1*
15
+
-*High level description of change 2*
16
+
-*...*
13
17
14
18
### Comments (optional)
15
19
20
+
*Add some comments, if any*
21
+
16
22
### Checklist
17
23
18
24
Please make sure to check the following:
19
25
20
-
-[] I have followed the steps in the [CONTRIBUTING.md](../CONTRIBUTING.md) file.
26
+
-[] I have followed the steps in the [CONTRIBUTING.md](../CONTRIBUTING.md) file.
27
+
-[] I am aware that pull requests that do not follow the rules will be automatically rejected.
> The project just got its [own article](https://towardsdatascience.com/apache-spark-cluster-on-docker-ft-a-juyterlab-interface-418383c95445) at Towards Data Science Medium blog! :sparkles:
3
3
4
-
This project gives you an out-of-the-box **Apache Spark** cluster in standalone mode with a **JupyterLab** interface and a simulated **Apache Hadoop Distributed File System**, all built on top of **Docker**. Learn Apache Spark through its Python API, **PySpark**, by running the [Jupyter notebooks](build/workspace/pyspark.ipynb) with examples on how to read, process and write data.
4
+
This project gives you an **Apache Spark** cluster in standalone mode with a **JupyterLab** interface built on top of **Docker**.
5
+
Learn Apache Spark through its Scala and Python API (PySpark) by running the Jupyter [notebooks](build/workspace/) with examples on how to read, process and write data.
0 commit comments