You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Terraform executor as Docker image
* Implement docker-compose for tf executor
* Enable _dev/deploy for data streams
* Fix
* Gohack: github.com/mtojek/package-spec/code/go
* Install service deployer as static resource
* Cleanup
* Define sample test config for AWS SNS
* WIP
* Fix
* Build terraform environment
* Mount TF_DIR
* Set envs in policies
* Fix: lint
* Fix: build dir (empty value)
* Fix: yml file
* Fix: service name
* Fix: brackets
* Fix
* Fix: reference issue
* Fix: period
* Set time
* Do not print request body to prevent leaks
* Fix: format
* Sample for EC2 metrics
* Fix EC2 system test
* Disable SNS test
* Pass test run ID
* Next fixes
* Fix: formatter
* Try: enable monitoring
* Update dependency on package-spec
* Address PR comments
* Fix failing test
* Fix: go.sum
* Don't set ports
* Update package-spec
* Collect logs for integrations
* Use cloud environment
* Fix: go mod tidy
* Update package-spec dependency
* Support session token
* Terraform service deployer
* nit-picks
* Don't connect to the stack network
* writeTerraformDeployerResources
* Link: env-file
* Link GH issue
* Fix comment
* Next improvements
* nit: separator
* Docs
* Update docs/howto/system_testing.md
Co-authored-by: Shaunak Kashyap <[email protected]>
* Update docs/howto/system_testing.md
Co-authored-by: Shaunak Kashyap <[email protected]>
* Introductory sentences
* Docker compose, terraform
* More generic
* Dont set hostname
* Check if there is a single definition of service deployer
* Fix
* Another fix
Co-authored-by: Shaunak Kashyap <[email protected]>
Copy file name to clipboardExpand all lines: docs/howto/system_testing.md
+66-16Lines changed: 66 additions & 16 deletions
Original file line number
Diff line number
Diff line change
@@ -12,19 +12,18 @@ Conceptually, running a system test involves the following steps:
12
12
1. Depending on the Elastic Package whose data stream is being tested, deploy an instance of the package's integration service.
13
13
1. Create a test policy that configures a single data stream for a single package.
14
14
1. Assign the test policy to the enrolled Agent.
15
-
1. Wait a reasonable amount of time for the Agent to collect data from the
15
+
1. Wait a reasonable amount of time for the Agent to collect data from the
16
16
integration service and index it into the correct Elasticsearch data stream.
17
17
1. Query the first 500 documents based on `@timestamp` for validation.
18
18
1. Validate mappings are defined for the fields contained in the indexed documents.
19
19
1. Validate that the JSON data types contained `_source` are compatible with
20
-
mappings declared for the field.
20
+
mappings declared for the field.
21
21
1. Delete test artifacts and tear down the instance of the package's integration service.
22
22
1. Once all desired data streams have been system tested, tear down the Elastic Stack.
23
23
24
24
## Limitations
25
25
26
26
At the moment system tests have limitations. The salient ones are:
27
-
* They can only test packages whose integration services can be deployed via Docker Compose. Eventually they will be able to test packages that can be deployed via other means, e.g. a Terraform configuration.
28
27
* There isn't a way to do assert that the indexed data matches data from a file (e.g. golden file testing).
29
28
30
29
## Defining a system test
@@ -39,21 +38,38 @@ Packages have a specific folder structure (only relevant parts shown).
39
38
manifest.yml
40
39
```
41
40
42
-
To define a system test we must define configuration at two levels: the package level and each data stream's level.
41
+
To define a system test we must define configuration on at least one level: a package or a data stream's one.
43
42
44
-
### Package-level configuration
45
-
46
-
First, we must define the configuration for deploying a package's integration service. As mentioned in the [_Limitations_](#Limitations) section above, only packages whose integration services can be deployed via Docker Compose are supported at the moment.
43
+
First, we must define the configuration for deploying a package's integration service. We can define it on either the package level:
47
44
48
45
```
49
46
<package root>/
50
47
_dev/
51
48
deploy/
52
-
docker/
53
-
docker-compose.yml
49
+
<service deployer>/
50
+
<service deployer files>
51
+
```
52
+
53
+
or the data stream's level:
54
+
54
55
```
56
+
<package root>/
57
+
data_stream/
58
+
<data stream>/
59
+
_dev/
60
+
deploy/
61
+
<service deployer>/
62
+
<service deployer files>
63
+
```
64
+
65
+
`<service deployer>` - a name of the supported service deployer: `docker` (Docker Compose service deployer) or `tf` (Terraform service deployer).
66
+
67
+
### Docker Compose service deployer
55
68
56
-
The `docker-compose.yml` file defines the integration service(s) for the package. If your package has a logs data stream, the log files from your package's integration service must be written to a volume. For example, the `apache` package has the following definition in it's integration service's `docker-compose.yml` file.
69
+
When using the Docker Compose service deployer, the `<service deployer files>` must include a `docker-compose.yml` file.
70
+
The `docker-compose.yml` file defines the integration service(s) for the package. If your package has a logs data stream,
71
+
the log files from your package's integration service must be written to a volume. For example, the `apache` package has
72
+
the following definition in it's integration service's `docker-compose.yml` file.
57
73
58
74
```
59
75
version: '2.3'
@@ -66,7 +82,43 @@ services:
66
82
67
83
Here, `SERVICE_LOGS_DIR` is a special keyword. It is something that we will need later.
68
84
69
-
### Data stream-level configuration
85
+
### Terraform service deployer
86
+
87
+
When using the Terraform service deployer, the `<service deployer files>` must include at least one `*.tf` file.
88
+
The `*.tf` files define the infrastructure using the Terraform syntax. The terraform based service can be handy to boot up
89
+
resources using selected cloud provider and use them for testing (e.g. observe and collect metrics).
90
+
91
+
Sample `main.tf` definition:
92
+
93
+
```
94
+
variable "TEST_RUN_ID" {
95
+
default = "detached"
96
+
}
97
+
98
+
provider "aws" {}
99
+
100
+
resource "aws_instance" "i" {
101
+
ami = data.aws_ami.latest-amzn.id
102
+
monitoring = true
103
+
instance_type = "t1.micro"
104
+
tags = {
105
+
Name = "elastic-package-test-${var.TEST_RUN_ID}"
106
+
}
107
+
}
108
+
109
+
data "aws_ami" "latest-amzn" {
110
+
most_recent = true
111
+
owners = [ "amazon" ] # AWS
112
+
filter {
113
+
name = "name"
114
+
values = ["amzn2-ami-hvm-*"]
115
+
}
116
+
}
117
+
```
118
+
119
+
Notice the use of the `TEST_RUN_ID` variable. It contains a unique ID, which can help differentiate resources created in potential concurrent test runs.
120
+
121
+
### Test case definition
70
122
71
123
Next, we must define configuration for each data stream that we want to system test.
72
124
@@ -97,10 +149,8 @@ The `data_stream.vars` field corresponds to data stream-level variables for the
97
149
98
150
Notice the use of the `{{SERVICE_LOGS_DIR}}` placeholder. This corresponds to the `${SERVICE_LOGS_DIR}` variable we saw in the `docker-compose.yml` file earlier. In the above example, the net effect is as if the `/usr/local/apache2/logs/access.log*` files located inside the Apache integration service container become available at the same path from Elastic Agent's perspective.
99
151
100
-
When a data stream's manifest declares multiple streams with different inputs
101
-
you can use the `input` option to select the stream to test. The first stream
102
-
whose input type matches the `input` value will be tested. By default, the first
103
-
stream declared in the manifest will be tested.
152
+
When a data stream's manifest declares multiple streams with different inputs you can use the `input` option to select the stream to test. The first stream
153
+
whose input type matches the `input` value will be tested. By default, the first stream declared in the manifest will be tested.
104
154
105
155
#### Placeholders
106
156
@@ -152,4 +202,4 @@ Finally, when you are done running all system tests, bring down the Elastic Stac
0 commit comments