You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/v4/console-connector.md
+4-3
Original file line number
Diff line number
Diff line change
@@ -2,15 +2,16 @@
2
2
3
3
## Installation
4
4
5
-
You can install @nlpjs/console-connector:
5
+
You can install the console connector @nlpjs/console-connector using:
6
6
7
7
```bash
8
8
npm install @nlpjs/console-connector
9
9
```
10
10
11
11
## Example of use inside NLP.js
12
12
13
-
This is a little bit special component. It allows to manage scenarios where the main interface way is the console. You can find an example of use on **`examples/02-qna-classic`**.
13
+
This is a little bit special component.
14
+
It allows you to manage scenarios where the main interface is the console. You can find an example of use on **`examples/02-qna-classic`**.
Copy file name to clipboardExpand all lines: docs/v4/language-support.md
+14-6
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ BERT Support means that the tokenizer and stemmer are supported through a BERT A
7
7
8
8
Microsoft Builtins mean that the Builtin Entity extraction is supported directly in javascript, while the ones supported by Duckling requires the deployment of a Duckling instance.
9
9
10
-
Languages not included in this list can be still supported, but without stemming, only tokenizing. That means less precission, but most of the times can be good enough, as an example you can use it for fantasy languages (at unit tests you'll find tests in klingon from Star Trek).
10
+
Languages not included in this list can be still supported, but without stemming, only tokenizing. That means less precision, but most of the time this can be good enough. As an example you can use it for fantasy languages (during unit testing you'll find tests in klingon from Star Trek).
11
11
12
12
| Locale | Language | Native Support | BERT Support | Microsoft Builtins | Duckling Builtins | Sentiment |
@@ -270,7 +278,7 @@ Languages not included in this list can be still supported, but without stemming
270
278
271
279
## Example with several languages
272
280
273
-
Example with three languages, where one of the language is klingon, to show that NLP will work even with support of the language, because it will use tokenizer but not stemmers.
281
+
This example uses three languages, where one of the languages is Klingon, to show that NLP will work even with language support, because it will use the tokenizer but not the stemmers.
Copy file name to clipboardExpand all lines: docs/v4/ner-quickstart.md
+11-10
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,8 @@
1
1
# NER Quick Start
2
2
3
3
## Install the needed packages
4
-
At the folder where is your node project, install the @nlpjs/basic, @nlpjs/express-api-server and @nlpjs/directline-connector packages.
4
+
5
+
In your node project folder, install the @nlpjs/basic, @nlpjs/express-api-server and @nlpjs/directline-connector packages.
5
6
```bash
6
7
npm i @nlpjs/basic @nlpjs/express-api-server @nlpjs/directline-connector
7
8
```
@@ -25,13 +26,13 @@ Create the file _conf.json_ with this content:
25
26
}
26
27
```
27
28
28
-
You'll telling the applicaition to use 4 plugins:
29
+
You are telling the applicaition to use 4 plugins:
29
30
- Basic: the basic plugins for an NLP backend, that includes evaluator, javascript compiler, logger, and NLP classes
30
-
- LangEn: the plugin to use english language
31
-
- ExpressApiServer: the plugin to have an API server done with express
31
+
- LangEn: the plugin to use English language
32
+
- ExpressApiServer: the plugin to have an Express API server
32
33
- DirectlineConnector: the plugin that uses the ExpressApiServer to serve an API for the chatbot
33
34
34
-
Also this configure the ExpressApiServer to be exposed in the port 3000 and to serve the chatbot frontend (serveBot: true).
35
+
Also this configures the ExpressApiServer to be exposed at port 3000 and to serve the chatbot frontend (serveBot: true).
35
36
Finally, it tells the NLP to import the corpus defined in the file _corpus.json_.
36
37
37
38
## Create the corpus.json
@@ -77,13 +78,13 @@ Add the file _corpus.json_ with this content:
77
78
```
78
79
79
80
This creates 2 intents: one to know the real name of a hero and other one to know where the hero lives.
80
-
Also creates the entity to recognize thre heros: spiderman, ironman and thor, and also their synonyms.
81
+
It also creates the entity to recognize the heros: spiderman, ironman and thor, and also their synonyms.
81
82
There is a part in the json to tell the NLP to load some contextData that will be used to generate the answers:
82
83
```json
83
84
"contextData": "./heros.json",
84
85
```
85
86
86
-
If you take a look at one answer, ```_data[entities.hero.option].city``` as an example, the content at the json _heros.json_ will be accesible at the context as data. Also, the entities are accesible at the property _entities_, so as the entity name is _hero_ you'll have the result from the NER for the entity _hero_ stored at_entities.hero_
87
+
If you take a look at one answer, ```_data[entities.hero.option].city``` as an example, the content at the json _heros.json_ will be accessible in the context as data. Also, the entities are accessible in the property _entities_, so because the entity name is _hero_ you'll have the result from the NER for the entity _hero_ stored in_entities.hero_
This initializes the project and load all the jsons building the structure when you call _dockStart()_ and returns to you a dock for the containers.
124
+
This initializes the project and loads all the jsons. It also builds the structure when you call _dockStart()_ and then it returns a dock for the containers.
124
125
Then you can retrieve instances from the container, in this case we retrieve the _nlp_ instance to train it.
125
126
126
127
## Start the application
@@ -135,8 +136,8 @@ Then you can navigate to http://localhost:3000 to use it.
135
136
136
137
## Stored context
137
138
138
-
You'll see that you can ask for information of a hero, but also that if you're talking with the bot about a hero then you can omit the reference to the hero you're talking about.
139
-
This context is stored per conversation, so different conversations have its own context variables.
139
+
You'll see that you can ask for information about a hero, but also that if you're talking with the bot about a hero then you can omit the reference to the hero you're talking about.
140
+
This context is stored per conversation, so different conversations have their own context variables.
Copy file name to clipboardExpand all lines: docs/v4/neural.md
+7-7
Original file line number
Diff line number
Diff line change
@@ -14,7 +14,7 @@ _NeuralNetwork_ is a class of the package _@nlpjs/neural_, that you can install
14
14
15
15
## Corpus Format
16
16
17
-
For training the classifier you need a corpus. The corpus format is an array of objects where each object contains an input and output, where the input is an object with the features and the output is an object with the intents:
17
+
To train the classifier you need a corpus. The corpus format is an array of objects where each object contains an input and output, where the input is an object with the features and the output is an object for the intent:
18
18
19
19
```json
20
20
[
@@ -47,7 +47,7 @@ For training the classifier you need a corpus. The corpus format is an array of
-_iterations_: maximum number of iterations (epochs) that the neural network can run. By default this is 20000.
96
96
-_errorThresh_: minimum error threshold, if the loss is lower than this number, then the training ends. By default this is 0.00005.
97
-
-_deltaErrorThresh_: minimum delta error threshold, this is, the difference between the current and the last errors. If the delta error threshold is lower than this number, then the training ends. By default this is 0.000001.
97
+
-_deltaErrorThresh_: minimum delta error threshold, this is the difference between the current error and the last error. If the delta error threshold is lower than this number, then the training ends. By default this is 0.000001.
98
98
-_learningRate_: learning rate for the neural network. By default this is 0.6.
99
99
-_momentum_: momentum for the gradient descent optimization. By default this is 0.5.
100
100
-_alpha_: Multiplicator or alpha factor for the ReLu activation function. By default this is 0.07.
101
-
-_log_: If is *false* then no log happens, if is *true* then there is log in console. Also a function can be provided, and will receive two parameters: the status and the elapsed time of the last epoch. By default this is false.
101
+
-_log_: If is *false* then no log happens, if is *true* then details are logged in console. You can also provide a function, and it will receive two parameters: the status and the elapsed time of the last epoch. By default this is false.
Copy file name to clipboardExpand all lines: docs/v4/nlu.md
+3-3
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ You can install @nlpjs/nlu:
10
10
11
11
## NluNeural
12
12
13
-
Class _NluNeural_ is an abstraction built on top of _NeuralNetwork_ that help to use it with a corpus.
13
+
Class _NluNeural_ is an abstraction built on top of _NeuralNetwork_ that help in using _NeuralNetwork_ with a corpus.
14
14
A language can be used as a plugin in order to use the correct tokenizer and stemmer for this language.
15
15
In this example both versions, with language and without language, are used in order to compare the results.
16
16
@@ -64,7 +64,7 @@ async function measure(useStemmer) {
64
64
```
65
65
66
66
## DomainManager
67
-
_DomainManager_ is the class that is an abstraction on top of _NluNeural_.
67
+
_DomainManager_ is a class abstraction on top of _NluNeural_.
68
68
It adds the concept of _domain_, so each intent belongs to one domain; that way we can have domains for _smalltalk_, _human resources_, _claims_, or whatever logical split of intents that we want to have.
69
69
Each _DomainManager_ instance has only one language.
70
70
It can be trained by domain or all together:
@@ -167,7 +167,7 @@ function addPersonalityDomain(manager) {
167
167
## NluManager
168
168
_NluManager_ is the abstraction over _DomainManager_: it contains one _DomainManager_ instance per each language that we want to use. It is also able to guess automatically the language of the sentence, so we can provide the locale of the sentence or omit it.
169
169
170
-
This is an example with two languages (english and spanish) with two domains each (personality and food).
170
+
This is an example with two languages (English and Spanish) with two domains each (personality and food).
0 commit comments