diff --git a/content/tutorial-nlp-from-scratch.md b/content/tutorial-nlp-from-scratch.md index 68a31d27..e51d6b1d 100644 --- a/content/tutorial-nlp-from-scratch.md +++ b/content/tutorial-nlp-from-scratch.md @@ -1051,7 +1051,7 @@ To further enhance and optimize your neural network model, you can consider one Nowadays, LSTMs have been replaced by the [Transformer](https://jalammar.github.io/illustrated-transformer/)( which uses [Attention](https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/) to tackle all the problems that plague an LSTM such as as lack of [transfer learning](https://en.wikipedia.org/wiki/Transfer_learning), lack of [parallel training](https://web.stanford.edu/~rezab/classes/cme323/S16/projects_reports/hedge_usmani.pdf) and a long gradient chain for lengthy sequences -Building a neural network from scratch with NumPy is a great way to learn more about NumPy and about deep learning. However, for real-world applications you should use specialized frameworks — such as PyTorch, JAX, TensorFlow or MXNet — that provide NumPy-like APIs, have built-in automatic differentiation and GPU support, and are designed for high-performance numerical computing and machine learning. +Building a neural network from scratch with NumPy is a great way to learn more about NumPy and about deep learning. However, for real-world applications you should use specialized frameworks — such as PyTorch, JAX or TensorFlow — that provide NumPy-like APIs, have built-in automatic differentiation and GPU support, and are designed for high-performance numerical computing and machine learning. Finally, to know more about how ethics come into play when developing a machine learning model, you can refer to the following resources : - Data ethics resources by the Turing Institute. https://www.turing.ac.uk/research/data-ethics