diff --git a/README.md b/README.md index 285a97b4..7baccd3e 100644 --- a/README.md +++ b/README.md @@ -176,7 +176,7 @@ $ npx serverless invoke -f hello -d '{"foo":"bar"}' #### Docker -Alternatively, you can build a Rust-based Lambda function in a [docker mirror of the AWS Lambda provided runtime with the Rust toolchain preinstalled](https://github.com/softprops/lambda-rust). +Alternatively, you can build a Rust-based Lambda function in a [docker mirror of the AWS Lambda provided runtime with the Rust toolchain preinstalled](https://github.com/rustserverless/lambda-rust). Running the following command will start a ephemeral docker container which will build your Rust application and produce a zip file containing its binary auto-renamed to `bootstrap` to meet the AWS Lambda's expectations for binaries under `target/lambda_runtime/release/{your-binary-name}.zip`, typically this is just the name of your crate if you are using the cargo default binary (i.e. `main.rs`) @@ -186,7 +186,7 @@ $ docker run --rm \ -v ${PWD}:/code \ -v ${HOME}/.cargo/registry:/root/.cargo/registry \ -v ${HOME}/.cargo/git:/root/.cargo/git \ - softprops/lambda-rust + rustserverless/lambda-rust ``` With your application build and packaged, it's ready to ship to production. You can also invoke it locally to verify is behavior using the [lambdaci :provided docker container](https://hub.docker.com/r/lambci/lambda/) which is also a mirror of the AWS Lambda provided runtime with build dependencies omitted.