AWS || Lambda Func || CD-steps || CI-GitHub ACTIONS
To set up continuous integration (CI) using GitHub Actions, follow these steps:
- Create a GitHub Repository: Start by creating a repository on GitHub to host your source code. If you already have a repository, you can skip this step.
- Define Workflow: Inside your GitHub repository, create a new directory named .github/workflows. In this directory, create a YAML file (e.g., ci.yml) to define your CI workflow.
- Configure Workflow: Open the YAML file and define the workflow using the GitHub Actions syntax. Specify the trigger event, such as pushes to specific branches or pull requests. You can also configure other event types like schedule or repository dispatch.
- Specify Jobs and Steps: Define one or more jobs within the workflow. Each job represents a set of steps that will be executed. For example, you can have a job for building your application, running tests, and generating code coverage reports.
- Set up Environment: Specify the environment for your CI workflow. This includes the operating system, programming language, and any required dependencies. GitHub Actions provides a variety of pre-configured environments, or you can create a custom environment using Docker containers.
- Define Steps: Within each job, define the steps that need to be executed. These can include cloning the repository, installing dependencies, running commands or scripts, executing tests, and generating artifacts.
- Configure Caching: To optimize the CI workflow, consider caching dependencies or build artifacts between workflow runs. This can significantly speed up subsequent executions by reusing the cached content.
- Add Optional Features: GitHub Actions provides additional features such as parallelism, matrix builds, and environment variables. Explore these features to enhance your CI workflow as per your requirements.
- Commit and Push: Save the YAML file and commit it to your repository. Push the changes to trigger the CI workflow. GitHub Actions will automatically detect the workflow file and start executing the defined steps.
- Monitor Workflow Execution: Monitor the workflow execution from the GitHub Actions tab in your repository. You can view logs, see the status of each step, and troubleshoot any issues that may arise during the CI process.
- By following these steps, you can set up continuous integration using GitHub Actions. This will enable automatic testing and validation of your codebase whenever changes are pushed to the repository, helping to catch issues early in the development cycle.
To set up continuous delivery (CD) using AWS CodePipeline, you can follow these steps:
Define your Application Architecture: Determine the architecture of your application, including the different components and deployment targets (e.g., Amazon EC2 instances, AWS Lambda functions, AWS Elastic Beanstalk, etc.).
Create an IAM Role: Start by creating an IAM role that CodePipeline can use to access and manage AWS resources, such as your source code repository, build environment, and deployment targets. Ensure the role has the necessary permissions for these actions.
Set up your Source Stage: Configure the source provider for your code repository. CodePipeline supports various source providers, including AWS CodeCommit, GitHub, and Bitbucket. Provide the necessary information to connect to your repository, such as the repository name, branch, and authentication details.
Configure your Build Stage: Select the build provider you want to use, such as AWS CodeBuild. Configure the build settings, including the build environment, build specifications, and any additional build options. You can define custom build scripts or use predefined build configurations.
Configure your Test Stage: Set up a stage for testing your application. You can use AWS CodeBuild, AWS CodeDeploy, or any other testing tool that integrates with CodePipeline. Define the necessary tests and configurations to ensure the quality of your application.
Set up Deployment Stages: Create deployment stages for your application. This can include deploying to a staging environment for further testing or deploying to a production environment. Configure the deployment settings based on your chosen deployment targets, such as Amazon EC2, AWS Elastic Beanstalk, or AWS Lambda.
Add Additional Stages: Depending on your CD requirements, you can add more stages to the pipeline. This might include manual approval stages, security and compliance checks, or any other necessary steps in your deployment process.
Configure Notifications: Set up notifications to receive alerts and updates about pipeline execution. CodePipeline can send notifications to Amazon SNS, Amazon Simple Queue Service (SQS), or email, allowing you to stay informed about the status of your pipeline.
Review and Create Pipeline: Review the pipeline configuration and ensure that all stages are correctly set up. Validate the settings, permissions, and integration with the selected services. Once you're satisfied, create the pipeline.
Monitor and Iterate: Monitor the execution of your pipeline, review logs and error messages, and iteratively improve your CD process. Gather feedback, make adjustments, and optimize your pipeline for faster and more reliable deployments.
By following these steps, you can establish a continuous delivery workflow using AWS CodePipeline. This will enable you to automate the build, test, and deployment processes for your application, resulting in faster and more efficient software delivery.
To set up continuous delivery (CD) using AWS CodePipeline, follow these steps:
Create an IAM Role: Start by creating an IAM role that will be used by CodePipeline to access and manage AWS resources. Ensure the role has the necessary permissions to interact with your source code repository, build environment, and deployment targets.
Create a CodePipeline: Go to the AWS Management Console and navigate to CodePipeline. Click on "Create pipeline" to begin the setup process.
Configure Pipeline Settings: Provide a name for your pipeline and select the service role you created in Step 1. Choose whether you want to start with a new pipeline or use a pipeline template. Click on "Next" to proceed.
Set up Source Stage: Select the source provider for your code repository, such as AWS CodeCommit, GitHub, or Bitbucket. Provide the necessary information to connect to your repository, including repository name, branch, and authentication details. CodePipeline will automatically detect changes in your repository and trigger the pipeline accordingly.
Configure Build Stage: Choose the build provider you want to use for your project, such as AWS CodeBuild. Configure the build settings, including the build environment, build specification file location, and any additional build options.
Set up Deployment Stage: Select the deployment provider based on your application's deployment target, such as AWS Elastic Beanstalk, Amazon ECS, or AWS Lambda. Configure the deployment settings, including the target environment, application name, and deployment options.
Add Additional Stages: Depending on your CD requirements, you can add additional stages to the pipeline. These stages can include testing, approval, or any other necessary steps before deploying to production.
Review and Create Pipeline: Review the pipeline configuration to ensure everything is set up correctly. Click on "Create pipeline" to create the pipeline in AWS CodePipeline.
Monitor Pipeline Execution: Once the pipeline is created, CodePipeline will automatically start executing the stages based on the changes in your source repository. You can monitor the pipeline's progress, view logs, and troubleshoot any issues from the CodePipeline console.
Iterate and Improve: CD is an iterative process. Continuously monitor and improve your pipeline by incorporating feedback, automating more processes, and enhancing your testing and deployment strategies.
By following these steps, you can set up a CD workflow using AWS CodePipeline, which will automate the build, test, and deployment processes for your application.
- Setting s3 bucket trigger on Lambda (Whenever a file insert to s3 bucket we will trigger the event)
- To read file content from s3Bucket on lamda trigger.
- Set cron jobs with a scheduled time inervals.
- AWS Lamda Layers:- archeived or zip file add to bundle add libraries.
- To whitelist IP addresses
Compute --> Lambda --> Function
1) Author from scratch
2) Use a blueprint
3) Browse serverless app repository.
1) Author from scratch
1.1) Funtion name
1.2) Runtime -> Java, Phython, Ruby
1.3) Permissions (For Logs printing we need to enable permission for the service., for bucket put & get access -> AWSLambdaExecute)
1.4) Edit code and Test
Setting s3 bucket trigger on Lambda (Whenever a file insert to s3 bucket we will trigger the event)
Steps:-
create s3 bucket
create IAM role -> select service lambda -> attach a permission ( AWSLambdaExecute)
create lambda function -> using 1.1,1.2,1.3
Add trigger -> go to lambda funct +add trigger -> All object create event
test -> go to lambda funct -> monitoring -> view logs in cloud watch
How to read file content from s3 on lamda trigger.
https://youtu.be/WBgedoH3Vn4
1) check permission + add permission (AWSLambdaExecute)
2) Modify code of lambda_trigger function (import boto3)
Setting cron job on lambda function:- (A job after an interval of time)
1) create labda function
2) Designer page +add trigger EventBridge(Cloud watch event)
3) Add rule +select EC2 machine
4) schedule expression +set time interval rate(5 mins) or
cron(0 17 ? * Mon-Fri) -> 0min 17hours mon-fri every day in a year
AWS Lamda Layers:- archeived or zip file to bundle add libraries.
1) Create a lambda funct, additional resources --> Layers +add layer ->custom layer
2) import liabraries accepted
3) create using shell-script
4) https://youtu.be/pj9svK2nfmk
To whitelist IP addresses within lambda
1) Create a lambda function
2) Click on configurations +add URL
3) Write code to whitelist APIs. +save +deploy
4) lambda funct -> ip-address-validates -> edit environment variables
"""
-*- coding: utf-8 -*-
========================
AWS Lambda
========================
Contributor: Chirag Rathod (Srce Cde)
========================
"""
import os
import ast
import json
from ipaddress import ip_network, ip_address
def check_ip(IP_ADDRESS, IP_RANGE):
VALID_IP = False
cidr_blocks = list(filter(lambda element: "/" in element, IP_RANGE))
if cidr_blocks:
for cidr in cidr_blocks:
net = ip_network(cidr)
VALID_IP = ip_address(IP_ADDRESS) in net
if VALID_IP:
break
if not VALID_IP and IP_ADDRESS in IP_RANGE:
VALID_IP = True
return VALID_IP
def return_func(
status_code=200,
message="Invocation successful!",
headers={"Content-Type": "application/json"},
isBase64Encoded=False,
):
return {
"statusCode": status_code,
"headers": headers,
"body": json.dumps({"message": message}),
"isBase64Encoded": isBase64Encoded,
}
def lambda_handler(event, context):
IP_ADDRESS = event["requestContext"]["http"]["sourceIp"]
IP_RANGE = ast.literal_eval(os.environ.get("IP_RANGE", "[]"))
METHOD = event["requestContext"]["http"]["method"]
if not IP_RANGE:
return return_func(status_code=500, message="Unauthorized")
VALID_IP = check_ip(IP_ADDRESS, IP_RANGE)
if not VALID_IP:
return return_func(status_code=500, message="Unauthorized")
if METHOD == "GET":
return return_func(status_code=200, message="GET method invoked!")
if METHOD == "POST":
return return_func(status_code=200, message="POST method invoked!")
return return_func()
AWS Lambda Cheatsheet
This cheatsheet is probably based on Python
| Runtime Versions | |||
|---|---|---|---|
| Type | Versions | AWS SDK | Operating System |
| Node.js | nodejs10.x | (JavaScript) 2.712.0 | Amazon Linux 2 |
| nodejs12.x | (JavaScript) 2.712.0 | Amazon Linux 2 | |
| Java | java11 | (JDK) amazon-corretto-11 | Amazon Linux 2 |
| java8.a12 | (JDK) amazon-corretto-8 | Amazon Linux 2 | |
| java8 | (JDK) java-1.8.0-openjdk | Amazon Linux | |
| Python | python3.8 | (Python) boto3-1.14.40 botocore-1.17.40 | Amazon Linux 2 |
| python3.7 | (Python) boto3-1.14.40 botocore-1.17.40 | Amazon Linux | |
| python3.6 | (Python) boto3-1.14.40 botocore-1.17.40 | Amazon Linux | |
| python2.7 | (Python) boto3-1.14.40 botocore-1.17.40 | Amazon Linux | |
| Ruby | ruby2.7 | (Ruby) 3.0.3 | Amazon Linux 2 |
| ruby2.5 | (Ruby) 3.0.3 | Amazon Linux | |
| .NET Core | dotnetcore3.1 | -- | Amazon Linux 2 |
| dotnetcore2.1 | -- | Amazon Linux | |
| Go | go1.x | -- | Amazon Linux |
| Custom Runtime | provided.al2 | -- | Amazon Linux 2 |
| provided | -- | Amazon Linux | |
| Available Operating System | |||
|---|---|---|---|
| Type | Image | Kernel | |
| Amazon Linux | amzn-ami-hvm-2018.03.0.20181129-x86_64-gp2 | 4.14.171-105.231.amzn1.x86_64 | |
| Amazon Linux 2 | Custom | 4.14.165-102.205.amzn2.x86_64 | |
| Settings | Limits | ||
|---|---|---|
| Description | Settings | Limits | Explained | Can be increased |
| Writable Path & Space | /tmp/ 512 MB | -- |
| Default Memory & Execution Time | 128 MB Memory 3 Second Timeout | -- |
| Max Memory & Execution Time | 10,240 MB (1 MB increments) 900 seconds (15 Minutes) Timeout | -- |
| Number of processes and threads (Total) | 1024 | -- |
| Number of File descriptors (Total) | 1024 | -- |
| Maximum deployment package size | 50 MB (zipped, direct upload) 250 MB (unzipped, including layers) | -- |
| Container image code package size | 10 GB | -- |
| Maximum deployment package size for console editor | 3 MB | -- |
| Total size of deployment package per region | 75 GB | Can be increased upto Terabytes |
| Maximum size of environment variables set | 4 KB | -- |
| Maximum function Layers | 5 layers | -- |
| Environment variables size | 4 KB | -- |
| Maximum test events (Console editor) | 10 | -- |
| Invocation payload Limit (request and response) | 6 MB (synchronous) 256 KB (asynchronous) | -- |
| Elastic network interpaces per VPC | 250 | Can be increased upto Hundreds |
| Lambda Destinations |
| Can be increased upto Hundreds |
| Monitoring tools |
| -- |
| VPC |
| -- |
| Concurrency |
| Can be increased upto Hundreds of thousands |
| DLQ (Dead Letter Queue) |
| -- |
| Throttle |
| -- |
| File system |
| -- |
| State machines |
| -- |
| Database proxies |
| -- |
| Execution Role (Common Execution Role Available) | |
|---|---|
| AWSLambdaBasicExecutionRole | Grants permissions only for the Amazon CloudWatch Logs actions to write logs. |
| AWSLambdaKinesisExecutionRole | Grants permissions for Amazon Kinesis Streams actions, and CloudWatch Logs actions. |
| AWSLambdaDynamoDBExecutionRole | Grants permissions for DynamoDB streams actions and CloudWatch Logs actions. |
| AWSLambdaVPCAccessExecutionRole | Grants permissions for Amazon Elastic Compute Cloud (Amazon EC2) actions to manage elastic network interfaces (ENIs). |
| AWSXrayWriteOnlyAccess | Grants permission for X-ray to to upload trace data to debug and analyze. |
Add new permission
import boto3
client = boto3.client('lambda')
# Role ARN can be found on the top right corner of the Lambda function
response = client.add_permission(
FunctionName='string',
StatementId='string',
Action='string',
Principal='string',
SourceArn='string',
SourceAccount='string',
EventSourceToken='string',
Qualifier='string'
)
| Execution | Invoke | Tweaks | |
|---|---|
| A Lambda can invoke another Lambda | Yes |
| A Lambda in one region can invoke another lambda in other region | Yes |
| A Lambda can invoke same Lambda | Yes |
| Exceed 15 minutes execution time | Yes (Can Tweak around) |
| How to exceed 5 minutes execution time | Self-Invoke , SNS, SQS |
| Asynchronous Execution | Yes (Async Exec) |
| Invoke same Lamba with different version | Yes |
| Setting Lambda Invoke Max Retry attempt to 0 | Yes |
| Triggers | Description | Requirement |
|---|---|---|
| API Gateway | Trigger AWS Lambda function over HTTPS | API Endpoint name API Endpoint Deployment Stage Security Role |
| AWS IoT | Trigger AWS Lambda for performing specific action by mapping your AWS IoT Dash Button (Cloud Programmable Dash Button) | DSN (Device Serial Number) |
| Alexa Skill Kit | Trigger AWS Lambda to build services that give new skills to Alexa | -- |
| Alexa Smart Home | Trigger AWS Lambda with desired skill | Application ID (Skill) |
| Application Load Balancer | Trigger AWS Lambda from ALB | Application Load Balancer Listener (It is the port that ALP receivce traffice) Host Path |
| CloudFront | Trigger AWS Lambda based on difference CloudFront event. | CloudFront distribution, Cache behaviour, CloudFront event (Origin request/response, Viewer request/response). To set CloudFront trigger, one need to publish the version of Lambda. Limitations: Runtime is limited to Node.js 6.10 /tmp/ space is not available Environment variables, DLQ & Amazon VPC's cannot be used |
| CloudWatch Events | Trigger AWS Lambda on desired time interval (rate(1 day)) or on the state change of EC2, RDS, S3, Health. | Rule based on either Event Pattern (time interval) Schedule Expression (Auto Scaling on events like Instance launch and terminate AWS API call via CloudTrail |
| CloudWatch Logs | Trigger AWS Lambda based on the CloudWatch Logs | Log Group Name |
| Code Commit | Trigger AWS Lambda based on the AWS CodeCommit version control system | Repository Name Event Type |
| Cognito Sync Trigger | Trigger AWS Lambda in response to event, each time the dataset is synchronized | Cognito Identity Pool dataset |
| DynamoDB | Trigger AWS Lambda whenever the DynomoDB table is updated | DynamoDB Table name Batch Size(The largest number of records that AWS Lambda will retrieve from your table at the time of invoking your function. Your function receives an event with all the retrieved records) |
| Kinesis | Trigger AWS Lambda whenever the Kinesis stream is updated | Kinesis Stream Batch Size |
| S3 | Trigger AWS Lambda in response to file dropped in S3 bucket | Bucket Name Event Type (Object Removed, Object Created) |
| SNS | Trigger AWS Lambda whenever the message is published to Amazon SNS Topic | SNS Topic |
| SQS | Trigger AWS Lambda on message arrival in SQS | SQS queue Batch size Limitation: It only works with Standard queue and not FIFO queue |
| Troubleshooting | ||
|---|---|---|
| Error | Possible Reason | Solution |
| File "/var/task/lambda_function.py", line 2, in lambda_handler return event['demoevent'] KeyError: 'demoevent' | Event does not have the key 'demoevent' or either misspelled | Make sure the event is getting the desired key if it is receiving the event from any trigger. Or if the not outside event is passed than check for misspell. Or check the event list by printing event. |
| botocore.exceptions.ClientError: An error occurred (AccessDeniedException) when calling the GetParameters operation: User: arn:aws:dummy: | Lacks Permission to access | Assign appropriate permission for accessibility |
| ImportError: Missing required dependencies [‘module'] | Dependent module is missing | Install/Upload the required module |
| sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) could not translate host name "host.dummy.region.rds.amazonaws.com" to address: Name or service not known | RDS Host is unavailable | Make sure the RDS instance is up and running. Double check the RDS hostname |
| [Errno 32] Broken pipe | Connection is lost (Either from your side or may be some problem from AWS) While invoking another Lambda, if the payload size exceed the mentioned limit | Make sure if you are passing the payload of right size. Check for the connection. |
| Unable to import module ‘lambda_function/index’ No module named ‘lambda_function' | Handler configuration is not matching the main file name | Update the handler configuration as per your filename.function_name |
| OperationalError: (psycopg2.OperationalError) terminating connection due to administrator command SSL connection has been closed unexpectedly | RDS/Database System has been rebooted. In a typical web application using an ORM (SQLAlchemy) Session, the above condition would correspond to a single request failing with a 500 error, then the web application continuing normally beyond that. Hence the approach is “optimistic” in that frequent database restarts are not anticipated. | Give second try |
| Error code 429 | The function is throttled. Basically the reserved concurrency is set to zero or it have reach the account level throttle. (The function that is invoked synchronous and if it is throttled then it will return 429 error. If the lambda function is invoked asynchronously and if it is throttled then it will retry the throttled event for upto 6 hours.) | Check for the reserved concurrency limit or throttle status for the individual function. Or check for the account level concurrent execution limit |
AWS Lambda CLI commands
Add Permission
It add mention permission to the Lambda function
Syntax
add-permission
--function-name <value>
--statement-id <value>
--action <value>
--principal <value>
[--source-arn <value>]
[--source-account <value>]
[--event-source-token <value>]
[--qualifier <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
add-permission --function-name functionName --statement-id role-statement-id --action lambda:CreateFunction --principal s3.amazonaws.com
Create Alias
It creates alias for the given Lambda function name
Syntax
create-alias
--function-name <value>
--name <value>
--function-version <value>
[--description <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
create-alias --function-name functionName --name fliasName --function-version version
Create Event Source Mapping
It identify event-source from Amazon Kinesis stream or an Amazon DynamoDB stream
create-event-source-mapping
--event-source-arn <value>
--function-name <value>
[--enabled | --no-enabled]
[--batch-size <value>]
--starting-position <value>
[--starting-position-timestamp <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
create-event-source-mapping --event-source-arn arn:aws:kinesis:us-west-1:1111 --function-name functionName --starting-position LATEST
Create Function
It creates the new function
Syntax
create-function
--function-name <value>
--runtime <value>
--role <value>
--handler <value>
[--code <value>]
[--description <value>]
[--timeout <value>]
[--memory-size <value>]
[--publish | --no-publish]
[--vpc-config <value>]
[--dead-letter-config <value>]
[--environment <value>]
[--kms-key-arn <value>]
[--tracing-config <value>]
[--tags <value>]
[--zip-file <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
create-function --function-name functionName --runtime python3.6 --role arn:aws:iam::account-id:role/lambda_basic_execution
--handler main.handler
Delete Alias
It deletes the alias
Syntax
delete-alias
--function-name <value>
--name <value>
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
delete-alias --function-name functionName --name aliasName
Delete Event Source Mapping
It deletes the event source mapping
Syntax
delete-event-source-mapping
--uuid <value>
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
delete-event-source-mapping --uuid 12345kxodurf3443
Delete Function
It will delete the function and all the associated settings
Syntax
delete-function
--function-name <value>
[--qualifier <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
delete-function --function-name FunctionName
Get Account Settings
It will fetch the user’s account settings
Syntax
get-account-settings
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Get Alias
It returns the desired alias information like description, ARN
Syntax
get-alias
--function-name <value>
--name <value>
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
get-alias --function-name functionName --name aliasName
Get Event Source Mapping
It returns the config information for the desired event source mapping
Syntax
get-event-source-mapping
--uuid <value>
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
get-event-source-mapping --uuid 12345kxodurf3443
Get Function
It returns the Lambda Function information
Syntax
get-function
--function-name <value>
[--qualifier <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
get-function --function-name functionName
Get Function Configuration
It returns the Lambda function configuration
Syntax
get-function-configuration
--function-name <value>
[--qualifier <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
get-function-configuration --function-name functionName
Get Policy
It return the linked policy with Lambda function
Syntax
get-policy
--function-name <value>
[--qualifier <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
get-policy --function-name functionName
Invoke
It invoke the mention Lambda function name
invoke
--function-name <value>
[--invocation-type <value>]
[--log-type <value>]
[--client-context <value>]
[--payload <value>]
[--qualifier <value>]
Example
invoke --function-name functionName
List Aliases
It return all the aliases that is created for Lambda function
Syntax
list-aliases
--function-name <value>
[--function-version <value>]
[--marker <value>]
[--max-items <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
list-aliases --function-name functionName
List Event Source Mappings
It return all the list event source mappings that is created with create-event-source-mapping
Syntax
list-event-source-mappings
[--event-source-arn <value>]
[--function-name <value>]
[--max-items <value>]
[--cli-input-json <value>]
[--starting-token <value>]
[--page-size <value>]
[--generate-cli-skeleton <value>]
Example
list-event-source-mappings --event-source-arn arn:aws:arn --function-name functionName
List Functions
It return all the Lambda function
Syntax
list-functions
[--master-region <value>]
[--function-version <value>]
[--max-items <value>]
[--cli-input-json <value>]
[--starting-token <value>]
[--page-size <value>]
[--generate-cli-skeleton <value>]
Example
list-functions --master-region us-west-1 --function-version ALL
List Tags
It return the list of tags that are assigned to the Lambda function
Syntax
list-tags
--resource <value>
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
list-tags --resource arn:aws:function
List Versions by functions
It return all the versions of the desired Lambda function
Syntax
list-versions-by-function
--function-name <value>
[--marker <value>]
[--max-items <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
list-versions-by-function --function-name functionName
Publish Version
It publish the version of the Lambda function from $LATEST snapshot
Syntax
publish-version
--function-name <value>
[--code-sha-256 <value>]
[--description <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
publish-version --function-name functionName
Remove Permission
It remove the single permission from the policy that is linked with the Lambda function
Syntax
remove-permission
--function-name <value>
--statement-id <value>
[--qualifier <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
remove-permission --function-name functionName --statement-id role-statement-id
Tag Resource
It creates the tags for the lambda function in the form of key-value pair
Syntax
tag-resource
--resource <value>
--tags <value>
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
tag-resource --resource arn:aws:arn --tags {‘key’: ‘pair’}
Untag Resource
It remove tags from the Lambda function
Syntax
untag-resource
--resource <value>
--tag-keys <value>
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
untag-resource --resource arn:aws:complete --tag-keys [‘key1’, ‘key2’]
Update Alias
It update the alias name of the desired lambda function
Syntax
update-alias
--function-name <value>
--name <value>
[--function-version <value>]
[--description <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
update-alias --function-name functionName --name aliasName
Update Event Source Mapping
It updates the event source mapping incase you want to change the existing parameters
Syntax
update-event-source-mapping
--uuid <value>
[--function-name <value>]
[--enabled | --no-enabled]
[--batch-size <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
update-event-source-mapping --uuid 12345kxodurf3443
Update Function Code
It updates the code of the desired Lambda function
Syntax
update-function-code
--function-name <value>
[--zip-file <value>]
[--s3-bucket <value>]
[--s3-key <value>]
[--s3-object-version <value>]
[--publish | --no-publish]
[--dry-run | --no-dry-run]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
update-function-code --function-name functionName
Update Function Configuration
It updates the configuration of the desired Lambda function
Syntax
update-function-configuration
--function-name <value>
[--role <value>]
[--handler <value>]
[--description <value>]
[--timeout <value>]
[--memory-size <value>]
[--vpc-config <value>]
[--environment <value>]
[--runtime <value>]
[--dead-letter-config <value>]
[--kms-key-arn <value>]
[--tracing-config <value>]
[--cli-input-json <value>]
[--generate-cli-skeleton <value>]
Example
update-function-configuration --function-name functionName
Comments
Post a Comment