Write, package, and deploy a Lambda function using only the AWS CLI. Trigger it via a public URL. Understand what serverless actually means.
By the end of this post you'll have a Python function deployed to AWS Lambda, reachable from the public internet via a Function URL, and you'll understand the basic cost model. About 25 minutes. Stays in the AWS free tier.
You'll need an AWS account and the AWS CLI installed and configured (aws configure).
Serverless does not mean "no servers." There are servers; AWS just doesn't make you provision, patch, or scale them. You upload code, AWS runs it on demand, and you pay per invocation + per millisecond of execution.
The trade: you give up control over the runtime environment (max execution time, memory limits, no persistent disk between invocations) in exchange for never thinking about capacity again. Good fit for event-driven workloads (S3 triggers, scheduled jobs, webhooks, light APIs). Bad fit for sustained high-throughput services or anything with persistent state.
Three things to know:
.zip (or container image). AWS unpacks it onto an "execution environment" — a tiny container — and runs it.lambda_function.lambda_handler(event, context).That's the whole programming model. Everything else is wiring.
mkdir lambda-tutorial && cd lambda-tutorial
Create lambda_function.py:
import json
import datetime
def lambda_handler(event, context):
name = event.get("queryStringParameters", {}).get("name", "world") if event else "world"
body = {
"message": f"Hello, {name}!",
"time_utc": datetime.datetime.utcnow().isoformat() + "Z",
"request_id": context.aws_request_id if context else "local",
}
return {
"statusCode": 200,
"headers": {"Content-Type": "application/json"},
"body": json.dumps(body),
}
if __name__ == "__main__":
print(lambda_handler({}, None))
The if __name__ == "__main__" block lets you test locally:
python lambda_function.py
You should see a dict with Hello, world!, current time, and request_id: local. Locally context is None; on Lambda it'll be a real object with the actual request ID.
Lambda wants a .zip containing your code:
zip function.zip lambda_function.py
For a function with no extra dependencies, that's the whole package. With dependencies (e.g. requests), you'd pip install requests --target . first, then zip the directory.
Lambda runs as an IAM role. The simplest viable role lets the function write logs to CloudWatch:
cat > trust-policy.json <<EOF
{
"Version": "2012-10-17",
"Statement": [{
"Effect": "Allow",
"Principal": {"Service": "lambda.amazonaws.com"},
"Action": "sts:AssumeRole"
}]
}
EOF
aws iam create-role --role-name lambda-tutorial-role \
--assume-role-policy-document file://trust-policy.json
aws iam attach-role-policy --role-name lambda-tutorial-role \
--policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
The first command creates a role Lambda is allowed to assume. The second attaches a managed policy that allows writing CloudWatch logs. Wait ~10 seconds before the next step — IAM is eventually consistent and a freshly-created role isn't immediately usable.
Get your AWS account ID for the next step:
ACCOUNT=$(aws sts get-caller-identity --query Account --output text)
echo $ACCOUNT
aws lambda create-function \
--function-name hello-tutorial \
--runtime python3.12 \
--role arn:aws:iam::${ACCOUNT}:role/lambda-tutorial-role \
--handler lambda_function.lambda_handler \
--zip-file fileb://function.zip
You should get back JSON describing the function. The important field is FunctionArn.
Test it:
aws lambda invoke --function-name hello-tutorial --payload '{}' /tmp/out.json
cat /tmp/out.json
You'll see your response. Lambda is running.
aws lambda create-function-url-config \
--function-name hello-tutorial \
--auth-type NONE
This gives you a public URL — no API Gateway needed for this simple case. Copy the FunctionUrl from the output.
Lambda also needs an explicit permission to be invoked anonymously:
aws lambda add-permission \
--function-name hello-tutorial \
--statement-id public-invoke \
--action lambda:InvokeFunctionUrl \
--principal "*" \
--function-url-auth-type NONE
Now hit it:
curl "<your-function-url>?name=ada"
You should see:
{"message": "Hello, ada!", "time_utc": "...", "request_id": "..."}
That's a serverless function reachable from the public internet, in five steps.
Change something in lambda_function.py (e.g. add "version": 2 to the body), then:
zip function.zip lambda_function.py
aws lambda update-function-code --function-name hello-tutorial --zip-file fileb://function.zip
The next request hits the new version. No restart, no provisioning. That's the part teams fall in love with.
aws lambda delete-function-url-config --function-name hello-tutorial
aws lambda delete-function --function-name hello-tutorial
aws iam detach-role-policy --role-name lambda-tutorial-role \
--policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
aws iam delete-role --role-name lambda-tutorial-role
Cold starts ignored. The first request after the function has been idle takes longer because Lambda is spinning up a new execution environment. For a tiny Python function it's ~200ms; for a large Java function with heavy startup it's seconds. If user-facing latency matters, use Provisioned Concurrency or a different platform.
Storing state on disk. /tmp exists in Lambda but is per-execution-environment and ephemeral. State that needs to survive across invocations goes to S3, DynamoDB, or another store.
Long-running tasks. Lambda has a 15-minute hard limit. Workloads near that should split into smaller invocations or move to ECS / EKS.
Loose IAM permissions. It's tempting to grant the function * permissions to skip the IAM dance. Don't — the role is your blast radius if the function is ever compromised. Grant exactly what it needs.
You've shipped your first function. The next levels:
Lambda is one of AWS's best products when used for event-driven work and short-lived tasks. The hardest part of getting started is the IAM dance — once your first function is running, every subsequent one is faster.
Get the latest tutorials, guides, and insights on AI, DevOps, Cloud, and Infrastructure delivered directly to your inbox.
Install Ansible, write your first playbook, and configure a remote server (nginx + a deploy user) without touching it manually. The basics that scale up.
Create your first S3 bucket, upload and download files, and set up the right access controls — without accidentally making everything public.
Explore more articles in this category
A working mental model for AWS VPCs — what each piece does, how they connect, and why "VPC" is the wrong mental model if you came from physical networks.
Create your first S3 bucket, upload and download files, and set up the right access controls — without accidentally making everything public.
We deployed the same edge function on both platforms and measured for a quarter. Where each wins, where each loses, and the surprises along the way.