How to search across S3 buckets
Amazon S3 has no native search API — every tool lists keys with ListObjectsV2 and filters client-side. The trick is paginating through the full result set instead of stopping at the first 1,000-key page. For small to medium buckets, `aws s3 ls --recursive | grep` is fast enough. For interactive cross-bucket search across providers, S3 Viewer paginates in the background with Tab autocomplete. For buckets with millions of keys, S3 Inventory + Athena is the right pattern.
Step-by-step.
- 01
In S3 Viewer: type in the search bar
Search runs as you type. HitTabto autocomplete the next prefix segment — findingdata/models/v2/training/checkpoint-42/weights.ptbecomes a few Tab presses. S3 Viewer paginates throughListObjectsV2in the background and filters in-memory across the buckets you have open — including buckets on different providers. - 02
Open the result
Hits show the full key path so you can see exactly where each match lives. Click to jump straight to the file. - 03
AWS CLI: recursive list + grep
Quick and dirty. Works for small to medium buckets.aws s3 ls s3://my-bucket --recursive | grep <query> - 04
AWS CLI: list-objects-v2 with --prefix
Use--prefixto narrow the listing first, then filter the output. Faster on large buckets — listing happens server-side.aws s3api list-objects-v2 \ --bucket my-bucket \ --prefix logs/2026/ \ --query 'Contents[?Size > `1000000`]' - 05
For very large buckets: S3 Inventory + Athena
Enable S3 Inventory for a daily CSV/Parquet manifest of every key, then query it with Athena. The right pattern for buckets with millions or billions of objects — listing them via the API would take hours.
What's actually happening.
S3 has no search API — you list keys (ListObjectsV2) and filter client-side. The AWS console's filter operates on the visible page (1,000 keys), which is fine for small buckets but not for deep ones. S3 Viewer paginates through ListObjectsV2 in the background and filters in-memory; results stream in as pages come back. Tab autocompletes the next prefix segment as you type, so navigating deep paths is keyboard-fast. The same pattern works for Cloudflare R2, MinIO, Backblaze B2, and any other S3-compatible provider, since they all implement the same listing API. For multi-million-key buckets, S3 Inventory + Athena is the right pattern.
Common questions.
How do I search for a file in an S3 bucket?
Why doesn't the AWS S3 console search find my file?
Can I search across multiple S3 buckets at once?
How do I search a Cloudflare R2 bucket?
Is there an S3 search API?
Skip the CLI. Try it in the browser.
S3 Viewer turns the steps above into a single click. Open source, self-hostable, free for personal use.
Why teams pick this
More how-tos
Switch buckets fast
Cmd-K fuzzy switcher across every connected provider, plus AWS CLI named profiles for scripts.
View S3 + R2 together
One credential per provider, one sidebar, Cmd-K to jump, and right-click to copy across clouds.
Rename an S3 file
Amazon S3 has no rename API — keys are immutable. Here's the standard copy + delete pattern, with metadata and tags preserved.