libcurl also has AWS auth with --aws-sigv4 which gives you a fully compatible S3 cliënt without installing anything! (You probably already have curl installed)
show comments
akouri
This is awesome! Been waiting for something like this to replace the bloated SDK Amazon provides. Important question— is there a pathway to getting signed URLs?
show comments
linotype
This looks slick.
What I would also love to see is a simple, single binary S3 server alternative to Minio. Maybe a small built in UI similar to DuckDB UI.
show comments
everfrustrated
Presumably smaller and quicker because it's not doing any checksumming
show comments
dev_l1x_be
for Node.
These are nice projects. I had a few rounds with Rust S3 libraries and having a simple low or no dep client is much needed. The problem is that you start to support certain features (async, http2, etc.) and your nice nodep project is starting to grow.
It gets slower as the instance gets faster? I'm looking at ops/sec and time/op. How am I misreading this?
show comments
tommoor
Interesting project, though it's a little amusing that you announced this before actually confirming it works with AWS?
show comments
nodesocket
Somewhat related, I just came across s5cmd[1] which is mainly focused on performance and fast upload/download and sync of s3 buckets.
> 32x faster than s3cmd and 12x faster than aws-cli. For downloads, s5cmd can saturate a 40Gbps link (~4.3 GB/s), whereas s3cmd and aws-cli can only reach 85 MB/s and 375 MB/s respectively.
This is good to have. A few months ago I was testing a S3 alternative but running into issues getting it to work. Turned out it was because AWS made changes to the tool that had the effect of blocking non-first-party clients. Just sheer chance on my end, but I imagine that was infuriating for folks who have to rely on that client. There is an obvious need for a compatible client like this that AWS doesn’t manage.
busymom0
Does this allow generating signed URLs for uploads with size limit and name check?
dzonga
this looks dope.
but has anyone done a price comparison of edge-computing vs say your boring hetzner vps ?
EGreg
You know what would be really awesome? Making a fuse-based drop-in replacement for mapping a folder to a bucket, like goofys. Maybe a node.js process can watch files for instance and backup, or even better it can back the folder and not actually take up space on the local machine (except for a cache).
I found the words used to describe this jarring - to me it makes sense to have an s3 client on my computer, but less so client side on a webapp. On further reading, it makes sense, but highlighting what problem this package solves in the first few lines of the readme would be valuable for people like me at least
show comments
yard2010
Tangibly related: Bun has a built-in S3-compatible client. Bun is a gift, if you're using npm consider making the switch.
libcurl also has AWS auth with --aws-sigv4 which gives you a fully compatible S3 cliënt without installing anything! (You probably already have curl installed)
This is awesome! Been waiting for something like this to replace the bloated SDK Amazon provides. Important question— is there a pathway to getting signed URLs?
This looks slick.
What I would also love to see is a simple, single binary S3 server alternative to Minio. Maybe a small built in UI similar to DuckDB UI.
Presumably smaller and quicker because it's not doing any checksumming
for Node.
These are nice projects. I had a few rounds with Rust S3 libraries and having a simple low or no dep client is much needed. The problem is that you start to support certain features (async, http2, etc.) and your nice nodep project is starting to grow.
> https://raw.githubusercontent.com/good-lly/s3mini/dev/perfor...
It gets slower as the instance gets faster? I'm looking at ops/sec and time/op. How am I misreading this?
Interesting project, though it's a little amusing that you announced this before actually confirming it works with AWS?
Somewhat related, I just came across s5cmd[1] which is mainly focused on performance and fast upload/download and sync of s3 buckets.
> 32x faster than s3cmd and 12x faster than aws-cli. For downloads, s5cmd can saturate a 40Gbps link (~4.3 GB/s), whereas s3cmd and aws-cli can only reach 85 MB/s and 375 MB/s respectively.
[1] https://github.com/peak/s5cmd
How does this compare to obstore? [1]
[1] https://developmentseed.org/obstore/latest/
Same as this https://github.com/minio/minio ?
This is good to have. A few months ago I was testing a S3 alternative but running into issues getting it to work. Turned out it was because AWS made changes to the tool that had the effect of blocking non-first-party clients. Just sheer chance on my end, but I imagine that was infuriating for folks who have to rely on that client. There is an obvious need for a compatible client like this that AWS doesn’t manage.
Does this allow generating signed URLs for uploads with size limit and name check?
this looks dope.
but has anyone done a price comparison of edge-computing vs say your boring hetzner vps ?
You know what would be really awesome? Making a fuse-based drop-in replacement for mapping a folder to a bucket, like goofys. Maybe a node.js process can watch files for instance and backup, or even better it can back the folder and not actually take up space on the local machine (except for a cache).
https://github.com/kahing/goofys
I found the words used to describe this jarring - to me it makes sense to have an s3 client on my computer, but less so client side on a webapp. On further reading, it makes sense, but highlighting what problem this package solves in the first few lines of the readme would be valuable for people like me at least
Tangibly related: Bun has a built-in S3-compatible client. Bun is a gift, if you're using npm consider making the switch.