更新20150527
更新20120412
我的初步答案显然错过了要点,所以要澄清:
如果您想通过简单的HTML表单进行基于浏览器的上传,那么您将受限于使用POST Object operation,它使用HTML表单将对象添加到指定的存储桶中:
POST is an alternate form of PUT that enables browser-based uploads as
a way of putting objects in buckets. Parameters that are passed to PUT
via HTTP Headers are instead passed as form fields to POST in the
multipart/form-data encoded message body. […]
上传在这里进行单一操作,因此不支持暂停/恢复,并限制您的原始最大对象大小为5 GB或更少.
这显然需要一个服务器(例如在EC2上)来处理通过浏览器启动的操作(这样可以方便S3 Bucket Policies和/或IAM Policies访问控制).
另一种可能是使用JavaScript库并执行此客户端,请参阅jQuery Upload Progress and AJAX file upload初始指针.不幸的是,没有AWS的规范JavaScript SDK可用(aws-lib令人惊讶地甚至不支持S3) – 显然,knox的一些分支已经添加了多部分上传,例如. slakis’s fork,我没有使用任何一个用于手边的用例.
初始答案
If it’s possible to upload [large files] directly to S3, how can I handle
pause/resume?
The AWS SDK for PHP exposes a low-level API that closely resembles the
Amazon S3 REST API for multipart upload (see Using the REST API for
Multipart Upload ). Use the low-level API when you need to pause and
resume multipart uploads, vary part sizes during the upload, or do not
know the size of the data in advance. Use the high-level API (see
Using the High-Level PHP API for Multipart Upload) whenever you don’t
have these requirements. [emphasis mine]
Amazon S3可以将对象从1字节一直处理到5TB(TB),请参阅各自的入门职位Amazon S3 – Object Size Limit Now 5 TB:
[…] Now customers can store extremely
large files as single objects, which greatly simplifies their storage
experience. Amazon S3 does the bookkeeping behind the scenes for our
customers, so you can now GET that large object just like you would
any other Amazon S3 object.
In order to store larger objects you would use the new 070017 that I blogged about last month to upload the object in parts. […]