upload progress statistics using AWS S3 createMultipartUpload (AWS PHP SDK v3)

Problem

Currently I’m using the AWS S3 createMultipartUpload function to upload large files (videos) to AWS S3 storage. This works like a charm.

However, due to the large amount of upload time it takes to upload a video to AWS S3 Storage, I’m trying to find a solution to retrieve the upload progress statistics e.g. uploadedBytes & totalBytesToUpload to calculate the time remaining until a video upload has been completed.

I’ve searched through dozen of StackOverflow questions and blogs on the internet on how to retrieve these statistics real-time of an uploading file, but haven’t found a working solution.

What have I tried?

Original code

$result = $this->s3Client->createMultipartUpload([
            'Bucket'       => $this->bucket,
            'Key'          => $path,
            'StorageClass' => 'STANDARD',
            'ACL'          => 'public-read',
            'Metadata'     => $metaDataValues
        ]);

Code using @http progress (GuzzleHttp) (found on https://stackoverflow.com/a/43700547/4686900)

$result = $this->s3Client->createMultipartUpload([
            'Bucket'       => $this->bucket,
            'Key'          => $path,
            'StorageClass' => 'STANDARD',
            'ACL'          => 'public-read',
            'Metadata'     => $metaDataValues,
            '@http' => [
                'progress' => function ($downloadTotalSize, $downloadSizeSoFar, $uploadTotalSize, $uploadSizeSoFar) {
                    $this->loggerService->logError("down [" . $downloadSizeSoFar ."]/[".$downloadTotalSize."] up [".$uploadSizeSoFar."]/[".$uploadTotalSize."]");
                }
            ]
        ]);

This code logs the following output after uploading a video to AWS S3 Storage

[2020-09-29 16:29:45] customLog.ERROR: down [0]/[0] up [0]/[0] [] []
[2020-09-29 16:29:45] customLog.ERROR: down [0]/[0] up [0]/[0] [] []
[2020-09-29 16:29:45] customLog.ERROR: down [0]/[0] up [0]/[0] [] []
[2020-09-29 16:29:45] customLog.ERROR: down [0]/[0] up [0]/[0] [] []
[2020-09-29 16:29:45] customLog.ERROR: down [0]/[0] up [0]/[0] [] []
[2020-09-29 16:29:45] customLog.ERROR: down [0]/[0] up [0]/[0] [] []
[2020-09-29 16:29:45] customLog.ERROR: down [0]/[0] up [0]/[0] [] []
[2020-09-29 16:29:45] customLog.ERROR: down [0]/[0] up [0]/[0] [] []
[2020-09-29 16:29:45] customLog.ERROR: down [0]/[0] up [0]/[0] [] []
[2020-09-29 16:29:45] customLog.ERROR: down [435]/[0] up [0]/[0] [] []
[2020-09-29 16:29:45] customLog.ERROR: down [435]/[0] up [0]/[0] [] []
[2020-09-29 16:29:45] customLog.ERROR: down [435]/[0] up [0]/[0] [] []
[2020-09-29 16:29:45] customLog.ERROR: down [435]/[0] up [0]/[0] [] []
[2020-09-29 16:29:45] customLog.ERROR: down [435]/[0] up [0]/[0] [] []
[2020-09-29 16:29:45] customLog.ERROR: down [435]/[0] up [0]/[0] [] []

In the logs you see that up stats [0]/[0], so no changes.

I’ve tried to think of other solutions such as calculating uploadedBytes using JavaScript AJAX XHR function, see code:

$.ajax(
{
//...
xhr: function() {
      let xhr = new window.XMLHttpRequest();
      xhr.upload.addEventListener("progress", function(evt) {
        if (evt.lengthComputable) {
           let percentComplete = (evt.loaded / evt.total) * 100;
           console.log(percentComplete)
        }
      }, false);
  return xhr;
}
// ...
});

This code, however, only captures the amount of bytes transfered between me locally and our own server, and not between my local environment & the AWS S3 Storage server.

Complete code

public function uploadMultipart($file, $path, $metaDataValues = [])
{
        ini_set('max_execution_time', 0);

        $parts = array();
        $result = $this->s3Client->createMultipartUpload([
            'Bucket'       => $this->bucket,
            'Key'          => $path,
            'StorageClass' => 'STANDARD',
            'ACL'          => 'public-read',
            'Metadata'     => $metaDataValues,
            '@http' => [
                'progress' => function ($downloadTotalSize, $downloadSizeSoFar, $uploadTotalSize, $uploadSizeSoFar) {
                    $this->loggerService->logError("down [" . $downloadSizeSoFar ."]/[".$downloadTotalSize."] up [".$uploadSizeSoFar."]/[".$uploadTotalSize."]");
                }
            ]
        ]);

        $uploadId = $result['UploadId'];

        // Upload the file in parts.
        try
        {
            $file = fopen($file, 'r');
            $partNumber = 1;

            while (!feof($file))
            {
                $result = $this->s3Client->uploadPart([
                    'Bucket'     => $this->bucket,
                    'Key'        => $path,
                    'UploadId'   => $uploadId,
                    'PartNumber' => $partNumber,
                    'Body'       => fread($file, 500 * 1024 * 1024),
                ]);

                $parts['Parts'][$partNumber] = [
                    'PartNumber' => $partNumber,
                    'ETag' => $result['ETag'],
                ];

                $partNumber++;
            }
            fclose($file);
        }
        catch (S3Exception $e) {
            $this->s3Client->abortMultipartUpload([
                'Bucket'   => $this->bucket,
                'Key'      => $path,
                'UploadId' => $uploadId
            ]);

            $this->logger->error("Error multipart uploading: " . $e->getMessage() . " on " . $e->getLine() . " for " . $e->getFile());
        }

        // Complete the multipart upload.
        $result = $this->s3Client->completeMultipartUpload([
            'Bucket'   => $this->bucket,
            'Key'      => $path,
            'UploadId' => $uploadId,
            'MultipartUpload' => $parts,
        ]);

        return $result['Location'];
}

Question

How can I retrieve the amount of uploaded bytes & total bytes to upload using the S3 createMultipartUpload function or alternative ways, between my local system & the AWS S3 Storage server.

p.s. I’m using PHP 7.2.20, Symfony 3.4 & AWS SDK for PHP v3

Source: Symfony Questions

Was this helpful?

0 / 0

Leave a Reply 0

Your email address will not be published. Required fields are marked *