实现多线程异步自动上传本地文件到 Amazon S3
最近抽空做个小工具,使用AWSSDK 对本地文件目录监控,并自动同步上传文件到S3 的过程,使用的是多线程异步上传,针对大文件进行了分块

参考文献:
https://www.codeproject.com/Articles/131678/Amazon-S-Sync
https://aws.amazon.com/cn/documentation/s3/
Introduction
The SprightlySoft S3 Sync application allows you to take a folder on your computer and upload it to Amazon S3. You can make additions, deletions, and changes to your local files, and the next time you run the application, it will detect these changes and apply them to S3. This program allows you to create a mirror of a local folder on S3 and always keep it up to date.
Background
Amazon Simple Storage Service (Amazon S3) is a service that allows you to store files in Amazon's cloud computing environment. When your files are in Amazon's system, you can retrieve the files from anywhere on the web. You can also use Amazon's CloudFront service in conjunction with S3 to distribute your files to millions of people. Amazon S3 is the same highly scalable, reliable, secure, fast, inexpensive infrastructure that Amazon uses to run its own global network of web sites. Best of all, Amazon S3 is free for the first 5 GB of storage.
SprightlySoft S3 Sync uses the free SprightlySoft AWS component for .NET to interact with Amazon S3. In the code, you will see an example of uploading files to S3, deleting files from S3, listing files, and getting the properties of files.
Using the code
The SprightlySoft S3 Sync program works by listing your files on S3, listing your files locally, and comparing the differences between the two lists. Here is the logic of the program:
- The code starts by getting a list of all user settings. These include your Amazon AWS credentials, the S3 bucket you are syncing to, and the local folder you are syncing from.
- The next major call is the
PopulateS3HashTablefunction. Here, all the files in your Amazon S3 bucket are listed. The code calls theListBucketfunction from the SprighztlySoft AWS component. This function returns anArrayListof objects that represent each item in S3. Properties of each object include S3 key name, size, date modified, and ETag. These objects are added to aHashTableso they can be used later on.
Copy Codeprivate static Boolean PopulateS3HashTable(String AWSAccessKeyId,
String AWSSecretAccessKey, Boolean UseSSL, String RequestEndpoint,
String BucketName, String UploadPrefix,
ref System.Collections.Hashtable S3HashTable)
{
Boolean RetBool;
SprightlySoftAWS.S3.ListBucket MyListBucket =
new SprightlySoftAWS.S3.ListBucket();
RetBool = MyListBucket.ListBucket(UseSSL, RequestEndpoint, BucketName,
"", UploadPrefix, AWSAccessKeyId, AWSSecretAccessKey); if (RetBool == true)
{
foreach (SprightlySoftAWS.S3.ListBucket.BucketItemObject
MyBucketItemObject in MyListBucket.BucketItemsArrayList)
{
WriteToLog(" Item added to S3 list. KeyName=" +
MyBucketItemObject.KeyName, 2);
S3HashTable.Add(MyBucketItemObject.KeyName, MyBucketItemObject);
}
}
else
{
WriteToLog(" Error listing files on S3. ErrorDescription=" +
MyListBucket.ErrorDescription);
WriteToLog(MyListBucket.LogData, 3);
} return RetBool;
}
- The next call is the
PopulateLocalArrayList. This function lists all local files and folders you want to synchronize. Each file and folder is added to anArrayListso it can be used later on. The function has the option to only include certain files or exclude certain files. The function is recursive. That means it calls itself for each sub folder.
Copy Codeprivate static void PopulateLocalArrayList(String BaseFolder,
String CurrentFolder, Boolean IncludeSubFolders,
System.Collections.ArrayList ExcludeFolderslArrayList,
String IncludeOnlyFilesRegularExpression,
String ExcludeFilesRegularExpression,
ref System.Collections.ArrayList LocalArrayList)
{
System.IO.DirectoryInfo CurrentDirectoryInfo;
CurrentDirectoryInfo = new System.IO.DirectoryInfo(CurrentFolder); String FolderName; foreach (System.IO.FileInfo MyFileInfo in CurrentDirectoryInfo.GetFiles())
{
if (ExcludeFilesRegularExpression != "")
{
//check if the file is excluded
if (System.Text.RegularExpressions.Regex.IsMatch(
MyFileInfo.FullName, ExcludeFilesRegularExpression,
System.Text.RegularExpressions.RegexOptions.IgnoreCase) == true)
{
WriteToLog(" Local file excluded by " +
"ExcludeFilesRegularExpression. Name=" +
MyFileInfo.FullName, 2);
}
else
{
if (IncludeOnlyFilesRegularExpression != "")
{
if (System.Text.RegularExpressions.Regex.IsMatch(
MyFileInfo.FullName, IncludeOnlyFilesRegularExpression,
System.Text.RegularExpressions.RegexOptions.IgnoreCase) == true)
{
WriteToLog(" File added to local " +
"list. Name=" + MyFileInfo.FullName, 2);
LocalArrayList.Add(MyFileInfo.FullName);
}
else
{
WriteToLog(" Local file not included by " +
"IncludeOnlyFilesRegularExpression. Name=" +
MyFileInfo.FullName, 2);
}
}
else
{
WriteToLog(" File added to local list. Name=" +
MyFileInfo.FullName, 2);
LocalArrayList.Add(MyFileInfo.FullName);
}
}
}
else
{
if (IncludeOnlyFilesRegularExpression != "")
{
if (System.Text.RegularExpressions.Regex.IsMatch(
MyFileInfo.FullName, IncludeOnlyFilesRegularExpression,
System.Text.RegularExpressions.RegexOptions.IgnoreCase) == true)
{
WriteToLog(" File added to list. Name=" +
MyFileInfo.FullName, 2);
LocalArrayList.Add(MyFileInfo.FullName);
}
else
{
WriteToLog(" Local file not included by " +
"IncludeOnlyFilesRegularExpression. Name=" +
MyFileInfo.FullName, 2);
}
}
else
{
WriteToLog(" File added to local list. Name=" +
MyFileInfo.FullName, 2);
LocalArrayList.Add(MyFileInfo.FullName);
}
}
} if (IncludeSubFolders == true)
{
foreach (System.IO.DirectoryInfo SubDirectoryInfo
in CurrentDirectoryInfo.GetDirectories())
{
if (ExcludeFolderslArrayList.Contains(
SubDirectoryInfo.FullName.ToLower()) == true)
{
WriteToLog(" Local folder excluded by " +
"ExcludeFolders. Name=" +
SubDirectoryInfo.FullName, 2);
}
else
{
FolderName = SubDirectoryInfo.FullName;
if (FolderName.EndsWith("\\") == false)
{
FolderName += "\\";
} WriteToLog(" Folder added to local list. Name=" +
FolderName, 2);
LocalArrayList.Add(FolderName); PopulateLocalArrayList(BaseFolder, SubDirectoryInfo.FullName,
IncludeSubFolders,
ExcludeFolderslArrayList,
IncludeOnlyFilesRegularExpression,
ExcludeFilesRegularExpression,
ref LocalArrayList);
}
}
}
else
{
WriteToLog(" Sub folders excluded by IncludeSubFolders.", 2);
}
}
- Next, the program creates a list of files that should be deleted on Amazon S3. This is done through the
PopulateDeleteS3ArrayListfunction. This function goes through each item in the list of files on S3 and checks if they exist in the list of local items. If the S3 item does not exist locally, the item is added toDeleteS3ArrayList.
private static void PopulateDeleteS3ArrayList(ref System.Collections.Hashtable
S3HashTable, ref System.Collections.ArrayList LocalArrayList,
String UploadPrefix, String S3FolderDelimiter, String SoureFolder,
ref System.Collections.ArrayList DeleteS3ArrayList)
{
//For each S3 item, check if it exists locally. String KeyName;
String LocalPath; foreach (System.Collections.DictionaryEntry MyDictionaryEntry in S3HashTable)
{
KeyName = MyDictionaryEntry.Key.ToString();
KeyName = KeyName.Substring(UploadPrefix.Length); LocalPath = System.IO.Path.Combine(SoureFolder,
KeyName.Replace(S3FolderDelimiter, "\\")); if (LocalArrayList.Contains(LocalPath) == false)
{
DeleteS3ArrayList.Add(MyDictionaryEntry.Key);
}
}
}
- Next, the program finds which local files do not exist on S3. This is done through the
PopulateUploadDictionaryfunction. This function goes through the localArrayListand checks if each item exists in the S3HashTable. The item may exist on S3 and locally but the local file may have different content. To determine if a file is the same between S3 and locally, the program has an option to compare files by ETag. An ETag is an identifier based on the content of a file. If the file changes, the ETag changes. Amazon stores the Etag of each file you upload. When you list files on S3, the ETag for each file is returned. If you choose to compare by ETag, the program will calculate the ETag of the local file and check if it matches the ETag returned by Amazon. Any file that doesn't match is added to aDictionaryof items that need to be uploaded to S3.
Copy Codeprivate static void PopulateUploadDictionary(ref System.Collections.Hashtable
S3HashTable, ref System.Collections.ArrayList LocalArrayList,
String UploadPrefix, String S3FolderDelimiter,
String SoureFolder, String CompareFilesBy,
ref Dictionary<string, > UploadDictionary)
{
//Check which local items need to be uploaded to S3. String LocalPathAsKey;
SprightlySoftAWS.S3.CalculateHash MyCalculateHash =
new SprightlySoftAWS.S3.CalculateHash();
String LocalETag;
System.IO.FileInfo MyFileInfo; CompareFilesBy = CompareFilesBy.ToLower(); foreach (String LocalPath in LocalArrayList)
{
LocalPathAsKey = LocalPath;
LocalPathAsKey = LocalPathAsKey.Substring(SoureFolder.Length);
LocalPathAsKey = LocalPathAsKey.Replace("\\", S3FolderDelimiter); LocalPathAsKey = System.IO.Path.Combine(UploadPrefix, LocalPathAsKey); if (S3HashTable.ContainsKey(LocalPathAsKey) == true)
{
//Only check files to see if the content is different.
if (LocalPath.EndsWith("\\") == false)
{
//The local file exists on S3. Check if the files are different. SprightlySoftAWS.S3.ListBucket.BucketItemObject MyBucketItemObject;
MyBucketItemObject = S3HashTable[LocalPathAsKey]
as SprightlySoftAWS.S3.ListBucket.BucketItemObject; if (CompareFilesBy == "etag")
{
LocalETag = MyCalculateHash.CalculateETagFromFile(LocalPath); if (LocalETag == MyBucketItemObject.ETag.Replace("\"", ""))
{
//Files are the same.
}
else
{
UploadDictionary.Add(LocalPath, LocalETag);
}
}
else if (CompareFilesBy == "size")
{
MyFileInfo = new System.IO.FileInfo(LocalPath); if (MyFileInfo.Length == MyBucketItemObject.Size)
{
//Files are the same.
}
else
{
UploadDictionary.Add(LocalPath, "");
}
}
else
{
//If the FileName is different the file
//will not exists on S3. No need to do a check here.
}
}
}
else
{
//The local file does not exist on S3, add it to the upload list.
UploadDictionary.Add(LocalPath, "");
}
}
}
Now we have a list of files that need to be deleted on S3 and a list of files that need to be uploaded to S3. The program has an option to list these changes, or go ahead and apply these changes to S3.
If we are deleting files on S3, the DeleteExtraOnS3 function is called. This function goes through the DeleteS3ArrayList and deletes each file in it. This is done by calling the MakeS3Request function. This function uses the SprightlySoft AWS component to send the appropriate command to S3. For more information about using the AWS component, see the documentation included with the source code and read the Amazon S3 API Reference documentation from Amazon. The function has the ability to retry the command if it fails.
Copy Codeprivate static void DeleteExtraOnS3(String AWSAccessKeyId,
String AWSSecretAccessKey, Boolean UseSSL, String RequestEndpoint,
String BucketName, ref System.Collections.ArrayList DeleteS3ArrayList,
int S3ErrorRetries)
{
Boolean RetBool;
int ErrorNumber = 0;
String ErrorDescription = "";
String LogData = "";
int ResponseStatusCode = 0;
String ResponseStatusDescription = "";
Dictionary<string, > ResponseHeaders = new Dictionary<string, >();
String ResponseString = "";
int DeleteCount = 0; foreach (String DeleteItem in DeleteS3ArrayList)
{
//Send the correct parameters to the MakeS3Request function
//to delete a file on S3. This function will wait
//and retry if there is a 503 error.
RetBool = MakeS3Request(AWSAccessKeyId, AWSSecretAccessKey,
UseSSL, RequestEndpoint, BucketName, DeleteItem,
"", "DELETE", null, "", S3ErrorRetries,
ref ErrorNumber, ref ErrorDescription, ref LogData, ref ResponseStatusCode,
ref ResponseStatusDescription, ref ResponseHeaders, ref ResponseString); if (RetBool == true)
{
WriteToLog(" Delete S3 file successful. S3KeyName=" + DeleteItem);
DeleteCount += 1;
}
else
{
WriteToLog(" Delete S3 file failed. S3KeyName=" +
DeleteItem + " ErrorNumber=" + ErrorNumber +
" ErrorDescription=" + ErrorDescription +
" ResponseString=" + ResponseString);
WriteToLog(LogData, 2);
WriteToLog(" Canceling deletion of extra files on S3.");
ExitCode = 1;
break;
}
} WriteToLog(" Number of items deleted: " + DeleteCount);
} private static Boolean MakeS3Request(String AWSAccessKeyId, String AWSSecretAccessKey,
Boolean UseSSL, String RequestEndpoint, String BucketName,
String KeyName, String QueryString, String RequestMethod,
Dictionary<string, > ExtraHeaders, String SendData, int RetryTimes,
ref int ErrorNumber, ref String ErrorDescription, ref String LogData,
ref int ResponseStatusCode, ref String ResponseStatusDescription,
ref Dictionary<string, > ResponseHeaders, ref String ResponseString)
{
SprightlySoftAWS.REST MyREST = new SprightlySoftAWS.REST();
String RequestURL;
Dictionary<string, > ExtraRequestHeaders;
String AuthorizationValue;
Boolean RetBool = true;
LogData = ""; for (int i = 0; i == RetryTimes; i++)
{
RequestURL = MyREST.BuildS3RequestURL(UseSSL, RequestEndpoint,
BucketName, KeyName, QueryString); ExtraRequestHeaders = new Dictionary<string, >(); if (ExtraHeaders != null)
{
foreach (KeyValuePair<string, > MyKeyValuePair in ExtraHeaders)
{
ExtraRequestHeaders.Add(MyKeyValuePair.Key,
MyKeyValuePair.Value);
}
} ExtraRequestHeaders.Add("x-amz-date",
DateTime.UtcNow.ToString("r")); AuthorizationValue = MyREST.GetS3AuthorizationValue(RequestURL,
RequestMethod, ExtraRequestHeaders, AWSAccessKeyId, AWSSecretAccessKey);
ExtraRequestHeaders.Add("Authorization", AuthorizationValue); RetBool = MyREST.MakeRequest(RequestURL, RequestMethod,
ExtraRequestHeaders, SendData); //Set the return values.
ErrorNumber = MyREST.ErrorNumber;
ErrorDescription = MyREST.ErrorDescription;
LogData += MyREST.LogData;
ResponseStatusCode = MyREST.ResponseStatusCode;
ResponseStatusDescription = MyREST.ResponseStatusDescription;
ResponseHeaders = MyREST.ResponseHeaders;
ResponseString = MyREST.ResponseString; if (RetBool == true)
{
break;
}
else
{
if (MyREST.ResponseStatusCode == 503)
{
//A Service Unavailable response was returned. Wait and retry.
System.Threading.Thread.Sleep(1000 * i * i);
}
else if (MyREST.ErrorNumber == 1003)
{
//Getting the response failed.
//This may be a network disconnection. Wait and retry.
System.Threading.Thread.Sleep(1000 * i * i);
}
else
{
//An error occured but retrying would not solve the problem.
break;
}
}
} return RetBool;
}
Finally, the program calls the UploadMissingToS3 function to upload files from UploadDictionary to S3. Here, the program sets header information such as Content-MD5, Content-Type, and metadata to store the local file's timestamp in S3. It then calls the UploadFileToS3 function, which is very similar to the MakeS3Requestfunction. The difference is, the UploadFileToS3 function has parameters that are only relevant to uploading a file. The program uses a variable called MyUpload which is a SprightlySoft AWS component object. This object raises an event whenever the progress on an upload changes. The program hooks into this progress event and shows the progress of the upload while it is taking place. This is done in the MyUpload_ProgressChangedEvent function.
Copy Codeprivate static void UploadMissingToS3(String AWSAccessKeyId,
String AWSSecretAccessKey, Boolean UseSSL,
String RequestEndpoint, String BucketName,
String UploadPrefix, String S3FolderDelimiter,
String SoureFolder, Dictionary<string, > UserRequestHeaders,
Dictionary<string, > UserContentTypes,
Boolean CalculateMD5ForUpload,
ref Dictionary<string, > UploadDictionary,
Boolean SaveTimestampsInMetadata, Single UploadSpeedLimitKBps,
Boolean ShowUploadProgress, int S3ErrorRetries)
{
Boolean RetBool;
int ErrorNumber = 0;
String ErrorDescription = "";
String LogData = "";
int ResponseStatusCode = 0;
String ResponseStatusDescription = "";
Dictionary<string, > ResponseHeaders = new Dictionary<string, >();
String ResponseString = ""; Dictionary<string, > ExtraHeaders = new Dictionary<string, >();
String DiffName;
String LocalMD5Hash;
String MyExtension;
String KeyName;
System.IO.FileInfo MyFileInfo;
System.IO.DirectoryInfo MyDirectoryInfo;
int UploadCount = 0; SprightlySoftAWS.S3.CalculateHash MyCalculateHash =
new SprightlySoftAWS.S3.CalculateHash();
SprightlySoftAWS.S3.Helper MyS3Helper = new SprightlySoftAWS.S3.Helper(); //Get a dictionary of content types from the SprightlySoftAWS.S3.Helper class.
Dictionary<string, > ContentTypesDictionary;
ContentTypesDictionary = MyS3Helper.GetContentTypesDictionary(); foreach (KeyValuePair<string, > UploadKeyValuePair in UploadDictionary)
{
DiffName = UploadKeyValuePair.Key.Substring(SoureFolder.Length,
UploadKeyValuePair.Key.Length - SoureFolder.Length);
DiffName = DiffName.Replace("\\", "/"); KeyName = UploadPrefix + DiffName; if (UploadKeyValuePair.Key.EndsWith("\\") == true)
{
ExtraHeaders = new Dictionary<string, >();
if (SaveTimestampsInMetadata == true)
{
MyDirectoryInfo = new System.IO.DirectoryInfo(UploadKeyValuePair.Key);
ExtraHeaders.Add("x-amz-meta-local-date-created",
MyDirectoryInfo.CreationTime.ToFileTimeUtc().ToString());
} RetBool = MakeS3Request(AWSAccessKeyId, AWSSecretAccessKey,
UseSSL, RequestEndpoint, BucketName, KeyName, "",
"PUT", ExtraHeaders, "", S3ErrorRetries,
ref ErrorNumber, ref ErrorDescription, ref LogData,
ref ResponseStatusCode, ref ResponseStatusDescription,
ref ResponseHeaders, ref ResponseString); if (RetBool == true)
{
WriteToLog(" Create S3 folder successful. S3KeyName=" + KeyName);
UploadCount += 1;
}
else
{
WriteToLog(" Create S3 folder failed. S3KeyName=" +
KeyName + " ErrorNumber=" + ErrorNumber +
" ErrorDescription=" + ErrorDescription +
" ResponseString=" + ResponseString);
WriteToLog(LogData, 2);
WriteToLog(" Canceling upload of missing files to S3.");
ExitCode = 1;
break;
}
}
else
{
//Calculate the MD5 for upload if required.
//Set the content type.
//Add extra headers. ExtraHeaders = new Dictionary<string, >(); if (CalculateMD5ForUpload == true)
{
if (UploadKeyValuePair.Value == "")
{
//Calculate the MD5.
LocalMD5Hash =
MyCalculateHash.CalculateMD5FromFile(UploadKeyValuePair.Key);
}
else
{
//Convert the ETag to MD5.
LocalMD5Hash = MyS3Helper.ConvertETagToMD5(UploadKeyValuePair.Value);
} ExtraHeaders.Add("Content-MD5", LocalMD5Hash);
} MyExtension = System.IO.Path.GetExtension(UploadKeyValuePair.Key).ToLower(); if (UserContentTypes.ContainsKey(MyExtension) == true)
{
ExtraHeaders.Add("Content-Type", UserContentTypes[MyExtension]);
}
else if (ContentTypesDictionary.ContainsKey(MyExtension) == true)
{
ExtraHeaders.Add("Content-Type",
ContentTypesDictionary[MyExtension]);
} if (SaveTimestampsInMetadata == true)
{
MyFileInfo = new System.IO.FileInfo(UploadKeyValuePair.Key);
ExtraHeaders.Add("x-amz-meta-local-date-modified",
MyFileInfo.LastWriteTimeUtc.ToFileTimeUtc().ToString());
ExtraHeaders.Add("x-amz-meta-local-date-created",
MyFileInfo.CreationTime.ToFileTimeUtc().ToString());
} //Add the user supplied headers to the other headers that will be sent to S3.
foreach (KeyValuePair<string, > MyKeyValuePair in UserRequestHeaders)
{
ExtraHeaders.Add(MyKeyValuePair.Key, MyKeyValuePair.Value);
} WriteToLog(" Uploading file. S3KeyName=" + KeyName);
RetBool = UploadFileToS3(AWSAccessKeyId, AWSSecretAccessKey, UseSSL,
RequestEndpoint, BucketName, KeyName, "PUT",
ExtraHeaders, UploadKeyValuePair.Key, UploadSpeedLimitKBps,
S3ErrorRetries, ref ErrorNumber, ref ErrorDescription,
ref LogData, ref ResponseStatusCode,
ref ResponseStatusDescription, ref ResponseHeaders,
ref ResponseString); if (RetBool == true)
{
WriteToLog(" Upload file to S3 was successful.");
UploadCount += 1;
}
else
{
WriteToLog(" Upload file to S3 failed. ErrorNumber=" +
ErrorNumber + " ErrorDescription=" +
ErrorDescription + " ResponseString=" +
ResponseString);
WriteToLog(LogData, 2);
WriteToLog(" Canceling upload of missing files to S3.");
ExitCode = 1;
break;
}
}
} WriteToLog(" Number of items uploaded: " + UploadCount);
} private static Boolean UploadFileToS3(String AWSAccessKeyId,
String AWSSecretAccessKey, Boolean UseSSL, String RequestEndpoint,
String BucketName, String KeyName, String RequestMethod,
Dictionary<string, > ExtraHeaders, String LocalFileName,
Single UploadSpeedLimitKBps, int RetryTimes, ref int ErrorNumber,
ref String ErrorDescription, ref String LogData,
ref int ResponseStatusCode, ref String ResponseStatusDescription,
ref Dictionary<string, > ResponseHeaders, ref String ResponseString)
{
String RequestURL;
Dictionary<string, > ExtraRequestHeaders;
String AuthorizationValue;
Boolean RetBool = true;
LogData = ""; if (UploadSpeedLimitKBps > 0)
{
MyUpload.LimitKBpsSpeed = UploadSpeedLimitKBps;
} for (int i = 0; i == RetryTimes; i++)
{
RequestURL = MyUpload.BuildS3RequestURL(UseSSL, RequestEndpoint,
BucketName, KeyName, ""); ExtraRequestHeaders = new Dictionary<string, >(); if (ExtraHeaders != null)
{
foreach (KeyValuePair<string, > MyKeyValuePair in ExtraHeaders)
{
ExtraRequestHeaders.Add(MyKeyValuePair.Key,
MyKeyValuePair.Value);
}
} ExtraRequestHeaders.Add("x-amz-date",
DateTime.UtcNow.ToString("r")); AuthorizationValue = MyUpload.GetS3AuthorizationValue(RequestURL,
RequestMethod, ExtraRequestHeaders,
AWSAccessKeyId, AWSSecretAccessKey);
ExtraRequestHeaders.Add("Authorization", AuthorizationValue); RetBool = MyUpload.UploadFile(RequestURL, RequestMethod,
ExtraRequestHeaders, LocalFileName); //Set the return values.
ErrorNumber = MyUpload.ErrorNumber;
ErrorDescription = MyUpload.ErrorDescription;
LogData += MyUpload.LogData;
ResponseStatusCode = MyUpload.ResponseStatusCode;
ResponseStatusDescription = MyUpload.ResponseStatusDescription;
ResponseHeaders = MyUpload.ResponseHeaders;
ResponseString = MyUpload.ResponseString; if (RetBool == true)
{
break;
}
else
{
if (MyUpload.ResponseStatusCode == 503)
{
//A Service Unavailable response was returned. Wait and retry.
System.Threading.Thread.Sleep(1000 * i * i);
}
else if (MyUpload.ErrorNumber == 1003)
{
//Getting the response failed.
//This may be a network disconnection. Wait and retry.
System.Threading.Thread.Sleep(1000 * i * i);
}
else
{
//An error occured but retrying would not solve the problem.
break;
}
}
} return RetBool;
}
When the program completes, it has an option to send log information through an email. This is useful if you run the program as a scheduled task and you want to be notified if there is a failure.
实现多线程异步自动上传本地文件到 Amazon S3的更多相关文章
- centos6.8 上传文件到amazon s3
centos6.8 上传文件到amazon s3 0.参考 AWS CLI Cinnabd Reference Possible to sync a single file with aws s3 s ...
- PhpStorm 配置本地文件自动上传至服务器
目的:本地文件夹下的文件实时同步至指定服务器的文件夹,减少代码移植的成本和风险 添加一个SFTP连接 Tools - Deployment - Browse Remote Host 配置连接参数 Co ...
- Android View转为图片保存为本地文件,异步监听回调操作结果;
把手机上的一个View或ViewGroup转为Bitmap,再把Bitmap保存为.png格式的图片: 由于View转Bitmap.和Bitmap转图片都是耗时操作,(生成一个1M的图片大约500ms ...
- [原]unity3d之http多线程异步资源下载
郑重声明:转载请注明出处 U_探索 本文诞生于乐元素面试过程,被面试官问到AssetBundle多线程异步下载时,愣了半天,同样也被深深的鄙视一回(做了3年多u3d 这个都没用过),所以发誓要实现出来 ...
- C#异步批量下载文件
C#异步批量下载文件 实现原理:采用WebClient进行批量下载任务,简单的模拟迅雷下载效果! 废话不多说,先看掩饰效果: 具体实现步骤如下: 1.新建项目:WinBatchDownload 2.先 ...
- 手工创建tomcat应用,以及实现js读取本地文件内容
手工创建tomcat应用: 1.在webapps下面新建应用目录文件夹 2.在文件夹下创建或是从其他应用中复制:META-INF,WEB-INF这两个文件夹, 其中META-INF清空里面,WEB-I ...
- Android 多线程 异步加载
Android 应用中需要显示网络图片时,图片的加载过程较为耗时,因此加载过程使用线程池进行管理, 同时使用本地缓存保存图片(当来回滚动ListView时,调用缓存的图片),这样加载和显示图片较为友好 ...
- 【iOS系列】-多图片多线程异步下载
多图片多线程异步下载 开发中非常常用的就是就是图片下载,我们常用的就是SDWebImage,但是作为开发人员,不仅要能会用,还要知道其原理.本文就会介绍多图下载的实现. 本文中的示例Demno地址,下 ...
- HTML5 jQuery+FormData 异步上传文件,带进度条
<!DOCTYPE html> <html> <head> <meta charset="UTF-8"> <link href ...
随机推荐
- 线段树||BZOJ1593: [Usaco2008 Feb]Hotel 旅馆||Luogu P2894 [USACO08FEB]酒店Hotel
题面:P2894 [USACO08FEB]酒店Hotel 题解:和基础的线段树操作差别不是很大,就是在传统的线段树基础上多维护一段区间最长的合法前驱(h_),最长合法后驱(t_),一段中最长的合法区间 ...
- [No0000F9]C# 运算符重载
您可以重定义或重载 C# 中内置的运算符.因此,程序员也可以使用用户自定义类型的运算符.重载运算符是具有特殊名称的函数,是通过关键字 operator 后跟运算符的符号来定义的.与其他函数一样,重载运 ...
- Eclipse各个版本区别
1.eclipse下载地址: 最新版:http://www.eclipse.org/downloads/ 历史版:http://archive.eclipse.org/eclipse/download ...
- 下载文件的协议:HTTP、FTP、P2P
本篇学习笔记以HTTP.FTP.P2P叙述与网上下载文件有关的协议 需要掌握的要点: 下载一个文件可以使用 HTTP 或 FTP,这两种都是集中下载的方式,而 P2P 则换了一种思路,采取非中心化下载 ...
- IP地址、MAC地址及端口
概述: IP 是地址,有定位功能(网与网的通讯) (在逻辑上唯一标识一台电脑)(网络层) MAC 是身份证,无定位功能(在子网干活)(在物理上唯一标识一台电脑) (链路层) 首先是如何查看ip ...
- C#配置.INI文件
百度搜了一些资料,好多没给示例,只给了代码.让人看了直接懵逼,后来找了个靠谱的:http://www.jb51.net/article/118591.htm
- React Native开源项目如何运行(转载)
学习任何技术,最快捷的方法就是学习完基础语法,然后模仿开源项目进行学习,React Native也不例外.React Native推出了1年多了, 开源项目太多了,我们以其中一个举例子.给大家演示下如 ...
- [daily] 内存越界的分析与定位
valgrind 自不必说 1. Address Sanitize 很好有,只需要在gcc编译的时候,加上选项 -fsanitize=address 它的工程:https://github.com/ ...
- [development][libconfig] 配置文件库
以前,一直用ini的配置文件. 简单清晰但是不灵活. 换一个: 试试libconfig 主页: http://www.hyperrealm.com/oss_libconfig.shtml githu ...
- git bash 命名
git log -p -2 我们常用 -p 选项展开显示每次提交的内容差异,用 -2 则仅显示最近的两次更新. git diff HEAD git clean -df 恢复到最后一次提交的改动: gi ...