当前位置: 动力学知识库 > 问答 > 编程问答 >

c# - System.Net.WebException while downloading file

问题描述:

I have a list of mp3 which I am downloading. After some files are downloaded, not all of them - around 5-7, I get WebException. I did a stacktrace and this is the result.

Exception thrown: 'System.Net.WebException' in System.dll

Debug message: The operation has timed out

InnerEx: at System.Net.HttpWebRequest.GetResponse()

at iBlock.Main._InetGetHTMLSearch(String sArtist) in C:\Users\...\Main.cs:line 590

My _InetGetHTMLSearch looks like this

private void _InetGetHTMLSearch(string sArtist)

{

aLinks.Clear();

if (AudioDumpQuery == string.Empty)

{

//return string.Empty;

}

string[] sStringArray;

string sResearchURL = "http://www.audiodump.biz/music.html?" + AudioDumpQuery + sArtist.Replace(" ", "+");

string aRet;

HttpWebRequest webReq = (HttpWebRequest)HttpWebRequest.Create(sResearchURL);

webReq.UserAgent = "Mozilla / 5.0(Macintosh; Intel Mac OS X 10_9_3) AppleWebKit / 537.75.14(KHTML, like Gecko) Version / 7.0.3 Safari / 7046A194A";

webReq.Referer = "http://www.audiodump.com/";

webReq.Timeout = 5000;

try

{

webReq.CookieContainer = new CookieContainer();

webReq.Method = "GET";

using (WebResponse response = webReq.GetResponse())

{

using (Stream stream = response.GetResponseStream())

{

StreamReader reader = new StreamReader(stream);

aRet = reader.ReadToEnd();

//Console.WriteLine(aRet);

string[] aTable = _StringBetween(aRet, "<BR><table", "table><BR>", RegexOptions.Singleline);

if (aTable != null)

{

string[] aInfos = _StringBetween(aTable[0], ". <a href=\"", "<a href=\"");

if (aInfos != null)

{

for (int i = 0; i < aInfos.Length; i++)

{

//do some magic here

}

}

else

{

//debug

}

}

else

{

//debug 2

}

}

response.Dispose();

}

}

catch (Exception ex)

{

Console.WriteLine("Debug message: " + ex.Message + "InnerEx: " + ex.StackTrace);

aLinks.Clear();

return;

//throw exception

}

}

what this method does is simple. A simple search of the sArtist given at audiodump.com

I have a timer which runs very fast, every 10ms.

private void MainTimer_Tick(object sender, EventArgs e)

{

_DoDownload(DoubleDimList[i][y], ref mp3ToPlay);

if (muted) Mute(0);

if (Downloading)

{

StatusLabel.Text = "Downloading: " + DoubleDimList[i][y];

}

}

Now this timer handles the download in the background in a Global scope.

The _DoDownload methos which basically starts the entire process looks like this

private void _DoDownload(string dArtist, ref string dPath)

{

if (!Contain && skip <= 3 && !Downloading)

{

try

{

_InetGetHTMLSearch(dArtist);

if (aLinks.Count < 1)

{

//skip and return

Console.WriteLine("Skipping: " + dArtist);

IniWriteValue(_playlists[i], "Track " + y, dArtist + " -iBlockSkip");

y++;

return;

}

string path = mp3Path + "\\" + dArtist + ".mp3";

if (DownloadOne(aLinks[0], path, false))

{

hTimmer.Start();

Downloading = true;

}

}

catch (Exception Ex)

{

MessageBox.Show("Download start error: " + Ex.Message);

}

}

else if (Downloading)

{

try {

int actualBytes = strm.Read(barr, 0, arrSize);

fs.Write(barr, 0, actualBytes);

bytesCounter += actualBytes;

double percent = 0d;

if (fileLength > 0)

percent =

100.0d * bytesCounter /

(preloadedLength + fileLength);

label1.Text = Math.Round(percent) + "%";

if (Math.Round(percent) >= 100)

{

string path = mp3Path + "\\" + dArtist + ".mp3";

label1.Text = "";

dPath = path;

aLinks.Clear();

hTimmer.Stop();

hTimmer.Reset();

fs.Flush();

fs.Close();

lastArtistName = "N/A";

Downloading = false;

y++;

if (y >= DoubleDimList[i].Count)

{

i++;

}

}

if (Math.Round(percent) <= 1)

{

if (hTimmer.ElapsedMilliseconds >= 3000)

{

string path = mp3Path + "\\" + dArtist + ".mp3";

hTimmer.Stop();

hTimmer.Reset();

fs.Flush();

fs.Close();

System.IO.File.Delete(path);

Contain = false;

skip += 1;

Downloading = false;

}

} }

catch(Exception Ex)

{

MessageBox.Show("Downloading error: " + Ex.Message);

}

}

}

Now once the exception is thrown it messes up the entire project. As you see in the last method, if _InetGetHTMLSearch doesn't update the search(returns nothing) I am skipping and moving to next search. However the exception will be thrown in every next search. I tried setting new cookies in every search but still didn't work.

Any solutions how to avoid this issue?

P.S. I have to say that if I change the timer's Interval to 500ms it will download more mp3 before the exception is thrown but not all of them.

Edit: The issue here is obvious. The request timesout but even if I set it to Timeout.Infinite it will hand there forever

分享给朋友:
您可能感兴趣的文章:
随机阅读: