windows xp sp2
I was having major issues when I installed the service pack 2.I found a download that fixed my machine if anyone else is interested. The software fixed that whole event ID 4226 thing and tripled my internet speed.
This topic was started by Jbuckley,
I was having major issues when I installed the service pack 2. I found a download that fixed my machine if anyone else is interested. The software fixed that whole event ID 4226 thing and tripled my internet speed. Check it out at [edit: LINK REMOVED]
[please don't post rubbish link to paysites for FREE tweaks
go here for the real one http://www.LvlLord.de ]
[please don't post rubbish link to paysites for FREE tweaks
go here for the real one http://www.LvlLord.de ]
Participate on our website and join the conversation
This topic is archived. New comments cannot be posted and votes cannot be cast.
Responses to this topic
Besides, nobody needs such a fix. It only fixes something in your head. That's all.
thats quite untrue ...
this fix IS very important .... BUT its based on users needs
when there was no patch I often had the problem of empty web pages which I first thought to be firefox ...
but then I realised that those pages where empty due the connection limit and transfer rejected ....
imagine ... open 10 web pages at once and 5 remain blank ... thats quite annoying huh ?
I had to look what pipelining means and found this: http://www.mozilla.org/projects/netlib/htt...lining-faq.html. I think you seem to misunderstand two things here. First, The TCP/IP connection restriction in SP2 only applies to connection attempts. Secondly, from what I understand from this document about pipelining, this is not just a bundling of requests to save TCP/IP packets. It's not even used for opening new connections but only for existing "keep-alive" connections.
So, what is wrong:
- Connection limit of SP2 limits all concurrent connections to 10
- Pipelining reduces the overall amout of packets used for opening new connections
And what is right:
- Connection limit of SP2 only limits all concurrent connection attempts
- Pipelining helps reducing the amount of packets for existing connections, where the server also supports it
In the case of SP2 this rumor of the SP2 limiting all concurrent connections to 10 spread like a desease, but is utterly wrong.
The only time the limit may apply is when you connect the first time with a P2P software and found peers have yet to be connected to. This process may be a bit slower due to the queueing but it doesn't affect the overall functionality. As there's no limit that applies to active and life connections, P2P programs aren't slowed down.
How a P2P software works:
1.) Starting the program
2.) Connecting to a server (if there is one used)
3.) Searching for material, loading torrents or ED2K links, etc. (if this hasn't been done in a previous session)
4.) Finding peers and connecting to them (this is where a limit may apply)
5.) Waiting in queue
6.) Receiving data
7.) Finishing a download
8.) Closing the program
Only in step 4.) a limit can apply. And that step is in fact a very short one. The most time is consumed in step 5.) and 6.).
So, what is wrong:
- Connection limit of SP2 limits all concurrent connections to 10
- Pipelining reduces the overall amout of packets used for opening new connections
And what is right:
- Connection limit of SP2 only limits all concurrent connection attempts
- Pipelining helps reducing the amount of packets for existing connections, where the server also supports it
In the case of SP2 this rumor of the SP2 limiting all concurrent connections to 10 spread like a desease, but is utterly wrong.
The only time the limit may apply is when you connect the first time with a P2P software and found peers have yet to be connected to. This process may be a bit slower due to the queueing but it doesn't affect the overall functionality. As there's no limit that applies to active and life connections, P2P programs aren't slowed down.
How a P2P software works:
1.) Starting the program
2.) Connecting to a server (if there is one used)
3.) Searching for material, loading torrents or ED2K links, etc. (if this hasn't been done in a previous session)
4.) Finding peers and connecting to them (this is where a limit may apply)
5.) Waiting in queue
6.) Receiving data
7.) Finishing a download
8.) Closing the program
Only in step 4.) a limit can apply. And that step is in fact a very short one. The most time is consumed in step 5.) and 6.).
.... you are right and wrong ...
first yes it only count for half open connections ... that for sure never said anything against that ...
lol 10 cons all in all would be hell
BUT
you can easily get the limit of 10 ... even though its a short period of time ... true ... I mean if it would be something theoretical OK ... but I have proof here ...
I mean ... this thing was something new ... you see it this was ... hey there is a patch ... is it really needed ???
but in my case it was
HELL WHAT THE FUCK IS GOING ON ... WHY DOES INTERNET SUCK THIS MUCH ... it really drove me crazy
then finding the patch fixing the problem immediatly
2nd
one of the main reasons for pipeling is extra cool feature its plain and simple ... go and do the loading parallel not serial
e.g.
you may noticed the green ad text ....
the fu**ing server http://itxt.vibrantmedia.com sucks ass it takes endless to respond ... page load was pain in the ass till I noticed that somehow the pipelining option was turned of ... and now ?
now the server still sucks ass but the other things got a chance to pass by by beeing processed parallel ...
now imagine ...
any site saves data at least on 8 servers (90% ad servers) when pipelining is on to get the max performance for broadband ...
now calling e.g. 5 webpages
makes ~ (AROUND ...this is just an example) 40 connections at once ... and poof one of them remains blank
again this is an example in most cases it will work fine ... but as soon as you do anythin parallel like downloading the situation gets more complicate
and the limit is easily reached ... and also noticed that warning message in windows log .... it ain't there for fun (this truly sais ... hey I restricted access connection XY because there are too many open connections
and if you think of how emule or P2P in general works ....
thousands of connections are made (btw one of the main reasons why P2P is so slow ... at least 1/5 to 1/3 is wasted for making connections)
first yes it only count for half open connections ... that for sure never said anything against that ...
lol 10 cons all in all would be hell
BUT
you can easily get the limit of 10 ... even though its a short period of time ... true ... I mean if it would be something theoretical OK ... but I have proof here ...
I mean ... this thing was something new ... you see it this was ... hey there is a patch ... is it really needed ???
but in my case it was
HELL WHAT THE FUCK IS GOING ON ... WHY DOES INTERNET SUCK THIS MUCH ... it really drove me crazy
then finding the patch fixing the problem immediatly
2nd
Normally, HTTP requests are issued sequentially, with the next request being issued only after the response to the current request has been completely received. Depending on network latencies and bandwidth limitations, this can result in a significant delay before the next request is seen by the server.
HTTP/1.1 allows multiple HTTP requests to be written out to a socket together without waiting for the corresponding responses. The requestor then waits for the responses to arrive in the order in which they were requested. The act of pipelining the requests can result in a dramatic improvement in page loading times, especially over high latency connections.
one of the main reasons for pipeling is extra cool feature its plain and simple ... go and do the loading parallel not serial
e.g.
you may noticed the green ad text ....
the fu**ing server http://itxt.vibrantmedia.com sucks ass it takes endless to respond ... page load was pain in the ass till I noticed that somehow the pipelining option was turned of ... and now ?
now the server still sucks ass but the other things got a chance to pass by by beeing processed parallel ...
now imagine ...
any site saves data at least on 8 servers (90% ad servers) when pipelining is on to get the max performance for broadband ...
now calling e.g. 5 webpages
makes ~ (AROUND ...this is just an example) 40 connections at once ... and poof one of them remains blank
again this is an example in most cases it will work fine ... but as soon as you do anythin parallel like downloading the situation gets more complicate
and the limit is easily reached ... and also noticed that warning message in windows log .... it ain't there for fun (this truly sais ... hey I restricted access connection XY because there are too many open connections
and if you think of how emule or P2P in general works ....
thousands of connections are made (btw one of the main reasons why P2P is so slow ... at least 1/5 to 1/3 is wasted for making connections)