View Single Post
  #6 (permalink)  
Old May 31st, 2003
David91 David91 is offline
91 is my age not my IQ!
 
Join Date: February 24th, 2003
Location: Singapore
Posts: 325
David91 is flying high
Default

Hi trap_door

Good to have the opportunity to learn something from you again since you have always been able to point out the error in my ways. In experiments reported by the University of Montreal:

Experiment A Experiment B
Total addresses received 7482 19484
Invalid addresses 2240 (30 %) 7042 (36 %)
Repeated addresses 1432 (19 %) 5298 (27 %)
Already in cache 1696 (23 %) 3978 (20 %)
Retained 2114 (28 %) 3166 (16 %)
Unique good addresses 1514 (20 %) 1792 (9 %)

only 20% and 9% of search returns were useful addresses. Given that such a high proportion of the search returns were invalid hosts, why is filtering such a bad option. It maximises the utility of the search list displayed and, if the user repeats the search on a regular time displaced basis, there should be a significant improvement in the quality of the results. But you also neglect to consider the more fundamental point I was making, namely that because these invalid returns are in the majority, the limitation on search returns actually reduces any given user's chance of getting good hosts.
Reply With Quote