A mashup of two different types of web search tools could revolutionise the effectiveness of internet searching, academics believe.
Information scientists Liu Wei and Chen Junjie, of the Taiyuan University of Technology in Shanxi, China, have combined two distinct types of computer software to build a search engine that can intelligently crawl other search engines.
"Traditional search engines cannot cope easily with this rapid expansion of information resources," explained Junjie.
Junjie and his colleagues turned to the concept of 'search agents' in a bid to cope with this problem.
Search agents are intelligent virtual robots that can scan data very quickly looking for keywords and assessing the context of their findings.
The researchers then combined the search agent idea with the so-called meta search engine. Meta searches involve scanning information not from a single source, such as the Google or Yahoo indexes, but from all available sources.
The Chinese team has developed a new intelligent search agent and combined it with a meta search tool.
The intelligent agent can determine the context of the user's search terms and choose appropriate search engines to scan. It then retrieves the most relevant results.
Junjie explained that this approach boosts the precision rate and the recall rate of traditional search engines, and fulfils query requests well.
The scientists describe their new search robot in Inderscience's International Journal of Agent-Oriented Software Engineering.
Dr Kuan Hon criticises GDPR consent emails that will only eviscerate marketing databases and 'media misinformation'
Apple squashes Steam Link app on 'business conflicts' grounds
Philip Hammond wants to forget rules that the UK agreed with the EU to ban non-European companies from the satellites
Instapaper to 'go dark' in Europe until it can work out GDPR compliance