Google has launched a beta version of a tool that lets website owners tell the search engine more about their website instead of having to wait until the Google webspider comes by.
The Sitemaps application lets website owners tell the search engine about the individual site's structure and the frequency at which it changes. In theory, this allows Google to create a more accurate index and deliver better search results.
The tool can benefit both amateur and large, professionally run websites, Google said on its website.
Website owners can download the free application that creates an XML file containing all the information about the website.
Sitemaps is similar to another technology that aids search engines called the Robots Exclusion Standard (RES). By adding a robots.txt file to the root of a website, site owners can instruct search robots or spiders which files and directories they should add to their index.
Where RES is an industry standard, the Sitemaps technology is developed by Google. Nothing, however, prevents competing search engines from accessing the information in the XML file and using it to improve their search index.
Kicking Palantir off of AWS is among their demands, too
Rafaela Vasquez was watching The Voice at the time of the crash, new evidence shows
PUBG price slashed on Steam after selling more than 50 million copies - as daily player numbers plunge
Use the same password for every website? It might be time to change them all