12 packages returned for Tags:"Robots.txt"

A robots.txt parser for .NET Supports ; - Allow directives. - Crawl-delay directives. - Sitemap declarations. - * and $ wildcards. See https://bitbucket.org/cagdas/robotstxt for usage examples.
SimpleSitemap is a lite library that helps you create web sitemaps for collections or lists of items. These sitemaps follow the Sitemap Protocol. Both sitemapindex and urlset links are generated, based upon the data collection size and 'page size'. Examples of this could be a list of your users,... More information
Small utility, used to check robots.txt on web app and make sure that it is valid. RobotsTxt check is based on https://www.nuget.org/packages/RobotsTxt, but this project is no longer maintained, so I copied it for myself and introduced some useful (as I see them) improvements
- It adds a robots.txt and a robots_closed.txt to the root of the website - It adds a rewrite rule to the web.config that rewrites the robots.txt to robots_closed.txt for all urls ending with tamtam.nl
Search Engine management tools. This library helps developers to create routing for robots.txt and sitemap.xml. Also provides facility for creating sitemap.xml file and load existing sitemap.xml file.
A simple middleware built on reflection designed to support your Search Engine Optimized (SEO) app by dynamically creating a Sitemap.xml and Robots.txt. This package is designed to be very simple for simple use cases but allows for custom disallowed routes, user agents, and custom node endpoints by... More information