A simple middleware built on reflection designed to support your Search Engine Optimized (SEO) app by dynamically creating a Sitemap.xml and Robots.txt. This package is designed to be very simple for simple use cases but allows for custom disallowed routes, user agents, and custom node endpoints by supplying optional parameters to the middleware extension methods. Sitemaps.xml asks if you would like to parse the controllers - you can add the [NoSiteMap] attribute to any class or method you would not like included in the Sitemap.xml and the [Priority] attribute to set custom priorities per route. You can provide detail routing information for a dynamic sitemap.xml of items, e.g. a link for each product in the products database. The Robots middleware will allow you to add any number of RobotRules for defining your User-Agent and Disallowed routes. This middleware parses the existing controller structure and automatically includes all get endpoints in the dynamic Sitemap.xml while ignoring any Posts, Puts, or Deletes.
Uplifted to .NET standard.
Install-Package SiteMaps.NET -Version 2.1.3
dotnet add package SiteMaps.NET --version 2.1.3
<PackageReference Include="SiteMaps.NET" Version="2.1.3" />
paket add SiteMaps.NET --version 2.1.3
#r "nuget: SiteMaps.NET, 2.1.3"
// Install SiteMaps.NET as a Cake Addin #addin nuget:?package=SiteMaps.NET&version=2.1.3 // Install SiteMaps.NET as a Cake Tool #tool nuget:?package=SiteMaps.NET&version=2.1.3
This package is not used by any NuGet packages.
This package is not used by any popular GitHub repositories.