Sitemap for Nicole Sharp's Website and HUMANS: Difference between pages

From NikkiWiki
(Difference between pages)
Jump to navigation Jump to search
(Created page with "<u><code>https://www.nicolesharp.net/sitemap.txt</code></u> == see also == * <u><code>SITEMAP</code></u> * <u><code>NikkiWiki sitemap</code></u> == keywords == <code>hyperlinks, links, net, network, NikkiSite, NikkiWiki, sitemap, sitemap.txt, URLs, web, weblinks, website, wiki, wikisite, WWW</code> {{#seo:|keywords=hyperlinks, links, net, network, NikkiSite, NikkiWiki, sitemap, sitemap.txt, URLs, web, weblinks, website, wiki, wikisite, WWW}} category:Nicole S...")
 
 
Line 1: Line 1:
<u><code>https://www.nicolesharp.net/sitemap.txt</code></u>
A "<code>[[ROBOTS|/robots.txt]]</code>" file instructs robot users how to access your website but it's also nice to have a "<code>/humans.txt</code>" file available for your human users. <ref><code>https://humanstxt.org/</code></ref>&ensp; The HUMANS ("we are people, not machines") website protocol will tell human visitors about the human(s) that built the website.
 
== documentation ==
 
* [https://humanstxt.org/ <code>humans.txt</code>: We Are People, Not Machines]
 
== editor ==
 
{{webtext editor}}
 
== directory ==
 
"<code>humans.txt</code>" should be saved to the root webdirectory ("<code>/</code>") together with "<code>/robots.txt</code>", "<code>[[SITEMAP|/sitemap.txt]]</code>", and "<code>[[SECURITY|/security.txt]]</code>".
 
<u><code>https://www.nicolesharp.net/humans.txt</code></u>
 
== Nicole Sharp's Website ==
 
"<code>/humans.txt</code>" for <u>[[Nicole Sharp's Website]]</u>.
 
<code><syntaxhighlight lang="text">
/* TEAM */
Admin: Nicole Sharp
Site: https://www.nicolesharp.net/
Twitter: https://www.twitter.com/nicolesharp100/
Location: Cumberland, Allegany, Maryland, United States of America (US/USA)
 
/* THANKS */
Allegany.EDU: https://www.allegany.edu/
FileZilla: https://www.filezilla-project.org/
Frostburg.EDU: https://www.frostburg.edu/
GIMP: https://www.gimp.org/
Google: https://www.google.com/
HighlightJS: https://www.highlightjs.org/
HumansTXT: https://humanstxt.org/
Inkscape: https://www.inkscape.org/
IrfanView: https://www.irfanview.com/
Matomo: https://www.matomo.org/
Microsoft: https://www.microsoft.com/
Mozilla: https://www.mozilla.org/
Notepad-Plus-Plus: https://www.notepad-plus-plus.org/
Oracle: https://www.oracle.com/
PDFSAM: https://www.pdfsam.org/
PHP: https://www.php.net/
RobotsTXT: https://www.robotstxt.org/
SecurityTXT: https://www.securitytxt.org/
Sitemaps: https://www.sitemaps.org/
Sumatra PDF Reader: https://www.sumatrapdfreader.org/
W3C: https://www.w3.org/
WHATWG: https://www.whatwg.org/
Wikimedia: https://www.wikimedia.org/
Yandex: https://www.yandex.com/
 
/* SITE */
Last update: 2023
Language: English (EN/ENG)
Software: Microsoft Windows, Notepad-Plus-Plus, FileZilla, Matomo, Wikimedia MediaWiki, HighlightJS, PDFSAM, Sumatra PDF Reader, Inkscape, GIMP, IrfanView
</syntaxhighlight></code>
 
== TEAM ==
 
"<code>TEAM</code>" provides contact info for the human development team (you).
 
<pre>
/* TEAM */
Title: Name
Site: URL
Twitter: URL
Location: City, County, State, Country (ISO3166Alpha2/ISO3166Alpha3)
</pre>
 
I recommend including the translingual two-letter and three-letter [[wikipedia:ISO 3166|ISO 3166]] country codes for non-English speakers.
 
Additional URLs (uniform resource locators) to social media webprofiles can be added as desired.
 
== THANKS ==
 
"<code>THANKS</code>" gives thanks to anyone who helped you build the website.&ensp; I use this field to give thanks to organizations and corporations that have provided free software to help me build the website.
 
<pre>
/* THANKS */
Name: URL
Name: URL
</pre>
 
== SITE ==
 
"<code>SITE</code>" provides additional information about the website.
 
<pre>
/* SITE */
Last update: Year
Language: LanguageName (ISO639-1/ISO639-3)
Software: Application, Application
</pre>
 
"<code>Last update</code>" should be year-only to avoid confusion between whether this refers to the last update of "<code>/humans.txt</code>" or the last update of the website ("<code>/*</code>").&ensp; Content on the website may be being updated every day, but you shouldn't have to update "<code>/humans.txt</code>" that often.
 
I recommend including the translingual two-letter and three-letter [[wikipedia:ISO 639|ISO 639]] language codes for non-English speakers.
 
== comments ==
 
The HUMANS protocol doesn't formally specify a way to add comments.&ensp; <abbr title="cascading stylesheet">CSS</abbr>-style comment notation is already used for the section titles.&ensp; Since this is a protocol designed to be read by humans and not by machines, you should have considerable flexibility to modify the syntax of "<code>/humans.txt</code>" however you like as long as the content is understandable by humans (there is already a machine-readable "<code>/robots.txt</code>" for the bots).&ensp; If you need to add a comment, I recommend to use the same comment notation from the ROBOTS and SECURITY protocols with each comment on a new line beginning with a hash ("<code>#</code>"). <ref><code>https://www.rfc-editor.org/rfc/rfc9309</code></ref> <ref><code>https://www.rfc-editor.org/rfc/rfc9116</code></ref>
 
<code><pre>
# A comment.
</pre></code>
 
== ROBOTS ==
 
HUMANS can be added to the Robots Exclusion Protocol ("<code>/robots.txt</code>") as a comment ("<code>#</code>"). <ref><u><code>[[ROBOTS#SECURITY]]</code></u></ref> <ref><code>https://www.robotstxt.org/</code></ref>&ensp; An example Robots Exclusion Protocol with HUMANS is given below.
 
<highlight lang="robots">
User-agent: *
Disallow:
Sitemap: https://www.example.net/sitemap.txt
# Security: https://www.example.net/security.txt
# Humans: https://www.example.net/humans.txt
</highlight>


== see also ==
== see also ==


* <u><code>[[SITEMAP]]</code></u>
* <u><code>https://www.nicolesharp.net/humans.txt</code></u>
* <u><code>NikkiWiki sitemap</code></u>
* <code>https://humanstxt.org/</code>
* <u><code>[[SECURITY]]</code></u>
* <u><code>[[ROBOTS#SECURITY]]</code></u>
 
== references ==
 
<references />


== keywords ==
== keywords ==


<code>hyperlinks, links, net, network, NikkiSite, NikkiWiki, sitemap, sitemap.txt, URLs, web, weblinks, website, wiki, wikisite, WWW</code>
<code>development, HUMANS, humans.txt, ROBOTS, robots.txt, TXT, webdevelopment</code>


{{#seo:|keywords=hyperlinks, links, net, network, NikkiSite, NikkiWiki, sitemap, sitemap.txt, URLs, web, weblinks, website, wiki, wikisite, WWW}}
{{#seo:|keywords=development, HUMANS, humans.txt, ROBOTS, robots.txt, TXT, webdevelopment}}


[[category:Nicole Sharp's Website]]
[[category:webdevelopment]]

Revision as of 2023-09-06T04:14:21

A "/robots.txt" file instructs robot users how to access your website but it's also nice to have a "/humans.txt" file available for your human users. [1]  The HUMANS ("we are people, not machines") website protocol will tell human visitors about the human(s) that built the website.

documentation

editor

As with all webtext files, you should use an advanced text editor such as Notepad-Plus-Plus (not Microsoft Windows Notepad). [2]  Files should be saved with Unix line endings and UTF-8 (Unicode Transformation Format Eight-Bit) character encoding.

directory

"humans.txt" should be saved to the root webdirectory ("/") together with "/robots.txt", "/sitemap.txt", and "/security.txt".

https://www.nicolesharp.net/humans.txt

Nicole Sharp's Website

"/humans.txt" for Nicole Sharp's Website.

/* TEAM */
Admin: Nicole Sharp
Site: https://www.nicolesharp.net/
Twitter: https://www.twitter.com/nicolesharp100/
Location: Cumberland, Allegany, Maryland, United States of America (US/USA)

/* THANKS */
Allegany.EDU: https://www.allegany.edu/
FileZilla: https://www.filezilla-project.org/
Frostburg.EDU: https://www.frostburg.edu/
GIMP: https://www.gimp.org/
Google: https://www.google.com/
HighlightJS: https://www.highlightjs.org/
HumansTXT: https://humanstxt.org/
Inkscape: https://www.inkscape.org/
IrfanView: https://www.irfanview.com/
Matomo: https://www.matomo.org/
Microsoft: https://www.microsoft.com/
Mozilla: https://www.mozilla.org/
Notepad-Plus-Plus: https://www.notepad-plus-plus.org/
Oracle: https://www.oracle.com/
PDFSAM: https://www.pdfsam.org/
PHP: https://www.php.net/
RobotsTXT: https://www.robotstxt.org/
SecurityTXT: https://www.securitytxt.org/
Sitemaps: https://www.sitemaps.org/
Sumatra PDF Reader: https://www.sumatrapdfreader.org/
W3C: https://www.w3.org/
WHATWG: https://www.whatwg.org/
Wikimedia: https://www.wikimedia.org/
Yandex: https://www.yandex.com/

/* SITE */
Last update: 2023
Language: English (EN/ENG)
Software: Microsoft Windows, Notepad-Plus-Plus, FileZilla, Matomo, Wikimedia MediaWiki, HighlightJS, PDFSAM, Sumatra PDF Reader, Inkscape, GIMP, IrfanView

TEAM

"TEAM" provides contact info for the human development team (you).

/* TEAM */
Title: Name
Site: URL
Twitter: URL
Location: City, County, State, Country (ISO3166Alpha2/ISO3166Alpha3)

I recommend including the translingual two-letter and three-letter ISO 3166 country codes for non-English speakers.

Additional URLs (uniform resource locators) to social media webprofiles can be added as desired.

THANKS

"THANKS" gives thanks to anyone who helped you build the website.  I use this field to give thanks to organizations and corporations that have provided free software to help me build the website.

/* THANKS */
Name: URL
Name: URL

SITE

"SITE" provides additional information about the website.

/* SITE */
Last update: Year
Language: LanguageName (ISO639-1/ISO639-3)
Software: Application, Application

"Last update" should be year-only to avoid confusion between whether this refers to the last update of "/humans.txt" or the last update of the website ("/*").  Content on the website may be being updated every day, but you shouldn't have to update "/humans.txt" that often.

I recommend including the translingual two-letter and three-letter ISO 639 language codes for non-English speakers.

comments

The HUMANS protocol doesn't formally specify a way to add comments.  CSS-style comment notation is already used for the section titles.  Since this is a protocol designed to be read by humans and not by machines, you should have considerable flexibility to modify the syntax of "/humans.txt" however you like as long as the content is understandable by humans (there is already a machine-readable "/robots.txt" for the bots).  If you need to add a comment, I recommend to use the same comment notation from the ROBOTS and SECURITY protocols with each comment on a new line beginning with a hash ("#"). [3] [4]

# A comment.

ROBOTS

HUMANS can be added to the Robots Exclusion Protocol ("/robots.txt") as a comment ("#"). [5] [6]  An example Robots Exclusion Protocol with HUMANS is given below.

User-agent: *
Disallow:
Sitemap: https://www.example.net/sitemap.txt
# Security: https://www.example.net/security.txt
# Humans: https://www.example.net/humans.txt

see also

references

  1. https://humanstxt.org/
  2. https://www.notepad-plus-plus.org/
  3. https://www.rfc-editor.org/rfc/rfc9309
  4. https://www.rfc-editor.org/rfc/rfc9116
  5. ROBOTS#SECURITY
  6. https://www.robotstxt.org/

keywords

development, HUMANS, humans.txt, ROBOTS, robots.txt, TXT, webdevelopment