HUMANS
A "/robots.txt
" file instructs robot users how to access your website but it's also nice to have a "/humans.txt
" file available for your human users. [1] The HUMANS ("we are people, not machines") website protocol will tell human visitors about the human(s) that built the website.
https://www.nicolesharp.net/humans.txt
documentation
editor
As with all webtext files, you should use an advanced text editor such as Notepad-Plus-Plus (not Microsoft Windows Notepad). [2] Files should be saved with Unix line endings and UTF-8 (Unicode Transformation Format Eight-Bit) character encoding.
directory
"humans.txt
" should be saved to the root webdirectory ("/
") together with "/robots.txt
", "/sitemap.txt
", and "/security.txt
".
Nicole Sharp's Website
"/humans.txt
" for Nicole Sharp's Website.
/* TEAM */
Admin: Nicole Sharp
Site: https://www.nicolesharp.net/
Twitter: https://www.twitter.com/nicolesharp100/
Location: Cumberland, Allegany, Maryland, United States of America (US/USA)
/* THANKS */
Allegany.EDU: https://www.allegany.edu/
FileZilla: https://www.filezilla-project.org/
Frostburg.EDU: https://www.frostburg.edu/
GIMP: https://www.gimp.org/
Google: https://www.google.com/
HighlightJS: https://www.highlightjs.org/
HumansTXT: https://humanstxt.org/
Inkscape: https://www.inkscape.org/
IrfanView: https://www.irfanview.com/
Matomo: https://www.matomo.org/
Microsoft: https://www.microsoft.com/
Mozilla: https://www.mozilla.org/
Notepad-Plus-Plus: https://www.notepad-plus-plus.org/
Oracle: https://www.oracle.com/
PDFSAM: https://www.pdfsam.org/
PHP: https://www.php.net/
RobotsTXT: https://www.robotstxt.org/
SecurityTXT: https://www.securitytxt.org/
Sitemaps: https://www.sitemaps.org/
Sumatra PDF Reader: https://www.sumatrapdfreader.org/
W3C: https://www.w3.org/
WHATWG: https://www.whatwg.org/
Wikimedia: https://www.wikimedia.org/
Yandex: https://www.yandex.com/
/* SITE */
Last update: 2023
Language: English (EN/ENG)
Software: Microsoft Windows, Notepad-Plus-Plus, FileZilla, Matomo, Wikimedia MediaWiki, HighlightJS, PDFSAM, Sumatra PDF Reader, Inkscape, GIMP, IrfanView
TEAM
"TEAM
" provides contact info for the human development team (you).
/* TEAM */ Title: Name Site: URL Twitter: URL Location: City, County, State, Country (ISO3166Alpha2/ISO3166Alpha3)
I recommend including the translingual two-letter and three-letter ISO 3166 country codes for non-English speakers.
Additional URLs (uniform resource locators) to social media webprofiles can be added as desired.
THANKS
"THANKS
" gives thanks to anyone who helped you build the website. I use this field to give thanks to organizations and corporations that have provided free software to help me build the website.
/* THANKS */ Name: URL Name: URL
SITE
"SITE
" provides additional information about the website.
/* SITE */ Last update: Year Language: LanguageName (ISO639-1/ISO639-3) Software: Application, Application
"Last update
" should be year-only to avoid confusion between whether this refers to the last update of "/humans.txt
" or the last update of the website ("/*
"). Content on the website may be being updated every day, but you shouldn't have to update "/humans.txt
" that often.
I recommend including the translingual two-letter and three-letter ISO 639 language codes for non-English speakers.
comments
The HUMANS protocol doesn't formally specify a way to add comments. CSS-style comment notation is already used for the section titles. Since this is a protocol designed to be read by humans and not by machines, you should have considerable flexibility to modify the syntax of "/humans.txt
" however you like as long as the content is understandable by humans (there is already a machine-readable "/robots.txt
" for the bots). If you need to add a comment, I recommend to use the same comment notation from the ROBOTS and SECURITY protocols with each comment on a new line beginning with a hash ("#
"). [3] [4]
# A comment.
ROBOTS
HUMANS can be added to the Robots Exclusion Protocol ("/robots.txt
") as a comment ("#
"). [5] An example Robots Exclusion Protocol with HUMANS is given below.
User-agent: * Disallow: Sitemap: https://www.example.net/sitemap.txt # Security: https://www.example.net/security.txt # Humans: https://www.example.net/humans.txt
see also
references
- ↑
https://humanstxt.org/
- ↑
https://www.notepad-plus-plus.org/
- ↑
https://www.rfc-editor.org/rfc/rfc9309
- ↑
https://www.rfc-editor.org/rfc/rfc9116
- ↑
https://www.robotstxt.org/
keywords
development, HUMANS, humans.txt, ROBOTS, robots.txt, TXT, webdevelopment