Embarrassingly I lost the code to this site when my laptop died.
After losing the website, I stopped updating the website. But a few days ago I decided I want to be able to post some things again, I am not a normal person, so I wouldn't let my website disappear.
If only there was a organisation that archives the internet. Oh wait, there is.
So how do I force a crawl? There is a Stack Overflow post saying I can poke some web pages, and even a curl
command to do it.
However it's rate limited, and it doesn't seem like the blessed way.
Take 2. ArchiveBot.
I had to jump on IRC and ask. The nice people did it straight away. Superb experience, would recommend.
It didn't appear on The Wayback Machine straight way, however this was expected, the wiki had told me. ArchiveBot produces a WARC file which gets sent to Internet Archive, which eventually gets loaded into The Wayback Machine. it's a nice system, and at this level, easy to understand. I'm sure there is a ton of complicated details, and I have some questions about authenticity, but I trust the people involved have already thought all of this way more than I have.
Many thanks to everyone involved.