On Friday, 8 June 2018 at 22:06:29 UTC, Walter Bright wrote:
On 6/8/2018 3:02 PM, Brad Roberts wrote:
Essentially (if not actually) everything on github is available through their api's.  No need for scraping or other heroics to gather it.

That's good to know! The situation I was concerned with is it going dark all of a sudden.

BTW, if someone wants to build a scraper that'll produce static web pages of the dlang PR discussions, that would be pretty cool!

There's plenty of third party tools that archive GitHub.

For example, https://www.gharchive.org/. GitHub advertises some of them at https://help.github.com/articles/about-archiving-content-and-data-on-github/#third-party-archival-projects and https://help.github.com/articles/backing-up-a-repository/.

Personally I think the fear of Microsoft ruining GitHub is completely unfounded. Just look at what they did to Xamarin. They bought an interesting product and then made it free for individuals, open sourced it, and improved it drastically. And they sure do hate Linux nowadays with dotnet CORE being partially to improve Linux / cross-platform support.

Reply via email to