|Smalyshev added a comment.|
@Nikki As a side note - please avoid running queries like "give me ten million entries", unless you really really really need it. It most probably would not be stable, and it creates load on the service that makes harder for others to use it - we're still running on only two machines. And you browser may not be happy loading several hundred megs of JSON into memory either.
This particular query produces hundreds of megabytes of data. I don't really consider doing such thing a recommended use case for the service - if you really need data to the tune of several hundred megs, processing dump is the best venue for now. When we get LDF implementation, there would be other ways too.
Cc: Smalyshev, gerritbot, Jonas, Nikki, Aklapper, mschwarzer, Avner, Lewizho99, Maathavan, debt, Gehel, D3r1ck01, FloNight, Xmlizer, Izno, jkroll, Wikidata-bugs, Jdouglas, aude, Deskana, Manybubbles, Mbch331
_______________________________________________ Wikidata-bugs mailing list Wikidataemail@example.com https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs