matej_suchanek added a comment.

  With some refactors done, this is the last thing that Pywikibot doesn't 
support yet.
  
  > Not sure what we have to do exactly for it,.
  
  Perhaps let's outline, where we want to get:
  
  ----
  
  - We want to query associated mediainfo objects, similarly to Wikidata items 
associated with articles. Specifically, we want this to work / not to break:
  
    import pywikibot
    site = pywikibot.Site('commons', 'commons')
    page = pywikibot.Page(site, 'Wikipedia')
    item = page.data_item()  # pywikibot.ItemPage(wikidata, 'Q52')
    file = pywikibot.FilePage(file_repo, 'Wikipedia-logo-v2.svg')
    file_item = file.data_item()  # an object for M10337301
  
  ----
  
  - We want to load data from mediainfo objects and this data must correctly 
refer to its origin:
  
    import pywikibot
    
    file = pywikibot.FilePage(file_repo, 'Würfelzucker_--_2018_--_3564.jpg')
    file_item = file.data_item()  # an object for M71019999
    caption = file_item.get()['labels']['en']  # 'Sugar cubes (2018)'
    depicts = file_item.statements['P180'][0].getTarget()  # 
pywikibot.ItemPage(wikidata, 'Q1042920')
  
  ----
  
  - We want to save only data which refers to appropriate sites:
  
    import pywikibot
    
    file_repo = pywikibot.Site('commons', 'commons')
    data_repo = pywikibot.Site('wikidata', 'wikidata')
    
    file = pywikibot.FilePage(file_repo, 'FooBar.jpg')
    file_data = file.data_item()
    
    item = pywikibot.ItemPage(data_repo, 'Q146')
    depicts = pywikibot.Claim(data_repo, 'P180')
    depicts.setTarget(item)
    
    file_data.addClaim(depicts)
  
  (Seems nobody cared about this yet, I was just able to save a claim on 
Wikidata 
<https://www.wikidata.org/w/index.php?title=Q4115189&diff=1190652304&oldid=1190580729>
 with Q1 constructed as `pywikibot.ItemPage(pywikibot.Site('test', 'wikidata'), 
'Q1')`.)
  
  ----
  
  Data from API doesn't seem to indicate the provenance 
<https://commons.wikimedia.org/w/api.php?action=wbgetentities&format=json&ids=M71019999&props=info%7Cclaims>
 and so does general information from API 
<https://commons.wikimedia.org/w/api.php?action=query&format=json&meta=siteinfo%7Cwikibase&siprop=general%7Cnamespaces&wbprop=url%7Csiteid>,
 so we have to hardcode this to our "family files". Either way, Commons has to 
become `DataSite` in Pywikibot and `DataSite` will get new methods for 
determining the repository. They will be useful for eg. `Claim` objects for 
setting or validating the repository.

TASK DETAIL
  https://phabricator.wikimedia.org/T173195

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: matej_suchanek
Cc: matej_suchanek, David_Haskiya_WMSE, Ladsgroup, daniel, WMDE-leszek, 
zhuyifei1999, Lydia_Pintscher, SandraF_WMF, Aklapper, Lokal_Profil, 
PokestarFan, pywikibot-bugs-list, Multichill, JohnsonLee01, Dijkstra, CBogen, 
Zkhalido, darthmon_wmde, Viztor, Nandana, JKSTNK, Wenyi, Lahi, PDrouin-WMF, 
Gq86, E1presidente, Ramsey-WMF, Cparle, Anooprao, GoranSMilovanovic, QZanden, 
Tbscho, MayS, Tramullas, Acer, LawExplorer, Salgo60, Mdupont, JJMC89, 
Silverfish, Dvorapa, _jensen, rosalieper, Altostratus, Avicennasis, Scott_WUaS, 
Susannaanas, mys_721tx, Jane023, Wikidata-bugs, Base, matthiasmullie, aude, 
jayvdb, Ricordisamoa, Wesalius, Fabrice_Florin, Raymond, Masti, Alchimista, 
Steinsplitter, Mbch331, Rxy
_______________________________________________
pywikibot-bugs mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/pywikibot-bugs

Reply via email to