On Mon, Aug 3, 2009 at 12:55 PM, Paul Houlep...@ontology2.com wrote:
For what it's worth, metaweb seems to largely remove ListOf pages
when adding wikipedia resources to Freebase.
Actually they get added to Freebase, but usually without a type
(unless they accidently get mistyped as a Person
[I left dbpedia-discuss on the distribution, but I'm not sure why they
got tacked on at the very end of the conversation. They'll probably
need to go check the data-modeling archives to get caught up on what
the conversation was about.]
On Tue, Aug 18, 2009 at 10:01 AM, Paul
This example query from the DBpedia page returns a list of companies
which all have exactly 151,000 employees:
On Tue, Apr 20, 2010 at 1:45 AM, Jens Lehmann
lehm...@informatik.uni-leipzig.de wrote:
For some link datasets in DBpedia, there is no proper update mechanism
included in the DBpedia SVN repository. In such cases, the link data
sets are copied from the previous release. For Geonames, this means
On Tue, Apr 20, 2010 at 3:22 PM, Jens Lehmann
lehm...@informatik.uni-leipzig.de wrote:
Hello,
Tom Morris wrote:
On Tue, Apr 20, 2010 at 1:45 AM, Jens Lehmann
lehm...@informatik.uni-leipzig.de wrote:
For some link datasets in DBpedia, there is no proper update mechanism
included
On Thu, Apr 29, 2010 at 12:24 AM, Kingsley Idehen
kide...@openlinksw.com wrote:
Tom Morris wrote:
On Wed, Apr 28, 2010 at 6:28 PM, Kingsley Idehen kide...@openlinksw.com
wrote:
We are syncing Live with Wikipedia now that 3.5.1 cut is out.
Do those announcements not get posted here? Where
2010/5/4 Benjamin Großmann benja...@neofonie.de:
The property for abstract has been changed since DBPedia Version 3.5:
Instead of dbpedia2:abstract you have to query for dbo:abstract now. Then
your query works.
Do properties get deprecated for some period of time before they go
away? Are
On Mon, Jun 21, 2010 at 2:53 PM, Paul Houle p...@ontology2.com wrote:
Benjamin Good wrote:
Cassio,
The short answer to your question (as I understood it) is that you could
not issue such a query to the dbpedia sparql endpoint by itself. Somehow
you would need to get access to an endpoint
You could use the Freebase data dumps to narrow down what you're
looking for and then go to DBpedia for any missing information.
They're down weekly and include both DBpedia IDs as well as the
original Wikipedia article number, so you can easily link to either.
On Thu, Nov 11, 2010 at 12:32 PM, Kingsley Idehen
kide...@openlinksw.com wrote:
On 11/11/10 11:55 AM, Tom Morris wrote:
You could use the Freebase data dumps to narrow down what you're
looking for and then go to DBpedia for any missing information.
They're down weekly and include both DBpedia
On Mon, Jan 31, 2011 at 11:36 AM, Dan Brickley dan...@danbri.org wrote:
Hi folks
I have just taken the list of old Archive.org-hosted movies in
http://tech.blorge.com/Structure:%20/2010/08/11/top-40-best-free-legal-movies-you-can-download-right-now/
and linked them by hand to DBpedia. In
On Mon, Jan 31, 2011 at 11:36 AM, Dan Brickley dan...@danbri.org wrote:
I have just taken the list of old Archive.org-hosted movies in
http://tech.blorge.com/Structure:%20/2010/08/11/top-40-best-free-legal-movies-you-can-download-right-now/
and linked them by hand to DBpedia.
...
My working
I agree it'd be nice to fix the encoding of parentheses, but with regard to #:
A similar thing is also the case for URLs that include an #
e.g.
http://dbpedia.org/resource/Midfielder%23Winger is an own Resource
http://dbpedia.org/resource/Midfielder#Winger returns the contents
for
2011/6/30 Luis Galárraga shamant...@gmail.com:
Thank you very much for your prompt response and your help. I do not have
too much time for the presentation so I have used the information you have
provided plus the dates of release in a simple OpenOffice Calc chart which I
am sharing now.
On Mon, Dec 26, 2011 at 7:26 PM, Patrick Cassidy p...@micra.com wrote:
I have looked briefly at the DBpedia ontology and it appears to leave a
great deal to be desired in terms of what an ontology is best suited for: to
carefully and precisely define the meanings of terms so that they can be
On Wed, Feb 15, 2012 at 9:39 AM, Martin Giese marti...@ifi.uio.no wrote:
in general, what is the process to get new extraction functionality
added to dbpedia? Do I submit a feature request and wait and hope, or
should I write code and propose it for inclusion in the extraction
framework?
On Thu, Feb 16, 2012 at 4:11 AM, Martin Giese marti...@ifi.uio.no wrote:
Norway has the additional advantage of being so small that only a few
hundred companies, organizations, and institutions are present on
Wikipedia. (At least with an org. nr. For the rest, we have little
hope of linking
On Wed, May 23, 2012 at 5:44 AM, Ziqi Zhang
ziqizhang.em...@googlemail.com wrote:
My task is to extract candidate concepts/entities for an ambiguous term
from dbpedia, e.g., cat (disambiguation).
Like some of the other answers, not directly relevant, but another
signal that you can use is
On Wed, May 30, 2012 at 10:29 AM, Jona Christopher Sahnwaldt
j...@sahnwaldt.de wrote:
@developers: We will have to discuss what's the best way to do this...
- Add a configuration value decimalSeparator whose value may be dot or
comma: , or .. Bit hard to read... We would also need a
I know it has been correct in the past so this only effects some subset of
the releases (not sure which ones).
Tom
On Jul 10, 2012 8:50 AM, Jona Christopher Sahnwaldt j...@sahnwaldt.de
wrote:
DBpedia linked to the wrong URI, but that has been fixed. In the
upcoming release, we'll use the dot.
URIs like http://rdf.freebase.com/ns/en.steve_martin
would require a bit more coding, but if people (someone at Freebase?)
think DBpedia should do that, we could probably add it for 3.9.
Christopher
On Tue, Jul 10, 2012 at 2:57 PM, Tom Morris tfmor...@gmail.com wrote:
I know it has been
On Wed, Aug 29, 2012 at 11:22 AM, Michael Douma micha...@idea.org wrote:
I have a question about using DBpedia properties.
I'm working on an iPad app for visually browsing Wikipedia with radial
tree layouts. My colleagues and I published a proof-of-concept app
called 'WikiNodes':
On Thu, Aug 30, 2012 at 7:58 AM, Kingsley Idehen kide...@openlinksw.com wrote:
On 8/29/12 11:53 PM, Michael Douma wrote:
@Kingsley:
Our app will work offline, so we will need to create packages. We
probably will not send users to DBpedia in realtime. Also, we are
merging and massaging
On Fri, Sep 14, 2012 at 1:25 PM, Kingsley Idehen kide...@openlinksw.com wrote:
All,
Here is a rehash of a post I made last week about DBpedia attribution in
line with the fundamental goals and principles of Linked Open Data.
** Attribution Guidelines Start ***
A better place for this would
On Wed, Oct 10, 2012 at 12:23 PM, Julien Cojan julien.co...@inria.frwrote:
Ok, it is hard to discuss about results on changing data.
There there is a mess about this example
Piyu_Bolehttp://dbpedia.org/resource/Piyu_Bolebecause it was the page of a
song :
On Thu, Oct 11, 2012 at 9:48 AM, Venkatesh Channal
venkateshchan...@gmail.com wrote:
Hi,
I think I am still missing on how to query the information.
On executing the query:
SELECT * WHERE { ?s http://purl.org/dc/terms/subject
http://dbpedia.org/resource/Category:Hindi-language_films. ?s
structuring styles by different Wikipedians.
I'd love to see the extractor get this smart, but I'm not holding my breath.
Tom
On Thu, Oct 11, 2012 at 4:15 PM, Tom Morris tfmor...@gmail.com wrote:
On Thu, Oct 11, 2012 at 9:48 AM, Venkatesh Channal
venkateshchan...@gmail.com wrote:
Hi,
I
/Category:Hindi_songs . ?song rdf:type
http://dbpedia.org/ontology/Work . ?song
http://dbpedia.org/ontology/artist ?artist . ?song
http://dbpedia.org/ontology/runtime ?runtime . filter
(regex(str(?song),'P'))} limit 100
Cheers,
Pablo
On Thu, Oct 11, 2012 at 4:15 PM, Tom Morris tfmor...@gmail.com wrote
The Freebase dump includes direct links for topics which have interwiki
links:
$ bzgrep /wikipedia/it freebase-datadump-quadruples.tsv.bz2 | head
/m/010pld /type/object/key/wikipedia/it
Yorktown_$0028Virginia$0029
/m/010pld /type/object/key/wikipedia/it_id
On Wed, Oct 17, 2012 at 10:23 AM, Dimitris Kontokostas jimk...@gmail.comwrote:
On Wed, Oct 17, 2012 at 5:10 PM, Tom Morris tfmor...@gmail.com wrote:
The Freebase dump includes direct links for topics which have interwiki
links:
$ bzgrep /wikipedia/it freebase-datadump-quadruples.tsv.bz2
will see whether it is worth to automate the task or not.
Thank you all for the advices!
Cheers,
Marco
On 10/17/12 4:10 PM, Tom Morris wrote:
The Freebase dump includes direct links for topics which have interwiki
links:
$ bzgrep /wikipedia/it freebase-datadump-quadruples.**tsv.bz2 | head
I'm confused by the editorial comments at the end:
On Fri, Nov 16, 2012 at 12:52 PM, p...@ontology2.com wrote:
I’d like to advise Freebase to resume publication of the quad dump until
it can demonstrate the correctness of any alternative data export. In fact,
with infovore available under
On Wed, Nov 28, 2012 at 10:42 AM, Jona Christopher Sahnwaldt
j...@sahnwaldt.de wrote:
On Wed, Nov 28, 2012 at 4:41 PM, Jona Christopher Sahnwaldt
j...@sahnwaldt.de wrote:
On Wed, Nov 28, 2012 at 2:29 AM, Hugh Williams hwilli...@openlinksw.com
wrote:
Hi Christopher,
I am not aware of a
On Mon, Dec 24, 2012 at 5:20 AM, Mohamed Morsey
mor...@informatik.uni-leipzig.de wrote:
You can use the Scala script [3], to generate the owl:sameAs links to
Freebase.
A word of warning about that script: it works off the old quad dump
format which is deprecated and may have been generated for
On Fri, Jan 4, 2013 at 6:54 AM, Robert Glaß rgl...@avantgarde-labs.de wrote:
Dear dbpedia team,
I have a question concerning the extraction of literatur information
from wikipedia pages.
How can I get an integrated view of the information concerning the data
inside the literature template.
On Wed, Jan 16, 2013 at 9:06 AM, Sebastian Hellmann
hellm...@informatik.uni-leipzig.de wrote:
we thought that it might be a nice idea to simplify the workflow for
creating outgoing links from DBpedia to your data sets. This is why we
created the following GitHub repository:
On Sat, Feb 9, 2013 at 5:22 AM, Mohamed Morsey
mor...@informatik.uni-leipzig.de wrote:
the one which equals 138^^http://www.w3.org/2001/XMLSchema#int is
dbpprop:municipalityCode
That seems very wrong. Where is this mapping/extraction defined? And why
are there two different mappings?
On Thu, Mar 14, 2013 at 4:03 PM, Aiden Bell aiden...@gmail.com wrote:
Hi all
I'm wondering if there is a way to produce spotlight type wikipage link
surface form RDF using the dbpedia framework rather than Spotlight? I'm
implementing a disambiguation algorithm with a large amount of existing
I suspect multistream bzip2 is the culprit (which is a sensible
correlation with parallel bzip).
For what it's worth Python 2.x can't read these files either. There's
a backport of the 3.x support, but it requires installing a separate
package.
Tom
On Wed, Mar 20, 2013 at 9:48 PM, Jona
Can someone point to the part of the discussion which talks about what the
problem is? This thread seems to start in mid-stream...
Freebase's MQL key encoding (http://wiki.freebase.com/wiki/MQL_key_escaping)
is a completely private encoding which shouldn't have any effect on
external
the freebase
discussion list about this problem.
Agree with Jona that the number of problematic references is not relevant.
Cheers
Andrea
2013/3/25 Jona Christopher Sahnwaldt j...@sahnwaldt.de
On Mar 25, 2013 3:32 AM, Tom Morris tfmor...@gmail.com wrote:
Can someone point to the part
escaping undone and the redirects resolved.
Tom
On Mon, Mar 25, 2013 at 9:18 AM, Tom Morris tfmor...@gmail.com wrote:
I wouldn't claim that Freebase is bug-free, but that's a quite old and
simple algorithm, so unless they're triples from very early in it's life
(say, 2007), I'd guess that bad
On Fri, Apr 5, 2013 at 9:40 AM, Jona Christopher Sahnwaldt
j...@sahnwaldt.dewrote:
thanks for the heads-up!
On 5 April 2013 10:44, Julien Plu julien@redaction-developpez.com
wrote:
Hi,
I saw few days ago that MediaWiki since one month allow to create
infoboxes
(or part of them)
On Thu, Apr 11, 2013 at 8:40 AM, Jona Christopher Sahnwaldt j...@sahnwaldt.de
wrote:
Thanks for the heads up and the mail to noc. Please let us know what they
say. We'll have to find a new language list file.
Are you after languages or wikis? You might be able to use one of the
other files
encyclopedia.
Let me try restating it in a different way. Information which is invisible
(because persondata template is not rendered) is much less likely to be
correct.
Tom
Cheers
Andrea
2013/4/26 Tom Morris tfmor...@gmail.com
On Fri, Apr 26, 2013 at 2:14 PM, Jona Christopher
I too would be interested in more info on Airpedia. What forum/list is
used to discuss it?
On Tue, Jun 4, 2013 at 6:40 AM, Jona Christopher Sahnwaldt
j...@sahnwaldt.dewrote:
Hi Alession,
I think airpedia looks really interesting. Could you tell as a bit
more about the precision of these
On Thu, Jun 13, 2013 at 7:52 PM, Young,Jeff (OR) jyo...@oclc.org wrote:
Stephen,
While you're waiting, you could get the owl:sameAs assertions in the
reverse direction from the VIAF data dumps. An N-TRIPLE form was added to
the processing about a week ago, which should make it easier to
Speaking of wrong mappings, do the algorithms used to generate the Airpedia
class mappings have any concept of classes which are (or should be)
disjoint with each other? I was looking at the distribution of the number
of classes assigned to entities and was curious what classes were assigned
to
I've looking at an analysis of the Airpedia entity types and I have a
question about how things are counted between DBpedia and Airpedia.
If I look at the DBpedia stats
http://wiki.dbpedia.org/Datasets/DatasetStatistics it says there are 71,715
films in EN wikipedia. If I count the Airpedia
On Thu, Aug 1, 2013 at 2:18 PM, Andy Mabbett a...@pigsonthewing.org.ukwrote:
I wonder whether one of you good folk could kindly answer a quick
question for me, please?
How many articles on he English Wikipedia have infoboxes? As of what date?
I appreciate that there will be caveats!
Your
I'm not going to weigh in on the URI minting, but I did want to correct a
couple of misconceptions.
On Sat, Aug 31, 2013 at 3:20 AM, Dimitris Kontokostas jimk...@gmail.comwrote:
Just like in Wikipedia when we see an Infobox_Person we assume that the
resource is an dbo:Person
That's a bad
Congratulations on the new release!
On Mon, Sep 23, 2013 at 6:27 AM, Christian Bizer ch...@bizer.de wrote:
1. the new release is based on updated Wikipedia dumps dating from March /
April 2013 (the 3.8 release was based on dumps from June 2012), leading to
an overall increase in the number
That's an interesting teaser, but it'd be useful to include a link to get
more detail. It took me a while to track down the actual GSoC progress
page:
https://github.com/dbpedia/extraction-framework/wiki/GSOC2013_Progress_Kasun
Tom
On Fri, Nov 29, 2013 at 6:57 AM, Marco Fossati
On Tue, Dec 3, 2013 at 1:44 PM, Paul Houle ontolo...@gmail.com wrote:
Something I found out recently is that the page links don't capture
links that are generated by macros, in particular almost all of the
links to pages like
On Sat, Dec 21, 2013 at 2:24 PM, Ali Gajani aligaj...@gmail.com wrote:
... I want to make sure I can use this dataset to count indegrees (high
influencers) properly. It is impossible to survey all the rows to ensure
the knowledge is true, but I am asking anyway.
Presumably you mean
On Sat, Dec 21, 2013 at 7:49 PM, Tom Morris tfmor...@gmail.com wrote:
On Sat, Dec 21, 2013 at 2:24 PM, Ali Gajani aligaj...@gmail.com wrote:
... I want to make sure I can use this dataset to count indegrees (high
influencers) properly. It is impossible to survey all the rows to ensure
For any type of search application, you not only want to do case and accent
folding, but also Unicode normalization http://unicode.org/reports/tr15/
(you could have both precomposed and combining accent versions of the è in
Isère). Typically a search engine could be directed to normalize both the
On Mon, Jan 12, 2015 at 6:02 AM, Volha Bryl
vo...@informatik.uni-mannheim.de wrote:
At the time of the extraction the infobox at the corresponding wiki page
had a line with a strange syntax:
| [[type]] = [[Public]]
See the wiki page history, May 2014. This had caused the extraction
Dimitris, Soren, and DBpedia team,
That sounds like an interesting project, but I got lost between the
statement of intent, below, and the practical consequences:
On Tue, Mar 10, 2015 at 5:05 PM, Dimitris Kontokostas
kontokos...@informatik.uni-leipzig.de wrote:
we made some different design
://semanticsimulations.com
*From:* Sebastian Hellmann [mailto:hellm...@informatik.uni-leipzig.de]
*Sent:* Wednesday, March 11, 2015 3:12 AM
*To:* Tom Morris; Dimitris Kontokostas
*Cc:* Wikidata Discussion List; dbpedia-ontology;
dbpedia-discussion@lists.sourceforge.net; DBpedia-Developers
On the surface that sounds like a useful bit of research, but I fear
you're building on a foundation of quicksand.
My very first entity was labelled Guinness but had no information
whether it was the beer, the company, the brand or one of the other
similarly named entities. Given that we don't
Freebase has mappings to both Wordnet and EN Wikipedia, so you might be
able to bridge from Wordnet to DBpedia via that route if you can't find
anything more direct.
Tom
On Sat, Oct 3, 2015 at 2:54 AM, Nasr Eddine wrote:
> I wounder if there is a mapping between
Two other sources you might consider are Freebase and Wikidata. Using them
together with DBpedia might give you better results.
Tom
On Tue, Dec 15, 2015 at 5:27 AM, Vihari Piratla
wrote:
> Thanks Dimitris for a detailed response.
> I see 2,945,956 unique titles in
Hi Katie. I don't think there are universally agreed best practices in this
space and people often have strongly held views on either side. You don't
mention internationalization/localization which is, in my experience, a
bigger concern for folks than semantic drift. Those who believe in numeric
Perhaps here:
https://github.com/dbpedia/dbpedia/tree/master/tools/DBpediaAsTables
On Sun, Nov 6, 2016 at 10:24 AM, Dimitris Kontokostas
wrote:
> Hi Petar,
>
> There is some interest to revive this project and cannot recall / find
> where is the code to generate these dumps.
On Wed, Aug 21, 2019 at 3:23 AM Sebastian Hellmann <
hellm...@informatik.uni-leipzig.de> wrote:
> we switched completely to:
>
> - https://forum.dbpedia.org
> - Slack https://dbpedia-slack.herokuapp.com/
>
> and http://blog.dbpedia.org for announcements.
>
That's too bad. Why abandon such a nice
66 matches
Mail list logo