Hello All ,

one Question for the sake of curiousity , i've added different launchers
and i can run them on my machine using :
mvn scala:run -Dlauncher=GenerateLLMasterFile

but it needs sudo  when it running it on the server , otherwise it gives me
the error :

Fehler: Hauptklasse org.codehaus.classworlds.Launcher konnte nicht gefund│
                                                                en oder
geladen werden

i wonder why is that , i dont think they need sudo permission to access the
pom files

thanks
Regards


On Fri, Aug 9, 2013 at 8:26 AM, Dimitris Kontokostas <[email protected]>wrote:

> Hi Hady,
>
> Although I am not a maven expert, you probably miss some maven basics.
> Maven searches for a pom.xml file to read its tasks / launchers, if you
> run maven from the directory of your script that's an expected message.
> please take a look at /scripts/pom.xml and /run script to see how you can
> define your script as a launcher & run it with maven
>
> Best,
> Dimtris
>
>
> On Thu, Aug 8, 2013 at 5:15 PM, Hady elsahar <[email protected]>wrote:
>
>> that's Make sense , i didn't use to run scala from Maven i always used
>> intellij and scala cmd for Running scala scripts
>>
>> i didn't get what's meant by scala launcher ? , shouldn't we clean ,
>> compile , then execute the Jar file created ?
>>
>> when i proceed with any of the maven commands  mvn clean  for example it
>> keeps sending me that main class doesn't exist
>>
>> Fehler: Hauptklasse org.codehaus.classworlds.Launcher konnte nicht
>> gefunden oder geladen werden
>>
>> thanks
>> Regards
>>
>>
>> On Thu, Aug 8, 2013 at 3:02 PM, Jona Christopher Sahnwaldt <
>> [email protected]> wrote:
>>
>>> On 8 August 2013 12:55, Hady elsahar <[email protected]> wrote:
>>> > I see also that there's no scala installed on the server
>>> > we will need that for the LL code to run
>>>
>>> Are you sure? If you use Maven, you can just call a Scala launcher.
>>> Maven will download all that's necessary. I rarely install Scala on
>>> any of the machines where I run DBpedia code.
>>>
>>> JC
>>>
>>> >
>>> >
>>> >
>>> > On Thu, Aug 8, 2013 at 12:49 PM, Hady elsahar <[email protected]>
>>> wrote:
>>> >>
>>> >> Hello Dimitris ,
>>> >>
>>> >> sorry for being idle for some days , i was travelling to Leipzig
>>> >>
>>> >> the python code is running now , don't know how much would it take
>>> >> Hopefully 3 hours as Markus mentioned.
>>> >>
>>> >> Sebastian also gave me access to the server  lgd.aksw.org , i'll Run
>>> the
>>> >> code Again there to speed up the process . then the LLextraction Code
>>> >>
>>> >> i just want to install some python modules for the code to run and
>>> maybe
>>> >> some packages like pip to ease the install ?
>>> >> is it okay to download packages and modules ? or is there a followed
>>> >> protocol in such cases ?
>>> >>
>>> >>
>>> >> thanks
>>> >> Regards
>>> >>
>>> >>
>>> >>
>>> >>
>>> >>
>>> >> On Sat, Aug 3, 2013 at 6:25 PM, Dimitris Kontokostas <
>>> [email protected]>
>>> >> wrote:
>>> >>>
>>> >>> Hi Hady,
>>> >>>
>>> >>> This might be what we were waiting for :)
>>> >>> If noone else objects, can you create a turtle dump and re-test /
>>> adapt
>>> >>> your existing ILL code?
>>> >>> Afterwards we can start the mappings process
>>> >>>
>>> >>> Best,
>>> >>> Dimitris
>>> >>>
>>> >>>
>>> >>> ---------- Forwarded message ----------
>>> >>> From: Markus Krötzsch <[email protected]>
>>> >>> Date: Sat, Aug 3, 2013 at 4:48 PM
>>> >>> Subject: [Wikidata-l] Wikidata RDF export available
>>> >>> To: "Discussion list for the Wikidata project."
>>> >>> <[email protected]>
>>> >>>
>>> >>>
>>> >>> Hi,
>>> >>>
>>> >>> I am happy to report that an initial, yet fully functional RDF
>>> export for
>>> >>> Wikidata is now available. The exports can be created using the
>>> >>> wda-export-data.py script of the wda toolkit [1]. This script
>>> downloads
>>> >>> recent Wikidata database dumps and processes them to create
>>> RDF/Turtle
>>> >>> files. Various options are available to customize the output (e.g.,
>>> to
>>> >>> export statements but not references, or to export only texts in
>>> English and
>>> >>> Wolof). The file creation takes a few (about three) hours on my
>>> machine
>>> >>> depending on what exactly is exported.
>>> >>>
>>> >>> For your convenience, I have created some example exports based on
>>> >>> yesterday's dumps. These can be found at [2]. There are three Turtle
>>> files:
>>> >>> site links only, labels/descriptions/aliases only, statements only.
>>> The
>>> >>> fourth file is a preliminary version of the Wikibase ontology that
>>> is used
>>> >>> in the exports.
>>> >>>
>>> >>> The export format is based on our earlier proposal [3], but it adds
>>> a lot
>>> >>> of details that had not been specified there yet (namespaces,
>>> references, ID
>>> >>> generation, compound datavalue encoding, etc.). Details might still
>>> change,
>>> >>> of course. We might provide regular dumps at another location once
>>> the
>>> >>> format is stable.
>>> >>>
>>> >>> As a side effect of these activities, the wda toolkit [1] is also
>>> getting
>>> >>> more convenient to use. Creating code for exporting the data into
>>> other
>>> >>> formats is quite easy.
>>> >>>
>>> >>> Features and known limitations of the wda RDF export:
>>> >>>
>>> >>> (1) All current Wikidata datatypes are supported. Commons-media data
>>> is
>>> >>> correctly exported as URLs (not as strings).
>>> >>>
>>> >>> (2) One-pass processing. Dumps are processed only once, even though
>>> this
>>> >>> means that we may not know the types of all properties when we first
>>> need
>>> >>> them: the script queries wikidata.org to find missing information.
>>> This is
>>> >>> only relevant when exporting statements.
>>> >>>
>>> >>> (3) Limited language support. The script uses Wikidata's internal
>>> >>> language codes for string literals in RDF. In some cases, this might
>>> not be
>>> >>> correct. It would be great if somebody could create a mapping from
>>> Wikidata
>>> >>> language codes to BCP47 language codes (let me know if you think you
>>> can do
>>> >>> this, and I'll tell you where to put it)
>>> >>>
>>> >>> (4) Limited site language support. To specify the language of linked
>>> wiki
>>> >>> sites, the script extracts a language code from the URL of the site.
>>> Again,
>>> >>> this might not be correct in all cases, and it would be great if
>>> somebody
>>> >>> had a proper mapping from Wikipedias/Wikivoyages to language codes.
>>> >>>
>>> >>> (5) Some data excluded. Data that cannot currently be edited is not
>>> >>> exported, even if it is found in the dumps. Examples include
>>> statement ranks
>>> >>> and timezones for time datavalues. I also currently exclude labels
>>> and
>>> >>> descriptions for simple English, formal German, and informal Dutch,
>>> since
>>> >>> these would pollute the label space for English, German, and Dutch
>>> without
>>> >>> adding much benefit (other than possibly for simple English
>>> descriptions, I
>>> >>> cannot see any case where these languages should ever have different
>>> >>> Wikidata texts at all).
>>> >>>
>>> >>> Feedback is welcome.
>>> >>>
>>> >>> Cheers,
>>> >>>
>>> >>> Markus
>>> >>>
>>> >>> [1] https://github.com/mkroetzsch/wda
>>> >>>     Run "python wda-export.data.py --help" for usage instructions
>>> >>> [2] http://semanticweb.org/RDF/Wikidata/
>>> >>> [3] http://meta.wikimedia.org/wiki/Wikidata/Development/RDF
>>> >>>
>>> >>> --
>>> >>> Markus Kroetzsch, Departmental Lecturer
>>> >>> Department of Computer Science, University of Oxford
>>> >>> Room 306, Parks Road, OX1 3QD Oxford, United Kingdom
>>> >>> +44 (0)1865 283529               http://korrekt.org/
>>> >>>
>>> >>> _______________________________________________
>>> >>> Wikidata-l mailing list
>>> >>> [email protected]
>>> >>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>> >>>
>>> >>>
>>> >>>
>>> >>> --
>>> >>> Kontokostas Dimitris
>>> >>
>>> >>
>>> >> -------------------------------------------------
>>> >> Hady El-Sahar
>>> >> Research Assistant
>>> >> Center of Informatics Sciences | Nile University
>>> >>
>>> >>
>>> >
>>> >
>>> >
>>> > --
>>> > -------------------------------------------------
>>> > Hady El-Sahar
>>> > Research Assistant
>>> > Center of Informatics Sciences | Nile University
>>> >
>>> >
>>>
>>
>>
>>
>> --
>> -------------------------------------------------
>> Hady El-Sahar
>> Research Assistant
>> Center of Informatics Sciences | Nile 
>> University<http://nileuniversity.edu.eg/>
>>
>>
>>
>
>
> --
> Kontokostas Dimitris
>



-- 
-------------------------------------------------
Hady El-Sahar
Research Assistant
Center of Informatics Sciences | Nile University<http://nileuniversity.edu.eg/>
------------------------------------------------------------------------------
Get 100% visibility into Java/.NET code with AppDynamics Lite!
It's a free troubleshooting tool designed for production.
Get down to code-level detail for bottlenecks, with <2% overhead. 
Download for free and get started troubleshooting in minutes. 
http://pubads.g.doubleclick.net/gampad/clk?id=48897031&iu=/4140/ostg.clktrk
_______________________________________________
Dbpedia-developers mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/dbpedia-developers

Reply via email to