Thanks  Sean.
I follow the guide, import the codebase into IntellijIdea as Maven project, 
with the profiles:hadoop2.4 and yarn.
In the maven project view, I run  Maven Install against the module: Spark 
Project Parent POM(root).After a pretty long time, all the modules are built 
successfully.
But when I run the LocalPi example, the compiling errors emerge,
1. EventBatch and SparkFlumeProtocol don't exist
2. There are a bunch of errors complaining q is not member of StringContext in 
CodeGenerator.scala
Then, I try by clicking the "Generate Sources and Update Folders For All 
Projects", and repeat maven install...still success with compiling errors there

Sean, any guide on this?Thanks












At 2015-01-09 18:08:11, "Sean Owen" <so...@cloudera.com> wrote:
>What's up with the IJ questions all of the sudden?
>
>This PR from yesterday contains a summary of the answer to your question:
>https://github.com/apache/spark/pull/3952 :
>
>"Rebuild Project" can fail the first time the project is compiled,
>because generate source files are not automatically generated. Try
>clicking the "Generate Sources and Update Folders For All Projects"
>button in the "Maven Projects" tool window to manually generate these
>sources.
>
>On Fri, Jan 9, 2015 at 10:03 AM, bit1...@163.com <bit1...@163.com> wrote:
>> Hi,
>> When I fetch the Spark code base and import into Intellj Idea as SBT
>> project, then I build it with SBT, but there is compiling errors in the
>> examples module,complaining that the EventBatch and SparkFlumeProtocol,looks
>> they should be in
>> org.apache.spark.streaming.flume.sink package.
>>
>> Not sure what happens.
>>
>> Thanks.
>>
>>
>>
>>
>> ________________________________
>>
>
>---------------------------------------------------------------------
>To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>For additional commands, e-mail: user-h...@spark.apache.org
>

Reply via email to