Yeah I got it. chmod command is failing in windows system. Please install
cygwin (with its help you can run unix commands on windows environment). I
believe this will solve this issue.

Regards,
Gagan

On Mon, Feb 4, 2013 at 11:42 AM, Li Li <[email protected]> wrote:

> I have not installed hadoop/hdfs in my windows 7 box.  that's the problem?
> -------------------------------------------------------
>  T E S T S
> -------------------------------------------------------
> Running org.apache.blur.store.blockcache.BlockCacheTest
> 13/02/04 13:47:31 INFO jvm.JvmMetrics: Initializing JVM Metrics with
> processName
> =blur, sessionId=1359956851299
> Cache Hits    = 24
> Cache Misses  = 9976
> Store         = 0.0061420423
> Fetch         = 0.0013997785
> # of Elements = 8191
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.576 sec
> Running org.apache.blur.store.blockcache.BlockDirectoryTest
> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.166 sec
> Running org.apache.blur.store.compressed.CompressedFieldDataDirectoryTest
> 13/02/04 13:47:36 WARN util.NativeCodeLoader: Unable to load native-hadoop
> libra
> ry for your platform... using builtin-java classes where applicable
> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.504 sec
> Running org.apache.blur.store.HdfsDirectoryTest
> Working on pass [0] seed [-7756543742976133877] contains [false]
> java.io.IOException: Cannot run program "chmod": CreateProcess error=2,
> ????????
> ?
>         at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)
>         at org.apache.hadoop.util.Shell.runCommand(Shell.java:200)
>         at org.apache.hadoop.util.Shell.run(Shell.java:182)
>         at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:
> 375)
>         at org.apache.hadoop.util.Shell.execCommand(Shell.java:461)
>         at org.apache.hadoop.util.Shell.execCommand(Shell.java:444)
>         at
> org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSyste
> m.java:533)
>         at
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys
> tem.java:524)
>         at
> org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.
> java:292)
>         at
> org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.jav
> a:386)
>         at
> org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.jav
> a:365)
>         at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:607)
>         at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:588)
>         at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:495)
>         at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:487)
>         at
> org.apache.blur.store.hdfs.HdfsFileWriter.<init>(HdfsFileWriter.java:
> 39)
>         at
> org.apache.blur.store.hdfs.HdfsDirectory.createOutput(HdfsDirectory.j
> ava:72)
>         at
> org.apache.blur.store.HdfsDirectoryTest.createFile(HdfsDirectoryTest.
> java:173)
>         at
> org.apache.blur.store.HdfsDirectoryTest.testRandomAccessWrites(HdfsDi
> rectoryTest.java:131)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
> java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> sorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(Framework
> Method.java:44)
>         at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCal
> lable.java:15)
>         at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMe
> thod.java:41)
>         at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMet
> hod.java:20)
>         at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.
> java:28)
>         at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRun
> ner.java:76)
>         at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRun
> ner.java:50)
>         at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
>         at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
>         at
> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
>         at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
>         at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
>         at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
>         at
> org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.
> java:53)
>         at
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4
> Provider.java:123)
>         at
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider
> .java:104)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
> java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
> sorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at
> org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(
> ReflectionUtils.java:164)
>         at
> org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke
> (ProviderFactory.java:110)
>         at
> org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(Suref
> ireStarter.java:175)
>         at
> org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWh
> enForked(SurefireStarter.java:107)
>         at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:
> 68)
> Caused by: java.io.IOException: CreateProcess error=2, ?????????
>         at java.lang.ProcessImpl.create(Native Method)
>         at java.lang.ProcessImpl.<init>(ProcessImpl.java:81)
>         at java.lang.ProcessImpl.start(ProcessImpl.java:30)
>         at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)
>         ... 47 more
> Tests run: 3, Failures: 1, Errors: 2, Skipped: 0, Time elapsed: 1.4 sec
> <<< FAIL
> URE!
>
> Results :
>
> Failed tests:
> testRandomAccessWrites(org.apache.blur.store.HdfsDirectoryTest):
>  Test failed with seed [-7756543742976133877] on pass [0]
>
> Tests in error:
>   testEOF(org.apache.blur.store.HdfsDirectoryTest): Cannot run program
> "chmod":
> CreateProcess error=2, ?????????
>   testWritingAndReadingAFile(org.apache.blur.store.HdfsDirectoryTest):
> Cannot ru
> n program "chmod": CreateProcess error=2, ?????????
>
> Tests run: 9, Failures: 1, Errors: 2, Skipped: 0
>
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO]
> [INFO] Blur .............................................. SUCCESS [1.060s]
> [INFO] Blur Util ......................................... SUCCESS
> [21.730s]
> [INFO] Blur Thrift ....................................... SUCCESS
> [15.863s]
> [INFO] Blur Store ........................................ FAILURE
> [11.651s]
>
> On Mon, Feb 4, 2013 at 1:53 PM, Gagan Juneja <[email protected]>
> wrote:
> > Hi,
> > Can you please post which test case is failing?
> >
> > Regards,
> > Gagan
> >
> > On Mon, Feb 4, 2013 at 11:19 AM, Li Li <[email protected]> wrote:
> >
> >> hi all
> >>     when I build blur, I found some problems
> >>     1. it warns [WARNING] Using platform encoding (GBK actually) to
> >> copy filtered resources, i.e
> >> . build is platform dependent!
> >>       I am using windows to build it. maybe the pom.xml should
> >> explicitly configure with utf8 or other encodings to avoid this
> >> warning
> >>     2. can't build with jdk1.7.
> >>       I have some errors using jdk1.7. it can be correctly compiled with
> >> jdk1.6
> >>     3. can's pass test
> >>       I can only build with mvn install -DskipTests=true
> >>       maybe hadoop can't be available in windows?
> >>
>

Reply via email to