I have a backup server, so it is very unlikey they both failed at the
same time. That said, I had 2 spot checks or whatever in total (<2
minutes each). Don't think I'd waste my time on ADC2.

On May 1, 2:33 pm, "Kevin Galligan" <[EMAIL PROTECTED]> wrote:
> I have the same issue.  I've had two people look at my app.  One for a
> very short period.  I do not know of my server ever being down.
>
> From the info I had, from Dan and general info, one of them was almost
> certainly a spot check.  However, every day it seems less likely that
> somebody is going to stop by.
>
> On Thu, May 1, 2008 at 6:18 AM, tomgibara <[EMAIL PROTECTED]> wrote:
>
> >  A monitoring script notifies me of any disruption and logs on the
> >  server leave a precise record - 3hrs is actually a rounded up.
>
> >  Your last comment about the video is pretty much my concern, which as
> >  I stated is not for my application but for others: what about the
> >  entrants who produced capable applications, but no videos? I spent
> >  time doing both.
>
> >  I'm concerned that entrants who committed all of their time to
> >  producing a good application and not a highly polished readme with
> >  supporting videos etc. might be unfairly disadvantaged. Both should
> >  certainly get credit, but if time constraints mean that judges just
> >  read documentation and don't use the application much, then what is
> >  being judged is the presentation of ideas, whereas my understanding
> >  was that Google's staff were looking for good applications.
>
> >  On May 1, 10:34 am, Incognito <[EMAIL PROTECTED]> wrote:
> >  > > The server hosting my application has failed twice in the last month
> >  > > (for about 3hrs each time) which I'm upset about, but there's nothing
> >  > > I can do (of course it hadn't failed for at least 6 weeks _before_ the
> >  > > deadline). If a judge had attempted to use the application during one
> >  > > of these periods they would have received a red message box warning
> >  > > them of the problem on the application's home screen (this was also
> >  > > explained in the supplied readme). It seems very unlikely (and
> >  > > unlucky) that two judges would have tested the application at these
> >  > > times; even more unfortunate if they are given an explicit warning
> >  > > that there was a problem connecting to the server but chose to
> >  > > disregard it.
>
> >  > Well, how do you know if wasn't down more than 3 hrs each time?
>
> >  > > What troubles me more is that the part of the application that is
> >  > > usable without a camera feed, barcode publishing, does not appear to
> >  > > have been tested either, even though it's a very accessible part of
> >  > > the application. One judge published one barcode. Is that the testing
> >  > > that an application which took 6 months to write merits?
>
> >  > Judges are not testers. i.e. The judge will not go through every
> >  > single feature to verify that it works. This is what I think is really
> >  > happening. First they read the first pages of your manual, or at least
> >  > the introduction (or watch a video demo if you have one) to understand
> >  > what your application  is about. Based on this they'll know how to
> >  > rate it for the inovation criteria. Second, they may open it just to
> >  > check out the GUI. If they really like it they may explore more of
> >  > your app. However, if all it's doing is just scanning a bar code then
> >  > they believe you that it does that and they don't verify it
> >  > themselves. Or it could just be that your application failed and it
> >  > didn't allow the judge to continue. Only way to know for sure is to
> >  > ask the judges.
>
> >  > If you provided a video it could be that they are relying heavily on
> >  > that do rate your application and just opening your app for a minute
> >  > or two to verify that your app opens. I know I would if I had 76
> >  > applications to judge and running out of time.
>
> >  > On May 1, 4:58 am, tomgibara <[EMAIL PROTECTED]> wrote:
>
> >  > > I scanned a log of the #android IRC channel this morning and now I'm
> >  > > confused (concerned?) by the judging process. Dan Morrill said that
> >  > > judging had progressed to halving the final 100 applications to 50.
>
> >  > > I'm not logging much information on the server side of my Moseycode
> >  > > application, but I do know how many devices (emulators) have
> >  > > authenticated with the demo account I provided for testing: 2. But as
> >  > > far as I know, each application is supposed to be judged by 4 judges.
>
> >  > > The server hosting my application has failed twice in the last month
> >  > > (for about 3hrs each time) which I'm upset about, but there's nothing
> >  > > I can do (of course it hadn't failed for at least 6 weeks _before_ the
> >  > > deadline). If a judge had attempted to use the application during one
> >  > > of these periods they would have received a red message box warning
> >  > > them of the problem on the application's home screen (this was also
> >  > > explained in the supplied readme). It seems very unlikely (and
> >  > > unlucky) that two judges would have tested the application at these
> >  > > times; even more unfortunate if they are given an explicit warning
> >  > > that there was a problem connecting to the server but chose to
> >  > > disregard it.
>
> >  > > What troubles me more is that the part of the application that is
> >  > > usable without a camera feed, barcode publishing, does not appear to
> >  > > have been tested either, even though it's a very accessible part of
> >  > > the application. One judge published one barcode. Is that the testing
> >  > > that an application which took 6 months to write merits?
>
> >  > > Since my application requires interactive use of the camera, I was
> >  > > resigned to judges not actually being able to use the scanning part of
> >  > > the application without setting-up a camera (and I know from my logs
> >  > > that the judges certainly did not scan any barcodes) so I made a video
> >  > > of that, but I did expect judges to fully explore the other elements
> >  > > of the application.
>
> >  > > My Moseycode application is being developed with the goal of fully
> >  > > realizing a new barcode system, not just as an entry into the ADC.
> >  > > Ever since explanations about the judging process were forthcoming I
> >  > > always felt that it probably wouldn't do well in the challenge because
> >  > > of its reliance on a real camera feed. But my concern is more general
> >  > > than that: what degree of testing did all the other applications
> >  > > receive?
>
> >  > > Of course this is all conjecture, perhaps my Moseycode application was
> >  > > too buggy, or perhaps two judges did try to use the application while
> >  > > the server was down, but I am disappointed by Moseycode's ignominious
> >  > > evaluation.
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Android Challenge" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/android-challenge?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to