Recapping a discussion on IRC, this is great progress...  Thanks Ankesh!

On Jun 17, 2014, at 4:01 PM, Ankesh Anand wrote:

> Quick summary of the progress so far:
> - File Uploads: A web interface to upload the Benchmark logs, files are 
> transferred to an archive folder. File validations and drag n drop are 
> supported.

Suggest uploading all files (regardless of whether via interactive upload or 
e-mail or http/rest/post) into a queue directory.  That way, we'll be able to 
throttle processing.  You'll probably want to create a public queue dashboard 
so we can check on the status of log processing.

> - Visualizations: Flot has been used to plot the aggregate data from the 
> database. An initial roster of the plots that we discussed earlier on the 
> list has been implemented, along with a few suggestions.

I don't think we'll need to see individual test results (the benchmark 
currently has 6 tests: m35, world, sphflake, etc), but will want to see #1 a 
view of a specific result, #2 a comparison of a given result against the 
database, and #3 views on the database in aggregate.

For both #2 and #3, we'll probably want to provide ways to look at 
similar/different CPUs, architectures (endian types), memory levels, clock 
cycles, results submitted "near my location", results submitted in given 
timeframe (year, date range, etc), ...

> - Processing Uploaded Files: I have leveraged the script from the previous 
> GSoC project, this needs to run as a cron job on the server.

No problems setting up cron jobs.

> Screenshots: http://imgur.com/a/RhPLe

Looks great.  The visual appearance should match our new style, though: 
http://www.google-melange.com/gci/task/view/google/gci2013/5844328496758784
 
> Issues:
> -  Deployment: I don't have an access to a server with static IP where I can 
> host my work. There are lots of things that need to be done differently in a 
> deployment setup(serving static files for example). This also prevents me 
> from demoing my work and getting quick feedback.
> 
> - Scarcity of Data: I only have the logs from my system, which prevents me 
> from having enough data to test all the visualizations. 

These will get taken care of as soon as they can be attended to, but will have 
to make due and keep busy in the meantime with other tasks.

> Questions:
> - What would be the best way to serve performance data to the developers? I 
> looked at OpenBenchmarking.org and they display performance indexes 
> categorized by Processors, GPUs etc. We could mimic the same with the 
> categorizations relevant to us.

Another good example is speedtest.net and more on 
http://www.dslreports.com/speedtest

> - Should I work on the documentation of the parts completed now or should it 
> be carried out at the end of the project?

Hold off on docs.  If they have to look at docs to use the interface, we 
probably need to work more on usability.  The intent is to make this stupid 
easy to use, to get a performance value for your system, to look at how it 
compares to others, and to see the state of the computing industry over time.

Cheers!
Sean

------------------------------------------------------------------------------
HPCC Systems Open Source Big Data Platform from LexisNexis Risk Solutions
Find What Matters Most in Your Big Data with HPCC Systems
Open Source. Fast. Scalable. Simple. Ideal for Dirty Data.
Leverages Graph Analysis for Fast Processing & Easy Data Exploration
http://p.sf.net/sfu/hpccsystems
_______________________________________________
BRL-CAD Developer mailing list
brlcad-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/brlcad-devel

Reply via email to