Declaration of interest: I am the Editor-in-Chief of the Journal of
Open Research Software, a journal which published software
metapapers which are similar to the Elsevier Original Software
Publication idea. I also maintain a list of journals which accept
papers about scientific software:
Hi,
I hope that people don't mind me advertising this here. A number of
the Software Carpentry community have been authors, reviewers and
editors for the Journal of Open Research Software, so hopefully it
isn't off-topic.
The Journal of Open Research Software is an international,
open-access,
> I'll add that besides a workflow system which would have helped (but
> not necessarily panacea, errors could be made there too), an open data
> reproducible paper would have helped even more: with all those
> skeptics, someone would have tried to re-run the analysis seeking for
> errors, and
I would have also liked the ability to vote for N candidates (where N =
number of positions being elected). Otherwise, I feel that I have to vote
strategically to get a good selection of candidates (i.e. is there someone
I think will get a lot of votes and thus I can use my vote on someone I
think
Hi All,
as part of the outreach activity we're undertaking with EPCC, we're looking
to pull together resources and examples for young scientists and engineers
in the UK (where young is classified as 7-19, but we are aiming at 10-15).
I wondered if people on the list had come across:
- tutorials
Hi Andreas,
I see you've already be directed towards the two resources I'm most
familiar with:
FOSTER Open Science collaboration (https://www.fosteropenscience.eu/)
- I have given "open source for open science" seminars at some of
their events before.
Sophie Kay (nee Kershaw's) Open Science
> I've often heard it said that research code doesn't need as much unit
> testing or code review as scientific code, but it isn't obvious to me
> that the consequences of error are less in research code. Or better
> put, it is obvious to me that that is so, but that is not a sign of
> good health
Wait, is "engaging in the Software Carpentry drinking game
before/during teaching" not on the list of worst practices yet?
In all seriousness, thanks for recording this Greg!
Neil
On 5 May 2016 at 18:45, Karen Cranston wrote:
> With this video + list of Worst
+1 to the overall sentiment, and +1 to Titus' modification as most
hosts will not be able to pay without the claim and receipts being
submitted.
However, as someone who has to authorise many guest expenses claims
through their institution, I know that it can be a struggle to get a
claim for
Thanks for passing it on - that's one of the best "how to actually use
this stuff without it getting in the way of what you're trying to do"
posts I've seen in a while.
Neil
On 30 April 2016 at 14:18, C. Titus Brown wrote:
> Have people seen this? I didn’t see it make the
A few years back, I tried to map the different types of software
project management infrastructure, based on when they're needed at
different stages of maturity for a project (see attached). This drew
upon sources like Fogel's Producing OSS, with tweaks specifically for
working in a research
Hi All,
Fiona Murphy is running a workshop on digital skills curriculum
planning at the EGU conference in Vienna on 28th April.
If there are any members of the Software Carpentry / Data Carpentry
community attending EGU who are interested in and able to
participating in the workshop, Fiona would
Nice!
Are you aware of the CHAOSS (Community Health Analytics Open Source Software)
initiative under the Linux Foundation? https://chaoss.community/
Most of the major interested parties in community health metrics are using
CHAOSS as the place to define both metrics/indicators and develop
13 matches
Mail list logo