> The experiment I worked on was BaBar at SLAC.

Oh, that's nice and also a kind of a funny coincidence. That means you were 
involved with an experiment studying weak CP violation whereas we're now trying 
to study strong CP violation, heh.

> Although the simulation and reconstruction and analysis code was written in 
> C++ book-keeping was better done in a scripting language. It was easier to do 
> this in Python. I _think, but you would know better, Python has moved to the 
> analysis area too. So, more researchers are using Python there.

Yes, a lot of people use Python all over the place in physics now. Most of my 
ATLAS colleagues write their ROOT scripts using pyroot / rootpy instead of 
ROOT's C++ abomination. Which is always kind of funny when you see them talking 
about "running python" but encountering segmentation faults... :)

> For analysis, I think Nim could have an advantage as it's faster. I think the 
> time to develop the code is about the same, but Nim would reduce the 
> execution time. It would also help to reduce the debugging time as a chunk of 
> time is spent keeping track of which variables are which type.

Oh yes, for sure! Unfortunately I feel most physicists don't realize that 
dynamic typing is a burden.

> I think I recall that someone wrote an interface to ROOT for Nim, so you can 
> read and manipulate your data in Nim.

Yes? I haven't seen that. Sounds to me like creating that wrapper would be kind 
of a pain, given how ROOT is 1. C++ and 2. essentially provides its own 
standard library.

> For your analysis are you using Nim? I think that would be a great thing. I 
> know nothing about Axions, or searches for Axions. I should look it up to 
> find out more.

Yep, I'm writing my whole analysis in Nim. Since the axion community is still 
pretty tiny (although it has been growing and should grow even more now after 
the last European Strategy for Particle Physics update, which really endorses 
axion searches) I'm not really forced to use some existing analysis framework. 
There was some code written by my predecessor, but that was all ROOT and 
MarlinTPC. Threw it all away and started from scratch. Code is here:

[https://github.com/Vindaar/TimepixAnalysis](https://github.com/Vindaar/TimepixAnalysis)

It's one big mono repository for my whole thesis essentially though. The most 
interesting code is the Analysis directory.

In general axion searches all deal with the same big problem: given that axions 
haven't been detected yet, it means their detection via some interaction is 
hard (-> coupling constants are tiny). What does that imply? Of course that no 
matter what kind of experiment one builds, all sorts of background will 
massively dominate everything one measures. So they are all very low rate 
experiments, which need the best possible background suppressions (both 
hardware and software wise) as possible. In that sense it's a little similar to 
neutrino experiments, except even worse. Also neutrino experiments of course 
have the benefit nowawadays of simply having a lot more manpower and money to 
build in better locations (e.g. waaayy below ground to shield from cosmics) 
than we do.

My experiment - CAST - is simply sitting in a random hall at surface level at 
CERN. The only shielding from muons I have is ~20 cm of lead. So there's still 
like ~1 muon every couple of seconds in my detector. What we want to measure 
are X-rays which are the result of axions entering our magnet (LHC prototype 
dipole magnet, 9 m long, 9T magnetic field) and interacting with the virtual 
photons of the magnetic field. These would just be the result of X-rays 
interacting in the Sun and randomly converting to axions and then leave the Sun 
unhindered.

Have a great weekend!

Reply via email to