That could depend where the beam is focused- if focused on the crystal
then it diverges from that point, like the bulk of the scattered x-rays
that give rise to background. If focused on the detector, it could actually
be convergent over that distance while the scattering is divergent.

Also on the ratio of divergence to beam diameter- if the beam was coming
out an infinitely small pinhole at the crystal, then whatever the divergence,
spot diameter would be increasing proportional to distance, spot area with
the square, and intensity inverse to the square. But if the beam diameter
at the crystal is 0.5 mm, and divergence is 1 in 10000 (unitless angle),
the diameter will not be doubled until 5 meters and the 1/x^2 law will
start to kick in beyond that.

James Stroud wrote:
The flux from the spots fall off as the square as well. Assuming that
flux at the detector is linear with respect to measured intensity, I'm
not sure where the benefit would be. I'm also assuming an ideal beam and
ignoring other sources of noise.

James



On Nov 23, 2009, at 2:54 PM, Richard Gillilan wrote:

It seems to be widely known and observed that diffuse background
scattering decreases more rapidly with increasing detector-to-sample
distance than Bragg reflections. For example, Jim Pflugrath, in his
1999 paper (Acta Cryst 1999 D55 1718-1725) says "Since the X-ray
background falls off as the square of the distance, the expectation is
that a larger crystal-to-detector distance is better for reduction of
the x-ray background. ..."

Does anyone know of a more rigorous discussion of why background
scatter fades while Bragg reflections remain collimated with distance?


Richard Gillilan
MacCHESS

Reply via email to