What is Seeing Atoms with X-Rays?
Bragg's Law is the foundational equation of X-Ray Crystallography (the technique famously used by Rosalind Franklin to discover the double-helix shape of DNA). Because the spacing between atoms in a crystal is roughly the same size as X-ray wavelengths, shooting X-rays at a crystal causes the waves to bounce off different layers of atoms and interfere. By measuring the angle of the brightest reflection, you can perfectly calculate exactly how far apart the microscopic atoms are.
Mathematical Foundation
Laws & Principles
- Constructive Interference: At random angles, the X-ray bouncing off the top layer of atoms and the X-ray bouncing off the bottom layer are out of phase. They collide and cancel out to pitch-black invisible zero. But at exactly angle $\theta$, the second wave has traveled exactly one full wavelength further ($n\lambda$), meaning the crests align perfectly, flashing a brilliant bright spot on the detector.
- The Sine Limit Lockout: Because $\sin(\theta)$ can never mathematically exceed 1.0, diffraction is entirely impossible if the wavelength chosen $\lambda$ is more than twice as large as the atomic spacing $d$. Shooting radio waves at a crystal yields nothing; the wave simply ignores the lattice completely.
Step-by-Step Example Walkthrough
" An engineer shoots an X-ray (wavelength 0.154 nm) at a mystery metal. The detector registers the first major bright spot (n=1) at an angle of exactly 22.6 degrees. What is the distance between the atoms? "
- 1. Ensure n=1, λ=0.154, θ=22.6.
- 2. Analyze formula: d = (n × λ) / (2 × sin(θ)).
- 3. Calculate sine: sin(22.6°) ≈ 0.384.
- 4. Denominator: 2 × 0.384 = 0.768.
- 5. Division: (1 × 0.154) / 0.768