"Aliasing" is the appearance of a signal that is not the true signal, due to undersampling.
Aliasing can manifest as jagged edges or as Moiré patterns.
Antialiasing reduces the obvious problems by taking more samples.
A pixel, $P$, should be coloured with the average of all the colours coming through the pixel:
$\textrm{colour} = { \Large \int_P C(x) \; dx \over \Large \int_P dx }$
where $C(x)$ is the colour arriving a position $x$ within the pixel and $\int_P dx$ is the area of the pixel.
In ray tracing, we can send rays through many locations within a pixel and average the colours of those rays.
A single ray at the centre of the pixel will result in aliasing:
A denser, but regular sampling, will reduce the aliasing. But the aliasing will still be present at a finer scale:
Sampling at uniformly randomly chosen positions in a pixel hides the aliasing by adding noise. Several uniformly randomly chosen rays will likely be "clumpy", missing some parts of the pixel and giving too much weight to other parts:
Jittered sampling regularly subdivides the area of the pixel and takes a single sample chosen uniformly randomly within each subdivision. This hides the aliasing, but with less noise than uniform random sampling:
(If interested, also look up Poisson sampling.)