digital scope "noise"

if you set a digital scope to a low setting, say 2mV/div and the short the probe tip to its ground lead, what sort of trace is typically seen? would you get a thin line as per analog scope or will it be a fatter trace?
 
I see about 100uV rms with x1 and input shorted, at a bandwidth of 100MHz, so about 10nV/√Hz (SDS1104X-E)

Because the trace is digital none of this is normally hidden, you see the full width of the noise waveform on a running display. Some 'scopes have configurable low pass filtering which can be useful, mine only has a 20MHz option.
 
On 2mv sensitivity on my old Tek 2445A CRT oscilloscope with no filters and 20MHZ limited there is a minute amount of noise --so low its in and out of triggering changing to 5mv the trace is hairline thin, that's why I still use an CRT ,scope --no processing of signal straight to CRT.
 
X1 or X10 probe? The 9 Meg resistor doesn't help. And bandwidth matters a lot. On my 7854 with 250 MHz preamp there is significant noise. Even more with the 20 GHz sampling plug in.

My digital scopes are similar. Wider band = more noise. Also the higher speed CRTs tend to have larger spot size. The 547 was the sharpest but limited to 50 MHz.
 
Digital scopes add another layer of noise, quantisation noise, which is affected both by ADC linearity (worst at very low signal levels) and clock jitter. Ironically the noise in the analogue input pre-sampler can theoretically act as dither to improve the performance of the A-D, however that implies the digital conversion is losing some low level signal detail.