You measure the HV at the tube's anode connection. That's the thick red wire that plugs into the tube with the suction cup. On a 19" monitor, it should be around 19.5kv or so - give or take a thousand volts. The probe's ground connection just gets clipped to the metal frame of the monitor.
HV can change slightly with beam current, which can cause the power supply to sag and thus affect the HV. But it won't fluctuate much. In most monitors the B+ is directly tied to the HV, so if the B+ is right on, then the HV will be too. Raster monitors don't usually have a HV adjustment, it's all pretty fixed - this is because the operating frequency of the flyback is tied to the horizontal scan frequency. In a color vector monitor, the HV can be a lot more variable, and that's the only place I ever use the HV probe. On a monitor like the WG6100, the HV can jump around a lot, depending on how well the circuit is working, and if the regulator transistors are good. It's still pretty tightly tied to the B+ though - on a 6100 when I dial the B+ to 180v, the HV is spot on.
If you're really paranoid, then sure, get a HV probe and measure the HV. But again, I don't think it's physically possible to produce damaging radiation from a modern monitor. The protection circuits will cut it off way before it gets that high, and even if they didn't, the picture wouldn't even come close to fitting on the screen because the B+ would be so high. You also get weird blooming if the HV regulation is shot. When the HV diode fails on a black and white vector, the HV gets screwed up and the picture gets huge and washed out. Of course, this is a black and white monitor - obviously there's no possibility for X radiation with a black and white picture tube.
-Ian