This video is from a 1987 Tektronix VHS tape. It goes over the basics of using a Waveform monitor to judge the technical quality of an analog SD video signal.
Today's digital waveform monitors such as those on the Tricaster don't show IRE, it shows a scale from 0 to 235 if 8 bits. How do you translate 0-100 IRE values to digital equivalents? I think 235= 100% IRE?
Pretty much. The use of 235 as the digital white level is to make sure any ringing from a black-white transition doesn't exceed the 255 cap and cause undesirable distortion. In NTSC-M terms, 16 in the digital space would be your 7.5 IRE black level (it'd be 0 IRE for PAL/SECAM).
In the NTSC system, the lowest black level is distinct from the blanking (screen/beam off) level in the transmission signal for the purposes of lining up the image in one's monitor, distinguishing the active video area (7.5 IRE) from the blanking intervals (0 IRE). As PAL and SECAM came later than NTSC/System M, they decided it wasn't necessary to have that pedestal, and just used PLUGE bars in a test pattern to set the appropriate levels (surrounding the desired black level with a slightly darker and slightly brighter section). The Japanese NTSC system (NTSC-J) didn't use the setup level, just like PAL and SECAM, but American NTSC still had to retain it for compatibility reasons.