The reason is that even though the webcam is much less sensitive than my main camera and has a much smaller dynamic range (brightness levels), it can take up to 25 frames a second, whereas the main camera is limited to one 0.1 second frame each second. To minimize the effects of seeing (air turbulence), I can therefore use the technique of "lucky imaging" with the webcam. This is a technique with exposure times short enough so that the changes in the atmosphere during the exposure are minimal. From these images (a movie really), I select the frames least affected by the atmosphere and combine them into a single image by shifting and adding the short exposures. This yields a much higher resolution than would be possible with a single, longer exposure and allows me to reach the diffraction limit of my telescope, about 0.5 arc seconds (one arc second is the apparent size of a dime about 3.7 kilometers away). By adding hundreds of individual frames like this, the effective dynamic range of the webcam increases, reducing the effects of noise, and I can apply advanced image processing techniques to further increase the resolution of the final image.
Because the exposures are short, I can also use the simpler ALT-AZ setup for the telescope, which is less sensitive to disturbances and vibrations by the wind. During the exposures, the telescope is passively tracking the object to counter the effects of the rotation of the Earth. The Earth's rotation moves objects with a speed of up to 15 arc seconds each second out of view or, with the image scale generally used for these images, between 30-60 pixels each second. The telescope mount can counter the effects of this rotation, but with my telescope, the remaining tracking errors have an eight minute periodic component of 21 arc seconds in them. With the built-in software of the telescope mount I reduced that to an eight minute periodic error of 7 arc seconds peak to peak (in polar mode; I never measured it in ALT-AZ mode but it sure is much higher).
But, because the individual exposures are very short, this remaining periodic tracking error in an 8 minute time period does not lead to image smearing.
Another benefit of the webcam for these kinds of objects is that, unlike my main camera, it is not a monochrome but a color camera. The 640x480 pixel CCD contains a Bayer filter (50% of the pixels have a green filter, 25% have a red filter and 25% have a blue filter). Although this means that each RGB pixel has at least two interpolated color components, it does have the benefit that the colors are shot at the same time. The only post processing required is relative shifting of the color channels, as the Earth's atmosphere refracts light at a slightly different angle for each color, which amounts up to several pixels on the image scale used.
The maximum number of frames that can be combined using this method is limited to a few thousand. Only a certain percentage, like 10%, of the frames can be used to maximize the resolution of the result. My wish to use an ALT-AZ setup for these kind of images also limits the total exposure time because of the effects of field-rotation in this mode. In addition, the imaged objects themselves rotate too, which leads to smearing if the movies are longer than about 10 minutes.