Astronomers have been measuring the parallax of astronomical bodies since at least the time of Ptolemy and Hipparchus. Ptolemy calculated a parallax for a position of the Moon and got a value of 1+7/60 degrees. The Sun had no measurable parallax. Unfortunately, like meteorites, comets were considered phenomena of the ethereal realm so we have no information about the path of comets from him. Tycho Brahe observed comets by measuring their positions from nearby stars to about 1 minute of arc accuracy.
When tracking an object passing near the Earth one observer can only determine its direction at various times if one does not take advantage of a theory of motion. To get the range of the object one must make simultaneous observations from two places on the Earth's surface. Consider the situation shown in the figure below where two observers sight object X from positions P1 and P2. The distances of the object from the two positions will be λ1 and λ2 along directions e1 and e2 respectively. It can be shown that the two lambdas are solutions of the set of equations below where ΔP=P2-P1 and the dot indicates a vector dot product.
Now consider two observers, one in Sydney and a second in Perth, who can both sight an object whose parallax is 0.1° in the directions shown. We can solve the equations above to determine the geocentric distance and range of the object. For simplicity it can be assumed that the radius of the Earth is 1 and we see that our observers will be separated by about half an Earth radius. The range of the object is found to be 294 Earth radii.
If the parallax is increased to 1° the range of the object decreases to 30.2 Earth radii.
For a object at a distance of about 5 Earth radii the parallax for observers separated by 1° is about 0.2°.
Edit (Jan 9): Corrected e2 so that the parallax worked out correctly but with altered values for the corresponding ranges.