This paper presents a novel interactive Projector Calibration for arbitrary Multi-Projector-Camera environments. The method does not require any calibration rig and is not restricted to any special arrangement of the display devices. The cameras in the system need to be pre-calibrated so that a common world coordinate system can be defined. Each projector is sequentially activated and a series of self-identifying tags is generated. These tags are accurately and robustly detected and must be seen by a minimum subset of the cameras. This is achieved by freely moving a target object, i.e. a sheet of paper, onto which the tags are projected. Point correspondences for each tag in three-space are computed by minimizing the reprojection errors in the relevant images and eliminating potential outliers. If a tag has been seen a sufficient number of times, it is masked out in the display so that a visual feedback of the current detection state is given to the user. The resulting 3D-point cloud coupled with the known 2D tag-position in the projector frame serves as input for a nonlinear optimization of the projector's intrinsic and extrinsic parameters as well as distortion factors. We show that the overall procedure takes less than a minute and results in low reprojection errors.