This article presents a method to improve camera calibration by separating determination of the non- linear calibration parameters from that of the lin- ear ones. We measure correspondences between the distorted image on a camera target and a flexible undistorted calibration pattern displayed on a TFT monitor. Using these correspondences the image of the camera may be projected onto the plane of the calibration pattern to remove the distortion. We prove that this projection follows the principles of a pinhole camera and call it virtual camera. Using the undistorted images of the virtual camera for tra- ditional camera calibration methods, linear camera parameters can be calculated with a 7.5% to 39.5% higher precision compared to Zhang/Heikkil#. The described procedure allows a camera calibration process in a non-iterative way.