Note: Please remember that this post is over 5-years old, is not therefore current, so code at your own risk.
In the summer of 2010, Apple opened up UIGetScreenImage() as a way of taking screenshots in iOS apps. There was great joy in the land.
Then the following September, Apple decided that it was better for app developers to use either UIImagePickerController or methods from AVFoundation to capture images and present a camera view. Happiness was replaced with great sadness in the land.
To help developers, Apple’s iOS Team came out with 4 Technical Q&A’s that tried to show developers how to get around the prohibition on UIGetScreenImage() while still accomplishing the same thing. To put it simply, what had been a one-line job became a many line task.
Worse, in none of the Apple supplied Technical Q&A’s was there an elegant solution for those interested in augmented reality applications…such as I, to take a screen shot. So, in plain English, if you wanted a screenshot of your augmented reality app including its UIKit layer content, sort of like…well, a gun camera, you were out of luck. This bothered me greatly.