Wanted top also add, I am not knocking apple too much for heavy use of lens correction. All of the new point and shoot cameras do it.
After you get out of the realm of plastic lenses, the lens easily cost more than the image sensor so companies will try to use as little lens glass as possible and use heavy software lens correction so they can use the same lens elements on as many cameras as possible.
for example
here is a canon powershot sx 230 hs's jpeg output
http://www.flickr.com/photos/hamster768/9727068436/sizes/o/in/photostream/
now here is the raw file output (camera raw was enabled through a firmware hack)
Now here is the raw frame buffer (converted to jpeg)
http://www.flickr.com/photos/hamster768/9723838921/sizes/o/in/photostream/
and here is the image that was displayed on the screen that I took a picture of using the camera
http://www.psdgraphics.com/file/metal-grid.jpg
If you look at the distorted areas in the raw image and compare them to the corrected jpeg that the camera makes, you will see blurring in those areas where the grain at the pixel level looks stretched.
the canon point and shoot mainly has it at the edges, but for the iphone, it has it at the edges and a ring of the ditortion around the center (2 large rings of distortion)
To get an image that is sharp edge to edge, you need a quality lens where the glass is very large and thus you are using only the best part of the glass and not the sloped edges where barren distortion takes place.
Though at that level you will notice the lens costing more than the camera.
The main thing holding apple back is their lens. even on the iphone 5, it is the main thing holding the camera back in terms of sharpness and detail due to the software corrections the camera has to make.
(PS even the highest end compact point and shoot that canon makes does this same behavior, they use the smallest lens possible for the sensor size, then use lens correction in the cameras built in jpeg processing)