Pages: [1]   Go Down
  Print  
Author Topic: Manual camera controls in iOS 8: Explained  (Read 381 times)
HCK
Global Moderator
Hero Member
*****
Posts: 79425



« on: September 07, 2014, 09:00:29 am »

Manual camera controls in iOS 8: Explained

<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p><a href='http://www.imore.com/camera-api-ios-8-explained' title="Manual camera controls in iOS 8: Explained"><img src='http://www.imore.com/sites/imore.com/files/styles/large/public/field/image/2014/09/wwdc_2014_camera_api_icon.jpg?itok=0YZNrg8y' />[/url]</p> <p>While the <a href="http://www.imore.com/camera" title="Camera app for iPhone and iPad features, help, and how-to">Camera[/url] app in <a href="http://www.imore.com/ios-8" title="iOS 8 help, how-to, news, reviews">iOS 8[/url] is only getting a few new features, the Camera application programming interfaces (API) — what developers use to make App Store camera apps — is getting the most significant update in the history of the platform, including and especially manual controls for focus, exposure, and white balance. Not much will change for casual photographers, but for pros and enthusiasts, the best camera we have with us will be getting a whole lot better. So, how does it all work?<!--break--></p> <h2>Automatic vs. manual</h2> <p>Nokia offers great lenses with optical image stabilization (OIS). They want to capture the best light possible right from the start. Google makes everything awesome on the servers. They never know which device or what quality camera they'll get the data from, they concentrate on finishing strong. Apple, however, has focused on the best custom image signal processors (ISP) in the business. They control not only the software but the hardware down to the chip so they optimize each part to get the best whole.</p> <p>That's why, with the tap of a finger, the iPhone locks on the most obvious subject in the frame, exposes for the best balance of light and shadow, makes sure white is as close to white as technologically possible, and produces an image that, 9 times out of 10, looks as good if not better than phones with much better optics or server farms can provide.</p> <p>But what about that 10th time out of 10? What about when the object you want to focus on isn't the most obvious? When you want to force a scene bright or darker for artistic or practical effect? When you want the set a custom white balance?</p> <p>Just like automatic transmission is the quickest and most reliable way for most people to drive a car most of the time, automatic cameras are the quickest and most reliable way for most people to capture the memories that will matter to them most of the time. For the pros, however, for the artists and experimenters, for those who want to control every aspect themselves — nothing beats full-on manual, not on the road and not on the shoot.</p> <p>And that's what Apple's providing with iOS 8. The built-in Camera app is getting time-lapse photography and a sun icon you can swipe to change exposure, but developers are getting more. They're getting complete manual control over focus, exposure, and white balance.</p> <h2>Manual focus</h2> <p>Focus means making sure whatever the most important thing is in your photo, whether it's as close a flower petal or as far away as a sunset, is crisp and as sharply captured as possible. Apple has done a lot to make focus "just work" on the iPhone. There's auto-focus, tap to focus, and multiple face detection. All of that is designed so that the camera will sharply capture what it believes is are the most important elements in the scene.</p> <p>Manual focus control is for when you want to determine for yourself what, if anything, should be sharply captured. Maybe you want the entire photo to be blurry and dreamy, maybe you want to stack focus to avoid any blur, maybe you want to pull focus on a moving subject, or maybe you want to change focus over time.</p> <p>The manual camera controls let you do just that. Instead of tap to focus, you can do something functionally akin to turning the focusing ring on a traditional camera lens. Hold your iPhone up and frame two objects, one super close, one some distance away. Manually change the focus and watch as the one in front goes from sharp to blurry, and the one in back goes from blurry to sharp.</p> <p>Manual focus in iOS works on a scale of 0.0 to 1.0, with macro on one end and "infinity" on the other. Developers can lock focus at any lens position to achieve focus along any point of that scale. Apple chose to use a scale rather than distance measures because of the way focus is implemented on the iPhone and iPad.</p> <p>To change focus, the lens is physically moved via a spring and a magnet. That means that there's bounce when it moves, there's a stretch dependent on the angle of gravity, and changes over time as the spring gets used more and more often. So, telling the lens to move to a certain position can and will produce different results at different times. Telling it you want a certain scale will move the lens to the position required to achieve it, regardless of what that position might be at any given time.</p> <p>Because even a Retina screen isn't as high resolution as a photo capture — currently 1136x640 vs. 3264x2448 on an iPhone 5s with an 8mp iSight camera — the preview image has to be scaled down. That can make manual focus more challenging. To help compensate for that, Apple is providing ways for developers to show zoomed in previews, to compute their own focus scores, and to highlight sharp areas (focus peaking).</p> <p>In other words, many of those fancy focusing tools you had on your DSLR are finding their way to your iPhone.</p> <h2>Manual exposure</h2> <p>To determine how bright or dark your image is, you "expose" your camera's sensor to greater and longer or smaller and shorter amounts of light. Normally, in automatic mode, the camera will constantly calculate the best exposure for any given seen so you get the best exposed photo of that scene. Sometimes, however, you might want an image that's surrealistically bright or sullenly dark, an image with minimal motion blur or with a lot of it, and image with as little noise as possible, or as bright as possible regardless of the amount of noise it general. Enter manual exposure.</p> <p>Exposure is determined by shutter speed, ISO (light sensitivity), and lens aperture.</p> <p>Shutter speed is the duration of exposure. The faster the shutter closes, the shorter the amount of time the sensor is exposed to light. That means the image will be darker but also have less motion blur will result (because things won't have had much time to move). The slower the shutter closes, the longer the amount of time the sensor is exposed to light. That means the image will be brighter but will have more motion blur (because things will have had time to move more).</p> <p>Generally you want shorter exposures/faster shutter speed for well-lit action shots, and longer exposures/slower shutter speed for low-light stills.</p> <p>ISO (International Standards Organization) originally measured how sensitive film stock was to light. Now it means how sensitive the digital camera capture is to light. Low ISO is less sensitive to light which makes for darker images but less noise. High ISO is more sensitive to light which makes for brighter images but with more noise (the result of spikes that occur when amplifying the signal off the camera's CMOS chip).</p> <p>Aperture is the size of the lens opening. If shutter speed is how long you sip from a straw, aperture is how big the straw is. The bigger the aperture, the more light you can take in while the shutter is open. To date, however, Apple has only shipped fixed-aperture cameras on the iPhone, iPod touch, and iPad. So, manual exposure controls are limited to shutter speed and ISO.</p> <p>Automatic exposure on iOS tries to ensure a properly exposed image by dynamically changing shutter speed (duration of exposure) and ISO (light sensitivity) based on a constant stream of metering stats it receives from the scene being photographed.</p> <p>Manual exposure lets you control all that yourself. You could, for example, could choose to minimize noise in a lower-light setting by cranking down the ISO and, if you're stable enough, cranking up the duration. That would give you a better lit, far less noisy image.</p> <p>Developers can set duration and ISO together or can lock one and only let the other be set. iOS will continue to feed them the metering stats, and supply an offset value that they can use if they want to, but duration and ISO will no longer be bound to it.</p> <h2>Exposure compensation</h2> <p>Sometimes you may want slightly more control than automatic exposure allows, but without the complexity of manual exposure controls. Instead of manipulating duration and ISO, you just want to make an image a little brighter or a little less bright. That's where exposure compensation, also known as exposure target bias, comes in.</p> <p>With exposure compensation, Apple's automatic exposure algorithms still handle all the heavy lifting, but you get to bias it one way or another to get closer to the look you want. And it works in both continuous and locked modes. So, you can bias the exposure to make a scene brighter, move the camera, and exposure will keep adjusting to keep that level of enhanced brightness. Or, you can lock exposure based on a particular scene and nothing will chance unless you bias the exposure from there.</p> <p>Exposure compensation is expressed in f-stops. +1 f-stop doubles the brightness, -1 f-stop halves the brightness.</p> <p>Developers can currently set exposure target biases between -8 and +8 for all existing iOS devices. However, Apple warns that that could change in the future.</p> <p>Exposure compensation is also the basis for the new adjustable exposure in the iOS 8 Camera app. Tap to focus, get the sun icon, swipe it up to bias exposure and make the scene brighter, or swipe it down to bias exposure and make the scene darker.</p> <h2>Manual white balance</h2> <p>White balance is just what the name implies — making sure the whites (and grays) in your image are as close to white (and gray) as possible. Too cool, and everything looks blueish. Too warm, and everything looks yellowish. In other words, white balance is all about making the colors in your image look as realistic as possible. Why is that hard? Because different light sources give off cooler or warmer light. Incandescent lights are warm and yellow. Daylight is cooler and more blue.</p> <p>Cameras need to adjust for light temperate by boosting other colors to compensate. For example, if color temperate is tinting a scene blue, the camera software has to boost red and a little green. Under mixed lighting conditions like a blue tinted computer display and a yellow tinted desk lamp, compensating can become more complicated. (See Planckian locus if you're interested in how it works.)</p> <p>The iOS Camera app handles all this automatically. Traditional cameras often offer automatic white balance as well, along with specific ones optimized for sunlight, cloudy outdoor light, shadowy conditions, incandescent bulbs, florescent lights, and flash photography, along with the ability to set custom white balances.</p> <p>All of that, and more, is what manual white balance allows.</p> <p>With iOS 8, Apple is giving developers full control of device red/green/blue (RGB) gains. That includes temperature casts between yellow and blue, and tints between green and magenta. Apple is also providing conversion routines to and from device independent color spaces. That means developers can go to or from the device specific values to x,y chromaticity values or temperature and tint values. That's important when cameras and the RGB gains coming off of them vary from device to device, but apps have to work across all devices.</p> <p>Developers set red, green, and blue gains all at once in a new struct. Currently, the maximum white balance gain a developer can set on any iOS device is 4, but Apple again cautions this might change in the future. x,y chromaticity, and temperature/tint are also set in new structs. Chromaticity can range from 0 to 1. Temperature is a floating point value in Kelvin, and tint is a green/magenta offset from 0 to 150. Conversion routines don't take into account whether their results are legal color values or not (i.e. can be seen by humans or not), so developers need to check for out-of-range values.</p> <p>Custom white balances using gray cards is now possible as well. A longtime tool of traditional photographers, a gray card can be invaluable in setting the proper white balance for a scene with mixed or otherwise tricky lighting.</p> <p>Gray cards are literally cards colored neutral gray that fill the center 50% of the frame. That way the auto white balance can lock onto a known neutral gray value and ignore any colors or reflections that might otherwise bias or misinform it.</p> <p>For example, if you wanted to take a picture of someone all dressed up in yellow sitting on top of a pile of bananas, it's possible the automatic white balance might mistake sunlight reflecting off all that yellow for incandescent light. So, it might boost the blues to compensate resulting in an image that looks sickly and wrong. Stick a gray card in there, however, lock white balance on the card, and the auto white balance will work to make that gray look gray regardless of any other colors or casts in the frame. You get great looking yellows without them messing up all the other colors in the photo.</p> <h2>Bracketed capture</h2> <p>Bracketed capture allows for bursts of images to be taken, with the option for changing camera values from image to image.</p> <p>Burst mode on the iPhone 5s is an example of a simple bracket where nothing changes but you make sure you capture all the action from a flip, finish line, or even a baby with eyes wide open.</p> <p>High-dynamic range (HDR) is the classic example of a bracket with changes. Take photos with the exposure biased to -2, 0, and +2, and then fuse them together pull out detail in both light and shadow.</p> <p>Combine bracket capture with the new manual camera controls and developers can make apps that do both those things, but also with the potential to do much, much more.</p> <h2>Bar codes, permission requestors, H.264 encoder, and PhotoKit</h2> <p>In addition to the manual controls, a few other features are coming to the iOS 8 camera and audio/video foundation as well.</p> <p>The camera is gaining support for three new types of bar codes, data matrix, interleaved 2 of 5, and ITF14, as well as global support for camera and mic permission requesters.</p> <p>Developers will also getting direct access to the hardware H.264 video encoder for real-time capture. Yeah, we're in for some fun.</p> <p>Then there's the new Photos app and PhotoKit, which ties into the new iCloud Photo Libraries, and gives developers faster performance, read and write access to the library, non-destructive edits, and the ability to delete photos (with permission). And there's photo extensions, which brings App Store filters and transformations into the main Photos app.</p> <p>In other words, there's a lot. A lot, a lot.</p> <h2>Bottom line</h2> <p>To say this is a major release for photography and photographers would be to greatly undersell it. With <a href="http://www.imore.com/ios-8" title="iOS 8 help, how-to, news, reviews">iOS 8[/url], Apple is taking the best automatic camera on a smartphone and making a run at the title of best manual camera on a smartphone as well. That Apple isn't including all the new controls in their own Camera app, but is leaving them for developers to implement might even allow for the best of both worlds.</p> <p>Casual photographers can stay comfortable in the largely automatic, easy and simple to use confines of the Camera app, and developers can make App Store apps that offer those full on manual controls. They can appeal to those pros, those artists, and those experimenters.</p> <h2>More of iOS 8: Explained</h2> <div id="block-block-69" class="block block-block"> <div class="content"> <ul><li>Handoff in iOS 8 and OS X Yosemite: Explained</li> <li>Making and receiving phone calls on iOS 8 for iPad and OS X Yosemite: Explained</li> <li>Sending and receiving SMS/MMS on iOS 8 for iPad and OS X Yosemite: Explained</li> <li>AirDrop and Instant Hotspot in iOS 8 and OS X Yosemite: Explained</li> <li>QuickType keyboard in iOS 8: Explained</li> <li>Interactive notifications in iOS 8: Explained</li> <li>SceneKit in iOS 8: Explained</li> <li>Metal in iOS 8: Explained</li> <li>Widgets in iOS 8: Explained</li> <li>Share extensions in iOS 8: Explained</li> <li>Action extensions in iOS 8: Explained</li> <li>Inter-app photo and video editing in iOS 8: Explained</li> <li>Custom keyboards in iOS 8: Explained</li> <li>Family Sharing on iOS 8: Explained</li> <li>iCloud Drive and Document Picker for iOS 8: Explained</li> <li>Document provider extensions in iOS 8: Explained</li> <li>TestFlight in iOS 8: Explained</li> <li>Apple Maps in iOS 8: Explained</li> <li>iMessage in iOS 8: Explained</li> <li>Photos in iOS 8: Explained</li> <li>Spotlight in iOS 8: Explained</li> <li>Health in iOS 8: Explained</li> <li>Touch ID in iOS 8: Explained</li> <li>HomeKit in iOS 8: Explained</li> <li>Adaptive UI in iOS 8: Explained</li> <li>Manual camera controls in iOS 8: Explained</li> </ul></div> </div> </div></div></div><div id="comment-wrapper-nid-26762"></div><img width='1' height='1' src='http://tipb.com.feedsportal.com/c/33998/f/616881/s/3e359916/sc/4/mf.gif' border='0'/><br clear='all'/>

<a href="http://da.feedsportal.com/r/204367377609/u/49/f/616881/c/33998/s/3e359916/sc/4/rc/1/rc.htm" rel="nofollow"><img src="http://da.feedsportal.com/r/204367377609/u/49/f/616881/c/33998/s/3e359916/sc/4/rc/1/rc.img" border="0"/>[/url]
<a href="http://da.feedsportal.com/r/204367377609/u/49/f/616881/c/33998/s/3e359916/sc/4/rc/2/rc.htm" rel="nofollow"><img src="http://da.feedsportal.com/r/204367377609/u/49/f/616881/c/33998/s/3e359916/sc/4/rc/2/rc.img" border="0"/>[/url]
<a href="http://da.feedsportal.com/r/204367377609/u/49/f/616881/c/33998/s/3e359916/sc/4/rc/3/rc.htm" rel="nofollow"><img src="http://da.feedsportal.com/r/204367377609/u/49/f/616881/c/33998/s/3e359916/sc/4/rc/3/rc.img" border="0"/>[/url]

<img src="[url]http://da.feedsportal.com/r/204367377609/u/49/f/616881/c/33998/s/3e359916/sc/4/a2.img" border="0"/>[/url]<img width="1" height="1" src="http://pi.feedsportal.com/r/204367377609/u/49/f/616881/c/33998/s/3e359916/sc/4/a2t.img" border="0"/><img src="http://feeds.feedburner.com/~r/TheIphoneBlog/~4/KHp8-XSvKiQ" height="1" width="1"/>

Source: Manual camera controls in iOS 8: Explained
Logged
Pages: [1]   Go Up
  Print  
 
Jump to: