Warm Southern Breeze

"… there is no such thing as nothing."

#iPhoneography Tips & Tricks

Posted by Warm Southern Breeze on Monday, March 16, 2015

The iPhone is a simple, though powerful tool. It’s ease of operation often belies the sophisticated mathematical and computer algorithms which undergird it’s operations. The integrated movie and still camera is one of the iPhone’s highlights, and Apple, Inc. makes every opportunity to improve the images the camera produces.

Quite honestly, the iPhone is perhaps my favorite creative imaging tool… as you may likely attest, if you’ve been following my photo stream for any time. There are very few things the iPhone can do that my Nikon DSLR cannot. There are a few features on the iPhone which the Nikon does not have, and vice versa. One is the Nikon’s ability to create RAW images, whereas the iPhone creates only JPEG images. By the same token, the iPhone has a “Burst” mode, whereas the Nikon does not. And I’ve been able to obtain images with my iPhone that my Nikon could never get, such as bluebird hatchlings in their nest box.

The diminutive size of the iPhone belies it’s strength, and the ubiquitous modern smartphone with integrated camera is now so commonplace that it’s difficult – if not impossible – to find a smartphone without an integrated camera feature.

#iPhoneography #HowTo - vss9896

iPhoneography Tips & Tricks –
This Screen Shot shows THREE helpful hints on using the iPhone camera to it’s fullest capacity.

One would imagine that a software-driven camera such as the modern high-dollar DSLR is, would have more features incorporated into it than it does already. And yet, it appears that only Sony has taken such a matter under consideration and incorporated those numerous features into their α (Alpha) series mirrorless DSLR cameras.

Part of the beauty of any smartphone is the practical ease with which it enables or empowers us to communicate – with text, with images, and more.

Some note with a sense of frustration and consternation, that increasingly, smartphone manufacturers do not include operating instructions with their premiere products, and unfortunately, such is also the case with Apple. It’s not that such instructions do not exist, but that those details are often hidden.

So while the ease of operations of the iPhone and other smartphone cameras have put photography within reach of almost everyone (remember, that was Kodak’s goal with their “Brownie” and “Instamatic” model cameras of years gone by), it remains the case that not everyone knows how to obtain a good image using their smartphone.

And that was precisely the case which inspired me to create this image and to share it with you.

A long-time friend had posted some images which appeared quite blasé, and poorly created. I inquired about the tool he used to create them, to which he replied that he used iPhone, and made the images around dusky dark… the veritable “blue hour” or Twilight Zone.

Since possessing a powerful tool is no guarantee that one knows how to use it, I surmised what he had done, and created this image to assist and improve his efforts. He replied that my supposition was correct, which reinforced my desire to assist him.

Perhaps this may help you, too!

4 Responses to “#iPhoneography Tips & Tricks”

  1. jvlivs said

    Reading this, As much as I love my iPhone, I am seriously gearing towards the Note 4/Note Edge. Reasons being because its camera is stupendous, and unlike the iPhone, you can add more memory to it. But just like comparing apples to oranges (no pun intended), just as the iPhone has its issues, likewise with the Note/Note Edge. With that in mind, it’s all about preference. Now one thing that I love about the current iPhone is 1) the camera, and 2) both the 6, and 6 Plus function like the iPad. If you wish to use it sideways, you can do that now, even though Apple might be a bit late using that feature seeing that Samsung had pretty much surpassed them on many features. But the camera life on both are very impressive. They both have a nice, crisp, vivid, and clear presentation. There was some debate in recent years as to whether or not iPhones/smartphones would eventually surpass the DSLR. I, for one, feared that it would, but after awhile I was convinced otherwise. Despite some conveniences that the iPhone/smartphone has, it has its limits.

    One major thing that has really taken the guesswork out the iOS camera was the “trigger” in the camera. I have the 4S, and it had iOS 5 when I first bought it, and I’m at iOS 8.1 now. And up until iOS 7 you couldn’t hold the trigger to make some quick shot pics (something Canon made very popular in recent years), and even though I don’t use that feature often, it always makes a good backup in case your DSLR wants to act choppy. Mine does that often-esp. in the middle of some real good shots.

    I’ll give credit where it’s due. Both iOS and Samsung have stepped it up with their cameras. I’d be very shocked if-and that’s a BIG if-I can get iOS 9 on a 4S. I’ll be checking out your other two blogs on the subject.

    Like

    • Warm Southern Breeze said

      Among knowledgeable critics, Apple’s sensor is hands-down the best. Interestingly enough, it’s made by Sony. And as you likely know, Samsung has made many components for the iPhone – which has inevitably led to civil suits by Apple against Samsung for patent infringements, with Apple’s touch-screen technology being perhaps the most well-known example. But more on the camera aspects…

      As you know, the modern camera – whether in a smartphone, a point-and-shoot, DSLR, or even high-end medium format – is software-driven, and that’s perhaps the easiest part. The other two parts are optics and hardware. Since optics are virtually down pat, the next “thing” is the hardware, and that’s where Apple is leading the way. In the iPhone 6 and 6 plus, the CMOS sensor is actually comprised of THREE separate color-sensitive sensors, each one “tuned” to be particularly sensitive to either Red, Blue, or Green, the three additive colors for delivery across an electronic spectrum. Here’s what chipworks.com wrote about the sensor: “Fabricated by Sony, the iSight camera chip is a stacked (Exmor RS), back-illuminated CMOS image sensor (CIS) featuring 1.5 µm generation pixels (introduced for the iPhone 5s). The die size is 4.8 mm x 6.1 mm (29.3 mm2).” And “…the iPhone 6 Plus FaceTime camera is a stacked Sony CIS…”

      As you and other photographers may understand (particularly “old-school” photogs like myself) the modern electronic pixel size correlates to grain size in standard B&W photography, in which the silver halide particles were the photosensitive compound embedded in a gelatin base upon a nitrate-then-later-plastic base. The larger the silver grains, the more light sensitive the film was. But there was a trade-off, which was granularity. The images couldn’t be significantly enlarged without losing some, or a significant portion of their detail – whereas with “slower,” or less light-sensitive films, the fine grain permitted significant print enlargement, albeit the trade-off was the necessity for significant lighting. Thus, those films were the ones most often used for studio work, whereas the “faster” films were used predominately for action or sports events, some which were held in evening hours, or in poorly lit indoor sports arenas.

      More to the point, however. In the modern pixel-based software-driven camera, the pixel size is a wonderfully analogous example of that principle. The larger pixel sizes in the sensor, the more light sensitive the sensor, but the poorer the quality of the image. The smaller the pixel size, the higher the quality image, but lower light sensitivity.

      How to fix that seemingly insurmountable obstacle of pixel size/image quality and light sensitivity?

      Use small pixels, and three separate sensors, each specifically “tuned” to be more sensitive to a particular frequency of light. As Apple Insider writes, the iPhone 6 has “5,344 pixels-by-4,016 pixels, the IMX230 has 21 effective megapixels, allowing the sensor to push out 4K video” and notes that “by stacking circuitry below the sensor’s backside illuminated pixel array, Sony was able to squeeze in advanced phase detection signal processing hardware to parse data from 192 available AF reference points (Focus Pixels).”

      So, yeah… manufacturers are focusing (pun intended) upon hardware.

      Like

  2. jvlivs said

    Wow. It’s kinda like how the auto industry has been run all these many decades… Different product, same comparison.

    Like

  3. […] I have earlier written about that in an entry dated Monday, March 16, 2015 which is entitled “#iPhoneography Tips & Tricks.” […]

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

 
%d bloggers like this: