INSTAGRAM UPDATE – PORTAIT MODE
Instagram is one-upping Apple with a portrait mode feature that runs on a wider variety of phones and works with video, not just photos. Instagram is rolling out Focus, which blurs the background while keeping someone’s face sharp for a stylized, professional photography look.
Focus can be found in the Instagram Stories format options alongside Boomerang and Superzoom in both the selfie and rear-facing cameras, and it rolls out globally today on iPhone 6s, 6s+, 7, 7+, 8, 8+ and X, as well as select Android devices. That’s compared to Apple’s portrait mode that only works on the iPhone 7+, 8+, and X, and Android portrait mode that exists on the Pixel 2 and Pixel 2 XL. Instagram’s launch could also suck attention away from “fake portrait mode” apps like Magic Portrait Mode, FabFocus, LightX and Point Blur that can also add blurry background “bokeh effects” to images.
The iPhone portrait mode that takes advantage of newer models’ dual cameras does a little better job of keeping the whole face in focus, but it’s not available on older iPhones and can’t do video.
Focus gperhaps gives people another reason to choose Instagram over Snapchat, and could make shooting inside the Instagram app more appealing. After eight years of sunsets and latte art, it’s the selfies and portraits that still feel fresh on Instagram. Making them look as good as possible could keep Instagram from growing stale as it rockets toward 1 billion users.
Focus appears as an Instagram Stories format alongside Boomerang and Superzoom.
Meanwhile, Instagram is starting to roll out Mentions stickers that make it easy to tag friends in a Story with a stylized graphic instead of just text. Instagram tested these last month, but now they’re becoming available to all iOS users. Just like adding emoji to photos and videos, you can select the Mention sticker, use the typeahead to find a friend’s username and tag them in a resizable sticker. That lets people tap through to view their profile, and generates a notification to the tagged user.
Instagram has had text mentions since November 2016, soon after it launched Stories, but Snapchat just added them last month. Mentions could make it easier for creators on both the apps to collaborate and cross-promote each other, or encourage fans to spread their name to friends.
As Facebook endures unending scandals, Instagram has remained relatively unscathed by the backlash. Without links and resharing, it’s immune to a lot of the fake news and politics that have made Facebook exhausting. Instagram seems to see rapid feature development as the best distraction.
TWITTERS LATEST PHOTO UPDATE
Twitter’s latest behind-the-scenes photo upgrade is significant and quite brilliant, but many users won’t even notice it.
Posting on a recent Twitter Engineering blog, researchers Lucas Theis and Zehan Wang detail how neural networks are now being used to dramatically improve the way images are presented on Twitter’s web and mobile platforms.
It’s certainly no Instagram, but Twitter users upload millions of images to the social networking service each day and, unlike the mostly square pictures posted on Instagram, Twitter images tend to come in all manner of shapes and sizes.
This presents a problem to the platform’s interface designers as the Twitter timeline displays most images horizontally copped previews which the user can then tap to view full-size. These previews are rarely the same shape as the original images, and therefore only a small section can be displayed.
But Twitter doesn’t just crop out a rectangle from the middle of each image. As you might imagine this would often result in the main subject of the picture, such as a face, being ‘cropped out’ of the smaller image, rendering the preview somewhat useless.
You’ve probably never given it a second thought, but algorithms have been in place for some time which attempt to prevent this from happening by using facial detection to help determine which section of the original image should appear in the preview.
This should ensure that if there’s a face in the original image, it won’t be cut off in the preview.
Useful as it is, this technique won’t help when it comes to pictures of non-human faces such as cats or the limitless types of interesting objects which may appear outside the middle section of the original image. The face detection algorithm also sometimes failed to spot faces in images or, as humans often do, faces where none were present.
To improve on this method, Twitter has implemented a new and more sophisticated system which attempts to guess automatically which part of an image people will be most interested in viewing and then makes that into the image preview.
The new method uses the concept of ‘saliency’ to identify these most interesting regions of an image. Those most likely to be looked at are considered to have high saliency and are usually those featuring faces, text, animals and other more abstract concepts such as areas of high contrast.
WHAT DO YOU THINK OF THESE PHOTO UPDATES? WHAT WILL THEY COME UP WITH NEXT? AS FAR AS WE ARE CONCERNED BRING IT ON!