Thursday, 13 September 2018

JBL’s smart display combines Google smarts with good sound

If you’re looking for a smart display that’s powered by the Google Assistant, you now have two choices: the Lenovo Smart Display and the JBL Link View. Lenovo was first out of the gate with its surprisingly stylish gadget, but it also left room for improvement. JBL, given its heritage as an audio company, is putting the emphasis on sound quality, with stereo speakers and a surprising amount of bass.

In terms of the overall design, the Link View isn’t going to win any prizes, but its pill shape definitely isn’t ugly either. JBL makes the Link View in any color you like, as long as that’s black. It’ll likely fit in with your home decor, though.

The Link View has an 8-inch high-definition touchscreen that is more than crisp enough for the maps, photos and YouTube videos you’ll play on it. In using it for the last two weeks, the screen turned out to be a bit of a fingerprint magnet, but you’d expect that given that I put it on the kitchen counter and regularly used it to entertain myself while waiting for the water to boil.

At the end of the day, you’re not going to spend $250 on a nice speaker with a built-in tablet. What matters most here is whether the visual side of the Google Assistant works for you. I find that it adds an extra dimension to the audio responses, no matter whether that’s weather reports, a map of my daily commute (which can change depending on traffic) or a video news report. Google’s interface for these devices is simple and clear, with large buttons and clearly presented information. And maybe that’s no surprise. These smart speakers are the ideal surface for its Material Design language, after all.

As a demo, Google likes to talk about how these gadgets can help you while cooking, with step-by-step recipes and videos. I find that this is a nice demo indeed, and thought that it would help me get a bit more creative with trying new recipes. In reality, though, I never have the ingredients I need to cook what Google suggests. If you are a better meal planner than I am, your mileage will likely vary.

What I find surprisingly useful is the display’s integration of Google Duo. I’m aware that the Allo/Duo combo is a bit of a flop, but the display does make you want to use Duo because you can easily have a video chat while just doing your thing in the kitchen. If you set up multiple users, the display can even receive calls for all of them. And don’t worry, there is a physical slider you can use to shut down the camera whenever you want.

The Link View also made me appreciate Google’s Assistant routines more (and my colleague Lucas Matney found the same when he tried out the Lenovo Smart Display). And it’s just a bit easier to look at the weather graphics instead of having the Assistant rattle off the temperature for the next couple of days.

Maybe the biggest letdown, though (and this isn’t JBL’s, fault but a feature Google needs to enable) is that you can’t add a smart display to your Google Assistant groups. That means you can’t use it as part of your all-house Google Home audio system, for example. It’s an odd omission for sure, given the Link View’s focus on sound, but my understanding is that the same holds true for the Lenovo Smart Display. If this is a deal breaker for you, then I’d hold off on buying a Google Assistant smart display for the time being.

You can, however, use the display as a Chromecast receiver to play music from your phone or watch videos. While you are not using it, the display can show the current time or simply go to blank.

Another thing that doesn’t work on smart displays yet is Google’s continued “conversation feature,” which lets you add a second command without having to say “OK, Google” again. For now, the smart displays only work in English, too.

When I first heard about these smart displays, I wasn’t sure if they were going to be useful. Turns out, they are. I do live in the Google Assistant ecosystem, though, and I’ve got a few Google Homes set up around my house. If you’re looking to expand your Assistant setup, then the Link View is a nice addition — and if you’re just getting started (or only need one Assistant-enabled speaker/display), then opting for a smart display over a smart speaker may just be the way to go, assuming you can stomach the extra cost.



from Android – TechCrunch https://ift.tt/2xamwO3
via IFTTT

Indonesian fintech startup Moka raises $24M led by Sequoia India

Indonesia’s Moka, a startup that helps SMEs and retailers manage payment and other business operations, has pulled in a $24 million Series B round for growth.

The investment is led by Sequoia India and Southeast Asia — which recently announced a new $695 million fund — with participation from new backers SoftBank Ventures Korea, EDBI — the corporate investment arm of Singapore’s Economic Development Board — and EV Growth, the later stage fund from Moka seed investor East Ventures. Existing investors Mandiri Capital, Convergence and Fenox also put into the round.

The deal takes Moka to $27.9 million raised to date, according to data from Crunchbase.

Moka was started four years ago primarily as a point-of-sale (POS) terminal with some basic business functionality. Today, it claims to work with 12,500 retailers in Indonesia and its services include sales reports, inventory management, table management, loyalty programs, and more. Its primary areas of focus are retailers in the F&B, apparel and services industries. It charges upwards of IDR 249,000 ($17) per month for its basic service and claims to be close to $1 billion in annual transaction volume from its retail partners.

That’s the company’s core offering, a mobile app that turns any Android or iOS device into a point-of-sale terminal, but CEO and co-founder Haryanto Tanjo — who started the firm with CTO Grady Laksmono — said it harbors larger goals.

“Our vision is to be a platform, we want to be an ecosystem,” he told TechCrunch in an interview.

That’s where much of this new capital will be invested.

Tanjo said the company is opening its platform up to third-party providers, who can use it to reach merchants with services such as accounting, payroll, HR and more. The focus is initially on local services that cater to SMEs in Indonesia, but as Moka targets larger enterprises as clients, he said that it will integrate larger, global solutions, too.

Moka offers services beyond point-of-sale, but the core offering is turning any smart device into a cash machine

Moka itself is expanding its capabilities on the payment side.

Indonesia, the world’s fourth largest country based on population and Southeast Asia’s largest economy, is in the midst of a fintech revolution with numerous companies pioneering mobile-based wallet services aimed at ending the country’s fixation on cash-based transactions. That’s mean that there are a plethora of options available today. Tanjo said Moka is working to support them all in order to help its merchants grow their businesses and consumers to have easier lives.

There are so many wallets here in Indonesia,” he said. “There are more than 10 right now and maybe in the next few months there’ll be 15-20, we want to be the platform that works with all of them.”

Already it works with the likes of OVO, T-Cash and Akulaku, and e-wallets including DANA and Kredivo. The startup is also working in another area of fintech: loans.

As an extension of its platform, it has tied up with SME loan companies who can reach out to Moka businesses using its platform. With the merchant’s consent, Moka can provide business data — including revenue, profit, etc — to help provide data to assess a loan application. That’s important because the process is particularly challenging in Southeast Asia, where few organized credit checking facilities exist — it makes sense that Moka — which has built its business around encouraging business growth and management — uses the information it has access to help its partners.

Tanjo said the company takes an undisclosed cut of the loan in cases where it has successfully connected the two parties. He said that he doesn’t expect that to initially become a major revenue stream, but over time he anticipates it will help its customer base grow and become a more important source of income for the startup.

Sequoia India has some experience in POS startups having backed Pine Labs in India, which recently landed a big $125 million round from PayPal and Singapore sovereign fund Temasek. Still, there are plenty of local players across various markets in Southeast Asia, including StoreHub, which is backed by Temasek subsidiary Vertex Ventures, and Malaysia’s SoftSpace.

While those two competitors have established a presence in multiple markets in Southeast Asia, Tanjo — the Moka CEO — said there are no plans to venture overseas for at least the next 12 months.

“We’re still scratching the service,” he said. “So doesn’t make sense to expand too soon.”



from Android – TechCrunch https://ift.tt/2MrZEOS
via IFTTT

Wednesday, 12 September 2018

Security flaw in ‘nearly all’ modern PCs and Macs exposes encrypted data

Most modern computers, even devices with disk encryption, are vulnerable to a new attack that can steal sensitive data in a matter of minutes, new research says.

In new findings published Wednesday, F-Secure said that none of the existing firmware security measures in every laptop it tested “does a good enough job” of preventing data theft.

F-Secure principal security consultant Olle Segerdahl told TechCrunch that the vulnerabilities put “nearly all” laptops and desktops — both Windows and Mac users — at risk.

The new exploit is built on the foundations of a traditional cold boot attack, which hackers have long used to steal data from a shut-down computer. Modern computers overwrite their memory when a device is powered down to scramble the data from being read. But Segerdahl and his colleague Pasi Saarinen found a way to disable the overwriting process, making a cold boot attack possible again.

“It takes some extra steps,” said Segerdahl, but the flaw is “easy to exploit.” So much so, he said, that it would “very much surprise” him if this technique isn’t already known by some hacker groups.

“We are convinced that anybody tasked with stealing data off laptops would have already come to the same conclusions as us,” he said.

It’s no secret that if you have physical access to a computer, the chances of someone stealing your data is usually greater. That’s why so many use disk encryption — like BitLocker for Windows and FileVault for Macs — to scramble and protect data when a device is turned off.

But the researchers found that in nearly all cases they can still steal data protected by BitLocker and FileVault regardless.

After the researchers figured out how the memory overwriting process works, they said it took just a few hours to build a proof-of-concept tool that prevented the firmware from clearing secrets from memory. From there, the researchers scanned for disk encryption keys, which, when obtained, could be used to mount the protected volume.

It’s not just disk encryption keys at risk, Segerdahl said. A successful attacker can steal “anything that happens to be in memory,” like passwords and corporate network credentials, which can lead to a deeper compromise.

Their findings were shared with Microsoft, Apple, and Intel prior to release. According to the researchers, only a smattering of devices aren’t affected by the attack. Microsoft said in a recently updated article on BitLocker countermeasures that using a startup PIN can mitigate cold boot attacks, but Windows users with “Home” licenses are out of luck. And, any Apple Mac equipped with a T2 chip are not affected, but a firmware password would still improve protection.

Both Microsoft and Apple downplayed the risk.

Acknowledging that an attacker needs physical access to a device, Microsoft said it encourages customers to “practice good security habits, including preventing unauthorized physical access to their device.” Apple said it was looking into measures to protect Macs that don’t come with the T2 chip.

When reached, Intel would not to comment on the record.

In any case, the researchers say, there’s not much hope that affected computer makers can fix their fleet of existing devices.

“Unfortunately, there is nothing Microsoft can do, since we are using flaws in PC hardware vendors’ firmware,” said Segerdahl. “Intel can only do so much, their position in the ecosystem is providing a reference platform for the vendors to extend and build their new models on.”

Companies, and users, are “on their own,” said Segerdahl.

“Planning for these events is a better practice than assuming devices cannot be physically compromised by hackers because that’s obviously not the case,” he said.



from Apple – TechCrunch https://ift.tt/2Mt1XBv

Apple’s Watch isn’t the first with an EKG reader but it will matter to more consumers

Apple’s COO Jeff Williams exuberantly proclaimed Apple’s Watch was the first to get FDA clearance as an over-the-counter electrocardiogram (EKG) reader during the special event at Apple headquarters on Wednesday. While Apple loves to be first to things, that statement is false.

AliveCor has held the title of first since late last year for its KardiaMobile device, a $100 stick-like metal unit you attach to the back of a smartphone. Ironically, it also received FDA clearance for the Kardiaband, an ECG reader designed to integrate with the Apple Watch and sold at Apple stores and just this week, the FDA gave the go ahead for AliveCor’s technology to screen for blood diseases, sans blood test.

However, the Apple Watch could be the first to matter to a wider range of consumers. For one, Apple holds a firm 17 percent of the world’s wearables market, with an estimated shipment volume of 28 million units in just 2018. While we don’t know how many AliveCor Kardiaband and KardiaMobile units were sold, it’s very unlikely to be anywhere near those numbers.

For another thing, a lot of people, even those who suspect they have a heart condition, might have some hesitations around getting a separate device just to check. Automatic integration makes it easy for those curious to start monitoring without needing to purchase any extra equipment. Also, while heart disease is the number one killer in the U.S. and affects a good majority of the global population, most of us probably aren’t thinking about our heart rhythm on a daily basis. Integrating an EKG reader straight into the Watch makes monitoring seamless and could take away the fear some may have about finding out how their heart is doing.

Then there’s the Apple brand, itself. Many hospitals are now partnering with Apple to use iPads and it’s reasonable to think there could be some collaboration with the Watch.

“Doctors, hospital systems, health insurers, and self-insured employers don’t want to manage separate partnerships with each of Apple, Xiaomi, Fitbit, Huawei, Garmin, Polar, Samsung, Fossil, and every other wearable manufacturers. They need a cross-platform product that works for all of their patients,” Cardiogram founder and EKG researcher Brandon Ballinger told TechCrunch. “So if Apple becomes the Apple of healthcare, then a company like Cardiogram or AliveCor can become the Microsofts of this space.”

How does this announcement from Apple affect AliveCor? CEO Vic Gundotra shrugs it off. He tells TechCrunch the vast majority of AliveCor’s business is from KardiaMobile, not it’s Apple-integrated ECG reader. “Apple has long alluded they were building something like this into the device,” Gundotra said, “so we’ve been anticipating it.”



from Apple – TechCrunch https://ift.tt/2MrzsUr

The 7 most egregious fibs Apple told about the iPhone XS camera today

Apple always drops a few whoppers at its events, and the iPhone XS announcement today was no exception. And nowhere were they more blatant than in the introduction of the devices’ “new” camera features. No one doubts that iPhones take great pictures, so why bother lying about it? My guess is they can’t help themselves.

Now, to fill this article out I had to get a bit pedantic, but honestly, some of these are pretty egregious.

“The world’s most popular camera”

There are a lot of iPhones out there, to be sure. But defining the iPhone as some sort of decade-long continuous camera, which Apple seems to be doing, is sort of a disingenuous way to do it. By that standard, Samsung would almost certainly be ahead, since it would be allowed to count all its Galaxy phones going back a decade as well, and they’ve definitely outsold Apple in that time. Going further, if you were to say that a basic off-the-shelf camera stack and common Sony or Samsung sensor was a “camera,” iPhone would probably be outnumbered 10:1 by Android phones.

Is the iPhone one of the world’s most popular cameras? To be sure. Is it the world’s most popular camera? You’d have to slice it pretty thin and say that this or that year and this or that model was more numerous than any other single model. The point is this is a very squishy metric and one many could lay claim to depending on how they pick or interpret the numbers. As usual, Apple didn’t show their work here, so we may as well coin a term and call this an educated bluff.

“Remarkable new dual camera system”

As Phil would explain later, a lot of the newness comes from improvements to the sensor and image processor. But as he said that the system was new while backed by an exploded view of the camera hardware, we may consider him as referring to that as well.

It’s not actually clear what in the hardware is different from the iPhone X. Certainly if you look at the specs, they’re nearly identical:

If I said these were different cameras, would you believe me? Same F numbers, no reason to think the image stabilization is different or better, and so on. It would not be unreasonable to guess that these are, as far as optics, the same cameras as before. Again, not that there was anything wrong with them — they’re fabulous optics. But showing components that are in fact the same and saying it’s different is misleading.

Given Apple’s style, if there were any actual changes to the lenses or OIS, they’d have said something. It’s not trivial to improve those things and they’d take credit if they had done so.

The sensor of course is extremely important, and it is improved: the 1.4-micrometer pixel pitch on the wide-angle main camera is larger than the 1.22-micrometer pitch on the X. Since the megapixels are similar we can probably surmise that the “larger” sensor is a consequence of this different pixel pitch, not any kind of real form factor change. It’s certainly larger, but the wider pixel pitch, which helps with sensitivity, is what’s actually improved, and the increased dimensions are just a consequence of that.

We’ll look at the image processor claims below.

“2x faster sensor… for better image quality”

It’s not really clear what is meant when he says this. “To take advantage of all this technology.” Is it the readout rate? Is it the processor that’s faster, since that’s what would probably produce better image quality (more horsepower to calculate colors, encode better, and so on)? “Fast” also refers to light-gathering — is that faster?

I don’t think it’s accidental that this was just sort of thrown out there and not specified. Apple likes big simple numbers and doesn’t want to play the spec game the same way as the others. But this in my opinion crosses the line from simplifying to misleading. This at least Apple or some detailed third party testing can clear up.

“What it does that is entirely new is connect together the ISP with that neural engine, to use them together.”

Now, this was a bit of sleight of hand on Phil’s part. Presumably what’s new is that Apple has better integrated the image processing pathway between the traditional image processor, which is doing the workhorse stuff like autofocus and color, and the “neural engine,” which is doing face detection.

It may be new for Apple, but this kind of thing has been standard in many cameras for years. Both phones and interchangeable-lens systems like DSLRs use face and eye detection, some using neural-type models, to guide autofocus or exposure. This (and the problems that come with it) go back years and years. I remember point-and-shoots that had it, but unfortunately failed to detect people who had dark skin or were frowning.

It’s gotten a lot better (Apple’s depth-detecting units probably help a lot), but the idea of tying a face-tracking system, whatever fancy name you call it, in to the image-capture process is old hat. It’s probably not “entirely new” even for Apple, let alone the rest of photography.

“We have a brand new feature we call smart HDR.”

Apple’s brand new feature has been on Google’s Pixel phones for a while now. A lot of cameras now keep a frame buffer going, essentially snapping pictures in the background while the app is open, then using the latest one when you hit the button. And Google, among others, had the idea that you could use these unseen pictures as raw material for an HDR shot.

Probably Apple’s method is a little different, but fundamentally it’s the same thing. Again, “brand new” to iPhone users, but well known among Android flagship devices.

“This is what you’re not supposed to do, right, shooting a photo into the sun, because you’re gonna blow out the exposure.”

I’m not saying you should shoot directly into the sun, but it’s really not uncommon to include the sun in your shot. In the corner like that it can make for some cool lens flares, for instance. It won’t blow out these days because almost every camera’s auto-exposure algorithms are either center-weighted or intelligently shift around — to find faces, for instance.

When the sun is in your shot, your problem isn’t blown out highlights but a lack of dynamic range caused by a large difference between the exposure needed to capture the sun-lit background and the shadowed foreground. This is, of course, as Phil says, one of the best applications of HDR — a well-bracketed exposure can make sure you have shadow details while also keeping the bright ones.

Funnily enough, in the picture he chose here, the shadow details are mostly lost — you just see a bunch of noise there. You don’t need HDR to get those water droplets — that’s a shutter speed thing, really. It’s still a great shot, by the way, I just don’t think it’s illustrative of what Phil is talking about.

“You can adjust the depth of field… this has not been possible in photography of any type of camera.”

This just isn’t true. You can do this on the Galaxy S9, and it’s being rolled out in Google Photos as well. Lytro was doing something like it years and years ago, if we’re including “any type of camera.” I feel kind of bad that no one told Phil. He’s out here without the facts.

Well, that’s all the big ones. There were plenty more, shall we say, embellishments at the event, but that’s par for the course at any big company’s launch. I just felt like these ones couldn’t go unanswered. I have nothing against the iPhone camera — I use one myself. But boy are they going wild with these claims. Somebody’s got to say it, since clearly no one inside Apple is.

Check out the rest of our Apple event coverage here:

more iPhone Event 2018 coverage



from iPhone – TechCrunch https://ift.tt/2NCXS1X

Beats did announce something today, after all

Turns out those rumors that Beats wouldn’t have anything to show off during today’s big Apple event weren’t 100-percent true. Granted, there was no mention of the headphone maker during the event itself. Instead, the company sent out a bit of an also-ran press release as things were still unfolding here in Cupertino.

The big reason the brand got no love during today’s event: these aren’t new products, really. Rather, they’re color updates to two of Beats’ existing lines. In fact, the new shades were designed to match Apple’s new hardware. Fittingly, the headphones are priced to match their corresponding handsets.

The $300 over ear Beats Solo 3 Wireless now come in Satin Gold and Satin Silver to match the new colors for the iPhone XS/XS Max, while the $60 urBeats3 earphones are available in Yellow, Blue, and Coral to match the cheaper iPhone XR.

The latter also sport a Lightning cable, so you don’t have to futz with the dongle (which is a fun phrase). They also snap together magnetically, so they can be worn around the neck.

The new Beats Solo 3 Wireless colors are shipping now and the urBeats3 are coming later this fall.

more iPhone Event 2018 coverage



from Apple – TechCrunch https://ift.tt/2x5VPt4

The 7 most egregious fibs Apple told about the iPhone XS camera today

Apple always drops a few whoppers at its events, and the iPhone XS announcement today was no exception. And nowhere were they more blatant than in the introduction of the devices’ “new” camera features. No one doubts that iPhones take great pictures, so why bother lying about it? My guess is they can’t help themselves.

Now, to fill this article out I had to get a bit pedantic, but honestly, some of these are pretty egregious.

“The world’s most popular camera”

There are a lot of iPhones out there, to be sure. But defining the iPhone as some sort of decade-long continuous camera, which Apple seems to be doing, is sort of a disingenuous way to do it. By that standard, Samsung would almost certainly be ahead, since it would be allowed to count all its Galaxy phones going back a decade as well, and they’ve definitely outsold Apple in that time. Going further, if you were to say that a basic off-the-shelf camera stack and common Sony or Samsung sensor was a “camera,” iPhone would probably be outnumbered 10:1 by Android phones.

Is the iPhone one of the world’s most popular cameras? To be sure. Is it the world’s most popular camera? You’d have to slice it pretty thin and say that this or that year and this or that model was more numerous than any other single model. The point is this is a very squishy metric and one many could lay claim to depending on how they pick or interpret the numbers. As usual, Apple didn’t show their work here, so we may as well coin a term and call this an educated bluff.

“Remarkable new dual camera system”

As Phil would explain later, a lot of the newness comes from improvements to the sensor and image processor. But as he said that the system was new while backed by an exploded view of the camera hardware, we may consider him as referring to that as well.

It’s not actually clear what in the hardware is different from the iPhone X. Certainly if you look at the specs, they’re nearly identical:

If I said these were different cameras, would you believe me? Same F numbers, no reason to think the image stabilization is different or better, and so on. It would not be unreasonable to guess that these are, as far as optics, the same cameras as before. Again, not that there was anything wrong with them — they’re fabulous optics. But showing components that are in fact the same and saying it’s different is misleading.

Given Apple’s style, if there were any actual changes to the lenses or OIS, they’d have said something. It’s not trivial to improve those things and they’d take credit if they had done so.

The sensor of course is extremely important, and it is improved: the 1.4-micrometer pixel pitch on the wide-angle main camera is larger than the 1.22-micrometer pitch on the X. Since the megapixels are similar we can probably surmise that the “larger” sensor is a consequence of this different pixel pitch, not any kind of real form factor change. It’s certainly larger, but the wider pixel pitch, which helps with sensitivity, is what’s actually improved, and the increased dimensions are just a consequence of that.

We’ll look at the image processor claims below.

“2x faster sensor… for better image quality”

It’s not really clear what is meant when he says this. “To take advantage of all this technology.” Is it the readout rate? Is it the processor that’s faster, since that’s what would probably produce better image quality (more horsepower to calculate colors, encode better, and so on)? “Fast” also refers to light-gathering — is that faster?

I don’t think it’s accidental that this was just sort of thrown out there and not specified. Apple likes big simple numbers and doesn’t want to play the spec game the same way as the others. But this in my opinion crosses the line from simplifying to misleading. This at least Apple or some detailed third party testing can clear up.

“What it does that is entirely new is connect together the ISP with that neural engine, to use them together.”

Now, this was a bit of sleight of hand on Phil’s part. Presumably what’s new is that Apple has better integrated the image processing pathway between the traditional image processor, which is doing the workhorse stuff like autofocus and color, and the “neural engine,” which is doing face detection.

It may be new for Apple, but this kind of thing has been standard in many cameras for years. Both phones and interchangeable-lens systems like DSLRs use face and eye detection, some using neural-type models, to guide autofocus or exposure. This (and the problems that come with it) go back years and years. I remember point-and-shoots that had it, but unfortunately failed to detect people who had dark skin or were frowning.

It’s gotten a lot better (Apple’s depth-detecting units probably help a lot), but the idea of tying a face-tracking system, whatever fancy name you call it, in to the image-capture process is old hat. It’s probably not “entirely new” even for Apple, let alone the rest of photography.

“We have a brand new feature we call smart HDR.”

Apple’s brand new feature has been on Google’s Pixel phones for a while now. A lot of cameras now keep a frame buffer going, essentially snapping pictures in the background while the app is open, then using the latest one when you hit the button. And Google, among others, had the idea that you could use these unseen pictures as raw material for an HDR shot.

Probably Apple’s method is a little different, but fundamentally it’s the same thing. Again, “brand new” to iPhone users, but well known among Android flagship devices.

“This is what you’re not supposed to do, right, shooting a photo into the sun, because you’re gonna blow out the exposure.”

I’m not saying you should shoot directly into the sun, but it’s really not uncommon to include the sun in your shot. In the corner like that it can make for some cool lens flares, for instance. It won’t blow out these days because almost every camera’s auto-exposure algorithms are either center-weighted or intelligently shift around — to find faces, for instance.

When the sun is in your shot, your problem isn’t blown out highlights but a lack of dynamic range caused by a large difference between the exposure needed to capture the sun-lit background and the shadowed foreground. This is, of course, as Phil says, one of the best applications of HDR — a well-bracketed exposure can make sure you have shadow details while also keeping the bright ones.

Funnily enough, in the picture he chose here, the shadow details are mostly lost — you just see a bunch of noise there. You don’t need HDR to get those water droplets — that’s a shutter speed thing, really. It’s still a great shot, by the way, I just don’t think it’s illustrative of what Phil is talking about.

“You can adjust the depth of field… this has not been possible in photography of any type of camera.”

This just isn’t true. You can do this on the Galaxy S9, and it’s being rolled out in Google Photos as well. Lytro was doing something like it years and years ago, if we’re including “any type of camera.” I feel kind of bad that no one told Phil. He’s out here without the facts.

Well, that’s all the big ones. There were plenty more, shall we say, embellishments at the event, but that’s par for the course at any big company’s launch. I just felt like these ones couldn’t go unanswered. I have nothing against the iPhone camera — I use one myself. But boy are they going wild with these claims. Somebody’s got to say it, since clearly no one inside Apple is.

Check out the rest of our Apple event coverage here:

more iPhone Event 2018 coverage



from Apple – TechCrunch https://ift.tt/2NCXS1X