Apple A13 Chip: AI and AR all in the one chip.

The A13 is the latest in line for this series of chips which has been the driving force for apple products for many years now. If you have read any of my other content you know that we are passionate about AI and AR and apple looks to feel the same with the Chipset looking to heavily support these technologies.

The A13 is the latest in line for this series of chips which has been the driving force for apple products for many years now. If you have read any of my other content you know that we are passionate about AI and AR and apple looks to feel the same with the Chipset looking to heavily support these technologies.


Checkout some of the below key points from Inverse.com



A13 Chip: Artificial Intelligence-Powered Image Editing


Last year, Apple highlighted the “Neural Engine” that comes built into the 7-nanometer architecture of its A12 Bionic processor. This enables iPhone photographers to adjust the background blur of Portrait Mode images — an effect known as bokeh — after an image has already been taken. We could see improvements to these capabilities this year.


Apple’s neural engine advancements essentially build a machine learning model straight into iPhones to accomplish tasks, like editing background blur after an photo has been taken. Before the A12 Bionic, users would have needed to send their images to Apple servers, edit them there, and get them back on their phone. That’s slower and requires internet connection, both of which are cumbersome to make one simple photo tweak.



Now users can make Photoshop-like edits right from the iOS Photo app, and this year could see further upgrade to what they’re able to edit. Users might be able to retroactively edit the field of view in non-portrait mode pictures to give iPhone photographers even more powerful editing tools.


A13 Chip: A.I. Improvements to Siri

An improvement to the A13’s neural engine could also result in serious Siri upgrades. Currently, Apple’s voice assistant needs to route voice commands through Apple’s machine learning algorithms, housed in its servers. But figuring out a way to house those A.I. models inside the phone would be a Siri game changer.


The voice assistant could, say, set a timer, turn on your flashlight, set a reminder, and open apps while an iPhone is offline or on Airplane Mode. A previous Apple patent described this capability, and the company poached Google’s former chief of search and artificial intelligence, John Giannandrea, early in 2018 to boost Apple’s A.I. initiative. An executive could have the secret sauce to improve Siri in this way.



Google recently debuted the “Next-Generation Google Assistant” during its annual I/O developers conference on May 7. This update compressed a 100GB machine learning model into just half a gigabyte to fit it into its future Pixel phones. It’s possible that Giannandrea is trying to recreate this project with Apple as the patent would suggest.


Now the question is if the company will be able to pull it off by September or if an improved Siri will need to wait until 2020.


A13 Chip: Augmented Reality Levels Up


It’s no secret that Apple has made AR core to its iPhone strategy. Last year, it touted that its A12 bionic chip can detect flat surfaces in images to better overlay virtual objects. This lays the groundwork for future AR apps and games using its ARKit 2.0 development tools and, eventually, its AR glasses, which are expected to be an iPhone accessory.


Apple employees have been quietly meeting with AR firms and filing a flurry of AR headset patents that describe how they could work. Plus, a third lens on the high-end iPhone will give the processor more data to work with, which could lead to better holograms.



AR was a big focus for Apple in 2018, and with its headset predicted to launch as early as 2020, it will more than likely remain that way this year.Check out the belowarticle on the A13 chip and its capabilities.



You may have missed

%d bloggers like this: