#Galaxy A8 2018 features    #Pixel Visual Core    #Huawei P20 lite    #Best quadcopters 2018

Google activates the Pixel Visual Core: this is how applications will make better photos in Pixel 2

At the end of last year we learned that the new Pixel 2 and Pixel 2 XL hid inside the first coprocessor designed by Google, an image chip called Pixel Visual Core, which hour after several months of waiting is finally activated to be squeezed.

Google has announced that from today the Pixel Visual Core image coprocessor will be activated in its Pixel 2 and Pixel 2 XL through the security update of this February I that will arrive over the next few days in a staggered way to its users.

Pixel Visual Core, the chip that improves our photos made by any application

So far, the HDR + mode of the Pixel 2 that substantially improves our photos in those complicated scenes, in which we find very dark areas and areas with lots of light, was only available through the Google Camera application. Now this enhanced image mode also reaches third-party applications.

Now users of a Pixel 2 that use the camera in applications such as WhatsApp, Instagram, Snapchat and Cía will get better pictures thanks to the Pixel Visual Core chip, which is capable of processing five times faster than the processor a burst of images to create an only takes in HDR, also achieving more detail and greater clarity when using RAISR technology.

If the user does not activate any effect, the flash is deactivated, the white balance is automatic and the default exposure will not be compensated, HDR + photos will be taken. As soon as the user touches some camera settings, this photographic improvement will be deactivated.

Pixel Visual Core: comparative

To show how photos improve with HDR + Pixel Visual Core Google shows us the following examples taken with the Pixel 2 camera:

The improvement is evident in the four examples. The photographs present the most illuminated scenes, with more natural colors, less grain, more sharpness and without dark areas.