Tuesday, April 23, 2024

How to enable the Pixel Visual Core for HDR+ on Android 8.1 — and what it actually does

Share

google-pixel-2-top-glass-camera.jpg?itok

It’s time for a photo processing upgrade.

Google’s new Pixel Visual Core co-processor has been sitting dormant inside the Pixel 2 and 2 XL since launch, but now with the Android 8.1 Developer Preview 2 (aka Beta 2) release we have an early look at what it can do. Well, sort of — it actually isn’t enabled by default on the phones, and turning it on only gives us a glimpse of what it’s capable of in third-party apps.

But if you know where to go, you can turn on the Pixel Visual Core and see what it does for your photos on the Pixel 2 or Pixel 2 XL. Here’s how.

How to enable Pixel Visual Core processing

The process for enabling the Pixel Visual Core is a bit funky, and isn’t actually in the Camera app itself but rather the Developer options — but chances are this won’t be an issue for you if you’re running beta software on your phone. Provided you’re running the latest Developer Preview / Beta, here are the steps:

pvc-enabled.jpg?itok=i7O9_kt4

Go into Settings, System, About phone.
Find the Build number at the bottom of the screen and tap it five times.

  • You’ll also need to confirm your screen lock.

Go back and tap on the new Developer options menu.
Scroll down under the “Debugging” subsection and tap the toggle marked Camera HAL HDR+.

  • (FYI: “HAL” stands for Hardware Abstraction Layer.)

Reboot your phone for the function to be enabled.

What does the Pixel Visual Core do right now?

So here’s the thing: enabling HAL HDR+ doesn’t change anything about the way the Pixel 2’s built-in camera performs — after all, it already has HDR+ on its own without utilizing the Pixel Visual Core. Because this is a beta release, the focus is on enabling the Pixel Visual Core for third-party apps to use. Once you turn on HAL HDR+ processing, any third-party app that plugs into the standard Android Camera API will have its photos processed with the Pixel Visual Core, giving them the HDR+ treatment much in the same way the Google camera app does already purely with proprietary processing software.

The change isn’t massive right now, but the future is bright with this dedicated co-processor.

That means when you fire up something like Instagram or the bevy of other apps with in-app camera needs, the photos you get directly out of those third-party apps will be closer to the quality you experience when taking photos with the built-in camera app. The goal is to not have such a big drop-off in camera quality when shooting inside an app versus using the built-in camera and sharing the photo afterward. This is a huge win for developers and users alike.

Chances are you won’t notice a huge difference in quality or processing speed just yet — remember, this is the first time Google is enabling the Pixel Visual Core for consumers (and just beta testers, who enable it, at that). But the computational capabilities of this co-processor go way beyond most ISPs in phones. There’s also a machine learning component to the way the Pixel Visual Core works, meaning it has the potential to “learn” and improve as it’s used. This processor’s capabilities could be leveraged far better in the future, both with third-party apps and the built-in camera. With hardware like this, the future is bright.

Once you enable the Pixel Visual Core on your Pixel 2 or 2 XL, let us know how you’re finding its capabilities!

Google Pixel 2 and Pixel 2 XL

[youtube https://www.youtube.com/watch?v=5YK63cXyJ2Q?modestbranding=0&html5=1&rel=0&autoplay=0&wmode=opaque&loop=0&controls=1&autohide=0&showinfo=0&theme=dark&color=red&enablejsapi=1]

  • Pixel 2 FAQ: Everything you need to know!
  • Google Pixel 2 and 2 XL review: The new standard
  • Google Pixel 2 specs
  • Google Pixel 2 vs. Pixel 2 XL: What’s the difference?
  • Join our Pixel 2 forums

Best Buy
Verizon
Google Store
Project Fi

Read more

More News