Adaptive Brightness uses machine learning in Pie to automatically adjust the display
Before Android 9, Android device manufacturers would use recommendations from the screen manufacturers to determine how bright a screen should be based on ambient lighting. The baseline mapping was the same for all users, and phone owners would have to constantly use the brightness slider to adjust the display to their liking. Setting the slider to the left of center made the screen dimmer than the preset, and moving it to the right of center made it brighter than the preset. Messing with the slider would force the user to constantly make adjustments each time the ambient settings changed.
With Android 9, Google made a change to improve this Adaptive Brightness. Handsets running the latest Android build now use machine learning to figure out how bright or dark a phone's user wants the screen under various ambient lighting conditions. The phone "remembers" every time the user moves the brightness slider to "perfect" the look of the display. In other words, the device is "trained" to automatically make the adjustments to the brightness slider each time the ambient lighting changes. This change is something that the user has to make himself under Android Oreo.
"We believe that screen brightness is one of those things that should just work, and these changes in Android Pie are a step towards realizing that. For the best performance no matter where you are models run directly on the device rather than the cloud, and train overnight while the device charges."-Google
Google says that the model it uses for the improved Adaptive Brightness can be updated, and will be fine tuned based on real time usage. Pixel models have this available now, and the company is talking to Android manufacturers about adding the improvement into the Android Pie builds for their handsets.
Things that are NOT allowed: