Who's to blame for Samsung's lackluster AR Emoji? You and your short attention span
If you've had the chance to check out the Galaxy S9/S9+ and took the AR Emoji feature of the selfie camera for a spin, there's quite the chance that you were left disappointed by its often jittery and sometimes super-stiff take on Apple's Animoji. And if you are any thing like us, you probably flinched when you saw the overly-simplified virtual avatar that your phone created for you. All in all, AR Emoji feels like a beta version at this point, though it clearly isn't.
Turns out the real reason why Samsung's AR Emoji are a bit disappointing, especially when compared to the better-polished Animoji, is nothing else but the average consumer's overly-short attention span. Seriously.
Oh, and in case you're wondering what Loom.ai's tech is ultimately capable of given enough time, check out Elon Musk's virtual avatar down below. Implying it's created from a simple two-dimensional photo, it's as accurate as they come. You can also check out all the personalized avatars of the Loom.ai team right here.
source: CNET
Let's break this bold statement down to its crude ingredients.
Turns out that Samsung didn't develop AR Emoji on its own, no. The tech giant outsourced the crude technology behind this feature from Loom.ai, a tech company co-founded by CGI veteran and Oscar-winner Kiran Bhat, who developed Industrial Light & Magic's VFX facial expression-capture technology that was used in ultra-popular flicks like The Avengers, Pirates of the Caribbean, Teenage Mutant Ninja Turtles, Warcraft, Star Wars: Episode VII, and Rogue One: A Star Wars Story. The near-life like CGI appearance of Hulk, Davy Jones, and Grand Moff Tarkin in the respective movies can be directly traced back to Mr Bhat and his Oscar-winning labor. As you might imagine, such accurate CGI requires extensive computational resources; remember that for later.
Turns out that Samsung didn't develop AR Emoji on its own, no. The tech giant outsourced the crude technology behind this feature from Loom.ai, a tech company co-founded by CGI veteran and Oscar-winner Kiran Bhat, who developed Industrial Light & Magic's VFX facial expression-capture technology that was used in ultra-popular flicks like The Avengers, Pirates of the Caribbean, Teenage Mutant Ninja Turtles, Warcraft, Star Wars: Episode VII, and Rogue One: A Star Wars Story. The near-life like CGI appearance of Hulk, Davy Jones, and Grand Moff Tarkin in the respective movies can be directly traced back to Mr Bhat and his Oscar-winning labor. As you might imagine, such accurate CGI requires extensive computational resources; remember that for later.
After conquering the CGI summits at Hollywood, Kiran Bhat co-created Loom.ai along with Mahesh Ramasubramanian. The main goal of Loom.ai is inherently simple and can be summarized with a single sentence: helping regular Joes carve their own super-accurate virtual avatar from a simple selfie without taking hours upon hours to render a short video.
"We've simplified it into a single photograph," said Bhat. However, "people's attention span is low. People lose it after 5 seconds," Bhat added.
Loom.ai's current software is capable of creating super-realistic virtual avatars from a regular selfie, but it is reportedly taking it an average of 7 minutes or more to complete the task at hand, which has been obviously shot down by Samsung as an unacceptable and impractically-long amount of time.
As the tech giant was given the framework, it was now calling the shots and decided that people would prefer speed in favor of better-looking avatars, which is a polarizing decision that we many wouldn't necessarily agree with. Loom.ai's co-founder said that AR Emoji "use a 2D tracker provided from Samsung on how the face moves, which is what gets fed into the SDK," as Kiran Bhat points out. Sure, speed is crucial, but we suppose that users could have been given the option to select a basic or a more advanced avatar upon tapping the "Create My Emoji" button. And this is where the "gotcha" comes in: provided that the software was allowed to do its computational magic for longer, AR Emoji would have certainly turned out way better and more life-like than the AR Emoji available as a dedicated camera mode for your selfie snapper. Speed is crucial, and Samsung chose that in favor of producing super-realistic virtual avatars.
In addition, unlike the iPhone X, which uses a plethora of sensors to scan the user's face and create a detailed 3D map of the user's face, the Galaxy S9 and S9+ are only capable of two-dimensional scans, which severely limits the accuracy and is a big reason why AR Emoji looks so unimpressive.
Fortunately, there's always room for improvement: Samsung deciding on how to use Loom.ai's framework means that future AR Emoji can technically be way more superior provided that Samsung decides it's worth it. Here's to hoping. Aside from throwing additional depth sensors at the front of a future Galaxy device, the framework's algorithms could be further optimized so as to produce better-looking avatars. So, it's almost a given that the second generation of AR Emoji will probably be much better than its precursor.
"The exact timing is up to Samsung," said Mahesh Ramasubramanian, Loom.ai's CEO, "But it is something you will see evolve."
As the tech giant was given the framework, it was now calling the shots and decided that people would prefer speed in favor of better-looking avatars, which is a polarizing decision that we many wouldn't necessarily agree with. Loom.ai's co-founder said that AR Emoji "use a 2D tracker provided from Samsung on how the face moves, which is what gets fed into the SDK," as Kiran Bhat points out. Sure, speed is crucial, but we suppose that users could have been given the option to select a basic or a more advanced avatar upon tapping the "Create My Emoji" button. And this is where the "gotcha" comes in: provided that the software was allowed to do its computational magic for longer, AR Emoji would have certainly turned out way better and more life-like than the AR Emoji available as a dedicated camera mode for your selfie snapper. Speed is crucial, and Samsung chose that in favor of producing super-realistic virtual avatars.
Fortunately, there's always room for improvement: Samsung deciding on how to use Loom.ai's framework means that future AR Emoji can technically be way more superior provided that Samsung decides it's worth it. Here's to hoping. Aside from throwing additional depth sensors at the front of a future Galaxy device, the framework's algorithms could be further optimized so as to produce better-looking avatars. So, it's almost a given that the second generation of AR Emoji will probably be much better than its precursor.
Oh, and in case you're wondering what Loom.ai's tech is ultimately capable of given enough time, check out Elon Musk's virtual avatar down below. Implying it's created from a simple two-dimensional photo, it's as accurate as they come. You can also check out all the personalized avatars of the Loom.ai team right here.
source: CNET
Things that are NOT allowed: