SAN FRANCISCO, CA--(Marketwired - July 28, 2016) - Neon Labs (https://neon-lab.com/), the video and image performance company, today announced the availability of Neon Pro, a free web app that makes the company's deep learning technology, and NeonScore™, available to individual content creators. Previously, the technology -- which identifies and serves high performing video thumbnails and images -- was only available to global image, video, eCommerce and content platforms operating at massive scale, through Neon Enterprise™.
Founded on a decade of neurocognitive research at Brown University, Carnegie Mellon University, and Harvard Medical School, Neon technology combines the science of human perception, deep learning, and the world's largest and most comprehensive dataset of emotional responses to images. Using deep neural nets trained on human visual perception data, rather than just clickstream data, Neon predicts how people will emotionally respond to an image, and how effective the image will be in driving engagement.
Neon's predictive image technology helps businesses drive significantly higher clicks, likes and shares for videos and images, resulting in increased revenue. The company guarantees that Neon Enterprise customers will increase their overall engagement rate. Depending on the content and context, increases in engagement of 30% and higher are common.
Neon Pro, a free and slimmed-down version of the company's enterprise offering, now available as a web app, allows individuals and content producers to get NeonScores for their videos and images. Neon Pro identifies the thumbnails and images that are guaranteed to increase engagement over human-selected images and thumbnails. Neon Pro is free and available at https://app.neon-lab.com.
How NeonScore Works
The NeonScore is a number from 0 to 99, common to Neon Pro and Neon Enterprise, that measures the predicted emotional impact of an image for a given audience, device or platform. The higher the number, the higher the predicted engagement. NeonScore uses the company's patent pending methods to analyze every image or video frame for over 1,000 unique and interrelated "valence" features that drive human interest, such as eye gaze, instability, brightness, and incompleteness.
Data Points (Industry & Proprietary)
- Over 3B images are shared daily between Snapchat, Facebook, Facebook Messenger, Instagram, and WhatsApp (Source: Mary Meeker Internet Trends 2016).
- Because 81 percent of video ads are muted and require consumers to click to play sound, the visual content is more important than ever to drive engagement (Source: Mary Meeker, Internet Trends 2016).
- Neon thumbnails are proven to increase CTRs and revenue over and above hand selected and automatically generated thumbnails.
- Neon has analyzed over 6.2 billion images.
- World's largest dataset of human response to images: over 1,000 valence features per image; 3.9 MM images = 3.99 billion data points
Comment on the news
- "Competition for consumer attention has become exponentially compounded by the constant influx of visual stimuli. Neon helps publishers, content marketers, and now entrepreneurs and individuals, break through these distractions by serving up the most compelling images, at the right time, to the right person, on the right device. Publicly offering this technology will also have an impact on platforms; for example, Neon is able to generate a higher number of clicks than the thumbnail images YouTube automatically selects using its algorithm. The ability to analyze an image and determine how consumers will engage emotionally -- even before viewing it -- is truly a significant development for the emerging visual economy," said Christopher
Kruse, Neon CEO.
- "Unlike the traditional deep neural networks that are trained to identify objects in a scene, Neon uses deep learning in a creative way to predict the emotional response to images at massive scale. This novel approach has helped us identify the features of an image that drive engagement, allowing us to predict the images that will go viral, even before they are published," said Sophie Lebrecht, Neon Chief Science
- "Not only did Neon take a significant amount of work off our hands, it also improved the clickability of our thumbnails by 30% on average. This is a huge win for us," said
Jim Hall, IGN VP of Technology.
- Watch the video: NeonScore, Predictive Image Engagement https://www.youtube.com/watch?v=R3DsJqC5PXo
- Download the Media Kit for images, video and more: http://bit.ly/29WVKR0
- Discover the Neon blog: https://neon-lab.com/blog/
- Follow Neon on Twitter: https://twitter.com/neonlab
Founded on over ten years of neurocognitive research at Brown University, Carnegie Mellon University, and Harvard Medical School, Neon's technology helps digital organizations deliver a superior customer experience and increase revenue. Neon combines the neuroscience of human perception and machine learning to automatically identify images & thumbnails that people are most likely to click. Businesses, including IGN Entertainment, a Ziff Davis company, achieve significantly higher engagement on video and images using Neon.
Founded in 2012 by Dr. Sophie Lebrecht and Dr. Michael Tarr, Neon is headquartered in San Francisco and is venture funded by MDV Ventures and XSeed Capital. Neon has been awarded three patents, and currently has 10 others under review. To learn more about Neon, please visit https://neon-lab.com/.
Embedded Video Available: https://www.youtube.com/watch?v=R3DsJqC5PXo