CLIPr, a Video Analysis and Management (VAM) platform that uses AI and Machine Learning (ML) to index video content and make it searchable, has announced a partnership with Grip, an event success platform.
Grip will aim to leverage CLIPr to index live session content in real-time, allowing its event organiser customers to offer their attendees and exhibitors CLIPr-enriched video content, including ability to search by topic, subtopic or transcription.
Tim Groot (pictured), co-founder and CEO of Grip, said: “Our customers are increasingly blending in-person and virtual events, and it’s important we partner with innovative technologies like CLIPr that maximise the value of session content 365, and deliver the most engaging event experience."
CLIPr’s VAM platform helps users quickly extract important moments of recorded session content so they can search, interact, and share content. Recorded videos are processed by CLIPr’s ML algorithms, which analyse audio and visual cues (i.e. presentation slides), along with natural language processing and emotion detection, to index key topics and moments in each video.
Event organisers using CLIPr enriched videos on Grip’s platform can extend post-event viewership and highlight the value of live event content, making it easier for attendees to consume and recall valuable moments of each session. Viewers can search videos by transcription, topics and subtopics, and react to moments through emojis or comments.
Humphrey Chen, co-founder and CEO of CLIPr, added: “Recorded session content has long been an under utilised asset for event professionals in terms of return on engagement and obtaining valuable metrics to inform future decision making. We are empowering organisers to reach the elusive 365 community engagement and can help tailor content to individuals.”