Going back to our biology discussion, the human brain literally recognizes pictures in the blink of an eye. Neuroscientists at the Massachusetts Institute of Technology (MIT) recently found that the human brain can process an image in as little as 13 milliseconds, in contrast to 600 milliseconds which is the time that it takes to process one word. Our fourth techtacle knows this well and offers integration with MAM systems to make stories appealing to the human eye.
But first. What exactly is MAM? As the number of multimedia files increased, so did the number of workflows for these files, from storage to production to distribution. Thus, it was clear that a system that could effectively and automatically navigate us through these files and take actions based on their metadata was needed. That is precisely the purpose of Media Asset Management Systems, which manage massive and rich media assets and their workflows in a single centralized source. In the process of news production, MAM technology enables a wide range of broadcast operations, especially when optimally paired with newsroom systems.
Octopus’ fourth techtacle enables smooth workflows by integrating with MAM technology providers. This means that you can navigate multimedia files stored in your MAM natively within the Octopus interface. Our unique plug-less approach enables production teams to work with clips residing on these systems.
Inside Octopus’ media library you can look directly into multimedia files, as well as preview low-res videos that can be streamed directly from the location where the clip is sitting at the moment. You can search for the clips using the full-text search as well as by adding filters. Once you have located the ideal clip, you can drag and drop it into your Evening News rundown or your “international” story folder. You can also access MAM-stored files when creating your script. The clips matching the searched keyword will automatically appear in Octopus for you to preview them and add them to your story.
Translated into workflows, this means that Octopus users can search for clips, preview the files, and insert them to their stories, rundowns, assignments, or scripts directly in the Octopus interface without having to run the MAM system plugin parallelly.
A placeholder workflow is also achievable with the help of our techtacles. You may create placeholders inside stories for clips that have not yet been gathered. For example, one member of the team is writing the script for an important interview while another is out in the field acquiring footage for the story. In this case, the journalist working on the story may create a placeholder, or a space, where the video should be placed after it has been gathered and edited. Once the video is ready to go, it is enough to upload it to the MAM system, since the placeholder will send the video immediately to the area designated for it. This eliminates the need to reopen the story and manually attach the clip as the MAM system will automatically send this information back to Octopus based on its metadata. Octopus recognizes when the placeholder is populated and labels it accordingly, making it visible in the Octopus interface. Additionally, as information is synchronized across systems in real-time, the status of the clip will be immediately updated to “ready” in the Octopus UI, letting the producer know that it can go on air.
Working with videos sometimes involves watching hours of footage until we find the person or element that we are looking for. We stand on the brink of the Fourth Wave of the Industrial Revolution, which is altering the way we live and work. For the latter, we may leverage this to our advantage. Our fifth techtacles comes from the future to show you how you can use Artificial Intelligence to delegate some work to the machines.