this may stupid question, have been digging days , haven't been able find distinct answer.
i know if possible use avfoundation framework synchronize time-based data without audio or video media file.
i building application need precisely trigger events on synchronized timeline. these events must synchronized audio or video file, not always. love able use avfoundation as, @ least conceptually, seems contain need. avmutablecomposition made of tracks of recorded values @ times play , use trigger events (the same way subtitle precisely displayed on video) perfect.
in investigations of framework documentation, there plenty of references avasset can made of tracks "each of uniform media type, including not limited audio, video, text, closed captions, , subtitles", , track consisting of text data work me, every single actual code example can find seems insist avasset has tied audio or video file on disk, , apple's list of supported file formats consists of video , audio file types.
i understand why, , can accept may have been built requirement of av file, keep getting drawn cryptic possibility might solve lot of problems without having reinvent wheel.
so question answered, think need 1 of 2 things. either definitive answer not able use avfoundation needs, or kind of pointer in right direction of kind of compatible file type use store non-av data use in avmutablecomposition.
thanks in advance!
Comments
Post a Comment