[Lazarus] Application Idea Request for Comments

Anthony Walter sysrpl at gmail.com
Sun Jul 1 16:50:53 CEST 2018

I am likely going to create a simple program that allows users to create
real time effects on using one or more source video files, and am
requesting comments before I start. I'd like to know what other cross
platform (Mac/Windows/Linux/Pi) programs exists that work as I am going to
describe, and am also open to suggestions.

Use case scenarios:

   - Fun video effects to display on a large home theater screen.
   - Programmable neo pixel effects
   <https://www.youtube.com/watch?v=CWI676b6hE8> based on video and or
   audio sources.
   - Picture in picture displays with 3D transforms and animation.
   - Custom detailed audio equalization and visualization.
      - Easy reverb, echo, delay, flanger, chorus effects.
   - Any other idea you have to either manipulate video and audio in real
   time, or forward data from video/audio to other external APIs such as
   Raspberry Pi GPIO pins.

Here are the details of what I am thinking about its operation:

   - Purely text based input.
   - Run from the command line with a Javascript file as the main script.
   - The program creates a full screen video player given input from the
   script file.
   - The script file can then alter the video and audio in real time as it
   plays. It can add effects such as mosiac, posterization, ascii matrix,
   mirroring/3D transforms, overlay text, color substitution or chroma key,
   etc. It can also put effect on audio track.
   - The script can load other script files, and call subroutines.
   - Your own custom GLSL fragment shaders implement the actual pixel
   - Multiple video, image, and audio sources can be composited to one
   final video/audio stream.
   - The program will include may pre fabricated scripts with reusable
   subroutines and examples.

The details of implementation include:

   - The MVP video player <https://mpv.io> will be used to decode video
   - The SDL media library <https://www.libsdl.org> will be used to create
   a window and graphics context and audio mixer.
   - The MVP API gives access to video frames through an OpenGL frame
   buffer object. I assume it also has an audio API.
   - Javascript will come from the JSC webkit
   <https://webkit.org/blog/7536/jsc-loves-es6/> engine.
   - GLSL fragment shaders
   <https://www.khronos.org/opengl/wiki/Fragment_Shader> will be connected
   to the rendering pipeline by user defined Javascript functions.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.lazarus-ide.org/pipermail/lazarus/attachments/20180701/21345548/attachment.html>

More information about the Lazarus mailing list