Later on in the DNN anomaly detection project we had the opportunity to temporarily integrate our A.I. technology into a Barco control-room product called OpSpace that would be demoed at the 2018 ISE Expo. The OpSpace system contained a subsystem called the EDP Analytics Service (EAS) which used a graphical IoT builder call Node-RED.
eval to misbehave).
At the time, JSON lacked schemas and it was not possible to insert comments unlike XML. But, now JSON has been extended to include all that. So, I now use JSON schemas and JSON within
websockets and non-SQL databases such as IndexedDB and MongoDB.
Experiences using this skill are shown below:
During greater part of 2017 and 2018, our team was doing research in Deep Neural Net (DNN) anomaly detection. The problem we were looking to solve was that control centers usually had far too many surveillance video cameras and control panels to monitor and not enough personal to pay attention to them all. Why not apply machine-learning (ML) anomaly detection on camera and computer monitor feeds to alert control center personal of abnormal events as soon as they occur?
This was a student intern project that I took over to turn into a useful application for use within the company. Even though this wasn't a research project, we thought it would be a good way to make Barco Labs better known throughout the company, as many employees viewed us as an "ivory tower" doing esoteric research of little practical value. The Smart Meeting Room App (SMRA) had the very practical benefit of finding and scheduling meeting rooms on the Barco campus.
By 2019, our machine-learning research project was now integrating and managing multiple cameras and video sources and as a result it was becoming increasingly difficult to configure using config. files only. I was given the task of creating a professional-looking desktop UI that could be accessed from a web browser on company locked-down PCs. It was decided that there would be no support for mobile devices and the GUI would be package as a Docker image.
I was given the task of researching automatic white-balance algorithms with the goal of calibrating multiple video cameras to the same color balance. The problem was that when switching between multiple cameras covering the same scene, a noticeable color shift was observed in the video stream, especially when the cameras were of different manufacture.
Created several proof-of-concept webapps in 2017 experimenting with ideas that will make virtual meetings more immersive. Our approach was largely inspired by the Star Wars™ films. If you remember, the Jedi Council held meetings in which remote participants were sitting in seats using holographic projections of themselves and vs. versa. I prototyped the same two-way immersive meeting idea using WebGL 3-D and WebRTC in web browsers.
Developed several browser-based video playback and video device management applications. Some examples (in reverse chronological order):
Designed and implemented complex, real-time, universal map display applications in Adobe Flash for the web and CocoaTouch for the Apple iPad that yielded a significant increase in revenue and helped achieve financial independence for the company (according to the President/CEO, Maurice Bailey).
Designed and created all of the Universal Flash Viewer plug-ins including all the graphics art work. They were all implemented in pure ActionScript 3 without the use of the main timeline. All ActionScript code was contained in separate source files. All plug-ins followed a standard design pattern: