Video output writing is not asynchronous

I use a NAS to store the output and I discovered that each time the network interface is under load the GPU utilisation goes to zero.
This becomes significant when using the ProRes format due to the large amount of data to be written each frame.

Please refactor the file i/o to run on asynchronously to the main thread. I think this can be archived by adding a buffer that holds the previous frame(s) in memory and runs concurrently with other tasks.

In case anyone has the same problem I have figured out a workaround

First I split the file into several parts using ffmpeg like so:

Then I have written a python script that runs in the background while it runs (replace watched_location and destination_location with your own values)

import os
import time
import shutil
from datetime import datetime, timedelta
from pathlib import Path

watched_location = r"c:\tmp"
destination_location = r"w:\encode\dst"


max_age = timedelta(minutes=2)


def file_age(file_stat: os.stat_result):
    return datetime.now() - datetime.fromtimestamp(file_stat.st_mtime)


def move_file(file: Path, destination: Path):
    file_name = file.parts[-1]
    destination_file = destination.joinpath(file_name)
    print("Moving file from {} to {}".format(file, destination_file))
    try:
        shutil.move(file, destination_file)
    except os.error as error:
        print("Failed moving with...")
        print(error)


while True:
    watched_path = Path(watched_location)
    destination_path = Path(destination_location)
    files = [(file, file.stat()) for file in watched_path.iterdir() if file.is_file()]
    for (file, file_stat) in files:
        if file_age(file_stat) > max_age:
            move_file(file, destination_path)
    time.sleep(max_age.total_seconds())

Hope that helps, but please devs fix it!