The introduction of Computer Graphics (CG) and Visual Effects (VFX) into filmmaking was not just a technological advancement but an art form in itself. In its nascent stages, these tools were wielded with precision and purpose. Every step, from performance to camera movement, lighting to set design, was meticulously planned to serve a singular vision.
In the early days, the complexity of CG and VFX meant that only those with a deep understanding of the technology could oversee its application. However, as these tools became more ubiquitous, there was a shift in leadership. Experts who once helmed these departments were gradually replaced by more “business-oriented” leaders, often with a superficial grasp of the intricacies of VFX and CG. This transition ushered in the era of “Fix it in post” filmmaking. Suddenly, sets were designed without considering performance, lighting was decided without regard to camera angles, and actors were left to perform without cohesive creative direction. This approach essentially deferred critical creative decisions, allowing them to be made by individuals who might lack the necessary artistic insight.
Today, we’re nearing a point where films are shot first and scripted later, leading to inefficiencies at multiple levels. The repercussions are evident in the wastage of resources, from the hours invested by artists and developers to the underutilization of talent. Modern films, despite having VFX budgets that eclipse iconic movies like “Jurassic Park” or “Terminator 2,” often appear lackluster and undercooked in comparison.
It’s crucial to understand that the tools aren’t the issue. Blaming CG for the decline in film quality is akin to blaming a pencil for a poorly written story. The real problem lies in the mismanagement of the creative process. This not only results in subpar art but also threatens the very fabric of the film industry.