“The choice for mankind lies between freedom and happiness and for the great bulk of mankind, happiness is better.”

~ George Orwell

Acclaimed documentary filmmaker Shalini Kantayya has brought as timely a film as one could imagine to the 2020 Sundance Film Festival. Screening in the U.S. Documentary Competition, “Coded Bias” welcomes viewers to the worlds of algorithms and facial recognition, two of the most malleable facets of artificial intelligence.

It’s fairly old news, of course, that AI and the manner in which it is fed data effectively controls our lifestyle choices in this digital age, but seeing the process laid out and insightfully deconstructed down to its molecular level is startling, to say the least.

As researcher Joy Buolamwini of the MIT Media Lab discovers while dabbling in facial recognition, much of the involved software misidentifies faces of both those with darker skin and women of all stripes.

During her subsequent investigations into the rampant bias inherent to the algorithms in question, it quickly becomes apparent that AI and the white males it rode in on at the genesis of the technology are not neutral. Not that any of this comes as a surprise to those in-house administrators packing a Y chromosome.

The film follows Buolamwini as she interacts with an assortment of other female scientists at the forefront of shifting the bias in this obvious civil rights paradigm to a more balanced part of the scale. To that end, their intent is to form a “Justice League.”

What they are after is legislative protection that would monitor widespread injustices in the usage of facial scanning from law enforcement and surveillance to automated hiring practices with its inbred workplace biases and credit decisions within the loan industry.

When a large part of the data-driven learning process of AI is by infusion of mega amounts of pre-biased information blocks that are part and parcel to the technology itself, there is little wonder as to the shape of the resultant output. Or, as they say, “horse-feathers in, horse-feathers out.”

The manner in which opinions are dissected is equally interesting. As in the industry-wide implementation of “popular” in place of “good” (see Facebook’s “Like” button).

The fact that these obviously flawed technological constructs are at the heart of shape-shifting our very lives while completely free from public and governmental scrutiny appears problematic to say the least. When you add that the hands on the wheel belong to non-altruistic and voracious appetites, the “big brother” issues increase exponentially.

Once a representative segment of a population becomes “hip” to the surveillance quandary, however, action of a preventive nature often follows suit. Last spring, for instance, San Francisco banned the use of facial recognition software by the police and other agencies.

There are those on both sides of the privacy equation who believe it’s psychologically unhealthy when people know they’re being watched in every aspect of their public and private lives. The fact that this opinion only resonates as a civil rights issue on one side is the issue. Go figure!

In China, it’s a given. Everyone is watched all the time. It’s just totally transparent. There is no choice but to “buy in.” The creepiness inherent to our system, both legislative and corporate, has long been “top-down.”

There is also room for an expanded infusion of the old privacy-versus-national security debate, although that isn’t actually the question being illuminated by Kantayya’s brilliant film. How about, as a good start, through legislative protections, make facial recognition neutral. Remove the biases. Do not miss this film. “Coded Bias” should be mandatory viewing.