Check out all the on-demand sessions from the Intelligent Security Summit here.
As of early October, more than 84.2 million absentee ballots had been requested or sent to U.S. voters in 47 states and the District of Columbia ahead of the U.S general election. According to some estimates, the swing state of Florida has already doubled California’s 1 million total, with nearly 2 million voters casting their mail-in ballots in the weeks leading up to November 3.
Delays in verifying mail-in ballots will slow the election tally, with tasks like processing ballots — verifying voters and separating that information from their ballot — anticipated to take longer than in previous years. Existing technology could expedite some processes, like software that matches signatures on ballot envelopes to voter records. (Thirty-three states require that voters’ signatures undergo validation.) But many question whether the algorithms underpinning this software might be biased against certain groups of voters.
How signature verification works
The category of algorithms used to verify signatures on ballots is known as “offline” signature verification because it relies on images of signatures when real-time information (like the downward pressure of a pen) isn’t available. Offline signature verification algorithms are trained on datasets that attempt to capture two feature types: global features that describe the signatures as a whole and local features that describe individual parts of the signatures (like symmetry and stroke directions).
Several studies on automatic signature verification have been published, most recently by the Central Police University’s Department of Forensic Science in Taiwan. The study found that an algorithm trained on an open source dataset from the International Conference on Document Analysis and Recognition attained accuracy between 94.37% and 99.96%. A more comprehensive paper published in the EURASIP Journal on Advances in Signal Processing concluded the accuracy of matching algorithms varied depending on the data used. Identification rates ranged from 74.3% for an algorithm trained on samples from 1,000 writers to 96.7% for an algorithm trained on a 657-writer dataset.
Intelligent Security Summit On-Demand
Learn the critical role of AI & ML in cybersecurity and industry specific case studies. Watch on-demand sessions today.
Portia Allen-Kyle leads the American Civil Liberties Union (ACLU) of Alabama’s non-litigation advocacy. She notes that automated signature-matching software is often trained on single-language (i.e., English) handwriting to refine the algorithm that allows for the best matches. Certain voters, such as those with mental or physical disabilities, stress-related ailments, or who don’t write in English, are potentially at higher risk of having their ballot rejected. Even voters with short names and hyphens are at a disadvantage since mistakes are more common on signatures with fewer “turning points and intersections.”
More than 750,000 absentee ballots didn’t count in the 2016 and 2018 elections because of signature discrepancies, according to NBC. And a recent ACLU survey discovered that in 2018, Florida voters of color comprised less than 28% of those voting absentee but 47% of all rejected ballots, with out-of-state and military dependents also experiencing disproportionately higher rejection rates.
Benchmarks of deployed signature verification software remain hard to come by, but a 2020 study published by Stanford University’s Law and Policy Lab Automated found that signature matching systems in California increased the rejection rate by 1.7 points (74%) in counties that lacked human review. Allen-Kyle and Surveillance Technology Oversight Project cofounder Liz O’Sullivan point out that many voters now register at a motor vehicle agency where their signature is digitized using a signature pad and that these signatures look distinct from those handwritten on paper because people move their hands differently and because the pads have low resolution.
“Even from a nontechnical standpoint, signature verification powered by AI or any form of automation is more likely to flag folks who have undergone a name change. This means that married women, trans people, or domestic abuse survivors will all be disproportionately likely to have their vote cast out,” O’Sullivan told VentureBeat via email.
Reuters reports that at least 29 counties across eight states use AI on mail-in ballots to ease the workload of staff enforcing signature rules. Most sourced the software from Parascript, a Colorado developer of document capture and recognition solutions.
To account for unpredictability in things like signature spaces on ballot envelope designs and scanning equipment, Parascript says its software allows election officials to set their own minimum scores for approving signatures. The performance variability is evident in Colorado, where Parascript’s software approves 40% of signatures in Douglas County, 20% in Denver County, and 50% in Larimer, according to Reuters. The approval rate for Adams County reportedly jumped when it boxed the signature space on envelopes, generating more readable images, while Larimer’s percentage fell as more signature matches came from fuzzy motor vehicle records.
Some states offer recourse when automated verification triggers a rejection. In Sarasota County, Florida, officials send a letter to voters whose ballots were challenged and attempt to alert them by text or call if the county has their phone number. Beyond Florida, 17 states require that voters be notified when there’s a missing signature or discrepancy and given an opportunity to correct it — though the protocols vary. A study published by University of Florida researchers found that smaller counties often simply mail notices, which may not be received before the voting deadline.
A lack of transparency exacerbates the challenges inherent in automatic signature verification. The U.S. Election Assistance Commission, which serves as a national clearinghouse and resource of information regarding election administration, says software should be set only to accept nearly perfect signature matches and that humans should double-check a sample. But the Commission doesn’t lay out acceptable error rates or sample sizes, and vendors of automated signature verification, like Parascript, aren’t required to publish their error rates.
Advocacy groups continue to mount legal challenges over state signature verification processes. Ruling on one of these lawsuits, the Pennsylvania Supreme Court determined last Friday that mail-in ballots can’t be rejected if a voter’s signature looks different from the one on their registration form.
“If the software uses image recognition, it is likely to be some kind of neural network,” O’Sullivan said. “These are subject to all the usual biases — anything that isn’t sufficiently represented in training data will be worse to perform. Think immigrant names, especially those with non-English characters, including accent markings,” O’Sullivan told VentureBeat. “But these algorithms aren’t available for public use. How could we test them? How can we trust their claims? This is why there must be public availability of tools used in public service and independent review bodies to validate these tests.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.