(As modified for OpEdNews)
http://www.opednews.
com/articles/opedne_rady_ana_070117_annotated_bibliograp.
htm

Annotated Bibliography of 15 Expert Reports

on Voting Systems and Fair Vote Counts



Electronic Voting and Fair Vote Counts (retitled by editor)
Summarized by Rady Ananda
On behalf of J30 Coalition
Columbus, Ohio
January 17, 2007



In response to the numerous failings of electronic voting
systems, as summarized below, the majority of these experts
offer electronic audit solutions, enhanced security protocols,
greater enforcement of existing laws, and proposals for new
law, election procedures, and backup systems, at additional
exorbitant cost to taxpayers.

The well-financed and most visible portion of the election
integrity movement agrees with these solutions.

None of these solutions, however, meet the Fair Vote Count
standard enumerated by international authority (OSCE, below,
to which the US is a signatory. See page 9).  Humans cannot
observe the vote count when it is conducted inside a machine,
be it touch screen, optical scan, mechanical lever, or any other
machine tabulator.  No amount of audits, security protocols, or
paper trails will change the fact that machines count the vote
secretly.

Key policy makers, on the other hand, see no urgency in
reconsidering electronic voting systems. Warren Stewart of
www.VoteTrustUSA.org recently advised,

“The incoming chair of the Committee on House
Administration (which crafted HAVA), Juanita Millender-
McDonald (D-CA), has let it be known … that it will not mandate
any voting technology changes until 2011.”   

This position ignores the science.  

As to the most appropriate and best next step, a vocal portion
of non-experts envisions an entirely different solution.  We rely
on expert conclusions about what doesn’t work, and expert
descriptions of what constitutes a democratic election: hand-
counted paper ballots, at the precinct, before all who wish to
observe.

Emphasis in the annotations below appeared in the original
document.  






REPORTS REVIEWED


Brennan Center, The Machinery of Democracy: Protecting
Elections in an Electronic World,
2006
http://www.brennancenter.
org/programs/downloads/Full%20Report.pdf  

Compuware Corp. DRE Technical Security Assessment
Report for Ohio, November 2003.  

Congressional Research Service, Election Reform and
Electronic Voting Systems (DREs): Analysis of Security
Issues. (Order Code RL32139) November 4, 2003.  
http://theory.
lcs.mit.edu/~rivest/voting/reports/Fischer-
ElectionReformAndElectronicVotingSystemsDREs.pdf

Cuyahoga Election Review Panel, Final Report, July 20, 2006
www.cuyahogavoting.org/CERP_Final_Report_20060720.pdf  
Reviewed by Kim Zetter of www.wired.com

Election Science Institute, 2006, “DRE Analysis for May 2006
Primary Cuyahoga County, Ohio”
http://www.cuyahogacounty.
us/bocc/GSC/pdf/esi_cuyahoga_final.pdf or see http://www.
electionscience.org click on Cuyahoga County Report tab.

Government Accountability Office, 2005, Elections: Federal
Efforts to Improve Security and Reliability of Electronic Voting
Systems Are Under Way, but Key Activities Need to Be
Completed
http://www.gao.gov/new.items/d05956.pdf  

Harry Hursti, Black Box Report Security Alert: July 4, 2005
Critical Security Issues with Diebold Optical Scan Design (1.94
w), 2005,
http://www.blackboxvoting.org/BBVtsxstudy.pdf  

International Parliamentary Union, Free & Fair Elections, 2006
http://www.ipu.org/PDF/publications/Free&Fair06-e.pdf

Rebecca Mercuri, PhD., focused on electronic vote tabulating
since 1989, Affidavit filed in Squire v. Geer, Franklin County
(Ohio) Court of Appeals, 06APD-12-1285.

OSCE – Office of Democratic Institutions and Human Rights,
2005, Election Observation Manual, 17 Criteria for a Fair Vote
Count (p. 62)
http://www.osce.
org/publications/odihr/2005/04/14004_240_en.pdf

Princeton Study: Feldman, Ariel J., J.A. Halderman, and E.W.
Felten, “Security Analysis of the Diebold AccuVote-TS Voting
Machine,” Center for Information Technology Policy and Dept.
of Computer Science, Woodrow Wilson School of Public and
International Affairs, Princeton University, 2006.
http://itpolicy.
princeton.edu/voting  

RABA Technologies LLC. Trusted Agent Report: Diebold
AccuVote-TS Voting System (report prepared for Department
of Legislative Services, Maryland General Assembly,
Annapolis, Md., January 2004).
http://www.raba.
com/press/TA_Report_AccuVote.pdf

Aviel Rubin, News article: “On My Mind: Pull The Plug,”
Forbes Magazine, 8/2006
http://www.forbes.
com/forbes/2006/0904/040.html?
partner=alerts&_requestid=2972

U.S. Commission on Federal Election Reform, 2006.  News
article: “Reversing Course on Electronic Voting: Some Former
Backers of Technology Seek Return to Paper Ballots, Citing
Glitches, Fraud Fears,” Wall Street Journal, May 12, 2006.  
http://online.wsj.
com/public/article_print/SB114739688261250925-
q5rh2ocioxu6mgjmS6bZPCZL0HY_20060610.html

David Wagner, Ph.D., Computer Science Division, University of
California, Berkeley. Written Testimony before the Committee
on Science and Committee on House Administration U.S.
House of Representatives, July 19, 2006.
Annotations



BRENNAN CENTER, The Machinery of Democracy: Protecting
Elections in an Electronic World,
2006
http://www.brennancenter.
org/programs/downloads/Full%20Report.pdf  

Studied 3 voting systems by type: DRE, DRE w/VVPAT, and
Optical Scan. Brennan identified 120 vulnerability points.  

Report is limited to identifying the least difficult way to alter
results on a statewide basis. It is also limited to studying
attacks that cannot be prevented by physical security and
accounting measures taken by election officials.  The analysis
further assumed that certain fundamental physical security
and accounting procedures were already in place.

Concluded that it would take only one person, with a
sophisticated technical knowledge and timely access to the
software that runs the voting machines, to change the
outcome.

All three voting systems have significant security and
reliability vulnerabilities, which pose a real danger to the
integrity of national, state, and local elections.

The most troubling vulnerabilities of each system can be
substantially remedied if proper countermeasures are
implemented at the state and local level.

Few jurisdictions have implemented any of the key
countermeasures that could make the least difficult attacks
against voting systems much more difficult to execute
successfully.

For all three types of voting systems:

1. When the goal is to change the outcome of a close
statewide election, attacks that involve the insertion of
Software Attack Programs or other corrupt software are the
least difficult attacks.

2. Voting machines that have wireless components are
significantly more vulnerable
to a wide array of attacks.

DREs without voter-verified paper trails do not have available
to them a powerful countermeasure to software attacks: post-
election Automatic Routine Audits that compare paper records
to electronic records.

For DREs w/VVPT and PCOS:

1. The voter-verified paper record, by itself, is of questionable
security value. The paper record has significant value only if
an Automatic Routine Audit is performed (and a well-designed
chain of custody and physical security procedures is
followed).

2. Even if jurisdictions routinely conduct audits of voter-
verified paper records, DREs w/VVPT and PCOS are
vulnerable to certain software attacks or errors.





COMPUWARE CORP. DRE Technical Security Assessment
Report for Ohio, November 2003.  Confidential report prepared
for Ohio Secretary of State Ken Blackwell, and later published
on the web.  High risks include:

With access to the supervisor card, someone could guess the
four digit PIN. The four digit PIN is a factory default from
Diebold and cannot be changed. In our test it was guessed in
less than two minutes of testing.

Smart Card Writer - with access to the small handheld writer,
someone could use a voting card more than once while at the
voting booth.

Diebold’s voting system uses MS Access as the database to
store the Ballot definition, Audit logs and Tally results. The
Database has no password protection. The audit logs and the
tally results can be changed.



CONGRESSIONAL RESEARCH SERVICE, Election Reform
and Electronic Voting Systems (DREs): Analysis of Security
Issues. (Order Code RL32139) November 4, 2003.  
http://theory.
lcs.mit.edu/~rivest/voting/reports/Fischer-
ElectionReformAndElectronicVotingSystemsDREs.pdf
This is a comprehensive report on several expert studies of
electronic voting systems.  Problems noted include:

There appears to be an emerging consensus that in general,
current DREs do not adhere sufficiently to currently accepted
security principles for computer systems, especially given the
central importance of voting systems to the functioning of
democratic government.

The ballot itself consists of redundant electronic records in the
machine’s computer memory banks, which the voter cannot
see. This is analogous to the situation with mechanical lever
voting machines, where casting the ballot moves counters that
are out of view of the voter. In a lever machine, if the
appropriate counters do not move correctly when a voter
casts the ballot, the voter will not know, nor would an
observer. Similarly, with a DRE, if the machine recorded a
result in its memory that was different from what the voter
chose, neither the voter nor an observer would know.

The same is true with a computerized counting system when it
reads punch cards or optical scan ballots. Even if the ballot is
tabulated in the precinct and fed into the reading device in the
presence of the voter, neither the voter nor the pollworker
manning the reader can see what it is recording in its memory.

Malicious computer code, or malware, can often be written in
such a way that it is very difficult to detect.

DRE software is moderately complex, and it is generally
accepted that the more complex a piece of software is, the
more difficult it can be to detect unauthorized modifications.

Most manufacturers of DREs treat their software code as
proprietary information and therefore not available for public
scrutiny. Consequently, it is not possible for experts not
associated with the companies to determine how vulnerable
the code is to tampering.



Scientists at the California Institute of Technology and the
Massachusetts Institute of Technology performed the most
extensive examination of security. The Caltech/MIT report
identified four main security strengths of the electoral process
that has evolved in the United States:

•        the openness of the election process, which permits
observation of counting and other aspects of election
procedure;
•        the decentralization of elections and the division of labor
among different levels of government and different groups of
people;
•        equipment that produces “redundant trusted recordings”
of votes; and
•        the public nature and control of the election process.

The report expressed concern that current trends in electronic
voting are weakening those strengths and pose significant
risks.


CUYAHOGA ELECTION REVIEW PANEL, July 20, 2006 Final
Report
www.cuyahogavoting.
org/CERP_Final_Report_20060720.pdf  Kim Zetter of wired.
com summarized the report as follows:
•        Due to poor chain of custody for supplies and equipment,
812 voter-access cards (which voters place in touch-screen
machines to cast their ballot) were lost, along with 215 card
encoders, which program the voter-access cards. Three
hundred thirteen keys to the voting machines' memory-card
compartments, where votes are stored, also went missing.
•        Officials set up two user accounts on the computer
running vote-tabulation software, then assigned one
password to both accounts and allowed multiple people to
use them, thwarting any effort to identify individuals who
might access and alter the system.
•        Sixty Board of Election employees took touch-screen
machines home a weekend before the election to test a
procedure for transmitting data on election night.
•        The election board hired 69 taxis to transport observers
to precincts to collect memory cards and paper rolls on
election night. But many cab drivers ended up gathering the
materials themselves, and about half the cabs returned to the
warehouse with election data, but no observer.
In at least 79 precincts, the number of voters who signed the
poll books did not match the number of ballots cast. At least
eight precincts had more ballots cast than registered voters.
Because some polling places served several precincts, some
of the discrepancies are explained by voters being directed to
the wrong machines, an error that did not result in uncounted
votes. But even when investigators tallied ballots and
signatures for all precincts in a polling place, 15 locations still
had mismatches. In one case, investigators found 342 more
votes than ballots.


ELECTION SCIENCE INSTITUTE, 2006, “DRE Analysis for May
2006 Primary Cuyahoga County, Ohio”
http://www.
electionscience.org click on Cuyahoga County Report tab or
see
http://www.cuyahogacounty.
us/bocc/GSC/pdf/esi_cuyahoga_final.pdf

The current election system contains significant threats to
inventory control of mission critical election assets, error-free
vote tabulation, and tabulation transparency.

The machines’ four sources of vote totals – VVPAT individual
ballots, VVPAT summary, election archive, and memory cards
– did not agree with one another.

Due to limits in the data, software computational abnormality
contributing to the count inaccuracies cannot be ruled out.
Computational abnormality could be the result of a failure to
adequately test the voting equipment before the election or to
manage the various databases appropriately.

A lack of inventory controls and gaps in the chain of custody
of mission critical assets, such as DRE memory cards, DRE
units, and VVPAT cartridges, resulted in a significant amount
of missing data. Because of the missing data, ESI is unable to
give a definitive opinion of the accuracy of the Diebold TSX
system.

In multi-precinct polling places, voters could vote on machines
located in other precincts. Accordingly, ballots from a number
of precincts appeared on the same VVPAT tape. VVPAT
ballots, however, lack a header identifying the precinct.
Without this information, it is not possible to conduct a
precinct-level tally of the VVPAT ballots.

Consider that each machine has a printer and potentially
multiple rolls of paper. Paper records of votes (the official
records) may be lost without voters’ awareness because of
paper jams, paper not being loaded properly, ink issues, and
other problems.  

Lack of a standardized proven manual count process is likely
to result in recount error and inefficiency.

ESI founder Steve Hertzberg spoke with wired.com’s Kim
Zetter in October, 2006.
http://www.wired.
com/news/technology/0,71999-0.html?
tw=wn_politics_evote_5  Zetter writes:

Out of 467 touch-screen machines assigned to 145 precincts
that ESI audited, officials could not locate 29 machines after
the election, despite days of searching. And 24 machines that
were found had no data on them. "All their paperwork says
(the machines) were deployed to polling locations but we can't
figure out why there's no election data on them," says ESI
founder Steve Hertzberg.  Cuyahoga County Board of
Elections Director Michael Vu provided no explanation for the
missing machines.


GOVERNMENT ACCOUNTABILITY OFFICE, 2005, Elections:
Federal Efforts to Improve Security and Reliability of Electronic
Voting Systems Are Under Way, but Key Activities Need to Be
Completed
http://www.gao.gov/new.items/d05956.pdf  Voting
system vulnerabilities and problems found include:

•        Cast ballots, ballot definition files, and audit logs could be
modified;
•        Supervisor functions were protected with weak or easily
guessed passwords;
•        Systems had easily picked locks and power switches that
were exposed and unprotected;
•        Local jurisdictions misconfigured their electronic voting
systems, leading to election day problems;
•        Voting systems experienced operational failures during
elections;
•        Vendors installed uncertified software;
•        Some electronic voting systems did not encrypt cast
ballots or system audit logs, and it was possible to alter both
without being detected;
•        It was possible to alter the files that define how a ballot
looks and works so that the votes for one candidate could be
recorded for a different candidate.





HARRY HURSTI, BLACK BOX REPORT Security Alert: July 4,
2005 Critical Security Issues with Diebold Optical Scan Design
(1.94w), 2005,
http://www.blackboxvoting.org/BBVtsxstudy.
pdf  Some of the key findings include:

With this design, the functionality – the critical element to be
certified during the certification process -- can be modified
every time an election is prepared. Functionality is
downloaded separately into each and every machine, via
memory card, for every election. With this design, there is no
way to verify that the certified or even standard functionality is
maintained from one voting machine to the next.

Paper trail falsification – Ability to modify the election results
reports so that they do not match the actual vote data 1.1)
Production of false optical scan reports to facilitate checks
and balances (matching the optical scan report to the central
tabulator report), in order to conceal attacks like redistribution
of the votes or Trojan horse scripts such as those designed by
Dr. Herbert Thompson.(19)

Removal of information about pre-loaded votes 2.1) Ability to
hide pre-loaded votes 2.2) Ability to hide a pre-arranged
integer overflow

The exploits demonstrated in the false optical scan machine
reports (“poll tapes”) shown on page 16 do not change the
votes, only the report of the votes. When combined with the
Trojan horse attack demonstrated by Dr. Thompson, this
attack vector maintains an illusion of integrity by producing
false reports to match the contaminated central tabulator
report. The exploit demonstrated in the poll tape with a true
report containing false votes, example pre-stuffs the ballot box
in such a way as to produce an integer overflow.

In this exploit, a small number of votes is loaded for one
candidate, offset by a large number of votes for the opposing
candidate such that the sum of the numbers, because of the
overflow, will be zero. The large number is designed to trigger
an integer overflow such that after a certain number of votes is
received it will flip the vote counter over to begin counting
from zero for that candidate.


INTERNATIONAL PARLIAMENTARY UNION, Free & Fair
Elections, 2006, p. 157 presents a summary of the theory
behind an observable vote count, and describes the benefits
of a parallel election.  
http://www.ipu.
org/PDF/publications/Free&Fair06-e.pdf

Finally, there is the count and, in appropriate cases, the
transfer of power to the successful party in the election.
Complementary to the principle of secret ballot is the integrity
of the count, which looks both to ensure that the expressed
wish of the elector is taken into account, and that the result
declared corresponds with the totality of the votes cast.

Sometimes, the ballots will be counted on the spot, and at
others, the ballot boxes are transported to central or regional
counting stations. In either case, transparency of process is as
valuable as accuracy in counting.

Transportation of ballot boxes commonly gives rise to fear of
substitution…  Confidence in the process can be enhanced by
the presence of party representatives both at the count and
during any interim period of transport.


As to citizen-run parallel elections, the International
Parliamentary Union explains:

Parallel voting tabulation has also proven its value as a means
of independently verifying the results reported by electoral
authorities. In this process, monitors record results obtained
from selected polling sites, and compare them with the official
results: The monitoring of vote counts as part of an overall
election-observation effort

•        Can boost the confidence of voters suspicious of
possible fraud;
•        Permit results to be projected more quickly than the
official results;
•        Allow for the identification of actual winners; and
•        Allow for the consequent exposure of any attempted
manipulations.


REBECCA MERCURI, PhD., Affidavit attached as Exhibit A to
Squire v. Geer Complaint, Franklin County (Ohio) Court of
Appeals, 06APD-12-1285.
Dr. Rebecca Mercuri has been studying electronic vote
tabulation since 1989, and has published over 40 scientific
papers on electronic voting technology. She observed the
partial recount of Frankin County, Ohio’s November 7, 2006
election.  She also oversaw the Signature Audit of 25% of
Franklin County’s records.  Her report found systemic
problems, concluding “there cannot be full confidence in the
results of these (35) problematic precincts.”  
She describes Franklin County’s recount process as
constituting “a breach of procedure that thwarts any
meaningfully appropriate and independent recount of the
election from the RTALs” (real time audit logs that serve as the
ballot of record in Ohio.)
“The recount methodology used by Franklin County did not
conform, and in fact significantly varied from the method
prescribed by Ohio Secretary of State’s Directive No. 2006.50
in many respects.”
Dr. Mercuri concludes:
“In summary, there are numerous reasons why there cannot
be confidence in the election process, the recount, and the
vote totals for the Franklin County, Ohio November 7, 2006
election.  These reasons include:
a)        the denial of an appropriate recount from the
VVPAT/RTAL materials for the requested precincts;
b)        significant evidence that parts of original RTALs and
end tally reports were missing;
c)        evidence the voting system was inappropriately
configured and improperly used during the election
d)        indication that election procedures were violated,
including the possibility of password overrides during setup,
and use of the machines to cast ballots after RTAL paper
supplies has run out;
e)        evidence of inappropriate impounding and handling of
election materials at the County warehouse following the
election, including improper exposure of the VVPAT/RTALs;
f)        unexplained disparities between the public counters of
ballots cast and the number of voters who signed the poll
books in many precincts; and
g)        misleading information provided to voters, and not
properly followed up by the County, regarding the safety and
examination of the voting machines and system.”


OSCE (Organization for Security and Cooperation in Europe):
Office of Democratic Institutions and Human Rights, 2005,
Election Observation Manual,
http://www.osce.
org/publications/odihr/2005/04/14004_240_en.pdf.  
Seventeen Criteria for a Fair Vote Count (p. 62) precludes
machine tabulation:
1. Is the count performed by polling-station officials, or are
other persons involved?

2. Do election officials appear to understand and adhere to the
required procedures?

3. Are ballots counted in an orderly and secure manner?

4. Is the count conducted in a transparent environment, with
adequate arrangements for domestic observers?

5. Does the number of registered voters recorded as having
voted correspond with the number of ballots cast?

6. Are unused ballots secured, cancelled, or destroyed after
being counted?  

7. Are invalid ballots properly identified in a uniform manner?
Are invalid ballots appropriately segregated and preserved for
review?

8. Do the ballots contain any unusual markings intended to
violate the secrecy of the vote?

9. Does the number of invalid ballots seem inordinately high?  

10. Does the counting adhere to the principle that the ballot is
deemed valid if the will of the voter is clear?

11. Are ballots for each party or candidate separated correctly
and counted individually?

12. Are any disputes or complaints resolved in a satisfactory
manner?

13. Are official counting records correctly completed at the end
of the count and signed by all authorized persons?  

14. Are domestic observers and poll watchers from political
parties able to obtain official copies of the protocol for the
polling station?  

15. Are the results publicly posted at the polling station?

16. Are there inappropriate activities by police and/or security
forces, such as taking notes and reporting figures or results
by telephone?  

17. Did polling-station officials agree on the vote-count
procedures and results, and, if not, what action was taken in
case of disagreement?  


PRINCETON STUDY: Feldman, Ariel J., J.A. Halderman, and E.
W. Felten, “Security Analysis of the Diebold AccuVote-TS
Voting Machine,” Center for Information Technology Policy
and Dept. of Computer Science, Woodrow Wilson School of
Public and International Affairs, Princeton University, 2006.
http://itpolicy.princeton.edu/voting  

The Diebold AccuVote-TS and its newer relative the AccuVote-
TSx are together the most widely deployed electronic voting
platform in the United States [8]. In the November 2006 general
election, these machines are scheduled to be used in 357
counties representing nearly 10% of registered voters (~ 15
million).

All of Maryland and Georgia—will employ the AccuVote-TS
model. More than 33,000 of the TS machines are in service
nationwide.

The machine is vulnerable to a number of extremely serious
attacks that undermine the
accuracy and credibility of the vote counts it produces.

Malicious software running on a single voting machine can
steal votes with little if any risk of detection.  The malicious
software can modify all of the records, audit logs, and
counters kept by the voting machine, so that even careful
forensic examination of these records will find nothing amiss.
We have constructed demonstration software that carries out
this vote-stealing attack.

Anyone who has physical access to a voting machine, or to a
memory card that will later be inserted into a machine, can
install said malicious software using a simple method that
takes as little as one minute. In practice, poll workers and
others often have unsupervised access to the machines.

AccuVote-TS machines are susceptible to voting-machine
viruses—computer viruses that can spread malicious software
automatically and invisibly from machine to machine during
normal pre- and post-election activity. We have constructed a
demonstration virus that spreads in this way, installing our
demonstration vote-stealing program on every machine it
infects.

While some of these problems can be eliminated by improving
Diebold’s software, others cannot be remedied without
replacing the machines’ hardware. Changes to election
procedures would also be required to ensure security.


RABA TECHNOLOGIES LLC. TRUSTED AGENT REPORT:
DIEBOLD ACCUVOTE-TS VOTING SYSTEM (report prepared
for Department of Legislative Services, Maryland General
Assembly, Annapolis, Md., January 2004).
http://www.raba.
com/press/TA_Report_AccuVote.pdf

The general lack of security awareness, as reflected in the
Diebold code, is a valid and troubling revelation. In addition, it
is not evident that widely accepted standards of software
development were followed.

Knowing the password, a smart card can be replicated, and
the voter can vote multiple times. RABA was able to guess the
passwords quickly, and access each card’s contents
(Supervisor Card, Voter Card, and Security Key Card).  Given
access to the cards’ contents it became an easy matter to
duplicate them, to change a voter card to a supervisor card
(and vice versa) and to reinitialize a voter card so that it could
be used to vote multiple times.  

The use of hardcoded passwords is surprising both as an
inferior design principle and in light of them being published
openly in the Hopkins report. It must be assumed these
passwords are well known.

The contents of these cards are neither encrypted nor digitally
signed. Thus, for example, the
PIN associated with a Supervisor Card23 can be read directly
from the card – provided the password is known. This means
creating Supervisor Cards is a simple task: a perpetrator could
program his card with an arbitrary PIN that the AccuVote-TS
would readily accept.

It is reasonable to assume that a working key to the AccuVote
hardware is available to an attacker.  The hardware consists of
a touch-screen voting terminal with two locked bays. Maryland
has ordered approximately 16,000 AccuVote-TS terminals
each equipped with two locking bays and supplied with two
keys accounting for 32,000 locks and keys. Surprisingly, each
lock is identical and can be opened by any one of the 32,000
keys. Furthermore, team members were able to have
duplicates made at local hardware stores.

One team member picked the lock in approximately 10
seconds. Individuals with no experience (in picking locks)
were able to pick the lock in approximately 1 minute.

A sampling of the vulnerabilities found as a result of poor
physical security coupled with software that fails to use robust
encryption and authentication include six methods of attack.
(Not reproduced herein.)

The GEMS server lacks several critical security updates from
Microsoft. The team was able to remotely upload, download
and execute files with full system administrator privileges.

The server enables the “autorun” feature. Given physical
access to the server, one can insert a CD that will
automatically upload malicious software, modify or delete
elections, or reorder ballot definitions.

The back panel of the GEMS server is not protected. Given
physical access to a running device it is possible to insert a
USB flash drive and upload malicious software onto the server.

The database files that contain the election definition (and
results) are neither encrypted nor authentication protected.  
By removing the front panel of the server (this is held in place
by a small keyed lock), one can insert a CD, power up the
server, and have it boot its operating system off the CD. A
sophisticated user can automate this procedure requiring only
a few minutes access to the server.

Because both the database password and audit logs are
stored within the database itself, it is possible to modify the
contents without detection. Furthermore, system auditing is
not configured to detect access to the database. Given either
physical or remote access it is possible to modify the GEMS
database.

The procedure by which precincts upload votes to their LBE is
vulnerable to a “man-in-the-middle” attack.

The team identified fifteen additional Microsoft patches that
have not been installed on the servers. In addition, the servers
lack additional measures (all considered best practice) for
defense such as the use of firewall antivirus programs as well
as the application of least privilege, i.e. turning off the services
that are unused or not needed. Each of these represents a
potential attack vector for the determined adversary.


AVIEL RUBIN, National Science Foundation Director of
ACCURATE Center, one of the authors with: Tadayoshi
Kohno, Adam Stubblefield, and Dan S. Wallach. Analysis of an
electronic voting system. In IEEE Symposium on Security and
Privacy, May 2004.  

Also see
www.avirubin.com and “On My Mind: Pull The Plug,”
Forbes Magazine, 8/2006 http://www.forbes.
com/forbes/2006/0904/040.html?
partner=alerts&_requestid=2972

Why am I advocating the use of 17th-century technology for
voting in the 21st century?

The boot loader controls which operating system, so it is the
most security-critical piece of the machine. To (install
overwriting software), a night janitor at the polling place would
need only a few seconds' worth of access to the computer's
memory card slot.

If the defense against the attack is not built into the voting
system, the attack will work, and there are virtually limitless
ways to attack a(n electronic) system.


U.S. COMMISSION ON FEDERAL ELECTION REFORM, 2006.  
See Wall Street Journal article, “Reversing Course on
Electronic Voting: Some Former Backers of Technology Seek
Return to Paper Ballots, Citing Glitches, Fraud Fears,” Wall
Street Journal, May 12, 2006.  
http://online.wsj.
com/public/article_print/SB114739688261250925-
q5rh2ocioxu6mgjmS6bZPCZL0HY_20060610.html

Former Secretary of State James A. Baker III and former
President Jimmy Carter, who were co-chairmen of the
bipartisan Commission on Federal Election Reform, warned in
their 2005 final report that (fraud) could happen.

"Software can be modified maliciously before being installed
into individual voting machines. There is no reason to trust
insiders in the election industry any more than in other
industries."  


DAVID WAGNER Written Testimony, Computer Science
Division, University of California, Berkeley, submitted to the
Committee on Science and Committee on House
Administration U.S. House of Representatives, July 19, 2006:

The federal qualification process is not working. Federal
standards call for voting machines to be tested by
Independent Testing Authorities (ITAs) before the machines
are approved for use, but ITA-approved machines have:

* Lost thousands of votes across the country, and have
reported thousands more votes than voters;

* Failed to catch numerous security defects found by
academics, industry consultants and interested outsiders.

The 2005 VVSG standards contain significant shortcomings
regarding the security, reliability, and auditability of electronic
voting:

* ITAs are paid by the vendors whose systems they are
evaluating, raising conflicts of interest between the voting
public and client-vendors;

* The process lacks transparency, rendering effective public
oversight difficult or impossible;

* Technical information about voting systems is often
considered proprietary and secret by vendors, and voting
system source code is generally not available to independent
experts. In the rare cases where independent experts have
been able to gain access to source code, they have
discovered reliability and security problems;

* Testing is too lax to ensure the machines are secure, reliable,
and trustworthy.

* Many standards in the requirements appear to be ignored
during ITA testing;

* If serious flaws are discovered in a voting system after it has
been approved, there is no mechanism to decertify the flawed
system.




FURTHER READING

Author’s relevant pieces:

A colorful, 2-sided flyer summarizing the above points can be
found at
http://tinyurl.com/kwycu

“DREs, Magic and Other Sleights of Hand,” Jan. 2007, Recount
Observations and Signature Audit of Franklin County, Ohio
Nov. 7 2007 Election.
http://www.freepress.
org/images/departments/2321.pdf



















Rady Ananda is a self-employed researcher, and is trained and
experienced in legal investigations.  She has been studying
election integrity issues since November 2, 2004, contributing
research, analysis and public outreach materials to the public
domain.

This annotated bibliography was expanded from a 12-Report
annotation published on www.OpEdNews.com,  for J30
Coalition to present to Ohio public officials.  J30 is a Columbus-
based group of citizens interested in election integrity.  
will be retired