How Bellingham AI Contract Scandal Exposed Serious Algorithmic Bias Risks in Local Government
The City of Bellingham is under intense scrutiny after an independent investigation revealed a staffer allegedly used ChatGPT to "tip the scales" in a major city contract. This case highlights the growing dangers of algorithmic bias and the urgent need for strict AI governance in public procurement.
A Modern Veneer on Ageing Procurement Issues
In early 2026, the City of Bellingham, Washington, found itself at the center of a national conversation regarding artificial intelligence and ethical governance. What began as a standard effort to modernize utility billing software quickly spiraled into a controversy involving allegations of "bid rigging" with a high-tech twist. Public records recently brought to light show a city employee allegedly using ChatGPT to generate contract requirements specifically designed to favor one vendor while excluding another.
This incident has sent shockwaves through local governments across the United States. While the use of AI for efficiency is often encouraged, this case demonstrates how easily these tools can be weaponized to bypass fair competition. According to experts cited by Cascade PBS, the staffer reportedly asked the AI to "tailor" requirements to ensure a specific Houston-based company won the bid, despite their proposal being significantly more expensive than competitors.
The Mechanics of Algorithmic Bias in Procurement
How exactly does a chatbot influence a million-dollar government contract? In the Bellingham case, the employee used generative AI to draft a "requirements matrix"—the technical checklist used to score potential vendors. By inserting highly specific, AI-generated language that only the preferred vendor could meet, the staffer effectively automated the exclusion of rival firms. This highlights a critical vulnerability: algorithmic bias isn't always an accident of the data; sometimes, it is a deliberate human choice coded into the system.
Mayor Kim Lund has since launched an independent investigation, describing the allegations as "serious." The core of the issue lies in the "permissive approach" many cities have taken toward AI. While tools like ChatGPT can help draft emails or summarize reports, using them to create legal frameworks for public spending without oversight creates a "black box" where transparency goes to die. Without human-in-the-loop verification, the line between efficiency and corruption becomes dangerously thin.
National Implications for Public Sector AI
Bellingham is not alone in its struggle to keep pace with rapid technological shifts. Across Washington state and the broader U.S., policies are trailing behind adoption. A recent report from KNKX points out that while nearly 80% of local government IT directors are concerned about the lack of clear AI regulations, many employees are already using these tools daily for sensitive tasks.
The risks extend beyond just contract favoritism. Algorithmic systems used in public sectors can inadvertently perpetuate social inequities. For instance, if an AI is used to screen job applicants or determine social service eligibility, it may rely on historical data that reflects existing biases against marginalized groups. The Washington State AI Task Force is currently racing to finalize recommendations by July 2026 to address these "high-risk" uses of technology.
Building a Framework for Trustworthy AI
To prevent future scandals, experts suggest that local governments must move away from "permissive use" and toward "structured governance." Key steps include:
- Mandatory Disclosure: Any AI-generated language used in public documents should be clearly cited and reviewed by a separate legal team.
- Bias Auditing: Regular audits of procurement outcomes to ensure that AI-assisted decisions aren't disproportionately favoring specific types of vendors.
- Staff Training: Moving beyond "how to use" AI and teaching employees the ethics of "when not to use" AI.
As Bellingham awaits the results of its independent investigation, the lesson for other cities is clear: efficiency must never come at the cost of equity. The "spirit of public procurement" relies on a level playing field, and as we move further into the age of automation, maintaining that balance will require more than just better software—it will require unwavering human accountability.

