Former Google Consultant Advises Transparency Measures for Big Tech Following AI Backlash

DataGrade CEO Joe Toscano suggests policies for accountability and transparency in the wake of Google’s Gemini AI struggles.

In the aftermath of the recent backlash against Google’s Gemini artificial intelligence (AI), former Google consultant and DataGrade CEO Joe Toscano has shed light on the root cause of the controversy and proposed measures for transparency in the AI industry.

Speaking exclusively to Fox News Digital, Toscano attributed Google’s predicament to launching Gemini prematurely in response to the competitive surge in generative AI products. The unveiling of groundbreaking tools like ChatGPT and Midjourney led Google to rush the release of Gemini, resulting in eyebrow-raising responses and historically inaccurate images.

Toscano emphasised that building an AI capable of answering diverse questions in multiple languages is an extraordinary feat. Still, he believes the product suffered from improper training rather than malicious intent. Google has since apologised for the erratic responses and paused the image generation feature, promising fixes before its re-release in the coming weeks.

While acknowledging the difficulty in creating intelligent systems, Toscano suggested that Silicon Valley adopt policies to insulate companies from harm and bring transparency to the public. He called for documenting AI processes, including decisions, people involved, and training data—a move to hold companies accountable for their AI outputs.


In 2023, most AI-related laws were part of comprehensive state consumer privacy laws, with federal guidelines emerging but no specific legislation. Toscano advocated for companies to undergo “algorithmic audits,” independent reviews by specialised knowledge professionals, to avoid regulatory capture.He stressed the importance of controls preventing companies from influencing audit results, comparing the system to drug checks in professional sports. Regular, intermittent audits would ensure safety and control to avoid reckless negligence.

Expressing concerns about archiving information, Toscano warned about the potential consequences if governments and companies leverage AI to control narratives. He argued that physical materials, like paper, could gain value as a more concrete way to assess truth versus fiction in historical documentation.

In response to Meta’s recent outage, Toscano highlighted the vulnerability of relying solely on digital records. He shared his practice of creating digital logs with paper backups to ensure continued operations in case of internet disruptions.

Toscano concluded with a stark observation about the modern war: the invisible hand that controls information ecosystems. According to Toscano, deleting archives and manipulating information has become a powerful tool in shaping societal narratives, influencing conversations from the macro to the micro level, and driving the future of democracy.


About Post Author