“To Govern AI, We Must Govern Compute” | Lawfare | “…and to protect humanity we must ban big, fast computers”

This blog, in February:

“we should totally expect to see legislation, quite soon, telling us that “nobody should be permitted more than [so many bogoMIPS, computrons, or teraflops] at home””

https://alecmuffett.com/article/109130

Now comes Lawfare:

The importance of compute to AI capabilities and the feasibility of governing it make it a key intervention point for AI governance efforts. In particular, compute governance can support three kinds of AI governance goals: It can help increase visibility into AI development and deployment, allocate AI inputs toward more desirable purposes, and enforce rules around AI development and deployment.

Visibility is the ability to understand which actors use, develop, and deploy compute-intensive AI, and how they do so. The detectability of compute allows for better visibility in several ways. For example, cloud compute providers could be required to monitor large-scale compute usage. By applying processes such as know-your-customer requirements to the cloud computing industry, governments could better identify potentially problematic or sudden advances in AI capabilities. This would, in turn, allow for faster regulatory response.

https://www.lawfaremedia.org/article/to-govern-ai-we-must-govern-compute

…and

One enforcement mechanism discussed in our paper is physically limiting chip-to-chip networking to make it harder to train and deploy large AI systems

…and

The power to decide how large amounts of compute are used could be allocated via digital “votes” and “vetoes,” with the aim of ensuring that the most risky training runs and inference jobs are subject to increased scrutiny. While this may appear unnecessary relative to the current state of largely unregulated AI research, there is precedent in the case of other high-risk technologies: Nuclear weapons use similar mechanisms, called permissive action links (security systems that require multiple authorized individuals in order to unlock nuclear weapons for possible use).

In their defence they do make at small nod to the possibility that such controls could have illiberal consequences, but they don’t appear to be very worried about that in proportion to the rest of the content, certainly they believe it is solvable with even more regulation.

So, back to ITAR and Cryptowars 1.0 thinking, then?

Previously:

California State Senator Pushes Bill To Remove Anonymity From Anyone Who Is Influential Online | Techdirt

So many problems with this naive concept… and not just the constitutional ones.

This bill would require a large online platform, as defined, to seek to verify the name, telephone number, and email address of an influential user, as defined, by a means chosen by the large online platform and would require the platform to seek to verify the identity of a highly influential user, as defined, by asking to review the highly influential user’s government-issued identification.

https://www.techdirt.com/2024/03/28/california-state-senator-pushes-bill-to-remove-anonymity-from-anyone-who-is-influential-online/

Neal Weigel, RIP

Neal first taught me to code. He and Wanda were a cosmopolitan pair who always had interesting tea, an Apple //e, and the latest copy of Scientific American on hand.

Alas.

He was doing close-in magic on a cruise ship once a few years ago and ran into somebody from Facebook Engineering and explained that he had taught me to code, which I was later happy to confirm. It was a fun reconnection.

https://www.linkedin.com/in/neal-weigel-8130ab48

Clause 14 – Powers to obtain communications data: 25 Mar 2024: House of Commons debates | TheyWorkForYou

Sometimes the Scottish National Party says the right thing:

We get the motivations for this Bill; they are understood and we are sympathetic with some of what the Bill seeks to achieve. However, we are not convinced that all the powers are shown to have been necessary and proportionate and that there are not other ways to get to where those seeking the new powers need to be.

https://www.theyworkforyou.com/debates/?id=2024-03-25b.1350.0#g1352.1

7 years ago, Alec Muffett on Twitter: “…the Home Secretary is literally calling for certain hashtags or words to elicit censorship/blocking on social media”

This is a debate which is now playing out everywhere, especially in the USA where the first amendment people are pointing out that there needs to be a test for government suppression rather than government advice regarding content.

Still scary as…

TruthSocial’s $11bn valuation rests on old Mastodon code which may be poorly maintained

Whatever people have paid for, it is not the engineering capability.

Click through to the entire thread because it is all worth a read and has several updates.

I don’t know who needs to hear this but #TruthSocial, which is running a forked version of Mastodon, does not from the source code appear to have appropriate mitigations in place for CVE-2023-36460, which theoretically allows attackers to create and overwrite any file Mastodon has access to, allowing Denial of Service and arbitrary Remote Code Execution https://nvd.nist.gov/vuln/detail/CVE-2023-36460 (probably other CVE’s as well, but some rely on federation which Truth Social doesn’t use?) #infosec

https://digipres.club/@ryanfb/112146904149736275

UK counter-eavesdropping agency gets slap on the wrist for eavesdropping

If you give the state a surveillance power it will be recklessly or wilfully misapplied, misused, or otherwise abused.

UK NACE is tasked with protecting the country’s most sensitive information and sites both in Britain itself and in embassies around the world. As part of this duty, the agency was given new powers back in October 2021 to acquire communications data — the legal term for metadata — in the interests of national security.

The new powers came with increased oversight. According to the 2022 annual report of the oversight body — the Investigatory Powers Commissioner’s Office (IPCO) — published Monday, UK NACE had a “a high incidence of errors” and the agency was acquiring communications data without the appropriate authorizations.

“Of most concern, we identified five authorisations (resulting from one single tasking) to identify a journalistic source in respect of which UK NACE had failed to seek the requisite approval from a Judicial Commissioner,” stated the IPCO report.

Despite the lack of the necessary approval — intended to be a crucial safeguard for protecting journalists’ sources — UK NACE was successful in acquiring this data, although it was not able to identify a journalistic source.

https://therecord.media/uk-nace-unlawful-surveillance-journalistic-source