Apologies

My blog got hacked at the start of the week; that’s why you haven’t been able to get in. The hackery has been resolved with the outstanding and patient help of my hosting service, Pair Networks, and you should be able to read to your heart’s content, again.

Unfortunately, as part of the cleanup, all users had to be deleted in order to be sure all the hackers had been deleted. Those of you who wish, or wished, to comment can still do so, but you’ll have to register again. For that, too, I apologize.

Eric Hines

Apologies

My blog got hacked at the start of the week; that’s why you haven’t been able to get in. The hackery has been resolved with the outstanding and patient help of my hosting service, Pair Networks, and you should be able to read to your heart’s content, again.

Unfortunately, as part of the cleanup, all users had to be deleted in order to be sure all the hackers had been deleted. Those of you who wish, or wished, to comment can still do so, but you’ll have to register again. For that, too, I apologize.

Eric Hines

“Should AI Have Access to Your Medical Records? What if It Can Save Many Lives?”

The Wall Street Journal asked that question last week. And their subheadline:

We asked readers: Is it worth giving up some potential privacy if the public benefit could be great?

A good many of the published answers centered on Yes, with oversight by, among others, medical professionals.

This reader (unpublished in the WSJ) says, resoundingly, No. Not now, and not for the foreseeable future, say I. Personal data aggregators, whether government or private enterprise, have shown no ability to protect our personal data, whether from hackers or from organizational carelessness, incompetence, or ignorance. With our medical data especially, very good protection, even six sigma-level protection, isn’t good enough. This is one of the few areas where perfection must be the standard. Since that’s an unachievable standard, AIs must not be permitted any access to our personal data, including our personal medical data.

There are additional reasons for saying no. One is the inherent bias programmers build into AIs. Alphabet’s overtly bigoted Gemini is an extreme example, but the programmers build their biases into AIs through the data sets they use and have their AIs use in training.

There’s also the just as overt bigotry too many medical training institutions apply through their emphasis on diversity, equity, inclusion claptrap at the expense of training actual medicine. Those institutions are producing the doctors that would the second generation of “medical” professionals doing the oversight.

In the current state of affairs, and for that foreseeable future, it’s not feasible to let AIs into any aspect of our personal lives. The blithely assumed public benefit is vastly overwhelmed by the threat to our individual privacy—the “public,” after all, is all of us individuals aggregated.

Is PRC-Level Surveillance Coming to California

California, whose gas taxes are among the highest in the nation, is on net losing revenue from those taxes as ICE motorists drive less and the number of motorists driving battery cars increases. The Progressive-Democratic Party, which reigns over California, is looking hard at implementing a…solution…straight out of the People’s Republic of China. Party is

piloting the idea of a “road charge,” which would charge drivers based on the number of miles they drive rather than how much gas they purchase.

So far, driver participation is voluntary, but when the pilot program is replaced by a permanent replacement, look for participation to become mandatory. Track the number of miles driven? That’ll be via uplink to the California government odometer readings.

It’s a short step from there to uplink all the places the motorists’ cars stop, and the routes the car took to get there.

At least nanny states can claim to be looking out for the welfare of their citizens. This is Party looking out for its own welfare by snooping increasingly into citizens’ lives.

Concerns Regarding “Unreasonable” Searches

There are concerns that a bill under consideration in the House, the Fourth Amendment Is Not For Sale Act, goes too far in protecting us Americans from 4th Amendment violations by the government at the expense of our counterintelligence capabilities.

The bill…would ban the government from buying information on Americans from data brokers. This would include many things in the cloud of digital exhaust most Americans leave behind online, from information on the websites they visit to credit-card information, health information, and political opinions.

Worse, goes the argument, the bill

would prohibit the US government from buying digital information that would remain available to the likes of China and Russia.

That last is a non sequitur, though. The fact that the data are readily available to our enemies doesn’t legitimize its collection by our government, which has Constitutional bars against most kinds of searches. It’s further the case that if we can’t be secure against the unwarranted [sic] intrusions of our own government, how can we expect our own government to keep us secure from the intrusions of foreign governments, especially enemy foreign governments?

There also is a misunderstanding buried in the claim regarding that digital exhaust [that] most Americans leave behind online. A significant fraction of that “digital exhaust” is not voluntary; it’s left behind as a condition of doing business with those enterprises that require collection of the data. Some of those data are legitimately needed by businesses: credit card account numbers if payment is being offered via credit card, shipping addresses so the seller can deliver the product, personal names so the seller can be sure of the credit card numbers and shipping addresses, and the like. Other data are demanded by the business as a condition of doing business with the customer for reasons unique to the specific enterprise.

Better would be to bar the sale, rather than bar the purchase, of such data.

That sale, too, should be barred universally, not just with respect to our government, within the following boundaries. All data that an enterprise demands be collected in order to do business needs to be barred from sale or any other transfer, to any other entity, whether government or not. There should be no default position or opt in or out; the sale or transfer of these data should be prohibited. Government legitimately can still access those data on presentation in court of a probable cause, supported by Oath or affirmation, and particularly describing the [data] to be searched, and the [data] to be seized. Voluntarily left data should require affirmative opt-in before those data can be sold or transferred. Failure to choose should be taken as not opting in—the enterprise cannot sell of transfer the data.