04 January 2006

EULA

Just about any piece of software you buy, or even just download and use for free, demands that you sign an “end user license agreement” — EULA for short — a legal contract between you and the software maker. It may come up in one of the endless dialogue boxes you click through during the intial install. Or it may come with the CD, with a note saying that opening the shrink wrap constitues agreement to this contract. Or it may come in some other form.

You may never have noticed this. You may have started blocking it out after years of dealing with software. But I want to talk about how very weird it is.

First, an ironic comment:



A Eulogy for the EULA by factoryjoe

See, EULAs say bascially two things. The software manufacturer isn't responsible for anything bad that happens to anyone as a result of using the software. And the software manufacturer doesn't promise that the software will work consistently, or at all, or really do anything.

And they say it in dense, verbose, unreadable legal language—pulling an example off of the web practically at random, I find a EULA from Microsoft that runs over 4700 words ... and in the first paragraph, it says that might not be all of it!

An amendment or addendum to this EULA may accompany the Product

Think about it for a minute. Most Americans, and practically all American corporations, are bound in a mountain of legal language generated by the software industry. I’m pretty sure that I personally am bound by more words of legal language from software EULAs than from every other source put together.

Mind you, we agree to these things under coercive circumstances under which we cannot possibly reflect on the legal consequences. Ian Goldberg and Kat Hanna tell a story about trying to confront Dell about the opacity of their EULA, and the series of misadventures they have just trying to get a straight answer about what their EULA contains.

He said he installs things all the time without reading the license agreements. He says I should just do that. I ask if he's really telling me to lie and to agree to legal documents I haven’t seen.

This is madness of Kafkaesque proportions, and its purpose is to protect the software industry from making any quality guarantees for their products. What other industry imagines they could even attempt such a scam? Not promising that their products will work at all?

Last year, I was at a client’s office and I heard that some US congressional representative was kicking around the office somewhere, doing a tour of Silicon Valley and looking to hear from industry leaders what congress should be doing to help the Dynamic World of High Tech fullfil the promise of the Economy of Tomorrow ... yadda yadda yadda. Though neither I nor the folks I was meeting with were going to meet the rep, we spent a few minutes brainstorming about what legislation would be good for the industry.

I said we need legislation limiting these damned EULAs. It shouldn’t be legal to make your customer sign a contract saying they don't mind if your product doesn't work. Aside from this being unreasonable, and maybe even immoral, it’s bad for the industry. The ability to make folks agree to a restrictive EULA creates a race to the bottom. If you're a company committed to spending the time and money to make an airtight, reliable working software system, you have to compete with companies publishing slipshod crap that looks the same on the surface. Sure, in the long run you’ll win customer loyalty ... but that’s if you survive long enough to see the long run, which is a tough bet in the software industry. So EULAs make the industry a competition to see who can get the sloppiest code out the door fastest. That’s not just bad for product users, it's bad for the companies when they are compelled to ignore questions of quality, producing the rushed production schedules, poor planning, and disinterest in customers’ needs that are endemic to the industry. A little legislation saying that you can't make the kind of EULAs that everyone does now would transform the whole industry for the better, and serve the public, too.


A forthcoming documentary on this subject.


Over on Mastodon Sindarina, Edge Case Detective lays this out more generally:

There is only one form of valid user consent;

INFORMED, ENTHUSIASTIC CONSENT, REVOCABLE.

That’s it. No ifs, no buts.

The user must understand what they are giving consent for, and the scope for which their consent is valid.

They must be enthusiastic, wholly onboard with the decision, not begrudgingly agreeing to it because they feel like they have no other choice.

And they must be able to revoke that consent at any time, whether five minutes from now, or five years in the future.

Their consent should be time-limited, and expire automatically when they no longer interact with your service or product.

If you change the scope, you need to ask for their consent again, and make sure they understand the impact of the changes you are making.

The scope includes who owns and operates the service or product. If you want to be acquired, you need to ask for their consent again.

User consent is NOT transferrable, period, no matter what modern terms of service claim.

Most people in tech do not want to hear this, because it invalidates the vast majority of their business models, AI/ML training data, business intel operations, and so forth. Anything that’s based on gathering data that is 'public’ suddenly becomes suspect, if the above is applied.

And yes, that includes internet darlings like the Internet Archive, which also operates on a non-consensual, opt-out model.

It’s so ingrained in white, Western internet culture that there are now whole generations who consider anything that can be read by the crawler they wrote in a weekend to be fair game, regardless or what the user’s original intent was.

Republishing, reformatting, archiving, aggregating, all without the user being fully aware, because if they were, they would object.

It’s dishonest as fuck, and no different from colonial attitudes towards natural resources.

“It’s there, so we can take it.” šŸ˜’

Oh, and also, fuck off with the patronising “lawl, don’t you know the NSA monitors you anyway?!” that seems rife among people from infosec circles.

People KNOW this. Just because some three or four letter agency is hard to fight against doesn’t mean that the objection against the next tech bro wanting to invisibly index our data is invalid.

Informed, enthusiastic consent, revocable. Or get the fuck out.

No comments: