LI + AI = GIGO


Estimated reading time: 7 minutes
Privacy

Please enjoy this article in The Stack, in which two Glaswegians, including yours truly, sit in the late summer sun and have a natter about the news that LinkedIn has not only decided to train its AI on its users’ data and content, but has actively opted-in UK users without their knowledge or consent.

As the article notes, the active nonconsensual opt-in does not apply to people in the EU, EEA, and Switzerland. Why?

We’ll put it this way: 404 subsequently reported that LI is also doing the active nonconsensual opt-ins to users outside those regions, not just the UK.

You might think the reasoning here is obvious, meaning that it’s based in the EU corpus of data protection law, but I suspect it might well be something else.

You see, the more you try to use LinkedIn for the things you actually need it to do, the more of its nonsense it pushes at you. It’s how you find yourself discovering yet more layers of nonsense that reveal some quite worrying problems with the site’s architecture. Those problems may well render LI’s AI vision impossible in Europe. I’ll explain why.

But before I get into that, we shall have

THE AIRING OF GRIEVANCES

Skip to the next section if this isn’t for you.

If it is, please read this section in the shouted voice of Jerry Stiller.

I hate LinkedIn. Despise it.

It’s a necessary evil for job searching, and for keeping in touch with colleagues in a sector like mine where most people work on short-term project contracts. Aside from that, I hate everything about it.

I personally equate it to having the most toxic-positive-manipulative American you’ve ever met sitting next to you, blinding you with the colour of their teeth, overwhelming you with sickly-sweet inspirational quotes and juvenile cheerleading, whilst constantly exhorting you to play the games – meaning the gamification – which the site forces on you at every click.

Upgrade to premium! Wish someone a happy work anniversary! Download our mobile app! Take a survey! Join a group! Enhance your profile! Write a post! Play a game! Follow an influencer! Oh do fuck off.

It’s as if the whole UI and UX are designed to nudge you into dropping whatever it was you came to the site to do, so you can instead play their made-up games and score their made-up points towards becoming a toxic-positive-manipulative American too. That’s not you, it’s not me, and that is frankly not whom anyone reading this post should aspire to becoming.

The site, its business model, and the algorithms that drive it, make no attempt at either localisation or relevance. Sometimes this is silly, as in that time the site forced all global users to update their physical location into a wider area, using American terminology which is simply not used here. (“Greater Glasgow Area”? Seriously? WTF is that?)

Sometimes this goes deeper, such as the site’s insistence – and this is such a case study in cultural bias – that your postsecondary school, aged 18, is the most important institutional relationship of your life as well as the definition of your identity. Again, nope. Only America does the “rah rah go school” shit. For the rest of the world, your undergraduate university is transactional. You goes in, you gets what you need, you leaves. You don’t spend the rest of your life wearing a tacky ring and shouting “go pedos!“. Tell that to LinkedIn, which is algorithmically programmed to elevate any rando who has ever set foot on a campus you haven’t set foot on in over two decades as the most important person in your network. Yes, you worked intensely with a great team based in Brussels for a while last month. But here’s a rando from “~your school~” who LinkedIn thinks matters more. Rinse and repeat.

And if luck works out for you so that 25 years later, you’re adding a new postgraduate experience, LinkedIn still wants to treat you like an 18 year old American, as it nudges you to add your “activities and societies”. Once again. We just do not do that here. Or anywhere.

I mean, technically, we do have marching bands in Glasgow. Good luck explaining that to an American corporate executive.

And then there’s the job searching.

The first thing you need to know about that is how much of LinkedIn’s UX, both in the jobs bit and across the wider site, is fake. You can click “I’m not interested in this job” a thousand times. It will still appear. You can click “I’m not interested in this employer” a thousand times. Same. You can click off a screen full of “people you may know” (who are people you met once at a breakfast networking event in 2009) a thousand times. Same. (I’m not kidding here, folks – LinkedIn seems algorithmically fixated on networking events I attended as a small business owner in 2008-2009.)

When you’re out of work, living on UC, and job searching 40 hours a week, your energy levels are rock bottom. Your emotional resilience is nonexistent. Same goes for your tolerance for bullshit. And yet on LinkedIn, you’re not just dealing with fake UX showing you jobs which you have indicated, so many times you’re on the verge of tears, aren’t for you.

You’re also dealing with LinkedIn’s algorithim insisting that certain jobs are perfect for you. In last year’s job search, LinkedIn’s litany of specially curated “your profile matches this job” selections included, and I am not making any of these up:

  • HRH Princess Anne’s travel coordinator
  • Online journalist, “Yacht Buyer” magazine
  • An in-home euthanasia and palliative care veterinarian (in which I would drive around London being the actual angel of death for geriatric pets. I’m not in London, and I don’t drive)
  • A secretarial role with a large, high net worth, high maintenance family based in Palm Beach, Florida, which owns a lot of documents for some strange reason (you gotta see this)
  • Sushi chef
  • Integrity investigator at the English FA, based at Wembley Stadium
  • Nuclear terrorism expert at the Home Office
  • A Danish language localisation translator, which is a weird take on my love of “Borgen”
  • A volunteer writer specialising in Saint Augustine
  • Housekeeping assistant (e.g. pillow fluffer), The Royal Household’s estates in Scotland

While we’re on this: one of the gamified nudges LinkedIn constantly asks you to do, when you’re out of work, is add an “open to work” banner to your profile. FOR THE LOVE OF CHRIST DO NOT DO THIS. Let me tell you what this does (I’m still shouting in Jerry Stiller’s voice, people.) All it does is sets your DMs to “open” so that anyone can contact you. That’s it. In other words, it removes your quality pre-screening filter.

What results is two things: one, the usual deluge of fuckwit recruitment consultants who literally do not read your profile before sending you completely irrelevant jobs in other sectors or cities, because they’ve gotta hit that daily quota.

Two, and yes you knew this was coming, is completely random men contacting you to mansplain what they think it is you are doing wrong with your job search.

One of whom, again I am not making this up, found my phone number, randomly phoned me when I was loaded with a head cold, and immediately started speaking at me, without a hello, in Russian. Why? Because he spotted that as my undergraduate minor from the 1990s. He then proceeded to speak at me for fifteen minutes, reading my own entire LinkedIn profile to me over the phone – “Oh, and I see you’ve done that too! Wow! You have a lot of broad experience! That should help you to get a job!” as I stood there, dripping with snot, contemplating whether or not hiring a hitman to get rid of someone is in fact an ethical act.

OK so I gotta wind up the ranting here, but here’s the point you should be taking from all of this.

None of this nonsense – the wrong jobs, the wrong connections, the wrong schools, the wrong everything – should be happening.

Because LinkedIn already has all the correct data that should be getting you the right results you need. They have that data because you put it there, when you updated your profile.

And yet, somewhere along the line, LinkedIn decides that you aren’t good enough for you. You’ve gotta play stupid games to win stupid prizes, and transform yourself, by their ridiculous American standards, into an entirely different person.

Now let me show you what I mean by “entirely different person”.

Am I not good enough for you?

Obviously AI is all the rage now, but LinkedIn has been experimenting with janky machine learning and AI generated content for years. One area where they implemented this, a good four or five years ago, was in creating the ability to automatically generate a profile summary. In theory, that would take the data in your profile and spin it into rather useful paragraph or two, right?

So I thought, let’s just push that button and see what it says.

This is what it said.

That’s … not me.

Not even close.

That is a list of things I do not and never have done, places I have never worked, skills I don’t have, and tasks I just don’t work on.

It’s 100% wrong. So wrong that it’s as if it was pulled off someone else’s profile entirely.

So where the hell was LinkedIn getting that information from, given that – again, all the correct information was right there in my profile?

Is that a peek into what’s in the back end of your profile too, regardless of the data you’ve actually input in the front? Does that explain why my algorithmic results are so consistently wrong across the board – network contacts, jobs, and so forth?

Seriously, LinkedIn, how do you fail to understand the task that badly?

So let’s bring this back to yesterday’s news about LI’s nonconsensual opt-in to AI training.

Garbage in, garbage out

Here’s what I think may be happening here.

The fact that LinkedIn is not rolling out this generative AI model in the EU, EEA, or Switzerland – places which unconditionally require opt-in consent, while also giving users the right to call out incorrect information that companies hold about them – is a bit of a confession to me.

It tells me that they know their site is built on data slop, and that slop is a feature, not a bug.

As I’ve shown you above, LinkedIn has been generating bizarre data hallucinations long before AI made it trendy. Those data hallucinations are built using unknown incorrect data from unknown sources which are, nevertheless, intrinsically tied up with correct data about peoples’ professional identities.

Those data hallucinations are LinkedIn’s content.

That’s the content they’re training their AI on.

Slop inputs create slop outputs. GIGO.

That output of slop is so bad that they can’t even try to roll it out in Europe.

And they know that, despite all the American gamified rah-rah. They know that.

There you go.

The Author

I’m a UK tech policy wonk based in Glasgow. I work for an open web built around international standards of human rights, privacy, accessibility, and freedom of expression. The content and opinions on this site are mine alone and do not reflect the opinions of any current or previous team.

2 Comments

  1. As someone currently using LI to look at jobs I felt every word of this. Spot on.

    • Special commiserations if, as I did last year, you have to explain this state of play every week to a sour-faced boomer at the Jobcentre, who is determined to force you into a secretarial or janitorial role to punish you for having ideas above your station.

Comments are closed.